Data Center in a Box?

Market researcher IDC has an event they’ve held for three years running called the IDC IT Forum & Expo. This year’s event was held at the Westin Waterfront in Boston. I was an IDC analyst prior to joining Virtualization Review, so while attending a few sessions, I had a chance to chat with many former colleagues of mine in the Enterprise Computing Group headed up by Vernon Turner. (Fun fact about Vernon, by the way: he likes to bungee jump in his spare time and has a penchant for showing live footage of his experiences during team meetings.)

Research VP Michelle Bailey gave an interesting presentation of data center trends including virtualization. IDC’s current projection for the number of servers both physical and virtual that will be deployed by 2010 is 60 million (that’s worldwide.) There was also the inevitable discussion of the “management gap”  between core technologies and systems management and automation tools. Bailey says it’s now a big enough problem to cause some of the companies that IDC tracks to pull back from moving to virtualization at least for the time being.

A few other data points that caught my attention: some companies are beginning to use alternative energy as direct power sources for their data centers including solar and wind turbines. And although the expensive and time-consuming (3.5 years on average) task of building a data center was described as “a last resort for many companies”, IDC survey data shows that roughly a third of companies doing so are “going green” in terms of their planning and development. Finally, there’s a new trend towards containerized data centers which can be drop shipped to a site, a kind of data-center-in-a box. Both Sun and – interestingly – Google are working on these types of solutions.

Posted by Tom Valovic on 06/05/2008 at 12:49 PM2 comments


The Buzz About B-hive

So what's the deal with VMware’s snapping up of B-hive? Its core product is a virtual appliance already partner-integrated with VMware's management tools. The integrated solution can reallocate VMs on the fly and create additional ones if needed. From a 50,000 foot perspective, this move fits nicely into the company's vision of automated virtual data centers.

Those are the strategic considerations. At a more tactical level, it appears that much of the rationale for the acquisition has to do with the fact that more and more mission critical apps are being virtualized. I recently spoke with Alex Bakman, CEO of VKernel, about the announcement.

His virtualization management company doesn't compete with B-hive but offers a VMware-certified suite of virtual appliances. Like B-hive, the company focuses on resource utilization but from an different perspective. VKernel looks at utilization in hosts and clusters whereas B-hive looks at end user performance that's sourced from the network.

Being a perpetually curious sort, I asked Bakman for his perspective and why VMware seems more interested in what's going side inside the "black box" of the VM. Pointing out that management is the new battlefield in virtualization, he stressed that many more mission-critical applications are being virtualized now than a few years ago.

"Companies are virtualizing SAP and Exchange Servers. They're not going to do that unless they understand performance aspects. If you asked VMware about a year ago about end user performance, they would say 'We don’t get into that… every VM is a black box.' That response no longer flies with customers."

Contrary to some other reports you might have heard, sources at VMware told me the company intends to release a beta product in late 2008 with an initial release in 2H09. Conductor is tied into both VMotion and VirtualCenter today but further integration with the management suite is likely.

What's your take? Comment below or fire off an e-mail. And for more on the acquisition check out Alex's blog.

Posted by Tom Valovic on 06/03/2008 at 12:49 PM0 comments


VDI Taxonomy Comments

Every once in a while I like to post interesting or especially insightful comments from readers. Here's a response from Bert on "Analyze This: The Flavors of Virtualization":

A decent taxonomy will benefit the whole industry! On the desktop virtualization definition, I would suggest to further subdivide into the two flavors "client-side" (VMware ACE, VirtualBox, etc) and "server-side" (essentially all VDI flavors), as these are typically mixed up sometimes. Also, IDC made a taxonomy of what they call "Virtualized Client Computing" (click here for a graphical overview and link).

Brian Madden recently made a flowchart comparing the Server Based Computer paradigm to desktop virtualization, which is an area I find is also confusing a lot of clients.

Finally, I also like Manlio Vecchiet's definition (Microsoft) for Virtual Desktop Architecture: "the storage and execution of a desktop workload (OS, apps, data) on a virtual machine in the datacenter and the presentation of the UI via a remote desktop protocol (such as RDP) to user devices such as thin and rich clients." This is what I would call "server-hosted" desktop virtualization, and his definition encompasses both the Server Based Computing model (i.e. Terminal Services) as well as Virtual Desktop Infrastructures (VDI).

Thanks, Bert. If any other readers have any thoughts or interesting links, send them my way.

Posted by Tom Valovic on 05/30/2008 at 12:49 PM0 comments


Telecom Virtualization Part 2

In my last blog I talked about the connection between what some Tier 1 telecom carriers were doing in the digital home and the emerging market for hosted desktop virtualization. I also suggested that – in theory -- some of them might eventually offer some sort of “quintuple play” consisting of wireline and wireless phone service, Internet access, IPTV and hosted desktop as a service.

Some skeptics out there might being thinking “in your dreams” and you know what? I’m inclined to be pretty skeptical too. But I think it’s still worthwhile pursuing this little thought experiment a bit.

Here’s the thing: many problems being experienced by home users aren’t all that different from what IT departments are contending with. That’s why Firedog and Geek Squad have grown the way way they have. I counted something like 17 Firedog trucks lined up in the parking lot of a Circuit City near my decidedly ex-urban town recently.

These businesses are booming businesses because home PCs are not easy to use and maintain. In other words, manageability, cost, and complexity are all issues that home users have to deal with, the very same issues that many IT department already contend with. Oh and did I mention all the folks (like my Mom in Florida) who could be online and using a computer if they could only figure out how to do it? (Read: new market penetration).

This could be a win-win. The carrier gets a broader market, new revenue stream and gets hooked into the home users’ computing ecosystem. And hosted desktop in the home could alleviate many end user hassles, extend capabilities to new classes of users, and lay an important cornerstone in making utility computing more than a buzzword. There are plenty of users who would be more than happy with a plug and play thin client that gets them into the game but doesn’t require truck rolls, lots of time and attention, and occasional elevations of blood pressure to maintain.

But wait, there’s more (as they like to say in the those commercials for kitchen-appliances-you-will-never-use). Less security hassles, automated backup (what’s it cost for a service like Carbonite these days?), a helpdesk option, and (if the carriers are smart), you only pay for the applications you use. Add high performance via fiber to the home to and it all starts to look pretty attractive.

This is all technologically possible. But is it market-feasible? Carriers could easily stumble against the likes of IT-centric cloud computing providers (think Google/IBM) who might decide to go after this market in terms of both ease of use and marketing both of which are closely linked. Let me know what you think. Post your comments here or send an email to the address below.

Posted by Tom Valovic on 05/27/2008 at 12:49 PM0 comments


Virtualization Notebook 6.24.08

On HP’s Radar? Shane Robison, HP’s chief technology and strategy officer, was recently interview by Newsweek on his thoughts about the company’s overall technology direction. While there was mention of HP’s efforts in data center automation and IT outsourcing, the subject of virtualization didn’t come up -- even in the context of a discussion about green IT. Curious…..

A Few Data Points on Virtualization Planning. Cirba’s Co-founder and CTO Andrew Hillier recently stopped by our offices to discuss the company’s latest product announcement, Version 5.0 of its planning analysis software, dubbed DCI.  I asked him whether large enterprise customers were doing the planning for virtualization themselves or hiring systems integrators or consultants to do it. His view is that there’s not one dominant trend and approaches depend on which vertical segment you’re talking about. Another interesting data point had to do with Microsoft’s future direction in developing management tools under the System Center VMM banner. He agreed with our view that Microsoft’s features and functionality would have a strong appeal to the SMB market given certain aspects of price/performance.

Netuitive Warns About Silos. Editor Keith Ward and I spoke recently with Marketing VP David Heimlich about the company’s direction in virtualization. The Virginia-based company has been in the IT management market since 2002 but specializes in a patented algorithm which “learns” the best metrics for assessing physical server performance (as opposed to using pre-selected and one size fits all metrics) and now does the same for VMs. That’s one ingredient in their secret sauce. The other (more specifically related to virtualization) is that their solution can gather data from VMware’s B-hive product which is more end user and applications-centric and correlate it with host performance. Interestingly, Heimlich observed that in some IT shops, there’s a trend towards virtualization becoming yet another organizational silo which over the long term can put a dent in the efficiencies that virtualization promises.

Posted by Tom Valovic on 05/24/2008 at 12:49 PM0 comments


Telecom and Virtualization

Like riddles? Here’s one. What’s the connection between virtualization, Firedog, Verizon, and Nicholas Carr? In this and the next few blogs, I’m going to float a bit of a thought experiment and I hope you’ll find the outcome interesting. The terrain to be explored: the connection between telecom and virtualization.

Virtualization is usually thought of strictly in an IT context and even network virtualization falls into that category. But virtualization also has many linkages to the world of telecom even though some of them are still formative.

As a former IDC analyst, a large part of my practice focused on convergence. Just as there are many types of virtualization, there are many types of convergence. The one I’d like to focus on is what happening in the digital home.

Telecom giants like AT&T and Verizon have “quad play” strategies which involve providing TV, wireless and wireline phone service, and Internet access to consumers. In this context, ATT has something called a “3 screen strategy” which involves the PC as well.

Cable companies are following suit. When one of these companies successfully starts to succesfully integrate these offerings (and most are still not there yet) you start to get something like the notion of a utility service like water or electricity. That’s where Nicholas Carr comes in. Taking IT and computing to the next level in this way is something he’s been talking about for a while now.

Now take a step back and add another puzzle piece: the home PC. A telecom carrier could use desktop virtualization from a company like Desktone to provide desktop as a services or DaaS capability, adding one more important capability to the digital home on for the utility model and quintuple play in the bargain. Farfetched? Not at all – telecom carriers such as Verizon are already evaluating it. In my next blog entry, I’ll talk more about how this could happen and suggest some future scenarios.

Posted by Tom Valovic on 05/22/2008 at 12:49 PM0 comments


Virtualization Hype: A Reality Check

Most of us involved with virtualization agree that it’s the hottest trend in IT right now. But we have to be careful not to get too carried away. So here’s my list of a few reality checks to keep in mind:

1. Virtualization does not serve as an adequate substitute for a cup of Starbucks Komodo Dragon blend on a rainy Monday morning.

2. Virtualization will not win the election this fall.

3. Virtualization is not a guarantee of future performance and involves risks and uncertainties which are often difficult to predict. This means that actual future results may differ from what has been forecasted or projected in forward-looking statements.

4. Neither Radiohead nor the Rolling Stones have concept albums planned around the notion of virtualization.

5. Virtualization will not take out the trash. (Please do not tell this to my wife. It’s been my fallback for the last few weeks.)

6. Virtualization will not bring about world peace.

7. Virtualization cannot deliver pizza. (See "Virtualization With Anchovies".)

Posted by Tom Valovic on 05/19/2008 at 12:49 PM4 comments


What Do IT Managers Think About Storage Virtualization?

There are surveys and there are surveys. One conducted recently by Symantec was hardly what you would call scientific because of the sample size just wasn’t large enough (a catch and grab at a trade show.) But it did shed some light on the current thinking of IT managers on some of the trials and tribulations that might be expected in implementing storage virtualization.

The survey targeted IT storage managers, system administrators, managers, and C-level. Some of the findings: both storage management and backup/recovery were the top challenges in VMware environments with nearly a third of respondents indicating that their data protection software for virtual environments is lacking.

In addition, the top two backup challenges cited were “too much I/O overhead” (41 percent) and “scheduling backups” (46 percent). Other key data points: nearly 20 percent of respondents indicated that they back up the entire physical server instead of individual virtual machines; and only 30 percent said that recovering individual files from a virtual machine is easy.

In our July issue, we’ll be looking into what’s driving the storage market with an industry snapshot. Stay tuned.

Posted by Tom Valovic on 05/15/2008 at 12:49 PM0 comments


Analyze This: The Flavors of Virtualization

"I wanna say things that nobody really understands." -- Pete Townsend, Coolwalking

After being an analyst for many years I began to notice an interesting phenomenon. Sometimes I couldn't switch being an analyst off. Problem with the car? Let's generate a quick logic tree diagram and fix that puppy. Can't get the top off the pickle jar? No problem, we'll just quickly jot down a three dimensional matrix, triangulate all the variables, and have lunch...

But it wasn't just me, I found out. One day I walked into the office at IDC and fellow analysts at the proverbial water cooler were discussing a Red Sox game. But here's the thing; they weren't just talking about the game, they were analyzing it, poring over every little nuance, nook, and cranny of the previous night's game with an intensity that was impressive if also slightly depressing. (I myself would probably have been happy with "The idiots lost again".)

I guess what I'm getting at here is sometimes analysts have a tendency to complicate things. But the best analysts can cut though a market and model it, after a fashion. This is particularly useful when trying to grok an emerging market. Because what happens is you get this elephant and blind man phenomenon where there's a lot of confusion about terminology.

Emerging markets have unformed taxonomies, fuzzy competitive dynamics, and capturing all of that can be a bit like nailing jello to the wall. That's why discussing things sometimes feels like using the same language, but with slightly unintelligible and different dialects. And that's also why getting the definitions straight until industry parlance can settle in (i.e., standardize a bit) becomes really important.

The emerging virtualization market is now struggling with this problem and is fraught with semantic wormholes. But in trying to sort it out, it's usually better to start somewhere and level set on a smaller topic, rather than trying to boil the ocean.

So let's start with client virtualization. Analyst Natalie Lambert over at Forrester has come up with a good schema for getting a handle on the four major types of client virtualization. Basically it lumps four different sub-types under client virtualization:

  1. local desktop
  2. hosted desktop
  3. local application
  4. hosted application

Lambert also tosses two more types of client virtualization to the mix: OS streaming and workspace virtualization (such as what's being provided by RingCube.) She's also provided some useful definitions for both desktop and applications virtualization. Desktop virtualization is defined as

"a computing environment, consisting of an operating system, applications, and associated data, that is abstracted from the user's PC."

Simple enough, right? Application virtualization is also defined:

"An application that is abstracted from the user's OS and runs in isolation from other applications."

I know this seems like pretty basic stuff to some of you -- but trust me, definitions are going to become increasingly important as this market gets more complex and mature.

In future blogs, I'll try to tackle some other taxonomy and definitional issues. In the meantime, let me know your thoughts on what some of the most challenging points of confusion are out there on the different types of virtualization.

Posted by Tom Valovic on 05/08/2008 at 12:49 PM0 comments


Throw Away Your Laptop, Says MokaFive

If virtualization were a state, Stanford University would be its capital. That's where Diane Greene, Mendel Rosenblum, and others worked on the technology that would eventually lead to the founding of VMware in 1998, allowing virtualization to cross the chasm into potential widespread adoption.

Now another brain trust spun off from Stanford is off and running with a startup called MokaFive that's going after desktop virtualization in a unique way. The company has some formidable competition including the big three (i.e. Microsoft, Citrix, and VMware) and the thin client providers. It has also targeted the desktop as a service (DaaS) market that other startups such as Desktone are going after with a vengeance.

I always enjoy looking under the hood and finding out what makes a new start-up tick. But tech moves so fast that history sometimes get shortchanged. We're all so focused pursuing the Next Big Thing it's hard to take the time to appeciate how the technology was actually developed. That's why in this case it was nice to look on the Website and see a company cognizant of its roots. There, a downloadable .pdf of the original paper that started it all "The Collective: A Cache-Based System Management Architecture" was prominently highlighted. (Not that anyone except maybe some competitors are actually going to read it.)

With a slew of new vendors knocking on Virtualization Review's door for briefings, we're trying to keep up. Editor Keith Ward and I recently spoke with John Whaley, one of the founders and the CTO of MokaFive, to get a handle on what they're all about. According to the site, John is responsible for "technical vision" and it's a pretty interesting one, bringing one version of the notion of utility computing a step closer to reality. (In this context --- and speaking of history --it's interesting to look back at what John McCarthy said about utility computing at the MIT Centennial in back in, uh, 1961. "If computers of the kind I have advocated become the computers of the future, then computing may someday be organized as a public utility just as the telephone system is a public utility ... The computer utility could become the basis of a new and important industry. Welcome to 1961 folks!)

So let me see if I can lay it out a bit. Let's say you're a road warrior and you walk into your hotel room. There are some "utility" devices in the room: a TV, a refrigerator and a general-purpose computer. The computer has some chops but no personality. The real computer is hanging around your neck or on your key chain -- a thumb drive containing your PC as a virtual machine. It's the ultimate sneakernet: plug it in to the USB port, boot up, and you're good to go. Such an approach, if widely adopted, would give rise to a wide proliferation of similar utility devices in public places. And unlike the Internet kiosks you see in airport terminals, these ones might actually get used.

MokaFive's LivePC virtual machines contain an entire operating system and application stack that fits on a USB flash drive and updates automatically with Internet connectivity. And of course the company also offers DaaS with a model than stands in technical performance opposition to the thin client model. I asked Whaley what he thought of new approaches to thin clients like what Wyse is offering with its recently announced Viance portfolio. (See our May/June issue for a story on this.) He contends that MokaFive's PC-based offering can provide much a better multimedia experience than what's available from bandwidth-constrained thin clients, even with these enhancements.

We'll have a lot more to say about this company and its competitors going forward. But for now, let us know what you think of MokaFive's vision for utility computing and whether it's likely to enjoy widespread adoption.

Posted by Tom Valovic on 05/05/2008 at 12:49 PM0 comments


Is Uncle Sam Virtualizing? Here Are Some Answers

How are federal government IT managers viewing virtualization? To find out Science Logic asked more than100 federal agency IT managers, systems administrators and network engineers at the FOSE conference to spill the beans. ScienceLogic is an IT management solutions company which offers a product called the EM7 Meta-Appliance, a centralized data and reporting repository. XM Satellite Radio is one of their customers.

I recently talked with Dave Link, the company's CEO, about what they found. Dave is an affable gent who spoke to me from Interop in Vegas where his company has been tasked with doing the network management for the show itself. Not a bad gig to have, and Dave didn't seem too disappointed about the marketing value of that opportunity.

The survey addressed a number of issues including, but not limited to, IPv6 readiness; virtualization; ITIL/CMDB; "green IT"; and using Web 2.0. However, for some reason I found myself irresistably drawn to the topic of virtualization. Here the results were more in the way of confirmation rather than any blockbuster revelations.

Fewer than 15 percent of respondents actually have a solution installed (comparable of course to the numbers we hear for the commercial market); and 42 percent of agency IT shops plan to implement virtualization next year. Finally, the survey took a look at "green IT" as a separate and distinct phenomenon. The key takeways? It's important (75 percent), but low priority. And only 13 percent of those surveyed actually have any tools in place.

The full results of the survey are here. To see comparisons between the 2007 and 2008 results, look here.

Posted by Tom Valovic on 04/29/2008 at 12:49 PM0 comments


Getting a Better Handle on I/O Virtualization

OK, I admit it. I'm intrigued by the names of companies. Take xkoto for example, a company that Virtualization Review recently spoke with (See Keith's recent blog on this). Obligingly, the company's Website actually tells you how the name was created: "xkoto is a transliteration of two Greek words which taken together convey the idea of "out of darkness, out of chaos". I just want to go on the record here by saying that I heartily endorse the idea of moving out of chaos...

3Leaf Systems is another company that sounds like it might have an interesting story behind its name; unfortunately, the Website doesn't illuminate. We could, however, surmise that the three leaves are the three major areas of virtualization that the company is addressing: memory, CPU, and I/O. Right now I just want to talk about one "leaf", namely I/O virtualization, the only area where the company is currently shipping product.

You're no doubt hearing a lot these days about desktop, server, and storage virtualization and less about I/O. This is an important area in any redesigned data center architecture looking to capitalize on the full benefits of virtualization -- i.e., keeping the resource base highly flexible and adaptive. However, in many respects, I/O development has not kept pace with the other forms of virtualization technology.

Virtualization is about making enterprise infrastructure hardware-independent. But it's about Layer 1 independence as well. Server I/O is limited by cabling and physical connections, which require manual intervention by IT staff to make changes. This makes it hard to scale and also keeps the process of virtualizing servers from attaining optimal flexibility. I/O solutions can alleviate this by virtualizing the connections between server NICs or HBAs and their associated switches.

But let's loop back to the company. 3Leaf provides ASICs, which lay the foundation for addressing this problem. The Santa Clara-based company is headed up by B.V. Jagadeesh, and Intel Capital is one of its venture partners. In a nutshell, the V-8000 Virtual I/O Server enables configurable "soft connections" between servers and network and storage switches. Recently the company announced what it calls the Virtual Compute Environment, its roadmap for a fully virtualized x86 data center infrastructure. Memory and CPU products are forthcoming.

3Leaf claims that the V-8000 reduces the number of NICs and HBAs by up to 85 percent and the number of cables by up to 70 percent, certainly an interesting value proposition.

Before I close, let me just briefly mention that, going forward, as we talk to this company and others, we'll be digging a little deeper into these kinds of improvement metrics to help readers get a better handle on the types of calculations that support them. As we do this, I hope you'll share any success stories (or challenges) with us to keep it real. A few reality checks here and there couldn't hurt, right? Thanks and stay tuned.

Posted by Tom Valovic on 04/29/2008 at 12:49 PM0 comments


Subscribe on YouTube