More Rumors About Cisco and VMware

So have things settled down for VMware now that Diane Greene is out and Paul Maritz is in? Not really, it would appear, as more rumors are churning about a possible sale to Cisco. As recently discussed here, Cisco is already a minority shareholder with a strong interest in both network virtualization and developing many of the connectivity and infrastructure management piece parts of the so-called next generation data center. The company has the market heft and the cash to pull off an acquisition like this, plus a great track record in the fine art of integrating acquisitions into the Cisco technology base and product porfolio.

Whether it's Cisco or some other suitor, some changes in the current ownership structure seems likely according to Tom Bittman over at Gartner. Tom told me this morning that he thought "a slow divestiture" of VMware was what could be expected. Bittman thinks that EMC will slide from being an 85 percent owner today to 50 percent or less by the end of 2009. Why? His view is that unless there is a tighter technology coupling of product portfolios and marketing, the current arrangement is really not helping either company.

Posted by Tom Valovic on 09/02/2008 at 12:49 PM0 comments


Will AT&T Succeed in Cloud Computing?

Cloud computing is a big tent and there are a lot of interested parties. That includes telecom carriers with AT&T recently launching what it’s calling a utility computing service called AT&T Synaptic Hosting. The company describes it as a “next-generation utility computing service with managed networking, security and storage for businesses.” AT&T is the largest carrier in the US but also has a huge global footprint as an international carrier which it will likely leverage.

I’ll have a lot more to say about carriers getting into cloud computing in future blogs. But for now, to me this announcement just underscores the fact that IT and telecom convergence is alive and well. This means not only a convergence of technology but also of markets with telecom providers increasingly competing with IT suppliers and systems integrators to provide voice, unified communications, as well as raw compute-based and Web-based services.

Why is this happening? For one thing, carriers are moving to an IT-based model and the telecom central office of the future will be – guess what? – a data center. This means that they can outsource these capabilities not just for voice but for IT services as well. And if companies like Google and Yahoo can provide voice services, then turnabout is fair play. Telecom carriers already provide various hosted data services resident in their own privately owned and managed geographically distributed data centers.

Will carriers be successful at this? This is where it gets interesting.  As I talked about in a previous blog on telecom and virtualization, this isn’t an easy question to answer. Some clues: if cloud computing represents the fulfillment of Sun’s now slightly dusty but still viable slogan that “the network is the computer”, then the importance of the network in any cloud offering shouldn’t be overlooked.

In this sense, the fact that the carriers own and operate their own networks is a major trump card. They also have organizations which have longstanding commitments to SLAs and five-nines reliability. On the flip side, carriers have less expertise in computing as a target market and don’t understand the enterprise data center environment as well as cloud providers coming out of the IT space, especially newer trends such as Web 2.0. And finally as Mike Eaton, CEO of Cloudworks has pointed out,  their achilles heel could very well be customer service.

Let me know what you think about their chances of success.

Posted by Tom Valovic on 09/02/2008 at 12:49 PM0 comments


Can Wyse Marry Thin Clients and Unified Communications?

Jeff McNaught is the “Chief Marketing Officer and Chief Customer Advocate” at Wyse (No kidding, that’s his title, check it out. In fact, all senior executives at the company have “Chief Customer Advocate” after their titles, an interesting touch. There’s also a “Director of People Development”.)

I always enjoy speaking to Jeff, in no small measure because he seems to truly enjoy evangelizing the Wyse thin client story. Ask him about a competitor and you can tell he’s quietly thanking you for the opportunity to tick off in rapid fire fashion all the reasons why the company’s portfolio is the better than what competitors offer. No ducking, dodging, or equivocation.

Recently I had an chance to talk with him about what I consider to be a fascinating data point: namely that Wyse is working on a project involving thin clients and unified communications (UC), two things you wouldn’t ordinarily associate. Since I spearheaded IDC’s research in this area for several years, any touch points between virtualization and UC tends to get my inner nerd working overtime.

Here’s the deal. Wyse is working with at least three partners on this, all with strong credentials in the space: Cisco, Avaya, and Nortel. Wyse expects to be coming out with a  a SIP-compliant product announcement for non-mobile enterprise deployments next year. McNaught said the company is working much more closely with one of these suppliers but couldn’t provide any additional detail. UC has major momentum in the enterprise these days which means simply that, sooner or later, any supplier in the value chain associated with hosted desktop virtualization market needs to have it on their radar screen. Stay tuned.

Posted by Tom Valovic on 08/25/2008 at 12:49 PM0 comments


Google's Darkening Cloud

When I was having problems using my Gmail account the other day, I didn’t worry too much about it, although it did give me pause. (Is “loading” the new “hourglassing”?) But when I got into the office and saw some reports about major Google outages, curiousity turned into concern. Good thing Google doesn’t have a tech support hotline for Gmail (Good thing for Google, I mean.)

According to a report in Computerworld, there were some users who experienced outages that lasted more than 24 hours. The glitch also affected end users relying on the Google Apps suite of collaboration tools. Characteristically, Google provided little color as to the details of what happened and why. (How ironic would it be if, as dependency on Google increases, the company were to become every bit as unresponsive an info-bureaucracy as pre-divestiture AT&T ? )

In a nicely framed overview of the hazards of cloud computing, J. Nicholas Hoover over at Information Week cited other recent outages including one involving Citrix's GoToMeeting and another with Amazon’s Simple Storage Service. The article pointed out that IT managers should take a close look at their options including but not limited to “storing data with multiple service providers and regularly backing up SaaS data on on-premises servers.” Trouble is having to backup your own data from the cloud seems a bit off the mark.

Trusting the cloud in terms of reliability is huge issue. Savvy IT managers won’t hesitate to hold a service provider’s feet to the fire in terms of SLAs and service guarantees. In the meantime, Google, the standard bearer for cloud computing in many respects, should rethink its Walled Fortress approach, offer some transparency, and start talking more to customers and the industry in general about its commitment to these kinds of issues.

Do you think Google needs to change the way it does business as well as the way it positions itself as a company? Send me an email and let me know what you think or post here.

Posted by Tom Valovic on 08/25/2008 at 12:49 PM1 comments


The Best Jazz Is On...Cable?

If you’re at all into jazz, I hope you’ve read Keith’s posting on the subject. I’m also a fan as is colleague Ed Scannell, editor of Redmond magazine. Ed attended the New Orleans Jazz and Heritage Festival back in May and has some interesting stories to tell about it as well as his experiences meeting some jazz luminaries over the years.

Jazz fans are a small but mildly fanatical contingent even though the genre has an audience share of something like a whopping 3 percent. If you peel away the Kenny G. and glorified elevator music stuff from that, the number drops even lower (so I guess we’ll have to keep Kenny in there to make the stats look good). But hey there’s something to be said for keeping a great tradition alive even if sometimes it feels like it’s on life support.

Some followers and semi-followers wonder where all the new and innovative artists are hiding out these days. For a while I was wondering too. Finding new artists, CDs, and venues takes some time and effort. Even in major cities like Boston (1105 Media has an office in Framingham, a Boston burb), it seems like new talent only trickles in to a small number of venues and keeping up is a challenge. Still I’ve found a great way to stay plugged in even if it’s a bit mundane: my local cable provider.

My provider is Charter Communications which offers a series of free music-only channels, something like 30 or 40. In fact, I didn’t even know they were there until I did some accidental channel surfing. The channels are outsourced from an outfit with the nondescript name of Music Choice but they do a good job.

And here’s the sweet part. The jazz channel is first rate because it consistently turns up a large number of new artists and CDs with the particulars of each song labeled. (Many of the excellent streaming stations out there still focus on the more classic stuff. ) All that’s missing (and this is the stuff of dreams or maybe just some new generation of IPTV): a button on the remote to click and purchase or redirect favorites to a folder somewhere in the cloud…ah well..)

It’s heartening to know that new CDs from new artists are coming out all the time and encouraging that really good jazz and creatively updated interpretations of the tradition are alive and well even if -- for me at least – the best way to stay updated is on cable.

Posted by Tom Valovic on 08/18/2008 at 12:49 PM0 comments


Hyper9: Search Meets Virtualization Management

If management and automation is the new battleground for virtualization, then it will be interesting to see where the innovation comes from in this area. The large system vendors have the advantage of a huge installed base of physical management offerings but at the same time are saddled with limitation of having to innovate incrementally “at the edges” of their existing products via release upgrades.

On the other hand, a company like HP has its hooks into just about every aspect of the next generation data center and is in a great position to leverage the increasingly de-siloed and virtualized components of server, storage, network, and applications and to aggregate individual console feeds into the much vaunted “single pane of glass”. And just for the record, although the largely unattainable but desirable “single pane of glass” goal gets invoked by just about every management vendor we speak to, we’re a long way from this becoming a reality. In fact, I invite you to consider that the problem will get worse before it gets better as we in many cases add yet another management layer just for virtual systems.

That said, it’s worth taking a look at some of the start-ups out there trying to innovate. Given the complexity involved, ease of use and elegantly simplified presentation of real-time and historical data is going to be at the heart of the evolving dynamic data center. One of the companies working this problem is Hyper9. The company is based in Austin, Texas, privately held, and backed by Matrix Partners and Silverton Partners. 

Editor Keith Ward and I recently spoke with Chris Ostertag, President and CEO and Dave McCrory, CTO. Hyper9 has two innovation plays. First, the product is being offered as freeware with optional value-add modules available as it moves along its roadmap development. Second, to reduce the complexity that virtualization introduces, the value prop is centered on a “Google-like” search engine as opposed to the classic hunt and find logic tree approach. The initial virtual appliance works in the VMware ESX environment and later versions will work with other hypervisors.

Is search the future of virtualization management? Send an email and let me know your thoughts.

Posted by Tom Valovic on 08/18/2008 at 12:49 PM2 comments


Yankee on Oracle: “Explosive Growth”

The Yankee Group’s Third Annual Global Virtualization Survey has turned up some great data on vendor market share. I’ll mention just a few data points that I think are especially noteworthy. The survey points to some interesting shifts in the vendor landscape. One conclusion is that Oracle is poised for “explosive growth in application virtualization and virtual appliances”.

Yankee ties this prediction to the company’s recent acquisition of BEA. In addition, it sees other IT systems vendors as coming on strong. Combined, Oracle, HP and IBM account for about an 11 percent market share but this has nowhere to go but up. (IBM’s share at about 5 percent, by the way, includes both x86 and Power Series sales.)

Another data point that grabbed my attention is that market penetration of Xen-based solutions have jumped significantly since the last time Yankee did the survey and is now up to 17 percent. This includes a Citrix XenServer share of 11 percent plus another 6 percent from Novell, Red Hat and other open source suppliers. Yankee says “this represents the biggest market share jump for any virtualization hypervisor solution.”

Finally, a comment on VMware’s new challenges. The survey data showed major market share advances from some key strategic suppliers: Microsoft (26 percent and solidly in the number two position), Citrix, Oracle,  Parallels, Virtual Iron, and Sun. Laura DiDio, a Research Fellow with the Yankee Group involved in the effort (who also, I recently learned, happens to be a neighbor of mine in Blackstone Valley horse country) stressed to me that as a result of this market movement, VMware’s once healthy two year lead has now been whittled down to six to nine months.

Posted by Tom Valovic on 08/12/2008 at 12:49 PM0 comments


IT Innovation: Don’t Forget the End User

" I know engineers. They love to change things."
- Leonard McCoy (Star Trek: The Motion Picture)

I’ve been watching the Vista debacle from a safe distance since Microsoft is looking to use application virtualization software from its purchase of Kidaro to address some issues there. The knock of course is that in developing Vista, Microsoft fixed a lot of things that weren’t broken. Ah yes, the impulse to tinker.

The whole thing got me thinking about Tony Picardi’s interesting research on the software complexity crisis when he was at IDC years ago. It also got me thinking about what I’ll call “the problem of innovation that exceeds the bounds of rational use” and how that works to create a new level of complexity that now haunts a lot of the IT landscape and frequently makes hapless end users into unhappy campers.

So when does innovation become excessive, unwanted, or unneeded? There are plenty of examples in other markets. Does the world really need a five-blade shaving razor? Don’t think so and (earth-shattering disclosure), I’m still using a mere two blades and haven’t been kicked out of any high-end Boston dining establishments yet. And how about those new traffic lights that came out a few years ago that you couldn’t actually see because of the glare?

Don’t get me wrong, innovation is what drives our industry. But here’s what I don’t get. There are human factors engineering people working in the auto industry and in many SIC code sectors and industries. Their job is simple and straightforward: make things work with the end user’s needs in mind. But somehow, when it comes to IT and telecom,  runaway innovation often trips up otherwise savvy vendors in their tracks and usability often gets the short end of the product development stick.

I applaud Apple’s efforts to try and change this with the iPhone. They clearly get it. And Google, the Wal-Mart of Cloud Computing, has, it must be admitted, made great strides in making applications easlier to use and has hired many user experience advocates over the last several years. (Actually, this is kind of a “must have” for its offerings given the fact of little or no access to any real tech support.) And maybe – just maybe – hosted desktop virtualization will be done in such a way to actually improve the end user experience (and not just the IT manager experience.) Let’s hope so. But it still remains one of the industry’s biggest and most underestimated challenges.

What are your thoughts about innovation tripping up the end user experience? Weigh in here or fire off an email.

Posted by Tom Valovic on 08/11/2008 at 12:49 PM0 comments


Cloud Computing Doesn't Have to Be Vague

I've looked at clouds from both sides now
From up and down, and still somehow
It's cloud illusions I recall
I really don't know clouds at all
-- Joni Mitchell, "Both Sides, Now"

Cloud computing is hot. But you knew that. Trouble is, it seems to have become one of those trends that generates so much buzz vendors feel an irresistible compulsion to stand in front of the parade while deftly (or not so deftly) tweaking the definition just enough to get their product or service offering under the tent. Alas, the devil will serve lemonade before this changes.

Like virtualization, cloud computing is pretty amorphous and therefore open to marketing spin. Analysts often take on the job of deconstructing the hype and providing frameworks and taxonomies that sort out the resulting confusion. Jeff Kaplan, managing director of an independent company called ThinkStrategies, has taken a good shot at this in a recent blog where he discusses the difference between SaaS and cloud computing.

Jeff talks about an e-mail he received from a frustrated company executive discussing a trade publication that was conflating cloud computing and SaaS (it wasn't Virtualization Review -- honest!). Acknowledging that the frustration is shared by many, he goes on to make some good distinctions:

"I view cloud computing as a broad array of Web-based services aimed at allowing users to obtain a wide range of functional capabilities on a 'pay-as-you-go' basis that previously required tremendous hardware/software investments and professional skills to acquire.

Cloud computing is the realization of the earlier ideals of utility computing without the technical complexities or complicated deployment worries. With this precept in mind, I see SaaS as a subset or segment of the cloud computing market."

I like Jeff's distinction. Cloud computing is a broad concept and can come in a wide variety of flavors. It can range from pure computing capacity resources, such as Amazon's EC2, or involve specific services or applications such as what salesforce.com -- the classic SaaS poster child -- offers.

But an interesting question comes up in this context: What are the touch points between virtualization and cloud computing? Seen from one vantage point, Google Gmail is a virtualized application that runs in Google's increasingly all-encompassing cloud. But this isn't true virtualization is it? I asked Natalie Lambert over at Forrester what she thought:

"Great question -- and one that I get asked a lot. Honestly, I see Gmail as a classic Web application. It uses a browser to execute code and uses the processing power of the local PC -- not what I typically think about when talking about hosted app virt. In addition, Web apps can store local data (such as cookies). This is all to say that there really isn't a clear abstraction layer, thus again, not really hosted app virtualization. However, it is not the hypervisor that is my hold up -- it is the lack of abstraction."

In yet another incarnation, a cloud offering can use a form of desktop virtualization essentially as an enabling technology. In this case, that cloud offering would be called Desktop as a Service or DaaS and be provided by either a telecom carrier or the outsourcing arm of an IT systems vendor or a systems integrator using products provided by companies such as Desktone. Finally, server virtualization can be viewed as an enabling technology and an accelerator of cloud computing and SaaS providers have used it to optimize their offerings.

What are your thoughts on the sometimes tricky relationship between virtualization and cloud computing? Post here or send me an e-mail.

Posted by Tom Valovic on 08/04/2008 at 12:49 PM0 comments


"Sadly, VMware is No Google"

So reads the title of a recent blog post by Adam Lashinsky (who wrote a widely read profile of Diane Greene last year that appeared in Fortune.) The posting made a solid point about the performance of the stock: "With shares of virtualization software maker VMware (VMW) in free-fall, it's worth noting that for all the comparisons that have been made between the EMC-controlled (EMC) company and Google (GOOG), VMware's stock has now done something Google's never did: It trades well below it's opening price in 2007. As I noted when VMware went public not quite a year ago, the shares were offered at $29 but opened at $52. At one point VMware's shares soared above $125. Thursday they were down about 5% at $35 shortly before the close." However, I also found the posting interesting in that, read between the lines, it gives an insightful glimpse into the inflated investor expectations mentality that has raised the bar so high for VMware.

Posted by Tom Valovic on 08/04/2008 at 12:49 PM0 comments


Surf's Up

Editor-in-chief Doug Barney passed along an interesting piece of info to Keith and me about a new EMC marketing initiative. The outreach involves an e-mail promo not from VMware but directly from EMC's Voyence marketing team. (Voyence is a configuration and change management company that EMC acquired last Fall). The promo discusses the benefits of virtualization along the lines of "Imagine if you could visualize virtual-to-physical relationships across the data center [and] enforce VMware best practice policies, such as improving utilization by finding VMs no longer in use...." and is signed simply "sincerely, EMC". Clearly EMC didn't waste much time putting itself at the head of the VMware parade.

So how long before we see a photo of Paul Maritz windsurfing?

Posted by Tom Valovic on 08/04/2008 at 12:49 PM0 comments


Thanks for the Use of the Market!

In a recent ComputerWorld article, "Microsoft's Online Woes Hint at Larger Vulnerability," Elizabeth Montalbano writes:

"Microsoft has built its massive software business by watching other companies take the lead in emerging technology markets and then following fast with competitive products that eventually become dominant once those markets begin to pay out. The company did it against IBM during the birth of the PC, Netscape during the browser wars, and is currently making a strong showing against Sony and Nintendo in the game-console market. However, Microsoft's inability so far to capitalize on online advertising and services and its inability to make any headway against Google shows that, despite its huge cash reserves, this strategy may no longer be effective."

Not so for virtualization of course. As Montalbano alludes to later in the article, there's a very credible school of thought out there that holds that this is exactly the strategy that Microsoft is pursuing in virtualization and very well may succeed with it.

Posted by Tom Valovic on 08/04/2008 at 12:49 PM0 comments


Subscribe on YouTube