When Will IT Technology Mature?

If you step back for a moment and away from the breakneck speed with which the Internet and IT has developed over the last 15 years, you realize that IT in general is a still a very immature market. Mature markets have certain characteristics and one is that at some point seemingly endless cycles of innovation begin to fade. At the same time,  technology curves, products, and end user familiarity and comfort levels with knowing how to actually use the products involved (there's a concept!) settles in to a known model with reasonably predictable characteristics.

Microsoft’s Vista saga is a good example of the beginning of the end of a self-fulfilling sales and innovation cycle that has reached a point of diminishing returns. It may mark a segue into a more mature phase in that segment but I doubt it. Given what we know about the impact of cloud and virtualization on IT today, it’s much more likely that we’re just entering into a new phase of even more disruptive innovation. To anyone reading this who was hoping for things to settle down a bit, sorry, I just don’t see it happening. But it does point to the need for IT professionals to constantly upgrade skill sets.

So if IT is showing no signs of really maturing as a market, (at least according to the way I’ve defined market maturity), what can we expect? For starters, innovation will continue to be radical. Cloud computing, virtualization, IT/telecom convergence and multicore on the technology side combined with more robust and pervasive (i.e. wireless) communications in the middle and then unprecedented levels of end user empowerment, processing power, and creativity on both the consumer and business side using Web 2.0 and social networking tools. It’s a volatile mix for continuing radical change (and that doesn’t even factor in the imploding global economy). Fasten your seat belts.

Posted by Tom Valovic on 11/20/2008 at 12:49 PM0 comments


VirtualLogix on VMware’s Trango Strategy

To get the kind of perspective on VMware’s Trango acquisition that only a competitor can provide, Editor Keith Ward and I recently spoke with one of the more visible companies in this space, VirtuaLogix. (Other competitors include Open Kernel Labs, HipLogix, and Green Hills Software.) Fadi Nasser, VP of product marketing, and Mike Seashols, chairman, provided some color on the deal from their vantage point. Being a competitor of course, the company has its own bias. Still, I thought the comments were useful.

I asked Mike Seashols if he thought this was a major strategic direction for VMware given the small size of the company and the fact that no new business unit has been established. His comment was that the potential market is “huge” although I didn’t have much luck pressing for market research projections as to the specific revenue opportunity. Mike said he thought the deal did in fact represent a major strategic direction for VMware but that the company had a “back of napkin strategy about how to execute it.” Another interesting comment: in his opinion, they were “a couple of years ahead of Trango” while confirming that the two companies have indeed bumped into each other competitively. 

Posted by Tom Valovic on 11/17/2008 at 12:49 PM0 comments


VMware Tackles Telecom

VMware’s acquisition of a small French start-up called Trango Virtual Processors signals a new direction for the company. But how big a deal is this? Trango is tiny, around 20 people. Nevertheless VMware being VMware, the move got its share of industry attention.

Trango’s product, an embedded bare metal hypervisor for virtualizing mobile devices, is called the Mobile Virtualization Platform or MVP. A key unanswered question right now is whether VMware/Trango is going after the same customer base that competitors like VirtualLogix are currently involved with. This includes mainstream telecom giants like Avaya, Alcatel-Lucent, and NEC.

The mobile device market is in a state of flux right now because cell phones are just special-purpose computers and multicore is going to amp their capability to an impressive level. This is just another area where IT/ telecom convergence --- one of the areas I used to cover as an analyst -- is making its mark (You can see it in just about every nook and cranny of the telecom market actually.)

Traditional mobile phone suppliers like Nokia and Motorola are feeling pressure from IT players like Apple (iPhone) and Google (Android) who are busy trying to reinvent the very notion of what a mobile handset is and does. Add to that the fact that desktop capability is already migrating to mobile devices like iPhones and Blackberrys and the value for VMware becomes apparent. But here’s the kicker question: the traditional telecom supplier market is huge although complex and highly specialized. Will they go after it?

What are your thoughts on the VMware deal? Post here or send me an email.

Posted by Tom Valovic on 11/16/2008 at 12:49 PM0 comments


Keep an Eye on this Company

Simtone is an interesting company to watch. I’ll tell you why. They're way ahead of the thin client curve. But if the trends that they’re helping to jumpstart actually gain serious momentum (which could take a while…and of course they need to execute on their vision), the company could become successful very fast. The market? Mobile thin clients hosted by carriers along the lines of a Blackberry model and maybe even devices that can accommodate voice (although be advised that this latter is my own speculation…Mario Dal Canto, the company’s CEO, would not spill those particular beans with me in a recent conversation with both him and the company’s Senior EVP of Carrier Business Lorenzo Mejia).

While mobile thin clients sound like a nice idea, we’re nowhere near that as a reality because the wireless bandwidth just isn’t there to support it. But that will change when carriers start offering GSM-based, HSPA. Dal Canto thinks that we’ll start to see this coming onstream sometime next year. The company currently has trials going with 3 major European carriers and a US operator involving its USP and VSP software platforms for delivering cloud computing services. Currently the solution supports both VMware and Microsoft virtualization configurations.

Posted by Tom Valovic on 11/12/2008 at 12:49 PM0 comments


VMware in the Wild

Will virtualization catch on in the Mojave desert? How about in the North Dakota plains? I have no information whatsoever on these questions but virtualization is alive and well in the wilds of New Hampshire where VMware’s New England user group  recently held a meeting. Why New Hampshire? I’m not sure but many of the attendees I met from Maine and New Hampshire were happy about the logistics.

In any event, I was able to spend the day there and found it time well spent. After lunch, Mark Bowker, an analyst with the ESG, gave an excellent presentation on the state of the virtualization market. Mark’s got one of the best presentation styles I’ve seen, very capable in terms of engaging the audience and not shying away from the “tough” virtualization questions like what happens to IT head count when a data center gets virtualized. (What happens in at least one case that was mentioned was a 70% staff reduction!)

Bowker asked the audience of approximately 100 attendees how many had attained the somewhat rare condition of being 100% virtualized. About 3 people raised their hands. I’ll do some more blogging on this event and the presentation but another key point he made had to do with something we’ve discussed before: that many shops are going down the “virtualize first then optimize” path as he put it. But once they’ve done the sweet spot virtualization, the real work often begins in terms of taking on the challenge of scaling storage and reconfiguring backup capability.

Posted by Tom Valovic on 11/09/2008 at 12:49 PM0 comments


Reader Comments on VDI and Web 2.0

It’s always great to hear from readers and one of the most enjoyable parts of my job. Here’s a response that came in recently about the blog on Web 2.0 and VDI. It’s from Amy Hodler, who is Director of Product Management with a company called Tranxition.

I read your post…regarding the collision course between DaaS/VDI and Web 2.0 and appreciate your bringing up a trend that people are just starting to recognize and few seem to want to discuss it.   I believe that “countervailing” trend you mention is larger in scope than you’ve pointed out in the article. 

The reality is that people are performing personal tasks and storing personal items on work equipments and that creates risk and IT “drag.”   What must evolve is a framework that acknowledges the behavior reality but protects the corporate assets.  The lines between work and personal life have already become intertwined and technology (and IT processes) will eventually adapt, but as we are reactive creatures it might get worse before it gets better.

This issue has been on my mind for some time because of the nature of my work.  I talk to a lot of customers, partners, analysts about how to best manage end user data/settings/state from an enterprise standpoint…and there’s a noticeable change in dialogue lately around serious management issues related to the collision of the personal and business tasks/software/platforms/behavior.  It’s a very interesting trend and I have some ideas on where we will end up, but I’m not sure about what it will look like in the middle. Good article. “

Posted by Tom Valovic on 11/04/2008 at 12:49 PM0 comments


Heavy (Not Bare) Metal Virtualization

Ok I admit it. I’ve been remiss in sprinkling enough clever pop culture references into my blog posts.  And being analyst for 9 years, I still have a tendency to slip into the ex cathedra mode of “analyst speak”. Old habits die hard, what can I say. I was reminded of this recently in reading a blog by my colleague right down the hall Lee Pender. Lee wrote a blog about unified communications which I enjoyed reading and referenced in another blog so I won’t do it again here. But I was also glad to see that Lee managed to work in a good Led Zeppelin reference into the blog since I also happen to be a musician and great Led Zeppelin fan. (In fact, my college roomate from Boston University back in the early paleolithic period is Steve Davis who wrote the bestselling book about Zep: “Hammer of the Gods”.)  Now let’s see, what’s a good band to reference in conjunction with network virtualization? Hmm, let me think…

Posted by Tom Valovic on 11/03/2008 at 12:49 PM0 comments


Zeus Technology: Virtualize Everything?

You’ve undoubtedly heard the conventional wisdom that virtualization impacts just about everything in the kingdom of IT. I’ll have to say, based on the long list of vendors we have been talking to in the last several months, that this certainly seems to be the case. A lot of mainstream IT vendors have adapted their marketing strategies to the virtualization market and are selling into virtualized environments.

Zeus Technology is one of them. The company’s core product area is an application delivery controller called ZXTM that also incorporates load balancing capability. Customers include BT, China Telecom, and NASA. The product is available as software, hardware appliances, or as a virtual appliance for VMware's Virtual Infrastructure 3 offering.

The company’s pitch is that since their product is software, it can replace hardware -based systems from companies such as F5 and Citrix, reside on the virtualized Web server (their target market), and offer a more cost effective approach to scaling up. The company’s marketing pitch is “virtualize everything” meaning all of the ancillary hardware that supports server or desktop virtualization.

Some of this is pure market positioning but there’s also some substance because virtualization is also about the shift from hardware to software and services and this is how data centers will over the course of time become more agile. The company is a certified VMware virtual appliance partner but says it has plans to support Hyper-V. It will also be offering an OVF compliant version as the standard progresses.

Posted by Tom Valovic on 11/02/2008 at 12:49 PM0 comments


VDI: An IT "Range War?"

According to Jeff Jennings, VP of Desktop Products and Solutions with VMware, the top three drivers for hosted desktop solutions are cost, management, and security. Improved security is especially useful in regulated industries like financial services and healthcare. With hosted desktop, security can be monitored more closely in the data center and virus threats reduced.

A item in Computerworld by Jim Duffy over at Network World sheds some interesting light on this problem. The article discusses “behavioral risks taken by employees in increasingly distributed and remote locations” and was commissioned by Cisco. The study, done by InsightExpress LLC, surveyed 2,000 employees and IT professionals in 10 countries focusing on security in the context of increasingly “untethered “work environments.

Among the key security-averse behaviors found: altering security settings for the purpose of bypassing IT policy and unauthorized use of applications. Interestingly with this latter, the article states that “seven out of 10 IT professionals said employee access of unauthorized applications and Web sites ultimately resulted in as many as half of their companies' data loss incidents. This belief was most common in the U.S. (74%) and India (79%). “

What that tells me is the following: there will be a tough battle shaping up between end users and IT departments over the locking down of computers. My former colleague Dan Kuznetsky had this to say in a blog about personalization vendor AppSense.

“Organizations have been asking for the best of both a locked down, encapsulated environment and the ability to offer staff members to personalize their work environment. They’ve learned that if the IT organization proceeds down a path towards a fully locked down environment, they often appear heavy handed and insensitive to staff members’ needs. In some organizations, this has lead to a “range war” between the IT department and the world at large. Everyone loses when the situation degenerates into open conflict.”

What are your thoughts about the likelihood of an IT “range war” as Dan described it. Post here or send me an email at [email protected].

Posted by Tom Valovic on 10/26/2008 at 12:49 PM0 comments


Employee-owned PCs: Good Idea?

Jeff Jennings who heads up VMware’s VDI solutions recently told me that he thinks something like 80% of desktop machines are candidates for virtualization. I told Jeff that seems high to me in light of the fact that at least one analyst firm was only projecting that 5% of desktops would be virtualized by 2011. But Jennings thinks that if the trend now seen in Asia towards employee-owned computers catches on, it could also fuel the the movement towards VDI adoption. He says the top three drivers that VMware is seeing for hosted desktop these days are cost, manageability, and security.

With the employee-owned approach, users are given a stipend from the company to go out and purchase their own computers. He cited the example of recent college graduates coming into the corporate workforce who might want to keep use their Macs. Using VDI, they can bring their Macs to the office but via hosted desktop virtualization can access all the corporate applications they need or a full hosted desktop. Time will tell if this approach will translate well into US-based corporate culture.

Posted by Tom Valovic on 10/26/2008 at 12:49 PM0 comments


VMM 2008 Released

As expected, Microsoft has released System Center Virtual Machine Manager 2008 (Otherwise and far more succinctly known as VMM 2008.) more or less on schedule. VMM 2008 will compete head to head with VMware’s own management offering, VirtualCenter, rebranded as vCenter at this year’s VMworld. As Editor Keith Ward blogged about previously, this is a very important product for Microsoft as it works to establish its credibility and the richness of its virtualization portfolio.

Microsoft’s secret sauce for VMM 2008 is a simple but powerful feature: it manages physical as well as virtual machines, a major element in the so-called “single pane of glass” value proposition. In addition, Microsoft is taking a more “open” road to management since the product can also manage VMware ESX. In this path towards openness, Microsoft is working closely with Citrix to push the DMTF’s OVF standard for VM metadata which is likely to have a major impact on the creating more management interoperability. By contrast, VMware’s idea of openess is to continue to pull partners into its orbit by exposing APIs for the development of virtual appliances.

While management is the next battleground for the “big-three” virtualization vendors (VMware, Microsoft, and Citrix), Citrix does not have a strong management offering but can, as it is in other ways, enjoy the coattail effect of working with Microsoft. Practically speaking, the real competition is between VMware and Microsoft. 

But that’s not the whole picture in management by any means. In the meantime, the big four vendors in IT management, namely CA, IBM, HP, and BMC, are not sitting on their hands. CA, for example, recently announced its Data Center Automation Manager that, like VMM 2008, manages both virtual and physical environments.

Add to the mix some interesting startup companies such as Fortisphere and Hyper 9 and the landscape gets even more intriguing. Hyper 9, for example, brings the element of search into the management mix; and Fortisphere is pushing the envelope for things like configuration management using automated policies with a kind of innovation that they claim the 800 pound gorillas won’t be able to match.

Posted by Tom Valovic on 10/21/2008 at 12:49 PM0 comments


Tideway’s CEO on the Green Data Center

It's often hard for people outside of the slightly specialized world of distributed computing and data centers to get their heads around quite how critical and complex they have become. So I find the attention the media is now lavishing upon the data center industry to be bittersweet. “Sweet” because it certainly serves to put into perspective the dark art of data center computing. A recent article in the Economist noted that by 2010, the total number of servers in the U.S. is expected to grow to 15.8 million, located in 7,000 data centers nationwide – the biggest of which currently contain up to 80,000 servers each.

“Bitter” because of the increasingly significant impact data center energy use and power consumption is having on the planet. Between 2000 and 2005, energy usage rose from just over 50 billion KWh per year to over 150 billion KWh. Over half of this energy is used to power servers, and 40% is used to keep those servers cool enough to operate. According to the EPA, data centers now account for 1.5% of all electricity consumption in the U.S., up from 0.6% in 2000 and 1% in 2005. McKinsey and the Uptime Institute estimate that data centers globally account for more annual carbon dioxide emissions than the entire country of Argentina.

In this context, the new key metric is “performance-per-watt,” and there is plenty of scope to improve it. The problem is the people, processes and systems required to drive this initiative forward are seriously lagging behind. A research program on best practices set up by the EPA has only 54 volunteers. Most data center administrators are nowhere close to knowing what’s running on which servers in their data centers. That makes adoption of more efficient hardware and co-locating software programs on a single box through virtualization risky, slow and arduous.

Power-efficiency is only one constraint when it comes to data center design, planning and operation. Consider compliance regulations, real estate costs, and the location of servers and compute cycles to allow for latency, cooling, and power supply – and the truly mind-bending challenge of data center optimization becomes clear. As virtualization and cloud computing in all its variants really take off, energy consumption and cost may be optimized in the future by moving workloads from one virtualized machine to another, but this will only increase the cost and complexity of managing the applications.

The right IT management tools can help a company reduce its carbon footprint – and save considerable costs in the process. That’s not a myth. The process isn’t necessarily simple and there is certainly additional work involved when we’re talking about adopting an enterprise-wide green strategy. But before they can get to the big work in the data center – decommissioning the inefficient or useless servers, virtualizing the existing resources, relocating – organizations need to be able to approach these initiatives with accurate and thorough intelligence about their IT assets and the relationships between infrastructure, specific business services and energy consumption. Without this intelligence, it can be hard to know what to do to reduce emissions, much less how to do it without risking the interruption of business-critical, revenue-generating services.

The onus is on IT to get the basics right. In the average data center we’ll find most power-hungry servers running at only 10-15% utilization. Combine this with the aforementioned 30 per cent plus of servers the Uptime Institute estimates are obsolete and decommissioned– but still using power - and it should be clear that there are significant and immediate cost and green savings we can achieve by implementing processes that quickly identify and remove these inefficiencies. In a journey of one thousand steps, these are the first strides.

Posted by Tom Valovic on 10/20/2008 at 12:49 PM0 comments


Subscribe on YouTube