Just in case you haven't caught one of my blogs, please allow me to introduce myself as the new editor in chief of Virtualization Review. My name is Bruce Hoard, and to put bookends on my career, I started out with Computerworld back in the early '80s and was founding editor of Network World. I then spent 20 years as a freelance writer and editor before being hired into this new role by Doug Barney-who obviously has an eye for talent, not to mention craggy good looks.
Prior to taking this job, I wrote a few white papers and magazine articles on virtualization. But all that was really only dabbling compared to what I've been doing in my first few weeks on the job: a flurry of vendor briefings, user interviews and immersing myself in all things virtual.
It didn't take long for me to get a handle on the importance of managing and controlling virtualization environments. In fact, it seems like almost every day a new survey drives home that idea.
Case in point: This past summer, Network World reported on a survey conducted by Network Instruments LLC, a vendor of network-analysis software tools. In this survey, 442 IT pros revealed that while 75 percent of respondents were deploying virtualization, "They lacked the adequate tools, visibility or information about their virtual environments to troubleshoot their problems."
Entitled "Security and Trust: The Backbone of Doing Business over the Internet," this survey discovered that respondents might not have the money required to invest in the tools they needed to manage their advanced environments. Why? Because 73 percent were being asked to do more with fewer resources. Among those tightening budgets, more than half of those polled said the reduction in financial resources could lead to IT degradation and failures.
Talk about a recipe for disaster: You install this disruptive technology that touches your servers and networks, and then you realize you don't have the means to manage it. What happens if there are major problems? How can you resolve them if you lack the necessary tools and information?
Meanwhile, virtualization technology continues to improve and proliferate. The Network Instruments survey went on to note that in 2009, an estimated 27 percent of applications are sitting on virtual servers. The survey also predicted that number would rise to 44 percent in 2010 and 60 percent by 2011.
Thomas Bittman, vice president and distinguished analyst at Gartner Inc., is also forecasting bullish numbers. He says that the installed base of virtual machines (VMs) will increase by a factor of 10, reaching 58 million VMs by 2012.
Asked what decision makers such as CIOs dislike about virtualization, Bittman cites "a new layer of complexity" that requires them to be very careful about how they manage their resource pools. In his worst-case scenario, companies that mismanage their virtualization installations run the risk of undoing key configuration-management and performance-management processes that may have been defined-and refined-over the course of years.
"All those processes have to change when I virtualize," he notes. "I have to change how I do capacity planning; how I deal with speed and virtualization; how I deal with offline images. I have to change all my processes, so it's very disruptive."
It is with these challenges in mind that we offer our 2010 Ultimate Virtualization Buyer's Guide, which is included in this issue. In addition to offering a slew of management tools from a variety of vendors large and small, it contains a wealth of other helpful products that cover the virtualization market from A to Z.
Check it out, and let us know what you think.
Bruce Hoard is the new editor of Virtualization Review. Prior to taking this post, he was founding editor of Network World and spent 20 years as a freelance writer and editor in the IT industry.