Here in the Boston area, if you live outside of Route 495 you're considered to be, like in medieval times, "beyond the pale." I like that just fine actually. My little Massachusetts town has six horse farms, lots of open space and rolling hills, less congestion, more fresh air, and I can actually see the stars at night. The town center is quintessentially New England with the mandatory three white steepled churches and a picturesque town common ... so much so that (I'm told) it was used to depict, guess what, a quintessential New England town in a Hollywood film years ago.
I typically work at home a few days a week. But occasionally working at home has its drawbacks. Like woodpeckers. Pesky, persistent, "can’t get a clue" woodpeckers. Like the one that seems to show up for work everyday with the intent of systematically deconstructing the cornerboard of my house and my concentration along with it.
Woody's got great timing. He invariably shows up right outside my second-floor home office when I'm writing. (Now granted, that's pretty often, but usually it seems to happen when I'm working on something that seems to require a lot of concentration.) In the past when Woody shows up and starts hammering away outside of the wall about three feet from my head, I've had to run downstairs, out the back door and around the side of the house to chase it away. As you might expect, doing this three or four times in the course of an afternoon can get a little tedious. Now when he (or she) goes to work, I grab a bamboo stick from one of my martial arts classes a few years ago and start tapping back on the wall. Seems to works like a charm. (Well, most of the time anyway.)
Posted by Tom Valovic on 12/29/2008 at 12:49 PM1 comments
John Suit is CTO and Founder of Fortisphere, a startup that focuses on automation technologies for virtual environments. He made some predictions for the virtual space in 2009, and shared them with us recently. One I found particularly intriguing. Imagine this scenario: a business unit owner in the company goes to their IT admin and says “How come I can go to Amazon and get an EC2 cloud instance for $15/month and .15 cents an hour and I have to pay you guys $9000 a year?”
Suit points out that if this becomes a trend, IT will “have to have a really good answer”. Indeed. He also says his company is seeing this kind of thing happening already. If the economy’s current duck dive provides even more momentum to cloud (high likely in my opinion), the trend might get legs. And while we’re talking Amazon, fun fact according to David Lynch, VPM over at Embotics with whom I had a similar conversation: Amazon’s external cloud-based level of IT transaction activity has recently crossed a threshold to where it’s now greater than internally generated activity.
Posted by Tom Valovic on 12/17/2008 at 12:49 PM0 comments
I'm not an economist nor do I play one on TV. Nor would I want to play one on TV nor for that matter would I want to be an economist, practitioners of what historian Thomas Carlyle called the dismal science. Frankly, I still can't quite figure out what economists actually do other than spin out fanciful theories about why things do or do not work that seem to have little relationship to whether things do or do not work. Still it must be nice to have a job like that. In any event, here are a few of my own non-expert opinions on what's going on right now with the global economy.
Clearly aspects of the economy have gotten so complex that it eludes understanding of even very sophisticated thinkers. The experts have often been baffled and stood before the public without good or cogent answers or explanations on numerous occasions. Combine the complexity of exotic designer financial instruments with the ability to move money around the planet on a near-instantaneous basis and, voilà, you get the kind of volatility we're seeing today. But here's the thing: The notion of anyone having the valuation of their life savings, retirement money or any huge chunk of personal financial resources moving up and down like a yo-yo is simply not a sustainable economic model. That said, it's surprising to me how many people have accepted this as if there were no alternative.
As I wrote about in Digital Mythologies, I believe that social and economic forces of IT and telecom combined fosters wealth-generating mechanisms that are still not fully understood (rest my case on this one folks -- look around). If you're interested, e-mail me and I can point you to research that documents this phenomenon. But this mechanism has not mapped well into the old "smokestack" economic models. Result: what we're seeing today.
Cure? I have no idea other than the fact that some radical, enlightened and creative new thinking is called for and, once proposed, will most certainly need some sort of global buy-in.
Posted by Tom Valovic on 12/15/2008 at 12:49 PM2 comments
VMware recently held an event in Boston called
Virtualization Forum 2008. I would estimate attendance at 400-600, in that range. My only gripe is that the company seems to have a penchant for piping in high-decibel hard rock music just before the first morning session (the same thing happened at VMworld before the Maritz keynote, some of you may recall). While I'm a avid rock fan, I would prefer to just sit there in a quiet, coffee-sipping lump at that time of day. But hey that's just me.
The Boston session wasn't exactly full of surprises, although I found it curious that VMware appears to be being a bit dodgy about the progress of the vStorage initiative that they were so eager to talk about in Vegas. Something might be in the works that has caused them to freeze their messaging. An anecdotal comment I heard the other day suggesting more incipient EMC involvement tends to bolster that line of thinking.
One of the interesting data points that came out of this session is that VMware is now considered to be the tenth largest software company. And at the event, Michelle Bailey from IDC gave a first-rate presentation and noted that VDI deployment levels are now at the 5 percent level. Considering that server virtualization stands at 10-12 percent, this is an impressive stat. Bailey also talked a bit about the power challenges facing companies these days in terms of data center consumption pointing out that, for many of them, the problem isn't cost, it's about just getting enough power off the grid to meet their needs.
Posted by Tom Valovic on 12/15/2008 at 12:49 PM0 comments
I was joking with a former IDC colleague the other day that I seem to see more folks from IDC's Enterprise Computing Group (my former business unit) now than when I was there. Of course, analysts are on the road or working from home more often than not. But I recently had an enjoyable lunch with John Humphreys and Julie Geer from Citrix at our usual business hangout, Legal Seafoods. John was up until recently covering the virtualization market for IDC but is now senior director of solutions marketing with Citrix, with responsibilities that cut across a number of different business units there.
John had some interesting observations to make on a number of topics. We talked quite a bit about Citrix' cloud strategy and he pointed out that most of the cloud providers (including Amazon EC2) today are using Xen Open Source hypervisors and many have developed their own management systems. This is an opportunity for Citrix to upsell products like its Workflow Studio and Netscaler.
He also mentioned that Citrix has plans to add a local client virtualization product in Q1 which could address a critical issue we've discussed here before -- i.e. knowledge workers not accepting VDI because they're used to customizing their PCs with personal programs, gadgets and Web 2.0 resources. (See "Two IT Trends on a Collision Course".) A partitioning hypervisor on the PC or laptop would in effect keeps the corporate "half" of the PC unsullied and at the same time give the end user his or her own sandbox to use for whatever applications were deemed useful -- in theory, a win-win for both the end user and IT staff which would have far less labor-intensive maintenance to perform.
What do you think? Is this the answer to the so-called "desktop dilemma" as VMware likes to call it? Post here or shoot me an e-mail.
Posted by Tom Valovic on 12/10/2008 at 12:49 PM1 comments
I kind of like IBM's clever marketing idea. An ad click parlayed off the theme of defining server virtualization takes you to a special-purpose
Web site, sponsored by both IBM and Intel. The site is a kind of workspace to share definitions of server virtualization and even allows you to draw a diagram using a software program called Whiteboard.
So here's how it works. You create your definition. Then site visitors get to vote on each one. Each contributor's definition is shown along with the total number of votes. Right now there are 22 definitions posted (although some half-hearted efforts represented, to be sure.) In addition, you can click on links to "most recent," "most discussed" or most votes.
Here's the one that, when I checked at least, had the most votes:
There's no question, virtualization has matured nicely over the past few years. Recently, with data center battle-tested products like VMware, it has become a key enabling technology for an even broader scope of markets. For the data center and enterprise, the benefits of virtualization are numerous and obvious. The need for high availability platforms that scale on demand has paved the way for larger, application-aware and multiple OS capable architectures. In addition, server consolidation to provide efficiencies in power consumption, maintenance and other overhead costs, has become critical. There are lots of other areas where virtualization reduces costs and provides efficiencies, including cooling, application/OS testing and associated man hours, as well as reduced backup, security and OS software licensing fees. For many in the enterprise, virtualization is a virtual no-brainer. In fact, many current business models in IT wouldn't even exist without virtualization today.
On the other end of the spectrum, the mainstream consumer or small business has been living in a "maintain and upgrade every 2-3 years" paradigm for a very long time for their generalized computing requirements. It has been only recently that the average consumer end-user has seen the benefits of virtualization technology but they are becoming more apparent here as well. Though perhaps it's not the be-all, end-all of general computing, virtualization has shown its merits to an ever-increasing base of end-user types. Though some may claim that cloud computing and virtualization are different, there are many commonalities between what the enterprise and data center markets call virtualization and what end users have at their disposal now for online application, backup, and synching services. From MobileMe to Amazon's EC2, virtualization has now officially gone mainstream and there's no end in sight with its numerous application potential and extremely low cost model. Though the computing enthusiast or gadget freak may not be comfortable with a reality where all of their processing and storage resources are handled virtually, let's face it, the mainstream end user simply doesn't have much use for all that hardware.
A few years from now, many end users will be comfortable with a simple netbook or a thin client as their desktop and then the rest of all that technology will reside in the cloud. In short, from the data center to the enterprise and now the end user, virtualization is here to stay and it's not just for CIOs and Senior Technicians anymore.
Some interesting points there. Only one problem: It's not a definition!
Posted by Tom Valovic on 12/08/2008 at 12:49 PM0 comments
Tarkan Maner, President and CEO of
Wyse, and Jeff McNaught CMO, stopped by our offices in Framingham to chat with
Redmond magazine editor-in-chief Doug Barney, exec. editor Lee Pender and yours truly about the company's future direction. Maner is not your typical CEO in many respects ... I could tell when he and Jeff immediately gravitated towards the
Rock 'Em Sock 'Em Robots game in our conference room as soon as they arrived.
Tarkan is candid about the industry's blind spots and passionate about his company's value in the marketplace, and conveys both with an animated style. He has the air of  someone who knows that the wind is at his back and that his company is at the right place at the right time. He told us that Wyse made a major investment in R&D and now has about 300 R&D employees. They're all working on virtualization software. (The hardware development is outsourced now to Taiwan.)
Another interesting data point: Wal-Mart is a customer with 200,000 thin clients running. "All the stores use thin clients and it used to be PCs 20 years ago. Wal-Mart doesn't do anything without doing extensive internal ROI," he pointed out.
Posted by Tom Valovic on 12/08/2008 at 12:49 PM0 comments
Today IBM and two business partners (Virtual Bridges and Canonical)
announced general availability for what the company described as its "first Linux-based, 'Microsoft-free' virtual PC solution," claiming that it cuts desktop costs in half compared to a Microsoft-based environment. (Note: The cost estimates weren't provided by a third party but by IBM.) A statement says that those savings will come from license costs, desk-side PC support and help and desk services among other things.
IBM's got the wind at its back on this one and it's an ill wind called recession. But their timing couldn't be better given the Vista downdraft and the need for companies to trim budgets. The partner-based solution will be offered worldwide with the intent to replace Windows with Canonical/Ubuntu Linux and Microsoft Office with IBM's open standards-based (and somewhat elaborately named) Open Collaboration Client Solution (OCCS). OCCS includes e-mail, calendaring, word processing, spreadsheets and even unified communication wrapped in with Notes and Symphony.
What's interesting about this announcement is the congruence of two cost-sensitive value propositions: VDI and Linux. VDI (a.k.a. hosted desktop) is not quite in the same league as server virtualization with respect to ROI. Still, Tarkan Maner, the CEO of Wyse, on a recent visit to our offices told us that Wyse has customers who have gotten three or six month payback for thin clients deployments. But combining the positives of the economic benefits of VDI, thin clients and Linux with the negatives associated with Vista, dissatisfaction with Redmond's licensing model and a seriously hurting economy could very well be all the elements needed for a perfect storm. Stay tuned.
Posted by Tom Valovic on 12/04/2008 at 12:49 PM0 comments
An interesting e-mail came in the other day from
Phil Koster, a systems engineer for Data Strategy,
a consulting company that specializes in VMware
solutions. Phil had read my
blog
on the maturity of the IT market (or, more accurately,
the lack thereof) and said he was going to comment
here on the site but thought the response might
be too long. Instead, he replied in
his
own blog. Phil disagreed with some of my comments
and raised some interesting points for future
discussion. (I want to think about his comments
a little bit before I respond.) In any event,
here's the first part of Phil's post followed
by a link to his site if you'd like to read the
whole thing.
I was reading a blog posting on when IT
technology will mature. I disagree with Tom
in that according to his definition, the IT
industry as a whole is not mature. I also disagree
in the appropriateness of using that definition.
Tom defines market maturity as "Mature markets
have certain characteristics and one is that
at some point seemingly endless cycles of innovation
begin to fade. At the same time, technology
curves, products, and end user familiarity and
comfort levels...settles in to a known model
with reasonably predictable characteristics."
There is no defined timeline in this definition.
I think that alone leads to confusion. How many
TRUE technological advances has the IT. industry
seen in the last 15 years? Once every few years
a TRULY revolutionary technology comes along.
Not a mild enhancement on an old idea but a
true, new technology. Look at Virtualization
for example. IBM Mainframes started doing virtualization
in what, the 60's, 70's? VMware, Citrix, Microsoft,
et el are just moving that concept to a new
platform (the x86). While this is "new" and
cool, it is conceptually decades old (admittedly
with some GREAT tweaks depending on what specific
product you look at). But what is MS Terminal
Services other than a MS proprietary form of
the old terminal/mainframe idea? Adding thin
clients only increases that parallel.
How is this different than cars or motorcycles
or telephones or anything else? The only difference
is the frequencies in which these innovations
occur. For cars, it can be decades. For the
PC, it can be less than a year.
(Click here
to continue.)
Posted by Tom Valovic on 12/03/2008 at 12:49 PM0 comments
Is virtualization confusing to newcomers to the
topic? You bet. Could certain aspects of it easily
be made less confusing? Absolutely. One of them
is the nomenclature surrounding VDI.
Nomenclature wars are mostly border skirmishes
that take place between vendors trying to stake
out conceptual territory in emerging markets.
Vendors hope their terminology will catch on because
if it does, presto, the best kind of viral marketing
gets jumpstarted. But you can also see it happening
elsewhere. For example, as mentioned in an earlier
blog, Forrester and Gartner both have their own
terms for VDI: hosted desktop virtualization and
server-based client virtualization, respectively.
There are plenty of other terms out there as
well such as "centralized desktop virtualization"
or the Citrix
usage "desktop virtualization." But you might
be thinking: so what if there are 10 different
terms for VDI floating around out there? The problem
is that confusion helps no one: vendors, partners,
customers and prospective buyers with this latter
category being especially affected. In due course,
eventually the market muddles through and sorts
all this out even though it's usually not a
particularly smooth or easy process. In the meantime,
there's little practically that can be done to
solve this particular semantic quagmire (aside
from editors like myself grousing in blogs).
Posted by Tom Valovic on 12/01/2008 at 12:49 PM1 comments
I ran across an interesting blog from Gordon Payne, the GM of Citrix’ Delivery Systems Division. I always like to see senior managers in a company blogging although the quality can vary. I found this one to be a bit more sales-pitch oriented in tone than some other executive blogs I’ve seen… forgivable as long as the information is in there (which in this case it is.)
First a little history. I met Gordon years ago when he was heading up a VOIP startup company called Tundo and I was an analyst covering IT and telecom convergence. Gordon also had a stint with telecom giant Nortel Networks. It’s interesting to note how many people with communications backgrounds are now working in one capacity or another in the VDI space (but not particularly surprising given how telecom-intensive the market is).
A couple of data points that caught my eye:
-- A single shared server for XenApp can generally accommodate 300-400 users whereas VDI using XenDesktop) is limited to 30-50 users per server.
-- Payne says that early VMware VDI solutions “seemed designed to increase IT's spend on back end storage” but states that “XenDesktop has been architected to optimize storage requirements by dynamically assembling users' desktop at the time when they logon. The only unique storage required for each user is their profile and application data.” However, it’s worth noting that VMware is also moving in this direction.
-- Citrix is working with the Xen.org, Xen Client Initiative (XCI) to create embedded hypervisors for laptops, PC's and PDAs. It would be interesting to see how this effort squares with what VMware is up to with its acquisition of Trango.
-- A third party evaluation of XenDesktop is available at this web site (Infoworld) Â
Posted by Tom Valovic on 11/30/2008 at 12:49 PM0 comments
As you can imagine, we have a lot of discussions with virtualization management vendors of all stripes. I tend to find startups especially intriguing since many of them have a particular value proposition that they’re trying to bring to the party. In some cases, they’re trying to fill a product gap in the marketplace (something overlooked by the big systems vendors or just overlooked in general) and in others, just trying to advance the state of the art.
A few examples spring to mind. A company called Netuitive is focusing on adaptive self-learning -- a mechanism that correlates other metrics already being generated by VMware’s vCenter and vApp representing server performance and end user performance respectively. The idea is that this provides a holistic view not obtainable by looking at either set of metrics on its own and allows predictive models to be developed.
Hyper9, a company is based in Austin, Texas, is attempting to to reduce the complexity that virtualization introduces with a “Google-like” search engine as opposed to the classic hunt and find logic tree approach. Another company called Fortisphere is pushing the envelope in configuration management using automated policies.
Eventually these capabilities will find their way into the marketplace and IT shops will vote with their pocketbooks as to which capabilities are worth having and which are not. Then, sooner or later, these innovative approaches, if successful, are likely to get imitated, migrate into other product development cycles, and end up in the offerings of the larger established vendors. What’s on your virtualization management wish list that’s not being offered currently? I’d love to get your thoughts. Post here or send me an email.
Posted by Tom Valovic on 11/23/2008 at 12:49 PM0 comments