The Many Faces of Desktop Virtualization

It can be argued that VDI is a one-off use case that VMware invented many years ago for its ability to host a virtual copy of a desktop operating system in the datacenter. In other words, taking what the company had already proven for server operating systems and attempting to recreate the same magic (and assumedly ROI) for the desktop operating systems.

Since then, Citrix has also entered the space for the hosted virtual desktop (HVD), arguably what VDI is all about, and there are other offerings from other organizations such as Quest that are also driving home the concept of HVD.

Now, there are other technologies in the field of desktop virtualization that are used to (you got it) virtualize the desktop:

Operating System Provisioning: OS provisioning can be delivered to either VMs in the datacenter or physical PCs at the desktop. An always-on network is required for these scenarios, which is why the laptop use case is not one that's preferred. Also, if this is to be delivered to bare-metal desktops, there is a significant issue in terms of hardware support: The hardware must support the boot options via the Citrix model, and the image that is shipped to each PC will most likely need every single driver known to man in order to guarantee that the OS will function correctly for the specific target hardware once it finally gets there. Citrix Provisioning server, Teradici PCoIP, HP's RGS solution are some examples.

Remote Desktop Services: Of course there are still Remote Desktop Services (RDS) that may or may not include the Citrix XenApp layer. This has long been the de facto model to deliver server-based desktops or just to deliver standard Windows applications to users with little requirements at the client side. RDS continues to offer the best bang for your buck if ROI is all you are looking for, because there is far better scalability due to only one instance of the actual operating system running on the hardware (as opposed to VDI).

Client Hypervisors: Client hypervisors, of which Virtual Computer, Citrix, and Neocleus (now Intel) are examples, are very interesting as they potentially grant the ability for all of the management benefits of HVDs with the added mega-bonus of utilizing the client hardware for guaranteed performance benefits. That is, local hardware is always going to perform better than something hosted in a datacenter for desktop operations. Their challenge, however, is simply that the client hypervisor has a requirement on the actual hardware that will be supported. So the ideal use case of allowing a user to bring in her own laptop and run a VM on it is somewhat thwarted by the fact that it is extremely likely the user's device is not supported. Hence, the VM will not run on it.

Client-side Hosted Virtual Desktops: The client-side HVD technology vendors attempt to reduce this risk by enabling a VM to run on top of an existing operating system, so in effect there are two operating systems at the desktop that need to be supported. This of course can be minimized to reduce complexity, and it does eliminate the issue of not being able to run on all desktops due to unsupported hardware.

These technologies are also beneficial to the end user because they work in a disconnected state. As a result, the laptop use case can be realized more easily. MokaFive, RingCube and Wanova are examples here.

Both HVD and CVD (Client Virtual Desktop) require a change in mentality for the systems administrator, because there now is a lot more involvement needed to ensure that the operating system image and subsequent application packages are delivered to the user's desktop. Arguably the CVD is even more of a mind shift, because the IT administrator needs a significant amount of education before even considering this new model of IT management.

Application Virtualization: Application virtualization technology became a recognized form of desktop virtualization almost five years ago when Microsoft invested in a company called Softricity. Microsoft Application Virtualization (App-V) separates the Windows applications from the underlying operating system and (optionally) isolates the application from all other applications to enable multiple different (and potentially conflicting) applications to execute on the same Windows platform. The technology enables the administrator to package the corporate applications into containers that are then delivered (or streamed) into the desktop on demand, thus enabling a far more efficient form of application management. Besides Microsoft, there are many application virtualization vendors in the market space today (VMware's ThinApp and Symantec's vWorkspace, among others), all of which are helping the administrator build a more manageable enterprise application pool.

At the End of the Day...
As we can see, desktop virtualization is actually a whole lot more than just VDI, with complexity easily able to run rampant if we allow it. The one thing that remains constant, however, is that all of these technologies require a user virtualization solution to ensure that the user is serviced and his/her settings are always applicable regardless of the application or the model by which the application was initiated. The user experience is critical in all cases.

Posted by Simon Rust on 04/14/2011 at 12:49 PM7 comments


DNA of the User

The term DNA may conjure images of everything from "CSI" to Dolly the sheep or the Human Genome Project. In the context of this column, it refers to that je ne sais quoi that makes each user unique and causes challenges for wide-scale virtualization deployments.

As in genetics, the DNA of every user is singular to him or her, no matter how similar it may be to someone else's. That DNA plays a fundamental role in determining the success of enterprises' virtualization efforts.

User DNA should not be confused with personalization of the user profile, which is currently possible across Microsoft platforms. To me, the real DNA of the user comes into play when we consider other platforms and application types, like software as a service (SaaS) or cloud-based applications. In a couple of years, it will be essential for the makeup of "me" and the makeup of "you" to be found in a multitude of business applications that are not solely Windows-based.

Clearly Windows applications are the ones that most operators use today. Tomorrow, we shall see things like native applications for tablet PCs, iPads, the new TouchPad, the Pre, whatever device you can imagine.

At present, the focus is on ensuring that a user's information from Outlook (for example) will work in the virtual desktop environment; however, those personal settings are not transferred onto a different e-mail client. An e-mail you have sent using your desktop Outlook client will have your personalized signature emblazoned across the bottom, but the e-mails you send through the Web-based version of that same application will not. The reason is that they are very different platforms and so must be managed differently.

Soon it will no longer be about making sure that Outlook works correctly but that my personal information management (PIM) works correctly overall. To make this possible, there needs to be some form of extrapolated language that is mutual and that therefore will allow PIM settings to plug in anywhere, giving users their choice of applications in the same way that they have the choice of mobile devices or tablets they use. We will need to have platforms able to convert that data.

In short, we must catch up to the expectations that users already have. If I send some emails and create some data through Microsoft Office on my Mac whilst I'm sitting on my couch at home tonight, I would like my settings and all of that data I created--part of my user DNA--to be available on my PC tomorrow in the office or on my iPhone, TouchPad or whatever device I am using. This will not be the case today of course, because even though Microsoft Office works similarly on both the PC and the Mac, each platform stores its underlying user information in a very different way. The user data is dictated to the application by the platform on which it is running.

In fairness, we users are attempting to cram a load of incompatible items into the same space, and clearly that is not going to work.

We do expect it nonetheless.

Developing a DNA Standard
User DNA ultimately will need to be some form of ISO standard, and there will need to be a plethora of vendors involved in the definition of the user DNA standard--how the user's email address is stored, the user name, their time zone and so forth. Inevitably, the applications we use will have settings translated into this standard, though it is still some distance away from realization.

For a high-level example, think of Google Translate. When you type in a phrase to translate from English into French, the first step in the process is for Google to translate the English phrase into Google language. The translation from Google-speak into French comes next. There is always an intermediary: English-Google-French. Obviously there is a lot more done in the background with regards to understanding how sentences are constructed in the various languages, but the point is simple--there needs to be an intermediary step.

Or, let's say there are people from all different backgrounds--someone from the UK, another from Belgium, one from Kenya, someone else from Japan, etc.--that have been shoved into the same room. A couple of them will be able to communicate with each other, but they will not all be able to communicate and understand everyone else in the room. Some form of intermediary language is required in order for them to all communicate together.

Technologically speaking, it is fundamentally all about the interoperability of using applications. People want to use an application to perform a specific function without needing to worry who built it or how it is being delivered. All they are concerned with is being able to use the application easily and access all of their data.

For the foreseeable future, there will continue to be multiple platforms and devices. Apple iOS is a market leader, Google's Android will become a leader, and Microsoft Windows 7 is not disappearing any time soon. There is no way that within the next four to five years we will see only one platform and one developer of that platform and have everyone decide to use it. Some may die, but we are not going to have all but one of them disappear. Combine that with the fact that our personal and professional lives continue to become more enmeshed on the technology front, and it becomes increasingly important to have all of these applications stitched together via this universal user DNA.

There is clearly a long way to go, but we know we must form the digital DNA in order to interoperate across any device at any time to allow our professional lives and personal lives to function at a digital level as they already do in our body and mind.

Posted by Simon Rust on 03/01/2011 at 12:49 PM5 comments


Veni, VDI, Vici: Campaign to Conquer Cross-Platform Personalization

As a culture, our minds have changed about how IT should be used. No longer do we think of it only as a necessary evil or a handy tool to make a few tasks simpler in the workplace. Information technology is an essential part of doing business.

Along with this mindset and reliance on technology come high expectations. We expect always-on access and all of our personal settings to be available on every device we use whenever we use it. We don't care who does it, what technology lies underneath or how it is made possible. We only care about where (everywhere), when (now) and why (because we want it).

Users are one of the biggest challenges to tackle in desktop virtualization adoption within the enterprise, as I've mentioned before.  Therefore, it makes sense that companies would try to achieve cross-platform personalization across operating systems, devices and any other related bits and pieces.

Cross-Microsoft Platform Personalization
First, let's look at the personalization movement already underway with the Microsoft platform.

Some virtualization companies have been working to migrate users from Windows Server 2003/Windows XP to Windows Server 2008 (R2)/Windows 7. Technically, the platform for each is similar, but they have different underlying user profile types. Technology solutions have centered on helping enterprises transition from a Windows Server 2003/Windows XP version 1 user profile to the version 2 profile of Windows Server 2008 (R2)/Windows 7. Because this has been accomplished and it is all on a Microsoft platform, it means we also can move between the version 1 profile on the desktop to the version 2 profile on the server and vice versa.

For example, a user could access his files at his desk in New York; tomorrow, he could be on an airplane accessing a Terminal Services session. As far as he's concerned, his experience will be exactly the same. Those two things--the laptop experience and server-based computing experience--would not have been the same without user virtualization technology.

Windows 7 Migration
At the same time, companies have been migrating their traditional desktop OS from Windows XP to Windows 7. Along the way, many of them have decided to make the management of Windows smarter and simpler at the same time, and they are embracing application virtualization as a more cost-effective approach to delivering the application set to their user population.

My earlier post about Windows 7 covered the emergence of an organic grouping in which customers are combining their physical and virtual desktops running Windows 7 with an application virtualization solution. This means they are crossing boundaries between server administrators and desktop administrators, so naturally it is pulling together the platforms and creating a need for cross-platform personalization.

Proliferation of Platforms
Soon, operating only on the Windows platform will not be enough. Today, it's cross-Microsoft platforms. Tomorrow, as soon as 2012, it needs to be Android, iOS, Blackberry technology, or some form of Linux even.

Typically it has been the mobile applications and devices driving the consumerization of IT. Consider your average IT help desk. Now those brave souls are getting questions about how to get their Slate, Streak, Xoom or iPad to work with the office network and applications, as opposed to questions about how to get their MacBook to work.

The Users of Tomorrow
The aforementioned high expectations for IT functionality are only going to increase as younger generations enter the workforce. My four-year-old and 18-month-old children will illustrate this point perfectly. They both know how to use a tablet as well as I can to access educational games, videos and so forth. Do you think they have any tolerance if the iPad battery dies, I hand them my iPhone as a substitute, and they can't get the same videos, games and content?

Not a chance.

They scream, get cross and throw the iPhone at me. They don't understand why the same items are not just there. Those are the Users of Tomorrow. They have an expectation of total synchronicity, and their tolerance level is as close to zero as you are ever going to get.

Even though I am of the pre-Internet generation that remembers the dark days when computers were not common household appliances, I find my own tolerance is getting significantly lower each day. It's fair to say there is not a day that goes by when I don't use a mobile phone, tablet device and a PC, so it's intolerable that my contacts, calendar, bookmarks and e-mail signature do not automatically appear on all of my devices and I have to manually transfer or re-enter all of that information.

That data is a key aspect of what I call the "DNA of the user," a topic to be explored in more depth in a future post.

Posted by Simon Rust on 02/08/2011 at 12:49 PM3 comments


How Web-Based Applications Impact VDI

The number and effect of Web-based applications has not yet grown big enough to directly affect VDI. However, they are a relevant topic for discussion because they do significantly lessen the impact of the administration model and desktop delivery for the enterprise, and ultimately will deliver cost savings.

The rise of Web-based applications will make a huge difference in how applications are delivered to users and, more importantly, how they are managed. In effect, all an end-user needs with his desktop computing device is a browser, making the composition of the desktop significantly simpler, since a browser is typically available by default on any current computing platform.

Administration Requirements
Web-based applications invariably make life significantly easier for IT desktop administrators. They no longer need to worry about how to get the client-side application to the user's desktop device, given that a browser is all that's required. Any data migration or application upgrades are performed in the data center, and therefore once again require no interaction with the client device since everything is stored and accessed centrally.

Let's consider Salesforce.com, the uber Web application example. A company using Salesforce.com only has to manage which users are allowed to access what parts of the database. How that data is used and dealt with is managed by the application itself in the Salesforce.com data center. As long as users have relevant credentials and a device with a Web browser such as Internet Explorer, Firefox or Safari, they are ready to go.

The catch with Web-based applications is that you must have an always-on connection to the location where the application is stored. It may be the public Internet for accessing a cloud solution such as Salesforce.com, or a virtual private network for accessing a private cloud solution such as an internal finance application. So in the example of Salesforce.com, the IT team really has only to concern itself with whether the user has Internet access and a browser.

Application Delivery into the Virtual Desktop
So, let's go full circle to the initial point of application types impacting VDI. A key aspect of how applications impact VDI involves the basic question of how to get the application into the virtual desktop, and consequently to the user. Most early adopters chose to build all of the required applications into the virtual desktop and then provide desktops to users on an individual basis. This was the only method available and was very costly due to significant storage costs.

Today, with the benefit of hindsight and the fact that the technology has moved forward significantly in the last couple of years, the preferred model would be to make use of application virtualization (such as Microsoft App-V) and/or to stream applications in real-time to the desktop. This allows the component desktop to come forward, reducing the cost to build, deliver and manage virtual desktops within the enterprise. However, the clear downside to the latter approach (streaming the applications into the virtual desktop) is the delay in how quickly the application makes it to a usable state at the desktop as the application is spun up in real time.

For example, one of the most prevalent applications is Microsoft Office, which would typically result in roughly 600MB as a virtualized application. This represents a sizable chunk of data to be delivered / streamed to every virtual desktop, every user, every single day. Granted, this data transfer is all within the confines of the data center, but it is still data that needs moving around on a daily basis and hence is costly when you multiply out the sheer volume of data transfers across the user population.

With a Web-based application, you need not worry about application provisioning into the virtual desktop image. By virtue of your use of a VDI desktop, you already have connectivity; therefore, if you are using VDI it is a given that you will have access to Web-based applications. 

So then, the Web-based application must be the Holy Grail that the enterprise has been waiting for, mustn't it, with central management, simple desktop administration with nothing to stream/download to the virtual desktop, and easy centralized upgrade management? Well, not quite. There is little provision for offline access and we are some way from having ubiquitous connectivity, unfortunately. On top of this, we need to wait for the application set to be converted across into the Web-based model. We then find ourselves walking into a whole different challenge regarding the personalization options across multiple Web applications -- but that is a future blog topic in the waiting.

Posted by Simon Rust on 01/21/2011 at 12:49 PM0 comments


Why Myths Should Be Left to the Gods

As the year winds down, it seems appropriate to clear out a few myths in order to start the New Year with a fresh understanding of Virtual Desktop Infrastructure (VDI).

Let us look at four of the most prevalent myths and how, when VDI is implemented properly, they are revealed as baseless fears.

Myth 1: VDI is storage-heavy and therefore cost-heavy.

Truth: One reason these myths exists is the belief that, in order for VDI to run successfully, an organization has to rely on a surplus of storage, which quickly adds to the cost of the infrastructure required to support VDI. Both of these myths really come down to the fact that early adopters of VDI needed to adopt a one-to-one approach when it came to the virtual machine images held in the datacenter, in order to receive user acceptance.

Today, the technology exists and is being used to enable the one-to-many approach to be a reality while retaining user acceptance, and in turn rapidly reducing the storage requirement. The truth is that if the IT team has understood the requirement and planned the deliverable, VDI should not require a surplus of storage. Furthermore, so long as IT breaks the deliverable down into management standardized components (also known as componentization), then the overall cost of managing the componentized model desktop can be significantly less than managing the existing desktop estate.

The component model can also be leveraged across the existing physical estate, thus bringing management savings across the entire desktop estate (physical and virtual). User virtualization also plays its part in reducing desktop management costs by treating the user environment as separate from the desktop and application delivery methods. This separation enables IT to standardize the corporate assets (desktop and application sets) and automate the delivery of the user's working environment, significantly reducing operational costs.

Myth 2: VDI is too hard.

Truth: There are two key elements that play into this perceived difficulty.

One is on the technological side, promoting the myth that VDI is too complex and therefore too hard to implement and seamlessly integrate with existing systems.

The other aspect contributing to complexity is the human element in the form of user acceptance. Users are accustomed to being in control of their data, applications and desktop environment, which can make it hard for them to relinquish control and swap their customized computers for a cookie-cutter (componentized) virtual desktop.

Without user acceptance, it is widely accepted that IT projects are destined for failure. One of the main challenges for VDI, particularly with the growing numbers of employees who want to use their own mobile devices to connect to the business in order to perform work activities, will be delivering services to secure, managed endpoints while giving the users enough flexibility and a sense of self-control over their working environment. By separating the user layer, it is possible to enable the personalization of the desktop experience and to realize a feeling of self-ownership within the user community, while enabling IT to implement unique desktop management based on user, application or scenario and still retain the ability to enforce corporate access and security policies.

Myth 3: Virtualization and the cloud are one and the same.

Truth: At the desktop level, virtualization and the cloud make use of similar technologies to reach the user, although they are located in very different places. Desktop as a Service (DaaS) is the cloud solution delivering standard desktops to enterprise users from an external provider at low cost, with virtualization effectively being VDI where the enterprise makes use of similar technology but manages the delivery and technology itself.

Any company looking to improve enterprise desktop management capabilities should consider desktop virtualization and cloud when determining its desktop management strategy. In the case of cloud computing, the enterprise may well be able to take advantage of the cloud’s improved economics to deliver lower cost desktop estate management. Transferring certain desktop and application virtualization technology solutions to the cloud, for example, can unlock their full potential.

Some enterprises may choose to go all the way and migrate all desktops and applications to the cloud, while others may prefer to keep their data close to the vest, choosing to use the cloud only for particular application services. This is a key advantage of cloud services: One does not have to jump in with both feet, so migrations can occur one service at a time as appropriate to the enterprise.

Overall, enterprises should be seeking out solutions that help them understand the practical implications of the user experience across the desktop, whether it is physical, virtual or in the cloud. At this point, the most applicable solutions can be considered and tested for suitability to the user requirements.

Myth 4: VDI is merely an extension of Server Based Computing (SBC).

Truth: With the similar goal of providing users with a common desktop from a central datacenter location, it would be easy to assume that VDI and SBC are somewhat interchangeable. While there are similarities that can be drawn, it is key to understand the primary limitations of the technology areas before assuming that they are one and the same, and opting for an SBC model because of its far more attractive TCO.

SBC was a good first step towards standardization as a cost-effective, reliable way to deliver a small common set of applications to a large number of users in a standardized format. However, there are limitations, including the fact that many applications are just not capable of delivery via a Windows Server operating system, given that users are effectively sharing the server resources. VDI goes beyond SBC, combining its benefits with those of the desktop operating system, where the application set does not have to share the underlying operating system, thus removing many of the application conflict challenges. When we add user management capabilities into the mix, we have a very powerful combination indeed.

By abstracting the user layer from the OS and applications, it is possible to have a VDI implementation that can dynamically scale to satisfy needs on-demand instead of holding memory and storage hostage "just in case" they might be required. IT can scale its environment to fit the number of users that are currently running and then provision additional virtual machines as needed to accommodate any additional users in real time. Managing the desktop and application set independently of the user environment allows IT to build standard images on-demand as components for each user in real time.

IT can keep the components separate and assemble them as required in real time, allowing a high degree of standardization that would have been impossible before while keeping users happy with their service deliverable. Ultimately, IT gets great flexibility in the selection of how applications are delivered.

What other important myths have you heard about VDI? Please share them in the comments section.

Posted by Simon Rust on 12/22/2010 at 12:49 PM0 comments


Windows 7: Where Is It Now and What is Next?

We humans have a fascination with benchmarking progress, evaluating development, and determining success or failure along the way. Case in point: the undying popularity of "Where are they now?" television shows.

The same can be said for the IT world, with its numerous product and technology launches that are touted as revolutionary, game-changing innovations. Such hype begs for retrospection to see whether it was much ado about nothing or whether the product/technology/company/wunderkind really did start a revolution and change the game.

Just over a year ago and with much fanfare, Microsoft introduced Windows 7, so it's only natural to compare it to its predecessor, Windows Vista.

A year after its launch, Windows Vista had less than 10 percent adoption. It is therefore natural to expect that 12 months in, Windows 7 would also be around the 10 percent adoption mark. Surprisingly, it is actually double that, with Windows 7 clearly demonstrating itself to be the platform that Windows Vista should have been.

Perhaps even more surprising than Windows 7's speedier adoption--which is significant in itself--is its dominance in desktop virtualization projects. Most of the desktop virtualization projects currently taking place are using Windows 7, which enforces the belief that Windows 7 is the way forward and a significant driver of desktop virtualization.

What I've seen from customers is that those adopting Windows 7 are also pressing ahead with application virtualization, and most predominantly (circa 90 percent) with App-V. These companies are not only changing the desktop paradigm with desktop virtualization but are realizing that application virtualization enables better application management and maintenance, at lower deployment cost.

As stated, the majority of customers I've seen make use of the App-V solution today, further enforcing my own belief that App-V is quickly becoming the de facto standard for application virtualization. Additionally, my experience is that application virtualization is being used in more and more enterprises for general application delivery across the broader desktop due to the benefits that are being realized during desktop virtualization projects. This leads me to believe that App-V is not only the upcoming de facto application virtualization technology but also application delivery technology.

A few months ago, many in the industry thought that componentization was the end goal, yet as the market has evolved we've learned that desktop virtualization is but another desktop delivery model, not the holy grail itself. The heterogeneous desktop is rapidly becoming accepted as the model by which most enterprises will be delivering service to users, with a need to ensure that all user experience remains well-performing and consistent regardless of which model is being used by the user.

Interestingly, not only are we seeing customers using Windows 7 together with an application virtualization technology on their virtual desktops, but those selfsame customers are also using physical desktops running Windows 7 (together with the application virtualization solution), with a desire to pull them together in terms of user experience. So, Windows 7 seems to be the key trigger that brings together areas of the business that were previously segregated--desktop and/or server virtualization and the physical desktop. User virtualization is clearly pulling these areas together, making for a more user friendly solution to running desktop computing in the enterprise.

Now, let's look ahead to the next 12 months and my predictions. I don't see any reason why market adoption won't be 30 to 50 percent within the next year, and it won't be solely Windows 7 either. App-V together with Windows 7 is a powerful combination that will doubtlessly be driving the upgrade process over the next 12 months.

Companies may well have to perform physical hardware upgrades in order to run Windows 7 on laptops, but that pain will be alleviated somewhat by their ability to simultaneously upgrade the application set with application virtualization technologies, improving on application delivery and reducing desktop management costs. At this time, the enterprise will also concentrate on the user experience by managing user virtualization as part of the migration management.

As we navigate our path out of the economic meltdown of recent times, all organizations are looking to maximize efficiency and reduce cost. Application delivery remains an expensive part of the IT cost center today, so application virtualization has already begun to demonstrate the potential to dramatically reduce packaging and testing time while improving application compatibility and enabling an enterprise to manage software assets far more efficiently.

So, in a sense, Windows 7 and App-V are perfectly aligned and timed to hit home where it counts--the enterprise IT budget.

From observing the introduction, adoption and implementation of Windows 7, it's clear to me that the operating system's popularity is not wholly due to its superiority over Windows Vista--though that certainly helps--but that it's also due to Windows 7's potency when combined with application virtualization. What will be interesting to see is how this reciprocal relationship matures and evolves in the coming year and what other changes it might influence in the enterprise IT environment and the user experience itself.

Posted by Simon Rust on 11/15/2010 at 12:49 PM5 comments


Preparation for Successful Cloud Delivery

The cloud has exploded as the next breakthrough technology most people are considering for their IT service delivery. An IDC report released in June 2010 estimated that in 2009 cloud-based services accounted for $16 billion of the approximately $2.5 trillion in global spending on IT, and that number would rise to $55.5 billion by 2014--representing 12 percent of all spending on IT. While still in its infancy, cloud computing shows extreme promise in providing IT departments with a more flexible and cost-effective way of delivering services to their enterprises.

However, similar to desktop virtualization, cloud computing on its own, without the right technologies accompanying it, cannot be fully realized.

In an environment where the corporate desktop consists of the combination of accessing device, outsourced OS and applications, internal applications, and user settings, the management of user-based policy becomes a crucial enabler to the successful implementation of the dynamic desktop environment. Currently most organizations repackage applications to include settings specific to the business, and often this is required multiple times since the various departments have differing needs. These settings can enable or disable features, assign resources such as databases and file systems, incorporate language differences, etc. The process of repackaging is of course time-consuming and expensive to the business. The role of IT then becomes managing the delivery of the applications, figuring out how these applications work together, and most importantly what the user experience will be in this world of multiple outsourced desktop components.

Now, as cloud services are included into the mix, the potential complexity of the combined solution escalates but the crucial nature of the user experience cannot change. For example, how will the enterprise ensure that the user is able to access the mixture of applications (whether cloud or local) without needing multiple sets of authentication details? And once within the applications, will relative time zones be accurately mapped between the applications and will the applications actually integrate together? Should we be concerned that the same dictionary is used across all applications regardless of how it is delivered? There are many such questions that you can bring to light when considering the implications of the technology in your own environment -- just think of what the user experience will be like. Will it be simple?

One solution for this problem is user-based policy, which allows these settings to be made on the fly as applications execute, eliminating repackaging and providing a far more efficient way to configure applications for users. This is a critical precursor to cloud-based delivery of applications into organizations, acting as just another reminder of how critical the user experience is to the cloud's success.

Additionally, organizations will face the "many-to-one" challenge where services are being delivered by multiple cloud services providers. For example, in a traditional PC or the virtual desktop, there exists a single point of integration of all applications in the client OS. In the future model of multiple cloud-based services, this single execution point no longer exists. The implication for the user is that the outsourced applications will not work together, such as applications not being able to use data from one application in another or use other applications. One example is .doc files not opening in Word from e-mail attachments, when the Word applications and the e-mail applications could be easily supplied from different vendors' services.

In order to address this "many-to-one" problem, a new infrastructure is needed that spans all the delivery services that come together for the organization, department, role and user. There are already multiple solutions for the single sign-on solution to help with the authentication challenge, and I foresee a couple of vendors looking to help with the policy-driven aspects to bring together some of the application integration pieces.

We have already seen vendors announcing support for the cloud, but there are still many areas that have yet to be addressed, including the problems mentioned above. I believe that these issues will come to the forefront as cloud computing becomes more of a reality. And, as seems to be a common theme in my posts, the user will play a vital role in determining which technologies will be the most effective for enterprises.

Posted by Simon Rust on 10/26/2010 at 12:49 PM6 comments


The Crucial role of Users in Accelerating Desktop Virtualization

No one can ignore the fact that desktop virtualization adoption is on the rise, but for all the benefits it promises to deliver I would think that deployments would be rolling out a lot more quickly.

After all, this still new technology promises a significant decrease in desktop TCO and fewer headaches for IT managers. By centrally managing each desktop, enterprises can dramatically save on desktop management costs and decrease help desk calls. Desktop virtualization can also be extremely helpful in the event of server or client hardware failures.

The reason for adoption varies with each organization. CIOs and top-level executives look at the cost savings the technology can deliver, while the IT department considers the time savings. Either one of these benefits is attractive to enterprises and plays a key role in why they consider adopting desktop virtualization.

However, like most scenarios that offer benefits, there are challenges along the route of desktop virtualization, with the main ones being storage optimization and performance, standardization, application delivery, desktop performance, high availability, and user acceptance. These are all important aspects that must be managed effectively in order for desktop virtualization to be successful enterprise-wide. 

In addition to ensuring that storage optimization is suitably managed, there are several key decisions that must be made before creating a desktop virtualization solution. Users must choose the application virtualization model to use, determine where the user population resides, select where the data resides, and define an acceptable user experience. 

To make an informed decision and try to preempt any user issues, it is vital to know what the user population looks like and what they require from their application set--in addition to being aware of the connectivity available from their working locations. Having the full picture of how the user needs to function will put the IT administrator in a far more effective place to create the correct overall solution. In most cases, this is something new for the enterprise since they previously just installed things at a user’s desktop and let the user get on with it. Until now, IT administrators have typically not needed to be concerned over the amount of bandwidth that application delivery would consume and how that would affect the user experience.

This all comes under the heading of user experience, and we must not lose sight of the fact that the user experience is the most important aspect of desktop virtualization. The user experience is actually becoming the most important aspect of desktop delivery, period.  This is largely due to the prevalence of desktop devices that users now have and interact with daily in the home, and the fact that the PC has become part of everyday life outside of work. As a result, users typically have more power in their home PC and expect their work system to be as good as--or better than--the one they use at home. Today’s user has a louder voice when it comes to  technology he is expected to work with, resulting in project failures where there is not a sufficient user experience, regardless of whether the project is for desktop virtualization or something else.

Planning out the project in a structured manner is key to accelerating any desktop virtualization deployment. The most important aspects relate to simplification, which is accomplished by standardizing the moving parts, sharing the operating system image among the user population to simplify the process of building the desktops, and then layering in the application set by making use of a solid application virtualization solution. This is the only way to minimize the cost of building the solution at outset and then to realize maximum gains in operational expenditure once the solution is in place.

Therefore, crucial elements of simplification include the selection of a suitable application virtualization technology, and the determination to drive its success within the enterprise by capturing as many applications as possible and packaging them appropriately for the target user population. If there are too many applications to virtualize within the project timeframe, the enterprise needs to decide which applications are the most important and/or frequently used. They then need to virtualize those apps and then make use of some third-party technology to deal with other user-installed applications by policy. This ability may be a key aspect that guarantees user satisfaction and, ultimately, acceptance of the desktop virtualization solution.

The final aspect that will ensure that desktop virtualization adoption is accelerated is the use of a personalization solution. This will ensure that the digital identity of the user is managed independent of the application delivery.  By delivering this essential component, IT administrators ensure the user always receives the same personal experience when using their desktop and application set.

As the economy is slowly improving,  IT departments are wary of what technologies they will begin to adopt. So, while the number of desktop virtualization deployments has greatly increased in the last five years, there is still a long way to go. I think that one of the biggest factors in an organization’s decision-making process will be learning of other success stories before they choose to venture into virtualization themselves.

Posted by Simon Rust on 10/15/2010 at 12:49 PM0 comments


BYOPC Morphs into HAPC

Ever wish you could take your home computer to work because it has all of your personal settings saved, not to mention the multitude of pictures of your pets, family and latest vacation? With the continued restraints on IT budgets, organizations are looking for innovative ways to save money, and Bring-Your-Own-PC (BYOPC) is the latest cost-cutting idea. Some recently released research has suggested that a BYOPC model can achieve around $300 per employee per month in cost savings--but is employee PC ownership the way to go?

While the concept of BYOPC makes sense to me in theory, in reality this model is unlikely to be widely adopted by enterprises. Support and warranty issues will cause unnecessary headaches for users and many will inevitably contact their own company support desk for assistance anyway--negating the entire purpose of BYOPC. This concept does free IT from carrying the capital expense for a lot of resources, and gives users the choice that they appreciate. However, these benefits will fail to outweigh the downside. And we all know that if users aren't happy their IT department isn't either.

A more viable alternative is using desktop virtualization in Home Access PC (HAPC). In this scenario, employees would leave their work PC at the office and use their static home PC for any after-hours work. This model is already in practice in many organizations around the globe. For example, a couple of London-based finance houses that I met up with several years ago were making use of the Citrix XenApp product lines to enable their users to work at home. Their studies showed that they were actually getting up to 60-70 percent more productivity out of employees, since they would work Sunday afternoons to prepare for the working week, as well as most evenings after 8 or 9 p.m. It was understood that once the children went to bed, the employees took the opportunity to get a couple of hours work in before shutting down for the night.

As desktop virtualization continues to surge, we will see more employees making use of their corporate desktop from home, with little or no requirements placed on their home infrastructure. A virtual desktop also eliminates the security risk associated with allowing corporate access from unmanaged, unknown endpoints. A non-persistent virtual desktop model works well in this scenario, as long as the employee has a predictable and personal experience across both devices.

I am very interested in hearing your thoughts. Do you see the virtual desktop as a means of extending your users' working day? Has your organization already implemented a BYOPC or HAPC model? Comment here.

Posted by Simon Rust on 10/01/2010 at 12:49 PM7 comments


User Rights Management: Giving Users the Ability To Access what They Need

IT is increasingly being faced with the problem of users having administrative rights on their personal desktops. Giving them the ability to access all areas of the desktop is an accident waiting to happen, often leading to high support costs and a compromised user experience. One of the worst-case scenarios comes if security is compromised through the loss of data or an attack from malicious software that can be sitting and waiting on the network.

The typical method of enabling administrative rights today is flipping an on/off switch -- the user can either have full or no admin rights. However, in most cases, the user only needs to have admin rights to certain elements of their desktop in order to complete their task. For example, many proprietary applications -- applications that allow changes to be made to hardware settings such as network adapters, applications that allow the installation of drivers for devices like printers, and applications that write to secure parts of the registry all require administrative rights to execute. Until recently, IT has been unable to grant users this access without compromising their systems.

User rights management (URM) addresses this issue by ensuring that only certain users are able to have administrative rights to certain applications in pre-defined situations. I know this seems like a lot of elements, but this new technology aims to provide organizations with a means to balance user needs with IT cost by enabling the elevation or reduction of user rights on a user, application or business rule basis. For example, there are many legacy applications out there that still have a requirement to write data to system areas of the operating system. While these can be managed with file and registry access control lists (ACL's), it is far easier to manage them by elevating the user account.

In addition, there are many daily tasks such as changing wireless network settings, date and time, system updates, etc. that require a user to be an admin. URM is able to give users admin rights for these certain areas while restricting access to other areas that are strictly managed by IT -- letting users maintain their productivity while ensuring their time is used most efficiently and the business is not exposed to unnecessary risks and costs.

I met with a finance company in New York just this week whose approach is to elevate everything in the "Program Files" folder on the local PC, excluding Internet Explorer. This policy was due to the fact that the company in question was all too aware that elevating Internet Explorer might be a significant risk. They seemed to miss the point that the other applications may also pose a significant risk. Are they not just causing themselves a potential security risk without proper consideration?
Too little control and unlicensed software, possibly even malware and viruses, can be on an organization's network and quickly wreak havoc. Too much control and IT limits users' ability to do their jobs by making something as simple as installing a custom printer driver far more complicated than it should be.
URM provides the balance required to allow organizations to reduce management costs while giving users a greater level of personal control over a standardized environment. So let me ask you this, how is your organization currently managing this balance?

Posted by Simon Rust on 08/26/2010 at 12:49 PM2 comments


Mid-2010: What's Happened So Far, What's Next?

Yes, it is already that time again. We are already full blown in the middle of 2010 and before you know it, we will be counting down to the New Year. It's always a good idea to do a midyear check/reflection just to make sure the years don't blend together. In the technology field, this is very common since the market seems to change as fast as the predictions change for our economic rebound. Here are a few comments on some up and coming technologies and what I have been seeing in the market.

Hosted Virtual Desktops
It was widely predicted by many that 2010 would be the proof year for hosted virtual desktops (HVD). Gartner predicted that there will be around 66 million HVDs by 2014, and we are already starting to see this number become more of a reality.

So far this year we are seeing the number of HVD pilots rise significantly and we are finally beginning to see the non-persistent pool model being used in customer accounts. As predicted, the ROI for persistent HVD's simply does not exist for the majority of use cases, the early adopters [typically] in the finance industry are now looking towards maximizing their ROI by adopting non-persistent pools. 

The HVD pilots that I am seeing at this point in 2010 are ignoring the persistent pool option completely and are focusing on non-persistent with some form of user virtualization solution.

Client Hypervisors
Another hot topic predicted for 2010 was Type 1 client hypervisors, which were to be one of the main events (in terms of innovative product areas). At Synergy earlier this year, Citrix released a technology preview of their XenClient product line that will allow desktops to run in a virtual machine installed directly on a user's laptop, rather than in a server inside the data center.  VMware's client hypervisor has been delayed unfortunately, with delivery looking unlikely during the second half of 2010.

Other vendors in this space, such as Neocleus and Virtual Computer are already enjoying some limited success, although their success is likely to improve as and when Citrix and VMware enter the market space with generally available products and begin to drive noise. 

User Installed Applications
UIA is certain to be a controversial issue in the remainder of 2010, as we are beginning to see more and more heated debates around the need for users to be able to install their own applications into their corporate desktop.  These debates continue to become more interesting as UIA becomes a reality and shows no signs of slowing down.

We will start to see more real use cases emerge in organizations soon and in fact are already beginning to see the early signs of this already. Just this week, during a short tour of customers in the New York area, we were asked several times about browser helper objects and how these were a current issue area for the non-persistent HVD solution being planned. As a consequence of such customer conversations, I am seeing that organizations are turning to UIA solutions to deliver to users in an effort to make use of HVD.

Others continue to believe that allowing users to have some control over their environment is extremely bad news. The key to finding the balance is making sure UIA is achieved in a controlled and safe manner with full auditing of all user installed applications. The remainder of 2010 is going to be quite eye opening as UIA technologies are introduced to the market. By the end of the year we should have a better idea of how it's being used and its affect on HVDs.

One thing missing from this year so far is vendors delivering their applications to customers as pre-packaged application virtualization packages. This will become more visible in the remainder of 2010. Microsoft already delivers Office 2010 in this model using their own application virtualization technology, App-V. It seems obvious that if any ISV were to lead the way and deliver a new product in App-V, you would hope that it would be Microsoft.  There are still challenges because of the fact that App-V is only available to MDOP customers, so this may affect adoption by ISVs.  This being said, I am aware of a couple of others that are about to deliver their technology as an App-V package and I look forward to seeing how their offerings compare.  Let's just sit back and watch this space.

Please let me know if there is anything that was missed that you think has made a major impact so far in 2010. I am interested in your feedback.

Posted by Simon Rust on 08/06/2010 at 12:49 PM3 comments


Virtual User Infrastructure: What, Why and How It Enables Desktop Virtualization

In an effort to reduce the cost of managing user desktops, enterprises have begun to think about delivering virtual desktops that can be generated on-the-fly and enable standardization to make desktop management easier.

On the market today, we see a number of solutions in various stages of development being designed to enable a single disk image to be used to support thousands of user desktops, thus minimizing the cost of the desktop. We are also seeing an excess of application virtualization technologies that enable enterprise applications to be easily layered on top of the operating system. These two technologies have had a tremendous amount of thought and development put into them over the past few years and are, by far, the most obvious solutions for enabling lower cost delivery of applications to end users.

If we look at the mainstream desktop virtualization vendors, they have almost exclusively focused on these two aspects. The best way to visualize this is to see them as Lego blocks in that with the right blocks we can build anything we desire. Basically, each block represents an operating system component or an application component, all of which can be simply plugged together to form the desktop for the user.  However, these technologies, when integrated, can become quite tricky. Sometimes we may build a figure and discover that it does not function the way it needs to, so we rebuild.

With these technologies, enterprises realize that isolating  layers, in order for them to be managed individually, is the most cost-effective method. However, the integration and interoperability of the applications and operating system  are crucial to the success of the desktop -- and end user satisfaction for that matter.

Say for instance, yesterday the applications were all locally installed on the desktop, provided as delivered/packaged MSI’s or even installed by IT from CD/DVD/USB drive. Therefore, all applications were locally installed with no isolation from each other, which meant that there were no integration worries when it came to applications being aware of each other. But this usually creates issues relating to incompatibilities between the applications, and in many ways this is exactly why application virtualization vendors exist today. Here we have created a Catch-22 situation in that the very technology that we created to fix application compatibility issues causes an application incompatibility issue, making the desktop harder to manage for the user.

It can be argued that the user experience is without question the MOST important aspect of a desktop delivery, and this remains the same whether the desktop is physical or virtual. Studies have shown that if the user does not accept the solution during proof of concept or pilot, then the adoption of virtual desktops will simply not be accepted in that enterprise.

In order to find the balance between delivering the best user experience and reducing desktop management costs, some form of Virtual User Infrastructure (VUI) needs to be implemented. The role of this would be to pull together the various forms of application virtualization at the desktop (regardless of whether that desktop is virtual, physical, terminal services or even a mixture of these) and enable the user to use the applications without being hampered by the aforementioned interoperability challenge. VUI is all about ensuring the user has a pleasant desktop experience.
VUI needs to cater to:

  • User personalization independent of
    • Platform (physical, virtual, RDS etc)
    • Application virtualization technology
  • User installed application management
    • Business applications that are required but have not yet been packaged for the desktop
  • User data management
    • User documents and other data that is created by users within their applications

These are commonly seen as the three key areas that a VUI solution needs to be able to deliver against in order to satisfy end users and ensure enterprises are being most cost-effective.

I am extremely interested to hear your feedback on this…how do you define the role of the user?

Posted by Simon Rust on 07/29/2010 at 12:49 PM6 comments


Subscribe on YouTube