Sean's Virtual Desktop

Better VDI Graphics Through Virtual GPU

A new vSphere 6.0 feature, it's VMware's best attempt yet to support 3D in a virtual desktop environment.

Building virtual desktop environments to support high-end applications can be challenging. One of the biggest challenges in this realm is supporting 3D graphics inside the virtual desktop. Although VMware has supported 3D graphics since Horizon View 5.2, there are some significant drawbacks.

VMware has attempted to address this with a new feature in vSphere 6.0 called Virtual GPU. Virtual GPU, or vGPU for short, allows an NVIDIA GRID GPU to be shared among multiple virtual desktops while providing direct access to the hardware using standard NVIDIA drivers. Although this is new to vSphere, it's not a new technology.

XenServer has had this feature for some time, and Microsoft added a similar feature called RemoteFX vGPU in Windows Server 2012 R2. RemoteFX vGPU does not use the NVIDIA GRID vGPU software that Citrix and VMware use, but technology developed internally by Microsoft instead.

History of GPU Technologies in vSphere
vGPU isn't VMware's first attempt at providing 3D graphics acceleration to virtual desktops; there are two other methods for providing 3D support to virtual desktops running on vSphere:

  • Virtual Shared Graphics Acceleration: Introduced in vSphere 5.1 Update 1 for supported NVIDIA QUADRO and GRID cards, vSGA allows multiple virtual desktops to share physical GPUs. To enable this, the GPU's driver is installed in the hypervisor, and the vSphere Soft3D driver is installed on the guest OS. vSGA only supports up to 512MB of video RAM per virtual machine (VM). Half of this VRAM is reserved on the GPU, and the remaining half is reserved from the ESXi hosts' RAM. It only supports DirectX 9.0c and OpenGL 2.1, so it may not be suitable for the latest 3D-enabled applications.
  • Virtual Dedicated Graphics Acceleration: Introduced in vSphere 5.1 Update 1 and supported in Horizon View 5.3, vDGA attaches a physical GPU directly to a VM using PCI passthru, and the standard GPU drivers are installed on the guest OS. This enables the VM to utilize the full capabilities of the GPU. There are two tradeoffs to this approach. The first is that the VM is essentially pinned to the host, and can't be moved due to having physical hardware passed into the guest OS. The second is capacity -- there is a limited number of GPUs that can be installed in a physical host, and this dictates the number of VMs that can use vDGA, as well as the number of hosts required.
How vGPU works
vGPU attempts to combine the best of vSGA and vDGA. Like vSGA, the graphics processors and RAM are shared among multiple virtual desktops in a vGPU environment. However, the VMs utilize native NVIDIA drivers and pass their commands directly to the GPUs like vDGA. A software component, the NVIDIA vGPU Manager, is installed on each ESXi host, but the main function of this component is to ensure that the VM has the correct graphics resources assigned to it.

vGPU utilizes profiles to allocate graphics resources to VMs. These profiles control the amount of RAM allocated to each VM, the maximum number of monitors, and, ultimately, the maximum number of users per GRID card. Profiles are assigned per physical GPU on the card, so it's possible to have multiple profiles running on a card.

[Click on image for larger view.] Figure 1. For an NVIDIA GRID K1 card, up to eight users are supported per physical GPU, depending on vGPU profiles. Graphic courtesy of NVIDIA.

For example, an NVIDIA GRID K1 card has four GPUs on the card, and each GPU has 4GB of video RAM available (see Figure 1). The low-end profile grants 512 MB of video RAM to each virtual desktop, and this allows up to 32 total users on the K1 card. The high-end profile grants all 4GB of video RAM to a single virtual desktop, and this limits the K1 card to four users. It's also possible to mix and match profiles, so you could have two high-end users and 16 low-end users on the same GRID K1 card.

vGPU Requirements
There are a few requirements that need to be met in order to utilize vGPU. The biggest requirement is hardware, and an NVIDIA GRID card is required. This requires servers that have been certified by NVIDIA; here's a list of these servers. Each server vendor offers different configurations and options. You'll need to check with your vendor to determine the best hardware configuration for your environment. The other requirements for vGPU:

  • vSphere 6.0
  • Horizon 6.1
  • Windows Virtual Machine with hardware version 11 (ESXi 6.0)

That covers the basics of what vGPU is and the system requirements for utilizing it in your VDI environment. In the next installment, we'll go over the steps for installing and configuring vGPU.

About the Author

Sean Massey is a systems administrator from Appleton, Wisc. He blogs about VDI, Windows PowerShell, and automation at http://seanmassey.net, and is active on Twitter as @seanpmassey.

Featured

Subscribe on YouTube