E-Handbook: How to use GPU for rendering graphics in VDI deployments Article 1 of 4

High-quality graphics are critical to VDI deployments

Before high-definition TV came out, I'd never really watched a football game and thought, "I have no idea what's going on because this image is just so darn unclear."

As time went on, however, I had to make sure I was in front of an HD television every Sunday during the season. Anything less just wouldn't cut it. Now, HD feels absolutely necessary for almost any type of program. It might seem silly on the surface, but I have come to expect a certain image quality when I watch TV, and I will not compromise.

This same idea carries over with desktop and application virtualization. In the past, many users didn't really need to offload graphics processing from their devices to a graphics processing unit (GPU) on a server, because they worked with simple apps such as Microsoft Word. The CPU on their devices could handle the job. GPUs on the server were reserved for heavy hitters such as computer-aided design application users.

Now, even simple apps, including Word, can require a GPU for rendering graphics, because the CPU simply can't handle it. And it's not just apps. Windows 10 will push graphics processing to its limits. Not to mention the fact that users expect their web browsers to display large images in high quality and play crystal-clear videos. Graphics processing is nearly universal now, so VDI shops better know what to do.

Use this three-part handbook to focus your view of how to deploy GPU-enabled virtual applications and more.      

Enterprise Desktop
Cloud Computing
SearchVMware
Close