Essential Guide

Browse Sections

Ten steps for smooth desktop virtualization deployment

Pulling off a successful desktop virtualization deployment project means you have to first understand which virtualization method is best for your client. This 10-step plan addresses common mistakes that are made in desktop virtualization deployment and offers guidelines for taking the correct approach.

Desktop virtualization can reduce endpoint computing requirements and consolidate desktop processing and management in the data center. But despite these significant benefits, the technology is still in its infancy, and desktop virtualization deployments can be problematic without careful planning. The following guidelines will help you plan and manage successful desktop virtualization deployments.

Choose the right virtualization approach

There are two basic methods for creating a centralized thin-client environment for customers: virtual desktop infrastructure (VDI) and remote access terminal services. VDI, the most common approach, runs the entire desktop environment on a server as a virtual machine. Remote access terminal services host applications on a central server and present those applications to desktop users, who can connect from any location -- a technique known as application virtualization. The most popular of these remote access offerings are Citrix Systems' Presentation Server (now called XenApp) and Microsoft's Terminal Services for Windows Server 2008.

Solutions providers must understand their customers' needs and business problems before recommending a virtualization approach. Going to customers with a preconceived solution and trying to force-fit that approach will lead to failed deployments.

"That's the reason that implementations have not progressed," said Barb Goldworm, president and chief analyst of Focus Consulting, a research and analysis firm in Boulder, Colo. "And it may not be one solution for the entire infrastructure."

Assess the network before deployment

The heavy lifting of desktop virtualization falls to customers' networks, which is where data is exchanged between the servers that process it and the endpoints that display it. Before deploying desktop virtualization, solutions providers need to examine the network and verify adequate bandwidth. (Gigabit Ethernet, or 1 GbE, is typically considered essential for virtual desktop deployments.) Also, there should be enough bandwidth available to support the peak load of every virtual desktop user. Otherwise, desktop performance will suffer.

In addition, network monitoring tools can help gather and report details of the network's behavior. Solutions providers should look for network stability or disruption patterns that might suggest other problems. Until the network demonstrates a level of performance and stability that would support virtual desktop users, a desktop virtualization deployment should not be an option. Network upgrades may be required to improve performance and resolve operational problems before rolling out a desktop virtualization project.

Develop a desktop virtualization server strategy

Servers are the heart of desktop virtualization, so it's absolutely critical that CPU, memory and I/O resources are available to support the peak processing demands of virtual desktop users. Solutions providers should use server monitoring tools to gather and report data about the system's resource utilization.

Desktop virtualization servers are potential points of failure. If a server fails, all of the virtual desktops it hosts will go down. Solutions providers must consider this possibility before deployment. One simple tactic is to spread the virtual desktops across multiple servers, so a fault in one server won't disrupt all users. Another more advanced tactic is to deploy a server cluster for virtual desktops. A cluster spreads the processing workload across all of its servers and can shift the load to other servers in case of a fault, resulting in no downtime for the users.

Upgrade storage to support desktop virtualization

Storage is often overlooked when deploying desktop virtualization. Desktop virtualization -- especially the VDI approach -- puts every PC's data on servers, so the data center SAN needs enough additional storage to host all of these virtual desktops.

"If I had 1,000 users with 20 GB hard drives, and I move that to the SAN on a VDI … I'd have to go buy 20 TB of expensive SAN storage," Goldworm said.

Such a storage purchase could dramatically change the cost of the deployment. Solutions providers must estimate their customers' storage requirements and integrate this additional storage with existing SAN environments before deploying desktop virtualization.

There are ways to cut these potentially project-stopping costs, and data deduplication is a particularly good strategy. Most virtual desktops use many of the same files, and data deduplication stores only one copy of those files and allows all the virtual desktops to access them. It can also store a single "gold image" of the operating system and applications users need and simply load them into memory at boot-up.

Protocol interoperability and graphics acceleration hardware

Desktop virtualization can employ Microsoft's Remote Desktop Protocol (RDP), the Independent Computing Architecture (ICA) used by Citrix Systems and other protocols that exchange client-server data. Solutions providers must make sure their customers' desktop endpoints are compatible with the virtualization protocols in use. (Most thin-client endpoints are compatible with both protocols.)

Current protocols were designed with graphics support in mind, but high-end graphics and visualization software (such as CAD) can still strain software-based protocols. Some virtual desktops may benefit from the TERA chipset from Teradici Corp. or other hardware-based virtual graphics processors. A server-side chip, or host processor, integrates with existing CPUs to encode the desktop environment across the IP network, while a client-side adapter, or a portal processor, decodes the content for display.

Deploy desktop virtualization in phases

Even with the most comprehensive monitoring and planning, it's not always easy to predict the success of a desktop virtualization deployment. Many solutions providers prepare mockup deployments in lab environments using equipment similar to their customers' and even simulate workloads to prepare for real deployments.

"Get a little experience with desktop virtualization before rolling it out to the client," said independent technology consultant Alex Zaltsman. "I suggest a pilot rollout with a segment of users that are not mission critical."

With pilot rollouts, solutions providers can make adjustments, train users, demonstrate value, and get management buy-in before deploying desktop virtualization company-wide.

Be aggressive with endpoint security

Users have less control over the content and management of their virtual desktops. But solutions providers still need to perform due diligence when deploying desktop virtualization to secure each endpoint aggressively.

"You need to take some time to really lock that machine down, so that the user can only operate in the virtual desktop environment," said Brien Posey, an independent technology consultant in Rock Hill, S.C.

Strong endpoint security makes it less likely that users will affect their local operating systems or make other unexpected changes to the endpoint.

Take strong central security measures

In addition to locking down each endpoint, it's important for solutions providers to address access control and ensure that each user is configured with least-privilege roles and policies. If the customer does not use group rights or least-privilege policies, solutions providers can sometimes take the opportunity to implement those security changes before deploying desktop virtualization. It doesn't make much sense for a client to strengthen its infrastructure through desktop virtualization if every user is just going to have access to everything on the network.

Implement and test backup services

An important weakness in many corporations is the vulnerability of user data. Many users back up their data infrequently or inconsistently, which can result in data loss when disaster strikes or the endpoint PC fails. By comparison, data centers back up servers with great regularity and often take disaster recovery (DR) into consideration by sending backups to remote storage or backup data center locations.

When customers back up their virtual desktop servers and corresponding SAN content, it preserves the data used by every endpoint and protects 100% of enterprise data. When deploying desktop virtualization, solutions providers should include virtual desktop servers and endpoint data storage in the overall backup and DR plan. It is a good opportunity to review customers' existing backup and DR strategies and identify potential future projects.

Train customers after deploying desktop virtualization

The move to desktop virtualization will change the way customers use and manage their endpoints, so training is often part of the deployment cycle. End users don't need much training; the desktop experience should remain almost unchanged. But users should know about changes to their rights and controls over their desktops. Administrators and help desk personnel may require more training to understand the virtual desktop environment and the levels of controls that are available. Training should start during the pilot program and expand throughout the desktop virtualization deployment.

Next Steps

Learn more about desktop virtualization in our Hot Spot Tutorial for solutions providers.

Dig Deeper on