Archive for category Desktop Virtualization / VDI

Windows 8 Developer Preview: Virtualization Options (VMware Workstation)

imageFor those of us developers that are itching to get our hands (and fingers) on the recently release Windows 8 Developer Preview, you might hit an unexpected snag.  If you’re like me, the first thing you tried after downloading the bits was to create a new virtual machine in your favorite desktop virtualization platform (most likely Microsoft Windows Virtual PC or VMware Workstation).  However, after creating and booting the VM and attaching an ISO, I encountered the following error:

VMware Workstation internal monitor error

vcpu-O:NOT_IMPLEMENTED vmcore/vmm/intr/apic.c:1903


Unfortunately, it looks like VMware Workstation 7.x platform (and, reportedly, VirtualPC, though I haven’t tested it myself), does not yet support Windows 8.  Perhaps I should have realized that a lot of the undercover boot and CPU optimizations would require an architectural shift to support the required CPU commands.

Options That Should Work

Though this might not be ideal for all users, there are several options to get the Developer Preview of Windows 8 running in a virtual machine:

  • Use Microsoft’s Hyper-V: If you have a Windows Server 2008 or 2008 R2 installation (or the stand-alone Hyper-V Server), you should be able to spin up a new Windows 8 VM quickly and easily.  It’s not desktop virtualization, but if you have a spare machine that supports Hyper-V’s CPU requirements, you should be all set.
  • Wait for the release of VMware Workstation 8.  While I haven’t yet tried it myself, there are reports of people having success with the beta of the upcoming release of VMware’s Workstation production.  A beta virtualization stack with a Developer Preview OS – How’s that for living on the edge?  It looks like the product is officially available from VMware now and you can request a VMware Workstation 8 Evaluation online (registration required).
    • Update: I downloaded a 30-day evaluation version of VMware Workstation 8, and the Windows 8 Developer Preview installed without one minor catch: Don’t use the VMware "Easy Install" option, as it’s based on the automatic install procedures for Windows 7.  Other than that, I’m up and running!
  • VirtualBox apparently supports the WIndows 8 Developer Preview (again, I haven’t yet tried it myself).  The application is available for free download.  The Windows 7 Hacker site has a walkthrough titled Install Windows 8 Developer Preview on VirtualBox.

Dual-Boot / Clean Install

Of course, you could skip virtualization altogether and install Windows 8 directly on your hardware.  That would give the best overall performance and the best experience with the new Metro UI.  You could install The Windows 8 Dev Preview alongside your current OS (though you might need to repartition), or you can just pop a spare hard drive in your computer to avoid any messy boot complications.  In general, this approach has worked great for me in the past.

Another option is to Boot to VHD.  That’s a significantly more complicated process, but the blog post Installing Windows 8 on Bare Metal with VHD-Boot should help.

A Note About the Developer Preview

While we’re all itching to try the new UI and functionality in Windows 8, there are a couple of things to keep in mind.  First, this version is not yet a “beta”.  It’s an earlier release that is designed to get developers up and running.  The official build number is Build 8102 M3.  Even if you’re like me and you’re willing to live on the bleeding edge with software, you probably don’t want to install this build as your primary OS.  Furthermore, Microsoft has mentioned that several features are not included in this build (though I haven’t yet run into anything that’s a showstopper for me).

On the brighter side, this build does not require product activation.  Coupled with the easy accessibility of the download from the Windows 8 Developer Preview site, that effectively means everyone will have easy access to this preview release.  Downloads are also available for MSDN Subscribers.

For More Information…

In case you missed it, Microsoft’s BUILD Conference keynotes are available at the BUILD Conference Web Site.  The Day 1 Keynote covered dozens of really exciting features, presented by Steven Sinofsky and several other Microsoft Program Managers.  I’m just getting started with my testing/development, and I’ll try to post more here once I have something of value.

Update (09/19/2011)

Based on the numbers of hits to this article, I thought this would be a fairly important topic.  On the Building Windows 8 blog, Microsoft has recently published a post titled, Running Windows 8 Developer Preview in a virtual environment.  It includes more details on the pros and cons of running Windows 8 using virtualization and provides the following summary:


  • Hyper-V in Windows 8 Developer Preview
  • Hyper-V in Windows Server 2008 R2
  • VMware Workstation 8.0 for Windows
  • VirtualBox 4.1.2 for Windows


  • Microsoft Virtual PC (all versions)
  • Microsoft Virtual Server (all versions)
  • Windows 7 XP Mode
  • VMWare Workstation 7.x or older

For now, I’m happily running Windows 8 test VMs on an evaluation version of VMware Workstation 8.0 and in Hyper-V on Windows Server 2008 R2 SP1.  Next stop: Running on some fairly recent hardware.

VDI: Virtuality vs. Reality

The idea of virtualizing desktops (often referred to as Virtual Desktop Infrastructure, or VDI) would appear to be gaining traction and mindshare.  Indeed companies like VMware and a host of other smaller virtualization technology provides are spending large amounts of their budgets on promotion and enabling VDI.  However, many IT pros (myself included) are skeptical.

A recent article written by Christina Torode, Users eye VDI but may wait for client hypervisors is now available on TechTarget’s SearchWinIT site.  I was interviewed for the article and got a change to provide some of my input.  Here’s an excerpt of some of my comments quoted in the article:

Cheaper alternatives to desktop virtualization

For independent consultant Anil Desai, VDI presents a dilemma. It promises to address security problems such as lost laptops and give IT better control over remote workforces. But he doesn’t see virtual desktop technology as the best way to solve these and other business problems.

He said there are more cost-effective ways to reduce security risks and gain control over user devices with existing technologies. There is the ability in Windows to restrict access to the USB drive or to improve manageability with remote management tools that lets IT cut physical visits to desktops and use the Remote Desktop Protocol, just as VDI uses.

Another example is the alternative of Windows Server 2008 Terminal Services for resource, hardware and management consolidation versus using VDI. Terminal Services in Windows Server 2008 lets IT run a single application in a virtual environment, in turn centralizing application management, he said.

Then there’s the overall cost for a virtual desktop infrastructure versus buying desktops. "When you see how much infrastructure, power and server resources go into a VDI solution versus getting desktops that have come down so much in price, I just don’t see the justification for that kind of investment," Desai said.

Desai said he is backing the concept of a client hypervisor and is waiting to see what the big three — VMware, Microsoft and Citrix — will do in this area. "It can reduce potential application conflicts and speed up deployments on many operating system platforms," he said.

Overall, it will be interesting to see what happens here – will VDI be just another over-hyped technology that never made significant inroads into corporate IT?  Or, is this is a real technology that will start replacing full desktops?

The Case Against Desktop Virtualization

Virtual Strategy Magazine has recently published my article, The Case Against Desktop Virtualization.  From the introduction to the article:

Ladies and gentlemen of the jury: You are being called upon to partake in one of the most important duties of an IT professional.  You will be asked to objectively evaluate claims and determine whether a relatively new development in virtualization technology – desktop virtualization – is a valid and useful solution for your environment.  You have already heard many strong arguments for desktop virtualization from much of the industry.  You will now hear from the other side: A discussion of how you can gain many of the benefits of virtualization without moving desktop computing to the confines of the data center.

OK, all drama aside, I should be clear about the point of this article.  My goal is not to convince you that desktop virtualization is not a good idea.  Rather, I’d like to provide some counter-point to a lot of the hype that we have been hearing lately.  Specifically, I’ll point out how many of the problems that desktop virtualization is designed to solve can be addressed in other ways.  The goal for you, the reader, is to determine which of these is the best way to solve these problems.  Order in the court!

Perhaps it’s a bit too dramatic, but I think it presents a good case, overall.  Feel free to leave your pleas and judgments here.

Desktop Virtualization: The Next IT Fad?

In the past, I wrote a couple of articles related to Virtual Desktop Infrastructure (VDI) (for the articles, see the VDI/Desktop Management Category).  The main idea is that, by running desktops within VMs, you can solve a lot of IT management challenges.  I’m not sold on the idea, and it appears that I’m not alone.  Hannah Drake, Editor at, asks Client-side virtual desktop technology: Unnecessary?.  The article quotes some interesting points from an IT consulting company.  I added my $.02 in the comments section.  Feel free to comment here, as well: Is VDI a fad, or is it a real solution for current IT problems.

Microsoft’s Virtualization Options (Upcoming Webcast)

There seems to be a lot of confusion out there related to different methods of virtualization.   In short, it’s not all about running multiple operating systems on the same system at the same time.  You can also virtualize and isolate specific programs (for example, within a Java Virtual Machine).  There are also other approaches.  Microsoft refers to its Terminal Services feature as “presentation virtualization.”  Most of us are quite familiar with using the Remote Desktop Protocol (RDP) to remote manage a computer or to remote run applications.  But with Terminal Services, applications actually execute on the server.  What if you want them to run on the client (where CPU, memory, disk, and network resources are arguably far cheaper)?

Microsoft SoftGrid (formerly Softricity) is designed to do just that.  An upcoming webcast will help explain the approach of deploying applications on-demand: TechNet Webcast: Introducing SoftGrid to the World (Level 200)

Which to use, Microsoft SoftGrid or Terminal Services? Both of the fictional companies in our webcast, Contoso and Fabrikam, are considering application virtualization, and they have heard of both Terminal Services and SoftGrid. But which do they choose? In this session, we look at these solutions, provide details on how they differ, and explain when to use them. We also cover how to install, configure, and use SoftGrid.

Better yet, the technologies can successfully be used together.  Unfortunately, one of the drawbacks of Softgrid is that it requires an Enterprise-level license for organizations that wish to deploy it.  There are hints that this will soon change to make SoftGrid a lot more accessible to the masses (I’d consider using it for my home office).

Of course, there’s also an option not to virtualize at all.  If you’re trying to consolidate, for example, Microsoft SQL Server machines, there’s probably a better way to consolidate your databases.  The bottom line is that there are a lot of different options for obtaining the benefits of virtualization.

Desktop Virtualization: Evaluating Approaches and Solutions

This article was first published on

Visualize the unglamorous task of crawling behind a dusty desktop computer to check for an unplugged cable. This is in response to a report that a user’s computer is “broken”. You have only the soothing sounds of an employee complaining to a friend about how IT is taking forever to solve the problem. Finish the job quickly, as you’ll soon be off to troubleshooting an application compatibility problem on an executive’s notebook computer. Assuming you haven’t left the IT industry altogether after reading this paragraph, I think you’ll agree that there are many compelling reasons for addressing desktop and application management issues.

In the first Tip in this series on desktop virtualization, I defined the goals and problems that we were trying to solve. I provided an overview of the various approaches (and noted that there’s often some disagreement over specific terminology). Here, I’ll cover the pros and cons of specific approaches, along with applications you might want to consider.

Presentation Virtualization Solutions

In presentation virtualization, user input and application output are managed using a network connection. Applications run on the server, and screen updates are sent to a thin client or desktop computer. Some solutions can virtualize individual applications, can work securely over Internet connections, and can be integrated with a variety of network-level access methods.

  • Benefits: Scalability is a big one: up to hundreds of simultaneous application sessions can be created on a single server. Applications management can be simplified since deployment to desktops is no longer a requirement. Access to applications can be managed centrally, and data may be stored securely on back-end systems.
  • Drawbacks: Applications must be compatible with the virtualization solution. While utilities are available for assisting in this area, they’re far from foolproof. Additionally, all users will be using the same OS version on the server side. Reliability is also a concern, especially when running business-critical client applications, due to the number of sessions that must be maintained. When running on slow connections, slow application responses can hurt the end-user experience.

Products and Solutions:

Application and OS Virtualization Solutions

Realizing that the primary purpose of desktop computers is to allow users to run applications, some vendors have focused on using application deployment and management solutions. The goal here is to allow many different applications to coexist on a single operating system that runs on users’ hardware.

  • Benefits: Users can run their operating systems and applications locally, leading to better performance and support for disconnected scenarios. IT departments can avoid application compatibility and deployment issues and can more easily keep track of licensing. Overall scalability is often far higher than that of virtualizing entire desktop operating systems.
  • Drawbacks: Applications may need to be modified (or at least tested) when running with these solutions. The base OS is shared among all applications and application environments, so all applications must run on the same basic platform.

Products and Solutions:

VM-Based Virtualization Solutions

There’s no reason that the benefits of server virtualization can’t be extended to desktop machines. VM-based virtualization involves the creation of VMs (either on-demand on permanent) for users on data center hardware. Users access their individual VMs using a thin client device or a remote desktop solution. On the server side, a “connection broker” layer is able to ensure that the right VMs are available and that users connect to their own systems.

  • Benefits: All OS’s and user data are stored within the data center (presumably on fault-tolerant, high performance hardware). This enables centralized management and increases average utilization on all of the systems that an IT department supports. Security risks are decreased, as are costs related to managing client-side computers.
  • Drawbacks: Entire operating systems are running for each user. This can limit scalability and increase costs related to storage. Additionally, users require a network connection in order to access their computers. Finally, server-side hardware resources can be far more costly than their desktop counterparts.

Products and Solutions:


It is important to note that these solutions are not exclusive of each other. For example, you could choose to virtualize a desktop OS and then use application virtualization products to deploy and manage applications. Realistically, most organizations will find all of these options to be suitable for simplifying some aspect of overall desktop operations. This area is evolving rapidly (in terms of both real technology and hype), so be sure to thoroughly research options before deploying them. Overall, knowledge is power, so keep these options in mind the next time you spend 30 minutes repairing a mouse-related problem!

Desktop Virtualization: Goals and Options

This article was first published on

Quick: Name a task that’s less enjoyable than managing client operating systems and applications! I have a feeling that if you’re a seasoned IT pro, you had to think for a few seconds (and, I’ll bet that many of you came up with some very creative responses). Clearly, the challenges of keeping end-users’ systems up-to-date can be a thankless and never-ending ordeal. Vendors have heard your cries, and various solutions are available. At the risk of sounding like a flight attendant, I do understand that you have a choice in choosing virtualization approaches. In this series of Tips, I’ll describe details related to the pros and cons of desktop application and OS virtualization. Let’s start by defining the problem.

Desktop Management Challenges

There are many reasons that desktop computers can be more painful to manage than their server-side counterparts. Some important issues include:

  • Analyzing Administration: Desktop and notebook computers are often located in the most remote areas of your organization (or outside of it altogether). Performing systems administration tasks can sometimes require physical access to these machines. And, even with remote management tools, the client-side machine has to be online and connected to the network. The result is significant time and effort requirements for keeping systems optimally configured.
  • Mitigating Mobile Mayhem: Traveling and remote users can be (quite literally) a moving target: It seems that as soon as you’ve deployed a system for their use, changes are required. While some users can’t avoid working offline, there’s a subset of the user population that might need to access their OS and applications from multiple locations. Managing multiple pieces of hardware or shared desktop machines can be time-consuming and tedious.
  • Dealing with Distributed Data: Security and regulatory compliance requirements necessitate the management of data. It’s far easier to secure information and prevent data loss or theft when everything’s stored in the data center. While stolen laptops can be costly, it’s far cheaper than dealing with stolen data.
  • Application Anarchy: Deploying and managing desktop applications can be a struggle. While deployment tools can simplify the process, issues like managing application compatibility problems can lead to a large number of support desk calls. Other issues include tracking license usage, ensuring that systems remain patched, and providing the right applications on the right computer at the right time.
  • Bumbling Backups: Ensuring that data remains protected on desktop and notebook computers can be problematic. Even with the use of backup agents, there’s room for error. And, getting users to consistently save their important files to network shares can seem futile.
  • Hardware Headaches: Managing desktop and notebook hardware can be time-consuming. Add in the costs of technology refreshes and verifying hardware system requirements, and the issue can quickly float to the top of an IT department’s list of costs.

From a technical standpoint, the issue is that applications are tightly tied to their operating systems. And the operating systems, in turn, are tightly tied to hardware. Solving these problems can help alleviate some of the pain.

Choosing a Virtualization Approach

There are several different approaches to addressing desktop-related challenges. One caveat is that the terminology can be inconsistent. I’ve taken a shot at categorizing the different approaches, but vendors’ descriptions do differ. Here’s a breakdown:

  • Presentation Virtualization: Some users require access to only one or a few applications (think about call center and point-of-sale users). The main idea behind presentation virtualization is that all applications are installed and executed on a specialized server that can then redirect video, keyboard, and mouse signals between a small client application or a thin client device. Since applications are installed centrally, deployment and management is less of an issue.
  • OS and Application Virtualization: For some portion of the user population, such as traveling employees or “power-users”, there’s a real need to run an operating system directly on individual computers. In these scenarios, it’s still desirable to simplify the deployment and management of applications. Application virtualization solutions provide a way to either compartmentalize or stream programs to the computers that need them. The process is quick, safe, and can happen with little IT involvement.
  • VM-Based Virtualization: Also known as Virtual Desktop Infrastructure (VDI), among other names, the idea here is to allow users to run their own desktop OS’s – except that they are physically stored in the data center. Typically, the operating system and applications are run within a dedicated virtual machine which is assigned to a particular user. Employees use either a thin client computer or a remote desktop connection to access their environments.

In addition to these options, there’s an implicit fourth choice: “None of the above.” As I described in my Tips, “VDI Benefits without VDI”, you can reduce problems to some extent by utilizing IT best practices. You can also use a combination of these approaches (for example, VM-based virtualization with application virtualization)to meet different needs in the same environment.

Looking for Solutions

In this Tip, I presented some of the problems that desktop virtualization attempts to address. It’s important to understand your pain points before you start looking for a remedy. Then, I described three high-level approaches for solving common problems. In the next part of this series, I’ll present information about the pros and cons of each approach, along with specific products to consider.

VDI Benefits without VDI: Desktop Management

This article was first published on

Quick: Think of the five systems administration tasks you most enjoy doing! If you’re like most techies, desktop management probably didn’t make the list. It’s probably right up there with washing the car or mowing the lawn (a whole different type of administration challenge). Caring for and feeding client-side computers can be a painful and never-ending process. Therefore, it’s no surprise that Virtual Desktop Infrastructure (VDI) technology is capturing the eyes and ears of IT staff.

But does VDI provide a unique solution? Or, can you get the same benefits through other practices and approaches? (If you’ve read the title of this Tip, there’s a good chance you can guess where I’m going with this.) Over the years, a variety of solutions for managing desktop and notebook computers have become commonplace. In this article, I’ll outline some problems and solutions. The goal is not to discredit VDI, but to look at options for achieving the same goals.

Deployment and Provisioning

  • Problem: Rolling out new desktop computers can be time-consuming and labor-intensive. Using VDI, provisioning is much faster since standard base images can be quickly deployed within the data center. Users can then access the images from any computer or thin client.
  • Alternative Solution(s): Automated operating system deployment tools are available from OS vendors and from third-parties. Some use an image-based approach in which organizations can create libraries of supported configurations and then deploy them to physical or virtual machines. When combined with network boot features, the process can be completely automated. Additionally, there are server-based options such as Microsoft SoftGrid for automatically installing applications as they are requested.

Desktop Support and Remote Management

  • Problem: Managing and troubleshooting desktop systems can be costly and time-consuming in standard IT environments, as physical access to client machines is often required. With VDI implementations, all client operating systems, applications, and configuration settings are stored centrally within VMs within the data center. This reduces the need to visit client desktops or to have physical access to portable devices such as notebook computers.
  • Alternative Solution(s): While VDI can sometimes simplify support operations, IT departments still need to manage individual operating system images and application installations. Remote management tools can reduce the need for physical access to a computer for troubleshooting purposes. Some solutions use the same protocols (such as the Remote Desktop Protocol, RDP) that VDI or other approaches would use. Products and services also allow for troubleshooting computers over the Internet or behind remote office firewalls. That can help you support Mom, who might not be authorized to access a VM image in your corporate data center.

Resource Optimization / Hardware Consolidation

  • Problem: Desktop hardware is often under-utilized and hardware maintenance can be a significant cost and management burden. By combining many desktop computers on server hardware, VDI can be used to increase overall system resource utilization. Additionally, client computers have minimal system requirements, making them more cost effective to maintain over time.
  • Alternative Solution(s): VDI takes the “server consolidation” approach and applies it to desktop computers. Standard client computers are minimally utilized, from a resource standpoint. Desktop hardware, however, tends to be far cheaper than data center equipment. And, with VDI client-side devices are still required, although they are “thin”. When data center costs related to power, cooling, storage, and redundancy are factored in, it can be hard to beat to cost of a mid-range desktop computer. Through the use of application virtualization and solutions such as Citrix and Microsoft Terminal Services, organizations can increase the effective lifecycle of desktop hardware. Windows Server 2008’s version of Terminal Services provides the ability to run single applications (rather than entire desktops) in a virtualized environment, thereby providing the benefits of centralized application management with scalability. There are potential compatibility issues, but they may be offset by the ability to support many more users per server.

Supporting Mobile Users and Outsourcing

  • Problem: Maintaining security for remote sites, traveling users, and non-company staff can be a significant challenge when allowing the use of standard desktop or notebook computers. VDI helps minimize data-related risks by physically storing information within the data center. Even if client devices are lost or stolen, information should remain secure and protected.
  • Alternative Solution(s): For some types of remote users, it might make sense to provide isolated desktop environments via VDI. However, these users would require network access to the VMs themselves. Multi-factor authentication (using, for example, biometric devices) and encrypted connections (such as VPNs) can help protect network access from standard desktop computers. Network Access Control (NAC) is a technology that can help prevent insecure machines from connecting to the network. And, carefully managed security permissions can prevent unauthorized access to resources. All of these best practices apply equally whether or not VDI is being used. Finally, there’s no substitute for implementing and following rigid security policies, regardless of the technical approach that is used.

Managing Performance

  • Problem: Desktop operating systems and applications can never seem to have enough resources to perform adequately, leading to shorter upgrade cycles. Using VDI to place desktop VMs on the server, systems administrators can monitor and allocate system resources based on the resource needs of client computers.
  • Alternative Solution(s): In theory, VDI implementations can take advantage of highly-scalable server-side hardware, and it’s usually easier to reconfigure CPU, memory, disk and networking settings for a VM than it is to perform a hardware upgrade on a desktop computer. The drawback with the VDI approach is that applications or services that consume too many resources could potentially hurt the performance of other systems on that same server. Load-balancing and portability can help alleviate this, but administrators can also use other techniques such as server-based computing to centrally host specific resource-intensive applications.

Workload Portability

  • Problem: Operating systems and applications are tied to the desktop hardware on which they’re running. This makes it difficult to move configurations during upgrades, reorganizations, or job reassignments. With VDI, the process of moving or copying a workload is simple since the entire system configuration is encapsulated in a hardware-independent virtual machine.
  • Alternative Solution(s): When entire desktop configurations need to be moved or copied, the VDI approach makes the process easy since it’s based on virtual machines. When using standard desktop computers, however, the same imaging and conversion tools can be used to move an OS along with its applications to another computer. As these hardware-independent images can be deployed to both physical and virtual machines, this also provides IT departments with a seamless way to use VDI and standard desktop computers in the same environment.


Ask not whether VDI is a solution to your desktop management problems, but rather whether it is the best solution to these challenges. VDI offers benefits related to quick deployments, workload portability, centralized management, and support for remote access. Few of these benefits are unique to VDI, though, so keep in mind the alternatives.

VDI Benefits without VDI:Managing Security

This article was first published on

What do leaky faucets, fragmented file systems and failed hard disks all have in common? We want to fix them! As IT professionals, most of us pride ourselves on our problem-solving abilities. As soon as we hear about an issue, we want to find the solution. Every once in a while a technology offers new solutions to problems you may not have recognized. VDI addresses raises and addresses some important issues that are related to IT management. But, is VDI the only solution to those problems?

Whether or not you agree that VDI technology will make inroads into replacing traditional desktop computers, all of the recent press on the technology helps highlight the typical pain that’s being seen in IT departments. From security to supportability to regulatory compliance, there’s clearly a need for improvements in IT management. For many environments, however, it’s possible to find solutions by using other approaches and practices.

For the record, I certainly don’t oppose the use of virtualization for desktop environments, and I think it most likely will find a useful role in many environments. However, in order to justify the costs and technology investments, it’s worth understanding other options. The point of this article is that VDI is not required in order to solve many IT-related security problems. Let’s look at some problems and alternatives.

Securing Desktop Data

  • Problem: Data stored on corporate desktop and notebook computers is vulnerable to theft or unauthorized access. By using VDI to physically store all of this data on virtual machine images in the data center, chances of data compromise are reduced. The reason for this is that information is that sensitive data is never actually stored on a desktop or portable computer. If the system is lost or stolen, organizations don’t have to worry about losing information since it is not stored on the local hard disk.
  • Alternative Solution(s): Securing data is a common challenge in all IT environments, and many solutions are available. Sensitive information, in general, should be stored in protected network locations. File servers should adhere to security standards to prevent unauthorized access or data loss. In this scenario, the most important data is already secured within the data center. For protecting local copies of information, there are several hardware and software-based solutions that can be used to encrypt the contents of desktop and notebook hard disks. An example is Windows Vista’s BitLocker feature. Even with VDI, you would have the need to protect local copies of VMs for traveling users.

Data Protection

  • Problem: Backing up and restoring important data on client machines takes significant time and effort. When using VDI, all of the contents of the desktop and notebook computers are actually stored in the data center (usually on a dedicated storage arrays or network-based storage devices). Since all of the data is stored centrally, systems administrators can easily make backups of entire computer configurations (including the operating system, installing applications, data, and configuration settings). The no longer have to really on network-based backup agents that require the computer to be powered on and accessible in order for the data to be copied.
  • Alternative Solution(s): Hardware failures or accidental data modifications on client-side computers are potential problems, but there are many backup-related solutions. I already mentioned the importance of storing critical files on data center servers. By using automated restore tools, users can quickly be restored to service, even after a complete hardware failure. While VDI might seem to help in this area, when backing up entire VMs and virtual hard disks, you’re actually protecting a lot of unnecessary information. For example, each virtual hard disk that is backed up will include the entire operating system and all of the installed program files. These types of files could be much more easily restored using installation media or by reverting to an image-based backup. Users should understand the importance of storing information in network environments. File synchronization (such as the Windows Offline Files feature) can be used to automatically support traveling users.

Managing System Updates

  • Problem: Systems administrators spend a lot of time in keeping systems up-to-date with security updates and related patches. Part of the challenge is in dealing with remote machines that must be connected to the network and be properly configured in order to be maintained. With VDI, guest OS images are located in the data center and can be accessed by systems administrators whether or not the VM is being used.
  • Alternative Solution(s): The VDI approach still requires each user to have access to a single operating system. The OS itself must be secured, patched, and periodically maintained with other types of updates. Most vendors have tools for automatically deploying updates to large numbers of computers. These same methods can be used with or without VDI. In addition, features such as Network Access Control (NAC) can help ensure that only secure computers are able to access the network.


VDI approaches can help increase security in many different situations. But, VDI is not the only option for meeting these needs. IT automation tools and practices can help address problems related to data protection, security of client-side data, and ensuring that network systems remain free of malware and other infections. When deciding how and when to deploy VDI, keep in mind the alternative approaches.