Client

Getting to grips with VDI

Virtual Desktop Infrastructure (VDI) is being promoted as the solution to all that ails the desktop. The truth is somewhat different.

It can reduce costs and it will certainly increase management control. But to get a working VDI implementation a lot needs to be done in advance and auditing and careful planning are the keys to success. As for ROI, tell your customers they can plan on recouping the initial investment but not on making big savings during the first few months.

Start at the beginning: what is VDI? The answer is remote desktops. If you think we’ve been here before, you are right. Terminal Services and virtual desktops have been around since the late 1990s and both have had varying degrees of success. What VDI aims to do is bring together both these technologies, providing a flexible desktop for the user with the separation from the underlying hardware that virtualisation provides.

If you look at the penetration of Terminal Services and virtualisation, it has tended to be more about the large business than the small, due to a lack of understanding about the benefits and a failure to clearly identify what the solutions could do. VDI is an opportunity to readdress that problem.

All businesses, small and large, suffer from the problem of desktop management. Applying patches, upgrading software and the need to add new hardware as applications demand more memory or disk create a constant drain on resources. On top of this is the ever present risk of data loss because much data is not held on core servers but instead is sitting out on the desktop.

One of the major advantages of VDI is the ability to improve security, especially protection from certain types of malware and virus attacks. The last year has seen a significant rise in the number of attacks that seek to infect a computer at the moment it is booted. This allows a malware attacker to take control before the anti-virus software loads.

The only way to get rid of this type of infection is to remove the computer from the network, format the hard disk and completely reinstall all the data and applications: a costly job.

Vendors have tended to highlight the potential power and cooling savings. For many small businesses this might look attractive, but the reality is that long before you start to see any savings, you will need to have made investments in replacing the client devices, upgrading the servers and in deploying a VDI solution. Such expenditure can mean that the power and cooling savings take years to be realised. When positioning VDI, therefore, concentrate on management, data protection and security. Do not get drawn into the power and cooling argument when talking about cost savings.

 

The basics of VDI
Client Device: the device used to access the virtual desktop. It can be a PC, thin client, laptop or any device capable of making a connection to the network.

Connection Broker: the software that takes the user request and determines what will happen next. With VDI this means comparing user logon credentials to provision a virtual desktop and applications.

Application Virtualisation:
where each application is virtualised and provided on demand to users. You patch and maintain a master image of the software and the users all work off a copy of this ensuring a single patch is applied to all users.

Virtual Desktop:
the virtual machine that the user accesses. It can be customised by the user or completely locked down by the IT department.

Blade PC: a specialised version of VDI. Each user is allocated a blade PC in a rack and whenever they connect they are directed to that blade PC. It is used by City trader desks to ensure very high performance to traders and flexibility to the data centre and support staff.

Hypervisor: the very small software layer that sits on the server and manages resource allocation to the virtual machines

 

Who’s who in VDI
VMware has the most complete offering of any vendor and is widely credited with creating the term VDI. From hypervisor to client, management infrastructure to application virtualisation, VMware has a complete solutions stack and is the only vendor with a mobile offering.

Citrix is the latest entrant into the VDI space. Years of experience in thin client and the purchase of XenSource give Citrix a complete stack to rival VMware. Despite being well established in the thin client space and having virtual desktop and server tools, Microsoft only has parts of the VDI solution and currently relies on Citrix and Quest Software to provide the broker and other services.

Symantec purchased Altiris two years ago, becoming a major player in application virtualisation and is currently building a VDI solution: it is not known if it will include a hypervisor. Quest Software is the leading third-party broker for VDI and has tools including software preparation and virtualisation. Heavily used by Microsoft and Parallels.

Best known for virtual PC software for the Mac, Parallels is now working to extend into VDI, especially in the education market, using Quest for broker and other services

There are only two vendors capable of offering a full service set of software at the moment: VMware and Citrix. All the others require you to bring together bits and pieces from multiple vendors which will increase the challenges and complexity of the final solution.

 

Preparing for VDI
As with all things in the IT sector, what looks simple is, in reality, far from it.

The starting point for VDI is not the desktop but the data centre. With Terminal Services we have improved their performance to around 25 users per server core. With VDI, do not expect more than 10 users per core which means that a quad-core, quad-processor server will support a theoretical 160 users. However, memory is a bigger issue. Expect an average of 1GB per user depending on applications, which means that you need 160GB of RAM to support 160 users.

Running VDI using a single server is also highly deprecated. Using two servers will allow for load balancing, a key lesson from Terminal Services. If you can add more servers, even if they are lower spec’d servers, it will improve performance.

You also need to think about storage. Pulling all the user desktops back into the data centre right from the start is not going to happen. Let’s just do the basic maths here. The average user today probably has a hard disk of around 100GB in size. If we assume that they are only using 60GB of space and the business has 100 users, then that amounts to over 6TB of disk space required to bring all of that data back to the data centre. A lot of large businesses would baulk at finding 6TB of disk space so it’s highly likely that small businesses will have a real problem.

Most of that space will consist of operating system, applications and data with no easy way to separate them out. A lot of that data will be duplicated, like multiple copies of PowerPoint presentations or reports that have been saved locally. If you have already begun to work with collaboration tools such as SharePoint you will have seen how many copies of documents exist on user computers when the users try to bulk upload to document libraries. Because you don’t know how much unique data each user has, you need to bring back the desktops slowly, doing a deduplication of the content and, if at all possible, ignoring the operating system and application directories.

As you begin to size the servers and storage, do not forget to address the backup needs and ensure that the support system is capable of coping with this increase in machines and data.

Desktop issues

Many of the issues that will prevent a successful VDI installation start at the desktop. You need to understand what is in use, where it is in use and what challenges it poses.

Audit the user environment.

This is something that should already be in place for licence management and good management practice.

Identify core and critical software.
By knowing what software every user has access to, you can reduce the complexity of your VDI implementation. Core software and critical software are two very different things. Core software such as office applications can be replaced with some business interruption but they will not bring your business to a halt. A user with AutoCAD who is responsible for design and development of products the business sells, has a critical piece of software.

Identify software requirements.
(See ‘Limitations of VDI’ boxout). Once you start running applications inside virtual machines, you are making a compromise on drivers. The biggest problem is where software packages have a dependency on very high-performance graphics cards. This should not affect the majority of users and you need to clearly identify the sub-set who are affected as you may need to develop a mixed VDI and local computing approach.

Identify locally attached hardware.
Not all hardware will work well over VDI. Some scanners, PDAs, smartphones and other peripherals that use synchronisation services or two-way audio may not work well in a VDI environment. Each of these needs to be checked in your trial environment before deployment. Some of the problems may be a new driver or require hardware replacement, others may present a more serious problem for VDI deployment.

 

Planning your VMs
The ultimate goal is to have as few master VMs as possible. This allows you to deploy a patch to a master VM knowing that the next time it is deployed to a user, it will be up to date. Instead of touching hundreds of computers, you touch just one. More importantly, as you can test the deployment of the patch against the master VM, you can avoid having to deal with user problems caused when a patch impacts unexpectedly on other applications.

The core part of a master VM is the operating system. Here you install all the key components that users require – and only those components. This reduces the security footprint of the OS. For example, under Windows Vista, not every user will require the local Internet Information Services, Telnet client or Tablet PC options. Now you can have a VM with those turned off for the majority of users and another VM where they are on for a smaller, more select group of users.

If the applications are required by all users, then put them in the core VM. If not, make them virtual packages that can be applied on demand through application virtualisation.

This gives you a standard master VM that every user gets and which can be patched on demand and then each software package can be managed separately. The only downside to including software inside the master VM is if you then have users who want to run different versions of core software packages. Now you have to deploy the other versions through application virtualisation.

 

Create a pilot
Whichever vendor you choose for this, you need to create a pilot. This must include a selection of user machines, servers and access to storage.

Start by installing the basic OS and applications into a VM. Do not do any patching at all. Deploy the VM and prove that you can connect to it. Then apply the patches to the VM and see how it reflects on the user machines. Include a PC with very limited hardware such as a six-year-old laptop with just 512MB of RAM and a 30GB hard disk. As nothing is running on the local machine it should have no problems running Windows Vista and Office Ultimate.

Immediately you see two things. The first is the ability to patch once and deploy to all users. The second is stopping the constant upgrade of local machines to support new operating systems and applications. Looking forward to Windows 7, this is an important thing.

 

The real VDI savings
Any deployment will create an initial expenditure in the data centre both for hardware and software. This will be offset over time by savings in help desk and software support. It will also be offset when newer operating systems and applications are deployed as there will be no need to upgrade local hardware.

Over the normal lifecycle of hardware, VDI also offers a potential reduction in costs by moving away from PCs towards thin clients. This cost reduction will also bring with it lower power and cooling costs but this is not and should not be the key decision factor when planning VDI.

What customers will need with VDI is support in creating the virtualisation of the data centre, the building of the VMs and ongoing management support. Virtualisation also opens up the secondary support market for off-line and off-site backups.
 


 
Banner
Share |
Write comment
security image
smaller | bigger
Comments (1)
Author
Message
AdamG
Posted: Dec, 8 2009
Ian,

Another issue with VDI is the impact it has on network performance, especially when it comes to viewing PDFs, video, PowerPoint presentations and other graphics intensive apps.

One solution to this is a product called Blaze, from Ericom Software. Ericom Blaze is a software-based RDP acceleration AND compression product that provides a superior end-user experience over WAN and congested LANs. Besides delivering higher frame rates and reducing screen freezes and choppiness, Ericom Blaze accelerates RDP performance by up to 10-25 times, while significantly reducing network bandwidth consumption over low-bandwidth/high latency connections.

Ericom Blaze works with any standard RDP host, including VDI, Terminal Servers and remote physical machines.

You can read more about Blaze at:
http://www.ericom.com/ericom_blaze.asp?URL_ID=708

Or view a video demo at:
http://www.ericom.com/blaze_youtube.asp?URL_ID=708

Adam
Ericom Software

busy

Download


Subscribe and get the magazine in the post before it's online

Subscribe and get access to all of the back issues

To read a sample eMagazine - March 2010

 
FREE SUBSCRIPTION!
Banner

IT EXPERT TOP TIP

If you're supporting en users who need to transfer files by FTP occasionally, explaining how to use FTP every time can get frustrating. Map an FTP site as a custom network location and they can do it through the familiar Explorer window. If you only have a couple of machines you can choose Tools >Map Network Drive… in Explorer and click the link 'Connect to a Web site that you can use to store your documents and pictures' to open a wizard that creates a network location. Select 'Choose a custom network location', type in the FTP address and fill in the user name and password. You can also create mapped drives and network places on the Environment tab of the user's Active Directory object - but if you have a lot of users to set up, put it in the logon script for the user profile under Active Directory Users and Computers.
If you're running into problems with Group Policy Objects, check this handy summary of the rules at http://support.microsoft.com/kb/555991/en-us. read more

TAKE THE POLL

Unified communications

Banner

The #1 Bestseller for Only 77p

Key resources

Login to view Key Resources

RECENT COMMENTS