Simplification is the most important thing when designing a backup system. Look for software that runs across all the operating systems you are backing up and that has the right agents and plug-ins for the applications your customer uses.
See the Windows Server 2008 tools described in this feature in action:
Managing multiple operating systems can be a challenge as every OS has its own built in utilities for doing backup. If you want to write your own scripts or backup utilities, use the built-in commands. What these commands don’t always do well is deal with open files that are locked by the operating system or applications, although with Vista and Windows Server you can take advantage of Volume Shadow Copy.
It’s critical to define a backup process that adapts as technologies and business needs change. Without a viable process, you cannot possibly be sure that backups are accurate or fit for purpose. The most important part of any backup process is restore. Any backup solution should require partial or sample restores on a regular basis to check that backups are effective. Remember, if you cannot restore then you have no backup.
Another key element is defining the backup strategy and matching that to the environment. Each type of end-user – home, mobile, office-based, or in another office – will need to be catered for. Some applications impose different needs from others. Unstructured data that is already stored in the network is no less important than databases but you may use a different timescale for backing it up.
Whenever it goes wrong, blaming the technology is the easy way out. After all, it’s inanimate, voiceless and every user knows deep inside that computers are flaky. The reality, sadly, is that people and process are the more common failure points and if the process is so dependent on one particular technology then it’s a poor process indeed.
Tape is the longest-serving backup technology but the problem has always been the reliability of the physical media. Heat, electromagnetic fields, humidity – these have all taken their toll on tape libraries, so long-term storage is best done at tape vaulting sites where these conditions are controlled.
Capacity is another problem for tape. While disk capacities have increased massively, tape has been slow to catch up. That is because of the need for stability in the media creation process and the cost of reengineering tape manufacturing plants. There are also standards to be maintained.
As newer versions of tape appear and existing tapes reach the end of their life, you have to migrate onto a new generation of tape. Sometimes this is irrelevant; by the time a tape needs to be migrated, a risk assessment may show the data on it isn’t worth it anymore. The challenge of migration is no different from the move from paper records to microfiche and microfiche to computer database; it’s an exercise in time and money. Rightly or wrongly, this is a commercial decision.
Changing tape standards can be a complete nightmare. Moving within versions of different standards is not too bad but moving from one standard to a completely different standard can create massive problems. You’ll need to hang on to old tape drives and ensure that they can still be used in case you need to get the data back.
Another big problem with tape is underestimating the amount of data to back up and how long it will take. Often tape drives are criticized for being slow but the problem is that the bottleneck is the network access or the tape management software.
D2D2T (Disk-to-Disk & Disk-to-Tape backup) is a new approach that seeks to take advantage of the increasingly lower costs of large capacity drives. It works by backing up data not to tape but to hard drives. This is quite different from tiered storage; it’s a backup to disk rather than data moved solely on the basis of when it was last accessed. Eventually, the data will be moved off to physical tape for archiving.
Using D2D2T has a number of important advantages such as speed and data management. As the data is being moved from disk to disk the only bottleneck is the network and there is no reason why this has to be done on the same site. You can back up between offices or to your own managed location.
A Virtual Tape Library is a block of storage that appears to the system as a set of physical tapes. Some come with just disks, others with built-in tape jukeboxes. The idea is that data is backed up to the physical disk, deduplicated and then stored in tape images. These images can be quickly restored or be moved to physical tape. Compared to traditional tape, you can save in excess of 80% on storage capacity by deduplicating.
Optical discs will work for some small businesses. Magneto Optical (MO) systems, which use high capacity optical platters to write data on, have long been favoured by those concerned with archival and legal issues. This is because unlike other technologies it started out as a Write Once Read Many (WORM) technology: anything written to the media could not be deleted or changed. That may have now changed with the introduction of Read Write (RW) media and WORM for tape but MO is still very heavily used in enterprises and you can offer an MO service for archival to small businesses that need it for compliance.
Backing up to CD or DVD, directly on the hardware or to a networked jukebox, has severe limitations. The capacity is limited, even with dual layer DVD, and there are real risks because of the ease with which the media can be lost; just look at the debacle of Her Majesty’s Revenue and Customs. Like tape, there’s an issue with lifespan and moving media between devices. Don’t assume that all devices support the same format. It’s not true, and this means that you might have a backup you can only restore if you change the optical media drive in the machine. However, dual layer DVD drives and disks do make this a possible option for users who are on the road and need to do a backup when away from the office.
Flash memory has soared in capacity over the last few years – 8GB, 16GB and 32GB drives are available – but that does not make it a sensible backup media even for individual machines. Biometrics put up the price and although there are ways to secure flash drives, including U3, the risk of data loss on these devices is still high. Also the NAND technology inside flash drives has a limited life. The older the drive, the more likely it is to fail. In fact, its lifespan is less than that of tape.
Solid State Drives are not yet affordable for most businesses and you don’t need their performance for backup in most situations.
External drives are a different matter. These are a very good backup media but need to be fully encrypted and users need to be educated about taking care of them. While the price of a 500GB portable drive has fallen below £100, the value of the data is likely to be many times more. Always check the business insurance policy before recommending them for backup.
With the advent of broadband online backup is becoming increasingly useful, as offsite storage for the office and especially for the mobile user: overnight, during dinner or even first thing in the morning while at breakfast, they can go online as usual and leave their computers to back up online. Should they lose their laptop or have it stolen, they can restore data and even applications and be working almost immediately.
Storage as a Service is the next level up from offline backup. Instead of just outsourcing backup, you outsource the entire storage architecture and move the data to a hosting facility, keeping a local cache in the office, so it’s vital to allow for enough bandwidth. There is a risk here. If you are going to provide this kind of service to your customers, you can either resell an existing service, with ironclad service level agreements or build a data centre solution with multiple carriers, multiple racks and continuous data protection. That way, any failure at your end is not a failure for your customers. Many of the high-end online backup services like Iron Mountain and EVault offer Storage as a Service as well.