IT's dirty little secret

By Heidi Biggar

When it comes to backing up and recovering data, system administrators have some big issues. Most don't know whether they are backing up everything that they should be. Surprisingly few have tested their restore capabilities. Many struggle with managing mixed backup environments. And still others can't get their backups done within the allotted window.

For years, system administrators have admittedly had their heads in the sand when it comes to these and other backup-and-recovery issues. Why? For the simple reason that they have had no easy-or cost-effective-way of addressing them.

While backup-and-recovery products do a good job of backing up specified data, they have generally lacked intuitive reporting capabilities, have been difficult to manage (particularly in large multi-platform environments), and have done little in the way of helping system administrators balance backup jobs or plan for future capacity requirements.

Software vendors are aware of these issues and are developing a variety of new reporting capabilities, as well as improved management tools and backup techniques.

Identifying the problems

With the help of industry consultants, InfoStor identified four key backup-and-recovery issues (see figure on p. 18). We then posted the four choices on our Website and asked end users to tell us which issue caused them the most problems.

Not surprisingly, 36% of the respondents said that not knowing whether they were adequately backing up their data was the biggest problem. About 25% cited the difficulty of managing multiple backup platforms or the ability to restore data, and 15% checked off backup window issues.

We then took these results to a cross- section of backup/recovery software vendors and asked them to comment on each issue and to share product development in these areas.

Is my data safe?

This really boils down to two questions: Am I backing up all the data I need to? And am I sure? Unfortunately, these answers are often buried in lengthy backup logs. While backup applications typically have reporting tools to help sort through pages of information, by almost all accounts-even vendors'-these products are inadequate.

Click here to enlarge image


"It's one of the things administrators bring us in for," says Stephen Foskett, a senior consultant at GlassHouse Technologies, a consulting and ser vices firm in Framingham, MA. "It's the dirty little secret that people don't want to talk about. They know that something isn't being backed up, but they just let it slide."

Why? "Because [until recently] there haven't been any solutions, so why bother talking about it?", explains Curtis Preston, principal consultant for The Storage Group consulting and integration firm, in Oceanside, CA. "Either they don't have the tools they need to determine if they are backing up everything that needs to be backed up, or they can't abstract the information they need with the tools they do have," he adds.

To get this level of granularity, Preston suggests using tools such as Bocada's Backup Report. This software is designed to filter the report data and to help administrators load balance backups to avoid potential backup spikes, make sure they have the capacity for future backups, and determine how much data they are actually backing up. "These are very common management questions, which are difficult to get answers to with most backup products," says Preston.

"My backup applications are heavily into metrics, but they don't generate any reports," says Chris Jacobs, the service manager for backup at a large Internet technology firm. The firm backs up about 200TB of data monthly for approximately 1,400 clients. About 80% of these jobs are done using Legato's NetWorker and the rest with Veritas' BackupExec.

"With [Bocada's] BackupReport, I can now see in minutes what could have taken me a week with NetWorker," says Jacobs, "and I couldn't do it at all with BackupExec." In addition to management improvements, Jacobs says she has also seen an improvement in capacity usage and backup performance. She expects the installation, which cost about $200,000, to quickly pay for itself.

While GlassHouse's Foskett says he would recommend products such as Backup Report to his clients, he believes they are generally unnecessary if administrators regularly revisit policies and go through schedules and "include" lists. But the reality is that administrators don't have the time or the resources to do this, he says.

"In most cases, what administrators have will do the job," says Foskett, "but the fact is that administrators aren't [using the tools they have] because the tools are tedious and they've got other fires to fight."

Click here to enlarge image

Fujitsu Softek's Windowless Backup and Recovery solution backs up a point-in-time replica (shown here as Z) of source volume E, using leading open-systems backup products (e.g., Legato NetWorker and Veritas NetBackup). The replica can be made in two ways: 1) using the snapshot capability in Softek Virtualization (via Softek storage domain server [SDS]) or 2) with Softek TDMF replication software. Which method is used depends on the environment: SAN, NAS, or direct-attached. The replica persists only long enough for the backup to complete, and only the changes made to the source volume during the backup are actually copied from E to Z. This minimizes bandwidth requirements.

GlassHouse has developed its own software for gathering data from backup logs. The company uses this tool to supplement Tivoli Storage Manager's reporting capabilities. Among other functions, the software looks for files that may not have been backed up because they were open during the scheduled backup process, according to Foskett.

Bocada's BackupReport supports all leading backup applications, operating systems, and storage environments/devices (e.g., storage area network, network-attached storage, tape, and disk). A variety of reporting tools are also available from storage resource management (SRM) vendors, and more products are expected from leading backup vendors.

For example, Atempo will add Excel-based reporting and trend analysis tools to its Time Navigator software later this year. CommVault will broaden its LiveVault software's reporting capability, adding 30 more reports, next month. Computer Associates is implementing phased changes to BrightStor ARCserve and BrightStor Enterprise Backup, starting with a more intuitive error-reporting capability. And Tivoli will replace Tivoli Decision Support with a product that has a more unified way of gathering information.

Managing multiple backup applications

If your environment involves multiple backup applications, you're not alone. The reality is that the majority of companies run more than one backup application.

According to a Robert W. Baird & Co. (www.rwbaird.com) survey of 86 mid-sized and large companies, 45% had multiple backup vendors and, of those, 32% had plans to consolidate applications this year.

The survey found that companies running multiple backup applications tended to be larger companies with complex operating environments. For example, 71% of the large companies (greater than $3 billion in annual revenue) had multiple backup vendors, while only 36% of companies with less than $1 billion in revenue had more than one backup vendor.

So, what do you do if you have multiple backup applications? Do you consolidate to one vendor? Not necessarily. What may make sense in some cases is to have a dedicated backup administration group, says GlassHouse's Foskett. "If you've got one or two dedicated backup administrators-rather than two or three part-time system administrators-then maybe you can keep two backup platforms."

By having dedicated backup administrators, Foskett says companies can avoid falling into the dangerous, and often costly, pitfall of hiring too many non-specialized administrators. "A given administrator may be able to manage a huge NetBackup environment if he has to, but he can only manage [a small part] of that environment if he also has to manage a NetWorker environment. The more backup platforms you've got, the more administrators you need," he says.

But whether you're running multiple backup applications or just one, have hundreds of servers or just a few, or are trying to gather information about remote backup operations or just local backups, odds are you're having problems managing them. What's needed is a single tool that can monitor, manage, and report on these environments.

Vendors realize this and are attacking the issue from a variety of directions. For example, Bocada has developed a tool that enables administrators to assess backup operations in mixed-vendor environments. Computer Associates has a storage portal that, although it isn't a reporting tool, enables users to manage mixed storage environments using a single Web interface. CA's plans, like those of other leading backup vendors, is to integrate the various storage management functions, including backup, into a single tool that, among other things, is capable of automatically collecting data from backup applications and presenting that data to administrators in a single view.

Other vendors such as Atempo and Reliaty (formerly Workstation Solutions) are developing new ways of using the Network Data Management Protocol (NDMP) to facilitate backup and recovery of data in environments with mixed operating systems, applications, and storage devices. This capability is built into Reliaty Backup and will be available from Atempo as a stand-alone product (Universal NDMP Server Engine) next month.

Adhering to the backup window

While vendors have made great strides in this area, end users continue to have problems backing up data in allotted backup windows.

"It's a big issue," says GlassHouse's Foskett. "A lot of people are complaining that their backup windows aren't being met." What can you do? In many cases, the problem can be eased by adjusting policies, by better balancing backup loads to avoid flooding the network, and/or implementing a dedicated backup network (LAN or SAN), according to Foskett.

Other options include replicating or taking snapshots of data, using more or faster tape drives, or using hierarchical storage management or HSM-like products to migrate unchanging data to archives (which means you're backing up less data). Analysts recommend staying away from serverless technologies for now (see "Serverless backup not ready for prime time," Info Stor, May 2002, p. 1).

Click here to enlarge image


One of the hottest trends is the concept of "zero-impact" or "windowless" backup. The idea is to take a snapshot of your disk and then back up the snapshot to disk or tape. However, while this is an effective means of eliminating the backup window, it is a complex-and often costly-option and one that can tie administrators to particular hardware or software platforms. It can also lead to incomplete backups if applications are in use.

"It's very difficult to take a snapshot of the right data at the right time and to make sure that the data is in a 'quiet' state so that what you're backing up is useful," explains Foskett. "And it can be very expensive."

Fujitsu Softek recently announced a windowless backup capability (see figure on this page) that leverages its existing virtualization and replication technologies and supports a variety of backup applications (e.g., NetBackup and NetWorker), operating systems (NT/2000 and Solaris), and storage architectures (SAN, NAS, or DAS). Additional support is expected over the next few months.

How does this product differ from existing options? In at least five ways, says John McArthur, group vice president of International Data Corp.'s storage research program: "It keeps track of changed data only, has a rapid restore capability, is cost-effective and host-independent, and you don't have to double up your storage resources."

It's all about restore

If the data hasn't been backed up, it can't be restored. It's as simple as that. So, the first step to making sure you can recover your data is ensuring that everything that should be backed up has actually been backed up. The next step is to periodically check backup systems and processes (see chart).

"If you do all this, then there's no reason you should be worried about restoring your data," says Foskett. "If you're worried about it, it's because you're worried the data isn't there."

If the speed, not the viability, of the restore is an issue, there are a few things you can consider, including using snapshot and replication techniques to make point-in-time copies of data available on disk, putting frequently accessed or vital data on low-cost disk rather than tape, using faster or more tape drives, and running multiple streams of data to/from the tape drive/library.


"Up to 60% of backups do not properly execute in typical network environments."-The Enterprise Storage Group

Common causes of failure

  • Inadequate coverage-Backup application is unaware of certain systems
  • Poor success rate-Backup application not doing what it is supposed to
  • Inadequate policy-Administrator erroneously tells backup application to ignore certain systems

Available products*

  • Bocada BackupReport
  • Veritas NetBackup DataCenter Advanced Reporter
  • Tivoli Decision Support
  • Tek-Tools Storage Profiler Backup

*This list is a sampling of currently available reporting tools.

Eight key questions end users should ask

By Ira Goodman

If you're like most systems administrators, you probably haven't given much thought to how much information is on your network, how much of that information is being used at any given time, or how much of it is worth saving and for how long.

Finding the right backup solution requires foresight and planning. If you ask yourself the following questions, you'll be well on your way to putting the right solutions in place.

  • How big is your full backup? How much data changes on a daily basis (%)?
    Backup volumes vary dramatically from company to company and department to department. If you've got 320GB or more to back up, then even a seven-slot library with 40GB DLT 8000 drives won't be able to store a full backup. However, a similarly configured library with 100GB LTO drives could handle the load as well as several days' worth of incremental backups (assuming each incremental backup is 10% to 20% of the base backup). On the other end of the spectrum, a small company with 10 systems, each with 18GB hard drives, would be adequately served by locally attached tape drives and no library. This question will force you to decide whether you want to do full backups each night or if incremental, or differential, backups are adequate. The difference between the two can mean 1TB versus 100GB backups daily. The amount of data versus the number of devices you have will not only tell you if you can do full backups during the night or over a weekend, but also how frequently you can back up-every day, every other day, or every week.
  • What is your expected data growth?
    On average, data growth is about 50% to 60% a year, although 100% is not uncommon. People generally underestimate their growth rates. So, if you expect 10% growth, plan for at least 30%. When forecasting, consider the type of files you'll be storing. For example, if you expect a large volume of video files, set your expectations higher.

  • What is your backup window?This is simply a question of dividing your backup volume by the allotted backup window (i.e., the time available for backup). Establish how many hours are available and the amount of data that is to be backed up within that window to determine the type and number of drives you'll need.
  • Is automatic twinning (i.e., the automatic creation of duplicate tape copies) required?Automatic twinning has pros and cons. It allows you to automatically create an extra copy of your data to keep off-site, which can be a big security bonus, but it doubles your capacity and hardware requirements. You'll need to weigh the cost of these additional resources against the value of having an automatic copy.
  • How long will you need to retain backups? Are there any specific legal requirements?
    Storing backup media is always an issue. Depending on the volume of data and the number of devices, eight hours of backup a week could translate into up to 10 pounds of data. So, you'll need to weigh your media choice on its storage capabilities and retention needs.Legal requirements vary by industry. For example, most financial companies are required by law to keep client data for seven years. That's a lot of tape to store. Due to the bulk, you'll probably want to store the data off-site (see question #6).

  • Is automatic off-site tape movement required?
    Will you want to move tapes to a vault? If so, you'll need to plan for this. Your operations staff will need to be trained on how to eject tapes from the library (if there is only one copy), etc. And you should know if you'll need to consider the implications for data restore. It could take up to a day or more to restore data from a tape stored off-site, versus instantaneous restore from on-site disk or tape. Your decision should be consistent with enterprise policies.
  • Are there other devices in use in your enterprise that might influence your decision to use one type of media over another?
    If your company has been using a certain type of media for a long time, you may want to keep using the same equipment and media-or at least equipment/media that is compatible with your installed infrastructure. If you decide to invest in new technology, you'll need to keep some old equipment around for some time. Also, you may want to standardize on the same technology throughout your enterprise. It's much easier to restore data if the same technologies are used in all departments.
  • What is your budget?
    Do you have a big enough hardware budget to support your backup process? This really says it all! Every aspect of your backup requirements has to be considered in light of your budget.

Ira Goodman is software services manager at Syncsort Inc.(www.syncsort.com) in Woodcliff Lake, NJ.

This article was originally published on August 01, 2002