Debunking the on-demand storage utility

Posted on May 01, 2005

RssImageAltText

A contrarian’s view of why the storage utility model’s promise of on-demand provisioning may be inappropriate/contraindicated for your corporate health.

By Dick Benton

The industry definition of the storage utility model focuses on its ability to provide storage on demand to an increasingly data-hungry business community. Industry pundits often liken the storage utility to electric or water utilities, which supply a commodity on demand to their customers: If you need more water, just turn on the tap; if you need more electricity, just plug in an additional device.

The storage utility model’s central premise is that storage can be provided in the same manner as a utility. This promises the elimination of delays in provisioning that typically occur when additional storage needs to be purchased and installed. Eliminating delays has the potential to shorten time to market-one of the keys to competitive advantage.However, there is one key difference in the utility model for water/electricity and the storage utility and that is the concept of persistence: Water and electricity are supplied and usually consumed immediately.

Storage provisioned to a business unit consumer is persistent; it is not simply consumed, but is a resource that stays around, often for lengthy periods. With water and electricity, you have to take positive steps to have the utility deliverable stick around. Fill your sink with water, maybe charge a capacitor, but the default is to consume the utility service as it is provided. With storage, the default is not only to consume the storage, but also to retain it, and probably many copies of the data on it, for considerable periods of time. Given storage’s quality of persistence, is the storage utility model and the concept of storage on demand good for corporate health? It seems that the arguments for storage on demand do not really stand up under examination.

Let’s look at some of the issues involved:

  • Reduction in time to market;
  • Automatic add-on costs for every gigabyte; and
  • Biting the bullet of data destruction.

Reduction in time to market

If you subscribe to the point of view that time to market is key to business success, then the ability to provision storage on demand is critical. This thinking may lead to the conclusion that criticality is equally applicable to application software, application servers, and the connectivity of servers to both users and to the storage environment. In this picture, it is difficult to see the provisioning of storage as the lone item on the criticality path.

So, is availability of data so mission critical as to justify the expense and complexity of the type of capacity planning required to support an on-demand model? If an organization consumes so much storage that the storage vendor will happily provide additional storage on-site and bill only as used, then the costs of capacity planning disappear, but the much larger costs of storage administration, data retention, data validation, and the complex demands of data destruction remain.

Automatic add-on costs

Business units often dismiss the significance of the cost of storage as hardware costs drop. But while hardware costs have decreased, the cost of administration has increased because a gigabyte just isn’t a gigabyte. For every gigabyte of production storage, there can be 30 to 50 times that amount in secondary storage, and this secondary storage tends to be on removable media that requires management and administration. Let’s look at a very simplistic example of how a gigabyte of production storage, required for a new application, can generate 25GB of secondary storage. (Of course, an individual organization’s ratios will differ from this example, depending on policies governing backup, archiving, and disaster recovery.)

Base -
1GB of production storage for an application’s structured data.
Plus -
1GB for development copy
1GB for test copy
1GB for QA copy
5GB for daily backup
4GB for weekly backup
12GB for monthly archiving
1GB for disaster recovery
Total -
25GB

So for every gigabyte of production storage, about 25GB can be consumed in secondary storage that requires scheduling, compliance checks, off-site arrangements (including transport filing and recovery schemas), regular validation of recovery capabilities, and proof of archiving compliance.

In this scenario, the question is: Should an organization allow this type of incremental cost to be imposed by a member of a business unit making a demand for storage? One could argue that the case for on-demand storage should be subordinate to a change request, based on a provisioning policy that protects the organization from data accumulation while optimizing time to market.

Biting the bullet of data destruction

From the examples above showing the effects of unchecked data growth, it becomes clear that at some point it is necessary to bite the bullet and develop policies and procedures to guide data destruction. To return to our water utility example, if we treated our usage of the water utility the way we treat our use of the storage utility, we would save every drop of water that came out of the taps after every conceivable household use. Our basements would gradually fill with buckets of “grey” water from many different uses. We would keep these because every month or two the local police would come around to make sure we could show them the particular buckets holding a specific class of water; or worse, we would have to prove the water was still actually in the bucket. At some point in time it just isn’t feasible to continue down this path; some buckets must be thrown out to make room for the daily addition of more buckets as household use increases with the growing family.

Even though the above is related “somewhat tongue in cheek,” the scenario parallels many organizations’ approach to the use of storage. Inevitably, the cost of maintaining data, the value it provides, and the risk of not having it reach a point where certain data has to be let go. Policy and procedure with provable compliance metrics must govern this process. How expensive this process is will depend on how much data was allowed to accumulate in the first place.

The utility model has its place, for example, in large organizations with the clout to prevail upon vendors to install extra capacity that is billed for only as it is used.

For many companies, however, the utility model may be just that-an interesting model to use when planning a more pragmatic provisioning.

The issues discussed in this article present a rationale for instituting a disciplined process governing storage requests, based on needs that are cost-justified, rather than unjustified and ad hoc requirements that are encouraged by the on-demand utility model.

Click here to enlarge image

Dick Benton is a storage business practices manager at GlassHouse Technologies (www.glasshouse.com) in Framingham, MA.


Comment and Contribute
(Maximum characters: 1200). You have
characters left.

InfoStor Article Categories:

SAN - Storage Area Network   Disk Arrays
NAS - Network Attached Storage   Storage Blogs
Storage Management   Archived Issues
Backup and Recovery   Data Storage Archives