Consultants offer ROI/TCO advice

Posted on March 01, 2005

RssImageAltText

By Heidi Biggar

All vendors make ROI/TCO claims. But how helpful-or accurate-are they in determining or justifying the real benefits of implementing various storage technologies?

Rather than placing too much stock in vendors’ ROI/TCO claims, storage consultants say a better option may be to determine the actual unit cost of the storage services users are demanding.

“We try to stay away from TCO, because its meaning has been twisted over the past decade,” says David Bregman, general manager, western region, at GlassHouse Technologies, an independent provider of storage consulting and services. “The key is to understand the per-megabyte or per-gigabyte unit cost of the storage services.”

While this may sound simple enough, in practice it often isn’t. The problem, according to Bregman, is that many organizations have no idea what storage resources they have or how they are using them, so they can’t calculate the unit cost of the resources, which makes it difficult to justify budgeted items.

“The goal of one of our customers was to understand what it was costing them to deliver storage so they could not only improve how they were delivering storage services to users, but also be able to justify new projects,” adds Bregman.

Once organizations take inventory of their existing storage assets, Bregman says they can then begin to understand and classify the various types of data in their organization, build tiers of storage that align with the specific business objectives of the various data pools, and tie all this back to an actual per-gigabyte cost.

For example, rather than engage in battle with upper management about the e-mail management problem, administrators should arm themselves with the appropriate tools to defend their position, according to Bregman. “Once you have the metrics - e.g., this particular group is using x storage capacity which is costing y - you can change behavior,” he explains.

GlassHouse has developed a template to help organizations build an overall cost model. Key components include the following:

Inventory your storage environment: How much and what type of storage resources are being used and for what applications? Once storage resources are inventoried, administrators can look at provisioning different tiers of storage, which have different associated costs and service levels attached to them. For example, mission-critical data can reside on high-end Fibre Channel arrays, with less mission-critical data on SATA arrays.

Determine actual storage/server usage: Are storage resources being used appropriately and efficiently? Are administrators mirroring data unnecessarily? Is storage/server capacity underutilized? As examples, Bregman says one GlassHouse customer, which had a lot of legacy equipment, was mirroring everything, including data that resided on RAID-5 arrays; another customer was throwing capacity at servers even though utilization was only about 50% because it was easier than trying to “manage” the situation.

Understand business processes, objectives: How is storage provisioned in the organization? How are vendors dealt with? The goal is for organizations to become more proactive and less reactive. By working with vendors in advance of a project, IT organizations can improve their leverage with vendors and negotiate better deals.

Calculate acquisition and operational costs: Although important, hardware acquisition costs are just the tip of the iceberg, accounting for about 8% of total storage costs while operation costs account for the remaining 92%, claims Bregman.

Factor-in depreciation schedules, or the cost of ownership over time: Administrators can change the depreciation schedule by “cascading” equipment or getting one more year of use out of equipment before retiring it.

Calculate maintenance costs: GlassHouse recommends working with vendors upfront to ensure maintenance costs don’t increase after hardware/software terms expire.

Include environmental factors: This includes cost of power, cooling, space, etc.

Calculate people costs associated with implementing, managing, and monitoring the storage environment: According to GlassHouse, this is the most under-estimated component.

Arm yourself with appropriate tools: Make sure you have the appropriate tools to manage the storage environment.

Examine backup and archive policies: Chances are that you’ll find you are overprotecting some data. Check your backup policies and ensure they match the requirements of the data (e.g., regulatory requirements, etc.).

Determine your risk factor: Assess the cost of doing something versus the risk of not doing it. For example, think twice about setting a default retention period of three weeks for all projects that come online, says Bregman. You’ll save money in storage costs but you could ultimately pay a huge penalty “when you can’t produce the company financials from three years ago,” should you be asked to do so.

Originally published on .

Comment and Contribute
(Maximum characters: 1200). You have
characters left.

InfoStor Article Categories:

SAN - Storage Area Network   Disk Arrays
NAS - Network Attached Storage   Storage Blogs
Storage Management   Archived Issues
Backup and Recovery   Data Storage Archives