How does storage fit into utility computing?

Posted on April 01, 2004

RssImageAltText

By Graeme Thickins

Though the concept goes by many names (which can lead to confusion), utility computing is an enticingly simple notion: Buy only the amount of computing/storage you need, when you need it—like flipping the light switch or turning on the water faucet—and respond effortlessly to changing business needs.

Several of the leading IT vendors began launching marketing blitzes more than a year ago. IBM has its "On-Demand Computing" initiative; Hewlett-Packard's is called "Adaptive Enterprise;" Sun's vision is encompassed in its "N1" architecture; and Veritas has its storage-as-a-utility vision and "Utility Computing Now" campaign. And many other vendors are applying their own spin on the concept. Get used to it: The barrage is just beginning.

As you might guess, however, getting to the utopian notion of utility computing will not be so simple. Utility computing is just beginning to go from concept to reality, and there's an increasing realization by both vendors and users that education is necessary. In one recent end-user survey, more than half of the respondents didn't know what utility computing was. "The on-demand utility model isn't fully baked yet," says Jamie Gruener, a senior analyst at the Yankee Group consulting firm.

Although the utility computing movement is clearly under way, analysts note that it's a long-term trend that's more of a journey than a destination. Nevertheless, analysts—including those focused on storage—say that it's a trend that IT managers should not ignore.

Why should storage professionals care about utility computing? "Because sooner or later, they'll be mandated to deal with the financial implications of their storage budget," says Gruener, adding that CIOs will be asking how storage is being used. "Storage is a data-center service that people take advantage of—so, how do you charge back the business units?" Some initial stages of chargeback are starting to happen, Gruener notes, and a Yankee Group survey of Global 2000 firms found that more than 20% of them claim to do some form of chargeback. "But there are many definitions of the term," Gruener explains. "Today, it's mostly asking for a budgetary contribution up-front. Actually invoicing departments monthly is a political issue—it's hard to do." Take heed: Utility computing has cultural implications, too.

"There will be a relentless drumbeat around business efficiency," says Clod Barrera, director of technical strategy for IBM's Storage Systems Group. And that, he says, is what's driving the whole on-demand trend. "It's a business notion, not a technical one."

The first thing to understand about utility computing is that it's largely about virtualizing resources—hardware, software, network, and storage. In other words, resources are not fixed or dedicated to specific applications or business units. Other key pieces of the puzzle are service level agreements (SLAs), policy-based management, standards, and the issue of outsourcing versus building an internal utility.

In some respects, storage professionals may be in front of the trend toward utility computing. "In a utility, the IT infrastructure is virtualized, or consolidated into a logical pool," says Mike Fisch, director of storage and networking at The Clipper Group consulting firm. So, if you've implemented a storage area network (SAN), "you've already taken a big step toward a storage utility," he notes in a recent white paper, Shining the Light on Utility Computing—a Business Perspective.

"The notion of business policies being driven down into the IT infrastructure is key to utility computing," says IBM's Barrera. "All requirements—application, security, access control, etc.—have to be aligned with business requirements." The ultimate objective of such policy-based management, he says, is for it to be self-managing—though this is admittedly a longer-term vision. "That's what we mean by 'autonomic computing'—machines that are self-configuring, self-healing, self-optimizing, self-protecting."

There's no question that the utility computing mandate is largely a concept that's flowing down from senior management with the primary objectives being to lower costs and improve efficiencies.

The Yankee Group's Gruener points out that storage professionals can begin by educating their end users about how much of the enterprise's storage resources they're using and what that's really worth. "This is the year for storage managers to start educating the business units," says Gruener. The point is to begin readying them for the coming realities of chargeback, which is central to the concept of utility computing. In contrast to traditional IT approaches, says Clipper Group's Fisch, "utility computing makes users accountable for their consumption. Utility computing provisions resources in measured tiers of service, rather than as discrete boxes or systems." He adds, "The providers are responsible for delivering the services in measured SLAs."

Fisch says a service level might include characteristics such as amount of capacity in megabytes, performance in MBs or I/Os per second, availability in percentage of uptime, recoverability in time to restore, and cost per unit of capacity.

Analysts and vendors alike agree that SLAs are a core underpinning of the utility computing concept and a central theme of the education that must take place within organizations—business units and IT—to make utility computing work.

There's no question that utility computing is real today. Many providers are selling outsourced services with much success—from IBM Global Services and HP's Utility Data Center on down, including several successful independent and regional suppliers. The phenomenon spans much more than multi-billion dollar deals such as American Express outsourcing its entire computing infrastructure to IBM in an on-demand, pay-as-you-go model. Outsourcing has become more strategic in recent years, with companies selectively outsourcing certain IT services to reduce costs and maximize internal efficiencies.

Where the concept is especially getting traction is in data protection and business continuity. "We believe data-protection services will emerge as the first widely adopted utility computing service," says Frank Brick, CEO of Arsenal Digital Solutions. Arsenal partners with telecommunications companies and hosting and network service providers to sell its data-protection services to IT departments.

Another example of a services provider that offers utility computing is Managed Storage International. And smaller players are offering specialized, storage-centric outsourced services in such areas as compliance and e-mail management.

"Outsourcing will be an economic and competitive mandate for SMBs [small and medium-sized businesses] that want to take advantage of the benefits of utility computing," says Tom Kieffer, CEO of Agiliti Inc., a regional services provider in St. Paul, MN. "Some large enterprises will choose to do it themselves, but they may ultimately buy utility services later once the pre-

requisite operational processes and application architectures—such as Web services—are in place." Agiliti recently announced a services partnership with Compellent, whereby Agiliti will provide near real-time off-site replication for users of Compellent's Storage Center system (see "Start-up spins a 'compelling' story," p. 16).

"For SMBs, it's simply not cost-effective or feasible to build an internal utility infrastructure," argues Frank Brick, Arsenal Digital Solution's CEO. "That would be akin to home owners deciding to get gas, electricity, and phone service on their own, without using the existing utility network." Brick says that outsourced utility computing services will provide a "democratizing effect" for SMBs that otherwise would not be able to afford and manage the infra- structure for storage networks.

There's no question the "internal utility" model will increasingly be pushed by the large IT vendors, including IBM, HP, and Sun, as the utility computing trend heats up. Software vendors such as Computer Associates and Veritas also position their offerings as enablers of this model. At the same time, these big players also espouse the outsourcing approach, either directly (e.g., IBM Global Services) or by marketing their utility computing solutions to providers that host utility services.

For example, Veritas is actively marketing to service providers. "Many of our customers are interested in evolving their IT infrastructure to support utility computing, and a lot of firms will look to service providers to enable this transformation," says Troy Toman, Veritas' senior director of product management. "Veritas is not trying to deliver the utility; we're an enabler. So it's critical for us to work with both service providers and 'internal utility' suppliers."

Veritas has been on an acquisition binge for more than a year to build its utility computing offerings, buying Precise, Jareva, and, early this year, Ejasent. With the Ejasent acquisition, Veritas gained a chargeback tool called MicroMeasure.

The Yankee Group's Gruener says that many companies are already doing chargeback using storage resource management (SRM) software. (For more information, see "Leveraging storage resource management software," InfoStor, January 2004, p. 18.)

Analysts and vendors agree that the path to utility computing and to the longer-term goal of automated policy-based management begins with process. One vendor that's building its whole business around that concept is Invio Software, which offers a suite of solutions to support on-demand storage. "Regardless of whether it's outsourced services or an internal utility, the ability to better manage and continually improve the processes used to handle a company's data is a key element of a utility computing initiative," says Chris Hyrne, vice president of marketing at Invio.

Perhaps the best thing to remember about the promise of utility computing is captured by The Clipper Group's Mike Fisch: "Through centralized management, virtualization of heterogeneous resources, and automation, utility computing will dramatically raise administrator productivity, lower management costs, and automate many of the repetitive, boring tasks."

Graeme Thickins (graeme@thickins.com) is a storage and IT industry veteran who writes frequently about storage networking, utility computing, and other IT trends. He's co-based in Minneapolis and southern California.


Standards for utility computing

In addition to the Storage Networking Industry Association's SMI-S standard, which has strong implications for utility computing, other standards in this developing space include one being pushed by outsourcing giant EDS, along with Computer Associates and other vendors. The Data Center Markup Language (DCML) is a standard designed to provide a mechanism to enable data-center automation, utility computing, and system management solutions, as well as information exchange. Last month, the DCML organization announced the formation of several technical working groups, which will lead to further definitions of the DCML specification.

A much larger initiative, representing about 100 industry players, comes from the Distributed Management Task Force (DMTF). The DMTF recently announced the formation of a Utility Computing Working Group, which will create interoperable and common object models for utility computing services within the DMTF's Common Information Model (CIM). The workgroup is co-chaired by representatives from IBM and Veritas. Other large vendors participating in the DMTF's efforts include Cisco, EMC, Hewlett-Packard, Oracle, and Sun.


Recent utility computing announcements

Last month, IBM announced its TotalStorage Open Software storage initiative. This family of products delivers a set of storage automation and virtualization capabilities designed to help users build on-demand storage environments that optimize storage utilization, improve application availability, and maximize personal productivity so that IT departments can be more responsive to changing business requirements.

Veritas says its CommandCentral software facilitates the shift to utility computing by providing operational and business-level management capabilities and a single service for requesting, deploying, reporting, and charging for IT services, thus helping organizations create a shared services, or utility computing, model. The product set includes a portal that Veritas says allows IT to become more transparent, measurable, and aligned with business objectives.

EMC's StorageScope, an SRM monitoring and reporting application within EMC's ControlCenter suite, provides integrated asset and utilization reports across multi-vendor storage infrastructures. With StorageScope, users can improve asset utilization and performance to provide higher service levels. The software allows storage to be re-allocated more efficiently, facilitates chargeback, provides planning for future capacity requirements, and tracks IT assets.

Originally published on .

Comment and Contribute
(Maximum characters: 1200). You have
characters left.

InfoStor Article Categories:

SAN - Storage Area Network   Disk Arrays
NAS - Network Attached Storage   Storage Blogs
Storage Management   Archived Issues
Backup and Recovery   Data Storage Archives