Storage-focused implementers can learn a lot from non-storage IT disciplines.
By Bob Rogers
—Depending on whom you talk to, information lifecycle management (ILM) is nothing more than "tiered storage," or, a new term for hierarchical storage management (HSM), or, a new way of classifying content for corporate or regulatory governance, or, something entirely different. The Storage Networking Industry Association (SNIA) defines ILM as:
"The policies, processes, practices, and tools used to align the business value of information with the most appropriate and cost-effective IT infrastructure from the time information is conceived through its final disposition. Information is aligned with business processes through management policies and service levels associated with applications, metadata, information, and data."
Notice the word storage is not included anywhere in SNIA's definition. The definition fits nicely into several other disciplines, including information assurance and security, enterprise architecture, performance and capacity planning, service level management, IT finance, records management, and business process management. However, what is primarily driving ILM adoption today is fear of prosecution.
Most current users of, or potential candidates for, ILM solutions are companies implementing compliance programs for corporate and regulatory governance. The records management community has been operating in this area for a very long time. Most of their emphasis is focused on using these tools for data classification and retention to ensure government and judicial demands for records can be met.
The convergence of ILM data-classification techniques and compliance is a serendipitous coincidence. However, what most people in the industry were hoping for from ILM was an improvement in information management. The ability to keep up with the management of information assets has become one of the major issues for IT today. The techniques used by storage administrators in the 1990s do not scale effectively to meet the demands of the data center and the growth in information today.
The principles of ILM are alive and well, just not in the way that most of the storage vendors had imagined. For example, the widespread adoption of practices based on the IT Infrastructure Library (ITIL) shows there is a major need for accountability in service management and delivery. The ITIL methodology focuses on many of the same areas as ILM, albeit from more of an operational aspect. The service management component of ITIL addresses issues of availability, capacity, and performance at a high level of detail. These are the same attributes that ILM principles use to differentiate data. The systems management folks are leading the effort to produce the "enterprise core values" for business process, workflow, and application service management. One might ask if the level of detail is sufficient for the storage folks (and the answer is probably "no"), but it would be wrong to assume no one in the enterprise has embraced the necessity of understanding and defining the service requirements of the enterprise portfolio.
If ITIL efforts are exposing service management issues and concerns in so many data centers, then the next question is: How does that information influence the management of data throughout its lifecycle? What tools and techniques can be employed to fulfill the ILM role?
Server and storage virtualization, de-duplication, encryption, disk-to-disk backup, and continuous data protection (CDP) are just a few examples of technologies whose goals are to reduce the data-management burden for storage administrators. Each of these technologies is little more than a stopgap measure in the rising tide of data when applied indiscriminately. However, the collaborative efforts of enterprise stakeholders to take the time to study business processes, workflows, and applications and structure them in an identifiable way, leverages those technologies to become powerful techniques of an "in-house ILM" solution.
For example, a large insurance company uses virtualized servers to isolate critical business processes, and virtual tape to increase availability by mirroring the virtual volumes to a disaster-recovery location before they are written to physical tape, and then expiry-based tape stacking to maximize tape usage with minimal interaction. Their method of classifying data may not be the most elegant process (i.e., classified by virtual server alignment), but it has yielded significant benefits in terms of conserving hardware resources and improving service to users. There were no special "ILM-ized" software or hardware components—just plain-old storage products implemented after some analysis and planning.
Most of the potential for ILM comes from having analyzed the enterprise environment to understand who deserves what resources. As the SNIA definition for ILM suggests, it is only when the "pondscum" has been culled out from the "breadwinner" applications that you can start achieving significant improvements in efficiency. Classifying information (regardless of what method is chosen), applying service level objectives, and understanding the value of the information are labor-intensive activities and generally not simple tasks.
As previously stated, the storage-focused ILM folks aren't alone. ITIL proponents have made significant progress in enterprise data centers, and their objectives are virtually identical. In addition, last year the IT Governance Institute (ITGI) and the Information Systems Audit and Control Association (ISACA) announced the "Val IT" initiative. Val IT is a framework (based on CobiT, or Control Objectives for IT Governance) that focuses on the evaluation and selection of IT investments and the value of them to the business.
There are three major areas of emphasis to Val IT: value governance, portfolio management, and investment management. The portfolio management processes of Val IT are particularly important to ILM because they describe several key management practices such as identifying resource requirements, performing gap analyses, and monitoring and adjusting portfolio priorities.
A trend is clearly developing here. The early ILM adopters were focused on compliance. The ITIL folks are establishing best practices for operations, and the Val IT proponents are working down the financial management path. Executive-level commitment for projects of this caliber is not optional. The project management responsibilities may be out of scope for most storage administrators since all the required "up-front analysis" is business-focused and well beyond issues of data placement, retention, and availability. The back-end part of ILM, which includes moving data from disk to disk, or disk to tape, is actually the easiest part of an ILM solution.
The key to ILM is that the technology will eventually simplify, automate, and make provisioning new business processes or changes to applications such a simplistic operation that some IT administration jobs will be in jeopardy; however, that day is years away. Today, most of the value of ILM is in analyzing what goes on in IT by thoroughly understanding application requirements, availability considerations, and performance and capacity requirements. There is no silver bullet, and as other disciplines beyond storage management have discovered, understanding the business from a service management, value, and governance perspective is not just a desirable goal, but a requisite to the health of the enterprise.
Bob "Mister" Rogers was one of the founders of the Storage Networking Industry Association's (SNIA) ILM Initiative (ILMI). ILMI is defining a reference architecture for ILM, including data classification, market and product segmentation, and requirements and use cases to drive ILM-related standards in the SMI-S management standard. This article was submitted on behalf of SNIA's Data Management Forum. Rogers is also the CTO and founder of Application Matrix LLC.