Tips on enterprise storage management

Posted on February 01, 2002

RssImageAltText

IT managers should focus on heterogeneity, centralization, snapshots/replication, and storage virtualization.

By Gary Wright

It's no secret that the data-storage needs of the typical Fortune 1000 company are doubling-and in some cases, tripling-every year. This explosive growth is fueled in large part by the Internet.

Dealing with this massive growth has become a mission-critical focus of IT departments: Failure to manage storage capacity, accessibility, security, and performance would be a death knell to most IT organizations.

However, more capacity means more complexity. As an organization grows, it typically adds different storage devices and assimilates new networks through acquisition. Implementing intranets, e-commerce, CRM, or other services over the Internet creates additional levels of complexity.

The keys to managing rapid growth of data complexity are

  • Ability to implement and manage heterogeneous environments;
  • Greater intelligence in the storage network;
  • Centralization of storage management;
  • Increased data accessibility; and
  • Efficiency in storage utilization.

As an organization grows, creation of heterogeneous environments is almost unavoidable. Whether it's because of acquisitions, changes in policy or management, or other reasons, growth companies find themselves dealing with many different data-storage devices and systems that must be integrated. Few organizations can afford to scrap their existing investment in servers, storage devices, and software to accommodate changing technologies and evolving products.

Some IT departments are currently implementing homogeneous storage area networks (SANs), with only one operating system and one type of storage array. However, most of these organizations plan to tie together these SAN "islands" or implement more-elaborate networked systems in the future.

Click here to enlarge image

For these more-complex, heterogeneous networked environments to be successful, the industry must evolve toward an open SAN environment-and the storage industry is not quite there.

With the increasing complexity and heterogeneity of storage environments come the questions of how IT personnel can manage multiple operating systems, software, and hardware devices and make them all work together. According to Carolyn DiCenzo, chief analyst at Gartner Inc., storage hardware budgets are growing faster than budgets to hire trained IT people because "it's cheaper to buy more software than hire trained people, and it is increasingly difficult to find qualified people to manage the storage. Many IT organizations lack the skills and knowledge, and skills are hard to find."

Rising complexity and a shortage of skilled people translate into a need for increasingly intelligent storage management software. Major enterprise storage vendors have their own types of storage management software, and they all depend on implementation across a network. Networking storage systems enable several critical capabilities:

  • Monitoring of storage devices remotely;
  • Remote upgrade, maintenance, and re- allocation of storage resources;
  • Implementation of a choice of different backup schemes; and
  • Centralization of management control.

Taken together, these capabilities offload much of the management of the storage systems from IT staff and offer significant improvements in terms of data availability, security, and the time and cost of administering systems.

Centralized management

When storage management becomes centralized, users can benefit from increased data accessibility. Centralization significantly reduces the complexity of the task, allowing fewer people to manage more data. When storage usage is centrally monitored, it is also possible to re-allocate storage from the central console (remotely) based on that usage, instead of by predictive algorithms. Heavy users can be allocated appropriate storage capacity instead of being forced to rely on backup and archiving to make room for more data. Central control allows for a complete storage-centric view of enterprise storage.

EarthLink, an Internet service provider, managed its explosive growth (from zero to three million members in five years) in part by centralizing its data-storage management. As EarthLink rapidly added servers and storage devices to keep pace with users' demands for service, the company found itself trying to manage many isolated storage islands and many single points of failure. The EarthLink LAN Services Group decided to move from a server-centric model to a storage-centric model and implemented a new, more-intelligent storage architecture that could be monitored and controlled from a single console. One of the greatest short-term benefits of the change, according to Greg Friedman, director of EarthLink's LAN services, was the ability to add or remove LAN servers without downtime. He reports that a process that used to take several hours has been reduced to about 15 minutes.

E-commerce over the Internet continues to be a major driving force behind the demand for 24x7 data availability. It is no longer acceptable to take servers offline for backup, maintenance, or changes. According to the META Group, companies that are the most dependent on automated systems (e.g., banking and telecommunications) accrue an average of nearly $3 million in losses for every hour of downtime. Businesses like hospitality or travel, less dependent on IT infrastructure, suffer revenue losses of $330,000 to $636,000 per hour of downtime.

Snapshots and replication

Businesses engaged in 24-hour business operation need to be able to capture "snapshots" of data at any given moment in time. The organization's management can process or analyze these snapshots or "clones" (mirrors), without bringing down the network for data backup, because backup is done from the clone instead of the primary volume. This process is best accomplished in a SAN with centralized management tools that can be used to create the clones, using agents to retrieve point-in-time data.

The South Financial Group, which offers mortgage, brokerage, and investment services, is an example of a business where data cloning is essential, not only because of business requirements, but also because the weather demands it. It operates 108 branch offices in some of the most hurricane-vulnerable states: North and South Carolina and Florida. Hart Raley, vice president of client services for the company, explains that data replication and data protection were a top priority: "With Fibre Channel technology, we designed a completely redundant infrastructure to improve performance, protect against system malfunction, and optimize our backup capabilities."

With data-replication software at the heart of a SAN, the South Financial Group implemented a cloning and snapshot scheme that minimizes downtime during backup and data migrations, while fully protecting the integrity and timeliness of the data. The firm tested its data-protection scheme by placing a sample database in Lexington, SC, replicating the database through Greenville, NC, breaking the connection, and running the application off the Greenville site. "All data was up-to-date and correct," reports Haley.

Another key to managing the increasingly complex storage environment is more-efficient use of storage. Storage hardware already claims at least half of the IT hardware budget. Just as it makes sense to have fewer people managing more data storage, it's common sense to run fewer systems that can accommodate more data. Today, adding more disks to deliver storage to a server means that you must manually re-allocate the storage, which is time-consuming and labor-intensive and limits data accessibility during the process. Automating this process is less disruptive, takes greater advantage of storage capacity, and reduces both the time and IT personnel for the process.

The storage industry is currently optimizing storage utilization through a process called virtualization. This concept separates the representation of data-storage capacity from the physical devices, allowing a pooled view of total storage capacity, instead of viewing storage device by device. Automated management and programming tools simplify storage management and allocation, making this process transparent and dynamic. Virtualization increases storage utilization and availability while significantly reducing the space required for storage (some estimate space gains of up to 40% or more).

That being said, virtualization has not yet been fully realized, although about a dozen vendors have delivered some of the required tools. In the near future, IT managers will be able to deploy virtualization to manage data storage at the highest strategic level.

In the longer term, tools and technologies will evolve that will make the management of SANs increasingly transparent and easy. For instance, some vendors have a vision of self-healing systems that detect and repair data pathway failures before human technicians could even become aware of the problem. The tremendous pressures imposed by the globalization of business will drive the industry to create technologies that will come as close as possible to the goal of providing users with delivery of data 24x7.


Gary Wright is director of enterprise storage marketing at Compaq Computer (www.compaq.com) in Colorado Springs, CO.


Comment and Contribute
(Maximum characters: 1200). You have
characters left.

InfoStor Article Categories:

SAN - Storage Area Network   Disk Arrays
NAS - Network Attached Storage   Storage Blogs
Storage Management   Archived Issues
Backup and Recovery   Data Storage Archives