Analysts share 2002-03 storage perspectives

Posted on February 01, 2003

RssImageAltText

By Heidi Biggar

If one thing stands out from a recent InfoStor survey, it's the general lack of consensus among industry analysts and consultants over the relative importance of end-user storage concerns and the priority end users should assign to various storage technologies.

The fact that consultants are so widely divergent over these issues is not surprising. The breadth of storage options available to end users, the increasing complexity of users' storage environments, and a tough economic climate have made it difficult—if not impossible—for analysts to make blanket recommendations about storage issues and technologies. Still, their cumulative feedback provides insight about end users' storage requirements.

Click here to enlarge image

InfoStor polled analysts and consultants from the Aberdeen Group, Cambridge Computer, The Data Mobility Group (DMG), The Enterprise Storage Group (ESG), The Evaluator Group, GlassHouse Technologies, the Storage Consulting Group, and The Yankee Group. All questions were related to storage issues and trends in 2002 and forecasts for 2003.

Poll respondents were first asked to rank six storage issues in terms of their relative importance among end-users (see table). Three of the eight consultants said "ensuring application availability" was the leading end-user concern, while five ranked "having adequate capacity/bandwidth" as least important.

Both Cambridge Computer and The Evaluator Group ranked "having adequate capacity/bandwidth" and "ensuring application availability" as the top two end-user storage issues. Cambridge Computer put capacity/bandwidth (with the emphasis on capacity) above application availability. The Evaluator Group, instead, ranked application availability above capacity/ bandwidth, but noted that the two were generally closely related and typically perceived to be the same issue among end users.

"Application availability is the first order of business [without which the other issues don't matter], but the reality is that problems with application availability are generally tied to issues with capacity," says Randy Kerns, a partner at The Evaluator Group.

According to Jacob Farmer, chief technology officer at Cambridge Computer, while application availability is very important for certain applications like e-mail, capacity is more important. As for bandwidth, Farmer says that, although it's important, it's a separate issue from capacity and one that often can be sacrificed for capacity in an either/or situation.

At the other end of the spectrum, some firms said that having adequate capacity/bandwidth was not nearly as important as application availability, data recovery, data backup, and cost issues. The point here isn't that capacity/bandwidth is a non-issue—after all, capacity demands show no signs of abating—but that, compared to other storage concerns, it's more routinely addressed and easily handled. "Having enough capacity/bandwidth is an obvious one, but without it, a company's survival is in question," says Steve Kenniston, an analyst with the Enterprise Storage Group.

Interestingly, seven out of eight consultants ranked "backing up data" and "data recovery" higher than "having adequate capacity/bandwidth." This finding appears to drive home the point that it is the process of ensuring that data is properly backed up and recoverable that poses the greater challenge to IT administrators, not the task of acquiring and provisioning storage capacity to business applications. However, that does not necessarily mean that backup and recovery is more important than application availability.

Half of the respondents gave disaster recovery—the ability to restore data in the event of a failure—a rating of "3" or lower in terms of its priority among end users, citing the economy, cost, corporate visibility (some storage concerns are simply more visible to upper management), and the lessening impact of September 11 as primary influencers.

Jamie Gruener, a senior analyst with The Yankee Group, said that while disaster recovery is still a strong issue among end users, it isn't as strong as it was a year ago. The reason is that those companies that could did implement new disaster-recovery procedures last year, and those that didn't won't be in the position to do so over the next 12 months, explains Gruener.

And the majority of analysts and consultants appeared to agree on that point. Five out of eight said that the sagging economy has caused a fundamental shift in the way IT administrators make storage decisions, causing them to rethink storage purchases and implementation plans.

"A year ago, it was all about cost-cutting—in particular, reducing headcount," explains Richard Scannell, vice president of corporate development and strategy at GlassHouse Technologies. "Today, there is a feeling that companies have to start looking beyond the 'low-hanging fruit' and get to the real causes of high costs, and that is operational efficiency."

"We can't put the genie back in the bottle," says ESG's Steve Kenniston. "End users now recognize that they have to be smarter about what they buy, how they use it, and certainly how they manage it."

And it is this realization that has led some end users to prioritize capacity and application availability issues over backup, explains Kerns. It's all about making decisions that provide the maximum economic gain.

Scannell says that end users who improve their backup processes will not only see an improvement in their ability to restore data, but also overall lower storage costs. The two—"lowering storage-related costs" and "improving the backup process"—are not unrelated, he explains. In fact, in typical storage environments backup can account for a very large portion of storage costs on a total-cost-of-ownership (TCO) basis, he says.

Putting further pressure on IT administrators is a growing list of regulations by external agencies (e.g., the Security and Exchange Commission) that dictates the type of data that an organization must store, on what type of media it must be stored, and for how long. These types of regulations not only affect the type of storage devices end users implement and where they implement them, but also the priority assigned to certain storage applications (e.g., backup and recovery, SAN security, etc.) going forward.

What to implement, what not to

In addition to prioritizing storage issues, InfoStor asked survey respondents to rank 10 storage technologies, of varying maturity, in terms of their importance to end users (see table).

The consultants ranked the following technologies on a scale from 1 to 5: virtualization, Fibre Channel SANs, IP SANs, standards-based management, disk-based backup, SAN extension, SAN security, SAN-NAS convergence, policy-based management, and storage resource management (SRM). A "5" indicated highest priority, a "1" lowest priority.

The results, though largely divergent, were interesting. Not surprisingly, the majority of respondents said that Fibre Channel SANs should be given the highest priority. This finding reflects not only the growing trend among end users to move storage resources out from behind the server and onto a dedicated network, but also the relative maturity of Fibre Channel compared to the other storage technologies on the list.

"There's a lot of pent-up demand for SANs, and I think we'll see a new wave of deployments in the second half of 2003 as Cisco's presence in the market increases," says The Yankee Group's Gruener.

Last year, Cisco entered the storage networking market with the announcement of its planned acquisition of multi-protocol switch/director vendor Andiamo. Last month, the company announced a partnership with IBM, giving IBM full reign to resell Cisco's MDS switches and directors (see "IBM, HP first to resell Cisco storage switches," p. 8). With IBM's backing, Cisco hopes to capture a significant share of the switch/director market and to drive market adoption of SANs—both IP and Fibre Channel.

Click here to enlarge image

null

"Cisco has a large customer base of networking users who have made Cisco switching architectures a standard...[and that] may have been waiting on the fence to jump into storage networking," says John Webster, senior analyst and founder of the Data Mobility Group.

If Cisco customers take the leap, then Cisco may have a competitive advantage over rivals Brocade, McData, and Inrange and could steer the future course of SAN adoption, but for now it's a roll of the dice, says Webster.

Despite Cisco's looming presence in the market, survey respondents were much less bullish about IP SANs in the coming year than they were about Fibre Channel SANs.

Besides the Enterprise Storage Group, which ranked both IP SANs and Fibre Channel SANs as a high priority for IT organizations, and Cambridge Computer, which gave IP SANs a higher implementation priority than Fibre Channel SANs, survey respondents said implementing an IP SAN just wasn't a top priority—not yet, anyway.

"End users who already have Fibre Channel SANs will add to them, and those that don't will look to iSCSI to fill the void," says ESG's Kenniston. Cambridge Computer also recommends both technologies: Fibre Channel SANs because they lower the costs of raw storage and IP SANs because they solve the problem at the right price.

Cambridge Computer's Farmer believes the timing is right for IP SANs. "Many iSCSI solutions will cost less up-front than direct-attached storage and have much more favorable TCO and ROI," he says.

The other survey respondents gave IP SANs a ranking of "3" or below. The Aberdeen Group gave it a "3," but said that 2003 would likely be a beachhead year for the technology. The Yankee Group, which also ranked IP SANs a "3," said it considered the technology to be "early adopter" in concept and recommended its implementation—on a pilot-testing basis, to technology-savvy administrators.

Likewise, the Data Mobility Group, which said that IP SANs weren't yet ready for prime time, recommended a "test-and-try" approach to the technology. The Evaluator Group gave IP SANs a "1," saying that hype had clouded the reality of IP SANs.

As for other 2002 hot topics such as virtualization and disk-based backup, results were just as mixed. While consultants didn't agree on the exact priority virtualization deserved (ranking it as both most and least important), they did agree that the term was widely abused and that, despite the hype, the technology will be a key enabler going forward.

"Virtualization will be a core part of infrastructure and device management [and will be embedded] in intelligent storage arrays and networking gear," says Yankee's Gruener.

The Evaluator Group's Kerns says that virtualization "is needed to enable the maximum economic gains for storage management [policy-based or otherwise]."

"End users want to manage 2x the data but with no more bodies," explains ESG's Kenniston. "To make that happen, they will need to virtualize things."

Aberdeen Group analyst Dan Tanner said virtualization is in bad need of standardization. As for ongoing debates that pit in-band technologies against out-of-band approaches, he said "they're stupid."

Disk-based backup also got mixed feedback. Four firms said that disk-to-disk backup should be a high priority for end users, and it should be used as an alternative to, or in some cases as a replacement for, slower and less-efficient tape solutions.

The Enterprise Storage Group, meanwhile, gave disk-based backup a "3," saying that it is something that end users will want, but that the market was still relatively immature—a point that GlassHouse agreed with. The Storage Consulting Group gave disk-based backup a "1," noting that the technology was "irrelevant," not because disk-based backup is inherently good or bad, but because the concept of "backup" is antiquated.

"Business continuity, not backup, is what drives IT today," says Richard Lee, founder of the Storage Consulting Group. "Synchronous and asynchronous mirroring/copying is what provides availability and resiliency, not backup to tape or disk. Backup is a 50-year-old technology. It's time to move on."

As for the remaining technologies, 50% of survey respondents said SAN-NAS convergence was a moderately high priority, if not just for the simple reason of forever ridding end users of having to choose between the two.

Standards-based management and policy-based management software each received three "high" priority votes, emphasizing the neeed for better and easier management.

The Yankee Group, for example, said policy-based management was a "must-have," as did the Data Mobility Group, because of its immediate ROI benefits. The Evaluator Group noted that, while standards are important, end users should only care that vendors supply solutions that work and will be supported for a long time.

Lastly, SRM, SAN extension, and SAN security technologies all received varying ratings, from most important to least important. More than one survey respondent downgraded the importance of SRM products. The Data Mobility Group, for example, said the acronym had become meaningless, while the Enterprise Storage Group noted that SRM features would likely be integrated into other management software suites.

Consultants appeared to agree that in the right environments, SAN security and SAN extension technologies were an asset, but most agreed that the SAN security market was too immature to determine a real value for this type of product.

In general, survey respondents said they expected SAN extension products to play a key role in IT organizations' business continuity efforts. Specific market targets include mid-sized and large companies looking to manage multiple SANs or consolidate backup and management.


Comment and Contribute
(Maximum characters: 1200). You have
characters left.

InfoStor Article Categories:

SAN - Storage Area Network   Disk Arrays
NAS - Network Attached Storage   Storage Blogs
Storage Management   Archived Issues
Backup and Recovery   Data Storage Archives