File virtualization combats unstructured data growth

Posted on December 01, 2007

RssImageAltText

By Kevin Komiega

Managing storage growth is a difficult task on the whole, but unstructured, file-based data is beginning to tax the management capabilities of current storage technologies, subsequently creating a need for more advanced ways to manage and move information in the data center.

New research from the Taneja Group consulting firm indicates that the growth of file-based data is reaching critical mass. According to the report, “Next Generation File Management and Controls Market Overview,” 73% of users surveyed said 60% or more of their data is unstructured; 53% indicated that they had 11TB or more of unstructured data in their environment; and 62% said that data is growing at a rate of 16% to 75% per year.

“The operational pain points around unstructured data are getting to a boiling point. The rapid growth of all of that data is making it difficult to manage and back up,” says Steve Norall, author of the report and a senior analyst and consultant at the Taneja Group.


The Taneja Group’s research is based on a survey of 238 IT users across a range of industries and was designed in part to determine the current state of unstructured data in their environments. The survey also queried users about how they plan to manage file data going forward. The results show that file virtualization technologies could be the key to managing the sprawl of unstructured data, and more and more users are evaluating or implementing file virtualization technologies as the market matures.

Streamlining storage management

As manager of CORE Systems for Interwoven, a content management solutions provider, Raymond Lockley manages all of the storage, servers, and connectivity that support Interwoven’s core business applications.

“The way we manage storage is less of a science than I would like it to be. We have pieces of storage all over the place and don’t have a great sense of how much we really have. That’s what we’re hoping file virtualization will help us with,” says Lockley.

About two years ago, Lockley and his team began a migration project, moving data from older Network Appliance filers to newer systems with upgraded drives and management capabilities. The migration presented some interesting challenges.

“There was no way for us to move the data from the old filers to the new systems without messing up a whole lot of people. We started trying to find a way to create a virtual space to complete the migration in a seamless way without having to take down the storage for all of our users,” says Lockley

After considering a number of products, Interwoven began evaluating Attune Systems’ Maestro file virtualization software. “We were one of their first beta test sites. We are a conservative company, technologically speaking, and we roll things out in phases.

“The file virtualization solution had to be seamless from an end-user perspective and could not interfere with our environment or performance, and it had to be easy to administer,” says Lockley.

Attune’s Maestro File Manager allows users to migrate data, retire or add servers, consolidate data, implement storage tiering, and balance capacity and performance across Windows-based file servers and NAS devices.

According to Lockley, there is a distinct need for file virtualization in most corporate storage environments. “You have a complex hierarchy of storage, and it would be impossible to give every department its own file server. The more you try to do that, the more of a management nightmare it becomes and the more the cost escalates,” he says.


Interwoven is currently in the process of using the information gathered in Maestro to improve storage utilization and streamline its internal storage management processes. Looking to the future, Interwoven plans to expand its use of file virtualization to additional data centers and implement global namespace technology across its infrastructure.


“The ability to create a global namespace in a rational, virtual way is going to be critical to our future growth,” says Lockley. He also advises his peers to start looking at file virtualization as soon as possible: “With the amount of money you can save on storage-anywhere from a 3x to 10x return on investment-it would be foolish not to do it.”

Higher performance/consolidation

Another company looking to stem the tide of file growth with virtualization is Northwest Geomatics, a provider of geospatial data. The company flies mapping aircraft on a regular basis, and the planes collect and send upwards of 5TB of data per day back to the company’s data center.

“Once the data hits here there has to be a storage infrastructure to house that data and one that allows us to process it into final products. The processing is quite intensive, and the data piles up,” says John Welter, vice president of technology at Northwest Geomatics.

Another aspect of the company’s business is electronic delivery and e-commerce services. “Some customers may not want to manage the data we provide internally so we have set up an infrastructure to house that data and serve it back to them. It’s all managed and indexed. Basically, we’re a storage solutions provider for the geospatial industry,” Welter explains.

As such, Northwest Geomatics has seen its storage needs explode, growing from about 20TB in 2002 to almost 460TB today. To accommodate the growth, the company migrated all of its storage behind a Pantera clustered NAS system from ONStor and is now layering the cluster with file virtualization technology from F5 Networks (which bought file virtualization specialist Acopia Networks).

F5’s Acopia virtualization technology decouples file access from physical file location, enabling non-disruptive data migration. The company’s ARX products integrate into existing NAS, Windows, Unix, and Linux environments to automate management tasks and data movement.

“The amount of data we’re handling on a day-to-day basis is growing rapidly and managing it is a nightmare. We’re using file-level virtualization to make it all look like one volume,” says Welter.

Welter hopes to gain two primary benefits with the Acopia technology: higher performance and consolidation. “With Acopia we get rid of all [the management issues] with a single pool of storage where all of our projects are living at one time. That helps us maximize the utilization of our storage space,” says Welter.

The single most important consideration during Northwest Geomatics’ file virtualization roll out is switching over to the new technology.

“The obvious question to consider is what happens if it doesn’t work? The Acopia product is a box that sits in front of all of our NAS systems, and when you’re looking at that much data undergoing a migration, we probably wouldn’t go ahead with the project if the shift were an issue,” says Welter.

Welter is also concerned with throughput and latency, but is confident that the Acopia product will handle the volume of data required.

“They have a road map that should be able to deliver the performance we need two years from now. The latency is low, and there’s not a lot of overhead,” he says.

Welter’s next project is going to require 1PB of storage. He’s planning to rely heavily on file virtualization to get the job done. “Acopia is going to play a big role in that project. I couldn’t comprehend managing a project like that without some sort of intelligence in front of the storage,” says Welter.


Unstructured data growth does not discriminate. Organizations large and small are fighting the same uphill battle to regain control of file storage.

Consolidation is key

Charles Collins, network administrator, leads the infrastructure department for Camden Property Trust, a real estate investment trust company that develops and manages apartments across the lower half of the US.

Collins deals with a mixture of file types and applications, including SQL Server database data, Exchange Server, and various other forms of unstructured data. He is responsible for storage and backups for the company.

“We’re spread out across the Sun Belt with several regional offices across different cities. We needed to figure out a way to get the backup equipment out of those offices and consolidate it back at the corporate data center,” says Collins. “That’s what made us start looking at file virtualization technology.”

Given that Camden Property Trust is an all-Microsoft shop, their file virtualization product had to play well with Windows. Collins opted for Brocade’s StorageX product for file virtualization and remote-site file replication.

“Our Exchange environment is growing by leaps and bounds. The flat files are also increasing exponentially,” says Collins. “We needed a file virtualization and replication product that was the right fit and would not require a huge learning curve to implement and manage.”

Brocade’s StorageX is a suite of applications designed to logically aggregate distributed file data across heterogeneous environments and across CIFS- and NFS-based file systems while providing policies to automate data-management functions. The StorageX Global Namespace unifies heterogeneous file data by pooling multiple file systems into a single, logical file system.

Camden Property Trust is consolidating the IT infrastructure at its seven regional office locations down to two or three. “StorageX put us in a position to do that,” says Collins.

The Taneja Group’s Norall says there is more than one way to tackle unstructured data growth. In addition to file virtualization, WAN optimization, wide area file services (WAFS), clustered storage, distributed file systems, and network file management solutions are all viable technology options.


Comment and Contribute
(Maximum characters: 1200). You have
characters left.

InfoStor Article Categories:

SAN - Storage Area Network   Disk Arrays
NAS - Network Attached Storage   Storage Blogs
Storage Management   Archived Issues
Backup and Recovery   Data Storage Archives