Storage vendors 'hype' virtues of virtualization


The relatively slow adoption rate of storage area networks (SANs) among end users does not appear to have affected the feverish pace of virtualization announcements from storage vendors this year. Already one of this year's buzzwords, virtualization is expected to transition from the realm of hype to a real-world "must-have" over the next six months or so.

"Users want virtualization but, unfortunately, most of them don't know it yet," says Lisa Beldean, technical director at ITIS Services, a consulting and integration firm in South Norwalk, CT. She says users are SAN-aware, but that they haven't made the virtualization leap for several reasons.

"What scares them off [from virtualization] is that they have either locked themselves into a costly homogeneous SAN solution to ensure full support and management capabilities or they pieced together a SAN management nightmare," Beldean explains.

What they don't realize, she says, is that virtualization is "what a SAN is supposed to be-heterogeneous choices, leveraging existing investments, consolidating, and sharing resources in a simplified management environment."

"SANs only solve what is essentially a wiring problem," says John Webster, a senior analyst with Nashua, NH-based Illuminata, a research and consulting firm. "By adding virtualization software-the virtualization engine-the deep benefits of SANs can be realized."

Recently, DataCore and StorageTek made virtualization announcements, adding to the variety of virtualization toolsets available (see "SANs rely on storage virtualization," InfoStor, January 2001, p. 20).

DataCore extends virtualization

Last month, three-year-old start-up DataCore Software released the fourth generation of its SANsymphony software, further extending the company's technology lead in the nascent storage virtualization market, analysts say. SANsymphony simplifies data management by consolidating storage resources into a virtual network storage pool.

New to release 4.0 is Asynchronous Internet Mirroring (AIM), which essentially brings network storage pooling and remote replication services-common at the enterprise level-to departmental organizations.

Storage administrators now have a single way of replicating data to off-site locations over existing Fibre Channel or IP local-, metropolitan-, or wide area networks, explains Augie Gonzalez, director of product marketing at DataCore. The most notable benefits, says Gonzalez, are more affordable disaster recovery and simplified storage administration.

DataCore has also added tertiary support to its synchronous network mirroring capability, enabling storage administrators to send mirrored data to two locations simultaneously. Destinations can be changed dynamically via a drag-and-drop graphical user interface.

Responding to user demand for increased flexibility and fault-tolerance, DataCore has also established alternate path support for mixed-operating environments (Solaris and Windows NT/2000), dual-port array extension, enhanced Microsoft Cluster support, and on-the-fly RAID protection.

As for alternative virtualization products, DataCore sees Compaq's VersaStor and Veritas Software's SANPoint as direct competitors, although the company claims that neither enhances I/O performance. Gonzalez also claims SANsymphony is inherently less expensive and easier to manage than a comparably configured VersaStor implementation. "There are [simply] fewer nodes to manage," he says. SANsymphony starts at $30,000 per configured virtualization node, plus the cost of requisite host bus adapters and/or switches.

"It is also here and now," adds Dan Tanner, senior analyst, storage and storage management, at the Aberdeen Group, a Boston-based research and consulting firm. "And equally important, DataCore is proving that in-band is not necessarily slower and less scalable than out-of-band techniques."

Proponents of out-of-band virtualization techniques cite problems with performance and difficulties with scalability as potentially significant stumbling blocks for in-band approaches, especially in larger, more complex SAN environments. In-band models pass data and control information along the same, rather than separate, paths (see "Storage virtualization: What, how, and why," InfoStor, March 2001, p. 58).

DataCore discounts such statements, claiming that its SANsymphony software actually improves performance, via caching, in many implementations and scales the enterprise. Advanced caching, says Gonzalez, actually boosts the performance to back-end devices without significantly driving up system costs. "It minimizes the need to access the back-end storage devices and allows the SANsymphony appliance to satisfy most I/O requests almost immediately from local cache," he adds.

The next release of SAN- symphony, planned for August, will feature IP support. The company also says it expects to announce several OEM agreements, on par with its Fujitsu Softek arrangement, over the next several months. Fujitsu Softek's recently announced storage virtualization strategy is based on DataCore's SANsym-phony technology.

STK to enter market

StorageTek says its will begin shipping a disk version of its SAN virtualization appliance-the Storage-Net 6000 (SN6000)-this fall. The device, which is currently in beta testing, is designed to support mixed-storage platforms, although initial products will be certified in StorageTek 9176 environments only. Open-storage support is expected to follow later this year, though no timetable has been provided.

An early leader in the virtual storage arena with its Shared Virtual Array (SVA) and Virtual Storage Manager (VSM) technologies, which allow for virtualization at the device level, StorageTek has been comparatively slow in bringing higher-level storage-networking-specific virtualization products to market.

"They're very conservative in their approach, and I think for a very good reason," says Illuminata's Webster. "[Other players] in this camp have serious issues [e.g., security, availability, and data integrity] to overcome in the minds of users." Other SAN virtualization strategies tend to be difficult to implement and have limited scalability," he says.

Given the steady, but slow, SAN adoption rate, StorageTek says its StorageNet rollout has been on pace with user demand. "When you consider that the majority of customers still haven't implemented SANs, let alone intelligent managed/virtualized SANs, I think our timing is spot on market needs and expectations," says Robert Nieboer, senior manager, global industry analyst relations, at StorageTek.

The SN6000, or storage domain manager, is essentially a 64-port switch with dedicated processor capability and embedded software support (Virtual Transport Manager). The SN6000 not only simplifies the manageability of increasingly complex storage environments, but also consolidates storage resources, according to Nieboer.

"When we say that the SN6000 supports tape, we mean that it virtualizes the tape environment at the SAN level," Nieboer says. "It looks at whole pools of tape and, eventually, disk and then doles out those resources to individual servers and applications. VSM and SVA, on the other hand, which are device-level virtualization tools, are designed to optimize the capacity of the disks and tape cartridges themselves."

StorageTek says that the fundamental difference between its SN6000 strategy and other vendors' approaches to SAN virtualization lies in how and where the technology is implemented-that is, at the storage, host, or network level. By implementing virtualization at the network level, STK claims to avoid some of the performance and scalability shortcomings common with other implementations.

The StorageNet 6000 disk model will feature STK's Virtual Volume Manager software. A box that supports both disk and tape is also planned.

Click here to enlarge image

Lisa Beldean
ITIS Services

Users tips: Virtualization checklist

According to Lisa Beldean, technical director at ITIS Services, a consulting and integration firm in South Norwalk, CT, there are several things users should consider when choosing virtualization tools. They include:

Performance-Some virtualization tools operate in the data path (in-band), and others do not (out-of band). Beldean says that bottlenecks generally are not an issue for in-band products they are sized properly, since data is typically cached or pure pass-through I/O, requiring little CPU involvement. The most important thing for performance is a lot of RAM and high-speed buses, she says.

Operating systems-What operating system the appliance runs on can be a factor, says Beldean. Veritas, with its ServPoint Appliance, runs on Solaris and has SAN and NAS capabilities from the same storage pool, as well as volume management, RAID, and replication tools.

Legacy investments-Customers' existing storage investments may sway the decision. For example, it may not make sense for a company to invest in an appliance if it has already invested heavily in some other type of tool. In those situations, a complementary visualization and management tool, such as Veritas' SANPoint Control, may be best, explains Beldean. "From a central console, it scans the storage network, maps all components, and links to other vendors' tools to manage the SAN components."

Similarly, some vendors' appliances allow legacy devices such as SSA and JBOD arrays to be included in the SAN storage pool without the use of routers. "Just think what you can do with the gigabytes of old storage that many users have sitting on their data-center floor," says Beldean. "It's not the best-performing disk technology, but it can be made available for HSM applications, additional mirrors for backups, and test and development use. The possibilities are endless."

Feature comparison-Beldean says it is important to look at feature sets of each vendor's products. Many virtualization tools, she says, offer data-copy functions, RAID, and integrated backup support. "Depending on a user's requirements, this can add significant cost savings over purchasing multiple individual tools."

If high availability and scalability are key concerns, users need to make sure the virtualization tools they buy currently support this, explains Beldean. "Different tools do it in different ways," she says.

This article was originally published on June 01, 2001