Virtual storage and virtual backup has been around for years. But it’s only in the past year or two that it has begun to catch up with the huge amount of virtualization that has swept across the server landscape over the past decade. As a sign of its growing maturity, software defined storage (another term for storage virtualization) has opened the door to a new wave of hyper-convergence where storage, server and networking architectures are coming together.

But with all these new toys at his or her disposal, what should storage managers and storage administrators be doing in the face of a rising tide of virtualization?

Keep It Simple, Storage

In one sense, virtualization adds another layer of complexity over the existing infrastructure. That’s fine if the end result is greater simplicity in terms of provisioning and automation.

‘The number one thing is to keep the overall solution simple enough to manage and grow over time,” said Dave Demlow, Vice President Product Management and Support at Scale Computing.

He gave the example of an expansion in virtual storage which requires you to check hardware compatibility lists from multiple vendors, provision it in one vendor’s console then connect to that storage in a different place or call multiple vendors for support or upgrades. If that begins to creep in when you implement virtual storage, he said, you are making things way too hard on yourself and introducing complexity that will ultimately lead to issues and finger pointing.

Evaluate Feasibility In Day-To-Day Operations

Simplicity is also the message trumpeted by VMware’s Director of Product Marketing for Storage and Availability, Alberto Farronato. Adding in virtualization and hyper-convergence should make the storage environment more flexible and easier to manage. But that may not always be the case. The only way to know is to evaluate each potential solution in your day-to-day operations.

“Moving to a hyper-converged architecture should not make your datacenter more complex than it already is by adding more new tools and interfaces to learn and manage,” said Farronato. “Choose a solution that provides the necessary level of integration with your hypervisor and other applications.”

Stay VM Centric To Manage Growth

Chris McCall, Senior Vice President Marketing, NexGen Storage, advises anyone managing data in a virtual infrastructure to keep it VM centric.

“You have to be VM centric, because it’s the only way to scale one’s ability to manage growth,” said McCall.

Adopt VVOLs

VMware vSphere Virtual Volumes (VVOLs) is a new integration framework that enables VM-centric operations on external storage systems. It is said to extend the control plane to external storage through the use of the vSphere Storage Policy-Based Management. This enables you to achieve tight alignment between managing applications and the storage that they run on.

Scott Davis, CTO of virtual storage performance vendor Infinio, pointed to VVOLs as a technology well worth adopting for virtualized storage. VVOLs, he said, enable mainstream arrays to provide granular data services on a per-application basis, such as backup and snapshots.

“Combining the emergence of VVOLs with the trend of trading in traditional batch backups in favor of snapshot-based data protection is a game-changer,” said Davis. “The capabilities that snapshot-based data protection provide (such as having application and data in its native format, being able to quickly recover a particular point in time and having much shorter backup windows) become accessible to individual applications.”

McCall concurred. “Every VMware Administrator should be paying close attention to VMware VVOLS as it will be the single biggest way to reduce time spent dealing with low value management and integration issues associated with virtual storage and backup,” he said.

Take Your Time

Yes, there is abundant potential available by virtualizing storage. But only fools rush in, as they say. Davis advised users to proceed cautiously.

“There is a lot of hype and many vendors are motivated to sell you the maximum amount of new, replacement equipment,” he said. “It’s not necessary to make a complete shift to these new technologies immediately; the best solutions are the ones that enable you to leverage and retain your existing investments while gaining new capabilities that tangibly benefit your business.”

Get It Right The First Time

Regardless of which technology or approach is selected, Ian McChord, product director of backup and DR vendor Datto, strongly urged enterprises to do their homework when it comes to determining which virtualization platform on which to run their storage environments. If you get it wrong and have to change, it can dramatically add to overall costs.

“Implementing virtual solutions aren’t too difficult, but switching them is laborious so make sure you think about your needs now and needs in the future before pulling the trigger on a solution,” said McChord.

Don’t Ignore Overhead

Farronato also cautioned users, this time with regard to hyper-converged storage technologies that are often implemented as a virtual storage appliance, either one per cluster or one per host. His point is that adding appliances can sometimes allow overhead to creep in, which inhibits the gains available from storage virtualization.

“This introduces its own installation and management overhead, in addition to the fact that these storage appliances are likely to consume CPU and memory resources,” said Farronato. “This means less compute resources available to the actual workloads.”

Go All Out

Bernie Spang, Vice President of Software Defined Infrastructure at IBM, said the best way to deal with this onrush of virtualization technologies and approaches is by employing software-defined storage techniques to manage the data holistically through a single integrated software system rather than a collection of rigid hardware specific piece-parts.

“The virtual storage system has to be highly automated so data can be placed where it’s needed and so different kinds of data can be combined for extracting insights when needed,” said Sprang. “With software-defined storage, the end-user doesn’t have to know where or how the data is kept. They request it, and it comes to them.”

Pay Attention To Data Management

In addition, Sprang believes that storage managers should make data management on clouds a priority. It may be fine to throw a lot of data onto the cloud, but only if you remain the master of it in terms of access, mobility and restorability.

“As more data moves to the cloud, nations and territories worldwide are adopting data management laws to protect personal data held by both government and private companies,” said Sprang.

Verify Complete Restores

Let’s end with a tip specific to virtual backup. It can sometimes get lost among all this lofty talk of software defined architectures and hyper-convergence. But it remains a core element of storage operations. And it is becoming more virtualized than ever.

The final tip concerns testing and restoring a new virtual backup tool, which is something that is typically well understood. But what might be missed by some is to take the restore all the way to see how well you would actually do in the face of a disaster.

“Include in your testing and restoring whether you can bring the data back and use it,” said Greg Schulz, an analyst with StorageIO Group. “Found out if permissions and security will still be in place, even when you restore to an alternate destination.”

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *