The Missing Links in Software-Defined Storage

Posted on January 24, 2017 By Drew Robb

RssImageAltText

In anthropology, the missing link is a hypothetical concept of an extinct creature that lies halfway in the evolutionary line between modern human beings and their ancestors. Applying that general idea to software-defined storage (SDS), we appear to have a missing link between the current reality of SDS and its ultimate vision.

One Vision?

Let’s start with the envisioned goal of SDS. Unfortunately, the interpretation of the end game varies from person to person and company to company. Kate Davis, manager, HPE Storage marketing, believes the goal is to have a consolidated approach to meeting application demands from software-defined storage to dedicated storage systems, including the use of SDS in hyper-converged (HC) products. By federating the underlying primary and secondary storage technologies, users can shift between SDS/HC and traditional arrays without having to re-architect their apps and processes.

But not everyone shares that exact concept. Paul LaPorte, director of products, Metalogix, believes SDS is really about enabling flexibility. Content and storage devices change. Regulations impacting content change, and organizations grow and merge. That’s why a rigid storage architecture eventually breaks down. SDS aims to establish a flexible storage environment that adjusts with the internal and external changes. It also delivers the opportunity for the automated content caretaking that is needed to free administrators to focus on other emerging challenges, he said.

Another way to look at SDS is in terms of the business as a whole. Business value and utilization rate are the vision, according to Tibi Popp, CTO, Archive360. Each unit of data has a different business value and utilization rate. This should be directly correlated to the cost of storing each data unit. Given that storage vendors are providing different storage types at different price levels, it is important to build software-defined storage that understands the value of the data and stores it on the most cost effective medium.

“Another critical feature that customers should look for moving forward is that SDS should be able to create a predictable analysis of the cost of storing data based on the value of the data,” said Popp.

But the implications could be even deeper — and perhaps a little grim from a storage perspective. Mario Blandini, vice president of marketing, SwiftStack, stated that the end game for SDS is that storage goes away as its own distinct segment of the IT infrastructure market.

“Storage came about when applications expanded beyond mainframe compute to distributed x86 computing,” said Blandini. “With SDS and hybrid cloud, storage is no longer defined as a big box full or hard drives, it is the management of data existing across private and public data centers, all in a single namespace.”

Meanwhile, trends such as digitization, the Internet of Things (IoT), big data analytics and the cloud are causing an upheaval across many industry verticals. Even traditionally conservative spheres such as power generation have realized they have to eliminate silos and move closer to the consumer mindset of "instant everything." As a result, IT must become more agile, it must become more efficient, and it must be ready for a broad spectrum of changes coming in the future — from hardware innovations to application design to cloud strategies.

Traditional infrastructure, however, is not well suited to addressing these needs, so organizations are turning to a software-defined data center (SDDC) architecture. Thanks to storage being a traditional pain point and major source of IT spend, SDS is the next step to take now that server virtualization is standard practice. A complete software-defined infrastructure, therefore, includes virtualized compute, storage and networking along with a common management platform that provides a unified operating environment capable of spanning from on-premises data centers to public clouds.

“By shifting to this type of modern software-defined environment, IT departments are increasingly becoming an internal service provider rather than a cost center,” said Lee Caswell, vice president of products, storage and availability, VMware. “Software-defined storage and compute, which we also commonly call a hyper-converged environment, allows IT to take a holistic view of infrastructure and concentrate on business objectives, rather than mere technical imperatives.”



Page 1 of 2

1 2
  Next Page >>

Comment and Contribute
(Maximum characters: 1200). You have
characters left.