How Data Backup is Changing and Why It Matters

Posted on January 08, 2014 By Christine Taylor

RssImageAltText

Traditional data backup keeps forging along. In some environments it works well enough and there is little drive and/or budget to replace it. But in other data environments traditional backup is increasingly inadequate for new data protection challenges, including cloud, big data, mobile, and virtualized environments. These data infrastructures require optimized approaches for the new challenges they represent.

This article will discuss newer backup options for these fast-growing environments as we head into 2014.

Cloud Backup

Most backup vendors offer some level of cloud functionality to their backup applications, even if it’s as simple as a cloud target. However, given extreme data growth and the need for fast backup and recovery, cloud backup requires more robust solutions.

Cloud gateway vendors EVault, Panzura, and Riverbed offer additional capabilities. EVault is a traditional backup vendor that has successfully retooled for the cloud. The company added additional functionality around dedupe, acceleration, and restore. Panzura’s global filesystem enables high performance to cloud storage, and Riverbed built Whitewater to add high speed local caching to WAN acceleration for faster data transport to the cloud.  

CTERA and TwinStrata are entrants in the emerging Hybrid Cloud Storage (HCS) category, which represents a close integration between on-premise and cloud storage functionality. HCS is not limited to backup and recovery. For example, MS StorSimple’s HCS platform with MS Azure is for primary storage environments. In the backup category, CTERA enables data accessibility and sharing across multiple locations by integrating on-premise and cloud storage. TwinStrata offers dynamic caching and supports a very wide set of cloud host partners.

EMC is actively pursuing cloud integration for its data protection offerings in EMC Data Protection Suite. EMC’s hardware and software portfolio size allows it to sell many cloud-based backup configurations and consulting services to service providers and corporate IT.  

On the other hand, Symantec just retired Backup Exec Cloud, a standalone cloud backup product included with Backup Exec 2012. Symantec cited Backup Exec Cloud’s lack of file sharing and mobile backup access – in other words, it was a simple cloud backup target and not exactly an innovative approach to cloud-based data protection and collaboration. (To be sure, Symantec NetBackup is seeing double-digit sales growth.)

We don’t usually think “tape” along with the cloud, but tape libraries are proving to be enormously useful in cloud-based data centers. For example, redIT (no relation to the online forum of the same name) is a virtual data center provider using Quantum Scalar libraries in conjunction with disk. Big tape libraries from Spectra Logic, IBM, and Oracle also complement disk storage in cloud hosting environments.

Big Data

Big Data is a big marketing term with fluid definitions. Certainly it means large volumes of data, but more specifically it refers to large data sets with current business value. Backing up big data has to offer extremely high scalability and performance, high availability, and innovative approaches to backing up very big data sets.

CommVault Simpana OnePass saves time at the scanning stage with an object-level baseline backup. Parallel file systems can also speed up big data backup such as IBM’s General Parallel File System connected to high performance storage hardware. Sepaton built its reputation on protecting big data environments with fast backup and restore, dedupe, high scalability and replication.

EMC is deeply involved with big data and offers multiple protection solutions. One popular solution is snapshots and replication using NetWorker Snapshot Management and mirrored configurations on VNX arrays. Symantec is also actively developing products and services for big data in its NetBackup Global Enterprise Data Protection Platform.

Spectra Logic in particular has made a big push for adding tape libraries into big data environments. Large volume libraries act as big online storage for less active big data backup, and active archiving software archives big data to tape and can immediately restore them onto high speed disk for analytics runs. This grants data protection to big data without long backup windows.

Virtualization

Backing up virtualized networks is a growing challenge as more and more companies deploy their production environments on virtual machines. Many customers apply physical backup applications to their virtual environment, but multiple VMs on a single physical server produce high levels of I/Os. Backing up hundreds to thousands of VMs quickly becomes untenable. It’s no surprise that fast-growing virtualization fuels major backup sales.



Page 1 of 2

1 2
  Next Page >>

Comment and Contribute
(Maximum characters: 1200). You have
characters left.