Enterprises are using the public cloud for storage more than ever before. They are relying on services such as Amazon Web Services (AWS) Simple Storage Service (S3) and Microsoft Azure Storage as repositories for everything from backups and overflow storage to disaster recovery (DR) and mission-critical application data.
A recent IDG Research survey found that 59 percent of organizations are either currently investing in or are planning to invest in more public cloud over the next 12 months. For those using the public cloud for storage, the following tips may help to ease selection and deployment, reduce costs and improve overall service levels.
1. Gradually Does It
Be sensible about moving to the public cloud. Top management may have ordered you to dump everything into the cloud in their haste to cut storage costs, get out of the in-house IT business and maybe even trim back on staff. But temper their enthusiasm with some due diligence. Crawl, walk, run very much applies with the public cloud. There is a learning curve that will take time to master.
“Adopt increasing levels of services as your teams get up to speed, and understand how to leverage APIs and automate everything through code,” said Mark Bloom, director of product marketing, compliance and security at Sumo Logic, a cloud-native machine data analytics service.
2. Start With Backup And DR
One of the effective use cases for cloud storage services like Amazon S3 and Microsoft Azure storage is for off-site backup storage. In this scenario, backup copies are retained in the cloud in case disaster occurs in the main data center.
However, just throwing backups to the cloud does not absolve IT of any further responsibility. It is up to IT to determine how easily it will be able to recover files and entire systems in the event of an outage, data loss incident or natural disaster. This means finding out from the provider just what guarantees they provide, what additional costs are involved should you need to recover your data, and how long recovery will take.
“In order to properly assess recovery time objectives (RTOs), organizations will need to be aware of the data retrieval, recovery and migration capabilities of the service provider,” said Goran Garevski, vice president engineering at Comtrade Software, a provider of IT infrastructure management solutions for data protection, system, network and application performance.
3. Know Your Vendor
One public cloud is not the same as any other public cloud. And within the major providers, there are a wide range of flavors to choose from. It pays to know your vendor and what services they offer. Azure has Blog Storage, Queue Storage, File Storage and Data Lake Storage as a few of the services it offers. Amazon has S3 storage as well as Amazon Elastic File System (EFS), Amazon Glazier, Amazon EC2 Block Storage and Amazon Import/Export Snowball. Do your homework so you know what is on offer and how the various service levels impact pricing.
4. Choose Carefully
Resist the temptation to deposit your data in the first service you come across or the one that seems to be the cheapest. It pays to do your homework and find the right fit. Once you know the available services, flavors and associated costs, take more time to assess the needs of your own data sets and storage needs.
Amazon, Azure and other public cloud storage vendors have a broad selection of storage services, and selecting the best-suited option for each data set can help when it comes to optimizing storage costs. In addition, taking advantage of built-in features for data management, monitoring and audits can reduce ongoing operational effort around storage.
“A precursor to selection of an optimum mix of services is clearly identifying the storage requirements for the different data sets in the organization,” said Deepak Mohan, research director, IDC. “This classification, combined with experimentation with various storage services (which is a cost-effective task in public cloud) will help organizations choose the best mix of storage services for their needs.”
5. Use Appropriate Storage Tiers
Among the many flavors of cloud storage are a variety of tiers of storage. These include data for hot files, infrequently accessed data and archive data.
“An important aspect of managing storage costs is tiering your data based on attributes like frequency of access and retention period,” said Sriprasad Bhat, senior program manager, Azure Storage.
Azure, for example, provides services for cool data — data that is infrequently accessed but requires similar latency and performance to hot data. Known as Cool Blob Storage, it is low-cost storage for cool object data such as backups, media content, scientific data, compliance and archival data. Depending on the region, costs for this service can be as low as $0.01 per GB, and Microsoft just released new regions where this service is available. This includes the much of the USA, and parts of Germany, Australia and Brazil.