8. Long-Term View

When it comes to big data projections, it’s clear that storage managers better plan correctly for growth. Most people, though, don’t span their attention enough – they are used to thinking only one, two or three years ahead. That’s not nearly far enough ahead.

“Think 5, 10, even 20 years ahead,” said Barbagallo. “Make sure you pick a solution that can evolve with your needs and that does not lock you into proprietary hardware.”

9. Don’t Rely Only On Disk

Gartner says we have created more data in past two years than the entire history of human kind. Yet storage architecture changes are not keeping up with the data demand.

According to Kryder’s law, disk density on each inch of magnetic storage would double every thirteen months.

“If the storage density changes are in line with Kryder’s law, by 2020 a two platter 2.5 inch drive would have capacity of 40 TB and cost $40,” said Senthil Rajamanickam, FSI Strategy and Operations Manager at Infogix.

That’s impressive enough on its own, but it’s not going to be enough to cope with all big data. SSD, tape and the cloud will all be needed to keep up with big data growth.

10. Dark Data

Operational data that is not being used is known as dark data. Gartner describes it as “information assets that organizations collect, process and store in the course of their regular business activity, but generally fail to use for other purposes.”

And there is an awful lot of it around.

“Preventing dark data in a big data environment requires data controls to review/monitor instream data during ingestion and capturing metrics to build an inventory of a big data environment,” said Rajamanickam.

11. Capacity Plus Velocity

Most discussion about big data focuses on having enough capacity. But the velocity of the data can be just as much of an issue. Therefore, this factor of big data velocity must be considered before architecting your storage design.

“Supporting event streams that are highly real time is a much different storage demand than dealing with constantly growing log data,” said Rajamanickam.

12. All Cloud Or Part Cloud?

Some will attempt to deal with big data by keeping data in house. But others may prefer to dump it all into the cloud and ensure they manage the data efficiently to control costs. Most, though, are likely to find a middle ground.

“A hybrid cloud approach allows you to continue to operate your system on premises in your data centers and in parallel move some operations to the cloud,” said Jeff Tabor, Senior Director of Product Management and Marketing at Avere Systems. “If storage is your main problem, a first step is to use a storage gateway to move older data to the cloud. If compute is your main challenge, cloud bursting technology lets you leave your data in place in your on-premises data center and begin to process the data in the public compute cloud.”