Flash arrays will play a role in a slim majority of Global 500 enterprise storage deployments over the next year, but some companies may be in for a rude awakening when they don’t reach the lofty performance figures touted by vendors, according to new research conducted by Gatepoint Research and commissioned by Load DynamiX.

Load DynamiX, a San Jose, Calif. storage performance validation company, released the results of its inaugural Storage Performance Validation Strategy survey (PDF) of 115 storage engineers and architects, revealing that 54 percent of respondents said they were planning to add flash arrays to their IT infrastructures. That’s good news for flash vendors, but not necessarily a boon for businesses seeking to supercharge their business applications in a cost effective manner.

Noting the high price of flash storage, at least compared to traditional HDD-based arrays, inline deduplication and compression are essential to keep costs in check, claimed the company. The trouble with flipping the switch on such functionality is that it comes with a performance penalty that may not be fully accounted for during testing.

“Accurate workload modeling and performance validation will be essential for proper investment and deployment decisions,” Load DynamiX advised in a statement.

Organizations are also implementing other storage technologies over the next year, the survey discovered. They include public cloud (28 percent), converged storage systems (26 percent), software-defined storage (SDS) solutions (23 percent) and object-based storage (21 percent).

Most companies (65 percent) perform at least some tests in pre-production labs, but their methods may be falling short. “Unfortunately, most sites were using freeware tools that can’t emulate the scale or accurate workload conditions of the actual production environment,” stated Load DynamiX.

Workload modeling is a challenge for most organizations, with only 36 percent of respondents reporting that they have a grasp of the I/O profiles of their production workloads, potentially leading to overprovisioning or performance that falls short for a majority of companies that are winging it. Nearly half, 49 percent, said they were evaluating newer storage technologies (SDS, cloud, object and virtualized storage), complicating efforts to accurately test for their impact on the data center.

“Most disruptive to storage performance predictability is moving to a Software Defined Storage approach, such as those based on CEPH or OpenStack,” observed the company.

The survey also provides insights into how IT organizations are prioritizing their storage projects this year. The top three storage priorities are implementing new data backup and recovery (58 percent) solutions, improving availability (51 percent) and evaluating new storage technologies (49 percent), noted the study.