Storage Performance Validation Strategy
Study reveals understanding value of pre-deployment storage performance testing but only nascent knowledge of production workloads.
SAN JOSE, Calif. – June 2, 2015, — Load DynamiX, the leader in storage infrastructure performance validation, announced the results of its inaugural IT end-user Storage Performance Validation Strategy survey. The study surveyed over 100 storage engineers and architects to gain an understanding of storage priorities, deployment strategies and investments in new technologies (i.e. object, Software defined Storage (SDS), cloud, and virtualized storage systems).
Several notable data points came from this survey:
- 65% of all IT organizations surveyed said that they are doing some sort of testing in pre-production lab environments. Included in those testing methodologies are both IT organizations using rudimentary freeware tools such as Iometer and more sophisticated sites using Load DynamiX performance validation solutions. Unfortunately, most sites were using freeware tools that can’t emulate the scale or accurate workload conditions of the actual production environment.
- Only 36% of the respondents understand the I/O profiles of their production workloads. 64% can’t accurately evaluate new storage systems or align purchasing and deployment decisions to performance requirements. To eliminate both over-provisioning, which results in wasted spending, and under-provisioning, which results in unhappy users, realistic workload I/O proﬁles of production application environments are essential. When combined with Load DynamiX load generation appliances, a comprehensive understanding of storage performance can be easily obtained before cutting over to live deployment.
- 49% of respondents are evaluating new technologies such as object, SDS, cloud and virtualized storage environments. These innovative technologies are making it more difficult than ever to predict the performance of applications on new storage architectures and approaches. Most disruptive to storage performance predictability is moving to a Software Defined Storage approach, such as those based on CEPH or OpenStack.
- 54% of those surveyed are planning on implementing Flash storage arrays to their storage estate within the next year. Given that flash storage can cost 3x to 5x more than traditional spinning media, it is essential to determine which workloads should move to flash and which should be deployed on traditional HDD or hybrid storage arrays. To make flash storage affordable, inline deduplication and compression are essential. Unfortunately, enabling such features can have a dramatic impact on performance and hence, accurate investment analysis of Flash arrays. Accurate workload modeling and performance validation will be essential for proper investment and deployment decisions.
“Load DynamiX is becoming the defacto standard in empowering IT architects and engineers to fully assess the effects that new storage technologies will have on their key business applications,” said Len Rosenthal, vice president of marketing for Load DynamiX. “With this survey, we are taking the pulse of the industry and attempting to stay lock-step with our customers and their infrastructure performance validation needs.”
The 2015 Storage Performance Validation Strategy survey was independently conducted by Gatepoint Research in April 2015. For a copy of the data gathered, please see this PDF >> Storage Performance Validation Strategy - Summary Results (PDF)
About Load DynamiX
As the leader in storage infrastructure performance validation, Load DynamiX empowers storage professionals with the insight needed to make intelligent decisions regarding networked storage. By accurately characterizing and emulating real-world application behavior, Load DynamiX optimizes the overall performance, availability, and cost of storage infrastructure. The combination of advanced workload analytics and modeling software with extreme load-generating appliances give IT professionals the ability to cost-effectively validate and stress today’s most complex physical, virtual and cloud infrastructure to its limits.