Insights, Best Practices and Forward Thinking from the Customer Facing Team, Solution Architects and Leaders on Extreme Performance Applications, Infrastructure, Storage and the Real-World Impact Possible

Street Circuit vs. Road Circuit - Lowering Latency Drives Workload Mixing

by VIOLIN SYSTEMS on August 16, 2016

As the Chief Operating Officer (COO) for my company, the diverse teams I lead remind me of the challenges businesses face when managing multiple and mixed workloads within their data centers. Just as a Formula 1 driver must race multiple circuits with different mixes of straights and turns using the same car, a company's IT infrastructure must enable mixed departments with multiple projects to complete tasks. Substitute workloads for departments, IOPS for projects, and latency for tasks and the connection becomes clear.

The key reason why companies are deploying all flash enterprise storage is the expectation that their preferred solution will run multiple and mixed workloads simultaneously. Examples include big data analytics, online transaction processing, databases, applications, server virtualization, and private cloud.

Choosing an all flash storage solution that favors higher IOPS at the expense of lower latency will lead to unpleasant surprises (I can't mix different workloads?) and deliver disappointing outcomes (I can't consolidate multiple workloads?). Unfortunately, this happens far too often.
Continued below...


Enter To Win

A DAY AT SKIP BARBER RACING SCHOOL



Low Latency Wins on Race Tracks and in Data Centers

In last week's blog, my colleague Susan used the analogy of auto racing to describe why consistent performance wins in Formula 1 racing and all flash storage. I'm going to build on her analogy to include the importance of lower latency when mixing many workloads of different types on the same all flash storage system.

Formula 1 racing is a mixed workloads environment where drivers race on different courses under diverse racing conditions during a season. The races occur on different types of road courses including close city streets, combinations of public roads and permanent track, and permanent racing facilities.

Formula 1 racing is also a multiple workloads environment with many drivers competing to cross the finish line before everyone else during each race. Each driver is one of many workloads on the course with individual drivers needing to run their race with minimal interference from other drivers.

The bottom line is this: low latency wins championships in environments with mixed and multiple workloads, whether they are Formula 1 races or enterprise data centers.


Reducing Latency Beats Increasing IOPS

The confusion between the benefits of lower latency versus higher IOPS is understandable, especially with all flash storage. Databases, applications, and users all need their tasks to complete sooner (low latency), regardless of how many other things are happening at the same time (high IOPS), but this is seldom the experience in enterprise data centers. Ultimately, doubling performance involves reducing latency by half rather than doubling IOPS.

Things become clear when considering how latency and IOPS actually affect all flash storage:

  • Lowering latency allows each storage operation to complete sooner.
  • Increasing IOPS allows more storage operations to work at once.

Since many all flash storage systems can be configured to provide more IOPS than enterprise data centers need, IOPS specifications are no longer an effective predictor of real world performance. Let's revisit the Formula 1 analogy to explore why.

Formula 1 teams focus on the amount of time their drivers require to complete races because finishing sooner wins championships. In other words, the entire team strives to minimize their driver's race times (lower latency). Racing more cars (higher IOPS) doesn't help any individual driver win more races, but it does increase everyone's race times (higher latency) due to more traffic.


"The goal is to remove performance off the table as an issue by creating architectures with enough head room so that the storage system will never be the bottleneck.

We will get to a point where you won't have to worry about storage performance testing because the storage systems will be such speed demons that it will handle anything you throw at it."

"Storage Performance: Important Things to Consider" by Tony Asaro, The INI Group, LLC


Want to Drive a Race Car?

It's not Formula 1, but they're real race cars and tracks. Click here and register for your chance to win a High Performance Racing and Driving class at a Skip Barber Racing School or you can choose from other prizes to help you indulge your inner adrenaline junkie.

Let's prepare for our next race. Join Said Ouissal next week as he digs into how low latency and high-speed of all flash drives data center performance.

Enter To Win a Day at

Skip Barber Racing School