The only manual thing we like is on a Ferrari
A common tactic (check out the other 10) to alleviate storage performance issues is the manual distribution of data based on knowledge of its use. Sounds like a dirty job.
But let’s illustrate the point: In the database world, an example is the creation of certain datafiles on fast storage in order to contain active, quarterly partitions of an accounting or sales table, with older quarters being archived to slower storage.
Other examples include the placement of transaction logs on alternating groups of spindles and the use of SSD or PCIe flash cards for temporary tablespaces.
Anything to do with manual has complexity, overhead and error associated with it. Manual distribution carries not only the challenge of complexity but also the overhead of constant monitoring to ensure the configuration remains optimal.
We prefer the easy route. When all file types can share ultra-fast, low-latency flash storage from Violin Systems, why choose complexity over simplicity?