In case you have not seen it, LTO-6 was announced, and it looks like the native cartridge size is only 2.5 TB. The LTO Consortium stated that the compressed cartridge will be able to store 6.25 TB of data. The only way I know of doing this is to increase the compression buffer size and the size of the data you are compressing against, which is often called the data dictionary. For those of us not using tape for backup -- which we know is often highly compressible, if you look at duplication as an extreme case for a large compression buffer and data dictionary -- we know backup applications can be significantly compressed. However, most archival applications cannot be.
Here is the problem I see: From what I have heard, the biggest growth in the tape market is for archival application, not backup, and many archival data types are either not compressible, as they have already been pre-compressed (e.g., video, music, pictures, images from MR or CT scans) or the data has limited compressibility, such as the output from simulation, which runs from various engineering applications (e.g., car crash testing, aircraft and boat design, and data from oil exploration and discovery). Some of the simulation data has some compressibility, but from what I have seen and have in the past tested, the compressibility is limited to at most 30 percent to 40 percent using huge compression buffers.
I would really like to see the data and understand why the LTO Consortium made the decision to approach the problem the way it has. What went into the decision process? What data sets were looked at to make the decision and the claim? What market spaces were considered? I am very confused how density was achieved, not via media and head density but via compression, given the long-term growth market for tape.
Labels: tape,backup,LTO,tape backup
posted by: Henry Newman