Data Integrity: Changing the Discussion

Posted on November 30, 2011 By Henry Newman

RssImageAltText

A number of my customers continue to ask questions about data reliability for archives. We all know that archives cannot be 100 percent reliable forever, but how reliable can they be? The answer is no one knows the answer. How can anyone figure out, given all the hardware and software involved in the archive what the reliability is. At least the people I talk with do not even try, as there is no basis of how to discuss the topic.

I think reliability must be discussed in terms of count of 9s. Is your data have 99.99999999 percent, also called 10 9s of reliability, or does it have 15 9s? No one really can tell, as there is no common way to discuss the problem. The 9 count could be calculated from the media reliability, but even that information is not used to have a common discussion. What I am thinking is there must be some standard way to discuss the reliability so the community and others have thoughtful discussions, and vendors can sell well-defined products. Vendors are not required to discuss data integrity, even at the media level for an archive, as the user community does not require it. It is time for a change in the thought process for those of use responsible for large archives of data.

I know it is possible to calculate the level of media data integrity, as my team has done it. I think that if we ask for this we will begin a bigger discussion on the integrity of data across the data path including all the checksums that have not be updated in 20+ years and that are no longer robust enough for the amount of data and the speed of the channel. It is time for a new discussion.


Comment and Contribute
(Maximum characters: 1200). You have
characters left.