By Russ Fellows, Evaluator Group
October 12, 2010 -- The ideas of liberty and security are often at odds with each other. In government and personal life, balancing these two ideals is a constant challenge. Within the world of information technology, providing freedom from any one vendor (i.e., liberty) is often at odds with protecting data using integrated, proven solutions. Maintaining this balance has been very difficult in the world of IT, until now.
A standards group known as OASIS recently announced approval of its Key Management Interoperability Protocol as a draft standard. The protocol is known as OASIS KMIP and is the culmination of several years of work by many vendors, including Brocade, Cisco, EMC/RSA, HP, IBM, Oracle, NetApp, Symantec, Thales and others.
The emergence of yet another standard is little cause for celebration. What matters is whether products that conform to those standards are delivered to end users. The first wave of these products is set to emerge in the coming weeks. One of the first vendors to ship a product compatible with the OASIS KMIP standard is IBM, with its Tivoli Key Lifetime Manager (TKLM). This is IBM’s second generation of key management products, and it is both backward compatible with earlier IBM key management efforts and will now also be compatible with other OASIS KMIP v1.0 compatible key management products.
Why encrypt data?
When data centers consisted of a physical entity that could be protected by gates, guards or other physical security means, encryption was the purview of the lunatic fringe. However, with the increasing importance of location transparency, physical security has become impractical. Add to that the rise of cloud or other IT-as-a-Service alternatives, and the need for data security becomes clear. Encryption is the only practical method for ensuring security of information when physical control is impossible. Security is often cited as the number one impediment to IT organizations using cloud or IT-as-a-Service.
Key management for storage
When data is encrypted, it is only readable, and hence available, if the key needed to decrypt the information is also available. Using encryption to protect stored information, so-called “data at rest,” is fundamentally more difficult than encryption of data being sent over a communication link, known as “data in-flight.” When data is being transmitted, if a key is lost or comprised, the data can be retransmitted, often with few consequences.
However, if data or a key is compromised for data being stored, the problem is significantly more complex. When encryption is used to protect data at rest, the availability of information depends completely on two separate items: The encrypted data and its key are both required in order to access the information. What complicates the issue further is that the encryption key should never be stored with the data it has encrypted. Systems engineers understand that these requirements significantly complicate information storage and security.
A brief history
The management of keys was recognized as the critical issue for encryption early on. Many of the advances in the practical use of cryptography had to do with generating, protecting and exchanging keys safely and securely. Advances such as the “Diffie Hellman” key exchange in the 1970s led to enhancements by a group of scientists that would later lend their initials to form a set of protocols and a company known as RSA. All of these researchers understood that the management of encryption keys is what enables encryption to be used as a practical and safe method for protecting critical information.
Until now, most IT departments have been extremely cautious to implement wide-scale encryption and key management for two reasons. The first was that encryption and key management can add significant complexity to ensuring data availability. The second reason was that IT shops were also concerned about vendor lock-in. Both of these concerns were valid.
Thus, unless organizations were mandated and specifically directed to encrypt their data, most companies have resisted doing so, and for good reason. That is, until now. With an open standard available, it will now become possible for companies to deploy a solution that is not proprietary and that will not tie the availability of their data to any one vendor or product.
What does this mean?
With the availability of a key management standard, it will now be possible for IT departments to access their encryption keys, and deliver them when necessary without being locked into one vendor’s solution. When the goal is to retain data for years or decades, it is critical that open standard techniques are used in order to ensure that keys and data remain independent of any vendor.
The IT world is looking for other vendors to follow the ratification of the OASIS KMIP standard with compliant products. As vendors commit to the idea of encryption key interchange, companies will be able to free their encrypted data from vendor lock-in, while ensuring sensitive data is protected.
Russ Fellows is a senior analyst with the Evaluator Group research, education and consulting firm, which provides unbiased product analysis and comparisons.