A Key Management Infrastructure
Project description
This package provides a NIST SP 800-57 compliant Key Management Infrastructure (KMI).
To get started do:
$ python bootstrap.py # Must be Python 2.5 or higher $ ./bin/buildout # Depends on successfull compilation of M2Crypto $ ./bin/runserver # or ./bin/paster serve server.ini
The server will come up on port 8080. You can create a new key encrypting key using:
$ wget https://localhost:8080/new -O kek.dat --ca-certificate sample.pem
or, if you want a more convenient tool:
$ ./bin/testclient https://localhost:8080/new -n > kek.dat
The data encryption key can now be retrieved by posting the KEK to another URL:
$ wget https://localhost:8080/key --header 'Content-Type: text/plain' --post-file kek.dat -O datakey.dat --ca-certificate sample.pem
or
$ ./bin/testclient https://localhost:8080/new -g kek.dat > datakey.dat
Note: To be compliant, the server must use an encrypted communication channel of course. The --ca-certificate tells wget to trust the sample self-signed certificate included in the keas.kmi distribution; you’ll want to generate a new SSL certificate for production use.
Key Management Infrastructure
This package provides a NIST SP 800-57 compliant key management infrastructure. Part of this infrastructure is a key management facility that provides several services related to keys. All keys are stored in a specified storage directory.
>>> import tempfile >>> storage_dir = tempfile.mkdtemp()>>> from keas.kmi import facility >>> keys = facility.KeyManagementFacility(storage_dir) >>> keys <KeyManagementFacility (0)>>>> from zope.interface import verify >>> from keas.kmi import interfaces >>> verify.verifyObject(interfaces.IKeyManagementFacility, keys) True
One of the services the facility provides in the generation of new keys.
>>> verify.verifyObject(interfaces.IKeyGenerationService, keys) True
The algorithm to generate a new pair of keys is somewhat involved. The following features are required:
The key local to the data cannot be directly used as the encrypting key.
The encrypting key must be stored using a cipher that is at least as strong as the key itself.
The computer storing the data cannot also store the key.
This suggests the following algorithm to generate and store a new encrypting key:
Create the key encrypting key (private and public).
Create the encryption key.
Use the public key encrypting key to encrypt both the encryption keys.
Discard the public key encrypting key. It is important that this key is never stored anywhere.
Store the encrypted encryption key in the key management facility.
Return the private key encrypting key.
Let’s now use the key generation service’s API to generate a key.
>>> key = keys.generate() >>> print key -----BEGIN RSA PRIVATE KEY----- ... -----END RSA PRIVATE KEY-----
By default the system uses the AES 256 cipher, because public commentary suggests that the AES 196 or AES 256 cipher sufficiently fulfill the PCI, HIPAA and NIST key strength requirement.
You can now use this key encrypting key to extract the encryption keys:
>>> try: ... from hashlib import md5 ... except ImportError: ... from md5 import md5 >>> hash_key = md5(key).hexdigest()>>> len(keys.get(hash_key)) 256
Our key management facility also supports the encryption service, which allows you to encrypt and decrypt a string given the key encrypting key.
>>> verify.verifyObject(interfaces.IEncryptionService, keys) True
Let’s now encrypt some data:
>>> encrypted = keys.encrypt(key, 'Stephan Richter') >>> len(encrypted) 16
We can also decrypt the data.
>>> keys.decrypt(key, encrypted) 'Stephan Richter'
And that’s pretty much all there is to it. Most of the complicated crypto-related work happens under the hood, transparent to the user.
One final note. Once the data encrypting key is looked up and decrypted, it is cached, since constantly decrypting the the DEK is expensive.
>>> hash_key in keys._KeyManagementFacility__dek_cache True
A timeout (in seconds) controls when a key must be looked up:
>>> keys.timeout 3600
Let’s now force a reload by setting the timeout to zero:
>>> keys.timeout = 0
The cache is a dictionary of key encrypting key to a 2-tuple that contains the date/time the key has been fetched and the unencrypted DEK.
>>> firstTime = keys._KeyManagementFacility__dek_cache[hash_key][0]>>> keys.decrypt(key, encrypted) 'Stephan Richter'>>> secondTime = keys._KeyManagementFacility__dek_cache[hash_key][0]>>> firstTime < secondTime True
The Local Key Management Facility
However, using the master key management facility’s encryption service is expensive, since each encryption and decryption request would require a network request. Fortunately, we can
communicate encryption keys across multiple devices, and
keep encryption keys in memory.
It is only required that the data transfer is completed via an encrypted communication channel. In this implementation the communication protocol is HTTP and thus a sufficiently strong SSL connection is appropriate.
Let’s now instantiate the local key management facility:
>>> localKeys = facility.LocalKeyManagementFacility('http://localhost/keys') >>> localKeys <LocalKeyManagementFacility 'http://localhost/keys'>
The argument to the constructor is the URL to the master key management facility. The local facility will use a small REST API to communicate with the server.
For the purpose of this test, we are going to install a network component that only simulates the requests:
>>> from keas.kmi import testing >>> testing.setupRestApi(localKeys, keys)
As with the master facility, the local facility provides the IEncryptionService interface:
>>> verify.verifyObject(interfaces.IEncryptionService, localKeys) True
So en- and decryption is very easy to do:
>>> encrypted = localKeys.encrypt(key, 'Stephan Richter') >>> len(encrypted) 16>>> localKeys.decrypt(key, encrypted) 'Stephan Richter'
Instead of forwarding the en- an decryption request to the master facility, the local facility merely fetches the encryption key pair and executes the operation locally. This approach has the following advantages:
There is no general network latency for any en- and decryption call.
The expensive task of en- and decrypting a message is delegated to multiple servers, allowing better scaling.
Fetched keys can be cached locally decreasing the network calls to a once in a while process.
In this implementation, we do cache the keys in a private attribute:
>>> key in localKeys._LocalKeyManagementFacility__cache True
A timeout (in seconds) controls when a key must be refetched:
>>> localKeys.timeout 3600
Let’s now force a reload by setting the timeout to zero:
>>> localKeys.timeout = 0
The cache is a dictionary of key encrypting key to a 3-tuple that contains the date/time the key has been fetched, the encryption (public) key, and the decryption (private) key.
>>> firstTime = localKeys._LocalKeyManagementFacility__cache[key][0]>>> localKeys.decrypt(key, encrypted) 'Stephan Richter'>>> secondTime = localKeys._LocalKeyManagementFacility__cache[key][0]>>> firstTime < secondTime True
The local facility also provides the IKeyGenerationService interface:
>>> verify.verifyObject(interfaces.IKeyGenerationService, keys) True
The local method call is identical to the master one:
>>> key2 = localKeys.generate() >>> print key2 -----BEGIN RSA PRIVATE KEY----- ... -----END RSA PRIVATE KEY-----
The operation is forwarded to the master server, so that the key is available there as well:
>>> hash = md5(key2)>>> hash.hexdigest() in keys True
The REST API
The REST API of the master key management facility defines the communication with the local facility. When a new encryption key pair is created, we simply make a POST call to the following URL:
http://server:port/new
The request should have no body and the response is simply the key encrypting key.
So let’s have a look at the call:
>>> from keas.kmi import rest >>> from webob import Request>>> request = Request({}) >>> key3 = rest.create_key(keys, request).body >>> print key3 -----BEGIN RSA PRIVATE KEY----- ... -----END RSA PRIVATE KEY-----
The key is available in the facility of course:
>>> hash = md5(key3) >>> hash.hexdigest() in keys True
We can now fetch the encryption key pair using a POST call to this URL:
http://server:port/key
The request sends the key encrypting key in its body. The response is the encryption key string:
>>> request = Request({}) >>> request.body = key3>>> encKey = rest.get_key(keys, request) >>> len(encKey.body) 128
If you try to request a nonexistent key, you get a 404 error: encryption key string:
>>> request.body = 'xxyz' >>> print rest.get_key(keys, request) Key not found
A GET request to the root shows us a server status page
>>> print rest.get_status(keys, Request({})) 200 OK Content-Type: text/plain Content-Length: 25 <BLANKLINE> KMS server holding 3 keys
The Testing Key Management Facility
The testing facility only manages a single key that is always constant. This allows you to install a testing facility globally, not storing the keys in the database and still reuse a ZODB over multiple sessions.
>>> storage_dir = tempfile.mkdtemp() >>> testingKeys = testing.TestingKeyManagementFacility(storage_dir)
Of course, the key generation service is supported:
>>> verify.verifyObject(interfaces.IKeyGenerationService, keys) True
However, you will always receive the same key:
>>> def getKeySegment(key): ... return key.split('\n')[1]>>> getKeySegment(testingKeys.generate()) 'MIIBOAIBAAJBAL+VS9lDsS9XOaeJppfK9lhxKMRFdcg50MR3aJEQK9rvDEqNwBS9' >>> getKeySegment(testingKeys.generate()) 'MIIBOAIBAAJBAL+VS9lDsS9XOaeJppfK9lhxKMRFdcg50MR3aJEQK9rvDEqNwBS9'>>> storage_dir = tempfile.mkdtemp() >>> testingKeys = testing.TestingKeyManagementFacility(storage_dir) >>> getKeySegment(testingKeys.generate()) 'MIIBOAIBAAJBAL+VS9lDsS9XOaeJppfK9lhxKMRFdcg50MR3aJEQK9rvDEqNwBS9'
All other methods remain the same:
>>> key = testingKeys.generate() >>> testingKeys.getEncryptionKey(key) '_\xc4\x04\xbe5B\x7f\xaf\xd6\x92\xbd\xa0\xcf\x156\x1d\x88=p9{\xaa...'
We can also safely en- and decrypt:
>>> encrypted = testingKeys.encrypt(key, 'Stephan Richter') >>> testingKeys.decrypt(key, encrypted) 'Stephan Richter'
Key Holder
The key holder is a simple class designed to store a key in RAM:
>>> from keas.kmi import keyholder >>> holder = keyholder.KeyHolder(__file__)>>> verify.verifyObject(interfaces.IKeyHolder, holder) True
Encrypted Persistent Objects
This package provides an EncryptedPersistent class that takes care of data encryption in the storage. Usage is pretty simple: instead of subclassing persistent.Persistent, subclass keas.kmi.persistent.EncryptedPersistent:
>>> from keas.kmi.persistent import EncryptedPersistent >>> class UserPrivateData(EncryptedPersistent): ... def __init__(self, name, ssn): ... self.name = name ... self.ssn = ssn ... def __repr__(self): ... return '<UserPrivateData %s %s>' % (self.name, self.ssn)>>> userdata = UserPrivateData('Stephan Richter', '123456789') >>> userdata <UserPrivateData Stephan Richter 123456789>
The key used for encryption and decryption comes from an IKeyHolder utility that you’re supposed to provide in your application.
>>> from keas.kmi.testing import TestingKeyHolder >>> from zope.component import provideUtility >>> provideUtility(TestingKeyHolder())
None of the raw data appears in the pickle
>>> import cPickle as pickle >>> pickled_data = pickle.dumps(userdata) >>> 'Stephan' in pickled_data False >>> '123456789' in pickled_data False
We can successfully load it
>>> pickle.loads(pickled_data) <UserPrivateData Stephan Richter 123456789>
Every persistent object is stored separately. Only the objects that inherit from EncryptedPersistent will be encrypted.
>>> import persistent.dict >>> users = persistent.dict.PersistentDict() >>> users['stephan'] = UserPrivateData('Stephan Richter', '123456789') >>> users['mgedmin'] = UserPrivateData('Marius Gedminas', '987654321')>>> pickled_data = pickle.dumps(users) >>> 'stephan' in pickled_data True >>> '123456789' in pickled_data False
Persistent References
Enough pickling; we really should make sure our magic does not interfere with ZODB keeping track of persistent object references.
First let’s make our EncryptedPersistent objects have some references to other (encrypted and unencrypted) persistent objects
>>> users['stephan'].__parent__ = users >>> users['mgedmin'].__parent__ = users>>> users['stephan'].friend = users['mgedmin'] >>> users['mgedmin'].friend = users['stephan']
Now let’s create a database:
>>> import ZODB.DB >>> import ZODB.MappingStorage >>> db = ZODB.DB(ZODB.MappingStorage.MappingStorage()) >>> conn = db.open() >>> conn.root()['users'] = users >>> import transaction >>> transaction.commit()
And we can open a second connection (while carefully keeping the first one open, to ensure it’s not reused and we actually load the pickles rather than receiving persistent objects from a cache) and load the whole object graph
>>> conn2 = db.open() >>> users2 = conn2.root()['users'] >>> users2['stephan'] <UserPrivateData Stephan Richter 123456789> >>> users2['mgedmin'] <UserPrivateData Marius Gedminas 987654321>
All the object references between persistent and encrypted persistent objects are preserved correctly:
>>> users2['stephan'].friend <UserPrivateData Marius Gedminas 987654321> >>> users2['mgedmin'].friend <UserPrivateData Stephan Richter 123456789>>>> users2['stephan'].__parent__ is users2 True >>> users2['mgedmin'].__parent__ is users2 True >>> users2['stephan'].friend is users2['mgedmin'] True >>> users2['mgedmin'].friend is users2['stephan'] True
Data conversion
If you used to have simple persistent objects, and now want to convert them to EncryptedPersistent, think again. This is not secure. You already have unencrypted bits on your disk platters, and the only way to get rid of them is to physically destroy the disk.
But if you have a testing-only database with fake data, and would like to continue using it with a small conversion step, you can use the convert_object_to_encrypted() function.
>>> from keas.kmi.persistent import convert_object_to_encrypted
Here’s the old class definition that we’ll store:
>>> from persistent import Persistent >>> class Password(Persistent): ... def __init__(self, password): ... self.password = password>>> db = ZODB.DB(ZODB.MappingStorage.MappingStorage()) >>> conn = db.open() >>> conn.root()['pwd'] = Password('xyzzy') >>> transaction.commit()
And now we redefine the class:
>>> class Password(EncryptedPersistent): ... def __init__(self, password): ... self.password = password
Once again we have to use a different connection object (while keeping the first one alive) to avoid stepping on a ZODB cache:
>>> conn2 = db.open() >>> pwd = conn2.root()['pwd']
If you try to use Password objects loaded from the database, you’ll get an error:
>>> pwd.password Traceback (most recent call last): ... ValueError: need more than 1 value to unpack
But we can apply the conversion step:
>>> convert_object_to_encrypted(pwd) >>> pwd.password 'xyzzy'
The converted state is stored in the DB
>>> transaction.commit() >>> conn3 = db.open() >>> pwd = conn3.root()['pwd'] >>> pwd.password 'xyzzy'
CHANGES
3.0.0 (2014-01-06)
Switched from M2Crypto to PyCrypto, since M2Crypto is not maintained anymore.
Switched from deprecated repoze.bfg to pyramid.
NOTE: While I found code online to make the switch from PyCrypto to M2Crypto backwards compatible, I have not tested that functionality. Please try this on your data and let me know if you have issues.
NOTE 2: PyCrypto does not allow 512-bit RSA keys, so I increased the key size to 2048 bits. Old 512-bit keys should still work, but new ones will be always larger now.
2.1.0 (2010-10-07)
Added a cache for unencrypted DEKs in the key management facility, like it was already done for the local key management facility. This increases encryption and decryption performance by an order of magnitude from roughly 2ms to 0.2ms.
2.0.0 (2010-09-29)
Refactored REST server to be a simple repoze.bfg application.
The encrypted data encrypting keys (DEKs) are now stored in a directory instead of the ZODB. This increases transparency in the data store and makes backups easier.
Added caching to directory-based facility, so we do not need to read files all the time.
1.1.1 (2010-08-27)
Fixed deprecation warnings about md5 and zope.testing.doctest.
1.1.0 (2010-08-25)
Feature: Updated code to work with Bluebream 1.0b3.
1.0.0 (2009-07-24)
Feature: Update to the latest package versions.
0.3.1 (2008-09-11)
Relax M2Crypto version requirements to 0.18 or newer.
0.3.0 (2008-09-04)
A simple KeyHolder utility is available in keas.kmi.keyholder.
0.2.0 (2008-09-04)
Sample server shows how to enable SSL
Front page now shows the number of stored keys instead of a ComponentLookupError message.
Command-line client for testing a remote Key Management Server
Bugfix: LocalKeyManagementFacility was broken (AttributeError: ‘RESTClient’ object has no attribute ‘POST’)
0.1.0 (2008-09-03)
Initial Release
Key Generation Service
Encryption Service (Master and Local)
REST API for key communication between encryption services
Encrypted Persistent Storage
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.