February 2015 | SafeLogic

Archive for February, 2015

8 Feb 2015

On Encryption Keys (and Anthem) – Part 2 of 2

SafeHealth_option2_orangeThe Anthem breach encouraged me to wrap up this blog series and talk about key management in a genuine security context. When the Anthem breach first was public, it looked as if patient records were accessed because of lack of data encryption. Then Anthem stated the real reason for the breach: they only encrypt data in flight to/from the database(s) and rely on user credentials for access to data in the database. Why didn’t they encrypt the data in the database? Well, per Health Insurance Portability and Accountability Act (HIPAA) requirements, they don’t have to as long as they provide protection of the data via other means. Like elevated credentials.

That worked well, didn’t it?

They were compliant, but obviously not secure. To add more security to compliance programs like HIPAA, there have been some cries for enterprises to implement encryption. So how do you encrypt data properly? Well, it all depends on your environment, the sensitivity of the data, the threat models, and any tangible requirements for regulatory compliance. Here are some general guidelines:

  • Use validated encryption.
  • Use strong, well-generated keys.
  • Manage the keys properly.

Use validated encryption. Federal Information Processing Standard (FIPS) 140 is the gold standard. The Advanced Encryption Standard (AES) is one of the FIPS-approved algorithms for data encryption, and it is a better encryption algorithm than what Joe the Computer Science Intern presented in his thesis project. It just is. Plus, part of the FIPS 140 process involves strenuous black box testing of the algorithms to ensure they’re implemented properly. This is crucial for interoperability, and proper implementation of the AES standard also provides a measure of confidence that there aren’t leaks, faults, etc. Always look for the FIPS 140 certificate for your encryption solution.

Use well-generated keys. A password-based key (PBK) is crap. Here a key is derived from a password after it’s hashed with a message digest function. PBKs are crap because most passwords are crap. They’re subject to brute-force attack and just should not be used. Password-Based Key Derivation Function v2 (PBKDF2) makes password-based keys a bit stronger by conditioning the digest with random elements (called salt) to decrease the threat of brute force. But the threat is still there.

Keys should be as unpredictable and “random” as possible. Unfortunately in software environments it’s difficult to obtain truly random data because computers are designed to function predictably (if I do X, then Y happens). But let’s say you can get provable random data from your mobile device or your appliance. Use that to feed a conditioning algorithm and/or pseudorandom number generator. Then use that output for your key.

Use strong keys. The strength of a key depends on how it’s generated (see above) and how long the key is. For example, the AES algorithm can accommodate key sizes of 128-bits, 192-bits, or 256-bits. Consider using a key size that correlates to the overall sensitivity of your data. In Suite B, 256-bit keys can be used to protect classified data at the Top Secret level. Is your data tantamount to what the government would consider Top Secret?

Also consider the environment. Constrained and embedded environments (think wearables) may not have the processing power to handle bulk encryption with 256-bit keys. Or maybe data is ephemeral and wiped after a few seconds and therefore doesn’t need “top secret level” encryption. Or maybe there’s just not enough space for a 256-bit key.

Use a key that is strong enough to protect the data within the constraints of the environment and one that can counter the threats to that environment.

Manage your keys properly. You wouldn’t leave the key to your front door taped to the door itself. Hopefully you don’t put it under the doormat either. What would be the point of the lock? The same applies to information security. Don’t encrypt your data with a strong, properly generated data encryption key (DEK) then leave that key under the doormat.

Consider a key vault and use key encryption keys (KEK) to encrypt the data encryption keys. Access to this key vault or key manager should also be suitably locked down and tightly controlled (again, many different ways to do this). Otherwise you might as well just not encrypt your data.

While we’re at it: rotate your keys, especially your KEKs. Key rotation essentially means “key replacement” … and it’s a good idea in case the key or system is compromised. When you replace a key, be sure to overwrite with Fs or 0s to reduce any chance of traceability.

Store those DEKs encrypted with KEKs and protect those KEKs with tools and processes. And remember to balance security with usability: rotating your KEK every 2 seconds might be secure, but is your system usable?

Anthem wanted the data to be useful, which is why it wasn’t encrypted at the database. But that usability came at a high cost. The good news is that it is possible to encrypt data and have it be usable.

 


Encryption is a critical, necessary piece of a system’s overall security posture. But it’s not the sole answer. In Anthem’s case, records were accessed via those “elevated user credentials” … which means that malicious hackers were able to get in to the authentication server and raise privilege levels of user credentials (usernames/passwords) that they either knew or gleaned from the auth server. So in this case, it’s irrelevant if the breached data was encrypted; the hackers had authenticated and authorized access to it.

So what’s the answer?

When this was first reported I tweeted this:

Editing_Encryption_Keys — Part_1__What_Are_Keys_Exactly_

Defense in depth means providing security controls to address all aspects of the system: people, process, and technology. Technology is the most difficult pillar to lock down because there are so many layers and threats, hence so many products such as firewalls, IDP, APT, IDS, SIEM, 2FA, AV, smart cards, cloud gateways, etc.

Encryption is a fundamental element for security of data at rest and data in motion (control plane and data plane). Even the strongest encryption with proper key management won’t protect data that is accessed by an authorized user, because it has to be usable. However, encrypted data and tight management of keys provides a critical, necessary piece to a robust security posture.

I hope this provides some guidance on how to think about encryption and key management in your organization.

 

BlogFooter_Ray

3 Feb 2015

Privacy, Liberty & Encryption

David Cameron

David Cameron

It is unfortunate, that in the aftermath of the Charlie Hebdo murders and hate crimes in France, rallying cries for freedom of speech were twisted to interpret “free” speech as the opposite of “private” speech.  A few weeks ago, British Prime Minister David Cameron spoke out, radically saying that “we must not allow terrorists safe space to communicate with each other,” going on to suggest that there should be no means of communication which the government cannot read.  I’m in no way sympathetic to extremists or rebels who leverage privacy to plan nefarious and destructive acts, but I am certainly sympathetic to all of the innocent, law-abiding citizens whose civil rights would be trampled by such a policy.

It was just a few short months ago that certain US government officials cried foul when Apple solidified their encryption capabilities to the point that consumer data could not be deciphered, even under federal subpoena.  As Matthew Green wrote on Slate.com at the time, “Designing backdoors is easy. The challenge is in designing backdoors that only the right people can get through. In order to maintain its access to your phone, Apple would need a backdoor that allowed them to execute legitimate law enforcement requests, while locking hackers and well-resourced foreign intelligence services out.”  For this, among a myriad of other reasons, Apple relieved themselves of the headache and built the ‘Secure Enclave’ instead.  Individual iPhones encrypt extended data using a unique key, mathematically derived by combining their passcode with a set of secret numbers that are built into the phone.  Tim Cook himself couldn’t decrypt it without the user’s passcode and physical access to the device.  By extension, Apple is now rid of thousands of subpoena requests and pressure from a variety of global governments.

Despite the claims that law enforcement’s hands would be tied by this development in time sensitive situations such as kidnapping cases, Bruce Schneier asserted in a CNN editorial that “of the 3,576 major offenses for which warrants were granted for communications interception in 2013, exactly one involved kidnapping.”  So much for that theoretical importance of maintaining access to user phones.  More importantly, Schneier points out that phone data “can no longer be accessed by criminals, governments, or rogue employees. Access to it can no longer be demanded by totalitarian governments.”

This is another complication.  Even if the FBI and other US law enforcement agencies were the absolute pinnacle of tech-fueled crime-fighting and the removal of communication intercepts truly shackled their efforts… at least it closes the door to other, more suspect governments.  Apple, Samsung and others can’t really play international favorites, after all.  If they were able to, and willing to, provide backdoor access to the USA, they would have obligations to North Korea as well.

Apple washed their hands of the encryption problem by abdicating their role as a middle man and gatekeeper, and the internet didn’t break.  Law enforcement and other agencies seem to still be solving crimes, even without their former favorite toy.  Possibly most important, the ship has sailed, before another government flexes their muscles.  Just like Iran banned WhatsApp.  Just like India forced Blackberry to provide a law enforcement backdoor.  The UK has long been a supporter of citizens rights and privacy.  Thankfully, Apple ended this conversation long before the Prime Minister’s kneejerk reaction, wishing out loud for a technology-driven vaccination from terrorism.  We can only hope that other phone manufacturers follow suit quickly.

I sympathize with the victims in France.  I understand the sentiments of David Cameron.  But now, more than ever, it is crucial that we protect our liberty by protecting our privacy.  If we are forced to sacrifice our rights, we have already lost the war.

BlogFooterWalt2