There has been a ton of chatter about the recent fines levied by the U.S. Department of Health and Human Services Office for Civil Rights (OCR), and for good reason. Money talks.
The Department of Health and Human Services (HHS) assessed a record $4.8 million in fines from New York and Presbyterian Hospital and Columbia University, after they submitted a joint breach report that dates back to September 27, 2010. And to resolve a HIPAA breach case from over two years prior, Concentra Health Services and QCA Health Plan, Inc. agreed to pay a combined $1,975,220.00 in April 2014. That’s right, nearly seven million bucks combined. These must have been just ridiculously egregious breaches, you say.
Well, not exactly.
In the first case, resulting in the highest HIPAA-related fines yet, Patrick Ouellette reports that the electronic personal health information (ePHI) of 6800 patients was exposed to internet search engines, related somehow to an application developer’s deactivation of a personally-owned server. My guess is that the dev didn’t do a comprehensive wipe on his testing machine, so when he started his next project… ouch.
In the second case, a laptop was stolen from an employee’s car outside of a physical therapy center in Missouri. It contained ePHI of 148 patients… and the laptop had not been properly encrypted. This was the key ingredient to becoming a major example set by the Office of Civil Rights (OCR).
Although HIPAA regulations draw no distinction between health information that is more sensitive (oncology lab test results, Center for Disease Control type stuff, etc) and clearly less sensitive (patient progress reports while rehabbing a torn meniscus, for example), we can say with reasonable confidence that this small local physical therapy center’s data was likely in the latter category. But like I said – HIPAA makes no distinction. The relatively small pool of potentially affected patients made no difference, either. The OCR’s investigation yielded evidence of perpetual compliance violations and a general policy of ignoring the regulations. That is the recipe for trouble, and the financial repercussions are clearly major. Encryption is a baseline for medical data security. It should be considered a non-negotiable starting point, but certain institutions continue to drag their heels.
Let me paint you a picture. To seasoned criminals planning a heist specifically to harvest patient data, encryption is a deterrent, but does not make a system impregnable. By comparison, virtually all of these incidents are inadvertent – a lost tablet here, a stolen laptop there. Encryption is extremely effective in these scenarios, to keep the equipment loss from escalating to a full-blown breach. In short, it keeps a hack from becoming a hacker. It insures that the local juvenile delinquent who puts a brick through a window just to grab a four year old PC to sell for drug money will be doing that – and nothing more. The laptop will be a brick itself shortly thereafter, and you can be confident that the smash-and-grab will not expose patient data in plain text, and will not yield a two million dollar price tag. ePHI is safely obfuscated, and your biggest problem will be deploying a new laptop to your employee.
Simple, right? Then why is this still like pulling teeth for some providers?
If the financial penalties aren’t a strong enough motivator, litigation is on the table as well. California’s Attorney General Kamala Harris has made no secret of her interest in the topic, offering medical identity theft prevention tips to the public this fall. This winter, Harris filed suit on behalf of California against Kaiser concerning a 2011 breach in which an unencrypted external hard drive was sold at a thrift store, containing a plain text database of Kaiser employees, along with Social Security numbers, dates of birth, addresses… and oh yeah, their family members’ information too. Worse, Kaiser only alerted about 10,000 of the 30,000+ affected employees. Not pretty.
So now you’ve got the OCR looking to fine you, the Attorney General suing you, and we’re not done yet. Just like in the enterprise environment, you can’t even rest once you’ve trained employees and given them some tools. You still need to safeguard against disgruntled or malicious employees. “You won’t give me a new laptop? Fine. I’ll just ‘lose’ this old one.” Or worse, “You won’t give me that 5% raise? Fine. I’ll just ‘lose’ my unencrypted device and we’ll see how much you’ll pay.” Scary stuff.
The stance of the OCR is clear, and it is straight forward. “Covered entities and business associates must understand that mobile device security is their obligation,” said Susan McAndrew, OCR’s since-retired deputy director of health information privacy. “Our message to these organizations is simple: encryption is your best defense against these incidents.”
It should be a no-brainer, but we continue to see companies holding out. Iron Mountain’s Director of Product Management for Healthcare IT, Michael Leonard, commented recently on this. “From our perspective, it is – I’ll say ‘puzzling’ – that organizations don’t encrypt more of the content even within their four walls.”
I’m not sure ‘puzzling’ is strong enough. Idiotic, maybe? Concentra Health Services and QCA Health Plan, Inc. were forced to cough up more than $13k per patient whose record was exposed, and that may just be the tip of the proverbial iceberg. HHS Chief Regional Civil Rights Counsel Jerome Meites predicted an increase in fines from the $10M assessed in the last twelve months by the agency. “Knowing what’s in the pipeline, I suspect that that number will be low compared to what’s coming up.” That’s ominous, and should be a wake up call to anyone who thinks that they can simply fly under the radar.
The reality is that encryption should be automatic. It should be offered in every software solution deployed to healthcare providers at every level. To help reinforce the transition, SafeLogic provides FIPS 140-2 validated encryption for these solutions. Remember, in the eyes of the federal government, only cryptographic modules certified by the CMVP are considered acceptable. This assessment has extended to the healthcare industry as well. HIPAA’s requirements have not yet explicitly required the use of FIPS 140-2 encryption exclusively, but customer requests already do, and the writing is on the wall for future versions of the standard.
For more information on the role of validated encryption in HIPAA regulations, please download SafeLogic’s whitepaper.