June 2014 | SafeLogic

Archive for June, 2014

24 Jun 2014

Pro Tip: Encrypt Medical Data

SafeHealth_v2_orangeThere has been a ton of chatter about the recent fines levied by the U.S. Department of Health and Human Services Office for Civil Rights (OCR), and for good reason.  Money talks.

The Department of Health and Human Services (HHS) assessed a record $4.8 million in fines from New York and Presbyterian Hospital and Columbia University, after they submitted a joint breach report that dates back to September 27, 2010.  And to resolve a HIPAA breach case from over two years prior, Concentra Health Services and QCA Health Plan, Inc. agreed to pay a combined $1,975,220.00 in April 2014.  That’s right, nearly seven million bucks combined.  These must have been just ridiculously egregious breaches, you say.

Well, not exactly.

In the first case, resulting in the highest HIPAA-related fines yet, Patrick Ouellette reports that the electronic personal health information (ePHI) of 6800 patients was exposed to internet search engines, related somehow to an application developer’s deactivation of a personally-owned server.  My guess is that the dev didn’t do a comprehensive wipe on his testing machine, so when he started his next project… ouch.

In the second case, a laptop was stolen from an employee’s car outside of a physical therapy center in Missouri.  It contained ePHI of 148 patients… and the laptop had not been properly encrypted.  This was the key ingredient to becoming a major example set by the Office of Civil Rights (OCR).

Although HIPAA regulations draw no distinction between health information that is more sensitive (oncology lab test results, Center for Disease Control type stuff, etc) and clearly less sensitive (patient progress reports while rehabbing a torn meniscus, for example), we can say with reasonable confidence that this small local physical therapy center’s data was likely in the latter category.  But like I said – HIPAA makes no distinction.  The relatively small pool of potentially affected patients made no difference, either.  The OCR’s investigation yielded evidence of perpetual compliance violations and a general policy of ignoring the regulations.  That is the recipe for trouble, and the financial repercussions are clearly major.  Encryption is a baseline for medical data security.  It should be considered a non-negotiable starting point, but certain institutions continue to drag their heels.

Let me paint you a picture.  To seasoned criminals planning a heist specifically to harvest patient data, encryption is a deterrent, but does not make a system impregnable.  By comparison, virtually all of these incidents are inadvertent – a lost tablet here, a stolen laptop there.  Encryption is extremely effective in these scenarios, to keep the equipment loss from escalating to a full-blown breach.  In short, it keeps a hack from becoming a hacker.  It insures that the local juvenile delinquent who puts a brick through a window just to grab a four year old PC to sell for drug money will be doing that – and nothing more.  The laptop will be a brick itself shortly thereafter, and you can be confident that the smash-and-grab will not expose patient data in plain text, and will not yield a two million dollar price tag.  ePHI is safely obfuscated, and your biggest problem will be deploying a new laptop to your employee.

Simple, right?  Then why is this still like pulling teeth for some providers?

Kamala Harris

Kamala Harris

If the financial penalties aren’t a strong enough motivator, litigation is on the table as well.  California’s Attorney General Kamala Harris has made no secret of her interest in the topic, offering medical identity theft prevention tips to the public this fall.  This winter, Harris filed suit on behalf of California against Kaiser concerning a 2011 breach in which an unencrypted external hard drive was sold at a thrift store, containing a plain text database of Kaiser employees, along with Social Security numbers, dates of birth, addresses… and oh yeah, their family members’ information too.  Worse, Kaiser only alerted about 10,000 of the 30,000+ affected employees.  Not pretty.

So now you’ve got the OCR looking to fine you, the Attorney General suing you, and we’re not done yet.  Just like in the enterprise environment, you can’t even rest once you’ve trained employees and given them some tools.  You still need to safeguard against disgruntled or malicious employees.  “You won’t give me a new laptop?  Fine.  I’ll just ‘lose’ this old one.” Or worse, “You won’t give me that 5% raise? Fine. I’ll just ‘lose’ my unencrypted device and we’ll see how much you’ll pay.” Scary stuff.

Susan McAndrew

Susan McAndrew

The stance of the OCR is clear, and it is straight forward.  “Covered entities and business associates must understand that mobile device security is their obligation,” said Susan McAndrew, OCR’s since-retired deputy director of health information privacy. “Our message to these organizations is simple: encryption is your best defense against these incidents.”

Michael Leonard

Michael Leonard

It should be a no-brainer, but we continue to see companies holding out.  Iron Mountain’s Director of Product Management for Healthcare IT, Michael Leonard, commented recently on this.  “From our perspective, it is – I’ll say ‘puzzling’ – that organizations don’t encrypt more of the content even within their four walls.”

I’m not sure ‘puzzling’ is strong enough.  Idiotic, maybe?  Concentra Health Services and QCA Health Plan, Inc. were forced to cough up more than $13k per patient whose record was exposed, and that may just be the tip of the proverbial iceberg.  HHS Chief Regional Civil Rights Counsel Jerome Meites predicted an increase in fines from the $10M assessed in the last twelve months by the agency.  “Knowing what’s in the pipeline, I suspect that that number will be low compared to what’s coming up.”  That’s ominous, and should be a wake up call to anyone who thinks that they can simply fly under the radar.

The reality is that encryption should be automatic.  It should be offered in every software solution deployed to healthcare providers at every level.  To help reinforce the transition, SafeLogic provides FIPS 140-2 validated encryption for these solutions.  Remember, in the eyes of the federal government, only cryptographic modules certified by the CMVP are considered acceptable.  This assessment has extended to the healthcare industry as well.  HIPAA’s requirements have not yet explicitly required the use of FIPS 140-2 encryption exclusively, but customer requests already do, and the writing is on the wall for future versions of the standard.

For more information on the role of validated encryption in HIPAA regulations, please download SafeLogic’s whitepaper.

BlogFooterWalt

18 Jun 2014

Tizen, Connected Cars and Buggy Whips

Two weeks ago, I had the privilege of giving a presentation at the 2014 Tizen Developer ConferenceSafeLogic_Tizen_Logos

The first thing that you should know is that this was a fantastic event.  Most of us will hear “user group” or “developer conference” and reminisce about our own early experiences, the coffee-and-donuts geek meetups, complete with a folding chair for each wannabe Wozniak.  This was much more.  With a variety of speakers tackling an equally diverse set of topics over a three day stretch, and a significant investment of time, money and energy from Intel and Samsung, I highly recommend attending in 2015 if possible.  It was a very smooth and well-coordinated conference, for speakers, attendees and exhibitors alike.

The second thing that you should know is that my session rocked.  ‘Security-Centric Development for IoT and Wearables’ was one of the few talks that had a specific focus on data protection.  My hope is that I was able to influence attendees to consider security as a non-negotiable aspect of their development efforts, and maybe next year we will see more like-minded sessions on the agenda.  At the very least, I had fun launching SafeLogic footballs into the audience and nobody got a concussion.

To be honest, I was blown away by the ideas bouncing among the audience.  There were developers from seemingly every corner of technology, all with a vision of success based on the same operating system.  It was inspiring to see how many different folks saw potential in the same place.  Since the conference, it has felt like everywhere I look, there is another potential application for Tizen, another opportunity to join the Internet of Things and another chance to connect.  The scary part is that it all has to be secured.  Remember, IoT is only as strong as the weakest link.

One session at the Tizen Developer Conference included a discussion of the connected car collaboration efforts of the Linux Foundation, IBM, Intel and Local Motors.  It made me think of the article I had just read on CNN, aptly titled ‘Your car is a giant computer – and it can be hacked’.  Scary stuff, and spot on.

GoogleCarThe Toyota Prius has solidified its place in the garage of everyday Americans based upon efficiency, not horsepower, and has been immortalized as the test mules for Google’s self-driving car project.  Tesla’s fully electric Model S was the top selling full-sized luxury sedan in 2013… not bad for a vehicle designed by tech geeks.  Google has pushed the envelope even further now, internally developing prototypes for an all-new self-driving vehicle that incorporates features of both.  The landscape is clearly changing – and quickly.

Steering wheels are the next buggy whip, and data security will be more important to safe transportation than seatbelts.  Driver error will be replaced by the threat of compromised communications.  Could you imagine arriving at your destination, only to find yourself at a location chosen by a malicious hacker?  Or having your vehicle overridden and driven into a wall, off a cliff, or into a lake?  There is serious potential in self-driven cars, but even more serious potential for disaster.

The Tizen platform is not uniquely vulnerable to these threats.  All of IoT inherently is.  A smart toaster in your kitchen has to be as secure as your car, even though it isn’t 3000 pounds of metal going 70 miles per hour.  Until developers begin treating all devices with the same level of respect, I encourage all of us to tread carefully.  Hackers relish the challenge of creating mischief as much as they value the results, so assume that you may be a target.  We all are.

If you are a developer in IoT, please check out CryptoCompact.  We have begun our pilot program, so consider it an open invitation to integrate military-grade encryption within your project.  We’re all in this together, so let’s stay safe.

BlogFooter_Ray

12 Jun 2014

SafeLogic Doesn’t Sell to the NSA

It’s not that we don’t appreciate the work of the National Security Agency here at SafeLogic.  Really, it’s quite impressive.  We certainly are thankful for the work of Homeland Security and the DoD.  And we absolutely, unequivocally, 100% support the men and women who have served in our national military.  We are red-blooded American patriots, who believe in life, liberty, and the pursuit of happiness.  And that is precisely why we do not work with the NSA.NSALogo

Several significant events have come to light that call their ethics into question, and I’m not even talking about Snowden, Wiebe or any of the other whistleblowers.

Many are still reeling from the revelations surrounding the ten million dollar bribe that the RSA Security Corporation took from the NSA, in exchange for making Dual EC DRBG the default algorithm in RSA BSAFE, the most popular proprietary encryption module in the business.  This transaction, if known publicly at the time, would have raised eyebrows and questions would have been asked.  Instead, it remained in the shadows for years before Dual EC DRBG was exposed as a backdoor for the NSA to decrypt information at will and the connection was made.  It was a betrayal from both RSA and NSA, and disappointing to say the least.

More recent is the allegation that the NSA had knowledge of the Heartbleed bug, and leveraged the vulnerability since its creation, for approximately two years prior to the public identification of the flaw.

Ignore the political debate about whether or not the NSA has a right to, or ought to, spy on Americans in order to insure our safety.  They found the bug and didn’t tell anyone!  Even within their own denial was an implicit admission.  “Reports that NSA or any other part of the government were aware of the so-called Heartbleed vulnerability before 2014 are wrong.”

Ok… so they admit to hacking Heartbleed in January?  Best case scenario, the NSA took advantage of Heartbleed for only about 90 days.  Should we feel any less betrayed?

heartbleedlogoLook, I understand that it’s not in the NSA’s job description to fix problems created by private industry.  Heartbleed was certainly a black eye for the team of volunteers that rolled out the 2012 update to OpenSSL protocols.  No argument there.  But the NSA motto itself reads Defending Our Nation.  Securing The Future.  Doesn’t that include sounding the alarms when an estimated two-thirds of the world’s internet activity is at risk?  Doesn’t defending our nation include defending our intellectual property?  The assumption is that Securing the Future refers to the future of the American way of life, which is tightly aligned with our capitalist free market economy.  But that apparently was vulnerable for two years!  So no… I don’t think the NSA lived up to its mantra.

SafeLogic’s allegiance is to our customers, and our customers’ loyalty is to their own customers.  At the end of the day, our success is measured on whether we did everything possible to insure the security of the end users’ information.  Since our inception in 2012, the answer has been a resounding “Yes!” every single day.  Any partnership, association, or agreement with the NSA would undermine that singular goal.

That’s why we don’t sell our encryption to the National Security Agency.

BlogFooterWalt