Blog | SafeLogic

Blog

11 Jul 2014

Glass for the Masses

google_glasssWearables and the surrounding culture are evolving to the next generation right before our very eyes and Google is firmly in the vanguard with the notorious Glass.

Just in the beginning of 2014:

San Diego traffic court heard a case against a Glass-wearing driver

Homeland Security interrogated Ohio man wearing Glass in a movie theater

VSP, the #1 vision insurance provider, announced eligibility of Glass for subsidies

Google added sunglasses and prescription frames to Glass lineup

Do you see a pattern?  I sure do.  Growing pains, and lots more to come.

Google is making a strategic effort to make Glass more accessible, but they have fallen short, yielding a not-quite positive reputation for their early adopters.  Perhaps any press is good press for Google, but I think it says something when “Glasshole” has been an entry in the Urban Dictionary for nearly a year before the device was even available for public sale.  Wearables are clearly poised for mainstream domination, but the public is just as clearly not ready to accept it yet.

The issue is a lack of hands-on experience by the masses.  As Keith Barrett pointed out in his blog, by slashing the price, Google could put the Glass into the hands of millions.  It would no longer be a novelty toy for the elite nerds who want to demonstrate their status.  The average American would become the advocate, knocking down barriers, removing stigma, and encouraging everyone to see the positive applications for the technology.  The everyman is a very powerful demographic, and it’s the only one that can combat the current notoriety of the Glass.

So let’s talk about actual, productive ways to integrate Glass into our normal lives.

Why are we not rolling out law enforcement apps for Glass that include real-time database reference for license plates and facial recognition?  That would be so much more productive than ignoring the topic until traffic cops pull over a blogger looking for publicity.

Why are we not deploying Glass in movie theaters to offer subtitles for deaf or non-English speakers?  That seems like a better option than calling in federal agents to investigate a potential bootlegger.

If we have subsidies to burn with insurance companies, why are we not developing Glass apps to help teachers in the classroom?  Imagine if a teacher could quantitatively measure the attention span of a room of first graders while engaging with them.  How about apps for health inspectors while in a commercial kitchen?  Or taxi drivers?  Or race car drivers?

The potential of Wearables, and specifically heads-up displays and augmented vision such as Glass, is vast.  I just hope that we can begin to truly embrace it as a culture soon.

BlogFooterWalt

3 Jul 2014

Happy Independence Day!

Stars and StripesWow.  It feels like just yesterday that I blogged about the importance of our freedom and opportunity, and how thankful I am to be thriving in the USA.  That was a year ago.  In ‘SafeLogic Time’, where we try to compress a month’s work into a week, and a year into a month, this feels more like a decade!

Much has happened since the summer of ’13.  I encourage you to go back and re-read some of our blog posts to recap.  It’s really pretty interesting to harken back to the challenges that we were facing last year and how we have solved some, while others are very much still threatening.  We will be selecting certain posts as suggested reading for what the Twitterverse likes to call #ThrowbackThursday… although I know that Walt really enjoyed X-Men: Days of Future Past, so that might be contributing to the retro theme too.

Some things have definitely remained the same.  SafeLogic still pursues innovation in security and encryption, prioritizing the safety, privacy and liberty of our customers, and our customers’ customers.  I’m still thankful and proud to be an American, and I’m still planning to grill, watch fireworks and put away a few cocktails.

In a landscape strewn with failed companies, startups deeply in the red, and ousted executives, I’m excited for Independence Day.  I have a lot of pride as I continue to lead this company, as SafeLogic continues to operate independently, at a profit, and with no venture debt.  It’s the most clear, direct way that I can say definitively that we will be here when you need us.  Next month, next year, or whenever you’re ready.

Happy Independence Day!

BlogFooter_Ray

24 Jun 2014

Pro Tip: Encrypt Medical Data

SafeHealth_v2_orangeThere has been a ton of chatter about the recent fines levied by the U.S. Department of Health and Human Services Office for Civil Rights (OCR), and for good reason.  Money talks.

The Department of Health and Human Services (HHS) assessed a record $4.8 million in fines from New York and Presbyterian Hospital and Columbia University, after they submitted a joint breach report that dates back to September 27, 2010.  And to resolve a HIPAA breach case from over two years prior, Concentra Health Services and QCA Health Plan, Inc. agreed to pay a combined $1,975,220.00 in April 2014.  That’s right, nearly seven million bucks combined.  These must have been just ridiculously egregious breaches, you say.

Well, not exactly.

In the first case, resulting in the highest HIPAA-related fines yet, Patrick Ouellette reports that the electronic personal health information (ePHI) of 6800 patients was exposed to internet search engines, related somehow to an application developer’s deactivation of a personally-owned server.  My guess is that the dev didn’t do a comprehensive wipe on his testing machine, so when he started his next project… ouch.

In the second case, a laptop was stolen from an employee’s car outside of a physical therapy center in Missouri.  It contained ePHI of 148 patients… and the laptop had not been properly encrypted.  This was the key ingredient to becoming a major example set by the Office of Civil Rights (OCR).

Although HIPAA regulations draw no distinction between health information that is more sensitive (oncology lab test results, Center for Disease Control type stuff, etc) and clearly less sensitive (patient progress reports while rehabbing a torn meniscus, for example), we can say with reasonable confidence that this small local physical therapy center’s data was likely in the latter category.  But like I said – HIPAA makes no distinction.  The relatively small pool of potentially affected patients made no difference, either.  The OCR’s investigation yielded evidence of perpetual compliance violations and a general policy of ignoring the regulations.  That is the recipe for trouble, and the financial repercussions are clearly major.  Encryption is a baseline for medical data security.  It should be considered a non-negotiable starting point, but certain institutions continue to drag their heels.

Let me paint you a picture.  To seasoned criminals planning a heist specifically to harvest patient data, encryption is a deterrent, but does not make a system impregnable.  By comparison, virtually all of these incidents are inadvertent – a lost tablet here, a stolen laptop there.  Encryption is extremely effective in these scenarios, to keep the equipment loss from escalating to a full-blown breach.  In short, it keeps a hack from becoming a hacker.  It insures that the local juvenile delinquent who puts a brick through a window just to grab a four year old PC to sell for drug money will be doing that – and nothing more.  The laptop will be a brick itself shortly thereafter, and you can be confident that the smash-and-grab will not expose patient data in plain text, and will not yield a two million dollar price tag.  ePHI is safely obfuscated, and your biggest problem will be deploying a new laptop to your employee.

Simple, right?  Then why is this still like pulling teeth for some providers?

Kamala Harris

Kamala Harris

If the financial penalties aren’t a strong enough motivator, litigation is on the table as well.  California’s Attorney General Kamala Harris has made no secret of her interest in the topic, offering medical identity theft prevention tips to the public this fall.  This winter, Harris filed suit on behalf of California against Kaiser concerning a 2011 breach in which an unencrypted external hard drive was sold at a thrift store, containing a plain text database of Kaiser employees, along with Social Security numbers, dates of birth, addresses… and oh yeah, their family members’ information too.  Worse, Kaiser only alerted about 10,000 of the 30,000+ affected employees.  Not pretty.

So now you’ve got the OCR looking to fine you, the Attorney General suing you, and we’re not done yet.  Just like in the enterprise environment, you can’t even rest once you’ve trained employees and given them some tools.  You still need to safeguard against disgruntled or malicious employees.  ”You won’t give me a new laptop?  Fine.  I’ll just ‘lose’ this old one.” Or worse, “You won’t give me that 5% raise? Fine. I’ll just ‘lose’ my unencrypted device and we’ll see how much you’ll pay.” Scary stuff.

Susan McAndrew

Susan McAndrew

The stance of the OCR is clear, and it is straight forward.  “Covered entities and business associates must understand that mobile device security is their obligation,” said Susan McAndrew, OCR’s since-retired deputy director of health information privacy. “Our message to these organizations is simple: encryption is your best defense against these incidents.”

Michael Leonard

Michael Leonard

It should be a no-brainer, but we continue to see companies holding out.  Iron Mountain’s Director of Product Management for Healthcare IT, Michael Leonard, commented recently on this.  ”From our perspective, it is – I’ll say ‘puzzling’ – that organizations don’t encrypt more of the content even within their four walls.”

I’m not sure ‘puzzling’ is strong enough.  Idiotic, maybe?  Concentra Health Services and QCA Health Plan, Inc. were forced to cough up more than $13k per patient whose record was exposed, and that may just be the tip of the proverbial iceberg.  HHS Chief Regional Civil Rights Counsel Jerome Meites predicted an increase in fines from the $10M assessed in the last twelve months by the agency.  ”Knowing what’s in the pipeline, I suspect that that number will be low compared to what’s coming up.”  That’s ominous, and should be a wake up call to anyone who thinks that they can simply fly under the radar.

The reality is that encryption should be automatic.  It should be offered in every software solution deployed to healthcare providers at every level.  To help reinforce the transition, SafeLogic provides FIPS 140-2 validated encryption for these solutions.  Remember, in the eyes of the federal government, only cryptographic modules certified by the CMVP are considered acceptable.  This assessment has extended to the healthcare industry as well.  HIPAA’s requirements have not yet explicitly required the use of FIPS 140-2 encryption exclusively, but customer requests already do, and the writing is on the wall for future versions of the standard.

For more information on the role of validated encryption in HIPAA regulations, please download SafeLogic’s whitepaper.

BlogFooterWalt

18 Jun 2014

Tizen, Connected Cars and Buggy Whips

Two weeks ago, I had the privilege of giving a presentation at the 2014 Tizen Developer ConferenceSafeLogic_Tizen_Logos

The first thing that you should know is that this was a fantastic event.  Most of us will hear “user group” or “developer conference” and reminisce about our own early experiences, the coffee-and-donuts geek meetups, complete with a folding chair for each wannabe Wozniak.  This was much more.  With a variety of speakers tackling an equally diverse set of topics over a three day stretch, and a significant investment of time, money and energy from Intel and Samsung, I highly recommend attending in 2015 if possible.  It was a very smooth and well-coordinated conference, for speakers, attendees and exhibitors alike.

The second thing that you should know is that my session rocked.  ‘Security-Centric Development for IoT and Wearables’ was one of the few talks that had a specific focus on data protection.  My hope is that I was able to influence attendees to consider security as a non-negotiable aspect of their development efforts, and maybe next year we will see more like-minded sessions on the agenda.  At the very least, I had fun launching SafeLogic footballs into the audience and nobody got a concussion.

To be honest, I was blown away by the ideas bouncing among the audience.  There were developers from seemingly every corner of technology, all with a vision of success based on the same operating system.  It was inspiring to see how many different folks saw potential in the same place.  Since the conference, it has felt like everywhere I look, there is another potential application for Tizen, another opportunity to join the Internet of Things and another chance to connect.  The scary part is that it all has to be secured.  Remember, IoT is only as strong as the weakest link.

One session at the Tizen Developer Conference included a discussion of the connected car collaboration efforts of the Linux Foundation, IBM, Intel and Local Motors.  It made me think of the article I had just read on CNN, aptly titled ‘Your car is a giant computer – and it can be hacked’.  Scary stuff, and spot on.

GoogleCarThe Toyota Prius has solidified its place in the garage of everyday Americans based upon efficiency, not horsepower, and has been immortalized as the test mules for Google’s self-driving car project.  Tesla’s fully electric Model S was the top selling full-sized luxury sedan in 2013… not bad for a vehicle designed by tech geeks.  Google has pushed the envelope even further now, internally developing prototypes for an all-new self-driving vehicle that incorporates features of both.  The landscape is clearly changing – and quickly.

Steering wheels are the next buggy whip, and data security will be more important to safe transportation than seatbelts.  Driver error will be replaced by the threat of compromised communications.  Could you imagine arriving at your destination, only to find yourself at a location chosen by a malicious hacker?  Or having your vehicle overridden and driven into a wall, off a cliff, or into a lake?  There is serious potential in self-driven cars, but even more serious potential for disaster.

The Tizen platform is not uniquely vulnerable to these threats.  All of IoT inherently is.  A smart toaster in your kitchen has to be as secure as your car, even though it isn’t 3000 pounds of metal going 70 miles per hour.  Until developers begin treating all devices with the same level of respect, I encourage all of us to tread carefully.  Hackers relish the challenge of creating mischief as much as they value the results, so assume that you may be a target.  We all are.

If you are a developer in IoT, please check out CryptoCompact.  We have begun our pilot program, so consider it an open invitation to integrate military-grade encryption within your project.  We’re all in this together, so let’s stay safe.

BlogFooter_Ray

12 Jun 2014

SafeLogic Doesn’t Sell to the NSA

It’s not that we don’t appreciate the work of the National Security Agency here at SafeLogic.  Really, it’s quite impressive.  We certainly are thankful for the work of Homeland Security and the DoD.  And we absolutely, unequivocally, 100% support the men and women who have served in our national military.  We are red-blooded American patriots, who believe in life, liberty, and the pursuit of happiness.  And that is precisely why we do not work with the NSA.NSALogo

Several significant events have come to light that call their ethics into question, and I’m not even talking about Snowden, Wiebe or any of the other whistleblowers.

Many are still reeling from the revelations surrounding the ten million dollar bribe that the RSA Security Corporation took from the NSA, in exchange for making Dual EC DRBG the default algorithm in RSA BSAFE, the most popular proprietary encryption module in the business.  This transaction, if known publicly at the time, would have raised eyebrows and questions would have been asked.  Instead, it remained in the shadows for years before Dual EC DRBG was exposed as a backdoor for the NSA to decrypt information at will and the connection was made.  It was a betrayal from both RSA and NSA, and disappointing to say the least.

More recent is the allegation that the NSA had knowledge of the Heartbleed bug, and leveraged the vulnerability since its creation, for approximately two years prior to the public identification of the flaw.

Ignore the political debate about whether or not the NSA has a right to, or ought to, spy on Americans in order to insure our safety.  They found the bug and didn’t tell anyone!  Even within their own denial was an implicit admission.  “Reports that NSA or any other part of the government were aware of the so-called Heartbleed vulnerability before 2014 are wrong.”

Ok… so they admit to hacking Heartbleed in January?  Best case scenario, the NSA took advantage of Heartbleed for only about 90 days.  Should we feel any less betrayed?

heartbleedlogoLook, I understand that it’s not in the NSA’s job description to fix problems created by private industry.  Heartbleed was certainly a black eye for the team of volunteers that rolled out the 2012 update to OpenSSL protocols.  No argument there.  But the NSA motto itself reads Defending Our Nation.  Securing The Future.  Doesn’t that include sounding the alarms when an estimated two-thirds of the world’s internet activity is at risk?  Doesn’t defending our nation include defending our intellectual property?  The assumption is that Securing the Future refers to the future of the American way of life, which is tightly aligned with our capitalist free market economy.  But that apparently was vulnerable for two years!  So no… I don’t think the NSA lived up to its mantra.

SafeLogic’s allegiance is to our customers, and our customers’ loyalty is to their own customers.  At the end of the day, our success is measured on whether we did everything possible to insure the security of the end users’ information.  Since our inception in 2012, the answer has been a resounding “Yes!” every single day.  Any partnership, association, or agreement with the NSA would undermine that singular goal.

That’s why we don’t sell our encryption to the National Security Agency.

28 May 2014

The Upside of the Heartbleed Bug

heartbleedlogoHeartbleed was huge.  Massive.  A giant, gaping hole that was able to be exploited in several ways and somehow was unnoticed for over two years.  It was an embarrassment, a black eye for the OpenSSL Foundation and really all who use OpenSSL for encryption… which is the majority of the Internet, and most of the world’s internal sites and apps as well.

The first confirmed data losses due to the Heartbleed Bug were on April 14th, when the Canadian Revenue Service lost 900 social insurance numbers (the equivalent of a Social Security Number) in six hours to a determined college student.  Bad?  Yes.  But destructive at the worldwide level that we believed possible?  Not even close.

So here’s my point.  Heartbleed had a big, fat, silver lining.  In the span of a few days, millions of administrators reset their private keys and reissued their SSL certificates.  We have confirmed very little actual harm caused by the vulnerability, and we have documented millions of websites and apps applying patches, updating their software, resetting their private keys and reissuing certificates.  If only we could inspire this type of prophylactic activity on a regular basis.  It’s like pulling teeth to get users to reset passwords, but one well-publicized breach and folks are clamoring for it.  Many consumers are being proactive and using tools to specifically avoid unpatched websites.  These are steps in the right direction.

Don’t get me wrong.  I won’t be wishing for another Heartbleed.  We have our hands full as it is with the eBays and Targets of the world.  But I’m absolutely certain that there will be another bug… probably worse/bigger/more widespread/more exploited/etc than Heartbleed, and it will be exposed in the fairly near future.  Such is life in this industry.  The ‘next big thing’ always includes the raised stakes inherent in our bigger Big Data, our faster connectivity, and our multiplying endpoints.  Luckily, we are making leaps forward every time we are faced with these threats, and we have very very very smart folks on our side.

My bigger concern had been that we will become jaded and tuned out to the dangers.  Target and eBay dropped the ball on their crisis responses, but banks and credit card companies responded swiftly and effectively.  Anecdotally, I have talked to a lot of people who were prompt to reset personal passwords and treat their identity protection with the proper level of respect and attention that it deserves.  The strong performance of site administrators and product architects worldwide in their response to Heartbleed has shown me that we have many reasons to be optimistic.  Here at SafeLogic, we had patches rolling out within hours of the announcement, and we were not alone.  As we approach the tipping point toward the Internet of Things, our vigilance must remain strong, and the industry’s unified response to Heartbleed has actually helped me sleep better at night.

21 May 2014

IoT: The Internet of Toilets?!

I recently read a humorous but forward-thinking post on Wired, espousing the potential use cases for an internet-connected toilet, complete with various sensors and capabilities.  The writer, Giles Crouch, nailed a few awesome scenarios, such as pregnancy detection, stool analysis, and hangover cures.  Yes, I’m a sucker for technology and I already want an iToilet, Giles… but only if they build it with security in mind.  The alternative brings to mind the 1937 Donald Duck cartoon, ‘Modern Inventions’.  You know how it ends… one disaster after another.

YourHatSir

For example, early pregnancy detection is brilliant!  Until you leave your pregnant wife home while on a business trip, and some criminal genius figures out that he can scan the neighborhood for homes in which the only urine collected belongs to a pregnant woman.  That would be valuable information for someone with ill intentions and should be encrypted and guarded like your better half herself.  [Note: The same hormone levels could indicate testicular cancer in a man as well, but it would be a statistical long shot.  Not enough to discourage a criminal from playing the odds.]

The automatic stool sample is an excellent feature.  It’s the hypochondriac’s dream.  Every sample submitted would be analyzed and advisories would be offered regularly.  Well, as regularly as the patient, at least.  The rate of car accidents may rise, as Mr. John Doe rushes home at lunchtime to make sure his contribution wouldn’t be wasted on the traditional ‘dumb’ toilet at the office.  But potentially more dangerous, when humans take medical advice from a machine, you better be sure that the machine can’t be hacked.

“Mr. Doe, your sample shows a few deficiencies.  Please drink one quart of Draino to rebalance your system.”
Hey, if my iToilet told me, it must be accurate.  Draino… whodathunkit.
That’s a mistake you can’t make twice.

Further, if that smart toilet is connected to both your calendar and your doctor’s appointment book, just imagine the sh!t show (pun intended) if this was intercepted in plain text by a malicious third party.  You might spend all day in the waiting room of a doctor that does not have you on the calendar, while your house is raided because your door lock app was compromised as well.

Ah, yes.  The future holds a great deal of creature comforts in automation… if we can just get the security dialed in first.

Now without further ado (or toilet jokes), here’s the one and only Donald Duck in ‘Modern Inventions’.  Cheers!

 

14 May 2014

The Real Truth About Wearables

I keep reading about Wearable tech’s ‘Dirty Little Secret’… the fact that most Wearable devices are shelved within three months of initial use.

Does this shock you?  No?  Good.  Me neither.
And I’m not worried about it.

samsung-smart-watch

If you’re reading this post, you’re no stranger to the phenomenon of the Consumerization of IT, or CoIT.  (It almost looks naked without the hashtag!  #CoIT.  That’s better.)  It’s also referred to as the ITization of Consumers, which doesn’t have the same ring to it, but is actually more accurate when describing the shift towards more sophisticated and savvy users.  Today’s enterprise employees don’t need a designated geek to configure and deploy a piece of equipment.  In fact, they usually prefer to set it up themselves, since nobody knows their needs and preferences better.  Some blame the millenials, but that’s just not the full picture.  This trend was manifesting as Shadow IT since before the millenials went to prom.

I bring up CoIT because it is the embodiment of today’s tech culture.  Everyone wants to use the newest, hottest devices, and they prove it everyday, with or without IT’s help or blessing.  Everyone wants to be an early adopter now.  Everyone wants to try the latest and greatest, which is absolutely stellar.  Not every device is going to be a hit, but we are okay with that.  At this point, a wearable device with strong universal adoption would be the exception to the rule.  So in this period of ‘fail fast’ versions, who better to beta test new wearables and subject them to real world conditions than us?

The same research that presents the three month interval of abandonment also puts forward an estimate that over 10% of adult Americans have purchased at least one of these devices.  If we included Bluetooth devices, you better believe that number would skyrocket.  Subtract the population that is – sorry, I’ll just say it – too damn old to mess with these new-fangled doohickeys, and we are approaching an impressive market penetration for wearables without any delusions that it is a matured technology.  As a culture, we have demonstrated our appetite for wearables by continuing to buy and try them.  There is a certain sense of pride associated with being an Explorer, Pilot, or Kickstarter participant.

Bottom line – I’m not surprised by, or discouraged by, this report.  Wearables are still nascent, like a recent graduate backpacking through Europe, searching for motivation and identity in an existential haze.  We should embrace it as it is formed, molding it to our vision.  We shouldn’t push it away and complain that it is undeveloped.  We need to try every device that we can get our hands on.  We need to speak up and give strong feedback.  Offer opinions publicly, so that others can echo or debate, in the plain view of the innovators who will give us exceptional, can’t-live-without-them wearables one day soon.

And of course, don’t forget to demand strong security in every piece of technology that we carry on our bodies.  Don’t forget how crucial it is to protect ourselves, and that includes our personal data.

We can make a difference in wearables.  Try, test, and critique.  Rinse and repeat.

6 May 2014

Securing the Internet of Things

Today’s blog entry is from our partners at Weaved.Weaved_LogoResize

Weaved is a cloud services company that provides nearly 4 million IoT device connections per month over the Internet.  We published a joint press release in April, announcing the partnership between SafeLogic and Weaved, and describing how we are working together to make the IoT secure.

 

The Internet of Things holds tremendous promise for driving the next wave of economic growth for Internet connected devices and applications.  Our smart phones have become the remote control for our lives and give us access to the Internet and our networked devices 24/7.   It’s easy to see that soon nearly every industrial and consumer electronics product will require some kind of app control as a standard feature.  Unfortunately, the Internet remains a publicly-accessible and unsecure environment for devices and every network is only as secure as its weakest link.

Right now, IoT devices are notorious for being that weakest link.  They have earned this reputation by ignoring security best practices and focusing only on local connectivity.  As a result, malicious tools have been developed, like search engines on the public internet that scan and search for open ports on devices.  So for mass market consumer adoption of IoT, device makers must really step up their efforts to apply some well established security best-practices and win back public trust.

At Safelogic and Weaved, we believe that a common sense approach to security in IoT must include:

1.  No Port Forwarding and No Open Ports on Devices

Port forwarding allows remote computers on the Internet to connect to a specific device within a private local-area network (LAN).  It’s an open door to your LAN from the outside and there is a surprisingly large installed base of devices that use this technique.  Weaved has developed a proprietary method of addressing and securely accessing any TCP service (Port) over the Internet without the use of port forwarding.  With Weaved’s technology, ports can even be shut down and appear as invisible to malicious “port-sniffers” and search engines.

2.  Trusted and Validated Encryption End-to-End

A lot of IoT devices today are storing or sending data across the Internet with weak encryption or even in the clear.  Even trusted companies like Skype have been criticized for allowing unencrypted media in their data path.  Weaved’s cloud services are already using unique, encrypted session keys per connection.   Going forward, Weaved and SafeLogic will collaborate to bring SafeLogic’s trusted and verified encryption engines to the platform for applications that demand that level of security.

These are just a couple of measures needed to protect your local network from being compromised.  There’s much more to cover on this topic, so expect to hear more from Weaved and SafeLogic in the near future, as we define and deploy our joint roadmap and services.

2 May 2014

Warning: Plan Your Validation Carefully

I’m always interested in the comments of engineers who recently completed a FIPS 140-2 evaluation.  It’s like the entire team had a meeting and played ‘Not It’, sealing the poor bastard’s fate for the last year-and-a-half or so.  Seems fair, right?

NotIt

 

It really isn’t their fault.  Maybe they contributed to a NIST evaluation early in their career, and they made the mistake of putting it on their resumé.  Maybe they were cocky and volunteered, figuring that it couldn’t be ‘that hard’ or ‘that time consuming’.  Or maybe they simply had the misfortune of being late that day.  Regardless, they became responsible for a process that doesn’t always make logical sense to an engineer and seemingly small early decisions have major ramifications for the entire lifespan of the product in question.

In some cases, veteran engineers with a pedigree in cryptography still get aggravated and befuddled by the inner workings at the CMVP.  The inspiration for this blog entry came from our friends at Oracle.  Darren Moffat, a Senior Principal Software Engineer based in the UK, vented about his experience in a post titled ‘Is FIPS 140-2 actively harmful to software?‘.

Before we go any further, the answer is no.  FIPS 140-2 is definitely not harmful.

Darren’s frustration centers around the establishment of validation boundaries.

Why does the FIPS 140-2 boundary matter?  Well unlike in Common Criteria with flaw remediation in the FIPS 140-2 validation world you can’t make any changes to the compiled binaries that make up the boundary without potentially invalidating the existing valiation. Which means having to go through some or all of the process again and importantly this cost real money and a significant amount of elapsed time.

He’s absolutely on point. The boundary is a crucial strategy point for every validation, and as a vendor pursuing a FIPS certificate, you want to set it carefully.  There’s no sense validating and locking in features that will require future updates.  From a user standpoint, this is exactly as it should be.  By insuring that any changes within the boundary require re-testing, buyers can be confident that a product’s encryption module has been fully vetted in its current form.

Moffat goes on, asserting that engineers ought to be able to issue patches and bug fixes without invalidating the FIPS certificate.  I agree completely!  This is precisely why SafeLogic’s CryptoComply family of validated cryptographic modules maintains a tight boundary.  The core crypto libraries are tested and validated, then left intact while the rest of the vendor’s product can be updated as needed.  Users know that the encryption within is of the highest quality, and there are no negative side effects of active updates from the provider.  This is a win-win, and it all stems from establishing the correct boundary for the CMVP.

I don’t agree with Moffat that most customers don’t care about FIPS 140-2, and I don’t agree that customers that care are only checking the box and don’t worry whether the certificate is still valid.  Oracle’s commitment to updating and patching their software is fantastic, but it should not come at that cost.  They invested a great deal in getting that certificate, and it should not be pushed aside so easily.  Earning a FIPS 140-2 validation requires time, money and commitment.  (Significantly less of all three if you use SafeLogic’s RapidCert, but still enough to be relevant.)  If done correctly, there should not be a choice between validated crypto and properly updated software.  This is a toxic ultimatum for both the provider and the user, and it should be avoided.

To share your thoughts and stories from the trenches, tweet at us @SafeLogic!