Blog | SafeLogic

Blog | SafeLogic

30 Sep 2015

Recap: CTIA 2015

CTIA Super Mobility 2015Ever notice that when you go to Las Vegas, the flight there is always faster than when you’re coming home?  Or worse, if you’re driving back to California, you start to lose the will to live somewhere between Stateline and Baker on the I-15 South.  It doesn’t matter if you won or lost, the journey home is usually brutal.

This was different.  As the SafeLogic team parted ways on the last day of Super Mobility, there was an electricity in the air.  Sure we were tired; it was a long week.  We were excited, too, and for good reason.  With some time to catch up and reflect, here are my thoughts on CTIA’s flagship conference.

1) The Good

CTIA remains one of the best places to network.  We got to spend quality time with delegates from customers, partners, analyst firms… some planned meetings, some spontaneous.  The Sands Expo at the Venetian and Palazzo resorts is a strong draw, especially for the west coast folks, so there were a lot of people in attendance that we wanted to see.  That was great.

Our CEO, Ray Potter, was featured as a speaker at the 151 Advisors’ App-Solutely Enterprise seminar, providing a solo talk on mobile security, setting the tone before joining the panel discussion on the same topics.  It was a lively session, to say the least, with five opinionated panelists and frankly not enough time for everyone to put in their piece.  Luckily, the discussion was carried forward down the hallway and into cocktail hour.

Ray was also invited to speak at Wireless U., a co-located event coordinated by CTIA and the NCSL (National Conference of State Legislatures).  So Friday morning, Ray presented to a room full of State Senators and Representatives.  While not directly fueling SafeLogic’s efforts, the feedback and questions from the group were invaluable.  Attendees were insightful and curious, despite their candid and often refreshingly self-deprecating general lack of expertise in technology.  The fact that these state congressmen and congresswomen were investing their time to better understand the hurdles facing their constituents and our nation as a whole was very encouraging as well.

2) The Bad

The downside is that the event is becoming increasingly saturated with cellular retailers and accessory vendors.  Somewhere along the line, the complementary industry of rhinestone-encrusted cases and external battery packs became a primary draw for the CTIA exhibit hall instead.  Add in the hands-free Segways, and you have the definition of “mobile” stretched a bit thin.

The MobileCon section of years past was essentially disbanded, incorporated into the rest of the exhibit floor.  I preferred the designated area of enterprise-focused software vendors who could focus their message on the enablement and empowerment of mobile workers.  Now, it is much more of a free-for-all (which is tougher for my marketing peers).  Gone along with it was the App-Solutely Enterprise stage, which was central to the exhibit floor in 2014.  Plenty of folks complained about the noise and bustle inherent to locating the stage directly on the main floor, but it was preferable to this year’s isolated ballroom on the second floor.

3) The [Not-So] Ugly

Maybe it’s Vegas, maybe it’s because CTIA is starting to converge on CES as a consumer-driven show, or maybe it’s just in contrast to the more buttoned-up atmosphere found at both security conferences and government-related events… Super Mobility really knew how to throw a party though!

Between the rave music punctuating attendee hangovers with serious subwoofers in the hallways and the efforts of the exhibitors, you definitely knew that this was an event to be explored.  We counted not one, but two BMW i8’s, as well as several non-electric supercars (Ferrari and Audi convertibles spring to mind), countless hired guns working the booths (not just babes, but Booth Bros as well), and a ridiculously talented balloon artist.

If the goal was to make a splash, then mission definitely accomplished.

The Bottom Line

While CTIA’s national conference is not what it used to be, it remains a strong destination for mobile security folks.  Would it be better if it was spun back out to a stand-alone event that caters better to enterprise technology?  Yeah, probably.  But until they do that, you’ll still see plenty of SafeLogic at Super Mobility.

Share your thoughts on the conference with us on Twitter!



23 Sep 2015

Changing Seasons

Credit: Jean-Pol GRANDMONT

Credit: Jean-Pol GRANDMONT

Happy Autumnal Equinox, everyone!  Yes, it’s the first day of fall for the northern hemisphere (and by proxy, the first day of spring for everyone down under) and I’m back blogging.  Football is back and playoff baseball is nearly here. (Go Dodgers!) Leaves are turning, pumpkins are growing, and there’s a lot to catch up on.

It’s been a long, hot, El Niño summer here in San Diego, where I’m based. While I spent some time at the beach like every San Diegan, the big chunk of time was devoted to working with the awesome SafeLogic team, reviewing and polishing key details of great things to come.  While I cannot yet reveal what’s in store, I will say this – we’ve worked hard to align each piece of the puzzle to best benefit our existing and future customers alike.  Our goal is to display our unwavering commitment to disruption on behalf of our clients.  The current model of FIPS 140-2 certification is broken and we are doing our best to insulate our customers and keep blazing new trails.

So why do you care?

Well, if you want to have a validation completed by the end of the calendar year, you should definitely reach out asap.  Along with official announcements in this space, we will be rolling out some new blog posts pertaining to specific verticals and solutions, as well as recaps and commentary related to this season’s industry events.  It’s going to be a busy Q4, let’s just say that. Stay tuned!




27 Mar 2015

Security on the Road

Travelling isn’t easy. I’ve been hitting the road more often lately, and even beyond the normal complications (Did I remember to turn off the thermostat? Did I lock the door?), security concerns rear their ugly head the minute that you walk out the door.  Here are a few thoughts on my own best practices for travel security.

Your phone and laptop should always have a password lock enabled, but even if you insist on skipping that precaution at home, please do yourself a favor and enable it on the road. I can’t count how many times I’ve heard the horror stories of leaving a device in a taxi. (or Uber. Or Lyft. Pick your poison.)

This is just hilarious.

This is just hilarious. No, it’s not me.

If you’re flying, TSA poses a hurdle as soon as you hit the airport. I always remind myself to be 100% vigilant at the luggage x-ray machine and metal detector… not because I think I need to stop the next hijacking plot, but because anytime my phone, keys, passport, laptop and everything else are exposed and out of my immediate control, I need to be on my game. If you have travelled with me before, you noticed that I’m completely willing to be ‘That Guy’ who holds up the line. Why? Because there’s not a chance in hell that I’m walking through the body scanner before my personal items have been gobbled up by the conveyor belt to the x-ray machine. No, I don’t trust the TSA agents or anyone else to ensure that my laptop makes it through. Especially when the next three people in line have identical MacBooks to mine. Maybe I should add a SafeLogic sticker to differentiate it on the road. Or I should register for TSA Pre, so I can leave it in my bag.  Note to self.

Once you’ve made it to the gate, whether you’re at the airport, train station, or friendly local HyperLoop stop, the dilemma inevitably arrives before your boarding call.

Free, open WiFi. Do you connect or not?

I’ve asked that question of a lot of smart people that I respect, and the answers vary. Sometimes the folks that I expect to be most paranoid admit that they use every Starbucks hotspot that they can find, without hesitation. Others eschew any connection that has not been provided and approved by their employer, lest they inadvertently cause a data breach. It’s about the liability. Me? I take precautions, but I’m more usually worried about the weirdo sitting next to me trying to eyeball my screen than getting singled out and sniffed among the thousands of connected devices on the network.

biztravelI’m forced to be more accepting of dodgy WiFi locations if I’m traveling abroad for pleasure though. When I’m on vacation outside of the States, I usually just remove my SIM card. It protects me from unwanted phone calls while I’m relaxing. More importantly, it protects me from unwanted roaming charges. Nobody likes a 5-figure mobile bill when they get home. It does require me to leverage WiFi when offered at the corner boulangerie or pub so I can plan my next destination, but usually well worth the trade-off. (Pro tip: load a local map on your phone app while you are connected… then even without WiFi, your GPS beacon will appear and give you a fighting chance to navigate accurately.)

But I digress. Once you arrive at your location, plastic is your lifeline. Better hope your credit or debit card doesn’t get stolen, forgotten, eaten by a rogue ATM (yes, that actually happened!) or possibly more aggravating, disabled by a fraudulent use flag. The founders of Final give a great example in their origin story and built a product with potential to save us from similar future issues. In the meantime, make a solid contingency plan for if your go-to card is unavailable. (No, panhandling is not a viable contingency plan.)

Technology can be your friend with the sheer volume of traveling documents, too. I like to use the Apple Passbook for my airline boarding pass whenever possible. Removing the paper slip from circulation means one less thing I need to keep safe. This is true for your itinerary, train tickets, directions, and many other items. The only catch is knowing whether your app of choice is secure.  Naturally, I gravitate towards solutions from trustworthy sources, especially those that I know have prioritized data security with strong encryption.  SafeLogic customers, if I have the option!

Centralize and travel light. I’ve even eschewed the use of a wallet, choosing to carry the bare minimums – ID, cash, debit card and credit card – in a specialized case for my phone. Thanks Speck. Just one more thing that I no longer have to keep safe.

Lastly, you must cover your tracks like a trained assassin.

• Used the WiFi at your AirBNB flat? Disavow the network on your devices.
• Used a smartlock system like Kēvo to access your rental? Delete delete delete!
• Used the Bluetooth connection to play Pandora or Spotify tunes in your rental car? Make sure to remove your phone from the ‘paired devices’ list on the vehicle console. (I’m looking at you, Kevin Chiu who paired his Samsung Galaxy S5 with that blue Toyota Camry in San Jose before I rented it!)

If you consider the repercussions of every byte you receive and packet you send, plan for worst-case scenarios that could leave you stranded, and memorize at least one phone number to call collect from a pay phone, you’re in good shape. Or at least hopefully better than you were 10 years ago.


8 Feb 2015

On Encryption Keys (and Anthem) – Part 2 of 2

SafeHealth_option2_orangeThe Anthem breach encouraged me to wrap up this blog series and talk about key management in a genuine security context. When the Anthem breach first was public, it looked as if patient records were accessed because of lack of data encryption. Then Anthem stated the real reason for the breach: they only encrypt data in flight to/from the database(s) and rely on user credentials for access to data in the database. Why didn’t they encrypt the data in the database? Well, per Health Insurance Portability and Accountability Act (HIPAA) requirements, they don’t have to as long as they provide protection of the data via other means. Like elevated credentials.

That worked well, didn’t it?

They were compliant, but obviously not secure. To add more security to compliance programs like HIPAA, there have been some cries for enterprises to implement encryption. So how do you encrypt data properly? Well, it all depends on your environment, the sensitivity of the data, the threat models, and any tangible requirements for regulatory compliance. Here are some general guidelines:

  • Use validated encryption.
  • Use strong, well-generated keys.
  • Manage the keys properly.

Use validated encryption. Federal Information Processing Standard (FIPS) 140 is the gold standard. The Advanced Encryption Standard (AES) is one of the FIPS-approved algorithms for data encryption, and it is a better encryption algorithm than what Joe the Computer Science Intern presented in his thesis project. It just is. Plus, part of the FIPS 140 process involves strenuous black box testing of the algorithms to ensure they’re implemented properly. This is crucial for interoperability, and proper implementation of the AES standard also provides a measure of confidence that there aren’t leaks, faults, etc. Always look for the FIPS 140 certificate for your encryption solution.

Use well-generated keys. A password-based key (PBK) is crap. Here a key is derived from a password after it’s hashed with a message digest function. PBKs are crap because most passwords are crap. They’re subject to brute-force attack and just should not be used. Password-Based Key Derivation Function v2 (PBKDF2) makes password-based keys a bit stronger by conditioning the digest with random elements (called salt) to decrease the threat of brute force. But the threat is still there.

Keys should be as unpredictable and “random” as possible. Unfortunately in software environments it’s difficult to obtain truly random data because computers are designed to function predictably (if I do X, then Y happens). But let’s say you can get provable random data from your mobile device or your appliance. Use that to feed a conditioning algorithm and/or pseudorandom number generator. Then use that output for your key.

Use strong keys. The strength of a key depends on how it’s generated (see above) and how long the key is. For example, the AES algorithm can accommodate key sizes of 128-bits, 192-bits, or 256-bits. Consider using a key size that correlates to the overall sensitivity of your data. In Suite B, 256-bit keys can be used to protect classified data at the Top Secret level. Is your data tantamount to what the government would consider Top Secret?

Also consider the environment. Constrained and embedded environments (think wearables) may not have the processing power to handle bulk encryption with 256-bit keys. Or maybe data is ephemeral and wiped after a few seconds and therefore doesn’t need “top secret level” encryption. Or maybe there’s just not enough space for a 256-bit key.

Use a key that is strong enough to protect the data within the constraints of the environment and one that can counter the threats to that environment.

Manage your keys properly. You wouldn’t leave the key to your front door taped to the door itself. Hopefully you don’t put it under the doormat either. What would be the point of the lock? The same applies to information security. Don’t encrypt your data with a strong, properly generated data encryption key (DEK) then leave that key under the doormat.

Consider a key vault and use key encryption keys (KEK) to encrypt the data encryption keys. Access to this key vault or key manager should also be suitably locked down and tightly controlled (again, many different ways to do this). Otherwise you might as well just not encrypt your data.

While we’re at it: rotate your keys, especially your KEKs. Key rotation essentially means “key replacement” … and it’s a good idea in case the key or system is compromised. When you replace a key, be sure to overwrite with Fs or 0s to reduce any chance of traceability.

Store those DEKs encrypted with KEKs and protect those KEKs with tools and processes. And remember to balance security with usability: rotating your KEK every 2 seconds might be secure, but is your system usable?

Anthem wanted the data to be useful, which is why it wasn’t encrypted at the database. But that usability came at a high cost. The good news is that it is possible to encrypt data and have it be usable.


Encryption is a critical, necessary piece of a system’s overall security posture. But it’s not the sole answer. In Anthem’s case, records were accessed via those “elevated user credentials” … which means that malicious hackers were able to get in to the authentication server and raise privilege levels of user credentials (usernames/passwords) that they either knew or gleaned from the auth server. So in this case, it’s irrelevant if the breached data was encrypted; the hackers had authenticated and authorized access to it.

So what’s the answer?

When this was first reported I tweeted this:

Editing_Encryption_Keys — Part_1__What_Are_Keys_Exactly_

Defense in depth means providing security controls to address all aspects of the system: people, process, and technology. Technology is the most difficult pillar to lock down because there are so many layers and threats, hence so many products such as firewalls, IDP, APT, IDS, SIEM, 2FA, AV, smart cards, cloud gateways, etc.

Encryption is a fundamental element for security of data at rest and data in motion (control plane and data plane). Even the strongest encryption with proper key management won’t protect data that is accessed by an authorized user, because it has to be usable. However, encrypted data and tight management of keys provides a critical, necessary piece to a robust security posture.

I hope this provides some guidance on how to think about encryption and key management in your organization.



3 Feb 2015

Privacy, Liberty & Encryption

David Cameron

David Cameron

It is unfortunate, that in the aftermath of the Charlie Hebdo murders and hate crimes in France, rallying cries for freedom of speech were twisted to interpret “free” speech as the opposite of “private” speech.  A few weeks ago, British Prime Minister David Cameron spoke out, radically saying that “we must not allow terrorists safe space to communicate with each other,” going on to suggest that there should be no means of communication which the government cannot read.  I’m in no way sympathetic to extremists or rebels who leverage privacy to plan nefarious and destructive acts, but I am certainly sympathetic to all of the innocent, law-abiding citizens whose civil rights would be trampled by such a policy.

It was just a few short months ago that certain US government officials cried foul when Apple solidified their encryption capabilities to the point that consumer data could not be deciphered, even under federal subpoena.  As Matthew Green wrote on at the time, “Designing backdoors is easy. The challenge is in designing backdoors that only the right people can get through. In order to maintain its access to your phone, Apple would need a backdoor that allowed them to execute legitimate law enforcement requests, while locking hackers and well-resourced foreign intelligence services out.”  For this, among a myriad of other reasons, Apple relieved themselves of the headache and built the ‘Secure Enclave’ instead.  Individual iPhones encrypt extended data using a unique key, mathematically derived by combining their passcode with a set of secret numbers that are built into the phone.  Tim Cook himself couldn’t decrypt it without the user’s passcode and physical access to the device.  By extension, Apple is now rid of thousands of subpoena requests and pressure from a variety of global governments.

Despite the claims that law enforcement’s hands would be tied by this development in time sensitive situations such as kidnapping cases, Bruce Schneier asserted in a CNN editorial that “of the 3,576 major offenses for which warrants were granted for communications interception in 2013, exactly one involved kidnapping.”  So much for that theoretical importance of maintaining access to user phones.  More importantly, Schneier points out that phone data “can no longer be accessed by criminals, governments, or rogue employees. Access to it can no longer be demanded by totalitarian governments.”

This is another complication.  Even if the FBI and other US law enforcement agencies were the absolute pinnacle of tech-fueled crime-fighting and the removal of communication intercepts truly shackled their efforts… at least it closes the door to other, more suspect governments.  Apple, Samsung and others can’t really play international favorites, after all.  If they were able to, and willing to, provide backdoor access to the USA, they would have obligations to North Korea as well.

Apple washed their hands of the encryption problem by abdicating their role as a middle man and gatekeeper, and the internet didn’t break.  Law enforcement and other agencies seem to still be solving crimes, even without their former favorite toy.  Possibly most important, the ship has sailed, before another government flexes their muscles.  Just like Iran banned WhatsApp.  Just like India forced Blackberry to provide a law enforcement backdoor.  The UK has long been a supporter of citizens rights and privacy.  Thankfully, Apple ended this conversation long before the Prime Minister’s kneejerk reaction, wishing out loud for a technology-driven vaccination from terrorism.  We can only hope that other phone manufacturers follow suit quickly.

I sympathize with the victims in France.  I understand the sentiments of David Cameron.  But now, more than ever, it is crucial that we protect our liberty by protecting our privacy.  If we are forced to sacrifice our rights, we have already lost the war.


24 Jan 2015

On Encryption Keys – Part 1 – What Is a Key?

Last week I met with a customer to help solve, among other things, some challenges around key management and key lifecycles. I thought I’d kick off a blog series on keys: what they are, their generation, use, recommended strength, etc.

First, let’s briefly address what a key is: a key is what protects your data. It’s a (hopefully!) secret parameter fed into an encryption algorithm to obfuscate data in a way that only someone with the same key can decrypt the data and read it as intended.*

Here’s how I explained it to my 10-year-old daughter:

Think about the door to our house. When the door is locked, only someone with a key can get inside. (Ok sounds more like authorization but stay with me). When inserted and turned, the key hits the pins that triggers the locking mechanism and unlocks the door. That key is the only key that can lock and unlock our door.

While quite elementary in my mind, it’s a relatively good example of the value and importance of the key lifecycle, which I briefly discussed with my daughter after she asked the following questions:

  • What if someone copies the key?
  • What if our neighbors lose our spare key?
  • How do we know if someone else used our key?
  • Does someone else’s key work in our lock?

All are relevant questions in relation to cryptography as well. Over the next couple of weeks, we’ll talk about how keys should be generated, ideal key sizes, and general key management issues and best practices.

Fair warning: there is no single, correct answer. We’ll use this series to address dependencies and variables such as environments, data sensitivity, and threat models.

*This is known as symmetric encryption, where one key encrypts and decrypts data. In asymmetric encryption a public key is used to encrypt data and only its associated private key can decrypt the data.



5 Jan 2015

My Worry and Optimism for Cybersecurity in 2015


Let’s face it – 2014 was pretty bad from an information security perspective, and I believe we will see a rise in the frequency, severity, and publicity of malicious hacks and breaches in 2015.

I’m worried that as a community, hell, as a society, we won’t see enough progress in this uphill battle of infosec. I’m not blaming anyone or pointing fingers. Security is hard because every organization is different: different people, different policies, different network topologies, different vendors, different missions, etc. (and that is why there is no silver bullet for security). In general, I’m worried about some SMBs that lack the resources to set up a proactive security posture. I’m concerned about some large enterprises that will continue to lag and not fully embrace security.

But… I’m optimistic. Security is at the tip of everyone’s tongue now. It’s “cool” … and cool is good.

SMBs have options for cloud productivity and storage solutions with security built in – at the very least, better security than what they could do themselves. Larger organizations can integrate many different solutions to enable their security posture.

Security is about defense-in-depth, which is to say having security at all layers, from policy and training to two-factor auth and encryption. Aggregate organizational differences can be met with the right technologies in the right place.

I’m optimistic because there are so many good and talented people working very hard to stay ahead of the bad guys. There are new technologies and new ways of thinking. There are VCs willing to fund such companies. There is more adoption and acceptance of security in the marketplace. There are companies with an assigned CISO to keep their business focused on security and out of the news.

So how do we make 2015 better to ease my worrying and reinforce my optimism?

Everyone: keep evangelizing. We have to keep talking about security and encouraging it. We need to think about security in new and emerging markets like wearables and IoT. I think after all the news in 2014, stakeholders are starting to get it. Perhaps we need better / tighter regulations. We need to talk about what’s real, what’s viable, and what’s manageable.

Product vendors: build security into your lifecycle. It’s doable. Microsoft initiated the Security Development Lifecycle with impressive if not astounding results. Cisco is doing it, along with many others. Security is a process. Bake it in to your software development. It’s good for you and your customers.

Buyers: check for the right encryption. Not all encryption is equal. Is your vendor using homegrown encryption written by Joe the Intern? Or is it standards-based? Just because a vendor says they implement AES doesn’t mean they do it correctly. Encryption needs to be correct to be true and interoperable. Look for FIPS 140 validation on your preferred vendor’s encryption library or ask for the certificate number.

All businesses: Assess the value of your data and where it resides. Then deploy the right products. Security is a process. Organizational security starts with security risk management, which guides the organization in protecting its assets. Before selecting security controls, the organization must know what data it needs to protect, the value of that data, and the lifecycle of that data. Whether protecting credit card numbers, user files, intellectual property, internal emails, provocative Mardi Gras photos, product roadmaps, money… all of that needs to be protected in an organized and actionable way.

Over time, we’ll explore more in each of these areas. In the meantime, this worrier is optimistic that we will stay focused, deliver, and do our best to make 2015 better.



22 Dec 2014

The Sony Hack Just Does Not Matter

Several times this year we’ve heard about hacks and compromised systems (more so than I can remember in recent history), and I have to say I’m truly amazed at all the press on the Sony hack. But why is this garnering so much attention?

Simply put, its effects are felt by a wider audience.The_Interview_2014_poster

  • Sony cares because of loss of revenue and tarnished reputation.
  • Movie stakeholders (the producers, actors, etc.) care because it could impact them financially. I have never read the relevant agreements for this industry, but I’m sure there is a force majeure clause that will now be subject to an unprecedented interpretation and a great deal of legal precedence going forward.
  • Theater owners / workers care because of supposed threats against their establishment, loss of revenue, and the inconvenience of replacing a movie in their lineup.
  • Consumers care because they can’t see a movie with some very funny comedians.

Banks or retailers get hacked and it makes the news for a couple of days and fades. Maybe it’s not serious enough? The Home Depot, Target, and Staples attacks don’t really take anything away from the consumer. They can still shop at those places, albeit with new credit card numbers. So they don’t really feel the effects. An entertainment company is hacked and it’s an act of war cyber-vandalism. So much so that the President has weighed in and vowed a response. I guess compromising a retailer is just a nuisance.

Finally, there is breach that consumers actually care about. The JPMorgan breach doesn’t directly affect the average family. We are, sadly, getting accustomed to being issued new credit cards and putting band aids on breaches in that industry. We can tolerate the Fortune 50 losing money, but don’t mess with our entertainment. That is intrinsically American.

Perhaps I should rethink this title, as now attackers may have found an avenue that will encourage even more attacks. And let’s face it: we have thoughts of actual war dancing through our heads. This isn’t script kiddies and folks just looking to make a quick buck. These are hackers with nukes.

At SafeLogic we’ve done a fair bit of evangelizing this year, trying to get makers of IoT devices and health wearables to build security in as opposed to treating it as a cost center and a reactive initiative. So with that in mind, let’s think about this:

If halting the release of a movie gets this much attention and buzz , what happens if critical infrastructure is compromised? What if people can’t get water? Or they get only contaminated water? What if the power grid is blacked out? What happens when connected “things” are compromised? These are the absolute scariest scenarios, the effects of which are far more impactful than what you’ve been reading about this week. These effects are real.

Let’s not discover what happens in these “what if” scenarios. We need awareness and we need plans and we need action. I’m hoping that everyone takes the Sony hacks to heart and thinks about what truly matters… Especially this time of year.

Oh, and encrypt your data with SafeLogic’s validated and widely-deployed encryption solutions.


27 Oct 2014

Exposing the Risks of Data-Driven Healthcare

BlogFooter_Guest_JaredThis is a guest post from blogger Jared Hill as a special contribution to SafeLogic.

The recent Heartbleed and POODLE data leaks exposed some of the major dangers of living in a digitized world. With the entire healthcare system becoming increasingly reliant upon digital organizational systems, a patient’s most private information — prescriptions, records, communications, you name it — might be vulnerable to hacks. While many hoped doctor-patient confidentiality and legal privacy rights would be easily applicable across the board, this guarantee can simply not be made in the digital realm.

Illegally obtained medical records promise huge sums of money on the black market, more so than customer or banking information, or even risque photos of famous celebrities. Certain kinds of personal information are very valuable for those wanting to pose as someone else in order to obtain medical care. Although there are dozens of cybersecurity-related legislative proposals before Congress and amendments made to pre-existing legislation, notably, the Health Information Portability and Accountability Act (HIPAA), there is still much work to be done to safeguard patients against data hacking.

The Heartbleed “mishap” incited widespread privacy and identity panic, particularly from those within the healthcare sector, but also among other professionals who are now culpable for such dataleaks. It has suddenly become glaringly obvious that thousands of servers are vulnerable to attacks from outside intruders, and it’s also clear that unsophisticated Secured Sockets Layer (SSL) certificates are not as safe as experts believed.  POODLE has illustrated the dangers of misconfiguration and untrusted networks in its own way.

The real question, then, is by what means can healthcare companies safeguard themselves against the next threat?  Some are confident that newly drafted legislation like FedRAMP will be helpful towards that end. One health IT expert was optimistic recently, saying, “Ideally, the FedRAMP regulations will adequately address common security concerns, such as multi-tenancy and shared resource pooling, and provide a standard set of regulations that would ensure secure cloud usage in the Healthcare industry.”  That would be a major step in the right direction.

Whether FedRAMP or the amendments made to HIPAA will increase patient privacy and data security remains to be seen. They may not be strong enough legislation.  Devices are emerging that have the ability to record DNA, heartbeat patterns, and a myriad of other integral and unique personal characteristics. Instead of solely responding to current issues and security breaches, startups and tech industries need to have a conversation now regarding exactly how users will be protected from technology that won’t arrive for another decade.

Rohit Sethi, vice president of security consulting firm Security Compass said, “Maybe down the road our heartbeat, for example, becomes the main way we prove our identities.  And if we didn’t protect it 10 years ago, we don’t have a way of correcting it. So we have to treat it as serious now because we can’t predict the future.”

Sethi has a point, and a frightening one at that. Sethi cites startups (responsible for creating many of the latest apps and storage systems) as a particularly worrisome area. While established companies have spent years understanding security breaches, startups are often run by young, motivated techies who are concerned about the innovation of the product first, and user security as a distant second.

Sethi predicted that, unless strong regulations are implemented and upheld, everything from medical information to our DNA fingerprints could all become subject to theft and misuse. “You can get a credit card reissued,” Sethi said, “but you can’t reset your heartbeat or your DNA.”

15 Oct 2014

Putting a Muzzle on POODLE

SafeLogic is not vulnerable to POODLEYou may have seen the news about POODLE recently.  The good news is that it’s not as severe as Heartbleed, which affected server-side SSL implementations and had repercussions across most web traffic. The bad news is that it’s still seriously nasty.

POODLE is an acronym for Padding Oracle On Downgraded Legacy Encryption and essentially allows an attacker to decrypt SSL v3.0 browser sessions. This man-in-the-middle attack has one major constraint: the attacker has to be on the same wireless network.

That renders POODLE irrelevant because everyone locks down their wireless networks, right? Oh yeah, except those customer-friendly coffee shops with public wifi. In places like Palo Alto, you can bet there is a *lot* of interesting information going over the air there. Or at conferences, where diligent employees handle pressing business and aggressive stock traders log in to their account to buy the stock of the keynote speaker (or short it if his presentation lacks luster).  The threat is real – session hijacking and identity theft are just the tip of the iceberg.

It’s worth noting that this is a protocol-specific vulnerability and not tied to vendor implementation (such as Heartbleed with OpenSSL and the default Dual_EC_DRBG fiasco with RSA). That makes it a mixed bag. The issue affects a wide variety of browsers and servers (Twitter, for example, scrambled to disable SSLv3 altogether), but users do have some control.  This is because SSLv3 can also be disabled in the client within some browser configurations, so check your current settings for vulnerability at and install any patches when available for your browser.

Some browser vendors have already made moves to patch against this threat and permanently disable SSLv3.  Meanwhile, others have dubbed server-side vulnerability “Poodlebleed” and offer a diagnostic tool to assess connectivity.

From a government and compliance perspective, Federal agencies should be using TLS 1.1 according to Special Publication 800-52 Rev 1. TLS 1.1 is not susceptible to POODLE. FIPS 140 validations and SafeLogic customers are not affected.

If you’re interested in a deep dive, I recommend this fantastic technical post by Daniel Franke, which also provides a great history of SSL and its challenges.