Conversations Archives | SafeLogic

All posts in Conversations

29 Dec 2016

Happy New Year from SafeLogic

Well, it’s that time of year. You know, the annual, happy-go-lucky, turn-the-page-on-the-calendar, celebrate-the-new-year, use-too-many-hyphens blog post.

I’ve been reflecting on the beginnings of SafeLogic – how we got here, where we’ve been, and where we are headed next. Most of those reflections have been pleasant, but certainly not all. There’s no need to put lipstick on it. The nearly two years that I went without salary weren’t exactly “fun” and I’m glad that’s in the past. Or the times I felt like an inadequate leader because it felt like we weren’t living up to the ridiculously overblown expectations of Silicon Valley society. Or the times we invested in new ideas only to find failure (which is not a bad word, by the way).

high-fiveI’m still thankful for all of those things because it put SafeLogic on a path that almost leaves me (yes, even me!) speechless. Those sacrifices were made with the future in mind, and we are now reaping the benefits. We’ve had so many positives this year that bullet points hardly seem to give justice to the significant effort behind them, but here are some quick highlights:

– We added a dozen new customers and strengthened relationships with existing customers.

– Revenue doubled from last year. (That’s good, right?)

– The number of support tickets decreased over 50%, signaling that the growth of our self-serve knowledge base is paying off.

– Average Time to Resolution on those support tickets is a fraction of what it was last year, a testament to the increased effectiveness of our technical team.

– 100% of support contracts were renewed. Always a good sign of customer satisfaction!

– Strategic additions to the team fueled these successes, which of course will then make it possible for more expansion. A very positive cycle.

On a personal note, I left the corporate world nearly 12 years ago to work for myself and at this very instant, I’m the happiest I’ve ever been. This is a journey that I could not undertake alone, and this team is the real deal. We have great products that customers want and need, and we help them solve real problems in innovative ways. Internally, we’ve grown and matured to the point that we are able to handle roadmap items and customer requests much more aggressively and proactively (and in some ways, automatically, which is extra cool).

So does all this reflection mean that we’re hitting pause because the CEO is happy? Oh hell no. We are just hitting our stride! Being content is nice, but never complacent. 2017 will be the year of more business innovation, of more new capabilities, of more milestones. Of, well, more.

This leads me to the mushy part:

Thank you, SafeLogic customers. Thank you for believing in us and we promise not to let you down as we continue to grow. Thank you, SafeLogic team. Your hard work and commitment is appreciated more than I can express. Thank you, SafeLogic partners, friends, and allies for your support, advice, and contributions.

Here’s to a stellar 2016 and to keeping the momentum going in 2017!

eft_letterhead_usd_pdf__1_page_

 

BlogFooterRay3

18 Aug 2016

Encryption Concerns in the UK


This is a guest post from Amazing Support’s David Share as a special contribution to SafeLogic.

BlogFooter_Guest_DavidShareIn the early days of 2015, the British Prime Minister at the time, David Cameron, put forth an idea to ban all forms of encryption in the United Kingdom (UK) dealing with software and especially embedded in messaging applications. This proposal to ban encryption followed Paris’ Charlie Hebdo massacre, in which the attackers were thought to have been communicating with each other using apps similar to WhatsApp and iMessage. Were this ban to be realized, a backdoor would have to be created into any and all apps, whether web or mobile-based, that utilise end-to-end encryption.

Encryption has become a battleground as of late. Government bodies and those who fear that apps are being utilised for the propagation of terrorism seem to be firmly entrenched of the idea of creating backdoors in these apps. Technology companies, like Apple, and those who are trying to preserve what they perceive as the last vestiges of civil rights and privacies, are fighting to maintain encryption’s independence. Needless to say, both sides have their pros and cons.

Creating a backdoor, according to proponents like Cameron and current British Prime Minister Theresa May, would ensure that law enforcement and government agencies are able to monitor and act upon those that would cause harm to the UK. When using the Charlie Hebdo massacre as an example of how a ban on encryption could have helped, it does make sense.

However, tech companies and cryptography experts fear that the creation of a backdoor does not ensure that it could only be used by the “good guys”. To them, a backdoor is a legitimate vulnerability that could be equally exploited by foreign spies and corrupt police, among others. Businesses are concerned that it may portend the end of ecommerce as we currently know it, since almost all credit card transactions online are done through encrypted channels. If that encryption had a backdoor, it may create a sense of distrust among the consumer base and scare off business. Finally, there is the matter of privacy. If the encryption walls did fall by government command, then users are left terribly exposed and would have to endlessly worry if what they say online can be misconstrued as dangerous or worse, an act of terror.

UK Prime Minister Theresa May

UK Prime Minister Theresa May

The proposal has been legitimised and is known as the Investigatory Powers Bill (IPB) under Theresa May’s leadership. According to May, the bill does not state that tech companies are forced to create backdoors in their encryptions. However, it does require companies to provide decrypted messages upon the presentation of a warrant. This is a problem in and of itself, as the messages from apps that utilise end-to-end encryption cannot be accessed by anyone without a proper password or code, and that includes the software publisher. So to comply with IPB and present a decrypted message, some sort of backdoor will be needed. Through the use of sly wording, May and the IPB is effectively forcing tech companies to create backdoors afterall, lest they face a potential ban from operating within the confines of the UK.

Already known as the Snooper’s Charter, the IPB will test the limits to which tech companies and citizens are willing to relinquish a portion of their privacy. If the IPB ever becomes law, the government or any law enforcement agency must simply find cause to issue a warrant to gain access to any citizen’s message history. May and her supporters insist that they will only do this to people who may pose a risk to the safety of the nation, but who is deemed to be a threat can take on many meanings. The opponents of the IPB are afraid that this could and would lead to breaches in privacy laws, even going so far as to say that it would go against portions of the European Convention on Human Rights. Those challenging the bill are questioning Britons about whether they want to join the ranks of countries such as China and Russia, which closely monitor and even dictate what sites can be browsed, what data can be accessed and what messages can be sent.

It seems that May and the current government are selling the IPB under the guise of improving national security. However, they have failed to answer opponents’ concerns about the negative effects of the bill – the potential invasion of privacy and the creation of a new vector of attack for malicious hackers. May says that the bill does not infringe on the rights and privacies of the citizens but experts on the matter believe otherwise. More frighteningly, May and her party have yet to come up with a rational solution to the security problems that the creation of a backdoor poses.

If Britons were to stand up and made their voices heard they should do it sooner rather than later. The bill has already made it to the House of Lords and passed its second reading, and is now headed to the committee stage on the 5th of September. As it is, and without strong opposition from within the House or the people, the IPB will almost surely be passed and become law.

30 Dec 2015

Bring on 2016!

Jan1
Ahh, the year-end crunch time is here. Closing and reconciling the books. Working with our customers to get in (or delay, when strategic, of course) last minute invoices and accruals. Making sure contracts are executed before the calendar flips over. Catching up. Projecting out. Forward planning. Requisite CEO year-end blog posts like this one. Check it off the list, Marketing Team!

To say that our 2015 was dynamic at SafeLogic is an understatement. As I’m recapping and reviewing our goals for 2015, I see areas where we “crushed it” (in the Silicon Valley lexicon), areas for improvement (yes, it’s a nice way to say that we dropped the ball on a few initiatives and no, I’m not too proud to admit it), and areas for new growth and development. I’m glad this year is behind us, because I’m just so damn ready for 2016.

SafeLogic’s 2016 campaign will be about growth, balance, and clarity. Almost like the plans of current Presidential candidates but without the lunacy and grandstanding, and a lot less speJanuarynd on TV commercials (sorry, Marketing Team). So how will these elements unfold?

Well, we added some very high profile customers to our wall this year, and we’ll grow our share in the market. We’ll increase our team and improve our infrastructure to support those new clients. We will balance delivery, professional development, budgets, customer requirements, and every other moving part that defines a software company. We’ll move quickly but carefully. We’ll work on the right things for our customers and for the industry, while having clear communication internally and externally.  We’ll have a lot of fun while delivering on very serious business-driven goals.

It’s going to be an exciting time. We’re launching some of our Skunk Works projects this year, and we’ve got new projects bidding to be added to the docket. It isn’t always easy to bring innovative and progressive new ideas to a field that is historically stagnant, challenging, and sometimes non-sensical (I’m talking to you, encryption, and you, regulatory compliance). But it’s what we do. And while I think we always have room for improvement, I think we do it pretty damn well, so expect more of the same next year, in higher dosages and more frequently.

I’m thrilled about the new year. We have the right priorities, the right team, the right solutions, and the right processes in place at SafeLogic. Now will someone please turn the calendar over to January? We’re ready to rock!

BlogFooterRay2

23 Sep 2015

Changing Seasons

Credit: Jean-Pol GRANDMONT

Credit: Jean-Pol GRANDMONT

Happy Autumnal Equinox, everyone!  Yes, it’s the first day of fall for the northern hemisphere (and by proxy, the first day of spring for everyone down under) and I’m back blogging.  Football is back and playoff baseball is nearly here. (Go Dodgers!) Leaves are turning, pumpkins are growing, and there’s a lot to catch up on.

It’s been a long, hot, El Niño summer here in San Diego, where I’m based. While I spent some time at the beach like every San Diegan, the big chunk of time was devoted to working with the awesome SafeLogic team, reviewing and polishing key details of great things to come.  While I cannot yet reveal what’s in store, I will say this – we’ve worked hard to align each piece of the puzzle to best benefit our existing and future customers alike.  Our goal is to display our unwavering commitment to disruption on behalf of our clients.  The current model of FIPS 140-2 certification is broken and we are doing our best to insulate our customers and keep blazing new trails.

So why do you care?

Well, if you want to have a validation completed by the end of the calendar year, you should definitely reach out asap.  Along with official announcements in this space, we will be rolling out some new blog posts pertaining to specific verticals and solutions, as well as recaps and commentary related to this season’s industry events.  It’s going to be a busy Q4, let’s just say that. Stay tuned!

BlogFooterWalt2

 

 

27 Mar 2015

Security on the Road

Travelling isn’t easy. I’ve been hitting the road more often lately, and even beyond the normal complications (Did I remember to turn off the thermostat? Did I lock the door?), security concerns rear their ugly head the minute that you walk out the door.  Here are a few thoughts on my own best practices for travel security.

Your phone and laptop should always have a password lock enabled, but even if you insist on skipping that precaution at home, please do yourself a favor and enable it on the road. I can’t count how many times I’ve heard the horror stories of leaving a device in a taxi. (or Uber. Or Lyft. Pick your poison.)

This is just hilarious.

This is just hilarious. No, it’s not me.

If you’re flying, TSA poses a hurdle as soon as you hit the airport. I always remind myself to be 100% vigilant at the luggage x-ray machine and metal detector… not because I think I need to stop the next hijacking plot, but because anytime my phone, keys, passport, laptop and everything else are exposed and out of my immediate control, I need to be on my game. If you have travelled with me before, you noticed that I’m completely willing to be ‘That Guy’ who holds up the line. Why? Because there’s not a chance in hell that I’m walking through the body scanner before my personal items have been gobbled up by the conveyor belt to the x-ray machine. No, I don’t trust the TSA agents or anyone else to ensure that my laptop makes it through. Especially when the next three people in line have identical MacBooks to mine. Maybe I should add a SafeLogic sticker to differentiate it on the road. Or I should register for TSA Pre, so I can leave it in my bag.  Note to self.

Once you’ve made it to the gate, whether you’re at the airport, train station, or friendly local HyperLoop stop, the dilemma inevitably arrives before your boarding call.

Free, open WiFi. Do you connect or not?

I’ve asked that question of a lot of smart people that I respect, and the answers vary. Sometimes the folks that I expect to be most paranoid admit that they use every Starbucks hotspot that they can find, without hesitation. Others eschew any connection that has not been provided and approved by their employer, lest they inadvertently cause a data breach. It’s about the liability. Me? I take precautions, but I’m more usually worried about the weirdo sitting next to me trying to eyeball my screen than getting singled out and sniffed among the thousands of connected devices on the network.

biztravelI’m forced to be more accepting of dodgy WiFi locations if I’m traveling abroad for pleasure though. When I’m on vacation outside of the States, I usually just remove my SIM card. It protects me from unwanted phone calls while I’m relaxing. More importantly, it protects me from unwanted roaming charges. Nobody likes a 5-figure mobile bill when they get home. It does require me to leverage WiFi when offered at the corner boulangerie or pub so I can plan my next destination, but usually well worth the trade-off. (Pro tip: load a local map on your phone app while you are connected… then even without WiFi, your GPS beacon will appear and give you a fighting chance to navigate accurately.)

But I digress. Once you arrive at your location, plastic is your lifeline. Better hope your credit or debit card doesn’t get stolen, forgotten, eaten by a rogue ATM (yes, that actually happened!) or possibly more aggravating, disabled by a fraudulent use flag. The founders of Final give a great example in their origin story and built a product with potential to save us from similar future issues. In the meantime, make a solid contingency plan for if your go-to card is unavailable. (No, panhandling is not a viable contingency plan.)

Technology can be your friend with the sheer volume of traveling documents, too. I like to use the Apple Passbook for my airline boarding pass whenever possible. Removing the paper slip from circulation means one less thing I need to keep safe. This is true for your itinerary, train tickets, directions, and many other items. The only catch is knowing whether your app of choice is secure.  Naturally, I gravitate towards solutions from trustworthy sources, especially those that I know have prioritized data security with strong encryption.  SafeLogic customers, if I have the option!

Centralize and travel light. I’ve even eschewed the use of a wallet, choosing to carry the bare minimums – ID, cash, debit card and credit card – in a specialized case for my phone. Thanks Speck. Just one more thing that I no longer have to keep safe.

Lastly, you must cover your tracks like a trained assassin.

• Used the WiFi at your AirBNB flat? Disavow the network on your devices.
• Used a smartlock system like Kēvo to access your rental? Delete delete delete!
• Used the Bluetooth connection to play Pandora or Spotify tunes in your rental car? Make sure to remove your phone from the ‘paired devices’ list on the vehicle console. (I’m looking at you, Kevin Chiu who paired his Samsung Galaxy S5 with that blue Toyota Camry in San Jose before I rented it!)

If you consider the repercussions of every byte you receive and packet you send, plan for worst-case scenarios that could leave you stranded, and memorize at least one phone number to call collect from a pay phone, you’re in good shape. Or at least hopefully better than you were 10 years ago.

BlogFooterWalt2

8 Feb 2015

On Encryption Keys (and Anthem) – Part 2 of 2

SafeHealth_option2_orangeThe Anthem breach encouraged me to wrap up this blog series and talk about key management in a genuine security context. When the Anthem breach first was public, it looked as if patient records were accessed because of lack of data encryption. Then Anthem stated the real reason for the breach: they only encrypt data in flight to/from the database(s) and rely on user credentials for access to data in the database. Why didn’t they encrypt the data in the database? Well, per Health Insurance Portability and Accountability Act (HIPAA) requirements, they don’t have to as long as they provide protection of the data via other means. Like elevated credentials.

That worked well, didn’t it?

They were compliant, but obviously not secure. To add more security to compliance programs like HIPAA, there have been some cries for enterprises to implement encryption. So how do you encrypt data properly? Well, it all depends on your environment, the sensitivity of the data, the threat models, and any tangible requirements for regulatory compliance. Here are some general guidelines:

  • Use validated encryption.
  • Use strong, well-generated keys.
  • Manage the keys properly.

Use validated encryption. Federal Information Processing Standard (FIPS) 140 is the gold standard. The Advanced Encryption Standard (AES) is one of the FIPS-approved algorithms for data encryption, and it is a better encryption algorithm than what Joe the Computer Science Intern presented in his thesis project. It just is. Plus, part of the FIPS 140 process involves strenuous black box testing of the algorithms to ensure they’re implemented properly. This is crucial for interoperability, and proper implementation of the AES standard also provides a measure of confidence that there aren’t leaks, faults, etc. Always look for the FIPS 140 certificate for your encryption solution.

Use well-generated keys. A password-based key (PBK) is crap. Here a key is derived from a password after it’s hashed with a message digest function. PBKs are crap because most passwords are crap. They’re subject to brute-force attack and just should not be used. Password-Based Key Derivation Function v2 (PBKDF2) makes password-based keys a bit stronger by conditioning the digest with random elements (called salt) to decrease the threat of brute force. But the threat is still there.

Keys should be as unpredictable and “random” as possible. Unfortunately in software environments it’s difficult to obtain truly random data because computers are designed to function predictably (if I do X, then Y happens). But let’s say you can get provable random data from your mobile device or your appliance. Use that to feed a conditioning algorithm and/or pseudorandom number generator. Then use that output for your key.

Use strong keys. The strength of a key depends on how it’s generated (see above) and how long the key is. For example, the AES algorithm can accommodate key sizes of 128-bits, 192-bits, or 256-bits. Consider using a key size that correlates to the overall sensitivity of your data. In Suite B, 256-bit keys can be used to protect classified data at the Top Secret level. Is your data tantamount to what the government would consider Top Secret?

Also consider the environment. Constrained and embedded environments (think wearables) may not have the processing power to handle bulk encryption with 256-bit keys. Or maybe data is ephemeral and wiped after a few seconds and therefore doesn’t need “top secret level” encryption. Or maybe there’s just not enough space for a 256-bit key.

Use a key that is strong enough to protect the data within the constraints of the environment and one that can counter the threats to that environment.

Manage your keys properly. You wouldn’t leave the key to your front door taped to the door itself. Hopefully you don’t put it under the doormat either. What would be the point of the lock? The same applies to information security. Don’t encrypt your data with a strong, properly generated data encryption key (DEK) then leave that key under the doormat.

Consider a key vault and use key encryption keys (KEK) to encrypt the data encryption keys. Access to this key vault or key manager should also be suitably locked down and tightly controlled (again, many different ways to do this). Otherwise you might as well just not encrypt your data.

While we’re at it: rotate your keys, especially your KEKs. Key rotation essentially means “key replacement” … and it’s a good idea in case the key or system is compromised. When you replace a key, be sure to overwrite with Fs or 0s to reduce any chance of traceability.

Store those DEKs encrypted with KEKs and protect those KEKs with tools and processes. And remember to balance security with usability: rotating your KEK every 2 seconds might be secure, but is your system usable?

Anthem wanted the data to be useful, which is why it wasn’t encrypted at the database. But that usability came at a high cost. The good news is that it is possible to encrypt data and have it be usable.

 


Encryption is a critical, necessary piece of a system’s overall security posture. But it’s not the sole answer. In Anthem’s case, records were accessed via those “elevated user credentials” … which means that malicious hackers were able to get in to the authentication server and raise privilege levels of user credentials (usernames/passwords) that they either knew or gleaned from the auth server. So in this case, it’s irrelevant if the breached data was encrypted; the hackers had authenticated and authorized access to it.

So what’s the answer?

When this was first reported I tweeted this:

Editing_Encryption_Keys — Part_1__What_Are_Keys_Exactly_

Defense in depth means providing security controls to address all aspects of the system: people, process, and technology. Technology is the most difficult pillar to lock down because there are so many layers and threats, hence so many products such as firewalls, IDP, APT, IDS, SIEM, 2FA, AV, smart cards, cloud gateways, etc.

Encryption is a fundamental element for security of data at rest and data in motion (control plane and data plane). Even the strongest encryption with proper key management won’t protect data that is accessed by an authorized user, because it has to be usable. However, encrypted data and tight management of keys provides a critical, necessary piece to a robust security posture.

I hope this provides some guidance on how to think about encryption and key management in your organization.

 

BlogFooter_Ray

3 Feb 2015

Privacy, Liberty & Encryption

David Cameron

David Cameron

It is unfortunate, that in the aftermath of the Charlie Hebdo murders and hate crimes in France, rallying cries for freedom of speech were twisted to interpret “free” speech as the opposite of “private” speech.  A few weeks ago, British Prime Minister David Cameron spoke out, radically saying that “we must not allow terrorists safe space to communicate with each other,” going on to suggest that there should be no means of communication which the government cannot read.  I’m in no way sympathetic to extremists or rebels who leverage privacy to plan nefarious and destructive acts, but I am certainly sympathetic to all of the innocent, law-abiding citizens whose civil rights would be trampled by such a policy.

It was just a few short months ago that certain US government officials cried foul when Apple solidified their encryption capabilities to the point that consumer data could not be deciphered, even under federal subpoena.  As Matthew Green wrote on Slate.com at the time, “Designing backdoors is easy. The challenge is in designing backdoors that only the right people can get through. In order to maintain its access to your phone, Apple would need a backdoor that allowed them to execute legitimate law enforcement requests, while locking hackers and well-resourced foreign intelligence services out.”  For this, among a myriad of other reasons, Apple relieved themselves of the headache and built the ‘Secure Enclave’ instead.  Individual iPhones encrypt extended data using a unique key, mathematically derived by combining their passcode with a set of secret numbers that are built into the phone.  Tim Cook himself couldn’t decrypt it without the user’s passcode and physical access to the device.  By extension, Apple is now rid of thousands of subpoena requests and pressure from a variety of global governments.

Despite the claims that law enforcement’s hands would be tied by this development in time sensitive situations such as kidnapping cases, Bruce Schneier asserted in a CNN editorial that “of the 3,576 major offenses for which warrants were granted for communications interception in 2013, exactly one involved kidnapping.”  So much for that theoretical importance of maintaining access to user phones.  More importantly, Schneier points out that phone data “can no longer be accessed by criminals, governments, or rogue employees. Access to it can no longer be demanded by totalitarian governments.”

This is another complication.  Even if the FBI and other US law enforcement agencies were the absolute pinnacle of tech-fueled crime-fighting and the removal of communication intercepts truly shackled their efforts… at least it closes the door to other, more suspect governments.  Apple, Samsung and others can’t really play international favorites, after all.  If they were able to, and willing to, provide backdoor access to the USA, they would have obligations to North Korea as well.

Apple washed their hands of the encryption problem by abdicating their role as a middle man and gatekeeper, and the internet didn’t break.  Law enforcement and other agencies seem to still be solving crimes, even without their former favorite toy.  Possibly most important, the ship has sailed, before another government flexes their muscles.  Just like Iran banned WhatsApp.  Just like India forced Blackberry to provide a law enforcement backdoor.  The UK has long been a supporter of citizens rights and privacy.  Thankfully, Apple ended this conversation long before the Prime Minister’s kneejerk reaction, wishing out loud for a technology-driven vaccination from terrorism.  We can only hope that other phone manufacturers follow suit quickly.

I sympathize with the victims in France.  I understand the sentiments of David Cameron.  But now, more than ever, it is crucial that we protect our liberty by protecting our privacy.  If we are forced to sacrifice our rights, we have already lost the war.

BlogFooterWalt2

24 Jan 2015

On Encryption Keys – Part 1 – What Is a Key?

Last week I met with a customer to help solve, among other things, some challenges around key management and key lifecycles. I thought I’d kick off a blog series on keys: what they are, their generation, use, recommended strength, etc.

First, let’s briefly address what a key is: a key is what protects your data. It’s a (hopefully!) secret parameter fed into an encryption algorithm to obfuscate data in a way that only someone with the same key can decrypt the data and read it as intended.*

Here’s how I explained it to my 10-year-old daughter:

Think about the door to our house. When the door is locked, only someone with a key can get inside. (Ok sounds more like authorization but stay with me). When inserted and turned, the key hits the pins that triggers the locking mechanism and unlocks the door. That key is the only key that can lock and unlock our door.

While quite elementary in my mind, it’s a relatively good example of the value and importance of the key lifecycle, which I briefly discussed with my daughter after she asked the following questions:

  • What if someone copies the key?
  • What if our neighbors lose our spare key?
  • How do we know if someone else used our key?
  • Does someone else’s key work in our lock?

All are relevant questions in relation to cryptography as well. Over the next couple of weeks, we’ll talk about how keys should be generated, ideal key sizes, and general key management issues and best practices.

Fair warning: there is no single, correct answer. We’ll use this series to address dependencies and variables such as environments, data sensitivity, and threat models.

*This is known as symmetric encryption, where one key encrypts and decrypts data. In asymmetric encryption a public key is used to encrypt data and only its associated private key can decrypt the data.

 

BlogFooter_Ray

5 Jan 2015

My Worry and Optimism for Cybersecurity in 2015

toughroad8ball

Let’s face it – 2014 was pretty bad from an information security perspective, and I believe we will see a rise in the frequency, severity, and publicity of malicious hacks and breaches in 2015.

I’m worried that as a community, hell, as a society, we won’t see enough progress in this uphill battle of infosec. I’m not blaming anyone or pointing fingers. Security is hard because every organization is different: different people, different policies, different network topologies, different vendors, different missions, etc. (and that is why there is no silver bullet for security). In general, I’m worried about some SMBs that lack the resources to set up a proactive security posture. I’m concerned about some large enterprises that will continue to lag and not fully embrace security.

But… I’m optimistic. Security is at the tip of everyone’s tongue now. It’s “cool” … and cool is good.

SMBs have options for cloud productivity and storage solutions with security built in – at the very least, better security than what they could do themselves. Larger organizations can integrate many different solutions to enable their security posture.

Security is about defense-in-depth, which is to say having security at all layers, from policy and training to two-factor auth and encryption. Aggregate organizational differences can be met with the right technologies in the right place.

I’m optimistic because there are so many good and talented people working very hard to stay ahead of the bad guys. There are new technologies and new ways of thinking. There are VCs willing to fund such companies. There is more adoption and acceptance of security in the marketplace. There are companies with an assigned CISO to keep their business focused on security and out of the news.

So how do we make 2015 better to ease my worrying and reinforce my optimism?

Everyone: keep evangelizing. We have to keep talking about security and encouraging it. We need to think about security in new and emerging markets like wearables and IoT. I think after all the news in 2014, stakeholders are starting to get it. Perhaps we need better / tighter regulations. We need to talk about what’s real, what’s viable, and what’s manageable.

Product vendors: build security into your lifecycle. It’s doable. Microsoft initiated the Security Development Lifecycle with impressive if not astounding results. Cisco is doing it, along with many others. Security is a process. Bake it in to your software development. It’s good for you and your customers.

Buyers: check for the right encryption. Not all encryption is equal. Is your vendor using homegrown encryption written by Joe the Intern? Or is it standards-based? Just because a vendor says they implement AES doesn’t mean they do it correctly. Encryption needs to be correct to be true and interoperable. Look for FIPS 140 validation on your preferred vendor’s encryption library or ask for the certificate number.

All businesses: Assess the value of your data and where it resides. Then deploy the right products. Security is a process. Organizational security starts with security risk management, which guides the organization in protecting its assets. Before selecting security controls, the organization must know what data it needs to protect, the value of that data, and the lifecycle of that data. Whether protecting credit card numbers, user files, intellectual property, internal emails, provocative Mardi Gras photos, product roadmaps, money… all of that needs to be protected in an organized and actionable way.


Over time, we’ll explore more in each of these areas. In the meantime, this worrier is optimistic that we will stay focused, deliver, and do our best to make 2015 better.

 

BlogFooter_Ray

22 Dec 2014

The Sony Hack Just Does Not Matter

Several times this year we’ve heard about hacks and compromised systems (more so than I can remember in recent history), and I have to say I’m truly amazed at all the press on the Sony hack. But why is this garnering so much attention?

Simply put, its effects are felt by a wider audience.The_Interview_2014_poster

  • Sony cares because of loss of revenue and tarnished reputation.
  • Movie stakeholders (the producers, actors, etc.) care because it could impact them financially. I have never read the relevant agreements for this industry, but I’m sure there is a force majeure clause that will now be subject to an unprecedented interpretation and a great deal of legal precedence going forward.
  • Theater owners / workers care because of supposed threats against their establishment, loss of revenue, and the inconvenience of replacing a movie in their lineup.
  • Consumers care because they can’t see a movie with some very funny comedians.

Banks or retailers get hacked and it makes the news for a couple of days and fades. Maybe it’s not serious enough? The Home Depot, Target, and Staples attacks don’t really take anything away from the consumer. They can still shop at those places, albeit with new credit card numbers. So they don’t really feel the effects. An entertainment company is hacked and it’s an act of war cyber-vandalism. So much so that the President has weighed in and vowed a response. I guess compromising a retailer is just a nuisance.

Finally, there is breach that consumers actually care about. The JPMorgan breach doesn’t directly affect the average family. We are, sadly, getting accustomed to being issued new credit cards and putting band aids on breaches in that industry. We can tolerate the Fortune 50 losing money, but don’t mess with our entertainment. That is intrinsically American.

Perhaps I should rethink this title, as now attackers may have found an avenue that will encourage even more attacks. And let’s face it: we have thoughts of actual war dancing through our heads. This isn’t script kiddies and folks just looking to make a quick buck. These are hackers with nukes.

At SafeLogic we’ve done a fair bit of evangelizing this year, trying to get makers of IoT devices and health wearables to build security in as opposed to treating it as a cost center and a reactive initiative. So with that in mind, let’s think about this:

If halting the release of a movie gets this much attention and buzz , what happens if critical infrastructure is compromised? What if people can’t get water? Or they get only contaminated water? What if the power grid is blacked out? What happens when connected “things” are compromised? These are the absolute scariest scenarios, the effects of which are far more impactful than what you’ve been reading about this week. These effects are real.

Let’s not discover what happens in these “what if” scenarios. We need awareness and we need plans and we need action. I’m hoping that everyone takes the Sony hacks to heart and thinks about what truly matters… Especially this time of year.

Oh, and encrypt your data with SafeLogic’s validated and widely-deployed encryption solutions.

BlogFooter_Ray