August 2013 | SafeLogic

Archive for August, 2013

27 Aug 2013

A Look Inside API Technologies’ Use Case

At the beginning of this month, SafeLogic COO Wes Higaki discussed the obstacle course known as UC APL certification.  He presented the strategy of leveraging CryptoComply with RapidCert and pushing both the FIPS 140-2 validation and the JITC validation in parallel for maximum efficiency.  Higaki also foreshadowed the release of a case study that illustrates the strategy in action.

Today, I am proud to announce that SafeLogic has published our case study with API Technologies.
ApiTech_LogoResize

It explains how CryptoComply and RapidCert were deployed within the ION Networks product line.  Not only did this allow API Technologies to pursue their UC APL certificate and accelerate their timeline, but SafeLogic brought FIPS 140-2 validated encryption to their products virtually immediately.

The API Technologies Case Study is available within our Exclusive Content area, alongside case studies featuring Symantec and Juniper, whitepapers on HIPAA and FISMA, and complete chapters from FIPS 140 Demystified: An Introductory Guide for Vendors, the book written by SafeLogic founders Ray Potter and Wes Higaki.  I highly encourage you to register for access and read up.

If you are looking at FIPS validation with an eye towards the UC APL or another certification, get in touch with us immediately.  Time is extremely valuable in these situations, so let’s get started.  You needed SafeLogic six months ago.

BlogFooterWalt

22 Aug 2013

MIT and the Effectiveness of Brute Force Attacks

Researchers have asserted that current cryptographic systems are not as secure as we have believed.

Photo by Bryce Vickmark

Photo by Bryce Vickmark

That’s a daunting statement.

When you hear that an MIT professor is publishing a paper that attacks the fundamental premises of your career, it’s only natural to get the mental equivalent of biting into a sour lemon.

Luckily, the headlines are mostly sensationalistic and the research, while interesting, is no threat to the security industry, let alone the way of life in the developed world.  Let’s read more into this and figure out what exactly Professor Muriel Médard and her team are trying to say.

 

 

The concept in question is that of the level of uniformity in the compressed source files.  Information theory demands that we assume the highest level of entropy and uniformity, even if the algorithm failed to quite meet that level.  Médard says that it is a reliance on Shannon Entropy that creates the issue.  Shannon’s 1948 paper was focused on communication, and advanced the idea that data traffic as a whole would average out any imperfections in the uniformity of individual pieces of data.  This is a fair assessment, but not the ideal approach for cryptography.

Average uniformity is not the goal of encryption.  Rather, it is the simple understanding of the weakest link that explains the conceptual error.  When encrypted data is under fire from a codebreaker, we do not worry about the 99.99% of the data that is properly encrypted.  It is the weakest link, that did not reach the highest level of uniformity and entropy, that is vulnerable and puts the entire data cache at risk.

“We thought we’d establish that the basic premise that everyone was using was fair and reasonable, and it turns out that it’s not,” says Ken Duffy, one of the researchers at National University of Ireland (NUI) at Maynooth, who worked alongside Médard.

Essentially, these slight deviations in the uniformity of the data open the door for a brute force attacker to test a series of assumptions.  For example, an assumption that a password was in English, or even was based on an actual word, could accelerate the codebreaking process.  “It’s still exponentially hard, but it’s exponentially easier than we thought,” Duffy says.

The good news?  (Yes, there is still good news.)

We are still very much talking about theoretical gains and the security is still very much intact.  Brute force attacks have always had a projected success window, but it was so astronomical that it was considered to be effectively moot.  This paper is simply saying that it is slightly less astronomical, but likely still effectively moot.

As Matthieu Bloch of Georgia Tech states, “My guess is that [the paper] will show that some [algorithms] are slightly less secure than we had hoped, but usually in the process, we’ll also figure out a way of patching them.”

That’s a great attitude, Bloch!  Now let’s clear up a few misconceptions that this news has created.

 

1)   We are suddenly vulnerable.

You really believe this?  I can guarantee that hackers have recognized this anomaly long before MIT announced it to the world.  It’s not a skeleton key for the world’s data, it’s just something we can improve.

 

2)   Shannon Entropy is useless and we’ve been wasting our time since 1948.

No, not exactly.  Shannon had the right idea when it came to data traffic.  The theory has just been misapplied to encryption.

 

3)   Our entire system of cryptography is now in question.

Definitely not.  In fact, it is research like this that proves more than ever that it is crucial to stick with cryptography that has been properly tested, validated, and integrated.  Encryption is not something that should be improvised or cobbled together.

 

So stay tuned for more news from MIT and we’ll keep you updated in this space.  If you’re using low level, unvalidated encryption, please only do so with the understanding that it is no impediment to a motivated hacker.  And if you need encryption at the highest levels, you’re already at the right place.  Don’t hesitate to reach out.

BlogFooterWalt

14 Aug 2013

The Inaugural International Cryptographic Module Conference

I just got word that I have been invited to speak at the first ever International Cryptographic Module Conference.  What an honor!

The creation of this specialized industry event was long overdue, and I’m proud to be a part of the inaugural edition.  The conference is scheduled to be held September 24th to 26th in Gaithersburg, Maryland and includes stellar topics that address the unique challenges faced by those who produce, use, and test cryptographic modules that conform with standards such as FIPS 140-2.  This is right up our alley.

My session is entitled “FIPS and FUD” and I will be discussing the confusion surrounding vendor claims about FIPS validation.  While the CMVP was established to eliminate the inconsistencies and create distinctions, many buyers are stuck in a landscape of fear, uncertainty and doubt.  My goal is to clarify these topics and help end users identify what they really need in a cryptographic solution. And believe it or not, we might even have a little fun during the session.

Stay tuned for more details… and see you in Maryland!

ICMCBanner

BlogFooter_Ray

8 Aug 2013

The UC APL Obstacle Course

Here at SafeLogic, many of our customers are focused on opportunities to sell to the US Federal Government.  The large and long-term contracts provide an incentive to jump through hoops and meet additional procurement requirements, which are a costly maze of complicated, expensive, and time-consuming barriers to entry.  But once you have qualified, you are in great position to capture those deals, not to mention opportunities in the private sector.  In some cases, there are multiple layers of requirements that must be met.  One of the most stringent is the placement of a product on the UC APL.

The UC APL, the United States Department of Defense Unified Capabilities Approved Product List, is administered by DISA, the Defense Information Systems Agency.  It was created to centralize the available solutions for the Department of Defense (DoD) and to provide a standardized method for approval.  Before the UC APL, different agencies and branches had to create redundant approval processes.  The establishment of the UC APL was a major positive step for both the DoD and the product vendors, streamlining the process and creating a single resource for qualified solutions.

The advantage of being listed on the UC APL does not begin and end with the DoD.  Federal agencies in other branches prioritize solutions that appear on the UC APL, as they know and respect the vetting process exercised by DISA and the DoD.  In fact, just as FIPS 140-2 has become an internationally recognized standard, products on the UC APL carry the same relative weight.

In order to qualify for the UC APL, products must be tested rigorously by JITC, the Joint Interoperability Test CommandJITCThis testing has two elements – Information Assurance (IA) and Interoperability (IO).  While IO testing has strict requirements in its own right, IA is typically the more challenging of the two.  IA testing adheres to the 2,000+ page Unified Capabilities Requirements document, and the product must meet a series of Security Technical Implementation Guides (STIGs) which outline best practices.  On top of that, Information Assurance demands that if the product includes encryption (and it nearly always does), that a FIPS 140-2 validation exists.  Common Criteria sometimes comes into play as well.  Don’t worry, we can help there, too.

UC APL testing is truly a marathon, not a sprint, and it has many obstacles.  Ok, maybe it’s more like a mud run.  It can definitely get messy.  There is good news, though.  We have identified one way to accelerate the process.  As it turns out, the JITC parameters allow for products to initiate testing upon their addition to the NIST (National Institute of Standards and Technology) CMVP (Cryptographic Module Validation Program) In Process list for FIPS 140-2.  In layman’s terms, this means that products that still need to earn their FIPS 140-2 certificate can go through both the JITC and the NIST CMVP validations simultaneously.  This is huge.  It’s like getting your bachelor’s degree while you’re still in high school.  Completing both certificates in parallel saves significant time and means that you can begin realizing revenue months faster.

CryptoComply and RapidCert, exclusively from SafeLogic, take full advantage of this opportunity.  Even for the folks who are aware of this rule, initiating a FIPS 140-2 validation is difficult.  The inertia is high during the first phase, documentation, which can take a great deal of time if you’re not leveraging RapidCert.  [For more detailed discussion of the accelerated timelines offered by CryptoComply and RapidCert, refer to my blog posts here and here.]  When a JITC certificate is part of the strategy, the opportunity is even bigger.

We will be posting a case study in the near future that illustrates this very strategy.  By implementing CryptoComply and adding the RapidCert option, a customer was able to take a product that had no FIPS 140-2 validation and within a month be in progress with their JITC certificate.  In the past, they would have been looking at approximately a year to get to the same stage.

So don’t hesitate to talk to SafeLogic.  There’s a lot at stake here, so let’s strategize and make things happen.