Blog | SafeLogic


27 Aug 2014

Vegas is Scary

Vegas is scary. Well, not the city itself.  I love Las Vegas!  (And I’ll be there again soon for CTIA’s Super Mobility Week. Ping me to meet up.)  The hackers that descended upon the desert oasis for Black Hat and DEFCON are the scary ones.  Their bag of tricks, more specifically.

I was on a mission to find and pick the brains of the most interesting attendees.  I came away somewhat traumatized, since I knew just enough to be truly disturbed by how many vulnerabilities were discussed.  Here are just a few, with links to more commentary by PC Mag. Max Eddy and Fahmida Rashid both did a stellar job and should be followed on Twitter.

Nest is Cracked

Saw it, wrote about it, followed Yier Jin on Twitter (and he followed me back. Very cool.)  Bottom line – Internet of Things devices should not be a doorway into your entire home network.  Consumers should consider setting up a quarantine, at least until these manufacturers figure it out.

Side note: what the hell, Nest? You’re part of Google now. You’re commonly considered some of the best and brightest. Shouldn’t you be setting a better example for the IoT vendors to come?

Airport Security Scanners Are Vulnerable

I’m not sure this is a great classic hack, per se, but it’s definitely a candidate for the Darwin Awards.  Who are the geniuses that are hardwiring login credentials into TSA-issue airport security scanners?  And to make it better, connecting them to the public internet?  Billy Rios, director of threat intelligence at Qualys, successfully identified two such systems.  He located 6,000 connected scanners, two of which were at airports.  PC Mag reported that one has been decommissioned since.  I want to know where this last rogue system is located… and I’m considering not flying until it is removed.

Satcom Links Become Slot Machines

IOActive’s Ruben Santamarta was able to hack the satellite communications systems used in airliners, cruise ships and other remote deployments.  Again, using hardcoded credentials and backdoors, Santamarta proved that several methods of alternate communications are vulnerable.  Making matters worse, the use cases when these devices are in play are exactly the situations that you don’t want to be hacked.  If you’re hitting SOS on a plane or a boat, the last thing you want to see is a Black Hat video slot machine!

Google Glass Steals Passwords

Ok, that one looks like click bait. In a way, it is. Qinggang Yue demonstrated that an iPhone or even a traditional camcorder would still do the trick, but the popular wearable poster child is the most sneaky.  He was stealing Android users’ PIN codes at an alarming rate – even 100% of attempts from 44 meters away, albeit with a camcorder on the fourth floor of the building to achieve an advantageous angle.  The upshot? Randomized keypads can’t become ubiquitous fast enough. They will negate the advantage of most PIN-stealing techniques, including this voyeur strategy. Without a direct and clear angle, Yue’s model was built to make assumptions about the location of each button.  By randomizing the location, users will not be able to rely on muscle memory to unlock their phone, access the ATM, enter their front door, etc., but hackers will have to work much, much harder.

Photo by Ryan Clarke

Photo by Ryan Clarke

Bonus Story – The Puzzle Mastermind Behind DEFCON’s Hackable Badges

Ryan Clarke aka LostboY aka LosT has a really cool gig. Wired’s Kim Zetter has the story, and while it’s not about a vulnerability, impending danger or security, I highly recommend taking a couple minutes to read it. Clarke designs seven badge types each year: attendees (humans), goons (conference volunteers), vendors, speakers, contest leaders, the press, and the Uber badge. Players have to collect each of them to decipher part of a math-based challenge. The lanyards holding the badges also contain puzzles. This level of creativity and craftsmanship is not commonplace, and it makes you want to attend DEFCON just to get one of these sophisticated works of art. And it makes me want to watch a movie like The Game again, just to get that thrill. Well done, LostboY, well done.



7 Aug 2014

Nest: Hacked or Just Jailbroken?

It is here, somewhere in the middle of the desert, among the inexplicably massive resort hotels that have risen from the sand over the years, that the experts have gathered.  First it‘s Black Hat, then it will be ITexpo.  Right now is the lull between the storms.
Not much of a lull, though, to be honest.  After Yier Jin, a researcher and assistant professor at the University of Central Florida (go Knights!), blew the doors off of the poster child for the Internet of Things at Black Hat, the hype machine has grabbed hold of the discussion and we’re in full swing.

One camp points to the discovered vulnerability in the Nest thermostat as proof positive of our future destruction.  The other takes it with a grain of salt, reassured by Nest Labs’ assertion that the unauthorized control requires physical access and should be considered a ‘jailbreak’, not a true hack.

I would fall somewhere in between the two schools of thought.  The latter doesn’t take the hack seriously enough, while the former is just a bit too convincing as Chicken Little.  But let’s take a closer look at the situation.

Sean Michael Kerner’s article at eWeek quotes Nest Labs’ statement.  “It doesn’t compromise the security of our servers or the connections to them and to the best of our knowledge, no devices have been accessed and compromised remotely.

Jin, the researcher, didn’t claim to hack Nest’s servers or control any remote devices… what he did say is that he could theoretically interfere with future firmware updates, rendering a particular thermostat helpless to potential bugs, hacks and loopholes that will doubtless be discovered later.  In addition, Jin points out that by forcing his way onto the device, he would have access to network credentials.  Now we’re talking about a clear and present threat.

So perhaps the bigger problem here is not the hack of the thermostat – it’s that the network credentials are accessible from the device.  As Seth Rosenblatt points out at CNET, Black Hat has pivoted this year to a true discussion of security, leaving the topic of privacy for another time.  Jin clearly uncovered a distinct security issue, and I’m excited to see how the industry responds.  In the meantime, we’ll see what ITexpo brings to town.
In the immortal words of Hunter S. Thompson, “Buy the ticket, take the ride.”  IoT is here, and we are all along for the ride.  Let’s make the most of it.  Drop me a note if you’re here in Las Vegas for the conferences, I’d love to hear your opinions.


11 Jul 2014

Glass for the Masses

google_glasssWearables and the surrounding culture are evolving to the next generation right before our very eyes and Google is firmly in the vanguard with the notorious Glass.

Just in the beginning of 2014:

San Diego traffic court heard a case against a Glass-wearing driver

Homeland Security interrogated Ohio man wearing Glass in a movie theater

VSP, the #1 vision insurance provider, announced eligibility of Glass for subsidies

Google added sunglasses and prescription frames to Glass lineup

Do you see a pattern?  I sure do.  Growing pains, and lots more to come.

Google is making a strategic effort to make Glass more accessible, but they have fallen short, yielding a not-quite positive reputation for their early adopters.  Perhaps any press is good press for Google, but I think it says something when “Glasshole” has been an entry in the Urban Dictionary for nearly a year before the device was even available for public sale.  Wearables are clearly poised for mainstream domination, but the public is just as clearly not ready to accept it yet.

The issue is a lack of hands-on experience by the masses.  As Keith Barrett pointed out in his blog, by slashing the price, Google could put the Glass into the hands of millions.  It would no longer be a novelty toy for the elite nerds who want to demonstrate their status.  The average American would become the advocate, knocking down barriers, removing stigma, and encouraging everyone to see the positive applications for the technology.  The everyman is a very powerful demographic, and it’s the only one that can combat the current notoriety of the Glass.

So let’s talk about actual, productive ways to integrate Glass into our normal lives.

Why are we not rolling out law enforcement apps for Glass that include real-time database reference for license plates and facial recognition?  That would be so much more productive than ignoring the topic until traffic cops pull over a blogger looking for publicity.

Why are we not deploying Glass in movie theaters to offer subtitles for deaf or non-English speakers?  That seems like a better option than calling in federal agents to investigate a potential bootlegger.

If we have subsidies to burn with insurance companies, why are we not developing Glass apps to help teachers in the classroom?  Imagine if a teacher could quantitatively measure the attention span of a room of first graders while engaging with them.  How about apps for health inspectors while in a commercial kitchen?  Or taxi drivers?  Or race car drivers?

The potential of Wearables, and specifically heads-up displays and augmented vision such as Glass, is vast.  I just hope that we can begin to truly embrace it as a culture soon.


3 Jul 2014

Happy Independence Day!

Stars and StripesWow.  It feels like just yesterday that I blogged about the importance of our freedom and opportunity, and how thankful I am to be thriving in the USA.  That was a year ago.  In ‘SafeLogic Time’, where we try to compress a month’s work into a week, and a year into a month, this feels more like a decade!

Much has happened since the summer of ’13.  I encourage you to go back and re-read some of our blog posts to recap.  It’s really pretty interesting to harken back to the challenges that we were facing last year and how we have solved some, while others are very much still threatening.  We will be selecting certain posts as suggested reading for what the Twitterverse likes to call #ThrowbackThursday… although I know that Walt really enjoyed X-Men: Days of Future Past, so that might be contributing to the retro theme too.

Some things have definitely remained the same.  SafeLogic still pursues innovation in security and encryption, prioritizing the safety, privacy and liberty of our customers, and our customers’ customers.  I’m still thankful and proud to be an American, and I’m still planning to grill, watch fireworks and put away a few cocktails.

In a landscape strewn with failed companies, startups deeply in the red, and ousted executives, I’m excited for Independence Day.  I have a lot of pride as I continue to lead this company, as SafeLogic continues to operate independently, at a profit, and with no venture debt.  It’s the most clear, direct way that I can say definitively that we will be here when you need us.  Next month, next year, or whenever you’re ready.

Happy Independence Day!


24 Jun 2014

Pro Tip: Encrypt Medical Data

SafeHealth_v2_orangeThere has been a ton of chatter about the recent fines levied by the U.S. Department of Health and Human Services Office for Civil Rights (OCR), and for good reason.  Money talks.

The Department of Health and Human Services (HHS) assessed a record $4.8 million in fines from New York and Presbyterian Hospital and Columbia University, after they submitted a joint breach report that dates back to September 27, 2010.  And to resolve a HIPAA breach case from over two years prior, Concentra Health Services and QCA Health Plan, Inc. agreed to pay a combined $1,975,220.00 in April 2014.  That’s right, nearly seven million bucks combined.  These must have been just ridiculously egregious breaches, you say.

Well, not exactly.

In the first case, resulting in the highest HIPAA-related fines yet, Patrick Ouellette reports that the electronic personal health information (ePHI) of 6800 patients was exposed to internet search engines, related somehow to an application developer’s deactivation of a personally-owned server.  My guess is that the dev didn’t do a comprehensive wipe on his testing machine, so when he started his next project… ouch.

In the second case, a laptop was stolen from an employee’s car outside of a physical therapy center in Missouri.  It contained ePHI of 148 patients… and the laptop had not been properly encrypted.  This was the key ingredient to becoming a major example set by the Office of Civil Rights (OCR).

Although HIPAA regulations draw no distinction between health information that is more sensitive (oncology lab test results, Center for Disease Control type stuff, etc) and clearly less sensitive (patient progress reports while rehabbing a torn meniscus, for example), we can say with reasonable confidence that this small local physical therapy center’s data was likely in the latter category.  But like I said – HIPAA makes no distinction.  The relatively small pool of potentially affected patients made no difference, either.  The OCR’s investigation yielded evidence of perpetual compliance violations and a general policy of ignoring the regulations.  That is the recipe for trouble, and the financial repercussions are clearly major.  Encryption is a baseline for medical data security.  It should be considered a non-negotiable starting point, but certain institutions continue to drag their heels.

Let me paint you a picture.  To seasoned criminals planning a heist specifically to harvest patient data, encryption is a deterrent, but does not make a system impregnable.  By comparison, virtually all of these incidents are inadvertent – a lost tablet here, a stolen laptop there.  Encryption is extremely effective in these scenarios, to keep the equipment loss from escalating to a full-blown breach.  In short, it keeps a hack from becoming a hacker.  It insures that the local juvenile delinquent who puts a brick through a window just to grab a four year old PC to sell for drug money will be doing that – and nothing more.  The laptop will be a brick itself shortly thereafter, and you can be confident that the smash-and-grab will not expose patient data in plain text, and will not yield a two million dollar price tag.  ePHI is safely obfuscated, and your biggest problem will be deploying a new laptop to your employee.

Simple, right?  Then why is this still like pulling teeth for some providers?

Kamala Harris

Kamala Harris

If the financial penalties aren’t a strong enough motivator, litigation is on the table as well.  California’s Attorney General Kamala Harris has made no secret of her interest in the topic, offering medical identity theft prevention tips to the public this fall.  This winter, Harris filed suit on behalf of California against Kaiser concerning a 2011 breach in which an unencrypted external hard drive was sold at a thrift store, containing a plain text database of Kaiser employees, along with Social Security numbers, dates of birth, addresses… and oh yeah, their family members’ information too.  Worse, Kaiser only alerted about 10,000 of the 30,000+ affected employees.  Not pretty.

So now you’ve got the OCR looking to fine you, the Attorney General suing you, and we’re not done yet.  Just like in the enterprise environment, you can’t even rest once you’ve trained employees and given them some tools.  You still need to safeguard against disgruntled or malicious employees.  ”You won’t give me a new laptop?  Fine.  I’ll just ‘lose’ this old one.” Or worse, “You won’t give me that 5% raise? Fine. I’ll just ‘lose’ my unencrypted device and we’ll see how much you’ll pay.” Scary stuff.

Susan McAndrew

Susan McAndrew

The stance of the OCR is clear, and it is straight forward.  “Covered entities and business associates must understand that mobile device security is their obligation,” said Susan McAndrew, OCR’s since-retired deputy director of health information privacy. “Our message to these organizations is simple: encryption is your best defense against these incidents.”

Michael Leonard

Michael Leonard

It should be a no-brainer, but we continue to see companies holding out.  Iron Mountain’s Director of Product Management for Healthcare IT, Michael Leonard, commented recently on this.  ”From our perspective, it is – I’ll say ‘puzzling’ – that organizations don’t encrypt more of the content even within their four walls.”

I’m not sure ‘puzzling’ is strong enough.  Idiotic, maybe?  Concentra Health Services and QCA Health Plan, Inc. were forced to cough up more than $13k per patient whose record was exposed, and that may just be the tip of the proverbial iceberg.  HHS Chief Regional Civil Rights Counsel Jerome Meites predicted an increase in fines from the $10M assessed in the last twelve months by the agency.  ”Knowing what’s in the pipeline, I suspect that that number will be low compared to what’s coming up.”  That’s ominous, and should be a wake up call to anyone who thinks that they can simply fly under the radar.

The reality is that encryption should be automatic.  It should be offered in every software solution deployed to healthcare providers at every level.  To help reinforce the transition, SafeLogic provides FIPS 140-2 validated encryption for these solutions.  Remember, in the eyes of the federal government, only cryptographic modules certified by the CMVP are considered acceptable.  This assessment has extended to the healthcare industry as well.  HIPAA’s requirements have not yet explicitly required the use of FIPS 140-2 encryption exclusively, but customer requests already do, and the writing is on the wall for future versions of the standard.

For more information on the role of validated encryption in HIPAA regulations, please download SafeLogic’s whitepaper.


18 Jun 2014

Tizen, Connected Cars and Buggy Whips

Two weeks ago, I had the privilege of giving a presentation at the 2014 Tizen Developer ConferenceSafeLogic_Tizen_Logos

The first thing that you should know is that this was a fantastic event.  Most of us will hear “user group” or “developer conference” and reminisce about our own early experiences, the coffee-and-donuts geek meetups, complete with a folding chair for each wannabe Wozniak.  This was much more.  With a variety of speakers tackling an equally diverse set of topics over a three day stretch, and a significant investment of time, money and energy from Intel and Samsung, I highly recommend attending in 2015 if possible.  It was a very smooth and well-coordinated conference, for speakers, attendees and exhibitors alike.

The second thing that you should know is that my session rocked.  ‘Security-Centric Development for IoT and Wearables’ was one of the few talks that had a specific focus on data protection.  My hope is that I was able to influence attendees to consider security as a non-negotiable aspect of their development efforts, and maybe next year we will see more like-minded sessions on the agenda.  At the very least, I had fun launching SafeLogic footballs into the audience and nobody got a concussion.

To be honest, I was blown away by the ideas bouncing among the audience.  There were developers from seemingly every corner of technology, all with a vision of success based on the same operating system.  It was inspiring to see how many different folks saw potential in the same place.  Since the conference, it has felt like everywhere I look, there is another potential application for Tizen, another opportunity to join the Internet of Things and another chance to connect.  The scary part is that it all has to be secured.  Remember, IoT is only as strong as the weakest link.

One session at the Tizen Developer Conference included a discussion of the connected car collaboration efforts of the Linux Foundation, IBM, Intel and Local Motors.  It made me think of the article I had just read on CNN, aptly titled ‘Your car is a giant computer – and it can be hacked’.  Scary stuff, and spot on.

GoogleCarThe Toyota Prius has solidified its place in the garage of everyday Americans based upon efficiency, not horsepower, and has been immortalized as the test mules for Google’s self-driving car project.  Tesla’s fully electric Model S was the top selling full-sized luxury sedan in 2013… not bad for a vehicle designed by tech geeks.  Google has pushed the envelope even further now, internally developing prototypes for an all-new self-driving vehicle that incorporates features of both.  The landscape is clearly changing – and quickly.

Steering wheels are the next buggy whip, and data security will be more important to safe transportation than seatbelts.  Driver error will be replaced by the threat of compromised communications.  Could you imagine arriving at your destination, only to find yourself at a location chosen by a malicious hacker?  Or having your vehicle overridden and driven into a wall, off a cliff, or into a lake?  There is serious potential in self-driven cars, but even more serious potential for disaster.

The Tizen platform is not uniquely vulnerable to these threats.  All of IoT inherently is.  A smart toaster in your kitchen has to be as secure as your car, even though it isn’t 3000 pounds of metal going 70 miles per hour.  Until developers begin treating all devices with the same level of respect, I encourage all of us to tread carefully.  Hackers relish the challenge of creating mischief as much as they value the results, so assume that you may be a target.  We all are.

If you are a developer in IoT, please check out CryptoCompact.  We have begun our pilot program, so consider it an open invitation to integrate military-grade encryption within your project.  We’re all in this together, so let’s stay safe.


12 Jun 2014

SafeLogic Doesn’t Sell to the NSA

It’s not that we don’t appreciate the work of the National Security Agency here at SafeLogic.  Really, it’s quite impressive.  We certainly are thankful for the work of Homeland Security and the DoD.  And we absolutely, unequivocally, 100% support the men and women who have served in our national military.  We are red-blooded American patriots, who believe in life, liberty, and the pursuit of happiness.  And that is precisely why we do not work with the NSA.NSALogo

Several significant events have come to light that call their ethics into question, and I’m not even talking about Snowden, Wiebe or any of the other whistleblowers.

Many are still reeling from the revelations surrounding the ten million dollar bribe that the RSA Security Corporation took from the NSA, in exchange for making Dual EC DRBG the default algorithm in RSA BSAFE, the most popular proprietary encryption module in the business.  This transaction, if known publicly at the time, would have raised eyebrows and questions would have been asked.  Instead, it remained in the shadows for years before Dual EC DRBG was exposed as a backdoor for the NSA to decrypt information at will and the connection was made.  It was a betrayal from both RSA and NSA, and disappointing to say the least.

More recent is the allegation that the NSA had knowledge of the Heartbleed bug, and leveraged the vulnerability since its creation, for approximately two years prior to the public identification of the flaw.

Ignore the political debate about whether or not the NSA has a right to, or ought to, spy on Americans in order to insure our safety.  They found the bug and didn’t tell anyone!  Even within their own denial was an implicit admission.  “Reports that NSA or any other part of the government were aware of the so-called Heartbleed vulnerability before 2014 are wrong.”

Ok… so they admit to hacking Heartbleed in January?  Best case scenario, the NSA took advantage of Heartbleed for only about 90 days.  Should we feel any less betrayed?

heartbleedlogoLook, I understand that it’s not in the NSA’s job description to fix problems created by private industry.  Heartbleed was certainly a black eye for the team of volunteers that rolled out the 2012 update to OpenSSL protocols.  No argument there.  But the NSA motto itself reads Defending Our Nation.  Securing The Future.  Doesn’t that include sounding the alarms when an estimated two-thirds of the world’s internet activity is at risk?  Doesn’t defending our nation include defending our intellectual property?  The assumption is that Securing the Future refers to the future of the American way of life, which is tightly aligned with our capitalist free market economy.  But that apparently was vulnerable for two years!  So no… I don’t think the NSA lived up to its mantra.

SafeLogic’s allegiance is to our customers, and our customers’ loyalty is to their own customers.  At the end of the day, our success is measured on whether we did everything possible to insure the security of the end users’ information.  Since our inception in 2012, the answer has been a resounding “Yes!” every single day.  Any partnership, association, or agreement with the NSA would undermine that singular goal.

That’s why we don’t sell our encryption to the National Security Agency.

28 May 2014

The Upside of the Heartbleed Bug

heartbleedlogoHeartbleed was huge.  Massive.  A giant, gaping hole that was able to be exploited in several ways and somehow was unnoticed for over two years.  It was an embarrassment, a black eye for the OpenSSL Foundation and really all who use OpenSSL for encryption… which is the majority of the Internet, and most of the world’s internal sites and apps as well.

The first confirmed data losses due to the Heartbleed Bug were on April 14th, when the Canadian Revenue Service lost 900 social insurance numbers (the equivalent of a Social Security Number) in six hours to a determined college student.  Bad?  Yes.  But destructive at the worldwide level that we believed possible?  Not even close.

So here’s my point.  Heartbleed had a big, fat, silver lining.  In the span of a few days, millions of administrators reset their private keys and reissued their SSL certificates.  We have confirmed very little actual harm caused by the vulnerability, and we have documented millions of websites and apps applying patches, updating their software, resetting their private keys and reissuing certificates.  If only we could inspire this type of prophylactic activity on a regular basis.  It’s like pulling teeth to get users to reset passwords, but one well-publicized breach and folks are clamoring for it.  Many consumers are being proactive and using tools to specifically avoid unpatched websites.  These are steps in the right direction.

Don’t get me wrong.  I won’t be wishing for another Heartbleed.  We have our hands full as it is with the eBays and Targets of the world.  But I’m absolutely certain that there will be another bug… probably worse/bigger/more widespread/more exploited/etc than Heartbleed, and it will be exposed in the fairly near future.  Such is life in this industry.  The ‘next big thing’ always includes the raised stakes inherent in our bigger Big Data, our faster connectivity, and our multiplying endpoints.  Luckily, we are making leaps forward every time we are faced with these threats, and we have very very very smart folks on our side.

My bigger concern had been that we will become jaded and tuned out to the dangers.  Target and eBay dropped the ball on their crisis responses, but banks and credit card companies responded swiftly and effectively.  Anecdotally, I have talked to a lot of people who were prompt to reset personal passwords and treat their identity protection with the proper level of respect and attention that it deserves.  The strong performance of site administrators and product architects worldwide in their response to Heartbleed has shown me that we have many reasons to be optimistic.  Here at SafeLogic, we had patches rolling out within hours of the announcement, and we were not alone.  As we approach the tipping point toward the Internet of Things, our vigilance must remain strong, and the industry’s unified response to Heartbleed has actually helped me sleep better at night.

21 May 2014

IoT: The Internet of Toilets?!

I recently read a humorous but forward-thinking post on Wired, espousing the potential use cases for an internet-connected toilet, complete with various sensors and capabilities.  The writer, Giles Crouch, nailed a few awesome scenarios, such as pregnancy detection, stool analysis, and hangover cures.  Yes, I’m a sucker for technology and I already want an iToilet, Giles… but only if they build it with security in mind.  The alternative brings to mind the 1937 Donald Duck cartoon, ‘Modern Inventions’.  You know how it ends… one disaster after another.


For example, early pregnancy detection is brilliant!  Until you leave your pregnant wife home while on a business trip, and some criminal genius figures out that he can scan the neighborhood for homes in which the only urine collected belongs to a pregnant woman.  That would be valuable information for someone with ill intentions and should be encrypted and guarded like your better half herself.  [Note: The same hormone levels could indicate testicular cancer in a man as well, but it would be a statistical long shot.  Not enough to discourage a criminal from playing the odds.]

The automatic stool sample is an excellent feature.  It’s the hypochondriac’s dream.  Every sample submitted would be analyzed and advisories would be offered regularly.  Well, as regularly as the patient, at least.  The rate of car accidents may rise, as Mr. John Doe rushes home at lunchtime to make sure his contribution wouldn’t be wasted on the traditional ‘dumb’ toilet at the office.  But potentially more dangerous, when humans take medical advice from a machine, you better be sure that the machine can’t be hacked.

“Mr. Doe, your sample shows a few deficiencies.  Please drink one quart of Draino to rebalance your system.”
Hey, if my iToilet told me, it must be accurate.  Draino… whodathunkit.
That’s a mistake you can’t make twice.

Further, if that smart toilet is connected to both your calendar and your doctor’s appointment book, just imagine the sh!t show (pun intended) if this was intercepted in plain text by a malicious third party.  You might spend all day in the waiting room of a doctor that does not have you on the calendar, while your house is raided because your door lock app was compromised as well.

Ah, yes.  The future holds a great deal of creature comforts in automation… if we can just get the security dialed in first.

Now without further ado (or toilet jokes), here’s the one and only Donald Duck in ‘Modern Inventions’.  Cheers!


14 May 2014

The Real Truth About Wearables

I keep reading about Wearable tech’s ‘Dirty Little Secret’… the fact that most Wearable devices are shelved within three months of initial use.

Does this shock you?  No?  Good.  Me neither.
And I’m not worried about it.


If you’re reading this post, you’re no stranger to the phenomenon of the Consumerization of IT, or CoIT.  (It almost looks naked without the hashtag!  #CoIT.  That’s better.)  It’s also referred to as the ITization of Consumers, which doesn’t have the same ring to it, but is actually more accurate when describing the shift towards more sophisticated and savvy users.  Today’s enterprise employees don’t need a designated geek to configure and deploy a piece of equipment.  In fact, they usually prefer to set it up themselves, since nobody knows their needs and preferences better.  Some blame the millenials, but that’s just not the full picture.  This trend was manifesting as Shadow IT since before the millenials went to prom.

I bring up CoIT because it is the embodiment of today’s tech culture.  Everyone wants to use the newest, hottest devices, and they prove it everyday, with or without IT’s help or blessing.  Everyone wants to be an early adopter now.  Everyone wants to try the latest and greatest, which is absolutely stellar.  Not every device is going to be a hit, but we are okay with that.  At this point, a wearable device with strong universal adoption would be the exception to the rule.  So in this period of ‘fail fast’ versions, who better to beta test new wearables and subject them to real world conditions than us?

The same research that presents the three month interval of abandonment also puts forward an estimate that over 10% of adult Americans have purchased at least one of these devices.  If we included Bluetooth devices, you better believe that number would skyrocket.  Subtract the population that is – sorry, I’ll just say it – too damn old to mess with these new-fangled doohickeys, and we are approaching an impressive market penetration for wearables without any delusions that it is a matured technology.  As a culture, we have demonstrated our appetite for wearables by continuing to buy and try them.  There is a certain sense of pride associated with being an Explorer, Pilot, or Kickstarter participant.

Bottom line – I’m not surprised by, or discouraged by, this report.  Wearables are still nascent, like a recent graduate backpacking through Europe, searching for motivation and identity in an existential haze.  We should embrace it as it is formed, molding it to our vision.  We shouldn’t push it away and complain that it is undeveloped.  We need to try every device that we can get our hands on.  We need to speak up and give strong feedback.  Offer opinions publicly, so that others can echo or debate, in the plain view of the innovators who will give us exceptional, can’t-live-without-them wearables one day soon.

And of course, don’t forget to demand strong security in every piece of technology that we carry on our bodies.  Don’t forget how crucial it is to protect ourselves, and that includes our personal data.

We can make a difference in wearables.  Try, test, and critique.  Rinse and repeat.