If my iPhone isn't secure anymore, how can I expect my smart home to be safe? By forcing Apple to cripple it's security systems, the FBI ensures we'll never be able to fully trust that any technology we invite into our homes doesn't come with a second set of eyes. Or third. Or fourth...
The smart home is the exciting future that everyone wants. A safer house, with lower energy bills, a smaller environmental footprint and helpful tips for my health? Yes, please! And it's finally starting to arrive. Companies are putting out quality products from intelligent thermostats to automatic window shades. Mine makes a smart detector that tracks fire, gases and air quality.
Unfortunately, some people are getting taken advantage of by companies that choose to cut corners. These predatory companies are introducing their devices without basic privacy and security features. Why? Usually to save a buck. Does it make sense to spend the additional resources to encrypt a technology when most customers won't know the difference? To some, the answer is unfortunately, "no."
The results are being splashed across newspaper headlines lately as smart baby monitors, connected toys and other intimate technologies get hacked by tinkerers as well as criminals. When I can hear a strange man's voice coming from the device monitoring my baby, telling me that he's been watching, that's terrifying. The culprits aren't even just the makers of the cheapest devices. It's hard for consumers to ensure that smart products will have the basic encryption tools needed to prevent this kind of personal violation. And now it's getting harder.
That's why it's so shocking to hear that American opinions are split in the Apple vs FBI debate about whether to fundamentally dismantle encryption standards. Should we be allowing unfettered surveillance of our children's bedrooms? The answer should be a resounding "no."
On the one hand, security is a math problem. Either the math adds up and keeps us secure or it doesn't. No matter how much we all want to believe we can build a secret key to a door that only good guys can go through, that's just not possible. The bad guys use the same door to get in, one way or another. That's because math is an all or nothing system. A backdoor is a backdoor.
Security is also a social problem. As soon as people aren't managing their own security, like with a thumbprint and a pincode on an encrypted phone, the questions become: who is actually managing my security and are they susceptible to a data breach? Usually breaches happen in one of two ways: tricking someone into handing over private information or having a computer get hacked. Information is only as secure as the least secure part of the system, and these systems usually have a lot of moving parts.
Recently we've seen security problems reach an unprecedented scale, such as some 40 million accounts being compromised by a US retailer, 80 million records from the second largest US health insurer, and 76 million accounts from the US's largest bank. We should be working to prevent further violations, not encouraging them.
The argument between Apple and the FBI is not about the horrible tragedy in San Bernardino, but something much bigger. Otherwise, the Feds would have stated exactly what they're looking for. They already have unfettered access to metadata, phone records, social media accounts, email history and everything else they have legal permission to use. But the expectation that nothing is beyond their reach and that nothing is truly private, from thoughts to locations, is dangerous. Even more dangerous is the fundamental misunderstanding of how this technology works and that this backdoor for lawful pursuit, repressive regimes and criminal enterprise is one in the same. This confusion endangers us all even more than it helps us. When FBI Director James Comey says that "there will be international implications (to weakening encryption) but we're not sure of the scope", it's crucial to lay out exactly what those devastating consequences can be.
My fear is that if security is crippled, the consequence will be to prevent an amazing future. There's no future for self-driving cars when we can't secure them. There's no digital healthcare. And there's definitely no smart home.
Security is not at odds with liberty. It can enhance our freedoms of expression, of privacy, from fear and from not getting hacked. I believe we can get to a better future through innovation, but right now the entire tech industry needs to stand together. Other smart home and IoT companies need to do better with security. And the U.S. government needs to stop advertising to the world that securing their citizens is no longer a priority. Consumers must trust their devices. Breaking security is a huge step in the wrong direction that will have resounding consequences to the economy and to our freedoms if it succeeds.