nav search
Data Center Software Security Transformation DevOps Business Personal Tech Science Emergent Tech Bootnotes BOFH

Sure, encrypt your email – while your shiny IoT toothbrush spies on you

Harvard's internet arm frets about gizmo security

By Kieren McCarthy, 2 Feb 2016

Analysis The increasingly noisy debate over encryption is nothing to worry about, eggheads at Harvard have announced today: it's your toothbrush you need to worry about.

In a 37-page paper titled Don't Panic: Making Progress on the 'Going Dark' Debate [PDF], a team from the Berkman Center has summarized discussions between themselves, security experts, and a number of unnamed people from the US intelligence community.

The goal of the discussions was to bridge the gap that has opened up between law enforcement and politicians – who have been asking for backdoors in software products and access to encrypted information – and tech companies and security bods, who have been arguing that strong encryption is critical for our digital future.

The end result is a very readable summary of the current situation with respect to encryption and why the FBI feels end-to-end encryption is a danger. Ultimately though, beyond producing a useful article for Wikipedia, the paper boils down to two things:

  1. The Feds shouldn't worry too much about encryption because it's not in tech companies' financial interests to provide it, and
  2. Whatever evidence is lost from the end-to-end encryption of, say, text messages will be more than made up with the expansion of Internet of Things products that have horrible security.

The first point: "First, many companies' business models rely on access to user data. Second, products are increasingly being offered as services, and architectures have become more centralized through cloud computing and data centers."

So because it's not in companies' interests to do so, they won't create truly secure end-to-end encryption for everything. Which means eavesdroppers will still, somewhere along the line, have access to sensitive stuff like encryption keys: law enforcement can get a court order (or otherwise pressure the corporation) to hand over the necessary information or cough up the knowhow to successfully wiretap internet-connected gadgets.

The paper notes two additional elements in favor of this argument: one, fully secure encryption is technically complex and can have a performance hit on low-end devices, and; two, the ecosystem of electronic devices is so broad that it is a pain to introduce a system that will provide trustworthy end-to-end encryption.

We can see you

As to the second, scarier point: the internet of things super-surveillance net.

The paper has this to say: "The Internet of Things promises a new frontier for networking objects, machines, and environments in ways that we are just beginning to understand. When, say, a television has a microphone and a network connection, and is reprogrammable by its vendor, it could be used to listen in to one side of a telephone conversation taking place in its room – no matter how encrypted the telephone service itself might be. These forces are on a trajectory towards a future with more opportunities for surveillance."

The paper uses recent examples, including the Samsung TV, the listening Barbie dolls, and Amazon's Echo. It also makes reference to an interesting case back in 2001 when the FBI tried to get a car company to use its roadside assistance service to record conversations in a vehicle. The company took the matter to the US Court of Appeals, which shot the FBI's case down but, according to the Berkman Center, not on surveillance grounds. By extension, it says that it is possible your car could act as a bug against you so long as your car's security features aren't impacted.

For some reason however, the paper doesn't then point to the high-profile recent cases of cars being hacked.

It's not just cars though: "Appliances and products ranging from televisions and toasters to bed sheets, light bulbs, cameras, toothbrushes, door locks, cars, watches and other wearables are being packed with sensors and wireless connectivity."

The argument is that this wealth of devices is going to provide intelligence services with the ability to track and listen in to people far beyond what they can do now. Hence: let's not worry about encryption – your kitchens and bathrooms are being bugged anyway.

FBI just wants your phone data, not your smart toaster's analytics

While the paper makes some interesting observations and provides a useful reference for the ongoing encryption debate, it does seem a little desperate to arrive at a conclusion, and so stretches the IoT analogy to, and a little beyond, its logical breaking point.

The fact is that the FBI doesn't want encryption on phones because phones provide crucial intelligence on specific individuals. It is a single device, it contains information, and it can be used to build a prosecution case.

It is hard to imagine an FBI agent picking up a phone at the scene of a crime, realizing it is running iOS 8, calling HQ, and saying: "It's all encrypted – wait! Quick, get me a feed of all the baby monitors in a two-mile radius."

The situation is different, of course, for the security services, which tend to want to track things in the background and keep tabs on people. Plus they have vastly greater resources and the authority to do things like hack toothbrushes. It's also the case that people are going to be less wary – or aware – of devices that they don't own, or which serve different functions. So in that sense, it is possible that those at the end of an investigation may reveal more than they would over a mobile phone.

Even so, the fact is that rather than work with one limited set of companies – telcos, for example – equivalent surveillance using IoT products means either hacking devices or building relationships with a significantly larger number of manufacturers.

As to the user data/advertising argument to explain why encryption won't be used very broadly: that also appears to take the assumption that mass surveillance is more useful than targeted surveillance.

Wrong metrics

There is a market for specific devices and apps that provide a high level of security, and the prices for them is coming down. When they come down to a low-enough level, a large number of the general population will value their privacy sufficiently to pay for it and then the people using them for nefarious purposes become harder to pinpoint.

Just because some companies prefer to monetize through data and so offer a lower cost to consumers, it doesn't mean that there won't be a large and expanding market for companies that do it the other way around.

What would be interesting to see is the comparative cost and availability over time of products and services that provide high levels of security, and the degree of use of those technologies.

It seems highly unlikely that the number of people carrying out criminal or illegal acts would increase with the general increase in use of technology: that would suggest technological advances drive people to perform criminal acts.

And so you are looking to keep eyes on roughly the same number of people. The difference is: they have better and cheaper tools for skirting surveillance. And no amount of hacked toothbrushes is going to compensate for that. ®

The Register - Independent news and views for the tech community. Part of Situation Publishing