This article is more than 1 year old

Leaky-by-design location services show outsourced security won't ever work

Google and Facebook can't – or won't – anticipate misuses of data that shouldn't exist

We’re leaking location data everywhere, and it's time to fix it by design.

An example: if you go on safari in Africa, you'll be asked to turn off your smartphone's location tracking capabilities.

The reason is that most people have no idea that every photo they take with their phone embeds location data in the exchangeable image file format (EXIF) metadata that describes each snap. But poachers know. And when a stream of snaps depicting something remarkable hits the web, they read that EXIF and know exactly where to start their hunt for a valuable beast.

Hence the request for a visit to your phone's Settings before safaris.

It gets worse. As reported in El Reg, little bit of code published to Github a fortnight ago showed how any app granted access to the photos on your smartphone (hint: that’s quite a few of them) can simply walk through your database of images and generate an accurate map of your movements. In many cases this record of movements can go back years.

Every geek I’ve told about this had the same reaction: a facepalm. Of course our photos keep a record of our movements. Of course any app that has access to our photos can produce a map of our movements. Two unrelated features collide, generating a kind of retrospective self-surveillance of which the NSA would be proud.

First things first. Close that security hole on iOS 11 by going to Settings >> Privacy >> Location Services >> Camera. That screen lets you turn off that automatic location tagging of images. On Android YMMV but try Settings >> Apps >> Camera >> App Permissions >> Location. Next, review all the apps that have access to the Camera Roll, and revoke the lot - they’ll ask for permission next time, and then you can make your own call about whether their need is commensurate to the risk.

We need for much more finely-grained access controls for our image archives. Apps should be able to have write access easily, but read access provably needs to be far more restrictive and conditional and time-limited.

What about all the other places you’ve posted your photos - Facebook and Flickr and Picasa. Can’t someone simply wander through those images, stalking you via a breadcrumb trail of EXIF location data? Of course they can.

If that weren’t bad enough, yesterday software engineer Rob Heaton published a short essay showing how to use WhatsApp to track the waking and sleeping patterns of almost anyone, anywhere, just by using their phone number. That little bit of location data - activity - leaks out of WhatsApp all the time, and can be used to map your active hours just as surely as any fitness tracking device.

This huge-and-growing hole in privacy flags another, larger issue that can no longer be avoided: we can not simply outsource our security to others. Neither Google nor Apple are wholly capable of anticipating all of the mis-use cases that end with our data being weaponised against us. You can’t fault them for trying - it’s just that the problem is too big for any one company (even companies worth north of a half a trillion dollars) to handle.

Whether they're even interested in plugging this hole is another matter altogether.

This is not the kind of security issue that can be patched. This is a problem of design, or rather, a lack of design thinking with respect to the security and privacy of the individual.

We urgently need a reset, rethink, and redesign, grounding this process inside an ethics and methodology of individual privacy, integrity and security. We need to do this for ourselves - in partnership with the device manufacturers, software architects, carriers, and app makers. Everyone has to be involved in a comprehensive assessment of devices that are so intelligent and so flexible they are being put to uses beyond the imagination of any single actor.

That process will not be easy: Privacy is the enemy of utility, and security is the enemy of seamlessness. Something that “just works" is almost always something that “just leaks" private information.

No one wants a world that’s both connected and hostile. No one wants to worry about every photo snapped — or every word uttered around Siri or Alexa or Google Assistant. But unless we change our ways, the future looks like a infinite series facepalms. Forever. ®

More about

TIP US OFF

Send us news


Other stories you might like