The explosion of IoT devices and the impending smart home revolution is an exciting next stage in the development of the internet. Promising to make lives easier and further integrate technology seamlessly into our lives, the number of connected devices is expected to explode in the coming years. There is, however, a temptation in the tech industry to create new products and get them to market before fully considering potential complications. It’s because of this that we see major data breaches, hacks of sensitive information, and standoffs between law enforcement and tech companies over the release of potentially incriminating data.
In the wake of March’s Westminster attack, UK Home Secretary Amber Rudd appeared on television and called for a ban on end-to-end encryption, suggesting that apps like WhatsApp ‘provide a secret place for terrorists to communicate with each other.’ Tied up in her stance is a fundamental misunderstanding of how both technology and encryption work. WhatsApp reportedly has a backdoor, which could allow law enforcement agencies access to messaging records in the most extreme cases. Otherwise, the encryption works to safeguard messaging records against hackers, and give users sanctuary from unprecedented levels of government snooping.
Last year, Amazon reportedly denied Arkansas police access to voice recordings taken from a murder suspect’s Amazon Echo. Importantly, the police were reportedly less interested in the actual questions or demands asked of the device - likely to be nothing explicit - but more interested in the possibility that the device was inadvertently activated in the hope that is unearthed any evidence.
Amazon saw its denial as a matter of principle: ‘Amazon will not release customer information without a valid and binding legal demand properly served on us. Amazon objects to overbroad or otherwise inappropriate demands as a matter of course,’ Amazon spokesperson Kinley Pearsall told Fortune. This stance is sensible; it is unclear legally how much of the information gleaned would be admissible, given that it is an illegal recording (against the defendant's consent) and a defendant could challenge it as hearsay. Handing it over, then, seems unnecessary. Also, given the inherently invasive nature of home assistants, without effective encryption and security, they will struggle to be adopted.
The prevailing opinion among those working in tech seems to be that encryption should be everywhere and should be the default option for connected products. Top cryptographer Bruce Schneier said: ‘Encryption works best if it is ubiquitous and automatic. It should be enabled for everything by default, not a feature you only turn on when you’re doing something you consider worth protecting.’
A very interesting TechCrunch piece explored the fact that smart home devices warp the traditional boundaries between private and public spaces. The physical perimeter is one thing, but the ‘vocal’ perimeter extends past the walls and windows of a property. Because Amazon’s Echo currently has no way of distinguishing between different users through biometric vocal identification, Echo could feasibly be operated by anyone, whether inside or outside the property.
Couple this, then, with all of Echo’s many capabilities - from unlocking the front door, to turning on the car, to reading out personal emails - and the full scale of the IoT’s security problem becomes more apparent. WiFi faced a similar problem and countered it with encryption, but this level of security would fundamentally damage Echo’s usability. The obvious answer is to incorporate some form of biometric vocal identification into Echo. This is not easy, though, and the existing systems that do this are fraught with mistakes at present. Also, the issue is only complicated further when we take into account the myriad other devices that could be voice controlled within a home.
An example of how this lack of distinction could become irritating (rather than dangerous) is Burger King’s new ‘Connected Whopper’ ad. The 15-second commercial ends with a Burger King employee saying ‘Hey Google, what is a Whopper?’, triggering Google Home devices within listening range and having them do the advertising themselves. One of these ads may be funny, but when other companies jump on the bandwagon this kind of marketing will quickly tire.
Encryption is important, as both a matter of privacy and of security. As devices proliferate and adoption picks up, it’s now more important than ever that lawmakers and users understand how encryption works on at least a basic level. Very few people would welcome a device into their home that can be tapped by the police with little more than a warrant. Equally few would be comfortable with their front door being operable through an open window on a warm day. We need encryption, and if it denies law enforcement agencies access to potentially useless information, then so be it.