Technology is continually changing the way we live and work, yet also proving the law of unintended consequences. Case in point: Internet-connected cameras on baby monitors were designed to reassure parents — but instead put children at risk of being watched by the wrong people.
Welcome to the Internet of Things — and to its privacy problem.
With the Internet of Things, or IoT, consumer and industrial products ranging from appliances such as refrigerators to medical devices such as insulin pumps are connected to the Internet. The goals are efficiency and convenience — so no wonder the number of IoT devices is expected to increase to 80 billion by 2025, up from around 11 billion now, according to IDC estimates.
Yet there needs to be some meaningful progress made by IoT manufacturers around device privacy — not to mention other kinds of IoT vulnerabilities, such as the big Fall 2016 attack originating from IoT-connected devices. This is even more important as these products become capable of gathering, using, and storing considerable amounts of user data.
“It’s not only the originally intended purpose you have to think about,” says Aaron Weller, Cybersecurity and Privacy Managing Director at PwC. “If a device can collect seven kinds of data, and only uses two, what happens to all the other data being collected?”
This is true of even the most mundane IoT device. For example, Weller’s home uses “smart” lightbulbs that can be operated via an app. The one-way communication raises privacy questions: “The app that I use to control the bulb may not expose all of the functionality that the device is capable of. For example, does it know what the time and date is, and does it track what time I turn the lights on in the evening?” asks Weller. “If I don’t turn them on for three days in winter, has it stored that information somewhere, possibly in the cloud, and could someone work out that I was out of town?”
Indeed, “the over-collection and over-retention of data are issues we’ve been dealing with in the privacy field for a number of years — but now it’s about thinking how we apply these principles to new technologies and how people will use them into the future,” Weller adds.
Because the IoT is still relatively new, some device manufacturers have not necessarily been concerned about the “too much data” issue in their products. They may have been more interested in adding features: “When new IoT products are released, they’re basically being tested on the entire population – tech companies put things out and then observe how people use them to determine which features become popular and which are just gimmicks,” says Weller.
Nor are there yet specific U.S. regulations or laws around IoT device privacy — even though the U.S. Federal Trade Commission has identified the risks of a lack of IoT privacy standards and the EU’s General Data Protection Regulation (GDPR) specifies stringent rules around IoT device privacy. “When the technology is moving faster than the legal framework, it’s hard to say ‘you must do these particular things’,” Weller says.
So for now, IoT privacy comes down to manufacturer self-regulation. Standards are being developed by industry bodies such as the Online Trust Alliance (OTA), which has published privacy parameters for wearable devices and home security systems, for example.
So what will these manufacturers do? This is where the concept of “privacy by design” comes into play. In this approach, manufacturers assess a product’s potential privacy risks and considerations during the design phase, and then adjust or address those issues in the product development and manufacturing process. This concept is a key feature in the GDPR, in which the most privacy-friendly settings — such as those that collect, retain, and share personal information – will be required to be designed and built into new products, devices, and business processes when the rule takes effect in 2018.
In general, a manufacturer’s IoT device privacy considerations should consider every step of the planned operating life and retirement of a device, including how to address any associated cloud services, how much data really needs to be gathered, and what should be done with any retained data. For instance, these considerations could address the secondary market for long-lived items such as refrigerators or cars — e.g., if these are sold by their original owners, will data remain in the device and be somehow accessible by the purchaser?
A real-life example of privacy by design is Blue Maestro’s Pacif-i, a pacifier which connects to a smartphone to record a baby’s temperature and the effect of any medication taken. Pacif-i works using an encrypted connection between the pacifier and the smartphone, and all data is stored locally on the phone.
“[Privacy] was something that we thought about from the start,” says Richard Hancock, CEO of Blue Maestro. “We’re at the fairly low end of the spectrum because our device does not hold any data about the child, and you could question how valuable information about temperature really is to anyone else, but we wanted parents to feel that no one else could see that data without them having control over it.”
Many manufacturers are likely to adopt privacy by design features, as Blue Maestro did, especially if they want to sell IoT products in Europe under the GDPR rules. Some of them could even use enhanced privacy as a selling point.
In many cases, manufacturers may be forced to implement privacy measures as the IoT privacy problem becomes more prominent, if they expect to stay competitive in the IoT space. “There’s more awareness than there was, and some of the leading device manufacturers are very much focused on this,” says Weller. “It could be a competitive differentiator for them.”
US Principal, Cybersecurity and Privacy, PwC US
Cyber & Privacy Innovation Institute Leader, PwC US