Data Privacy (or the Lack Thereof) in the Internet of Things


Posted on by John Linkous

Data PrivacyAt this year's Consumer Electronics Show (CES) in Las Vegas, new technologies ran the gamut from incremental changes for existing technologies to full-blown new market segments (here's looking at you, drones). While technologies such as drones and connected cars have significant implications for geo-positioning privacy and even kinetic threats, an even bigger threat comes from the impact they may have on our data privacy.

Watching Your Watch

One of the most prominent technologies featured at this year's event was fitness bands, a subset of "smartwatch" technology. These wearable devices track a user's physical movements, as well as health metrics such as heart rate, breathing patterns, and posture. All of this data is sent to our good friend, the cloud. The intent of these devices is generally benign; they are there to help users maintain healthier lifestyles or assist healthcare professionals in monitoring at-risk patients. They are a part of the broad spectrum that is the Internet of Things (IoT). There's certainly nothing wrong with this in concept, until you realize the many ways that data privacy can be breached by these technologies—both intentionally and inadvertently.

Shared Data Blues

In terms of data privacy, the problems with these technologies are vast. Let's start with a basic premise, that product functionality and performance will always trump security. IoT products are based on a hodgepodge of off-the-shelf hardware and software, coupled with proprietary code and chipsets. While many IoT devices use trusted operating system kernels such as Linux and its robust TCIP/IP stack, they augment this with other technologies, including new protocols such as INSTEON and Z-Wave that don't have nearly the level of security vetting as other, more trusted protocols.

The likelihood of IoT vendors integrating security testing on their products is relatively small. Moreover, the concept of access control is completely lost on many IoT devices. Data is stored in cleartext, and there are no "users." Physical access to the device means you have access to the data it contains. This may not be as big of a deal when you're talking about a desktop computer, or a device that utilizes encryption or passcode protection (such as a mobile phone). But when the device contains information more private than just your contact list, the risk is significantly higher.

Unanticipated data sharing is another issue. Many IoT technologies like to federate. For home automation systems, for example, the problem arises when these devices federate information that's private in nature. For fitness bands, this might involve automatically transferring users's health metrics to a centralized server within, say, a hospital environment. For home automation, this might involve sharing information about when scheduled events are to occur, indicating when homeowners may be away. The reality is that neither the relative security of federated endpoints of IoT devices, nor the security implications of threats (such as man-in-the-middle attacks), is of great concern to most vendors.

Unfortunately, the technical issues related to the IoT devices mentioned above are the lesser concern to data privacy. The greater issue is the one thing many technology vendors seek: maximum monetization. If you have a product that captures users's personal information and then sends it to the cloud (presumably for the user's benefit of reviewing the data at a later time), this information becomes fertile ground. This is true for both the vendors who want to get revenue out of the statistical data users provide, and the marketers who want to chop the data up and package it to even more product and service vendors. Some regulatory controls, such as HIPAA, are in place to protect the privacy of select user data. But these mandates are generally limited to specific types of personal data (such as health care-related information), and only apply when protected information is stored along with personally identifiable data. All a vendor or marketer needs to do is find a way to circumvent the specific definitions used in these laws, and then they can use the data in a myriad of ways. This may be against the spirit of a law, but it's not against the letter of it.

Ultimately, users must determine how much of their personal data they wish to share, and with whom. The problem comes when technology—and by extension, the developers and manufacturers of those technologies—makes that decision for us. At that point, privacy becomes obsolete.

Contributors
John Linkous

, Technology Advisor

Privacy

critical infrastructure data security privacy

Blogs posted to the RSAConference.com website are intended for educational purposes only and do not replace independent professional judgment. Statements of fact and opinions expressed are those of the blog author individually and, unless expressly stated to the contrary, are not the opinion or position of RSA Conference™, or any other co-sponsors. RSA Conference does not endorse or approve, and assumes no responsibility for, the content, accuracy or completeness of the information presented in this blog.


Share With Your Community

Related Blogs