The Oxford Dictionary defines privacy as "the state or condition of being free from being observed or disturbed by other people."

 

It seems basic enough that people would expect to have the right to be left alone. One problem, though: technological innovation in the areas of artificial intelligence and data analysis has led to some pretty astounding advances in the art of threatening privacy. And while we've had years of debate and refinement around this topic, there's still a lot we have to learn about our relationship with privacy, and how to adequately protect it.

 

A quick glance at the privacy headlines to start 2020 clearly illustrates how complex of an ongoing challenge privacy presents. From legislation to surveillance to the delicate balance between privacy and user experience, there's a lot happening on the frontlines of this critical topic, and the implications for the future shape of privacy are enormous.

 

Let's start with probably the biggest privacy news of the new year: the Jan. 3 introduction of the California Consumer Privacy Act (CCPA), sweeping legislation that seeks to make privacy practices much more transparent, and give consumers much more power over their data. Businesses have been preparing themselves for months to ensure they're in compliance, but even as they struggle with what compliance looks like, the weeks-old law is already attracting controversy.

 

Not only is the legal community engaged in a debate about CCPA's constitutionality and whether it's susceptible to potential challenges in court; experts such as Eric Goldman, a law professor at Santa Clara University, who expressed his opinions in a recent piece for The Hill, are suggesting that a federal law would be more effective, and much less confusing for business, than the potential for a slew of state privacy laws.

 

While the CCPA is big news in the United States, our evolving relationship with surveillance looks to be possibly the biggest global privacy story of 2020. As AI and analytics are increasingly paired with images and video to create powerful surveillance engines, humans are faced with a future in which they're watched everywhere they go. Nowhere is this on clearer display than in China, where cameras backed by facial recognition technology are everywhere.

 

But even in a notoriously restrictive country, dissent is forming. NPR recently published a fascinating piece about how activism is starting to bubble up in reaction to widespread surveillance by government and private industry alike, and the resulting attack on private data that's collected.

 

China is far from alone. Similar concerns about surveillance exist in the United States as well, such as in upstate New York, where civil rights advocates are up in arms over a school district that's using facial recognition technology to spot guns and sex offenders, claiming that doing so threatens student privacy.

 

It's an issue that figures to grow as AI-infused surveillance technologies mature and spread. Along those lines, the Chicago Tribune recently published a detailed piece about Clearview AI, a company that has quietly been providing innovative facial recognition technology to hundreds of law enforcement agencies all over the country. This raises the real possibility that what Chinese citizens are facing may not be far off for Americans.

 

For now, however, ground zero for privacy battles in the United States is centered on users and their personal relationships with technology. In particular, privacy concerns again are swirling around iPhones®.

 

On the user front, recent reports have suggested that new security and privacy features in iOS 13 may be undermining iPhone functionality. It's a topic that reflects the constant struggle to achieve a balance between convenience and privacy, with consumers alternately asking tech companies to unlock the world for them while adequately protecting from that world's threats.

 

Meanwhile, the issue of whether iPhones should have backdoor access has resurfaced in the wake of the mass shooting at a naval base in Florida last month. The FBI wants to get into two phones that belonged to the shooter, a Saudi national who was killed in the attack, but, just as it did several years ago following a terrorist attack in San Bernadino, CA, Apple® is refusing to create a backdoor on the grounds that doing so would threaten the privacy of all iPhone users.

 

Elsewhere on the user-centric front, Google has said it will end support for third-party cookies, meaning it will become much tougher for advertisers to track consumers and offer them targeted ads. But an Inc. report suggested that less-ethical advertisers may turn to tools that are even greater privacy threats. No one's quite sure how this move will play out, but it sure looks like a win for privacy, at least in the short term.

 

Clearly, the privacy waters are, at best, muddy, which makes 2020 a potentially critical year in the continued evolution of our relationship with privacy in the internet age. Whether we tip toward more expansive and agreed-upon privacy protections or we continue to allow the degradation of any assumption of privacy, we could very well look back on this year as the one when the battle was won … or lost.
Contributors: