How Do Google, Microsoft and Adobe Think About Privacy?

Posted on by Tony Kontzer

You may have noticed that privacy has become sort of a big deal of late. From Apple's battle with the FBI to ongoing work on developing an international privacy regulation, privacy issues have taken center stage, right next to the plethora of security challenges companies face.

privacyAt the RSA Conference on Tuesday, a panel of privacy executives from Microsoft, Google and Adobe took on the topic, and they made one thing abundantly clear: Privacy is an often-foggy topic that presents numerous challenges and requires some uncomfortable tradeoffs.

One of the primary challenges is exactly how to define privacy and the role it plays within an organization. The answers are different for every entity, depending on factors such as industry, product type, customer profile, and compliance requirements.

"It's about what's appropriate collection, appropriate protection and appropriate use," said Brendon Lynch, chief privacy officer at Microsoft, describing the prism through which the company views privacy. "Sometimes privacy and security work hand-in-hand, sometimes they're treated as separate processes."

Google obviously faces more wide-ranging privacy issues than most companies because of the vast amount of data it's constantly collecting about its millions upon millions of users. For that reason, the company has established a stringent privacy process around its products.

"Every launch at Google undergoes a privacy review," said Keith Enright, legal director of privacy for the search giant.

Additionally, Google has established separate privacy teams to focus on legal and engineering issues, respectively. Doing so enables them to dive deeper into the implications for both aspects of the business, he said.

Naturally, enforcing privacy is not always a popular activity. When a privacy executive has to tell marketers that something they're doing presents an unacceptable privacy risk, tensions invariably surface. As a result, enforcing privacy is a very delicate operation.

Enright said that such tension is important to maintaining a meaningful privacy dialogue, and thus should be encouraged. But he also said it's in a privacy team's best interest to choose its words wisely.

"If we have a situation where we have concerns, we don't say you have a privacy problem," said Enright. Instead, he trains his team to remind employees to keep the user in mind, and to remain true to the concept of trustworthy computing.

Sometimes, however, privacy laws and regulations conflict with an organization's objectives. Sometimes that necessitates a tweak of those objectives, but more often than not, it reflects the fact that most privacy regulations are hopelessly out of date.

A prime example of this is the European Data Protection Directive that was adopted in 1995—just before cell phones and the Internet took off—and amazingly remains in effect. European and U.S. leaders are now more than three years into hammering out the details of the General Data Protection Regulation (GDPR) that will replace the aging directive.

But that's easier said than done, the panelists agreed, as there are perceived weaknesses with the 200-page document that must be addressed. One problem: the document was prepared entirely by government officials, without significant input from industry.

"It's written by people who don't run businesses," said MeMe Rasmussen, VP and chief privacy officer of Adobe Systems, noting that the document reflects the agendas of dozens of nations, leaving much open for interpretation. "The dust is not going to settle for a few years."

Digging deeper into the GDPR, there's one very interesting thread that's equally problematic: The right to be forgotten. It's a controversial concept that traces back to the 1998 case of a Spanish man who wanted Google to delete a web page detailing the foreclosure of his home. The man, Mario Costeja Gonzalez, argued that he had since paid the debt on the home, and thus the page was unfairly negatively impacting his life.

Ultimately, the case went before Spain's highest court, which ruled that Google must comply with the Data Protection Directive and remove the post. On the day it was forced to comply, Google received 12,000 requests to have personal details removed from its search engine. The company was hopeful the right to be forgotten would not be included in the GDPR.

"It's not one of the outcomes we were pleased with," Enright said, surprising no one in the room. "We had to invest in a process that was minimally invasive of other rights when complying."

At which point Rasmussen provided a humorous reminder of how unique every company's privacy considerations are.

Said Rasmussen: "I'm very thankful we don't have a search engine."

Tony Kontzer

, RSA Conference



Blogs posted to the website are intended for educational purposes only and do not replace independent professional judgment.  Statements of fact and opinions expressed are those of the blog author individually and, unless expressly stated to the contrary, are not the opinion or position of RSA® Conference, RSA Security LLC or any other co-sponsors. RSA Conference does not endorse or approve, and assumes no responsibility for, the content, accuracy or completeness of the information presented in this blog.

Share With Your Community