Privacy 2022: Perspectives from the Top

Posted on in Presentations

Risk and complexity have accelerated ferociously for any organization using data and managing privacy today. This panel – all seasoned professionals from the pinnacle of the privacy field – will explore the legislative landscape, the risks that exist and are emerging, and the operational realities of managing privacy in a complex digital environment.

Video Transcript

>> ANNOUNCER:  Please welcome panel moderator, Dominique Shelton Leipzig.


     >> DOMINIQUE SHELTON LEIPZIG:  Well, great.  First of all, we're so thrilled to be here.  I'm so thrilled to be able to have this conversation about privacy from the top with Kalinda Raina who is the Chief Privacy Officer of LinkedIn, and Jane Horvath, the Chief Privacy Officer of Apple, and Keith Enright, the Chief Privacy Officer of Google.


     So, we're going to get right into this, because I know there are a lot of questions on all of your minds.  A big question on my mind is the fact that the issue of privacy has almost, it seems, overnight gone from a legal issue to a business issue.  It seems like almost every day I pick up the news, on the front page there is something there dealing with data and privacy.  I even saw Jane featured in O Magazine recently, so that is how popular this topic has become.  And I was just curious, and probably many in the audience are curious, how has this changed the way that you do your work and what you do?


     So, maybe start with Kalinda.


     >> KALINDA RAINA:  Thanks so much for that, Dominique.  Yeah, it is amazing to see how much privacy is pervading the discussion these days.  I think for those of us who’ve been in the profession a long time - Keith, Jane, and I - we're pretty excited to see the billboards on 101 advertising privacy.  So, it has become something that, finally, companies are beginning to integrate into the way they do business.


     I think what that has meant for those of us in this profession is, it’s becoming a lot easier, quite honestly, to get some of the things that we have been trying for years to move forward to begin to develop those within companies.  So, developing a culture of privacy where everybody in the organization sees it as part of their job is really one of the things that I have found most exciting about since we've had GDPR come into play; that everyone at the company really has a familiarity and an understanding and a passion around seeing, protecting the privacy of our members as part of how we ensure trust as a company. 


     >> DOMINIQUE SHELTON LEIPZIG:  So important.  Jane, you've been at this for a long time.  Tell us what it has meant to you to see this go from behind the scenes to in front.


     >> JANE HORVATH:  So, I think it's interesting.  Privacy is front‑page news right now, but from the perspective of my career, I felt like I've been on the front lines of privacy for a very long time, particularly a lot of the important moments.  From an Apple perspective, I knew the moment I walked in the door that Apple was very different with respect to privacy.  It was something that Steve Jobs had talked about publicly, talking about asking, asking over and over again. 


So, privacy was something, of course it was more back scenes at Apple at the time, but it was very much a cross-functional effort from the beginning.  No one owns privacy at Apple.  It is cross‑functional.  We all work together.  One of the secret sauces at Apple is a team of privacy engineers who report in to the SVP of software, and they are embedded in every product, as well as privacy policy and legal people to build privacy in from the start.


     So, privacy is now front‑page news, but it almost feels like an afterthought right now.  It's exciting.  I mean, right now there is a pending on the bus privacy law.  Exciting.  And GDPR, as Kalinda mentioned.  But from a cultural standpoint at Apple, it just fits so well, and because of the executive commitment to privacy, it has been something that has been part of every product and service that touches personal data, for a long time. 


     >> DOMINIQUE SHELTON LEIPZIG:  That's a good point.  You mentioned the pending bipartisan federal bill that hit the news last Friday.  Again, this sort of timely moment for privacy.


     Keith, tell me what that has done and what has that meant for you in your role at Google.


     >> KEITH ENRIGHT:  Sure.  So, in some respects, my experience has been really similar to Jane's.  I joined Google just over 11 years ago.  Had the privilege working with Jane there at the front end of my time.


     Because it is Google, because privacy, security, keeping users safe, is so fundamental to the mission of the business and to our long-term success, my experience of the issue has always been front and center in terms of executive attention.  I would say that Sundar, my CEO, is probably as sophisticated on this issue as probably the chief privacy officers of most Fortune 50 companies.  It's on the agenda of every significant leadership and every product conversation inside of the company.  So, internally, things actually haven't changed all that much.


     I would say, externally, it's been incredibly exciting.  Several of us, all of us, have served on the board of the IAPP, the leading trade association in our space.  And watching the growth of the profession, the maturity of the profession kind of extend over time has been incredible, and it leaves me with a sentiment.  I'm sort of paraphrasing something one of my former colleagues had said, “it's the same fight but it’s a bigger army.”


I think there are so many of us that are committed to keeping users safe online, delivering the benefits of technology to as many people around the world as we can in a safe and private way.  And there is just a higher level of sophistication and a much stronger talent pipeline than I think we've ever had before, and I see that trend line continuing. 


     >> DOMINIQUE SHELTON LEIPZIG:  Great.  Speaking of trends, and as privacy has sort of risen in prominence, I guess, there is this other trend that is out there of putting other types of laws, like antitrust, consumer protection, content, into sort of this sphere of privacy.  And whether that is appropriate or not.


     There are trends going on in Europe with the Digital Markets Act that is being proposed.  And this idea of this tension that might exist between making encryption and other things available, and also keeping data private.  I'm just really curious to know what you think about the appropriateness, or lack thereof, of incorporating antitrust and other legal principles into the privacy discussion.


     Let me start with Keith, and I'm going to take it this direction this time.


     >> KEITH ENRIGHT:  Sure.  Starting with the issue of antitrust, because I do think this is timely, there are a number of bills being considered in the US Congress right now that we should have grave concerns with.  The audience here is especially sophisticated on issues of security.  One of the issue areas that we see in legislation that’s under consideration would require sort of open sharing of information with third parties at a level of parity that might not make sense and could actually introduce new privacy and security risks.


     One thing to highlight, again, which I think this audience understands better than probably any other audience in the world, is we are looking at hundreds of millions of signals every day in our effort to try to protect users.  And we successfully present hundreds of millions of attacks and exploits from our users.  We keep more users safe everyday than any other company in the world.


     The challenge that we face with some of the proposed legislation is restrictions on our ability to act upon that information without potentially opening ourselves up to legal claims by third parties who are suggesting that, in the effort of acting quickly, we could mistreat or unfairly treat some third party.  We need to preserve flexibility to be able to protect our users leveraging observed intelligence as efficiently and effectively as we can.


     I would also say on the issue of regulating platforms and providing shared access to data, it is incredibly important that we recognize not all parties are operating with the same sophistication, and not everyone who is seeking access to sensitive user data intends to operate on it with the same good intentions.  We need to be able to preserve the discretion to operate platforms in a way that is reasonably designed to optimize for the security and privacy of the users on the platform.


     Now, all that having been said, while I do have grave concerns about competition legislation that’s currently being proposed, I would say, more generally, we are in desperate need of federal privacy legislation.  I think you're going to find broad-based support across industry.  We have long supported it, and we actually think smart, thoughtful regulation that protects user rights and allows for continued data‑driven innovation, it's essential.  And, like Jane, I'm increasingly optimistic that maybe we’re finally on the cusp of it. 


     >> DOMINIQUE SHELTON LEIPZIG:  Yeah.  Jane, I would be interested in your thoughts.  It's exciting to think that, if the discussion starts moving away from these tangential antitrust and other theories, and we get to the federal privacy law, if that will change the discussion.  Curious what you think about the appropriateness, or lack thereof, of this injection recently of antitrust consumer protection into the privacy discussion.


     >> JANE HORVATH:  Like Keith, we are absolutely in favor of competition, in fair competition, and we do not want to be anti‑competitive.  However, the bills that we're looking at right now that are pending, or actually enacted, have some concerns from both a privacy and security standpoint.


     iOS has one way for third parties to get on the platform.  That is through the app store.  The app store has both really good privacy protections, so if a developer wants to get on the app store, they have to agree to the developer guidelines.


     The developer guidelines actually have a number of privacy requirements that we've put in looking at, literally, what is the risk to our customers of an app calling this API or using this API balanced with the desire to have innovation on the platform.  A really good example of that is the health app.  The health app on the phone, you can upload all of your medical records, if you want.  That data stays on device.  If it's backed up, it's backed up in an end‑to‑end encrypted format.  But we knew when we were developing the health app that it could lead to a lot of really important innovations.


So, apps should be able to read and write to the health app.  So, there are APIs for both.  But if a developer wants to call one of those APIs, they may, pursuant to the app store guidelines, only do so for purposes of health and fitness.  There is no “and with the consent of the user.”  It is literally constrained.  An app can use that data for those purposes only.


     So, if we're looking at the competition bills right now, it looks like we would no longer be able to hold apps to that level of privacy.  It needs to be reasonably tailored, and I would imagine we would be in a defensive position to say that that was reasonably tailored.  They would probably argue that we could have loosened the restrictions a lot more and that would have allowed more access to the data for other third purposes.  So, that’s a small example.


     On the security side, we're looking at requiring side loading on the iOS platform.  That means that users who have traditionally gotten their apps from the app store, apps that have gone through app review to determine whether they comply with the guidelines, apps that have gone through app review to look for security vulnerabilities.  Is it a flashlight app that is also calling all kinds of other APIs that it has no need for?  Those are the kind of things that app review looks for; and as a result of app review, we have 98% less malware than the other platform.


     So, these are the things that we're looking at in balancing the competition legislation.  We really feel like we have designed a device from the ground up with privacy and security in mind.  We hear often, “well, Apple, you'll just figure it out.”  We were told that about five years ago in connection with encryption.  Build a backdoor and you can figure out how to protect that backdoor.  We know it's going to be safe, but if you open up the platform to side loading, that is, in some sense, a backdoor into the platform and access to all of the software. 


     >> DOMINIQUE SHELTON LEIPZIG:  Thanks, Jane.  I mean, these are so complicated issues.


     >> KEITH ENRIGHT:  Dominique, just one quick reaction to this.




     While Jane and I agree on most things, I do think it's important to call out, on iOS and on as well on Android, on a laptop, you can install applications from the internet.  We do think you need to balance the benefits of allowing platforms to compete on privacy, and on this issue specifically, allowing competing app stores, we think the benefits are self‑evident. 


And I would challenge this statistic that is frequently put out about 98% less malware.  We haven't seen peer-reviewed recognized research to support that statistic.  But as a general matter, again, we agree on more than we disagree on, but I do think we can respectfully disagree on this notion that allowing healthy competition in terms of platforms and app stores still seems like the right way to proceed. 


     >> DOMINIQUE SHELTON LEIPZIG:  It's interesting because the business models are also driving a lot of this, and when you look at antitrust and these concepts coming in, Kalinda, that are not just in Europe.  I was just reminded last September when Senator Klobuchar had hearings on the interrelationship between antitrust and privacy, and there was some concerns around that.  And Commissioner Kahn was at the IPP conference talking about these relationships.


     Curious about your thoughts about the appropriateness and what you've just heard from Jane and Keith.


     >> KALINDA RAINA:   Well, antitrust is a really interesting issue in the privacy space right now.  And as you can hear from Google and Apple, it is a very difficult issue for us in the privacy space to manage.


     What we are seeing is data is one of the most valuable assets that companies have today.  And so, where antitrust in the past might have been about how much ownership you have of a certain market, a lot of it’s beginning to focus on how much data do you have, and what control do you have as a result of that data.


     One of the things that I think is interesting about the interplay between privacy and antitrust is, for those of us who are privacy advocates, we're constantly looking for ways within organizations to minimize the amount of data collected, to deidentify data, to figure out through privacy-enhancing technologies, new ways in which we can use data in a less privacy impactful way.


And one of the things that some of the antitrust legislation could potentially do is put companies in a position where, if they are to do that and deidentify data, they’d have to share it on with others to even the playing field.  So, there’s sort of this “tug and pull” of how much data needs to be shared across the big five in tech to ensure that we have a fair playing field.


On the other hand, it does disincentivize some of those things that those of us in the privacy space have been saying to work towards, which is deidentification of data using more privacy-enhancing technologies.  And, potentially, then putting yourself in a position where you’d have to port data over to others.


     I think Keith makes a really good point in, when you're looking at the digital marketing act and even the DSA – the Digital Safety Act - they're making assumptions about the sophistication of the companies, then.  What they can do, what the capability is of being able to work across platforms; that isn't always true, and makes it really hard to put into action. 


     >> DOMINIQUE SHELTON LEIPZIG:  I'm glad that you mentioned the volume of data.  So, let's talk about that.  I was looking at some statistics that I thought were sort of impactful and helpful to take a step back and think about.


     We are generating, globally, 2.5 quintillion bytes of data per day.  That is 18 zeroes behind it.  Even as we’ve been having this conversation, as I understand it, there are 127 new connected devices that are being created and innovated per second.  Every day, about 11 million new connected devices come online.


     This is totally going to continue and the data as an asset, and because of the sheer volume of it, is going to continue to be a push and pull.  I'm just curious, from your standpoint, how has the volume of data, just the sheer volume, impacted what you do?


     Maybe I'll start in the middle this time with Jane, and then go out.


     >> JANE HORVATH:  Sure.  So, as far as data goes, there is a different range of data that could be impactful.  Data that is not tied to an identity is a whole lot less risky than personal information.  One of the things that we've worked deeply on at Apple is trying to use privacy-enhancing technologies to minimize the amount of personal information that we collect. 


     A good example, Apple Maps.  When you call Apple Maps, it generates a random device number, just a random number, that will serve as your identity while you're using Apple Maps for the session.  If I am navigating from San Francisco to L.A., my device will create a number, and it may create a number of numbers on down to L.A.; because the idea is we don't want to be able to string together all your endpoints to be able to reidentify you.


When your device is communicating with Apple servers, Apple server sees you as a random number, and that random number changes every time you use Apple Maps, or if you're using it for a long time.  So, that's one way to minimize.  You cite the amount of data; I think you really have to look and see, is that identifiable data that could potentially be compromised, or are there better things to do.  If you don't need to use data, encrypt data.  Don't have access to it. 


iMessage was designed - when we released it, we didn't need access to the communication, so we released it end‑to‑end encrypted.  So, I think that that is the real command here, is to look at all data processing, and when you're designing products from the ground up, you minimize the amount of personal information you're collecting. 


     >> DOMINIQUE SHELTON LEIPZIG:  That's really an important point about the types of data and ways to make it less impactful as far as individually identifying.


     Keith, how is the sheer volume of data that you talked about, how is that impacting what you do?


     >> KEITH ENRIGHT:  Sure.  I would open with the very provocative proposition that processing data, even at that astonishing scale, can be good.  If we are making people's lives better, if we are delivering value to our users and our customers around the world, and we're doing it all responsibly, we need to challenge this assumption that processing data is intrinsically negative or hostile or harmful.  Because I don't believe that is the case if we're doing it in a responsible and thoughtful manner.


     With that, I agree with what Jane was saying.  There is an incredible amount to be excited about in terms of the innovation that technology is delivering.  We are grappling with, how do we protect the fundamental rights and freedoms of users while we are delivering them increasingly powerful and interconnected services that allow them to live their lives better, do their jobs better, be more efficient, effective, deal with things like a global pandemic.


     Some of the things that we've done, we did community mobility reporting.  In the early ages of the pandemic, we worked together with our technical teams to figure out, how can we use anonymous data to help the public health sector understand how people's location patterns had changed.  Not looking at individual users in a way that identified anyone, but we could help public health authorities understand how traffic to hospitals was changing, which is incredibly useful to them as they were planning for how to adapt public health systems to support people during the early challenges of the pandemic.


     We also had a historic cooperative effort with our friends at Apple on exposure notifications, which, again, was just - we knew that there was something we could do.  We desperately wanted to help.  Initially, the inquiries that came in to us were, they wanted to use location data to support traditional contact tracing efforts.  There were all kinds of problems for that.  Our location data is permission-based, so we don't have this exhaustive repository of location data, because a lot of people don't choose to turn it on.  Location data is also not as accurate as you would want it to be for contact-tracing purposes.  It might say you are on the wrong side of the street, or on the wrong floor.  And even those slightly small inaccuracies could really screw up contract tracing.


Instead, there was innovation around, maybe we could use proximity between devices, do it on an on-device processing model, minimizing any data that was being shared with any central server so that, without enabling unwarranted surveillance, you could provide useful intelligence to people about whether they had been exposed to something that could put then at a health risk.  So, I think those kinds of innovations are enabled by the collection, storage, and processing of data at scale.  And I'm really excited about the direction of technology, because I think we are going to be able to do it in even more exciting ways that are even more protective of user privacy than anything we've seen to date. 


     >> DOMINIQUE SHELTON LEIPZIG:  Great.  Hearing from you, hearing from Jane, we're hearing about innovation being a way to grapple with the volume.  Kalinda, how has this impacted your work and what you do in terms of the volume of data that’s there?


     >> KALINDA RAINA:   Yeah.  Well, I was just reflecting as Jane and Keith were talking that, when I started out in data privacy in the early 2000s, the amount of data that people were providing to the Internet, we didn't have phones that we walked around with, or tablets, or - the ability to buy things online was kind of, you were a bit risky if you did that.  I remember one time I was working at a law firm in Craigieburn and decided to buy something on eBay.  What I got was not what I thought I was going to get.  I remember everyone kind of laughing like, “Oh my God, why would you buy something on eBay?”  What a chance to go put your credit card online.


     Our world has completely shifted in the past 20 years in terms of the amount of data that is being collected and created all the time.  And that has really shifted, for me, the privacy profession.  Because I think when I was early at companies, like Nintendo as Chief Privacy Officer before there were phones, the amount of data we were collecting was all through our online site.


     Today, what we are collecting, as you said, is a tremendous amount, and that has brought about this need to think about the governance of data within organizations.  And I think this is some of the ways in which - in our company, I interact with our security teams the most, is thinking about how do we annotate this data?  How do we keep track of it?  How do we put policies on it?  How do we make sure it's deleted in an automatic fashion?  How do we put access controls in place?  And that is a lot of the places where our profession in privacy, your profession in security, is going in terms of trying to manage not only the risk of this data, but just the sheer volume of it within organizations to make sure that it is protected and used in the ways that we've committed to using it. 


     >> DOMINIQUE SHELTON LEIPZIG:  Kalinda, I love how you talked about governance, because this sort of brings us into another question I wanted to think about here with you and the audience; and that's the role of the security team and the engineers in this process of helping actualize the governance and actualize some of the privacy policies and privacy procedures that you are innovating. 


Jane, I want to start with you, because you talked about having engineers being embedded with your product teams, and that’s very excited.  Let's maybe talk about that.


     >> JANE HORVATH:  Sure.  I'm talking to a room full of engineers, and I actually have a computer science degree, very old, but I think doing privacy, if you don't have a fundamental understanding of databases, of servers, there’s a lot of technical understanding you need to have, or you need to have someone in the room who understands it, as well.  Because this isn't like every other area of law in which you can walk in and opine on a widget or something.  Data, databases, it all involves a really deep understanding of computer science.


     Working with the privacy engineers to figure out how to minimize data, sometimes a lawyer can say, “Oh, that's absolutely legal, you can do it.” But the engineers will say, “Well, wait a minute, you don't need all that.  Why don't you sample?  Why don't you, instead of collecting something from every user, why don't you do sampling?”  And they can think technically and talk technically, and I think they move the envelope.


     On a security side, vendors, vendor security, vendor audits.  Looking at, we've got contracts in place, but who is looking to see whether they're actually, in fact, doing what they say in the contract?  So, the best friend a privacy person has in a company are your security and privacy engineers. 


     >> DOMINIQUE SHELTON LEIPZIG:  That's a really good important point about the inner relationship between the professionals working as a team to try to tackle this issue, the technical and the policy folks.


     Keith, how are you seeing this, and what is the role of engineers and the security team in what you do?


     >> KEITH ENRIGHT:  Sure.  The privacy engineering function at Google is perhaps the most fundamental.  When I think about our product strategy, the way things are evolving, it is about more than meeting the requirements of changing laws; though, of course, that is a baseline that we need to target.  Our mission is to keep users safe online while they enjoy the benefits of our product and service suite.


     The engineering function, it's interesting.  If you look at the way the law has evolved over the last five years, you look at something like the requirement to have a data protection officer under the GDPR, I remember when folks first saw this provision in the law.  They're like, what is this unicorn?  Like, where are we going to find someone that is sufficiently technically sophisticated to understand the issues and be actively engaged in our launch review process on a day‑to‑day basis, and be an appropriate overseer for escalations if we think that maybe we are not calibrated appropriately on an issue of data protection?  While, at the same time, there is an explicit requirement that this person be an expert in European data protection law.


     Ultimately, we, as I think many other companies of similar scale and complexity, decided yes, we're going to have a data protection officer; and yes, that person will be remarkable and will have an interesting background, but it's going to take a village to actually satisfy the requirements.  Not only as they were manifested in the GDPR, but as we see them evolving, as we follow the trend lines in the law now.


     So, when I look at our privacy organization, I stand alongside our chief information security officer.  I stand over, primarily, our legal and compliance functions for privacy.  And then we have a massive sprawling organization that includes privacy engineers, UX specialists, all folks from cross domains from different disciplines that all have this incredibly rich and deep understanding of how privacy affects our product strategy in the way that we want our products to present to users.  And that all rolls up into a very sophisticated governance function that goes all the way up to the office of our chief executive.


I think seeing privacy engineering as one absolutely essential function in what is really an overall governance and accountability framework, that’s where we're headed.  That’s the next stage of maturity that companies are moving towards as we figure out, how do we comply with a massively complex and byzantine set of requirements that are coming to effect around the world.  But how do we do that in a way that we're not creating products that are ultimately unusable or less usable to the users that we're ultimately here to serve?  We need to reconcile those two things.  And to do it, it's really about organizational flexibility more so than any particular discipline. 


     >> DOMINIQUE SHELTON LEIPZIG:  I love what you talked about there in terms of just adapting and working closely, not just to do the minimum of what the law requires, but also looking ahead to what will work for users.  When you look at the 150 countries plus that now have data protection laws around the globe, this teamwork and inner relationship seems so crucial.  I'm really interested to hear, Kalinda, your take on this because of the huge role that LinkedIn plays.


     >> KALINDA RAINA:   Yeah.  I really like what Keith is saying around the flexibility, because it is definitely a growing field; privacy.  One of the things we found on LinkedIn is, in the past four years, since GDPR was passed, we've seen a growth in the privacy profession that is quite extraordinary.  People mentioning privacy in their profiles and actually saying that this is something that they're doing.  And part of that growth has been in the engineering space, which I've always been a huge advocate of, since my days at Apple, of seeing privacy engineers at work.  Their ability to come in, as Jane was saying, and see what the potential is to use different technologies or techniques to make any product privacy better.


     I think there’s also a tremendous amount of exciting work happening in privacy-enhancing technologies right now, and what is capable of using data in ways that doesn't impact people's privacy in the same way.  I think that is a very exciting place.  And one of the things that we have done over time is really build out a privacy organization that spans not only engineering, so we have engineers who are focused on privacy-enhancing technologies and working on product design, as well, but also this data-governance component.  And then, actually, the end state of what our members are experiencing.  How are they interacting with settings?  How are they interacting with the platform in a way that is privacy protective?


So, it really does take multiple teams across an organization where one of the very few horizontal teams that cover the entire company in terms of thinking about issues very holistically from both the beginning of the members’ experience but all the way through to the end to how that is used on the back end, how it might be used or shared when you're dealing with suppliers.  All of those pieces come together.  And I think one of the things that we are seeing potentially happening, if we do get federal legislation here in the U.S., is the requirement of privacy by design.  We've already seen it develop in our GDPR laws, but we might also be able to see that here in the U.S., as well, which I think would be a real opportunity for growth in the profession from engineering perspective of folks who have both that privacy stance and the security stance to build that into products so that we really ensure we're building with trust. 


     >> DOMINIQUE SHELTON LEIPZIG:  I love what you said there, and also in the trends.  Because looking at the bipartisan draft, the draft bill that just came out on Friday, there’s a whole section on algorithms and looking at fairness and testing and auditing.  So, there is going to be, it looks like, if this gets off the ground, it will be baked in; that sort of partnership that you guys are already modeling.


     I want to give everyone a heads‑up that I'm going to ask a few more questions, but we're going to leave the last ten minutes for questions.  So, in about four minutes, we'll open up two mics on either aisle, and if you're interested, feel free to start cueing up.  Because I want to make sure that, if you’re sitting in your seat and you have questions, as security professionals, we want to make sure we get those answered.  But on to one more question that I had.


     We are seeing so much from corporations talking about ESG and the role that is being played for public companies, but also, I think it's having an impact on privately-held companies, as well.  So, this environmental, social, and governance.


     Do you see privacy playing a role in this ESG discussion, and if so, what is that role?  And maybe start with Kalinda, and then I'll go back down this way.


     >> KALINDA RAINA:   Well, it's funny you ask that, because I was just having a conversation with someone yesterday who said it should be ESPG.  The “P” for privacy.  I actually think that this is a shift that we have seen over the past ten years.  It's been a slow development.  For those of us who have been working in the profession from behind the scenes, we've been really trying hard to grow the awareness of privacy, the understanding of privacy, in our organizations.  And now we're beginning to see the public show an interest, as well.  And I think that's the really exciting shift that has been happening, is people's awareness and judgement of companies based on how they're handling data, how they're thinking about these issues, how they are treating privacy; and, quite honestly, all the things that go into trust, security, safety, privacy.  How you are approaching the environment.  How you're approaching diversity.


     And I really do think that, as we head into - we've only had the first 20 years of a truly commercialized internet.  The next 20 years, the things that are possible are only going to be possible with people's trust, and people's trust in the organizations that are doing those things.  And so, I really think that we are going to see a greater awareness.  And speaking from different companies, not just the ones here on the stage, but others thinking about trust and how you build that into your products, how you ensure that to your users. I think that is going to continue to grow. 


     >> DOMINIQUE SHELTON LEIPZIG:  Very interesting.  Keith, I'm curious, how are you seeing the ESG discussion and the inner relationship with privacy, and how is that articulating on your end?


     >> KEITH ENRIGHT:  Sure.  So, I would say at the outset, today, the role of technology in protecting democratic values and democratic institutions around the world, and the responsibility of technology providers in that conversation, has never been more front and center.  I think we are all seeing that manifest every day in different contexts.  Privacy, I think, is one facet of that.  We've talked about competition policy.  You could also talk about content regulation.  These are extraordinarily important and timely issues, and technology and the role of platforms in that conversation, it's one important aspect that we need to be really thoughtful about.


     For privacy specifically, I am seeing it come front and center in many ESG conversations, and I'm seeing that in different contexts.  I'm seeing that in my work at Google, I'm seeing that in my board service.  There is growing interest in the investment community to understand how you are thinking about privacy, not only as you define your company's role in society and sort of defining your own sense of social responsibility; but also in the way you're thinking about it as a risk vector for your company and ensuring to everything that we’ve said, you are building appropriate governance and accountability systems into your company to ensure that, as both an opportunity and a risk, you're appropriately anticipating, engaging with, preparing for, and addressing privacy in your product strategy, in your corporate vision, and sort of as you plot the way forward.


So, this feels like an inevitably to me.  I don't think it's a flash in the pan.  As we’ve talked about trend lines around data collection and data processing, the increasing centrality of technology as sort of the thing that mitigates people's interactions with their lives and with the world, this is something we all need to responsibly engage with.  And I think is it going to be a really interesting time for practitioners in the privacy and the security space because we're going to be essential elements in that conversation. 


     >> DOMINIQUE SHELTON LEIPZIG:  A hundred percent. I do see the centrality of the technology raising the question about the ESG question coming forward.  And when you mention investors, we certainly see, on our end, questionnaires coming from black block and Vanguard about the state of privacy for an enterprise.


     I'm just curious, Jane, have you seen ESG impact privacy in your day‑to‑day?


     >> JANE HORVATH:   Yeah.  Actually, privacy is an ESG value at Apple.  Privacy is a corporate value at Apple.  And so, all of our execs are held to account for privacy as part of our ESG values.  So, it has, in fact, come to fruition at Apple. 


     >> DOMINIQUE SHELTON LEIPZIG:  Wow.  Well, with that, I want to pause here and see if we've got time for questions.  And I do see a spotlight over here.  A gentleman here in the yellow shirt, would you please come forward?


     >> ALEX OZDEMIR:  Yes.  My name is Alex Ozdemir.  I'm a PhD student at Stanford, working on cryptography.  I actually want to build on this theme that’s come up during today’s discussion on tradeoffs between the functionality of the product and the level of privacy guarantees that you're performing.  It’s something that was latent in some of Kalinda's answers.  The existence of privacy-enhancing technologies that can help us eliminate or mitigate that tradeoff.


My question for all of you is whether or not you see a role for your organizations in funding long‑term research into privacy-enhancing technologies that can make those tradeoffs better for both business and for the consumer's privacy?


     >> KALINDA RAINA:  Keith, you go first, and I'll go second.


     >> KEITH ENRIGHT:  I would say absolutely, unequivocally, yes.  One example that I will give that I sort of mentioned before, when we were working on exposure notification technology, that project was slow.  One of the challenges that we had, Jane and I spent many nights on video conferences with regulators and policymakers and technologists around the world, because there were a few challenges we had to untangle.  One was a presumption that privacy was intention with the utility of the technology.  There was a very strongly held belief by many thoughtful, highly qualified people that if you optimize for privacy, your exposure notification platform won't work as well.  You will impair functionality if you do this privately.  And we fought that very vigorously.


We felt that strongly throughout that, A, the two things could not only be reconciled, not only could we do this in a private way, it was essential to the success of the technology.  Because if people didn't trust the implementation, they wouldn't turn on exposure notifications.  And scale was essential to that technology delivering on its promise.


Some of the underlying technologies that allowed us to make compelling arguments there, and the things that allowed us to innovate, are the direct results of the third‑party research and work that people had done in this field.  This is where the greatest promise lies.  We should be leaning in extraordinarily hard, because we need the help of the academic community and research community to help us come up with good creative solutions to these really hard long‑term problems.


     >> KALINDA RAINA:  I think you can see we are all very enthusiastic about what you’re suggesting.


     >> JANE HORVATH:  Yes.


     >> KALINDA RAINA:  And it is something that is a part, I imagine, of Apple, as well, Google, and LinkedIn, in the way that we operate.  And, given all of us being us so close to universities, this is something that we've done at LinkedIn, where we've actually partnered with universities in trying to share some of our teachings and learnings.


     >> ALEX OZDEMIR:  I'm glad to hear it.  We're very excited about continuing to collaborate, as well. 


     >> DOMINIQUE SHELTON LEIPZIG:  Exciting.  Over on this side of the room, let's hear from you. 


     >> MONICA HATHAWAY:  Yeah, hi.  Monica Hathaway.  I wanted to lean in a little bit on the trust idea.  I think that's so critical.  And wanted to hear some ideas from you guys on thought-starters for how we rebuild trust.  I think there's been a lot of talk about privacy, and using that as a value proposition to sell, but then we're also seeing that trust eroded.  The recent headlines from DuckDuckGo, having that data sharing partnership with Microsoft, that’s such a slap in the face to people who use DuckDuckGo as a way to circumvent Google.  Sorry.


     It seems like we're taking one step forward and then two steps back when we put privacy as this core value, and then turn around and abuse that trust and we actually find out that it's not what they said it was.


     >> KALINDA RAINA:   I'll just respond.  I think one of the challenges here is in educating the public on what’s actually happening with data, and everyone getting a little more sophisticated.  Both within companies and externally, as well.


     But I would say that I think companies are beginning to really focus, not just the large companies, but even the smaller companies, as well; and I think if we have federal legislation, here in the U.S., quite honestly, those of us on the stage, we're all subject to GDPR.  We've been thinking and doing this work, because we are international companies subject to international laws.  But if we are to get comprehensive federal legislation here in the U.S., which all of us are very much excited about, quite honestly, I think it could make a real shift in how smaller companies and some of the others are held accountable, and actually being able to show that they are truly doing what they say they are doing.


     >> JANE HORVATH:  Yeah.  That’s actually a separate compliance obligation under GDPR, is you can comply with the law but you also have to have the accountable systems to show that you're complying with the law.  So, that comes naturally, and companies should be accountable for the statements they make.


     >> KEITH ENRIGHT:  Yeah.  Echoing that and adding one layer of complexity.  I would say first, strongly agree, companies should be held to account, not only by regulators and law enforcement, but by users.  If a company is building a brand around privacy and making promises to – that is their mission, that is what they're doing, I am encouraging users to become as sophisticated as they can be and work with companies that are being honest with them.


     Now, one of the interesting challenges here is, there is a temptation, I think, among less sophisticated policymakers to put the onus back on users to suggest transparency is the solution to all of this; just explain everything that you are doing, and then force users to make decisions about every single data-processing activity.  That’s not the solution, for a whole bunch of reasons.  Companies should be held to account.  We should be responsible for making reasonable decisions.  And we are inevitably going to have to make decisions about where we draw the lines, what defaults are, and we need to do that in a way that optimizes for user trust.  And we need to operate in a legal and regulatory environment that gives us the security and the flexibility to be able to do that in a responsible way.


But it is very, very difficult, because there is a tendency, especially as consent is increasingly held up as the gold standard, consent needs to be informed, consent requires that the user process a ton of information to make an informed decision.  And it puts companies in a very difficult position.  If you have a complicated product suite, how do we determine exactly what the right level of information is?  How do we deliver it as efficiently and effectively as possible?


     I talked about the fact that we have privacy UX researchers that are specifically focused on this problem every day.  How do we communicate as efficiently and effectively with the users as we can to allow them to manipulate settings to put them into a state that they actually want to be in and that they really understand?  I think that’s another one of the big challenges for tomorrow, as systems become more complicated and people use more and more digital products.  We need to figure out some way to get that balance right. 


     >> MONICA HATHAWAY:  Looking forward to what is to come for that. 


     >> DOMINIQUE SHELTON LEIPZIG:  Okay.  On this side of the room, I think we have time for one more question, possibly two. 


     >> Audience:  Hi.  My name is Shreib, coming from Norway.  My question is for Keith.  You claim that privacy is one of your fundamentals, while Google is pretty notorious to be flying under GDPR in the past several years.  And also, I just got another fine a couple weeks ago for violating the right to be forgotten.  Please help me understand.  It's been over four years.  What is the blocker?  What is actually the struggle you are having to comply with GDPR?


     And a follow‑up question.  I've been hearing this, everybody talks about the need for the U.S., like federal legislation in USA.  What’s the problem?  Are the public representatives having a hard time defining these values?  And what the big tech is doing to actually introduce such regulations? 


     >> DOMINIQUE SHELTON LEIPZIG:  So, we have 36 seconds.  On the federal legislation, I will just say the main sticking point appears to be this issue of whether there will be mandatory arbitration, whether mandatory arbitration is appropriate and can be included in the bill.  And there is one group, Senator Cantwell, who has one opinion, and Senator Wicker has another.  I just want to point that out.


     >> KEITH ENRIGHT:  On the GDPR question and the right to be forgotten question, two things.  One I would hold up our compliance with the right to be forgotten as being the high watermark for compliance by a multinational with this type of requirements.  We received the adverse ruling in this TEHA case, that was later codified in law in the GDPR.  We have incredibly positive relationships with most regulators across Europe.  We have developed incredibly high-functioning.  We are processing many of these removal requests.  There are inevitably going to be instances where we analytically disagree with the outcome.  And we've been engaging very constructively with regulators in thousands of occasions to drive outcomes that appropriately balance rights and freedoms under European law to comply.


     When you're operating at the scale we're at, you are inevitably going to have some instances where a court or an individual disagrees with the outcome.  We encourage the courts to sort of scrutinize that.  That’s how we get smarter.  That’s how we get better.  That’s how the process improves, is by being educated if they think we've gotten wrong.  So, I look forward to more of that dialogue with regulators.


     I would also say, as to GDPR generally, our GDPR compliance program is widely recognized as one of the most sophisticated, rigorous, and high-performing in the world.  Again, at the scale we are at, regulators are inevitably going to look at Google when they are thinking about how to move their policy agenda globally.  We are going to be a primary enforcement target, along with some of my colleagues on the stage.  Again, we welcome that.  This is how we get better, this how the law improves, and this is how we learn how to best protect our users and comply with the law around the world.  So, we welcome the enforcement, and I'm quite proud of and pleased with the way it’s played out to date. 


     >> DOMINIQUE SHELTON LEIPZIG:  Thank you.  And I think we're going to end there.  We are at time.  I want to thank the audience for all of your questions, and thank the panelists for this invigorating discussion about privacy from the top.

Dominique Shelton Leipzig


Partner, Cybersecurity & Data Privacy Leader, Global Data Innovation and Ad Tech Privacy & Data Management practices, Mayer Brown

Keith Enright


Chief Privacy Officer, Google

Jane Horvath


Chief Privacy Officer, Apple Inc.

Kalinda Raina


VP, Chief Privacy Officer, LinkedIn

Security Strategy & Architecture

government regulations legislation privacy risk management GDPR



Share With Your Community