Podcast Transcript
Introduction:
You're listening to the RSA conference podcast, where the world talks security.
Kacy Zurkus:
Hello listeners, and welcome to this edition of our RSAC 365 podcast series. Thank you so much for tuning in. We have a great podcast lined up for you today on the topic of exploring the idea of privacy, as it relates to informed consent and consumer data with my guest Anne Zimmerman. Anne is a lawyer, and a founder of modern biosx.com, where she blogs and engages in bioethics projects. She serves as editor in chief of Columbia university's peer-reviewed bioethics journal, voices and bioethics, chair of the New York City bar association bioethical issues committee, and co-chair and co-founder of the bioethics forum at the collaborative for palliative care. I am Kacy Zurkus, content strategist at RSA conference, and here at RSA conference, we host podcasts twice a month and I encourage you to subscribe, rate, and review us on your preferred podcast app, so you can be notified when new tracks are posted. And now I would love to dive into today's topic with Anne. Anne, thank you so much for joining us.
Anne Zimmerman:
Thank you for having me.
Kacy Zurkus:
I'm thrilled to have you here with us today, and I feel like this is a very important topic on privacy. In a recent blog post in modern bioethics, you wrote to me, informed consent is more valuable in traditional to medical care or medical research than in engagement with big data. Yeah, consent is the operational tool behind widespread data collection and the prevailing framework for big data. Consent opens the floodgate to permissible data uses and leads to vulnerability to both non-consensual and unexpected uses. Can you talk about this very real issue of what actual informed consent is as it relates to clinical care and what it is not as it relates to big data?
Anne Zimmerman:
Sure. Informed consent really sounds empowering, but its ability to empower depends on the situation. In medical care and clinical research, it supports autonomy and decisions affecting a person's own body. So even then informed consent is not without problems. It's not a perfect system and it needs the support of other requirements and other laws. But consent is one of the tools that can really prevent unwanted or delineate where the permission ends. So it's sort of a guarantee plus an acknowledgement of known side effects and risks. So if I were to say, operate on my left arm only, and I've signed the consent. If they operate on the right side, that's a huge problem, and malpractice laws would step in and cover that issue. So consent really helps draw a line and it's a way of agreeing with the doctor about what is to be done.
Anne Zimmerman:
So in the data realm informed consent, isn't doing the same job of drawing a line. The data world is bigger than the or data lasts forever and informed consent at the point of data collection is really for convenience for the use of electronic medical records and for data sharing with insurers. But the data then goes toward building databases of anything from symptoms to demographics. Consent may serve to protect the entity collective or selling the data from liability in a way, and it's not really relevant to the data once it's de-identified because you can share and use that data without consent. So people automatically kind of have to assume the risk of re-identification and really once de-identified the data can sometimes take on a life of its if fewer restrictions than people realize. So informed consent in data is more like letting go of data at the time of collection without that much control over its use.
Anne Zimmerman:
And it's really unlikely, especially in the medical care community, it's unlikely that someone signing off on data use collection or storage, who is also really busy seeking medical diagnosis or treatment is concentrating on the data aspect. So they're really signing off in exchange for care. So consent to share the data is just added to the balance of who assumes, which risk and the person seeking healthcare isn't getting something additional that they want in exchange for that data, but they are assuming risks like leaks are hacking. So beyond healthcare, informed consent is also problematic as a primary framework for consumer data, for things like the newer laws, like the GDPR, consent is just one way to legally use data. There are other legitimate purposes for which companies can use data without permission, but it shifts the burden and responsibility to people. So as a framework, some of the burdens from the company or the government gets shifted to people who might not really feel that they have a true choice. So while there have been small toward empowerment, it's not clear that people really feel empowered at all over their data.
Kacy Zurkus:
Yeah. And you know, speaking to myself, I can say, I certainly don't feel empowered over my data. And I also sort of feel at this point the cat is out of the bag and we've given that informed consent and for youth across the board without really understanding what that means. So Anne, one thing I wanted to talk to you about also is I know that you're really passionate about this facade we have of informed consent being voluntary. Can you talk to our listeners about the charade of consent as a quid-pro-quo or a condition of participation?
Anne Zimmerman:
Sure. To me, consent is not really voluntary. It is always in exchange for something in exchange for access or services or participation. I don't run around giving out my social security number or my email address, my height, my weight, my Fitbit data, or even Google search history. It's not in my interest to give those things freely and voluntarily. So, basically, I could be barred from things for not giving it. So I could be barred from so important like diagnostic radiology or medical care in general, or use some website or an app if I don't consent to sharing a certain amount of personal data. So if I were worried about the possibility of it being hacked or leaked, I can't really act on that worry. I can't protect myself. So to me, a rule that acts more like either consent or go home, that really is often the rule.
Anne Zimmerman:
It's not a true test of voluntariness. Most people would not forgo a necessity like medical care for the sake of privacy. So when we are told to forgo privacy for medical care, we really just do it and we hope that the data is being kept secure. The same is really true for websites. You might click on permissions, but it's not because you want them to have your data, it's just because you want access to the app, the search engine, the website or whatever it is. And in an online purchase, the same is true. The data is kind of icing on the cake for the company. It's a way of creating a data base on reselling data and carrying out marketing strategies. So basically I give you both money and data and you give me the item I'm buying. So the data is sort of seen as an exchange for the privilege of using the convenient platform, but it's an additional cost to me and an additional benefit for the company.
Anne Zimmerman:
So even things like the search engine will get that search history and be able to work with the data. And sometimes you choose to give a certain amount of personal data. Like for example, if there's an exercise app or something that is fitness-related. So the data I might want to type in and share is sort of personal and is in the health realm, but it's not in the hospital situation. I might want to type in my height, my weight, my blood pressure, and it might give me something like a personalized target heart rate for a workout. But I don't have the ability to stop that data from being de-identified shared and aggregated. So I find that it's voluntary and that I could choose not to use the app at all, but it isn't discretionary once I want to use the app. And I think that's a big limitation on how we define voluntariness.
Kacy Zurkus:
Yeah. And I think it's an important distinction, right? Like to the point that words matter, that difference between voluntary and discretionary is really critical when it comes to the collection and use of data. So if consent is inadequate protection for sensitive health data, what do you consider adequate measures that need to be taken to protect our health data?
Anne Zimmerman:
Part of what we need is a much bigger picture. We are not just trying to protect only our data. We're protecting societal goods. So one of those societal goods that we're protecting is privacy, especially in the sense of freedom from government intrusion or from certain and other intrusions. And then we are also kind of protecting the ability to not be defined or constrained by our data in a way that can seem dehumanizing. So we have reasons that are broader societal protections that we want. So I think what's happening is that we're starting to fall into sort of a surveillance state and we're becoming very comfortable with that. So generally, people have not consented to the totality of data combined from multiple sources, but because we can be described quite accurately, genetically, medically, and even psychologically by our data, there are important steps like showing up limits on profiling by government entities and on subpoena power at invisible evidence. Health details can be derived from things like cameras that estimate more than you think like mood and weight and facial recognition, technology can create a face print used to detect genetic information.
Anne Zimmerman:
So we're not going to ever require a search warrant for something that exists in the public realm, but we need to evaluate constitutional protections for times that our information or our data could be used against us or that we could really start to tolerate a lot of privacy violations against bystanders. So law enforcement and government agencies can purchase data, some identifiable, some de-identified, just like companies do. And that's kind of both good and bad. And I just think we want to look out for our interests in preventing recorded political and social profiling without any special cause. And then another step that is more at the point of collection, I think we should stop collecting access data about children for school medical forms. Schools should be limited to a few you data points that they need to know.
Anne Zimmerman:
So FERPA kind of covers who at the school can see it and that type of thing, but it doesn't really limit collection, and a less medicalized parenting style could end up leading to charges of abuse or neglect because you're putting information out there. It could be leaked or hacked, but it also could be subject to a subpoena power. So parents are sort of forced to trust cyber security when they might not want to and the risks really go to the child, whether it is an embarrassing condition or subjects them later somehow out of cyber bullying or even to future extortion, these things, it's not really voluntary. The school medical form isn't quite voluntary, because not everyone's going to say all homeschool, so I don't have to fill out the form, right? So children's data is all over the internet. And I think it's important to see it's not exactly about privacy, but it's about who has power over the data.
Anne Zimmerman:
We see a similar problem with foreign adversaries. There is an issue that affects both economic advantage and national security. We have a huge customer base with known health problems, identified vulnerabilities and adversaries that can kind of create a system of dependence and use it as a bargaining chip. It'd be great if a foreign company or country cures a given disease. And we could benefit from that. But what if the price is national security compromise? Data really is power and we should care about who holds that power. Data could even be used for something terrible criminally like genocide. So the idea that people consented to share the data just becomes so besides the point an ancillary, we don't know how to pick up the pieces in the future if the data really is abused and in the wrong hands. So I think it might be a better approach to really disincentivize some of the data collection in to devalue data.
Anne Zimmerman:
So companies and hackers don't even want it. So sometimes the data might become no longer relevant naturally like credit cards could become obsolete. So then stealing a credit card number would be meaningless, but we can control other things in a similar way. Like if we were to regulate pharmaceutical advertising differently. Sensitive, and de-identified health data wouldn't be as valuable in the corporate world, but it could continue to be used for medical research.
Anne Zimmerman:
So a hacker might not want it as much. And similarly, a company legitimately using it for marketing might not want it as much. Discrimination laws also help a lot with how data could be used against someone, but we might need to fine-tune some of those laws. For example, things like genetic data from a face print isn't necessarily covered by some of the genetic discrimination laws. So we do have a lot of disability laws, but how they work when everything's out on the table is somewhat unknown. If we get to a point where every time you go to an interview or you go to look for insurance, the company has your entire genetic code, I'm not sure that right now the law is strong enough to kind of deal with that.
Kacy Zurkus:
You've certainly given us a lot to think about in so many different realms of the way that data is used collected and can be potentially abused. I want to go back to the statement that you made about consent becoming almost ancillary and beyond the point, right? So as a follow-up to that, if consent is really irrelevant to most of these issues that you pointed out that really matter most to the people in society, vulnerability to foreign adversaries, ransomware hacking, cyber crimes. You know, these are all non-consensual things. When is it that consent actually does matter as it relates to our data?
Anne Zimmerman:
I think consent is a really good minimum in the data sphere, but it is the ability to withhold consent that would make the consent meaningful. So GDPR and CCPA kind of try to do some of that to say this uses okay with me, this uses not okay with me and I can click a few things, that gives a little bit of control. I think the problem is the insincerity. It's not realistic to think that most people click through and agree to certain uses. So I think people see things like a banner about cookies simply as a distraction, they need to exit out of their screen and the easiest way is to accept. So in the hospital arena or people are given the okay to use their data for medical research purposes.
Anne Zimmerman:
I think that is the best use of consent because it's being used for something that they really approve of. They aren't necessarily aware that once they identified that data would be aggregated and sold, but in general, I think consent does operate as a good springboard, but data use laws should regulate companies rather than placing too much responsibility on the individual consenting.
Kacy Zurkus:
So one sound argument around consumer data and privacy that is gaining traction is this idea of fairness. The very people who do data is being used for profit are not compensated for the use of their data. It's something you wrote about in another piece barcode me. And you said, quote, whether I am barcoded or represented by a QR code or even a shape or algorithm, there are always ways to compensate me. I prefer Venmo, but PayPal is fine too. I love that. But can you talk a little bit more about the idea of privacy in the big data landscape in the context of our personal data?
Anne Zimmerman:
Yeah. I would hate to see privacy use as an excuse to not be able to trace the data back to me for payment. So if someone's going to profit from my data, I think it might as well be me. As an added note. Some people really want to be traceable to ensure that they will be alerted to medical research that affects them in particular or has to do with their own disease or condition. So privacy and confidentiality are not everyone's first priority and that's not even the priority of a lot of privacy laws. I mean, laws like HIPAA are about enabling data sharing and making something easier for like electronic medical records and insurers.
Anne Zimmerman:
So I think that we just need to recognize that privacy isn't the only thing that matters. So I'd say there's an argument that some data really is used in research that benefits everyone or is sort of a social good as some medical researches. But I suspect many people would like the cash as well, especially if a commercial purpose is involved down the line. So there are lots of ideas about how to compensate maybe using blockchain or other technology, but I think things like even a healthcare token or a special currency that compensates for data and has to be used in healthcare or somehow incentivizes health could all so work instead of cash. The problem really is that now people receive nothing for their data.
Kacy Zurkus:
So that leads me to think about is the issue one of privacy, or going back to that question of fairness that it's not fair that others are benefiting financially from the data collection of individuals really unknowingly or because they have to agree to something out of convenience. So are we talking about an issue of privacy or fairness in compensation?
Anne Zimmerman:
I think we shouldn't think that the two are mutually exclusive. So to me, privacy is sort of an artificial excuse. If someone says, I can't trace this back to you to pay you, I sense that they don't want to pay me. Data is worth billions. The global market for big data is about 130 billion now expected to grow to about 240 billion by 2026. So people who trade and work in the aggregated data world are getting a windfall. I think that we can make efforts to preserve privacy and still have some fairness.
Kacy Zurkus:
I like that. Yeah. Yeah. I mean a fair world would be a wonderful thing, right? You know, through the eyes of a child. When things aren't fair, it's very upsetting yet through the eyes of an adult, we often say, well, the world's not fair. I appreciate that perspective on how to maybe make things a little more fair. Anne, thank you so much for joining us today. Listeners, thank you for tuning in. To find products and solutions related to privacy, we invite you to visit rsaconference.com/marketplace. Here you'll find an entire ecosystem of cybersecurity vendors and service providers who can assist you with your specific needs. Please keep the conversation going on your social channels, using the hashtag RSAC, and be sure to visit rsaconference.com for new content posted year-round. Anne, thanks again for joining us.
Anne Zimmerman:
Thanks so much for having me, Kacy.