Posted on
in Presentations
We are in a crisis of trust and truth, one of our own making. Every day brings new examples and uncertainties: vaccines, elections, “fake news”, and on it goes. The slope of discord is getting exponentially steeper and the prices higher across society. What role does government, private sector, and each of us play in creating information disorder? Join Hugh Thompson and esteemed guests from the Aspen Institute’s Commission on Information Disorder to delve into one of the most important problems of our time. Don’t miss this compelling close to RSAC 2022.
Video Transcript
>> ANNOUNCER: Please welcome Program Committee Chair, RSA Conference, Hugh Thompson.
>> HUGH THOMPSON: Hey, everybody. Great to see you. Welcome to the last session of RSA Conference 2022. How was the conference? Good? And give it up for DJ Shiftee. That is impressive. That is impressive.
Well, thank you all for being here. Thanks for such an amazing week, such a great, I don't know, just community feel and fellowship over this week. So, I really, really appreciate it.
And we have a fantastic show for you today. Our panelists will be out in just a little while, and I'll introduce them, of course. But I wanted to just set the stage a little bit for what we're going to talk about. I think it's one of the most important issues of our time, and it's information disorder. How do we know that the information that we're getting, the information that we're acting on, is it real? Who wrote it? Is it based in fact? It's a challenge that all of us face every day.
I'll give you an example from this morning. So, I woke up and I looked at my phone, which is the first thing I do every time I wake up. And there's an alert on there. How many people here use Ring? The doorbell with the video thing. Okay – wow, applause for Ring. Okay, that's surprising. Okay, but – I like it. I like it. I also like it.
So, there's an alert that comes up, and there's like this community feed in there that tells you about things that are happening in your neighborhood. And the alert was, "Possible Mountain Lion?" Question mark?
So, I'm like, wow. All right, we've got five little kids, and you know. So, I immediately opened the thing up, and there's this incredibly grainy footage of something moving at a far distance away, right? And I look and there's already thirty-six comments on it. And the first comment was, “It's a mountain lion,” right? Like confirming that this is a mountain lion.
The second comment was, “It looks like a squirrel.”
Third comment, “I think it's actually a small child,” right? And this goes on and on and on.
And then finally, a veterinarian – or a claimed veterinarian, I can't validate this – in the neighborhood, apparently, says, judging by the gait of this, you know, creature or this shadow or this person, that it's actually a raccoon, right? And then suddenly everybody calmed down and you know, they think it's not a mountain lion.
Now, I bring this up because this kind of stuff we've got to deal with now every day. Is what we're getting is information fed – that’s fed to us based in fact? Is it real?
And I wanted to start off by sharing a personal security, and I guess misinformation story, with you. So, as many of you know that have been coming to this conference for many years, I've got five young kids. And I was surprised by my wife, who's amazing, about a month ago, the day before my birthday. And she said, I've got something great for us to do tomorrow. It's going to be amazing. I got us seats for the San Francisco Giants baseball game.
Any San Francisco Giants fans in here? A fan, two fans, okay. All right, this is not good for – we're actually in the city.
So, I'm like, oh, wow, that's fantastic. And my first question was, who's going to watch the kids? And then her response was slightly alarming, which is, no, they're coming with us, right?
And bear in mind, all five of these kids are under twelve, right? So, you know, I'm thinking in the back of my mind, you know, these games are pretty long. I don't know how the kids are going to hold up.
And so, regardless, the next day, we pile into the minivan, get everybody packed in. We drive down to San Francisco. And I don't know if you've ever seen Giants Stadium, but here's a photo. And this is the family as we're kind of getting ready to go. We've got our Giants gear on, you know. We're ready, ready to take it in.
Now, when my wife said she got good seats, you know, the definition of good is variable, right? I think it really depends on the person. And so, we walk in and show the usher, you know, the seat numbers, and they keep walking us down, lower and lower and lower and lower.
And then, finally, we get to our seats, and it is right behind home plate, right? Yeah, right? Now – now, those would be good seats if it was me and my wife.
Let's go to the next picture.
What I did not know and recognize immediately was that these seats are shown and highlighted every single time a batter comes up to bat. And if you are not familiar with the game of baseball, this happens very often. This is like 60% of the game, right?
And so, you see me here just trying to get the kids into it. Like, look, guys, look. He's throwing the ball and he might hit it, and you know.
Now, let's go to the next photo.
Things started to get a little bit unruly. And in fact, you know, they were yelling out things to the – advice, actually, to the batter. And we were really close, so, you know, at one point, some guy actually turned around. And so, I'm just trying to, you know, shh, shh, guys, you know, you've got to – okay. Let's get to the next picture.
By the seventh inning, I'm trying to think about, how do I contain this for the rest of the game? And so, you know, my daughter, Zoe, who's one of our six-year-old twins, she's like giving me a hug. I'm like, okay, great, you know, we can just relax for the rest.
And then something interesting started to happen. Let's go to the next one.
It began with a peace sign, which is not bad in and of itself, you know, not incredibly distracting, and you know, just good intentioned, right? Let's go to the next one.
So, she had been studying these batters as well as the coaches and was now doing signs of, you know, like what the folks in the outfield should be doing, right? So, she's doing this. Let's go to the next one.
And this is when security came. If you're not familiar with this gesture in baseball, it is the umpire, the guy that's standing behind the plate, judging that the ball that's thrown is a strike, right? So, she's now sort of casting judgment on the ball for the audience.
And you'll notice in this photo that it's kind of credible in the sense that the ball on the lower left is just passing the plate, right? So, she's trying to be accurate in her call.
And so, security shows up, I want to say forty-five seconds after this, and said, look, we were hoping that this would subside over the last five minutes, but we've gotten complaints from the network that people are getting both distracted and confused, as this was broadcast on national television, right?
And so, let's go to the next slide.
We then remove the three younger children, so the two six-year-olds and the four-year-old. My wife's, like, taking them back somewhere. I'm stressed out. You can probably see me here texting my wife on the phone.
And interestingly, concerningly, as soon as they left, the big kids took over, right? And you can see them in the front, and they're waving, and one of them is doing a hat sign. And then we had to just get out of there.
So, this – this was our trip.
But I bring this up mainly as emotional therapy, but also to tell you that it is so confusing today to distill what's right and what isn't, what's a legitimate sign, what's a legitimate signal, and what isn't.
We have an incredible group of guests today, an unbelievable group. And let me first tell you how this group was formed.
So, it was formed by Aspen Digital. And it was formed through the funding of Craig Newmark. Many of you know him from Craigslist. Craig from Craigslist. Anybody ever use Craigslist? I believe Craig is here.
And Craig Newmark Philanthropies to go out and study this problem of information disorder. And they brought together a very eclectic group of folks as a part of the commission, and we have the three co-chairs here with us today.
And I'd like to introduce our first guest. He is the Founding Director of the Cybersecurity and Infrastructure Security Agency, CISA, Senior Newmark Fellow in Cybersecurity Policy for Aspen Digital, and cofounder of the Krebs Stamos Group. Please welcome Chris Krebs.
Chris.
>> CHRIS KREBS: Hugh.
>> HUGH THOMPSON: Good to see you back. Have a seat, have a seat. All right, thanks for coming.
>> CHRIS KREBS: Thanks for having me. It's been a couple years since I've been up here.
>> HUGH THOMPSON: Yeah, no, it's good to see you, man.
So, first, a couple of questions. So many people today get their news from social media, right? I mean, it's just a reality, right? And it's an unfortunate one.
And what I had heard is that when you were Director of CISA, with all the might and information capabilities that CISA has, even you were informed of new news through social media. This is what I've heard, man.
>> CHRIS KREBS: Are you referring to my termination by tweet? Is that what it was?
>> HUGH THOMPSON: I am, I am, I am, I am.
>> CHRIS KREBS: Oddly enough, like the three –
>> HUGH THOMPSON: And that was not fake news, apparently, yeah.
>> CHRIS KREBS: The three Seminole moments, I feel like, of my time in CISA, all surrounded information delivered to me via Twitter.
>> HUGH THOMPSON: Oh, boy. This – I've got to take a drink for this. Okay.
>> CHRIS KREBS: So, the first one was in May of 2017. And I got a call from Kirstjen Nielsen, who was the Chief of Staff at DHS at the time. And she's pulling into the White House. She calls me up and she is like, hey, there's something going on in the world. Can you go and check it out?
I'm like, I have no idea what you're talking about.
I pull up Twitter – WannaCry. Okay. That – that’s a month later. I'm at the ear, nose, and throat doctor and two of my kids are about to get tubes. And it's early in the morning. I'm just kind of flipping through Twitter. And I'm like, huh, a railway in Ukraine is offline and a bank in Russia is offline. What's going on? NotPetya.
>> HUGH THOMPSON: NotPetya.
>> CHRIS KREBS: The last day of my employment at CISA.
I walk into the house and somebody texts me and says, hey, you need to check out Twitter. I think you just got fired.
>> HUGH THOMPSON: Unbelievable, that was –
>> CHRIS KREBS: And lo and behold, I pull up Twitter. It’s 7:07 PM, November 17, 2020 – not like that's seared into my brain.
>> HUGH THOMPSON: Right, no, it’s a vague memory.
>> CHRIS KREBS: And I'm sitting there in the kitchen and I'm like – I turn to my wife and I go, I just got fired by – by tweet.
>> HUGH THOMPSON: This is crazy.
>> CHRIS KREBS: But like the best part about it was like, all of my kids, five kids like you, kids are around the island, and they just all like kind of snap and look at me and then start tearing around the house going, Daddy got fired, Daddy got fired.
>> HUGH THOMPSON: Oh my god. The support – just the support you were looking for.
>> CHRIS KREBS: I mean, it was like a pressure release valve went off. But yeah, five kids, you know that, running around the house screaming.
>> HUGH THOMPSON: I know it very well. Well, Chris, first let me say, I can't thank you enough for your service. The work that you did at CISA and the foundation that you led was incredible, incredible.
>> CHRIS KREBS: Thank you.
We were at a dinner last night and I introduced myself as, "Hi, I'm Chris Krebs. I was Jen Easterly before it was cool."
>> HUGH THOMPSON: That’s right, you did say that.
>> CHRIS KREBS: And you know, that’s – she's done a fantastic job with CISA and really proud to see where the agency's going.
>> HUGH THOMPSON: And I think, because I was at that same dinner, I think the response from someone else in the crowd was just admiration of your hair. That's what they really said. I'm not sure where the connection is, but, you know, again, amazing work that you did there.
And now you have been involved in this Commission for Information Disorder, understanding, you know, what's the root. What is information disorder as you guys define it?
>> CHRIS KREBS: Yeah, so, kind of the context for the commission was, I felt as I was coming out of the job, that there were two strategic areas kind of left undone. You know, 2020, whether I had more time in the job, two things I really wanted to tackle.
First was a more strategic response to ransomware. I didn't feel that the government was doing enough and we could have a more strategic approach across, you know, reducing vulnerabilities, engaging in the cryptocurrency market, and then putting the fight to the adversary. And we all see where that went last year.
But the second thing was a more strategic response to disinformation. We had kind of taken an ad-hoc approach to the 2018 and the 2020. It wasn't really clear who had leadership. So, that was one of the things I wanted to do.
And then the team at Aspen approached me and said, hey, you know, we're going to put this group together. Craig's behind it and funding it and we want to go study this problem, give some real actionable, near-term and mid-term recommendations.
And like, it took me all of five seconds to decide.
So, we got a great group together. We launched it in April of 2020 – 2021. And you know, first, we had to kind of define the problem, because everybody had approached us, like, oh, it's disinformation, which seems to have kind of lost any meaning. So, as we were doing the initial research, one of the kind of foundational documents is a report by Professor Claire Wardle from 2016-17, and she defines information disorder as a framework, not like here in the infosec community, we don't love frameworks.
So, the idea here is you want to kind of bucketize the types of activities and risks that we're facing. So, information disorder, at its core, is three problems, disinformation, which is false information that is spread for malicious purposes; misinformation, which is false information spread for, you know, inadvertently without bad intentions; and then malinformation, which is real information spread for harm. Hack-and-leak campaigns is a great example of malinformation.
And then within all of that, there are a set of factors that are driving this seeming implosion. It's like the increase of national news and the TV and the ability to share low quality information, the erosion of local media or the erosion of trust in science and evidence and expertise; computational amplification, so bots that boost – boost for signal. So, there's a series of factors.
So, we wanted to look at these issues and really get to the root cause of why we are where we are and what are some of those recommendations. And I think we came out with – with a really solid group of recommendations across increasing trust, increasing transparency, and reducing harms. And it's great to see that some of the recommendations have been picked up. Some haven't. And you know, it's a rocky road and you know, we're not going to fix this problem tomorrow.
>> HUGH THOMPSON: It was an amazing report. And one of the things that struck me right away were the folks that were involved in this commission. It was incredibly diverse and eclectic. Like, you had people from every aspect, including a prince, if I remember correctly, Prince Harry.
>> CHRIS KREBS: Yes, so, we had – we had some folks that have been on this stage before, Alex Stamos, my business partner, Herb Lin from Stanford. We had a former Congressman in Will Hurd. Some really thoughtful academics and researchers, Safiya Noble from UCLA, Kate Starbird from the University of Washington. She's brilliant in this space. Jamil Jaffer, Deb Roy, OG Twitter team. And then yes, we had royalty. We had royalty, Prince Harry. And I tell you what, I just couldn't do it. I couldn't call him Prince Harry.
>> HUGH THOMPSON: You couldn't do it? Why? Why, that’s –
>> CHRIS KREBS: Because we, you know –
>> HUGH THOMPSON: I'm from the Bahamas. We're in the Commonwealth. How dare you not –
All right, fine. Anyway –
>> CHRIS KREBS: We made that statement two hundred-some-odd years ago, so.
>> HUGH THOMPSON: Okay, all right.
So, well, let me ask you this. How have you seen this disinformation, misinformation, malinformation combined with other types of things? Like, you know, you look at what's going on in Ukraine, for example.
>> CHRIS KREBS: So, tactically, you know, I would have expected the Russians to come into Ukraine and take out any sort of telecommunications, the ability to command and control and engage with lines of communication. And they just didn't. And we'll look at that for a while and try to figure it out.
But what that did was it opened up space for the Ukrainians to completely dominate the information space. You know, you think about the early days, the Ghost of Kyiv, right, the fighter jet that became an ace.
The Babushka de la Muerte, right, the grandmother that walked up to the Russian soldiers and said, sunflower seeds, put them in your pocket, you're going to die here.
Those sorts of stories allowed them to really seize the narrative when immediate global sympathy and support in the Russians were back tracking.
And at the same time, you saw the US government declassify on a rapid basis, in a matter of hours sometimes, to expose potential disinformation campaigns and operations.
>> HUGH THOMPSON: Which was incredible. I mean, what an amazing progression.
>> CHRIS KREBS: Yeah, I mean, it went from months and months, as I started, to twenty-seven hours at one point in the Trump administration, now a couple hours. It's just – it’s a brilliant counterintelligence and counter disinformation tool.
>> HUGH THOMPSON: Right, you know, another thing that I just found fascinating in the report, there were some, to me, very unexpected conclusions that made sense after you went through it, which are the populations that are disproportionately affected by disinformation.
And I'd love to welcome into the conversation, a fellow cochair of yours, Rashad Robinson. And Rashad, incredibly accomplished, president of Color of Change, a leading racial justice organization with more than 7 million members. Ladies and gentlemen, please welcome Rashad Robinson.
>> CHRIS KREBS: Hey, buddy.
>> HUGH THOMPSON: Rashad, how's it going? Have a seat, have a seat. Thanks so much for being here.
>> RASHAD ROBINSON: It's great to be here. Hello, everyone.
>> HUGH THOMPSON: Tell me, how did you become involved in this commission? What drew you to it?
>> RASHAD ROBINSON: Well, I lead a racial justice organization, so every single day, we are working to build power for Black Americans.
Color of Change was founded in the aftermath of a flood, which was Hurricane Katrina, caused by bad decision makers and turned into a life altering disaster by bad decision makers. And many people in this room probably remember those images of Black people on their roofs begging for the government to do something and left to die.
And the thing about Katrina that really animates our work and animates why I would get involved in something like this is that it illustrated things that people already knew, right, geographic segregation, generational poverty, the decisions that have been made that – and the ways in which structural racism undergirds them, from what has happened to our climate to a whole bunch of other systems which have been harmed.
But at the heart of it, no one was nervous about disappointing Black people. Government, corporations, and media.
So, part of what we have to do at Color of Change every single day is fight for the truth, to build power around the truth. It's a – it’s a fight that Black Americans have had to have in this country since being brought here, the truth about our very humanity, the truth about us as being fully human.
And so, what we are dealing with right now with information disorder, which impacts all other issues, is – is a fight for the truth, is to make sure that the truth actually has a fighting chance, whether it impacts public policy, whether it impacts civic life, whether it impacts all the ways in which we get to live in society.
And so, an opportunity to sort of bring the work that we're doing every single day to a place where people are trying to find consensus about the sort of fundamental questions about how do we move our society forward, how do we live together inside of a multi-racial democracy, where if we just think about this country, but we know that this impacts the world, but if we just think about the United States, a land of ancestral strangers, people who do not come from the same place and are trying to form a way forward together, we need the truth. We need a shared truth.
And so, an opportunity to be part of this commission and to do this work was why I jumped at the chance and a way to make sure that the work that we could do could be part of a greater whole and something that could hopefully be useful at sort of advancing a way forward.
>> HUGH THOMPSON: Thanks so much for dedicating your time, not just to this commission, but to Color of Change. I mean, you've made a huge difference in the world.
And I'm curious, as you went through the process of studying this problem, and with all the resources that the commission had, is there anything that surprised you? Is there anything that stood out to you?
>> RASHAD ROBINSON: Well, there's a couple of things that constantly surprised me about this that I think were animated through the work together. And I have to also just also say what a pleasure it was to work with Chris and Katie and folks who I did not know and who come from different backgrounds and perspectives, but were focused on us trying to figure out something that we could all put our names on collectively together.
And what I'll say is a couple of things. One was that, how much some people who I know believe in risk assessments and security assessments and food inspectors, somehow don't believe in those same principles when it comes to our tech platforms, somehow don't believe in accountability and regulation and sort of rules when it comes to the technology that is supposed to take us into the future and would allow it to sort of drag us into the past.
The second thing is, you know, this – this way in which how much the internet and the risk of information disorder, the risk of untruth has shifted, right? We used to be in this place of goodwill being exploited, you know. Like, you know, for me personally, of course, you know, if a foreign prince showed up with some resources, I might be sort of interested.
>> HUGH THOMPSON: See, he called him a prince.
>> RASHAD ROBINSON: A foreign prince to – no, I'm not talking about that prince. I'm just talking about the emails we may get sometimes, you know, the emails you may get with a prince that has money if you just help them out.
The exploitation of goodwill, the exploitation of charities that might not exist, which was sort of a way in which people really were harmed on the internet at one place, right?
>> HUGH THOMPSON: Absolutely.
>> RASHAD ROBINSON: And now so much of this is about the exploitation of bad will. How – how the levers are being used to exploit the very worst of us and how much race is a part of that, right?
And so, as we think about even some of the big debates that we are facing in this country, whether or not we even teach Black history, American history in our schools, we have to think about these questions as security questions, right? If Russia kind of knew more about Black people, then some of the security experts inside of the big tech platforms, then that is something that we should all be concerned about.
And so, some of the recommendations in here, the ones particularly about diversity inside of tech platforms, should also be questions about diversity for a room like this, right, where this is not just a question of diversity or inclusion as charity or the moral thing to do, this is about diversity as the strategic thing to do. Because if we don't have the right set of experiences and the right set of eyes on the very questions about how do we keep ourselves safe, if our – if the folks that want to destabilize us know that racism exists, then we have to recognize that racial justice isn't charity, but racial justice is strategy. And we have to actually think strategically about how to deal with all of the fault lines, all of the ways in which we will be exploited as a society if we don't actually deal with the very fundamental flaws that have sort of existed from the very beginning.
>> HUGH THOMPSON: You know, reading – reading this report, it just made so much sense to me. Like, the clarity with which it was written, the conclusions are just self-evident once you actually go through the facts. And I wonder, you know, you're talking to the world security community here, right? This is where the world comes together to talk about security.
>> RASHAD ROBINSON: I feel very safe.
>> HUGH THOMPSON: Yeah, you should feel safe. You should feel safe. Yes, this is the right room to be around to feel safe.
What recommendations would you give? I know you have some recommendations inside of the – inside of the report, but what can each of us do to try and improve the situation?
>> RASHAD ROBINSON: Well, I think each of us in our sort of own positions in this work, I think, do have to recognize the role of race and racial inequality in the spaces that we're in. And while it might sometimes lose us some friends in the short-term, in the long-term, if the north star is safety, if the north star is actually to make our society work for all of us, then we have to face the fundamental flaws and the challenges in our society, even if it may make us uncomfortable in the short-term. We – heroes don't play both sides.
>> HUGH THOMPSON: I like that.
>> RASHAD ROBINSON: It might make us rich, but it will not make you a hero.
And so, what I actually would implore this group to sort of recognize is that our society's only becoming more diverse. The challenges of us trying to find a way to make decisions collectively and move forward collectively will only become more and more challenging. And if we allow companies to simply self-regulate themselves, then we will make – put ourselves at the whim of an incentive structure that we simply cannot win, an incentive structure that will continually put us on – on the losing end of, you know, safety, integrity, and security over profits. And that, I think, is one thing that we all have to recognize.
And so, each of us – each of you are in positions or each of you are in places where you can impact the circles or the roles.
And then the thing that I think was most important and really very much connected to this report and one of the things we called out very early on was the need for leadership. And that means leadership across many sectors. This means leadership across accrediting institutions, places where you may give authority to someone to have power, whether – and if someone is doing things that actually put us on the losing end of the truth, how do we remove that accreditation? What in our society are we doing? How are we holding media companies accountable? How are we holding tech companies accountable?
And then, of course, the role of government. Self-regulated companies are unregulated companies, and unregulated companies will always do harm to us in some way, shape, or form. And so, we have to sort of think about leadership, and leadership requires courage.
And so, for all of us in this room, we're going to look back at this moment, five, ten, fifteen years from now, and all of us will get to write, right now, the story of the character we want to be and how we move ourselves forward as a country and how we move ourselves forward as a member of the world community.
And I want everyone to ask themselves, what character do you want to be in that story? And that, I think, is an opportunity that we all have, and a particular opportunity you all have, given your expertise, your position, and the ways in which you get to move in the very spaces that you're inside of.
>> HUGH THOMPSON: Rashad, thanks so much for your contributions into this piece. Thanks so much for the contributions of your organization. You're really making a difference.
And I want to introduce the third cochair of this group. This is someone many of you probably feel like you know very well. She was for fifteen years, the co-anchor of NBC's Today Show. I grew up watching her. She was the first woman to solo anchor a network evening newscast, serving as the anchor and managing editor of the CBS Evening News. And in 2017, she founded Katie Couric Media.
Without any further delay, I would like to introduce the amazing Katie Couric.
>> KATIE COURIC: Hello.
>> HUGH THOMPSON: Thanks so much for being here.
>> KATIE COURIC: You're welcome.
>> HUGH THOMPSON: Thanks so much for being a part of this.
>> KATIE COURIC: Hi, everyone. Hi. It's like The Tonight Show. It’s very exciting. And I hate when people say they grew up watching me, Hugh. You know how old that makes me feel?
>> HUGH THOMPSON: I was very, very young. I was like a baby. I was a baby.
>> KATIE COURIC: No, you have to say you were in college or something, at least. Don't say you were younger.
>> HUGH THOMPSON: Apologies, yes, okay. No more time stamping.
Katie, thank you for being here.
>> KATIE COURIC: You're welcome.
>> HUGH THOMPSON: Really appreciate it. Thank you for staying here, especially after that introduction.
But it was amazing to have your voice as a part of this study, as a part of this research. And, okay, I'm not going to say I grew up watching you. So, I've watched you many times on television. And when I saw you on The Today Show, on the evening news, I listened to what you said and I just – my instinct was to accept it as truth and fact. And it's because you were representing truth. You were a part of one of the big networks. You had an army of fact checkers behind you. This is – this is how I thought about things. This is how I thought about the world.
Things have changed quite substantially now, as people are just inundated with information. But I do remember that even early on in your career, you explained to the world the internet and the concept of the internet. I really didn't actually know that you were such an expert in this area very early, you know, in the internet formation. And I'm told that we've actually found a clip where you discussed this. Would you like to see the clip? Would you like to see the clip?
>> KATIE COURIC: I was an early adapter.
>> HUGH THOMPSON: Let's roll the clip.
>> BRYANT GUMBEL: That little mark with the A
and then the ring around it?
>> ELIZABETH VARGAS: At?
>> BRYANT GUMBEL: See, that's what I said. Katie said she thought it was "about."
>> KATIE COURIC: Yeah.
>> ELIZABETH VARGAS: Oh.
>> BRYANT GUMBEL: But I have never heard it said.
>> KATIE COURIC: Or around.
>> BRYANT GUMBEL: I had always seen the mark but I have never heard it said. And then it sounded stupid when I said it,
"violence at NBC."
>> KATIE COURIC: Well, it wouldn't be "around" or "about."
>> ELIZABETH VARGAS: We had a big fight in the lunchroom the other week.
>> BRYANT GUMBEL: See, there it is, violence@nbc.ge.com, I mean.
>> KATIE COURIC: Well, Alison should know.
What do you say to that, Alison?
>> BRYANT GUMBEL: What is the internet, anyway?
>> KATIE COURIC: Internet is that massive computer network, the one that's becoming really big now.
>> BRYANT GUMBEL: What do you mean? How does one not – what, do you write to it, like mail?
>> KATIE COURIC: No, a lot of people use it and communicate – I guess they can communicate with NBC writers and producers. Alison, can you explain what internet is?
>> OFFSCREEN: It's a giant computer network made up of – started from –
>> BRYANT GUMBEL: Oh, I thought you were going to tell us what this was. Look in the dictionary.
>> ELIZABETH VARGAS: It's a computer billboard.
>> OFFSCREEN: It’s not in it. It’s – it’s a computer billboard but it’s nationwide. It's several universities and
everything all joined together in this.
>> ELIZABETH VARGAS: Right.
>> BRYANT GUMBEL: And others can access it.
>> ELIZABETH VARGAS: Right.
>> OFFSCREEN: And it's getting bigger and bigger all the time.
>> KATIE COURIC: Thank you.
>> HUGH THOMPSON: Katie Couric, ladies and gentlemen.
>> KATIE COURIC: I was like this – I was this savant, clearly. I mean, I – people see that and I always say, hey, back in 1994 – not this crowd, because I'm sure you all did – but a lot of people – I mean, it was a relatively new concept back then. And we see, obviously, it's taken over, as you and Chris were talking about. The vast majority of people, I think it's like 90-something percent, get their news online. Now half get it through social media. So, it has – it has changed every aspect of our society in ways that clearly Bryant Gumbel and Elizabeth Vargas and I never envisioned back in 1994.
>> HUGH THOMPSON: No, but what I love about that clip though, actually, is you were a proxy for America, in a way, right? They were discovering it at the same time as you. And I love how the producer offstage says, it's like a giant billboard network.
>> KATIE COURIC: Right?
>> HUGH THOMPSON: Let me just clarify with my internet expertise.
>> KATIE COURIC: And this is when we had computers that looked like little mini fridges, you know, with the dial up, and we could actually communicate with people inside NBC through top lines. And the whole idea, you know, this was when Al Gore was calling it the information superhighway and the world wide web and – but boy, have things changed.
>> HUGH THOMPSON: Things have changed. And I wanted to get your opinion on this. I mean, you're one of the preeminent journalists of our time, right? And you have –
>> KATIE COURIC: Oh, quit it some more, here.
>> HUGH THOMPSON: You are, I mean, you actually are. Empirically, you actually are.
And if I think about, you know, like earlier, I was talking about my mountain lion problem this morning and not understanding if there was really a mountain lion on the loose, and do I have to enact our family mountain lion protocol, and you know, all of that stuff.
You have to deal with the same thing every day, right? I mean, you are also looking at social media. You're also having to make judgments of, is this real, is this not, is there something behind this? Is there – just talk to me about how the landscape has changed.
>> KATIE COURIC: Well, you know, when I got into the business, back in 1979 –
>> HUGH THOMPSON: Which was not that long ago.
>> KATIE COURIC: It was, you know, it was a much more limited landscape. There were three major networks. People would watch, you know, Walter Cronkite back in the day. But really, my generation, it was Peter Jennings and Dan Rather and Tom Brokaw.
And you know, there were major newspapers. There were many local newspapers. 2,200 local newspapers have folded since 2005. And we can talk about that in a moment. I know Chris mentioned that.
But there was a very limited pool of information. You know, I think the iPhone didn't come out till I believe 2007 or 2008. So, when you think about that. And that, I think, is when everything really exploded, because suddenly, you had everything you could possibly want in the palm of your hand.
But you know, when I was doing interviews at The Today Show or even CBS, they would be very deeply researched. Oftentimes, if I was doing an interview on The Today Show, I would overprepare by calling someone from the Rand Corporation or calling someone from a think tank or Brookings to get some background. And I was also mindful that when I got some information, I would consider, well, is that person biased? Or is that somebody who looks at the economy like Paul Krugman? Or is it somebody who looks at it from a different perspective? And kind of weighing all of that.
And so, it was much, much easier, I think, to find credible sources and to research, because there wasn't this plethora, this velocity of information.
One thing I really appreciated about being on this commission, though, and something that you mentioned about the diversity of voices, and you know, and I really have enjoyed getting to know Chris and Rashad.
But you know, I sort of had the feeling those were the good old days. And they really weren't in – they were in some ways, because there was not so much disinformation, malinformation, whatever. But the voices that were controlling the media were white men. And so, there was no representation of marginalized communities. Women were just integrating broadcast news. Read my book; you can read all about the fun that was.
>> HUGH THOMPSON: It's an excellent book. Also, you need to go to Katie Couric's website and subscribe to her newsletter.
>> KATIE COURIC: Oh, thank you.
>> HUGH THOMPSON: It’s amazing. I love the newsletter.
>> KATIE COURIC: Anyway, so – shameless plugs.
So, it's interesting, because when I look at the big picture of the changing media landscape, there are a lot of positive things, you know, the democratization of media, the fact that people – people’s voices can be heard that previously were suppressed or oppressed.
And yet, it has just been taken over by, you know, bad actors and trolls and people who are just misleading people. And, you know, obviously, it's a serious enough problem that we all were anxious to be a part of this commission and to try to do something about it.
You know, my daughter, Carrie, worked for Reuters, and they had a partnership with Facebook. She was correcting misinformation for about two years during the pandemic. And the things that she would tell me that people were saying, you know, whether it was the vaccine caused autism or there was a bunch of ships photographed off Long Beach, California that were said to be rescuing – that was Donald Trump had sent them to rescue Americans. And you know, Carrie would have to call the shipping company, the Mayor of Long Beach. She would have to do all this extensive reporting and then rate the veracity of the claim.
But as they say, lies run around the world before the truth gets a chance to tie its shoes. So, by the time Carrie would even review and rate and say this is completely false or unreliable, it had been shared by millions of people. So, it is a really – it's a conundrum. It's a big challenge. And as Barack Obama says, the biggest threat to our democracy.
>> HUGH THOMPSON: It's such a privilege to have the three of you up here together. And – and I want to ask this question, because you pose this in the report, which is, what can be done? Like, if you think about the scenario you just laid out, fine, we're going to fact check it, but actually, even if that took ten minutes, the thing has propagated all over the world. And I would love Chris, Rashad, Katie, to get your views on systemically, what can be done here?
>> KATIE COURIC: Well, I mean, I think we can maybe divide it up, because there are a lot of different recommendations. But Chris, do you want to talk about sort of the super spreader aspect of it? Or what do you want to take on?
>> CHRIS KREBS: So, that all kind of rolls up into the platforms themselves and how information is delivered, how it's distributed.
So, I'm constantly reminded of the book " Neuromancer," William Gibson, in the 80’s, right. And he coins the term cyberspace. And one of the things he talks about is, he describes it as the unthinkable complexity. And that's kind of where we are right now I think with the social media platforms, with the way information's distributed, the velocity volumetrically of information.
So, we have probably gotten past a point and we are certainly not advocating for any regulation of content itself, necessarily. It's the mechanism. It’s the optimization. It's the incentive structures.
And so, I go back to about 2002-2003 and think about Enron. Enron really highlighted a failure in transparency and accountability in corporate America. So, as a result, the Congress passed Sarbanes–Oxley that had this Christmas tree of reporting requirements and transparency requirements.
>> HUGH THOMPSON: I like how you described it, a Christmas tree of reporting. Many people had to live through this.
>> CHRIS KREBS: But I think that's where we kind of are right now, right? As Katie and Rashad have both talked about, there's not a consistent set of disclosure requirements. What does content moderation policy look like? What does super spreader reach in content and accounts look like?
You know, what about advertisement and how advertisement is structured, including like microtargeting? You can talk about the redlining that happens across Facebook. So, there are all sorts of different just basic disclosure and transparency requirements that is the first step. And not a single one of them touches the First Amendment.
>> KATIE COURIC: Because I think academics have tried to get the information to study this, to do the research on it so they can actually understand how these – how these social media platforms work, how algorithms work.
And yet, they're denied access to that. So, it's like, you know, the man behind the curtain. It's like the Oz of social media. And if you don't understand how the systems are operating, you can't come up with the solutions to fix them.
>> RASHAD ROBINSON: Yeah, and it's not just the denial of transparency; it's the immunity around the business model, right? There's one thing around, you know, freedom of speech and what we get to put on the platform. There is, though, the amplification and there's the paid advertising, which are very different than just freedom of speech.
I've been in conversations with the leaders of the platform, you know, been in a meeting with Mark Zuckerberg. We were trying to get Facebook to put in a policy around suppression, around the census, that were targeting specific communities. And we knew that there was narrow targeting suppression attempts to actually reduce the sort of acceptance of taking, of engaging in the census, which has all sorts of political consequences.
And you know, their answers to the questions were almost these freedom of speech sort of pushbacks that had nothing to do with the fact that if you post something on Facebook, you actually have to put money behind it if you even want the people that follow you to see all of the content or all of the people to follow you. It is about amplification. It's about the business model. It's about choices that the – that Facebook has made so that you are more likely to see conflict content than you are to even see the content of your friends and your – and your family. These are business model choices that are not actually transparent.
In fact, we learn more about what's happening inside of these companies from whistle blowers than we do from disclosures that should actually have to happen if these companies want to operate at this level and scale.
And so, one of the true recommendations was around a narrow reform of Section 230 of the Communications Act, which actually sort of deals with the business model but doesn't deal with the First Amendment. And as a civil rights leader, I need the First Amendment to be able to raise my voice, to push back. And I recognize that there will be people I vehemently disagree with that also need the First Amendment to disagree.
And we should recognize that companies that there should – that companies that have a profit incentive that come in to sort of decide that they're going to put their hands on the scale of untruths, they're going to put their hands on the scale of making – of pulling us apart, there should be some accountability for the consequences that causes to society.
>> HUGH THOMPSON: Let me ask you all a question, yeah.
Given that this is the current state, and you have great recommendations about how Congress can act, how others can act to provide this transparency, in the current state that we're in right now, tomorrow we're all going to go and we're going to get back on Instagram and we're going to go on Twitter and we're going to say, what can we do to help immunize, if anything, ourselves?
Even if we know that this is happening, this massive, rampant amplification without verification and checking. Like, I'd be curious, like, what do you guys individually do?
>> KATIE COURIC: You know, what did Sy Syms say? An educated consumer is our best customer? I mean, I think – did anyone know what Syms is? Anyway, it's a discount store. Forget it.
But anyway, you know, I think that people need to be very – they need to be critical consumers. They need to consider the source. They need to be as discriminating about the content they're reading as they are about, you know, what they would feed their kids. They need – you know, and the problem is, it puts a lot of onus on the consumer, but we need to teach media literacy in schools.
You know, I remember seeing a well-known actress posted something, and it looked like the American Medical Association. It was some bogus claim or some claim, and I was like, is that right? Who is saying that? And I Googled the organization and it was this antiabortion, very, you know, extremely conservative organization that wasn't even a medical organization.
But when you think about deep fakes, you know, you could make Nancy Pelosi look drunk when she wasn't, or have something coming out of Barack Obama's mouth that he never said, it is really, really hard for consumers. So, I think we need to start educating even in elementary schools about being an educated consumer and understanding the source for this material is not accurate.
You know, now when I'm preparing for interviews, because I do a lot on my social media channels, you know, my team, I say – they'll tell me something. I'll be like, where did you get that? Who wrote that? Where did that come from? And I never had to quite, you know, be as discerning about every piece of information I get. But you just have to now because people have agendas and they're not legitimate news organizations that pop up constantly.
>> RASHAD ROBINSON: And this is the tension, though, that we have to sort of deal with as a society. I fundamentally believe we should try to make ourselves as educated as possible. But just like certain communities can't outrun food deserts and somehow find fruit when there is, like, manufactured inequality in their communities and there are no supermarkets that provide fruit, the ways in which certain communities are targeted and exploited because inequality is not unfortunate like a car accident; it's manufactured – and so – and it's not unfortunate, it's unjust – that we have to obviously, and communities that have been under attack have always been the most resilient communities, have always had to find a way out of no way.
And at the same time, we have to recognize that there is a deep incentive structure, a profit incentive, at continuing to target these communities, continuing to sort of drive up inequality, because it will be good for business and it will be good for sort of keeping the sort of fault lines that can be exploited in place.
And so, that is also why each of us in our own place with our own power have to recognize that when it comes to actually moving public policy or holding even the tech platforms accountable, we will lose in the back rooms if people are not lined up at the front door.
And all of us have different levels of power, and you all are people that can be lined up at the front door or inside the rooms to actually make real change happen.
And I hope that none of you sort of lose that level of power, privilege, and responsibility that each of you have.
>> KATIE COURIC: But Rashad, what exactly are you asking these folks to do?
>> RASHAD ROBINSON: I'm asking them – I’m asking them to speak up. I’m asking them –
>> KATIE COURIC: I'm sorry, Hugh, I'm taking your job.
>> HUGH THOMPSON: Katie Couric, Katie Couric, ladies and gentlemen.
>> RASHAD ROBINSON: Katie Couric is asking me a question in public. I love this.
>> HUGH THOMPSON: Amazing.
>> RASHAD ROBINSON: I think that we have to hold the line between real solutions and fake solutions. And I do think I've been in a number of places with people from the platforms that will start the conversation off acknowledging the problem and then talk about it as a demand side problem initially, to make it, to put it back on the communities that are most targeted and exploited, and say, well, isn't it a demand problem that these – that communities are clicking on things or liking things that have been sort of served up in this way?
And all of that, I think, is taking us away. It's like blaming communities that have been targeted for voter suppression for having lower voter turnout. It's blaming communities that have been targeted for junk food and junk food marketing, for saying, oh, wow, now the obesity rates are higher in their communities. It's blaming communities that have, you know, pollution in their communities, drilling or fracking in their communities, and saying, well, now the asthma rates are higher in their communities. Those parents should be more responsible.
We see this over and over. And what I guess I'm saying is that we have hundreds of years of this happening, you know, since the very foundation. And we know that we can't outrun inequality. We actually have to deal with it head on and try to dismantle it, and it doesn't mean that we don't work. It doesn't mean that we don't try to learn as much as possible. I'm not taking away in any way the hard work and effort that people have to do. Communities have always done that, and of course they know that.
But this is an opportunity for leadership, and leadership from those that are in places of power that could actually do things to hold that line between what we can actually make possible in real and what is like being put on the table to make us believe that the problem will simply go away if some people try harder to not be at the losing end of inequality.
>> KATIE COURIC: I've loved everything you said, but I still don't know what concrete steps people, the average person, can take.
>> RASHAD ROBINSON: Well, I think the average person can take is popularizing this with their elected official, of raising their voice around the legislation that's before Congress right now. I think these are not average people, though. These are cybersecurity experts that have jobs inside –
>> KATIE COURIC: Okay, what can cybersecurity experts do?
>> HUGH THOMPSON: This is Katie Couric. This is like classic Katie Couric.
>> RASHAD ROBINSON: Well, cybersecurity – cybersecurity experts inside of major corporations can actually sort of deal with the terms of services inside of their companies to make sure that they're actually being enforced, even when powerful people sort of violate them. Time and time again, we have won policy fights at companies around their terms of services only to have enforcing those policies run in foul of the profit incentives.
We can actually have companies stand with us in real ways when we are fighting for legislation and change inside of Congress. And, you know, there are so many ways in which the long-term impacts of this problem will hit corporate America, even though there's short-term opportunities to kind of having disinformation. There, the long-term impacts will be harder.
I think that for folks that are experts and are researchers on this issue, helping to sort of elevate the harm of these issues in ways that more people can understand, because we need more people sort of understanding these issues and talking about them and connecting them to the other harms that exist in society.
And so, I think in each and every person's sort of individual position in this room, as I sort of looked at the type of people that were coming to this conference and the sort of various roles that they have in society, there's a real opportunity for each of you to play a leadership role in recognizing that these problems and the problems that will continue to persist in our society require each and every one of us to sort of use the place that we're in and then work out from there.
>> KATIE COURIC: And have a sense of urgency about it. I'm afraid with so many issues that we're facing in our country right now, if you look at the – the gun violence, for example, which is a big issue for me, and the trouble to come up with sensible gun laws that can reduce gun violence in this country.
And then, obviously, you have Roe v. Wade being overturned. You have the situation in Ukraine. And I think sometimes an issue like this that is sort of cerebral and not super tangible in terms of the harms, right, you have to really kind of connect the dots. I think as a result, people don't see this as the important issue and the critical issue it is.
>> RASHAD ROBINSON: But you look at information disorder and then you look at Black people trying to shop in a supermarket in Buffalo or kids trying to go to school and be safe, and then we think about sort of the flood of information disorder and the sort of role. We watch content of the Buffalo shooting being pulled off of Twitch, then living on Facebook for nine hours. We're calling Facebook to try to get them to pull it down and they say it doesn't violate their terms of service.
And after they have made all of these commitments and told us that they're doing all of these new things, and to recognize, like, well, who pulls the switch inside after nine hours – before nine hours of the thing being on? Who even cares about it?
And then we recognize that because there is not consequences at scale, because the current set of accountability is sort of like – the fines are like a night out on the town for probably most of us, they don't actually feel incentivized to do anything.
And then we have to think that we do need more whistle blowers, we do need more people raising their voices, we do need more people making sure that those in power in Congress, in the White House, will recognize that there will be consequences if they don't do anything, and there will be rewards and there will be people who will, you know, reward them and hold them up, and there will be leadership opportunities for those that do stand up and champion these issues in ways that will help to save us all.
>> HUGH THOMPSON: Well, let me ask –
>> KATIE COURIC: Rashad is very passionate.
>> HUGH THOMPSON: He is, he is, he is.
And has made a difference. And so, I've got a final question for each of you, and I'll go down the row.
These folks have been here for the last four days talking about the most grave security issues in society. This is their last moment, their last impression at the conference. You've come out with this report. You've also raised huge awareness to this topic. Has there been progress? Give us some hope. Give us a ray of hope. What has happened since?
>> CHRIS KREBS: Since the report, there's been legislation that's been introduced that's consistent with the themes of the report, including the research, the public sector and private research in the platform.
So, Nate Persily is a professor at Stanford, drafted some legislation and shared it with the members of Congress. Entirely consistent. We endorsed it. So, there is action.
But look, here's my takeaway, is that the government is not going to save us here. It is not an issue that, like we saw a couple months or so ago with the DHS Disinformation Governance Board. These are the sorts of things that can get weaponized. It ultimately comes down to us in our communities to engage.
And I do think that COVID had a radicalizing effect on us because we were – we withdrew from our communities, which has an – the diversity in our communities has an inoculation effect. And if you go to a kid's soccer game and if you were to say something like, oh, I don't know, JFK Jr. is still alive, the other parents would look at you like you're crazy. But the echo chambers and the filter bubbles allow for that sort of information to propagate.
So, we have to pull back, reengage in our communities, encourage sponsors to support local media. Hey, local media, Arizona/Georgia, were the main venues of information in media around the 2020 election in uncovering the fraud in Arizona and the things that were going down and the claims that were being made in Atlanta around the Georgia election.
So, really, it's about us and who we support. And do we care? Do we really care about that? That's the question I keep asking. If we want to be a democracy, then we've got to act like it. It takes work. You're not just going to let it happen.
>> HUGH THOMPSON: Rashad.
>> RASHAD ROBINSON: Look, we have more people willing to take action on these issues than ever before. We have more people connecting these issues to the real harms than ever before, connecting information disorder and connecting what happened in Buffalo in ways that are much quicker. And I think that is, I think, the pathway to having more people invested in – in helping us solve these problems.
But I think Chris is absolutely right, is that we need to build stronger connections of people offline. We need more places where people are coming together. And that doesn't happen –
>> HUGH THOMPSON: Such as RSA Conference.
>> RASHAD ROBINSON: Such as RSA Conference. And then the thing that I think, you know, was one of the – two of the places in the report that we haven't talked about as much is, one is that there are communities around this country that are trying to engage in truth, reconciliation, and healing, trying to have conversations about a shared story, where people of different races and backgrounds are coming together to be inside of a story about how – where we've been and where we're going and doing it in the midst of these fights to actually ban books and ban education and ban – and I think the more we are having those type of discussions about how do we come together and talk about hard histories, talk about hard challenges, and actually celebrate the sort of multi-racial progress that we have made together, the better off we'll be.
And so, those are some of the glimmers of hope that I think I hope to see. And I hope that more of us will be invested in that hard work. And I say hard work because that – it is hard work, but I do think not doing it will make things a lot harder for all of us in the end.
>> HUGH THOMPSON: And a final word to the legendary, encouraging, uplifting Katie Couric.
>> KATIE COURIC: I know, I'm not very optimistic about this, though. I’m so sorry.
>> HUGH THOMPSON: I was hanging my hat on this.
>> KATIE COURIC: You know, listen. You know, it's interesting when you think about half – half of the GOP candidates that are running in races are perpetuating the Big Lie, that Joe Biden was not legitimately elected as President of the United States. It's not hurting them in the polls.
And so, I'm very concerned about what is happening and some of the things that are happening at the state level in terms of legislative changes. They're actually going to result in a rigged election in 2024, giving election officials way too much power.
And so, I guess the – some of the positive things are that we have elevated this. Barack Obama gave a lengthy speech at Stanford about this.
>> HUGH THOMPSON: Excellent.
>> KATIE COURIC: And cited our research. And, you know, and I think he's shining a light on it.
I think that cable news operations are starting to realize that perhaps they went too far. CNN has gotten rid of the "Breaking News" banner when – you know, because it was never breaking news, but they said it was breaking news. They are starting to have different points of view represented.
You know, I think what happened in cable is commentators took over and it became like talk radio on TV. And that's fine as long as people know, this is commentary. But I think people miss having, you know, reporting of the facts and letting them kind of come to a conclusion with careful analysis, instead of diatribes by –
And you know, listen, and I guess – I don't know what time it is, but at five o’clock today, eight o’clock tonight, and eight o’clock Eastern, the January 6th hearings are going to be airing on prime time television, on every network except for Fox News, because they don't want their viewers to see it.
I mean, if that doesn't show you how screwed up our country is right now, I don't know what does.
Other than that, Mr. Lincoln.
>> HUGH THOMPSON: But seriously, you know, Katie, Rashad, Chris, I can't thank you enough, not just for your time here and you coming to RSA Conference, but just your work and dedication in this area. I know for each of you, there are infinite things you could be spending your time on, and I know you poured your heart and soul into this.
>> KATIE COURIC: I wanted to be on the Hugh Thompson Show.
>> HUGH THOMPSON: You're here, Katie. You made it.
>> KATIE COURIC: Add that to my resume.
>> HUGH THOMPSON: The pinnacle of an unbelievable journalistic career.
Thank you all so much for being here. Thank you for being a part of RSA Conference. Thank you all for being a part of RSA Conference 2022. Can't thank you enough for just the fellowship, the community, everything this week. And we can't wait to see you next year.
>> KATIE COURIC: Say goodnight, Hugh.
>> HUGH THOMPSON: Goodnight, and we'll see you in April.
Share With Your Community