Real World Stories of Incident Response and Threat Intelligence

Posted on in Presentations

Incident responders and threat intelligence analysts will pit their skills directly against threat actors for control of networks. In this session, three leading figures in the incident response and threat intelligence industry share their experiences from responding to incidents in the last 12 months.

Recommended Reading Available in Our Bookstore

Tracers in the Dark: The Global Hunt for the Crime Lords of Cryptocurrency by Andy Greenberg

Video Transcript

   >> ANNOUNCER:  Please welcome panel moderator Lily Hay Newman.


   >> LILY HAY NEWMAN:  Hi. Good morning. I'm Lily Hay Newman. I'm a reporter with WIRED and I'm really excited to be here today and be on stage with three really stellar contributors to incident response and threat intelligence.


   Next to me is Lesley Carhart from Dragos. Next to them is Katie Nickels from Red Canary. And down there is Wendi – hi, Wendi Whitmore, from Palo Alto Networks and Unit 42. And they are all going to introduce themselves in more detail but the goal of this session is really to kind of drawback the curtain on incident response and threat intelligence and get an idea of what goes on in their daily lives and hear the stories – some stories from like on the ground, you know, what's really happening, because I think we all talk about this a lot in the abstract but it is great to hear something more personal and hear their insights.


   So, anyway, it’s interesting to, you know, see the vantage points that you all have on these like really broad topics. So, I wanted to let all of you talk a little more about where exactly you sit and kind of what's the special thing that you get to see in your daily work. So Lesley, go ahead.


   >> LESLEY CARHART:  Hello. Nice to you see all. Wow, this is a great showing for a relatively early morning at RSA. My name is Lesley Carhart. I work for a company called Dragos. I'm an incident responder there. I have been there for about six years. I have been doing odd incident response in the industrial space for about fifteen years. So I work with critical infrastructure systems. So power, water, manufacturing, aviation, trains, all the things that we don't think about as computers traditionally. And that's the space where I respond to crises. Katie?


   >> KATIE NICKELS:  Awesome. I'm Katie Nickels. I'm Director of Intelligence Operations at Red Canary. Also a certified instructor for the SANS Institute, teaching the cyber threat intel course. Over my career, that’s close to fifteen years – I was counting it up – similar to Lesley's, I started out in the Department of Defense, working in different security operations centers, threat intel teams embedded there, worked for different contractors. I worked for the MITRE Attack team, so thinking about all the different techniques that adversaries are using in  intrusions. And now I have visibility into a lot of different environments, seeing lots of different types of intrusions and incidents and helping different organizations respond to those.


   >> WENDI WHITMORE:  Awesome. Good morning, everyone. As Lesley and Katie mentioned after Lily, it is just fantastic to be here with you this morning. And I see this big crowd.


   So, I'm Wendi. I lead Unit 42, and Unit 42 is comprised of incident response consulting, proactive services, threat intelligence, and our threat hunters on our managed defense team.


   So, I started my career in the Air Force as a special agent investigating computer crimes. Have worked at a variety of companies in the industry but always specializing in incident response and threat intelligence. And our teams are really at the intersection, I think, of both the work that Katie and Lesley do in terms of responding to breaches, but also doing the analytics on the background to really determine, you know, from the strategic lens, who is responsible for these and what does that mean we need to do next in order to protect against them.


   >> LILY HAY NEWMAN:  So, I think we should – as we were preparing for this, I was realizing that this could be like six hours, you know. There is so much that we could talk about and that we want to cover.


   >> SPEAKER:  It won't be though.


   >> LILY HAY NEWMAN:  Yeah. I was thinking that, you know, maybe we should start by just diving into what it’s like when, you know, you start – you all start hearing about a new incident that's developing. And, you know, I thought we could start with talking about 3CX and that sort of supply chain attack situation that has sort of – you know, first it seemed like okay, this is a supply chain attack. Then it came out that really this is two supply chain attacks intersecting. And what's the sort of first day? You know, how are you wrapping your minds around this? Where are you hearing things from? Maye kind of take us through it from your different perspectives that you just kind of laid out.


   I don’t know, Katie, do you want to start?


   >> KATIE NICKELS:  Sure, absolutely. It’s always interesting because sometimes these incidents start with chatter amongst friends. That’s how, you know, my colleague first got a tipoff for 3CX. Hey, I heard there’s a supply chain thing. And anyone who has tracked big incidents like this, you know that anything you hear in about the first twenty-four hours, be really skeptical. So, I was very skeptical. I’m like, okay, supply chain. Cute. Let's see.


   And then super grateful to folks at CrowdStrike who popped up a Reddit post really quickly. Thank you to them because the reality is supply chain compromises are so hard to detect. Right? We all have these cognitive biases. And I don’t know about you but if I'm looking at a trusted piece of software and I see something spawning from that, I’m like, oh, it is probably fine. Which a lot of people did a week before CrowdStrike released that Reddit post.


   So, it is a little chaotic in the beginning. Right? I think, you know, it is particularly tough in the beginning to be careful about what you know and what you don't know. Right? At this point we know that CrowdStrike reported a supply chain compromise at 3CX. We don’t know how bad it is. And from there, it’s about kind of scoping it, figuring out, do organizations actually use this, right. It’s tougher from a vendor perspective, all of our customers and anyone use it. If you are a single org, hopefully a little easier.


   So, I'm curious, Wendi, like how your team, you know, with your visibility, responded to that?


   >> WENDI WHITMORE:  Yeah. So, I think the first thing I often think is like uh-oh, okay, there goes my weekend, right? I'm sure like many of you do, you know, most of these events it seems like are a Friday evening, a Saturday morning, at least by the time that our teams get called in and start responding at any kind of scale.


   Your point about just not knowing the information and data sets that are valid in that first twenty-four hours and how dynamic those can become is absolutely critical.


   You know, the team I'm at now, it’s interesting because not only are we, as Unit 42, responding on behalf of clients, but then we’re also leading our internal, what we call rapid response, across the company. And so we have an obligation to make sure that as soon as we understand what the latest threat intelligence is and what a new attack vector is, then all of our products are detecting this and are able to protect our clients.


   So, that adds another element, I think, of chaos – in a good way because that's certainly, you know, our first and foremost obligation and make sure that we can protect – that our products are protecting clients throughout the world. But that said, it is very challenging to manage, okay, how many clients are actually going to be impacted by this? Is this the thing? Or is it actually maybe not a thing? And it might take us twelve to twenty-four or even forty-eight hours or longer to figure that out. And then what type of action plans do we need to put in place on the back end?


   >> LILY HAY NEWMAN:  What's the first day like for you, Lesley?


   >> LESLEY CARHART:  I work in a slightly different environment where the consequences are often much more severe, life, safety, the environment. If the facility is catching on fire, that's very serious stuff that could happen immediately. And sometimes triage has to happen before we have a full view of everything that's going on.


   But I loved what Katie said about skepticism. If you are somebody who is thinking about getting into the incident response space, if you are thinking about doing that in the future, you haven't done it yet, something that you will learn that you don't expect is that sometimes you have to be the skeptic. You have to be the one doing the reality check for people who are panicking and think things are much worse than they potentially are. They could really be that bad, but oftentimes, you are drawing people back because in those first twenty-four hours, as both Wendi and Katie said, we just don't know for sure. And typically, things tend to be less unusual than they seem to be. Things tend to follow trends, attackers tend to follow the same TTPs to some degree.


   It's not always new and novel. And so, what we do a lot of the times in those first twenty-four hours is think about why it might not be as crazy and severe as it really is. Of course, in this case it was pretty gosh darn severe.


   >> KATIE NICKELS:  Great point. I think with 3CX, I would say it was not on a SolarWinds level. There were definitely a lot of orgs compromised but based on our visibility, we saw a lot of people got kind of the initial malicious DLLs but there wasn't really follow-on activity. Right? I think we saw maybe in a handful of environments, some stealer activity. And you know, our hypothesis was, because I think it was a Wednesday morning this broke, not a Friday afternoon, thank goodness for once. Thanks, CrowdStrike.


   But I think it is sort of a lesson in collaboration and the power of actually sharing publicly because, we hypothesized, right, CrowdStrike really early on shared that GitHub was being used for infrastructure. We know it’s from GitHub, you all took that infrastructure down quickly. We hypothesize that that actually might have stopped a lot of environments from having later on intrusion chain phases. So, I think a lot of orgs actually got saved by GitHub in that case.


   A nice example of how sharing and taking down infrastructure can stop these things from being a lot worse.


   >> LILY HAY NEWMAN:  Yeah, that's really interesting. I'm thinking also about how you all balance – because skepticism is really important in my job too. And so, you know, that really resonates. How do you balance that with the need to react quickly and take things seriously so that that action can happen to prevent? And I'm thinking about Log4j as an example of something that, you know, where the – how bad it was going to be or what the impacts were going to be like. The script was kind of not yet written and the question was how quickly could everyone come together and respond. And it seemed like because of the sense of urgency and the sort of dire concern, that some of the worst, you know, potential outcomes were avoided in that case.


   So, I don't know, maybe just tell us a little more about – it's hard to verbalize your intuition but how you, you know, find that balance in the early days.


   >> LESLEY CARHART:  Some intuition is really just a lot of experience and seeing a lot of things. I know that's not the answer that everybody wants to hear who is getting into the field. Yes, you have to see a lot of cases to have those good gut feelings as an investigator but two important skills being an incident responder are being a good investigator. So, understanding the scientific method, not jumping to conclusions, needing solid evidence. And then also having very good risk management skills. What is the likelihood of this causing a catastrophic consequence? What if I'm wrong? So, you have to actually make those risk decisions while doing really good investigative work, understanding that you can't jump to conclusions, looking for evidence, and corroborating evidence. So, those are essential skills to being a good incident responder.


   >> KATIE NICKELS:  Plus one on the evidence. As an intel analyst, I encourage everyone to think about what assessment do you make, right, what conclusion do you draw based on what evidence. And that’s particularly important early on. For example, we assess that this box is infected based on a 3CX supply chain, based on the fact we saw a malicious DLL file reported by CrowdStrike.


   Like the nuance between oh my gosh, this is infected, and this is infected based on this evidence, that's so crucial at the beginning when you don't know a lot. How do you know what you know? So, I always encourage analysts and incident responders to kind of get in the habit of saying I think or I assess this based on this. It can really clear up some of that initial confusion.


   >> WENDI WHITMORE:  I think, to double down on what you both said about skepticism. So, I want to say three words, skepticism, curiosity, and calm.


   The skepticism coming in from, you know, hey, okay, we have an initial dataset, applying that investigative mindset to say is this really what we think it is? But I'm going to – we want people who not only want to prove an allegation but disprove it in the same level of degree, right? That's going to allow us to then really go through those critical decision making skills to determine how much of a thing it is.


   On the curiosity side, Lesley has done a great job of pointing out a couple of tactics or traits that we might look for in future responders or future analysts. And I think more so than what degree you have, having a curious mind combined with some of that skepticism is really a key skill that we have on the team of saying, you know, hey, okay, I see that this is what this – the data is showing me right now. But something doesn't quite add up so I'm going to really dig deep into that and make sure.


   And then the third part, the calmness. Like, that's such a critical skill. And you guys both highlighted in terms of hey, we're getting the first call. Oftentimes it is a chaotic situation. My team members and I, we were just called in to one of these incidents on Friday night. And the initial conversation, it was a major corporation. Their CISO was on the line and they had information that pointed to potentially one of our firewalls. And we're very concerned about traffic that was coming from there or that looked like it was coming from there. And when we got on the phone, tensions were very high. Like hey, we need answers on this immediately.


   And so it took, you know, not only a lot of technical skills to be able to work through the situation, identify hey, this actually wasn't the case, but that calm manner in which we responded initially just started to like, you know, tamper down the amount of chaos and frustration on the call and say, okay, like, you know, the client very much saw that okay, you guys are interested in getting answers as quickly as we are. You’ve got a lot of good data. We can work through this. But there very much is like this almost – you know, the human potential or the human element I should say, you know, of these conversations where you are a little bit like a therapist at times.


   >> KATIE NICKELS:  Security therapy. On our resumes, yeah.


   I love that point on calm. And I think one thing I always go back to is like panic is not a necessary part of the incident response cycle. There is a difference between panicking and having a sense of urgency, and I think trying to have that calm urgency, that is a tough balance but you can achieve it. I think it's so much more productive than running around with your hair on fire.


   >> LESLEY CARHART:  That feeling that your parents gave you when you were a kid that everything was going to be okay. You have to be able to exude that to the people who you are doing incident response for. That's challenging. That’s another skill that you learn over time.


   And I will tell you this. So, real talk for people who are, again, new to this field. The first day of incident response for an investigator it is scary. It is scary. You’re never sure if you’re going to find that initial piece of evidence you really need to catch the adversary. Once you start finding threads to pull on, then it becomes really engaging and interesting and you feel a lot better, but it’s always a little scary the first day. But we have to work on our internal Zen and being calm about dealing with these intense crises that can have really serious consequences.


   >> LILY HAY NEWMAN:  Yeah, the security therapy is really working. I feel very calm.


   I want to ask you in that vein though about incident fatigue, breach fatigue, burnout, because, you know, that's something, as you are all talking, that I'm thinking about. There is a difference between a calm and an assurance and sort of being jaded or, you know, just being kind of tapped out, you know. And so, do you find with yourselves or among your colleagues, and you know, the camaraderie that you have on your teams, that you all sort of talk about this or is it something that everyone is kind of dealing with on their own?


   >> KATIE NICKELS:  I think on healthy teams, they are talking about it. And I think that’s awesome. I manage a team of threat hunters and they have an on call rotation. I was chatting with them recently and someone pointed out, like, when you’re on call and you get called multiple nights in a row, that's exhausting.


   I was like, you know what? I hear that. So, amongst themselves, they decided, hey, instead of a five-day rotation, let's do a three-day rotation. I think having that healthy conversation of – even between employee and manager, hey, I am starting to feel really tired; can we make some changes? Thinking about on call rotations, how often you cycle people out I think is a really, really healthy aspect of a team.


   >> WENDI WHITMORE:  I think it is a real thing for sure. You know, there is no doubt that that comes into play. Like you said, I think there are more conversations about it today than there have been in the past.


   You know, as leader, I will look at, like, if we've got a major incident where you have got a large team of people on it, you are not only looking at, okay, like, how many days can the engagement manager stay on site without just totally losing it or needing to go back and see their family, get some sleep, like just decompress for a few minutes. So, trying to be strategic about how you might rotate people out in that regard.


   But also just realizing like, that can have a play then in terms of the reaction to it. If the team is super burned out and tired, it’s like, okay, I'm just not going to be at my best in terms of what the calm energy that I am going to bring to the client. So, how do we make sure that we're just keeping track of each other and making sure like hey, have you guys spent enough time with your family or just away from work? Who needs the weekend off versus who is like really raring to go?


   On the flipside of that though, I think one of the coolest things about our job is that – just the mission impact, right. I don't have to talk to team members at any level of our teams about like, why we're doing this or why it’s important. These are people that wake up that are super excited about whether it is national security, whether it is protecting against cyber criminals. You know, they really feel like, hey, we're making a difference in the work that we do. And I don't think everyone has the luxury of saying that. So, I think that is something that we're really fortunate for.


   >> LESLEY CARHART:  I don't know if I'm allowed to do this but I’m going to ask the audience to raise your hand if you are responsible for maintaining an incident response plan for an organization or updating it, creating it?


   Okay, so, I’m going to give you all some homework. Go on your incident response plan and make sure that the things that Wendi and that Katie just talked about in terms of rotations during an incident are in your plan. I see them missing all the time in peoples’ incident response plans.


   Incidents are high stress. Sometimes they go on for weeks, especially if you are doing internal incident response for your organization. You need to have a plan for how you hand those off. What if somebody gets sick? What if the person you really rely on in your document is too unwell to keep going? You need to plan for all of those situations like rotation and handoff and the health and wellbeing of your responders during a long-term incident, and that needs to be in your plan.


   >> LILY HAY NEWMAN:  That's great homework, actually. A lot of people raised their hand, which really speaks to how many people in the audience and here in this community are shouldering this burden and kind of having this constant vigilance.


   I'm also wondering about these handoffs within your teams and how you kind of coordinate all of this given that more incidents are coming down the pike all the time. Do you feel like you have the ability to do that step back and kind of not just do the postmortem but have some distance and then revisit incidents and see what you can learn with, you know, that time away and perspective? Or does it feel like there is just always this grind where it is difficult to, you know, have the luxury of that reflection?


   >> KATIE NICKELS:  I wish we had time to do deep after actions for every incident. I think it is so tough because there is always something new coming in.


   But what I’ve found is that when you don't take the time to reflect, to do that after action, you find you make the same mistakes incident after incident after incident. And so, I would say, if you have the luxury, if it's even a half day, an hour, a half hour, something after a major incident, pause, breathe, take a quick moment to have an after action discussion with the team. What went well? What didn't go so well? What can we do better next time? And then the real challenge is following up on those items. We didn't have this log visibility? Let’s improve that. Six months later, you find you actually don’t have the same thing still.


   So, it’s not easy but I think that again it’s if you don't take that time, you are going to waste more time later because you are making the same mistakes.


   >> WENDI WHITMORE:  Yeah, I think – so, the way we try to organize it is you have a team of people that are responding to breaches. They may be going on site, some of them may be staying back. But then we have a team similar to Katie’s where you have analysts who are really looking at the situation more strategically, so not having a lens into only this one breach but looking across the board of all of the work that not only the investigations we’ve done but a lot of the other telemetry sources we are tracking. And then the goal with that team is that they are really able to start seeing patterns at a larger scale and say hey, wait a minute, okay, this looks like it could be tied to this and/or maybe something that we investigated four months ago and we see something else. Now we actually think there are related.


   So, having, you know, an organization where you have got some overlapping capabilities like that is really helpful. But you know, that said, like the after action reports post incident, those are so critical and that's something that we can probably all get better at because you then do get like mired into the next situation. And oftentimes that next situation might be more exciting because hey, it is new datasets now and new information and we want to go jump on, you know, in this case, your investigation.


   >> KATIE NICKELS:  One quick example on why that is so important, documenting what happened in an incident, we have seen in most ransomware intrusions, there is some kind of exfiltration of data. And now there are intrusions that are just extortion, right? They steal the data. They say pay us or we're going to post it on the dark web.


   One of the challenges we’ve seen actually recently with a couple customers is that adversaries will email them and say hey, we are whatever ransomware group; here is some data that we have stolen from you, right Pay us or we're going to post it to the dark web.


   That data, if you don't know where that might have come from, if you haven't captured from a previous intrusion, maybe a previous extortion of a ransomware you had a year ago, if you don't know if that data was stolen, you don't know for this new incident whether it is old data or new data. I see this really commonly with those extortion intrusions where people don't take the time to actually think about what was stolen what exact files, at what times. And in the future, adversaries often try to re-extort. So, that’s an example of why it is so important to kind of dot those I’s, cross those T’s. Those details really matter, including for future incidents.


   >> LESLEY CARHART:  If you work for a security company like we do, you probably – may have the luxury of having an intelligence team that's doing those types of reviews of big campaigns to understand how things are working in the broader cybersecurity and attack space.


   If you work for an organization doing cybersecurity for itself, you are the one who is going to have to answer the questions that Katie just talked about. Is there a persistent adversary who is continually attacking you? Is one intrusion related to the previous intrusion? You’re going to have to answer those questions.


   We don't see the full picture, necessarily, as providers, as consultants over time in your organization, but you do. So, you don't necessarily have that luxury of the big intelligence team that sees tons of attacks against a ton of different organizations. You should be getting that intelligence from somewhere. But you need to look at the intrusions that are happening in your environment long-term to see if they're related.


   >> LILY HAY NEWMAN:  Yeah, I think all of this kind of helps me at least get a sort of deeper picture of, you know, both, I think that distinction you are drawing is really helpful both on the security firm side and, you know, firms that are resourced enough to have that dual insight versus companies that are trying to do it all for themselves. You know, I think it's important to really highlight that and talk about that.


   One thing that I really wanted to ask all of you going into some, you know, some more of the threats everyone is facing right now is I'm curious about what stands out in your mind or is important to you as something that the community either isn't talking enough about enough or, you know, is talking about but doesn't realize that there is a bit of an iceberg that, you know, we're talking about something but it is even bigger below the surface because victims really don't want to talk about that thing.


   It could be a type of incident or it could be a technique or something that they feel is really embarrassing to have happen to them. I'm curious, you know, what's the dirty laundry kind of?


   >> LESLEY CARHART:  A lot of intrusions are coming from very simple places. The fundamentals are really, really hard, everyone. Asset inventories, knowing what computers you have, knowing your perimeter, knowing if you have exposed hosts, knowing that your network is properly segmented or not. Those are really challenging things to do, especially if you have a big network. And adversaries, for the most part, they are taking the path of least resistance. They will choose easy targets, ones that they will be efficient and effective at compromising. And so many incidents that we respond to, even in the critical infrastructure space, are still coming from simple mistakes in security hygiene.


   And of course, people don't want to talk about that. They want to talk about whiz bang cool things they saw at RSA. That's neat. That’s fun, right? But we do need to make sure that we’ve updated our network maps in the last ten years, please.


   >> LILY HAY NEWMAN:  So much homework from Lesley. It is good stuff.


   >> KATIE NICKELS:  Good homework.


   >> LILY HAY NEWMAN:  Katie?


   >> KATIE NICKELS:  Yeah, I have a similar answer on the thing that springs to my mind is cloud misconfigurations. I think we're all embracing cloud as a great, cheap way to store data. But it is I think sometimes easier to misconfigure a cloud environment than configure it properly. And thinking about the basics, it is so important to configure it properly, set – write a password, and have a second factor of authentication. So many environments are just left wide open. Right? Ports just open to the Internet. No password at all or a weak password that can be easily guessed.


   And you know, I think of that for Lily's question because it is embarrassing. If you set a default password, “password,” on your S3 bucket, an adversary just guesses that, logs in, steals a whole bunch of sensitive data. That's embarrassing. Even organizations that have spent, you know, millions of dollars on security tools, the best in the industry, if you don't configure your cloud environment correctly, data can walk out the door.


   So, I think that's one that I’m hoping we will start to see a little more reporting on what's going on in cloud environments because I think that's a huge risk that maybe we are underestimating right now.


   >> WENDI WHITMORE:  I could not agree with both of you more. 100%. I think one other area I might add to that is that I do think there are still more ransoms being paid than are widely communicated. We obviously all understand why you wouldn’t go out shouting about that. But I do hear it discussed more in kind of the dark rooms between CISOs, you know, in trusted environments. And so, I think just the awareness that that's still going and being able to ask your peers about hey, what's your plan for this? Like, have you spoken to your CEO? Is there – you know, what are the scenarios in which we would actually consider doing this? Doing that proactively. If that's the case, having a relationship with the CFO who is likely going to be responsible for potentially assisting in that transaction and making sure that occurs, do we have a way to potentially pay some sort of cryptocurrency? Or is our policy that we are never going to consider that absent any of these scenarios.


   So, I think having those – being aware that you all should be having those conversations. I think many organizations are. But just talking a little bit more openly about it so that we can learn from one another I think is helpful.


   >> LILY HAY NEWMAN:  Talking a little bit more about communication, you know, as an incident is unfolding, and, you know, after the fact, both publicly, particularly, but also internally within an organization, how – what would you say about – we often hear that well, we can't really talk about something or get into specifics because, you know, we don't want to give attackers, you know, an upper hand or give anyone an edge. When is that really true? And where it is really, you know, valuable to wait on sharing information and, you know, when would we want to encourage more transparency?


   >> KATIE NICKELS:  It depends. There is my intel answer. It has been how many minutes before that?


   >> LILY HAY NEWMAN:  Very diplomatic.


   >> KATIE NICKELS:  I know, I know. I mean, I think 3CX is a great example. CrowdStrike could have chosen to not say anything and kept that private, but in revealing that, I think they actually probably shut down that operation pretty quickly.


   So, I think there are so many considerations, right? Intelligence gained and lost. If we reveal this information publicly or in a private community, how would that change the way the adversaries behave? I think that if there is an active intrusion, you know, you don’t want to publish, hey, we know how you’re moving laterally into the internet. But there is a balance there.


   I think that sometimes organizations don't want to accept any risk but I think what we have found even is if it is privately, in trusted sharing communities, that sometimes giving other folks a heads up on what's happening, not the victim details, not the environment, not what user clicked the phishing email, but information about the threat actor, the TTPs, that's what others care about. I would encourage this community to kind of lean forward a little bit. What can you share even if it is private, right? There are good reasons not to share publicly but I think we see again and again that something that is targeting you is probably hitting other organizations, so it sounds cheesy, but sharing is caring in my opinion.


   >> WENDI WHITMORE:  I think there is a couple of elements. Katie is hitting a lot on the, you know, what can we be sharing to protect each other. 100% agree with that. I think though that there are also – you know, can come into play the idea of what are organizations sharing publicly to – like during a breach.


   In that scenario where organizations are so graded on that, more so in public sentiment and public opinion than necessarily their response. So, we have seen some great examples of organizations who have been transparent about it and have actually gathered even more, you know, positive client sentiment and following because people feel like they have been transparent.


   There is cases though, unfortunately, where that's come back to bite other organizations because of regulations or other types of potential legal workstreams down the road.


   So, I think that's still such a tricky question to answer, right? It’s, as Katie said, it absolutely depends on the situation. But the more we can encourage the sharing to protect the greater good, that's absolutely better.


   >> LESLEY CARHART:  Get involved in your ISAC if you have one in your industry vertical. That is a private sharing group for organizations in a similar vertical to you. And they exist by the grace of you participating in them and maintaining them and making them healthy. Some of them are in much better conditions than others because they are a community effort.


   So, get involved in the one that's applicable to your organization or business and try to keep it healthy and running because that's a good way to potentially, and when you can share depends, of course, but that's a potential way that you could share some critical information to perhaps similar organizations to you who might be targeted by the same adversaries and campaigns.


   >> LILY HAY NEWMAN:  Maybe on this point, we can talk a little more about public-private collaboration, global incident response, and we had all talked about this a bit in terms of ransomware so we could speak to that. But any part of it that you – that stands out to you. What do you think is going well? Do you think there is improvement on this type of collaboration and sharing? And do we need to be leaning forward more like Katie was saying, even more? Or where do you think that's at?


   >> WENDI WHITMORE:  I think it is definitely really on a really positive trajectory. You know, I have an opportunity to be on, not in the government side, right, but be partnered with a lot of government agencies, international law enforcement, intelligence agencies. And I think a couple of things. One, in the wake of SolarWinds and then certainly with Log4j, I do believe that between public and private partnerships, that started stimming a lot more real-time sharing. With Log4j, you saw CISO leveraging GitHub and Twitter and Slack to share information pretty rapidly and leveraging existing technologies that are widely unused by organizations throughout the world. That was a big shift.


   Now I think with Russia-Ukraine, you not only see that continued public-private partnerships, but from my lens, and Katie, I would definitely be interested in your perspective because I know you’re at the intersection of a lot of these intelligence crossroads, but I see more sharing between us and our traditional competitors than I have ever seen before, and pretty rapidly. People giving us a phone call, hey, that we just saw this indicator where you guys are working with this other team. Can you look into this? I'm seeing a lot more of that than I ever have in the last twenty years. So, I'm really excited and optimistic with that.


   >> KATIE NICKELS:  I think so. I think, you know, the theme of this year is stronger together. I think one of the wonderful things that I see is analysts, researchers, even if, you know, Palo Alto and Red Canary are competitors, like, I don’t care. I'm going to share threat information with Wendi and her team because it makes us all better. I think the public-private thing is tricky and it’s interesting because I think it’s interesting, CyberUK was last week. And it’s interesting to compare and contrast different countries' approaches.


   I think that, in the U.S., right, realistically, the U.S. government has had a tough time with private sector, and I think so often in cyber policy circles, information sharing, it is kind of the like thoughts and prayers of cyber policy because what's a solution? We need more information sharing. I think there have been so many efforts and like the intentions are great. What I have found with information sharing, it comes down to personal relationships, which is challenging because it is not always scalable, right? Public-private, you know, maybe officially government can't share anything but there are amazing FBI agents, even folks from cyber com, from NSA even who will share with researchers at private companies. You know, we’ve got to raise the roof for one of those agencies in the audience.


   But I think that the real challenge is building on the personal relationships that works so well in information sharing. How do you institutionalize that? I don’t think it’s a solved problem. I think there are a lot of people working on it. I think JCDC from CISA has promised. I think looking to things that work really well like NCSC at the UK has a great relationship with private sector.


   But I'd also say for any government folks, and I used to work in government, push forward. I know, for example, NSA has Cybersecurity Collaboration Center. Trying to push an organization that has been super-secret squirrel to share more. What can you declassify?


   So, I think the more people who start working in private sector, with private sector and government can sort of push the U.S. government to maybe do more. And in critical infrastructure, this is huge.


   >> LESLEY CARHART:  Your accuracy is hard. And one of the interesting topics to me, and Wendi and I were both in the Air Force and you worked for the government. Government bureaucracy and military bureaucracy can be very challenging to move fast and break things in. And there’s a lot of potential in the reserve guard space too, for helping with incidents and intelligence sharing, things like that, but that's a very slow moving machine. Some states are doing interesting things with their National Guards in terms of helping with incident response. Others have done nothing.


   So, getting the engine so spin up and move has been challenging but I also see positive direction in the critical infrastructure space. We see critical infrastructure utilities getting help from more government organizations, which is so desperately needed, especially from municipal utilities like water and sewage that have next to no resources.


   >> LILY HAY NEWMAN:  One other question I want to ask, because I know we were all talking about this, that it is maybe a little bit of an underdiscussed area, is when we're thinking about bringing in the next generation of incident responders and, you know, threat intel analysts, what do we need in that next generation and how are you all approaching, you know, providing support and mentorship and sort of bringing the right people in?


   >> LESLEY CARHART:  Oh, geez.


   >> LILY HAY NEWMAN:  I know this is a big Lesley question.


   >> LESLEY CARHART:  All of us started a while ago. I'm not going to guess how long ago we started. Two years, yes. We're all twenty. Definitely.


   We didn't have a great support structure when we got into this field. And so, I personally feel like I need to overcompensate for the next generation, so I do a lot of mentorship. I run an online conference to teach young people how to get into cybersecurity. We're all doing a lot of stuff. Katie's teaching. We're doing a ton of different things to build pipelines and bring people in because, from my perspective, nobody helped us and we desperately need good people. We need people who don’t necessarily look like us. We need people from all over the world being a part of this effort because we have so many problems to solve. Katie?


   >> KATIE NICKELS:  Yeah. Plus one to everything. I would just encourage people to think about bringing in junior employees, not as a risk but as an opportunity. I think it's tough because you bring someone in and you train them up and then they go elsewhere. And that can be tough. But I think we need more of that. And just recognition, that's okay because we made our industry stronger.


   So, I would encourage everyone to try to reach out to someone, again, maybe who looks a little bit different. Is there someone you could bring along, right? Because we all started somewhere and like who can we keep the door open for behind us?


   >> WENDI WHITMORE:  Absolutely agree. I think I would add to it too. We need a bigger pipeline of the best and brightest students and best and brightest minds out there. And a challenge that I see, especially at high school levels but even before that, junior high, grade school, is that there are a lot of great students out there that just simply don't even know that cybersecurity exists and what it is as a career field.


   My own nieces didn't know what it was because, you know, high schools oftentimes don't have computer security, I mean, just computer science programs to begin with. But then cybersecurity teams. And so, the more we can increase awareness, have competitive opportunities, so like teams that these younger children can compete on, to be aware of, and just really raise awareness at that level, I think the more we can do to better the future for us.


   >> LILY HAY NEWMAN:  So, we have time to take one or maybe two questions. I think there are some mics around. And to this point about bringing people in, we just thought it would be nice. But just make sure we can hear the question mark at the end of your question.


   >> SPEAKER:  I love that.


   >> LESLEY CARHART:  Nobody has any questions about incident response? You all have it down.


   >> SPEAKER:  There. You can walk up to the mic.


   >> LESLEY CARHART:  The microphones are stationed strategically around this room.


   >> LILY HAY NEWMAN:  We got a few.


   >> KATIE NICKELS:  Should we start over here?


   >> AUDIENCE:  Okay, so my question is, how much is too much sharing? Because there this is a problem that I have, right. Right, when we have an incident, it is really scary to share that we had an incident. So, how much is too much sharing?


   >> WENDI WHITMORE:  Yeah, I'm going to take Katie's default answer of it depends, right? So, it depends who the sharing is with and what the situation is.


   We were talking about this backstage. Like, there is a certain of amount sharing sometimes even within an organization that needs to occur of limiting that in terms of maybe not every employee needs to know. Maybe people outside the security team who aren’t responding, like there may be legal reasons that you need to confine that.


   I think the bigger challenge is, so I will pick one element out of that and you guys can pick more I think. But would be like the sharing and what timeframe. Specific to like a first twenty-four to forty-eight to seventy-two hours of an incident, one of the biggest challenges that I see is organizations who share too much too soon and then the information changes because evidence is dynamic. We don't have all the data yet.


   So, one thing we encourage organizations to do is be very careful about not sharing information that will need to be walked back at some point.


   >> KATIE NICKELS:  I think going back to that, what is your assessment, what's your evidence? That's kind of a key way you can do that. We assess. It’s early. I think one thing that is always going to be too much is calling out specific victims, right? We work for vendors. Never call out this specific company was breached, right? This specific person who clicked the phishing email, I think that’s always going to be too much in my opinion, protecting that information.


   But other than that, yeah, it depends. I always recommend starting smaller with the trust community, an ISAC, or peers, and then expanding because I think that can mitigate some of the risk a little bit, starting with a smaller sharing group.


   >> AUDIENCE:  Good morning. I do have a question. You were talking about being skeptical during the first twenty-four hours. But with the raising of those ransomwares, do you really think that waiting twenty-four hours, being skeptical, is good for your customers?


   >> KATIE NICKELS:  Yeah. The question being is it really good to be skeptical for twenty-four hours? Is that good for customers?


   I think, you know, what I would say is being skeptical doesn't mean you are not doing anything. Right? If you think there might be ransomware, you are going to hunt for lateral movement, you’re going to look how they got in. You are going to do all those things. It is just more of a mindset of if someone comes out with a blog post or a tweet saying, oh my gosh, it’s Russia, you’re like, um, let’s stay focused on scoping the incident. Let’s not trust every random open source thing. So, I think you can be skeptical and still take action, if that makes sense.


   >> LESLEY CARHART:  From a scientific method perspective, when you’re doing good science, you’re always trying to disprove your hypothesis. That’s what we’re doing in incident response too. We're still doing science. We're still doing an investigation. We are still looking for evidence. We are still trying to corroborate it. But we are trying to disprove our hypotheses instead of prove them because there can be confirmation bias there.


   >> WENDI WHITMORE:  I think one thing too, just specific to ransomware, it works in an organization's favor to buy more time. So, especially those first twenty-four, forty-eight, seventy-two hours, the more information that you can glean from the potential attacker, the engagement with them to be able to buy your organization time to then make a decision based on what data that unfolds over that time, the significantly better off you will be in terms of lessening the operational impact. If you do decide that hey, potentially paying the ransom is the option we have here, you then use that time to negotiate that as well.


   So, you know, attackers tend to always use time as a pressure valve on victims and that's something that organizations can use very effectively against them.


   >> AUDIENCE:  Thank you.


   >> LILY HAY NEWMAN:  One other question that I would ask is because we didn't quite get to this, is in that vein, how do you all approach or sort of what are some of the differences dealing with incidents that are criminal in nature versus government backed actors. And, you know, the first twenty-four hours or the first, you know, the sort of timeline that we've talked about, what are the differences there as you are trying to sort out either what is going on or who is the actor or, you know, if you have a sense trying to start that process.


   >> LESLEY CARHART:  I'm not sure I'm going to agree with my peers on this one but I’ll give my opinion here.


   >> LILY HAY NEWMAN:  We love it.


   >> LESLEY CARHART:  First of all, I am cautious about any attribution to countries. That's a dangerous thing to do unless you have boots on the ground intelligence. There’s false flag operations. There’s confusion about subcontractors and criminal organizations that are involved with countries. So, you attribute criminal actors and state-based actors in similar ways. You look at what they can do, what their capabilities are, how they operate.


   So, in some ways, you do things very much the same for both of them. You are responding in the same way and also that's because their capabilities are starting to become fairly equivalent in a lot of reasons. There is more resources available to criminal actors. They have made a ton of money on things like ransomware and BEC. And they can do things that were traditionally reserved capabilities for state adversary groups before.


   >> KATIE NICKELS:  Yeah, I will just say, attribution can matter but when you are starting a response, I don't think that focusing on what country it is is the best approach. Thinking about can we identify the malware, the patterns, how they might be moving laterally. I think that's much more important than the country, although that sometimes matters too later on.


   >> WENDI WHITMORE:  Yeah. Completely agree. I think at the onset of an investigation, same approach, right, towards the – where it could change would be the outcome or what the next steps are. You know, if you have got data that you need to relay potentially to an intelligence organization, if it’s national security related, there may be some different handling and steps there. But initially, approaching every investigation, right, in a very similar manner.


   >> KATIE NICKELS:  Someone over there who has been patiently waiting for a question.


  >> LILY HAY NEWMAN:  Oh. Oh, sorry. Hi.


   >> KATIE NICKELS:  It’s okay.


   >> AUDIENCE:  Hi. I'm going to roll it back to the earlier question about that whole twenty-four to seventy-eight hour window and when do you start sharing and when do you let anybody know. And it is interesting like, the take that I am getting from you guys is kind of keep some things close to the vest until you are really, really sure.


   But on Monday, I went to this modern bank heist where the U.S. Secret Service’s Matt O’Neill said that's the crucial time to reach out to Secret Service or FBI or anybody because it is like the huge cash out transaction, transaction fraud. It's like the first twenty-four hours, if they can stop it, then they can claw back most of the money or much more of it than if they wait. And the longer you wait, the more money an organization is going to lose.


   So, could you kind of discuss that friction between keeping it close to the vest and letting law enforcement know as soon as possible to limit financial payment?


   >> KATIE NICKELS:  I’ll add some homework; there should be an incident response plan, right? Of course, go back, update that. Think about establishing those relationships early so you can have Secret Service agent, FBI agent, where you say hey, we saw this thing today. We think right, if there is a weird wallet address, this might be unusual, we're not totally sure yet, but I want to give you an early heads up.


   I think it is all about establishing that trust before the incident and then being able to say with a level of confidence we are not totally sure but here is an early tip so they can take action.


   >> WENDI WHITMORE:  It sounds to me like he was referring to a very specific type of transaction. In case, right, financially motivated, maybe wire fraud, BEC related, in those cases, yes. They are talking specifically about like the measures that the U.S. government and working in coordination with other governments abroad can leverage to claw that money back. So, that's absolutely critical in those cases.


   But Katie's point is super valid. Have the relationships in advance. Be able to have a cell phone number of someone that is going to answer your call on the other end and be able to say, oh hey, yeah, this is something we do need to get involved in or maybe not the case, right?


   >> LILY HAY NEWMAN:  Well, I feel like we all got a lot of fascinating insights from the three of you and a lot of good homework. So, everyone get on it. Thank you all for your time.


   >> WENDI WHITMORE:  Thank you.

Lily Hay Newman


Senior Writer, WIRED Magazine

Lesley Carhart


Technical Director, Incident Response, Dragos, Inc.

Katie Nickels


Certified Instructor, SANS Institute, Director of Intelligence, Red Canary

Wendi Whitmore


Senior Vice President, Palo Alto Networks

Share With Your Community