Posted on
in Podcasts
Podcast Transcript
Introduction:
You're listening to the RSA Conference podcast, where the world talks security.
Kacy Zurkus:
Hello listeners, and welcome to this edition of our RSAC 365 podcast series. Thanks so much for tuning in.
Kacy Zurkus:
I'm your host, Kacy Zurkus, content strategist with RSA Conference. And today I am joined by our guest, Safi Mojidi, who will be talking about why DevSecOps approaches are critical to securing virtual healthcare applications, particularly for the LGBTQ population.
Kacy Zurkus:
Before we get started, I want to remind our listeners that here at RSAC we host podcasts twice a month, and I encourage you to subscribe, rate, and review us on your preferred podcast app so that you can be notified when new tracks are hosted. And now I'd like to ask Safi to take a moment to introduce himself before we dive into today's topic.
Kacy Zurkus:
Safi, over to you.
Safi Mojidi:
Hi Kacy, thanks for having me. Well, I am currently a doctoral candidate and cyber security specialist with about 15 years of experience in cyber security. I have led the design, implementation, execution of many enterprise cloud security programs.
Safi Mojidi:
I've led work at NASA, Department of Justice, Department of Defense, Slack, and Salesforce. And currently I am the head of information security at a healthcare startup called FOLX Health where we provide telehealth care that really centers on the queer and trans community, but is certainly available for anyone regardless of how you identify.
Safi Mojidi:
I've also founded a nonprofit called Hacking the Workforce to really increase the visibility in retention of black LGBTQ professionals in leadership positions within cybersecurity, but more broadly tech. I am also a two sport Hall of Famer in basketball and semi-pro tackle football.
Safi Mojidi:
In my free time I love working out, roller skating, and traveling with friends and family. So happy to be here.
Kacy Zurkus:
And we are so happy to have you here with us today. So let's go ahead and jump in with the first question, which will hopefully set the stage for the connection to DevSecOps and software integrity. Can you maybe share with our listeners, what is your perspective on the ways in which virtual health providers create favorable conditions for underrepresented individuals to access healthcare?
Safi Mojidi:
Yeah, that's a great question, Kacy. I think we are at a very interesting time, both in healthcare and the opportunities for access that technology can provide. I think at the start of 2020, the COVID-19 health crisis really ushered in a perfect opportunity to fundamentally shift how healthcare is provided and accessed.
Safi Mojidi:
Around 97% of all primary care physicians were essentially forced to leverage telehealth, and that sector grew by almost 92%. It's a market that's poised to grow to almost a $400 billion industry in the next five years, so moving forward health systems will really need to focus on their ability to continuously and accurately evaluate the patient experiences that they're providing their members as well as increasing operational efficiency and clinical outcomes.
Safi Mojidi:
But I really think as we look to the future of healthcare, it's important to ensure that telehealth doesn't now make it open season on underrepresented individuals while they seek access to care.
Safi Mojidi:
What I mean by that is the question is no longer when providers will finally embrace telehealth, but how successfully they'll be at implementing the necessary security features to continue to scale its use. And I really think DevSecOps is one of the most important practices healthcare companies can implement to reduce risk, increase software integrity, and lower the likelihood of a data breach or major security incident.
Safi Mojidi:
As we see more interconnection in data exchange between healthcare organizations, the underlying platforms will continue to see quality or security issues that could really lead to serious breaches or system outages. Both those things can really erode the trust of a patient and harm the organization's reputation.
Safi Mojidi:
So employing strong DevTechOps principles are necessary not only to prevent the things I just mentioned, but also ensure that software delivery is consistent, we can reduce the number of incidents and really maintain a positive customer experience for all patients.
Safi Mojidi:
The last thing I'll mention is the importance of DevSecOps is really to figure out a way to abstract humans out of the process. The sheer magnitude of HIPAA, state, federal rules and regulations are difficult to implement and certainly more difficult to manage manually. So it's really important that development happens with iterative, automated approaches to deploying software that is secure and is able to scale.
Kacy Zurkus:
Okay, that's a lot. And I want to try and maybe break some of that down in some follow-up questions.
Safi Mojidi:
Sure.
Kacy Zurkus:
You talk about access and data breaches, so with the increased use of virtual health apps, there's clearly this influx of digital health records, which as you point out opens the door to cyber threats. So then can you talk about some of the risks to that personal health information? And obviously it's more desirable because it's more lucrative for the attacker to access personal health information than just personally identifiable information, right?
Safi Mojidi:
Absolutely. I think the industry has made some great strides over the last let's say 15 years in making sure that stakeholders are connecting with the information the right way and accessing it the right way. But mistakes happen, there's not 100% foolproof method to deploy completely secure applications in particularly mobile applications.
Safi Mojidi:
But I think it's really become evident just how nefarious actors are looking to interact with healthcare industry systems. So within the past few years, identity theft has become a huge issue, ransomware, and then from a nation state perspective, we're starting to see more and more cyber crime, cyber espionage, that healthcare companies are now becoming victim to because they do hold such valuable information about so many different people. I gave a training on 1Password the other day and I think it was a real eye opener for some people.
Safi Mojidi:
One of the things that we would like to try to see is effectively controlling how people are interacting with healthcare applications and systems. So we want to make sure we're providing people with the tools so that they can protect themselves, and I'll go into some of those a little bit later. But a data breach of a healthcare company will almost certainly include a patient's demographic information, but also their health and financial information. So medical records, credit card information. Sometimes it can include diagnosis, treatments, lab test results, prescription information, and definitely other health insurance related types of information.
Safi Mojidi:
When a patient's medical information is exposed, the effects of that can be catastrophic on someone's life. For example, patients with exposed medical information puts those individuals at greater risk for insurance fraud or identity theft. And when we're talking about marginalized communities, this can really have deleterious effects on finances, mental health, and ultimately quality of life.
Safi Mojidi:
So for members of the LGBTQ community, this can quickly become a safety issue in that information about their gender identity or sexual orientation is no longer confidential. So really talking about significantly impacting people's lives if we are not good custodians of their data and or effectively implementing the necessary controls to prevent some of those threats from actually being realized.
Safi Mojidi:
And I want to just also say this isn't just an issue for patients, this is absolutely also a business issue. According to research done by IBM, the cost of the average data breach is about $4.2 million. There was a startup called myNurse, they actually had to shut their doors two months after a data breach because they effectively couldn't recover from the amount of penalties and the loss of reputation that came along with their data breach.
Safi Mojidi:
So these types of incidents can expose personal data, and now as a patient, I'm worried about how can I be assured that moving forward this company, this healthcare company, this physician's office will really be a good custodian of my data? How can I overcome the trauma of dealing with my personal health information being made public and or being held for ransomware.
Safi Mojidi:
So I've come to a doctor's appointment and I'm not able to have my visit because my physician isn't able to access my medical records because they've been held at ransom. So those breaches really erode relationships between the patients, their clinical teams. In certain communities this was one of... Virtual health or telehealth was the one way that made access to qualified medical care a possibility. So now patients may be less willing to seek medical attention, share potentially pertinent health information with the medical staff, and ultimately that leads to unfavorable health outcomes for that individual.
Kacy Zurkus:
Absolutely, and this is not directly related, but it got me thinking of it as I was listening to you. I read last week in Infosecurity magazine had reported that sex [inaudible 00:10:32] cases in the UK doubled in 2021 compared to 2020. And it is the ability of malicious actors to use this private information, even if it's not sexting, but it's private information that no one wants publicly shared that then becomes a vulnerability to the vulnerable already.
Kacy Zurkus:
So definitely cognizant of the threat. So what then are some DevSecOps approaches that these app developers or virtual health providers can take in order to secure digital health?
Safi Mojidi:
That's a good question. I think there are definitely some approaches that remain true regardless of the technology stack being used. So for example, minimize storage of sensitive data. I always tell developers and engineers, if you don't need it, don't store it. If it's not germane to figuring out if your code works or if the application is functional, certainly don't store that information locally on your machine and purge it in the places where it may happen to transmit through.
Safi Mojidi:
So another good one, because APIs are the thing of the future, is really securing the backend and those connections from malicious attackers or malformed APIs. So what we're really talking about here is API authentication and really securing their transport mechanism. So really understanding thoroughly what APIs are doing, what access they have within your environment, is that access overly permissive?
Safi Mojidi:
What areas are you able to lock down an API and still maintain functionality? Additionally, ensuring that we're as stewards of technology, we are creating an opportunity for high-level authentication or complex passwords. And in certain situations, even multifactor authentication.
Safi Mojidi:
I think we do ourselves a disservice, the healthcare industry or the cyber security industry, we do ourselves as a disservice if we're not enforcing the need for strong, complex passwords. So at least 12 characters, including one uppercase, one lowercase letter, special character, number, a space potentially. And then on the flip side of that, start implementing or recommending for people when they're signing up for accounts to use pass phrases, things that are easy for them to remember, but still meet the criteria of a complex password with those components I just mentioned.
Safi Mojidi:
Other things that I think are easier to miss if you particularly don't have someone who can speak to encryption in-house, but just really making sure you're using industry and best practice modern encryption methods.
Safi Mojidi:
So really making sure everything's encrypted in transit to protect against privacy leaks or data theft, but then also you can't forget about implementing file level database and source code encryption.
Safi Mojidi:
Last and certainly not least I will say application developers should really make sure that they are prioritizing penetration testing activities, both from an internal and external perspective. So internally, hopefully your tests are automated, you have a CICD pipeline, you have gates in that CICD pipeline that will run quality control checks but also run vulnerability checks and assessments prior to code being deployed.
Safi Mojidi:
I think that's crucial, because you certainly want to identify vulnerabilities or flaws before they make their way into production. They're a lot easier to fix when they're still in development and certainly cost a lot less to fix. But I also think some people may overlook the importance of also having external penetration tests conducted at least annually. External penetration tests in my experience have been a good way to really identify public-facing security risks and vulnerabilities.
Safi Mojidi:
So as an application developer you're not necessarily trying to brute force your way into an application, that might not be one of the things that you check for during your deployment checks. But that is something that for example would be covered if you were to conduct an external penetration test, really doing some enumeration, making sure that there aren't any vulnerable aspects of your code pipeline that are publicly available or have vulnerabilities or threats that are in the wild and have been confirmed as able to be compromised.
Safi Mojidi:
So doing your homework, doing some open source intelligence around how you select vendors that you work with, tools that you implement, and certainly the methods in which you use to deploy secure code into your production environment.
Kacy Zurkus:
And I think your point about the penetration test is so important, and it's really... I don't know what needs to change, whether it's the narrative or perception or what it is, but I feel like there is this fear from developers that if we go through this external pen test and they do discover vulnerability, and then it is disclosed that as going to reflect badly on us.
Kacy Zurkus:
And it seems to me that it's the exact opposite. Like, Hey, kudos to you, you found it before a malicious actor did.
Safi Mojidi:
Yes, so you hit on a few very important things there. The first of which is, I like to tell people you need to build your cybersecurity culture before it builds itself. And the example you just described is exactly what that is.
Safi Mojidi:
So most people are worried about job security. Everyone wants to do a great job. Everyone wants everyone to know that they're doing the right thing or they believe that they're doing the right thing. And so having an outsider come in to essentially audit your work is never a good feeling if it's proposed that way. But if we can shift the mindset and the culture into becoming less reactive, because that's what an incident causes all of us to do is to react.
Safi Mojidi:
So in the event of a breach, now we're going through our incident response plan and capabilities if we have one. So one of the things I like to tell people is no one's spying on you and it's not a mechanism to test your aptitude at your job, this is a business decision that needs to be made in order to reduce our risk as an organization.
Safi Mojidi:
So like you are a developer, your role is to develop code. We're not asking you to conduct security tests because that's not your wheelhouse, that's not your specialty. But if we don't have a mechanism in-house to make sure that happens, the world is full of consultants. You hire consultants when you don't have capabilities internally, and that's not a knock on anyone. It's just a thing, like cybersecurity professionals are hard to find. They're hard to keep. And the way you overcome that challenge is to hire a qualified consultant who you know is reputable and has worked with the type of environment you're asking them to conduct a penetration test on.
Safi Mojidi:
And so first I definitely say, it's about changing the narrative, as you mentioned, but leaders from the top down really being intentional about how they're communicating the importance of cybersecurity, the importance of understanding our cyber risk, the importance of understanding our risk posture, and what our level of acceptable risk is in an organization.
Safi Mojidi:
So at the end of the day, if a CEO or CIO says, cyber security is important, but we just don't have it in our budget. That is a risk that they are choosing to accept, to not identify a consultant or have an external penetration test conducted.
Safi Mojidi:
So it's not one that I would recommend, however, there's always the business requirements, there's always the economics that come into play. And one of the things that I like to tell people is it's a lot cheaper to go ahead and pay a penetration tester now than particularly in healthcare applications to pay for the cost of a breach.
Safi Mojidi:
The global average for the cost of a breach is over $4.2 million. And so healthcare, as I mentioned earlier, is the industry with the highest breach cost. So we can say on average, about 80,000 people are affected by a breach, and that costs about $4.2 million. I guarantee you it's a lot cheaper to go ahead and get a pen test than it is to deal with a breach.
Safi Mojidi:
I know security people are expensive, but we're not that expensive.
Kacy Zurkus:
So one thing I also wanted to touch on in this conversation is of course the beloved S Bomb. There's been a lot of conversation around the software bill of materials. In 2017, the Health Care Industry Cybersecurity Task Force wrote a report on improving healthcare industry cybersecurity. And in it they wrote, these six imperatives that they identified in the report reflect a shared understanding that for the healthcare industry cybersecurity issues are at their heart patient safety issues, which you've talked about.
Kacy Zurkus:
As healthcare becomes increasingly dependent on information technology, our ability to protect our systems will have an ever greater impact on the health of the patients we serve. While much of what we recommend will require hard work, difficult decisions and commitment of resources, we will be encouraged and unified by our shared values as healthcare industry professionals and our commitment to providing safe high-quality care. So profound.
Kacy Zurkus:
The report goes on to talk about the criticality of securing mobile devices and applications, and references the NIST special publication 1800. But that was five years ago. So what are other frameworks or guidelines that developers need to be mindful of in order to ensure software integrity?
Safi Mojidi:
Absolutely. Yeah, I remember being really excited about that report when it came out. I think both industries, information security and healthcare realize that technology is coming and maybe we are not as prepared as we should be. I think there's a few organizations, the National Cybersecurity Center of Excellence, so NCCOE, which is a part of NIST, it was created a few years ago. It's a collaborative hub where essentially industry organizations, government agencies, and academic institutions work together to address business problems that are now becoming cybersecurity challenges for organizations.
Safi Mojidi:
So they provide easily adaptable example cybersecurity solutions that use the best standards, best practices, as well as commercially available technology to solve this problem. I think another good one is Project Mobile Security that was created by OWASP. So this is more of a centralized resource that essentially give developers and security teams resources that they need to build and maintain secure mobile applications. And of course that's open source.
Safi Mojidi:
The goal there is really to classify mobile security risk and then help provide controls to either reduce their impact or likelihood of exploitation. And I think the last thing for folks to start really being mindful of is privacy laws, they're coming. They're here in some states, but it's just a matter of time before there is some federal guidance that speaks specifically to privacy for technology.
Safi Mojidi:
Right now, the healthcare industry really needs to make sure that we're at least adhering to current laws. So there are laws that are currently in place in certain states that really deal with unauthorized access, malware and viruses, that's a law in all 50 states. There are certainly denial of service laws that currently I think is around 25 states. Right now there's ransomware laws in a few states, spyware laws, and phishing laws.
Safi Mojidi:
So there are a lot of things that as we're developing applications and we're requesting information from patients or customers, there are still ways that we can be responsible in how we're communicating. And then certainly because of some of those identified risks that I just mentioned, we certainly want to make sure we're building in security controls that will prevent as much as possible those things from being exploited.
Kacy Zurkus:
So speaking of preventing things from being exploited, given the sensitive information that could be accessed if these healthcare provider applications were breached, is there a greater burden or duty of care for those who are knowingly developing applications that will be used by the LGBTQ population?
Safi Mojidi:
Absolutely. I think at the end of the day there's the technical aspect of controlling access to data in an application you're building, and ways to ensure that those technical controls are in place is absolutely compliance testing your application.
Safi Mojidi:
It's the first line of defense to ensure, first of all, that you're in compliance with HIPAA regulations, security rules, privacy rules, but it definitely helps you also identify potential security concerns in your application.
Safi Mojidi:
And in order to effectively manage or remediate those issues, it's definitely major to ensure you have the tools that can help you maintain and manage, and also create guardrails for developers. I think it's really crucial to reduce risk, but also helps developers be more productive and can hopefully abstract them away from some of the more manual repetitive tasks that can be automated. And now they can essentially pull back and take a look at overall, how are people interacting with their software?
Safi Mojidi:
That's where UX designers and developers come into play, where they're really taking a look at how the functionality of an application could better serve X population. And if we're talking about specifically the LGBTQ population, there are certain considerations that need to be made. This is a group of people who traditionally have not had the best relationships with healthcare providers or culturally competent healthcare providers who understand the ways to communicate, understand the importance of asking for pronouns, understand there shouldn't be stigma attached to gender identity or sexual orientation.
Safi Mojidi:
And telehealth is really unlocking doors for access to these people. But on the flip side, we're operating in a environment, tech specifically, where there is not a ton of diversity.
Safi Mojidi:
So the last thing that I would mention in terms of healthcare providers and the greater burden of [inaudible 00:26:29] care that they need to pay in order to serve this population is really take a look at your development teams, your operations teams, and your security teams, are those groups of people diverse? Like overall, if your entire organization is diverse, great. For the most part, the healthcare industry is a pretty diverse, melting pot of people from all walks of life, age, race, et cetera.
Safi Mojidi:
But if you don't have enough differing opinions, views on how to achieve goals, resolve problems, really future-proof your application from ending up on the five o'clock news, I think it is really important to make sure that those teams are made up of very different people or a diverse set of engineers or technical leads so that there really is more insight into what the broader population needs out of your application or the services that you're providing.
Kacy Zurkus:
I love it. Safi, it has been so great to have you here. I love listening to you talk because you just have so much depth of knowledge about so many different things.
Safi Mojidi:
Thank you.
Kacy Zurkus:
Before we wrap up, do you have any parting words of your wisdom to share with our listeners?
Safi Mojidi:
From the healthcare perspective, continuing to be flexible and agile as technology facilitates modernization of our applications and systems. I think there is a perception that healthcare applications are just gargantuan, huge monolithic systems that cannot be secured unless they are blocked in a room without internet access.
Safi Mojidi:
And that's not the way of the future. And I really employ the healthcare industry to try to do as much modernization as possible. And at the same time, bringing along cybersecurity professionals along the journey to help you really ensure that the modernization is happening in a way that is efficient, effective, but also secure.
Safi Mojidi:
And then I guess the last part would be just like it's vital for members of the LGBTQ population or other underrepresented minorities in the US to see themselves reflected in the healthcare workforce or in their ability to access qualified medical care, I think it's equally as important to ensure that that same level of diversity is also continued to be talked about. And it's important to elevate it in terms of diversity in cybersecurity and more broadly tech.
Safi Mojidi:
I think without that diversification of thought, lived experiences, mindsets, perceptions, all those things come into play when we're developing applications, when we're securing systems, whether or not we consciously think about it. I think without that, we're all being left more susceptible to cyber crime.
Safi Mojidi:
And so I would just say as we continue to transform into a digital first society, a lack of diversity of thought will also increase our susceptibility to a continuing growing list of cybersecurity threats. So if we continue to rely on the same people, groups that ask the same types of questions, come from the same background, it's going to remain very difficult to continue to combat the numerous persistent threats to security.
Kacy Zurkus:
Absolutely. Safi, it has been great to have you. Thank you so much for joining us. I hope you'll join us again for a future event.
Kacy Zurkus:
Listeners, thank you so much for tuning in. To find products and solutions related to DevSecOps and software integrity, we invite you to visit RSAConference.com/marketplace.
Kacy Zurkus:
Here you'll find an entire ecosystem of cyber security vendors and service providers who can assist with your specific needs. Please keep the conversation going on your social channels, using the hashtag RSAC, and be sure to visit RSAConference.com for new content posted year-round.
Participants
Safi Mojidi
Head of InfoSec, FOLX Health
DevSecOps & Application Security
application security data security DevSecOps endpoint detection visibility & response endpoint security Internet of Things mobile applications mobile device security mobile security privacy secure coding software integrity
Share With Your Community