Given that hundreds of thousands of Americans now work in cybersecurity, and yet cyberattacks continue to plague companies and other organizations throughout the United States regularly, it’s only natural to ask about the current state of professional development within the field.
Sadly—and perhaps predictably—it is mediocre, and deterioration is deepening.
The biggest reasons are that in-house training is relatively scarce; thus, a huge shortfall of cyber pros continues to grow. The explanation? Typically, it’s that overworked veterans are too busy to help with training, and not nearly enough people want to enter the field, notwithstanding the allure of generous salaries.
This conundrum isn’t merely a matter of opinion. According to the recently published 2022 Cybersecurity Skills Gap research report produced by Fortinet, a multinational corporation based in Silicon Valley that produces a wide array of cybersecurity solutions, 80 percent of organizations suffered one or more breaches that they said could be attributed to a lack of cybersecurity skills or awareness.
The upshot: Cyber professional development needs a serious refresh. The overall IT market moves very fast—and security requirements even faster—because serious hackers are constantly changing their attack methodologies, regularly forcing cyber pros to scramble to keep up. Nonetheless, things are fixable. Needed are better training and support and more flexibility and creativity in hiring.
Consider the relative dearth of training today. Cyber employees typically receive a day or two of it when they are hired and, thereafter, a mild brushup once a year. This isn’t nearly enough amid the swirling vortex of constant change. Some employees forget some of what they learn after a few months, let alone readily absorb new technological developments. This is why the Advanced Computing Systems Association recommends that companies host cybersecurity training every four to six months, preferably with the help of interactive examples and videos.
The cyber hiring shortfall is an even bigger problem. According to CyberSeek, a top resource on the U.S. cybersecurity job market, there are now 600,000 unfilled cyber jobs in the United States, up from 465,000 18 months ago. Globally, the situation is far worse. There are 2.7 million unfilled cyber jobs worldwide, according to Cybersecurity Ventures, compared to roughly 1 million a decade ago.
Compounding matters is the fact that seasoned cyber pros regularly move to a new job in three years or less. Studies show this is less a matter of pay and more about employees feeling isolated and overwhelmed. Predictably, business niceties tend to take a back seat, with security staff chronically busy and understaffed.
Given the chronic risk of a cyber breach, it may seem hard to fault the high priority placed by big companies and organizations on in-depth security knowledge. If a standard software or web developer makes a mistake, they may need to push back a deadline and irritate a client, but that’s about it. On the other hand, if an employee’s task is to develop and implement security systems, an error could lead to severe data loss and potential financial losses.
Consequently, most companies still prefer that hires hold a bachelor’s degree in computer science or computer engineering. They want cybersecurity certifications, as well as experience in select sleet specialties, such as intrusion detection, secure software development, and network monitoring.
Nonetheless, persistent and constantly growing shortfalls in the cybersecurity workforce cry out for compromise. In fact, cybersecurity has long been a field that has embraced some people with non-traditional backgrounds. Many young cyber pros no longer have degrees in computer science and related fields. Many got at least part of their training at one of the scores of cyber boot camps nationwide or at a community college. What most of them needed (and had) was at least some digital security experience somewhere, critical thinking skills, and a passion for learning.
Indeed, more would-be cyber pros are increasingly learning core IT skills through community college programs, often online, and cyber boot camps, and manage to glean some real-world experience, often through an apprenticeship at a boot camp or a community college. Some companies then welcome them in an entry-level IT role, such as a help technician or a network administrator. More of this is needed.
Among the biggest boot camps is publicly traded 2U, which runs more than 200 boot camps, full- or part-time, in eight disciplines, including cybersecurity, at scores of universities, including Columbia and Northwestern. Another major boot camp purveyor, Thrive DX, offers training at more than 50 universities, including the University at Buffalo, the University of Miami, and Kansas State University.
Some universities are also becoming more aggressive in providing cybersecurity training. For example, East Tennessee State University and health insurer BlueCross BlueShield of Tennessee (BCBST) have teamed up to put students through a 27-month intensive bachelor’s degree program that includes paid internships at BCBST. In addition, the Cyber Talent Hub—a new platform that will allow companies to offer customized online training on cyber technologies for free at select universities—is about to be unveiled. Colleges initially involved will include New York University, the University of Chicago, and the University of Michigan.
Much more needs to be done. But as more positive developments come to fruition, the cyber skill shortage may start to answer itself. Then it will become clear that there’s not really a people shortage but merely a knowledge shortfall.