As politicians fight over the right approach to addressing cyber threats, the consensus seems to be that more training and education are needed, particularly in the area of critical infrastructure.  Many proclaim, somewhat accurately, that many cybersecurity weaknesses are the result of users clicking on links or opening files that they should not.  Logically, then, those same users should be subjected to more training and drills to reinforce the right practices.  In fact, anecdotal and more comprehensive evidence seems to indicate that training and awareness programs do lead to more responsible behavior.  However, such vigilance may be short-lived or, at least, limited to following a few basic rules.  As the hacker methods change, it becomes increasingly difficult to keep large populations of users, who don’t have security as their primary responsibility, sufficiently vigilant.  Typically, security works best if it is integrated into tasks people do every day like putting on a seatbelt before driving.  Combatting the latest hacker tricks will need to rest with those who make a living worrying about security as a product vendor, integrator, or member of the company’s information security department.

And that is where a similar debate has been going on.  Just how do we effectively train and educate this workforce to create an effective cadre of “cyber warriors?”  Countless organizations have jumped into help, and the federal government, through its National Initiative for Cybersecurity Education, is attempting to define the requisite skills, training, and education needed.  But here is where the need for competent cybersecurity professionals smacks head-on with the perennial problems of our education and training infrastructure.  Because security is about combatting a sentient adversary that is constantly changing tactics, educational and training programs that focus on presenting a lot of information and then testing the students on what they’ve retained hardly seem appropriate.  Hands-on exercises that present real-life scenarios are certainly more useful, but they are expensive to produce, frequently contrived, and limited to scenarios that can be replicated in a classroom.  Additionally, educational programs, in particular, are often more about helping employers identify the best candidates than they are about ensuring students grasp a concern or truly learn something.  The Khan Academy was founded partly on the principle that our educational system was failing students by not providing sufficient opportunity to truly master a concept or area of study.  Instead, the system proceeded with the goal of presenting course material at a fixed pace, and if a student didn’t catch on quickly enough, he/she was likely a candidate for some remedial learning program that was not targeted at his/her needs.

So what does this have to do with creating an effective cybersecurity workforce?  Well, for one, technical roles frequently found in the science, technology, engineering, and math (STEM) fields tend to demand mastery of a concept rather than mere exposure to a subject.  Where liberal arts subjects often exist to provide one of many possible ways to greater enlightenment, STEM subjects frequently contain the necessary ingredients for completing a task successfully.  An electrical engineer can’t design an electrical circuit without a strong understanding of the basic principles of electricity, and a programmer’s code won’t compile if he’s using the wrong syntax. Similarly, a network security engineer can’t effectively protect a network without a strong understanding of protocols like Transmission Control Protocol/Internet Protocol (TCP/IP).  He can try and fake his way through fancy point-and-click user interfaces, but he’s likely to miss a lot.  Similarly, a security engineer entrusted to protect a coal-fired power plant is doing his employer a disservice if he can’t explain what a boiler does or the role that programmable logic controllers (PLC)  and distributed control systems (DCS) play in keeping the plants running. 

So how does an employer ensure such competencies?  One way would be to give tests incorporating real-life scenarios.  That could help, but the logistics can be tricky when done as part of the hiring process.  What employers want instead is to be able to trust that all these training and education programs actually produce competent candidates.  Ideally, students could not pass a course until they have demonstrated proficiency rather than simply obtain a passing score on a multiple choice exam.  Unfortunately, for adult education and training programs, the students are often footing the bill, and if the standard was mastery of all concepts taught, there would be a lot more failing grades and a lot fewer students willing to foot the bill.  And similarly, many employers paying for training are hesitant to pay what it takes for employees to truly understand.  Quite often, training is viewed as a perk, and so more spending cannot always be justified.  There would appear to be a disconnect between the course material and one’s job responsibilities.  Clearly, employers are not seeing the extra value generated or training budgets would not be the first things cut when belts tighten.  Perhaps the benefits of training and education are inherently hard to quantify and usually manifest themselves in ways that are not easy to see but are nonetheless valuable.  However, it could also be that training and educational providers are simply doing what they’ve been doing for hundreds of years and hoping that no one notices their failures, and that students are successful in spite of the education they’ve received rather than because of it.