How will generative AI change cybersecurity teams?

A new report finds that half of cybersecurity pros expect artificial intelligence to replace some roles and skills. Current hiring is focusing on problem-solving instead of technical skills.


A new report has found that about half of cybersecurity "practitioners and decision-makers" anticipate generative artificial intelligence (GenAI) will remove the need for cyber professionals to have certain technical skills or to fill certain roles, according to a survey by cybersecurity membership organization ISC2.

Forty-nine percent of cyber professionals working in the government sector said they believe generative AI will lead to certain cybersecurity skills becoming obsolete, and 48% said they think generative AI could replace certain cybersecurity roles, according to additional data ISC2 shared with Government Technology. A slightly higher portion of the overall respondent group — which spans many industries — echoed those views.

 But it's still unclear if generative AI really will replace some skills, as well as which skills those would be. That ambiguity may be why many hiring managers across industries are prioritizing recruits with soft skills that will remain relevant regardless of what the emerging technology's future looks like.

Per the report, 59% of hiring managers don't know what skills are needed to be successful in an "AI-driven world." As such, many said they're currently prioritizing finding candidates skilled at problem-solving, teamwork and communication over those with technical skills like cloud computing security and risk assessment.


And even while AI's growing role in organizations may be looming, only 23% of government hiring managers worldwide (and 24% of hiring managers across industries) said they were actively looking for recruits with skills in AI and machine learning. That may indicate that these hiring managers are more focused on immediate needs, rather than ones that may take a few years to bear fruit.


In contrast with hiring priorities, non-hiring managers — that is, professionals who don't influence the final decision on whether to bring on a candidate — were more likely to value AI and machine learning skills. More than a third believe AI and machine learning skills are important for cyber professionals wanting to get hired or promoted. That's a view espoused by 40% of non-hiring managers in government, and 37% of non-hiring managers across industries.


Some cyber teams are actively using generative AI, including for purposes like making it easier to access information or to speed up "tedious" tasks. Here, governments in the global survey showed a sharp departure from other industries: Just 26% of government respondents said generative AI was built into their cybersecurity teams' tools, compared to 45% of respondents across industries. U.S. states, however, may be more in line with the wider trend: The latest Deloitte-NASCIO Cybersecurity Study found 41% of state CISOs using GenAI to support their security work and 43% looking to do so within the next 12 months.


Even as cyber teams see ways generative AI can help their own work, many worry about use of generative AI by outside departments introducing more data privacy and security risks. More than two-thirds of government respondents told ISC2 their organizations needed more regulations around how to safely use the technology, slightly higher than the portion of overall respondents saying the same.


Including cyber teams in shaping the state's generative AI strategies can help, and the recent Deloitte-NASCIO study found many state CISOs are doing so: 88% of state CISOs were involved in developing their state's GenAI strategy, and 96% involved in developing their state's GenAI security policy. That outstrips the global average ISC2 found, with only 60% of respondents across industries and countries saying their cyber teams had a part in creating regulations and guidelines for GenAI.

Comments