Artificial intelligence is big, but are companies hiring for AI roles too fast?

Since the release of OpenAI’s ChatGPT, all the hype around generative artificial intelligence has had companies scrambling to hire talent who know how to implement and leverage the rapidly evolving technology.

According to Indeed’s latest U.S. Jobs and Hiring Trends Report released in November, the number of AI jobs jumped 20 times from the beginning of 2023 to the end of October.

Job openings related to artificial intelligence will skyrocket in 2023. (license/image)

As part of that trend, the number of Chief AI Officers (CAIOs), which appears to be a hot new role in 2024, is rapidly increasing.

According to Glassdoor, 122 people with the title of Head of AI or Vice President of AI joined Glassdoor forums last year, up from 19 in 2022. More than 400 federal departments and agencies hold government positions alone, and that number is expected to explode this year. I’m looking for CAIO.

AWS launches program to increase AI and technology capabilities for small and medium-sized businesses

But what does this new role entail and who should fill it? Should the head of AI be an engineer, a lawyer, or have some other background?

The answers to these questions will vary by industry and company, but leaders should look for specific answers to them before investing in an AI-only position at the C-suite, says Asha Palmer, senior vice president of compliance at edtech firm SkillSoft. He says it is necessary to obtain

Businessman in a glass office looking at the cityscape

Asha Palmer, Senior Vice President of Compliance at Skillsoft, says that while companies may not need a dedicated CAIO, they do need some form of oversight of their use of AI. (license/image)

“I don’t think there’s a one-size-fits-all answer,” Palmer told FOX Business. “I think companies have to do that.” [ask], “What is our use case?” and “How do we use it?” Where do we want to use it? How can we accelerate our business? ” These are all questions businesses should start asking themselves, and they’re probably already doing so. ”

Next, companies should ask themselves what risks they face from using AI, and who is best positioned, within or outside the organization, to balance the opportunities and risks associated with the technology. You have to make a decision, Palmer says.

OPENAI Post-Transaction Valuation of $80 Billion: Report

Rather than having a dedicated chief AI officer, Skillsoft has an AI governance committee comprised of a cross-functional team to provide oversight. Representatives from various departments are involved, including legal, engineering, compliance, and customer success.

Palmer believes that whether a company chooses to hire a CAIO or establish a board of directors, AI will have an organizational focus and a high level of oversight and visibility. says there is a need.

Man standing and talking with colleague sitting at table in office

Asha Palmer says that if companies don’t already have an AI monitoring policy in place, they should do so now. (license/image)

He said boards need to question their companies’ use of AI and the management around it because the technology is here to stay and will be increasingly used, accelerating efficiency and effectiveness. Moreover, those who have not asked such questions should start now to stay ahead of the curve. regulation The surrounding AI will definitely come.


Palmer said regulations are usually put in place after most people have some infrastructure in place and act as an enforcement mechanism to get others to join them.

“I encourage companies that want to be good corporate citizens to start this journey without regulation and really develop a sustainable and strategic strategy around AI. Again, this includes not only acceleration factors, but also guardrails and safety measures.” brake. ”



Sign up to stay informed to breaking news