Sitting under the lights of a broadcast studio stage in New York City, surrounded by cameras and aquamarine-green LED displays of city skylines and futuristic graphics, the tall, bespectacled CEO of a Swedish fintech company announced that, around December 2023, his company stopped hiring. It had been a year since they’d hired any new employees at all, and they weren’t planning on adding new hires to replace their natural attrition rate of 20 percent per year. The plan was, instead, to let AI fill the gaps. Then, he explained, “I am of the opinion that AI can already do all of the jobs that we as humans do.”
He had already done an experiment to update the market with company financials not with his own person, but an AI-generated version of himself. It was an attempt to level with other workers who feel replaceable by AI. And the numbers show, there’s a lot of people who feel that way. Research published in June from BCG found that 41 percent of respondents fear their job is likely to disappear within the next 10 years. In his eyes, this move said, “Look, a lot of jobs are going to be threatened … So let’s replace our [read: CEO] jobs first, and AI will become more popular than if it replaces other jobs. We are trying to play into that narrative.”
Those words didn’t age well.
In May, just five months after that broadcast appearance, news broke that the same fintech company began rehiring human customer service representatives to restore some of the 700 positions that AI initiatives had replaced, and that same CEO admitted that the reliance on agentic AI had gone too far, leading to a drop in quality, diminished market value, and a reevaluation that “really investing in the quality of the human support is the way of the future for us.”
Currently, we’re faced with a major disconnect between how we perceive AI in the workplace and what the actual lived experiences—and data points—are. It’s easy to find narratives that diminish generative and agentic AI simply as a threat to jobs. It’s a warranted fear, as data shows that yes, agentic and generative AI will eliminate jobs. But it’s estimated to create more.
Extrapolating a recent global study on the future of jobs, around 170 million jobs—the equivalent of 14 percent of today’s total US job market—will be created due to advances in and adoption of AI between 2025 and 2030. A total of 8 percent, or 92 million jobs, are calculated to be displaced. That tallies out to a net growth of 7 percent—around 78 million new jobs in the US.
On the flip side of the fear around AI—namely, agentic AI—there’s an overconfidence in the technology; a rush to invest in and implement it without due process. That leads to broken workflows, inefficiencies, lost revenue, security risks, and—as demonstrated by that Swedish CEO—understaffing. At best, it leads to cognitive dissonance that isn’t doing anyone any favors. One study published in June monitored experienced developers contributing to open source projects. Some of the developers used AI assistants, some didn’t. The result? Those using AI were 19 percent slower. The real twist? They thought they were 20 percent faster.
That same AI at Work study from BCG found another gap: 77 percent of employees think AI agents will be important in the next three to five years. But only 33 percent have a proper understanding of what they are. The data is showing us this: Upskilling is the way forward for both employers and employees.
“Companies cannot simply roll out gen AI tools and expect transformation,” Sylvain Duranton, global leader of BCG X and a coauthor of the report, said. “Our research shows the real returns come when businesses invest in upskilling their people, redesign how work gets done, and align leadership around AI strategy.”
When employees are more familiar with AI agents, they see them as a valuable tool rather than a threat. When employers have a deep understanding of their own work and how agentic AI can support it, that work can be broken down into bundles of tasks that are assigned to different roles, creating a roadmap for understanding where AI has a chance to save time and improve the work.
When that understanding exists, then employees can be reskilled to use AI effectively within a company structure and individual team workflows. BCG has outlined a “recipe” for training and upskilling employees in the AI at Work report, which recommends investing in at least five hours of in-person sessions and offering team members ongoing access to coaching. The goal is to have people feel like they are completing tasks 20 percent faster, and actually be completing tasks 20 percent faster.
“Companies that reshape their workflows and invest in people are seeing superior results,” said Vinciane Beauchene, global lead on Human x AI at BCG and a report coauthor. “But that transformation must be accompanied by a clear people strategy.”
This kind of structure will mark the moment when organizations are ready to hire or upskill positions that use AI. There will be a need for linguistics editors, quality control managers, agentic managers, AI editors, prompt engineers—some of those roles that agentic AI at work will create. Professionals may not be writing code or managing internal records anymore, but they will need to be able to monitor and guide the AI that is.
This reality is an important stake in the ground. A reminder that AI will not replace employees, but employees who don’t know how to work with AI may get replaced. With that in mind, it’s critical that both employers and employees commit to understanding, learning, and working with AI.
This isn’t the first time the job market has faced this type of shift. Currently, for every 10 professionals, at least one has a job title that didn’t exist in 2000.
It’s a good reminder that how AI impacts work—and the job market—isn’t a runaway train. Its effects on each organization, each job, and each task are in the hands of the people being affected. But that means that the responsibility is on each individual as well. To think critically, to upskill, and to integrate an attitude of continuous learning. Or, at least, that’s probably what the first social media managers and data scientists from the early aughts would advise.
Learn more about BCG here.

