Much of the discussion about generative artificial intelligence (AI) in the workplace has focused on concerns such as job loss or increases in productivity. However, research by Chris Benner, Professor of Sociology, highlights that lessons from previous technological changes and current work on AI can guide how innovation is used to improve work now and in the future.
Benner points out that industries like food service, agriculture, and personal services face challenges when it comes to understanding how technology can be used positively and how workers can be involved in implementing AI. In an interview, he explained that it is unlikely there will be a rapid loss of jobs within any single occupational category. “Most new technologies, like in previous rounds of rapid technological change, are changing tasks, not complete jobs, allowing job activities to shift over time. And technologies don’t determine outcomes on their own, but institutional choices, business models, policies, governance, and power relations do,” Benner said.
He emphasized that who shapes AI use in the workforce is crucial. “The key question is who gets to shape AI use in the workforce. The same AI tools can produce very different outcomes depending on who is involved in decisions about design, deployment, and governance. For example, we see instances where AI can either deskill work, intensify surveillance, and hollow out jobs, or it can augment workers, reduce drudgery, and improve job quality.”
Benner identified immediate risks associated with AI adoption: “The most immediate risks aren’t mass layoffs but algorithmic management and electronic monitoring; increased work intensity and loss of autonomy; racialized and gendered bias in scheduling, evaluation and discipline. These dynamics echo earlier waves of automation but AI scales and obscures managerial power in new ways.”
Regarding patterns across sectors where AI is being implemented differently — especially service or blue-collar jobs — Benner said automation will likely shift tasks rather than eliminate entire roles. He noted that Large Language Models are having an impact on professional occupations involving writing or analysis since these workers have more flexibility to adapt their responsibilities.
However, he also highlighted overlooked opportunities for using AI: “We are overlooking AI’s potential to make invisible skills visible… Some of the most economically and socially important work — childcare, early education, elder care… cannot easily be automated.” According to Benner: “Generative AI could help here by helping us better understand communication… supporting training… making tacit knowledge more visible without replacing human judgment.”
On broader social implications for employment structures tied to benefits like health insurance or retirement plans — which are often linked directly to employment — Benner said: “AI underscores the need to rethink social supports tied to employment… If AI produces broad productivity gains from that collective inheritance [of shared knowledge], it raises a basic question of fairness: Why don’t we treat some of those gains as a shared social return? This opens the door to ideas like an AI universal dividend… universal access to healthcare and lifelong learning.”
When asked how society can ensure positive uses for AI at work rather than harmful ones: “We need to focus on worker-centered innovation… using AI to support training… better scheduling… reducing administrative burden so workers can focus on relational or creative work,” Benner stated.
He added that achieving these outcomes requires more than just market-driven adoption: “To achieve these types of outcomes requires workers’ voices… industry standards… public-interest governance.” He stressed updating labor standards for algorithmic management as well as investing in community-based learning infrastructure.
Benner concluded by stating: “AI will not determine the future of work on its own. The real question is whether we treat this as another extractive technological transition — or as an opportunity to rebuild institutions… around work in ways that center dignity equity and learning.”



