In recent months, several NEA Higher Ed unions have negotiated contracts that reflect the potential of artificial intelligence (AI) to upend their work. More will certainly follow, predicts Deborah Williams, lead negotiator for the Johnson County Community College faculty union,
in Kansas.
“We’re at a new frontier in technology, which invites conversations of a variety of kinds, including bargaining,” Williams says. Whether the goal is to get a contract that addresses the use of AI by faculty or management, these emerging topics can and should be addressed in collective bargaining agreements, she says.
“The more we can anticipate and develop language around these things, the better,” Williams adds.
Protecting human work
As faculty and staff unions begin to tackle AI at the bargaining table, their goals often include the following:
• Ensuring the human connection remains at the heart of teaching.
• Preventing management from using AI to do the job of union members.
• Ensuring employee evaluations are done by human managers (not by AI).
• Protecting union members’ intellectual property.
Fueled by a fear that AI could be used to replace union members’ jobs, the bargaining team representing City University of New York (CUNY) faculty and staff had two goals. The first? “To say AI won’t replace faculty,” says James Davis, Professional Staff Congress (PSC) president. “We wanted the university to make a commitment to human instruction for every single class that’s on the schedule, … regardless of modality.”
The human connection between teacher and student is “central to everything we should be doing,” Davis notes, especially as many CUNY students come from underserved communities.
The PSC bargaining team achieved their first goal, winning contract language that clearly says all course instructors will be human.
The second goal? PSC hopes to protect members who work outside of classrooms. “Do [administrators] want to put our professional staff out of work? I think so!” Davis says. This turned out to be a margin that PSC couldn’t cross in 2025, but, Davis promises, “It is an area we will return to.”
At Rowan College, in New Jersey’s Burlington County, protecting union members is also the priority, says union president William Whitfield. Faculty and counselors can imagine a day when chatbots advise students, or large language models compare student papers to rubrics “and spit out grades,” Whitfield says. But would this be good for students? Or educators?
“We don’t want to eliminate [AI] entirely because we know it’s fundamental to workplaces,” he notes. “The question is: How do we maintain our academic integrity while preparing our students to use these tools?”
The union and college settled on language that protects “individual duties and whole positions,” Whitfield notes. Specifically, it says:
“The College agrees that AI in all forms shall not cause the replacement, displacement or reduction of any Unit Member’s base workload. Unit member preference for teaching, course design/deployment, and directed study opportunities will continue to receive priority consideration. There will be no reduction in the number of unit members based solely upon the College’s use of AI.”
Strengthening union voices
As AI develops, it’s important to get a foothold in your contract now, advises Jason Eggerman, president of the faculty union at the Community Colleges of Spokane, in Washington.
“The college had this idea that we should wait until [the landscape] is no longer changing so rapidly,” but that will be never, he says.
In Spokane, both sides agreed to a contract noting, “AI is rapidly evolving,” and that it is the intention of the college and union that “future use of AI be done in a thoughtful and measured way, with due concern for student welfare and success.”
Eggerman points to a line requiring both sides to “engage in discussions surrounding AI going forward,” and says it has already paid off.
Recently, Spokane staff opted not to turn on new AI features in their online learning management system, saying it hadn’t been “discussed,” as required. “[The contract] is already protecting us from administrators unilaterally turning stuff on,” Eggerman explains.
Looking Outside Academia
As bargaining teams navigate this new territory, many are looking off-campus for inspiration. One Florida faculty member points to the Writers Guild of America, which went on strike for five months in 2023 to get a contract ensuring AI will be a tool for members and not a means to replace them.
Similarly, more than three dozen NewsGuild contracts include AI-related provisions. At the New Republic, the contract states that generative AI may be used by union members “as a complementary tool, … but it may not be used as a primary tool for the creation of [work].”
It also states AI shall not result in layoffs or reduced pay for union workers or be used to fill vacant positions.
Among K–12 educator unions, the St. Paul Federation of Educators (SPFE) led the way in 2025 with a contract that says school officials “shall not eliminate bargaining unit members as an immediate and foreseeable result of adopting or implementing generative AI technologies.”
It also states:
“In no event shall an educator be disciplined, involuntarily transferred, or receive an adverse employment action or evaluation solely on the basis of AI-generated data, metrics, or analytics.”
“Our district didn’t want to talk about this at all. They were like, ‘Let’s look at this in five years,’” says SPFE President Leah VanDassor. “Five years? No. You have to get them to look at it now.”