Why generative AI is not a simple win for university teaching
Universities are integrating generative AI into teaching to enhance creativity and efficiency. For design educators, whose work is closely tied to experimentation and originality, AI is quickly becoming embedded in daily academic routines. What remains unclear is whether this integration supports long-term professional well-being.
A new study published in Sustainability shows that generative AI use boosts innovation and self-efficacy while also intensifying anxiety and occupational stress among university design teachers.
Titled Generative Artificial Intelligence Becomes a Colleague: Dual Pathways of Empowerment and Depletion in University Design Teachers' Work Behaviors, the study is based on survey data from 436 university design teachers in mainland China. It challenges the assumption that generative AI functions solely as a neutral or supportive tool. Instead, the authors argue that AI increasingly operates as a workplace actor, altering teachers' access to psychological resources and reshaping how they engage with their jobs.
Generative AI as a source of empowerment
According to the study, generative AI can meaningfully enhance teachers' sense of professional capability when it is perceived as supportive rather than threatening. For design educators, whose work depends heavily on creativity, experimentation, and visual communication, AI tools can reduce routine burdens and open new possibilities for instruction.
The research shows that higher levels of generative AI use are associated with increased teaching self-efficacy. Teachers who regularly integrate AI into their work report greater confidence in their ability to design lessons, solve instructional problems, and support student learning. This heightened self-efficacy, in turn, strengthens teaching-related well-being, reflected in greater satisfaction and emotional engagement with academic work.
Teachers who experience increased self-efficacy and well-being are more likely to engage in innovative work behaviors, such as experimenting with new teaching methods, redesigning curricula, and exploring interdisciplinary approaches. At the same time, these teachers show lower levels of work withdrawal, including reduced tendencies to disengage, procrastinate, or emotionally distance themselves from their roles.
Generative AI contributes to the accumulation of valuable personal resources, including confidence, energy, and creative capacity. When these resources expand, teachers are better positioned to invest effort in innovation rather than self-protection.
Importantly, the authors point out that these positive outcomes are not automatic. They depend on whether teachers perceive AI as aligned with their professional identity and pedagogical goals. When AI is seen as a collaborator that enhances rather than replaces human expertise, it can strengthen teachers' sense of agency and purpose.
The hidden cost of AI-driven teaching
The study also documents a parallel and less visible process: the depletion of psychological resources associated with generative AI use. As AI becomes more deeply integrated into teaching practice, many educators experience rising levels of anxiety and occupational stress that undermine engagement.
The findings show a clear link between increased AI use and heightened AI-related anxiety. Teachers report concerns about skill obsolescence, loss of professional distinctiveness, and the pressure to continuously adapt to rapidly evolving technologies. These anxieties are not limited to technical competence but extend to broader fears about role displacement and diminished academic autonomy.
In addition to anxiety, generative AI use is linked to elevated occupational stress. The study suggests that rather than reducing workload, AI adoption can introduce new demands, including the need to monitor outputs, ensure academic integrity, and meet rising expectations for productivity and innovation. For some educators, AI becomes another source of performance pressure rather than relief.
These negative psychological states feed into a resource-depleting pathway. Anxiety and stress weaken teachers' capacity to engage creatively with their work, suppressing innovative behavior and increasing withdrawal tendencies. Teachers experiencing higher levels of depletion are more likely to limit effort, avoid experimentation, and disengage from professional development activities.
Put simply, generative AI is not inherently beneficial or harmful. Its impact depends on how it interacts with teachers' existing resources and vulnerabilities. Without supportive conditions, the same tools that enable innovation can also accelerate burnout.
Organizational context shapes AI outcomes
The study analyses contextual factors that influence how generative AI affects teachers. The authors identify two organizational variables that significantly moderate AI's impact: perceived organizational support and psychological contract breach.
Perceived organizational support refers to the extent to which teachers believe their institution values their contributions and cares about their well-being. The study finds that high levels of perceived support amplify the positive effects of generative AI. When teachers feel supported by their university, the relationship between AI use and self-efficacy becomes stronger, reinforcing empowerment and innovation.
Supportive institutions are more likely to provide training, clear guidelines, and reassurance about the role of AI in academic work. This reduces uncertainty and helps teachers integrate AI into their practice without fearing negative consequences. In such environments, AI is more easily framed as a shared resource rather than an imposed threat.
On the other hand, psychological contract breach intensifies the negative effects of AI use. Psychological contracts are the implicit expectations employees hold about their relationship with their organization, including fairness, stability, and mutual respect. When teachers perceive that their institution has violated these expectations, AI-related stress and anxiety increase sharply.
The study shows that under conditions of psychological contract breach, generative AI exacerbates resource depletion. Teachers who already feel undervalued or insecure interpret AI adoption as further evidence that their institution prioritizes efficiency over human expertise. This perception accelerates withdrawal behaviors and undermines trust.
These findings highlight the role of governance and institutional responsibility in shaping AI outcomes. Technology alone does not determine whether AI empowers or depletes teachers. Organizational signals, policies, and support structures play a decisive role in mediating its effects.
Implications for higher education policy
While many institutions focus on technical capability and innovation metrics, the research suggests that neglecting the psychological dimension of AI integration may undermine long-term educational quality.
Design education, in particular, occupies a sensitive position. Creativity-driven disciplines rely heavily on educators' intrinsic motivation, professional identity, and emotional investment. Policies that treat generative AI as a cost-saving substitute for human expertise risk eroding these foundations.
The authors argue that universities should move beyond tool-centric AI strategies and adopt a resource-oriented approach. This includes investing in professional development that strengthens teachers' confidence in using AI, establishing clear boundaries around AI's role in assessment and instruction, and openly addressing concerns about job security and academic values.
Equally important is the need to reinforce organizational trust. Transparent communication, participatory decision-making, and visible institutional support can buffer against the stress and anxiety associated with AI adoption. Without these measures, AI risks becoming a catalyst for disengagement rather than innovation.
- FIRST PUBLISHED IN:
- Devdiscourse