Skip to main content
Queen Mary Academy

Teaching with Generative AI: Balancing Innovation and Environmental Responsibility

Professor Cédric John raises the question: how can universities model responsible AI use while integrating these technologies into teaching and assessment?

Published:

The Hidden Cost of Generative AI

Generative AI has rapidly become a fixture in higher education. Since ChatGPT’s release in late 2022, educators and students have embraced large language models (LLMs) as tools for coding, writing, and exploring ideas. The technology’s accessibility through natural language makes it an equaliser for students with diverse linguistic backgrounds, and it supports curiosity-driven, independent learning. Yet beneath this pedagogical promise lies a significant environmental footprint that often goes unexamined.

Each AI query runs on energy-intensive data centres that rely on vast arrays of Graphics Processing Units (GPUs). These facilities occupy land, consume substantial amounts of water for cooling, and draw power, which comes still largely from fossil-fuel sources. Training and operating a single model can emit thousands of tonnes of CO₂. While an individual prompt may seem trivial, the cumulative effect of millions of daily queries is substantial. In educational contexts, this raises a difficult question: how can universities model responsible AI use while integrating these technologies into teaching and assessment?

As an educator who actively allowed AI in coursework, I have seen the enthusiasm with which students adopt these tools. But I also believe that embedding sustainability into digital practice must be part of any institutional literacy. The challenge is not simply technical, it is also ethical and pedagogical.

Embedding Environmental Awareness into AI Literacy

In my opinion, the first step is to acknowledge that AI literacy should include environmental literacy. Queen Mary’s new Critical AI literacy course for staff offers an ideal opportunity to start this conversation. Educators need to understand not only how to use AI effectively but also when its use is justified. Not every query or assignment benefits from automation.

Practical changes can help. Encourage students to write and refine their own text before turning to AI for feedback, rather than relying on it to generate full drafts. Use smaller, privacy-preserving, or local AI models for teaching demonstrations where possible. Models deployed on premises by universities are likely to be less power-hungry than those in large data centres. Discuss openly with students the resource implications of “free” online tools and the broader sustainability context of digital learning. As researchers, we should prioritise energy-efficient modelling and transparency in reporting computational cost. I acknowledge that this is not always an easy task, as the true environmental cost of AI can be difficult to track down.

The goal of embedding this approach within teaching practice is to shift AI from being a convenience to being a conscious choice. It invites educators and students alike to reflect on proportionality: when is AI use pedagogically valuable enough to justify its energy cost? AI is not the devil, but it may not always be the silver bullet and it does come at a cost.

Towards Sustainable Digital Pedagogy

By foregrounding the environmental dimension of generative AI, universities can align digital transformation with sustainability goals. This alignment supports the UN Sustainable Development Goals and strengthens our responsibility as educators preparing students for a climate-constrained world.

In my view, the path forward is not to reject AI, but to use it wisely, just as the calculator reshaped mathematics without replacing it. AI can expand access, enhance feedback, and deepen learning, but only if we remain aware of its material footprint. By integrating environmental considerations into AI literacy, Queen Mary educators can lead by example, modelling for students how innovation and sustainability can coexist.

The conversation about AI in education is often framed around ethics and academic integrity. It is time to add a third pillar: environmental responsibility. Each interaction with an AI system represents not just a cognitive act, but a physical one which bears real-world consequences. Our teaching can help make those consequences visible and shape a more sustainable culture of digital learning. And our research should focus on making energy production more environmentally friendly and AI less power hungry.

Professor Cédric John

Head of Data Science for the Environment and Sustainability, Digital Environments Research Institute, Queen Mary University of London

https://www.qmul.ac.uk/deri/deri-people/deri-staff-/profile/cedric-m-john.html

 

 

Back to top