Skip to main content
Digital Education Studio

AI Literacy and Inclusion - bridging the Gap

Identifying and enabling medical and dental students to overcome traditional barriers to engaging with new digital technology, including AI

an interview with Dr Nehete, Faith Sarima Chinda and Tahmid Yameen Ahmad

Artificial intelligence and digital technologies are rapidly becoming embedded in both higher education and clinical practice —from lecture capture and automated transcription to diagnostic imaging tools and data-driven decision systems. These are no longer distant possibilities; they are increasingly part of students’ day-to-day academic and clinical environments and shaping expectations of how knowledge, assessment, skills and professional practice unfold.

At the same time, the 10 Year Health Plan for England: fit for the Future signals a healthcare future shaped by digital transformation, automation, and AI-supported decision making, placing explicit expectations on the digital readiness, adaptability, and critical judgement of incoming professionals.

In a recent conversation, the Digital Education Studio caught up with Dr Nehete, Clinical Reader in Restorative Dentistry, Faith Sarima Chinda, a final-year dental student, and Tahmid Yameen Ahmad, a fourth-year medical student, to discuss a new Learner Interns Project sponsored and funded by Queen Mary Academy on AI literacy and inclusion. The initiative aims to understand how medical and dental students currently engage with AI and digital tools, and to identify the enablers and practical barriers that may be limiting confidence, access, or effective use while also identifying promotors for engagement.

What barriers to engaging with AI and digital technology concern you most in medical and dental education?

Dr Nehete

The barriers I see are often invisible rather than purely technical. We might have excellent facilities, but if a student hasn’t had much exposure to technology before, their confidence can be very low. Think of Vygotsky’s Zone of proximal development — if something hasn’t been part of your everyday life, it is harder to extend into more complex systems. Socio-economic background, shared devices, and prior access all play a role in improving an understanding of scaffolding needs.

I often find that we assume availability equals accessibility, but confidence, familiarity, and belief in one’s ability to engage are just as important as hardware or software.

This project comes from the awareness that healthcare is moving very quickly towards digital proficiency and AI-supported workflows and diagnostics. If we do not address these confidence and access gaps now, some of our graduates will enter the workplace already at a disadvantage. The incredible work done within our widening participation agenda can truly bear fruition if we are cognisant of these factors that impact employability.

Can you describe a moment where you felt unsure or unsupported using digital or AI tools in your studies?

Faith

In dentistry, the shift from analogue to digital has been huge. When I did early work experience, the practice I observed still used traditional X-ray darkrooms, but in dental school everything is digital. That contrast really shows how quickly the profession has evolved and how suddenly expectations can change for students. The speed of change can feel overwhelming, especially when there is not always formal guidance on how to use new tools. A lot of the time we are expected to figure things out ourselves.

I am excited by AI and digital platforms, and I can clearly see their benefits for both learning and patient care, but the uncertainty comes from not always knowing what is reliable, what is appropriate, and what is ethically sound. Technology itself is not the problem — the challenge is the lack of structured teaching around it. Having clearer direction on when, why, and how to use these tools would make a significant difference to confidence.

What solutions did you explore when facing those uncertainties?

Tahmid

My approach started mostly with self-teaching. I began university during the pandemic and relied heavily on handwritten notes, while some of my peers were already using digital tools and AI. Seeing how they were covering material faster and in different ways pushed me to start experimenting myself. Over time I began using AI to transcribe lectures, create flash cards, and interpret exam feedback. One of the most helpful things for me was taking general cohort feedback and asking AI to explain it in the context of my own performance. That helped me connect theory with clinical practice and improve more deliberately.

But I also realised that self-experimenting is not equal for everyone, and it can lead to over-reliance. Without guidance, it is easy either to avoid these tools completely or to depend on them too heavily. What we really need is structured support that shows us not only how to use AI efficiently, but also how to question it and maintain strong independent critical thinking.

What would meaningful engagement with AI look like for students in clinical training?

Dr Nehete

For me, meaningful engagement starts with willingness and belief. Students need to feel that they can try technologies without fear of failure. If we normalise experimentation and remove the stigma around not knowing and lack of experience, skill development follows much more naturally. When that foundation is there, students build confidence alongside competence.

Faith

For me, it is about critical understanding. We need to know what AI can do and what it cannot do and be able to interpret its outputs rather than simply accept them. It is about being informed and reflective users instead of passive consumers, with clear ethical guidance supporting those decisions.

Tahmid

For me, it is balance. AI should support our reasoning, not replace it. In both education and clinical practice, I see it as a partnership rather than something we depend on completely. Used thoughtfully, it can increase efficiency and exposure, but our professional judgement still has to lead.

Dr Nehete

The project’s intended outcomes move beyond awareness toward practical, measurable change. Through focus groups with medical and dental students across all years, surveys, and mixed-method research, the team aims to identify where confidence drops, where skills are being self-taught, and where structured support is missing. The purpose is to build an accurate picture of current student lived experience that can inform scaffolding for realistic, evidence-based educational responses aligned with emerging clinical workforce expectations.

These insights are expected to shape curriculum design so that digital and AI capabilities are embedded explicitly rather than left to informal experimentation or individual initiative. In parallel, the project recognises the importance of staff development delivered in authentic clinical and teaching contexts, ensuring educators feel as confident and prepared as the students they support.

Recurring priorities include strengthening foundational digital fluency, clarifying what counts as AI versus general digital tools, improving feedback literacy, and establishing ethical guardrails so that students learn to critically question outputs rather than accept them passively. The anticipated impact is a more coherent pathway from basic digital competence to specialised clinical application — from knowledge bases, communication platforms and note-taking tools to imaging, simulation, and AI-assisted diagnostics.

In this sense, AI literacy functions both as an inclusion measure and an employability strategy, equipping future clinicians to engage critically and confidently in increasingly technology-rich workplaces.

Students and staff are invited to participate in upcoming focus groups or to express interest in contributing to the AI Literacy and Inclusion project by contacting Dr Nehete. If you teach medical or dental students, or support their learning in any capacity, your perspective is particularly valuable. Educators who would like to share insights, facilitate student participation, or explore collaboration opportunities are encouraged to reach out directly to Dr Nehete to help shape the next stages of the project.

This project is part of the Learners Interns Programme , established by Professor Stephanie Marshall, Vice Principal (Education) to support the University in gaining student feedback on the entire student experience. The Programme is designed to ensure engagement of the student voice in developing strategic learning projects which deliver the Queen Mary 2030 Strategy and the Active Curriculum for Excellence.

Back to top