How can we develop authentic online assessments that support our students’ learning and skill development? In July, our Digital Education Community of Practice was privileged to hear from the Blizard Institute’s Dr Joanna Riddoch-Contreras about her experience of designing such assessments.
Joanna designed and launched the BSc Neuroscience at QM over 10 years ago in 2015. Joanna and her colleagues in the Blizard Institute have been working to improve assessments across the programme, in which students learn about varied topics including neuroanatomy, pharmacology, physiology and translational neuroscience. Their focus on curriculum design has been to ensure constructive alignment, helping students move forward in learning journey and develop skills, and incorporating formative assessment and opportunities to for students to use feedback. Feedback is a key component of the programme, through both formal and informal avenues, such as class discussions and social learning activities, self-assessment and reflection, observations and advisor check-ins.
Since 2020 the majority of assessments have moved online, Joanna has also been mindful of ensuring academic integrity and in 2023, she has been piloting Cadmus to support assessment literacy and academic integrity.
Joanna shared three examples of authentic online assessment, implemented with varying degrees of success in the second-year module Cell and Molecule Neuroscience, which is a multidisciplinary module studied by Neuroscience, Biochemistry and Pharmacology students. Her aims in including authentic assessment in this module was to motivate and engage students with key concepts and ideas, develop their problem-solving and critical thinking skills, and provide meaningful real-world examples.
The first example was a data interpretation assessment, which replaced an essay. Joanna and the team wanted to make sure students understood and could interpret outputs from their ‘dry laboratory/MATLab’ sessions and could integrate their understanding of theory and key concepts from lectures to help them interpret scientific results, such as graphs and figures. Students were asked to interpret and explain scientific images, including an image that they would not have seen before. This was an open-book assessment, and the team worked on the assumption that students might work together; Joanna noted that the nature of the questions made it difficult to identify collusion where students correctly answered the data interpretation questions.
In the second example, students wrote a ‘News and Views’ article summarising one of three research papers, similar to the summary articles featured in research journals. This assessment was designed to help students develop their skills in reading research papers, identifying key findings, and communicating research to others. The team provided a range of scaffolding resources to help students prepare their article, and students could submit a draft for feedback to help them improve their final submission. Approximately half of the cohort takes the opportunity to submit a draft each year, and Joanna also shares some general feedback and suggestions with all students via the QMplus discussion forum and in class discussions.
The final example Joanna shared was a lab practical assessment. In the lab, students measure and record the conduction velocity of a nerve. In previous years, they have then completed questions in the lab which test their understanding of the factors that might influence the conduction velocity and other aspects of their recording. In the new version of the assessment, students completed the questions online after the lab session. The team quickly realised that the types of questions asked did not work well as an online assessment, as they did not allow markers to discriminate between different levels of understanding or to ensure that students had not colluded. They ultimately decided to move the assessment back into the lab session, and Joanna noted that moving assessments online requires the careful consideration and redesign of questions.
Student feedback about the assessments has been largely positive. Students reported that the assessments really made them think about scientific skills and reading literature, and that they tested their knowledge and understanding well, rather than simply their ability to remember and repeat facts. However students were concerned about the potential for collusion with some of the assessment, which the team has taken on board.
Students were particularly positive about the ‘News and Views’ assessment – they liked being able to submit their drafts and found that the assessment made them really read and engage with their research paper. They also found Cadmus’ prompting on paraphrasing helpful for developing their writing skills and appreciated being able to have their notes and the scaffolding resources within Cadmus to easily refer to as they wrote. Joanna noted that with so many students, the team have to be strategic about the amount of feedback provided – they take the approach of highlighting three good things and three things for improvement; having a clear and descriptive rubric alongside the more personalised feedback provides extra information for students about what they have done well and what they can improve.
Joanna concluded with three tips for designing online assessments:
- Structure and clear instructions are key with assessments
- Be creative
- Be positive and encourage students to be active learners
Find out more
Find out more about using Cadmus for assessment
Check out Queen Mary Academy’s Assessment Toolkit and case studies