Skip to main content
Digital Education Studio

What is shaping assessment in HE?

Jo Elliott, Reader in Learning Design; Jorge Freire, Johnny Lee, Thomas Hinks, Senior Learning Designers, Digital Education Studio

Introduction

Assessment in Higher Education is a complex area and discussions about assessment often result in as many questions as they do answers! Over the last twelve months, universities have been grappling with the impacts of rapidly developing generative AI tools and what these mean for our assessment practices and for academic integrity. Other conversations of note focus on the purpose of assessment and its role in students’ learning and development, and the degree to which our assessment practices are fair and inclusive of all our students. And, of course, we also need to consider the logistics and student and staff workloads associated with different types of assessment.

In this article, the DES Learning Design team summarise and share perspectives from some recent events and articles about assessment and the different ways in which we might approach it.

Social Justice and Authenticity in Assessment Practice

Jan McArthur is the Head of Educational Research at Lancaster University. In her talk for #TELresearchers (#HEresearchers) in January, she shared the story of her career to date and the events and lessons that led her to develop some of her most influential ideas, around Assessment for Social Justice and Authentic assessment.

Assessment for Social Justice

Jan highlighted that our assessment systems are, on their surface, concerned with procedural fairness but in practice are deeply unfair. This unfairness arises, partly, through the selective way different hardships are handled. For example, we recognise that illness can negatively affect performance, but we don’t recognise the struggles of lower-income students who must work throughout their time at university. Jan’s suggestions for socially just assessment include moving the focus away from the procedures that organise assessment towards the outcomes of assessment, properly accounting for student differences and allowing students to have a say in assessment processes.

Authentic Assessment

Jan’s critique of authentic assessment stems from the focus on ‘real world’ activities and the conflation of real world with the world of work. For Jan, the term ‘real world’ suggests that students do not experience this real world during their studies and disempowers them from working towards change by limiting them to working within current workplace paradigms. She suggests that assessment should engage students in activities that are socially important, not just those that are important to the world of work, and should include questioning and discussion of why these activities are important.

Changing Assessment in an Age of Artificial Intelligence

AI continues to disrupt assessment in Higher Education. In January, Professor David Boud - Alfred Deakin Professor and Foundation Director of the Centre for Research in Assessment and Digital Learning (CRADLE) at Deakin University, Australia - led a webinar and discussion titled ‘Changing Assessment in an Age of Artificial Intelligence’. Dave posed the question of ‘what should change and what should stay the same in HE assessment practice, so assessment meets its multiple goals and obligations?’ Practical and research-informed, the webinar presented useful insights on developing assessment practices that meet integrity and assurance goals, as well as developing feedback literacy and evaluative judgement.

One possible practice suggested by Dave was the use of interactive oral assessment at key points throughout the course. This suggestion prompted much discussion and debate amongst participants, on the merits, challenges and scalability of these types of assessment.

The use of interactive oral assessments is further explored ‘Tell me what you learned’: interactive oral assessments and assurance of learning in the age of generative AI’. In the article, educators at the University of Sydney shared their experience of shifting from traditional written assignments to oral assessments, where students present individually or in groups and engage in follow-up questions to demonstrate their grasp of concepts. The educators highlighted that the presentation segment provides an excellent opportunity to teach students to responsibly use digital and AI tools for research and presentations – the type of task that many of us might use AI for By allowing the use of AI tools but requiring verification against credible sources and unit content, students learn to discern and incorporate accurate information. This enhances their digital fluency, a key QM graduate attribute.

The subsequent Q&A session allows educators to authenticate students’ mastery of the material they have just delivered and probe for deeper understanding, ensuring academic integrity. Assessors pose questions from a tailored question bank aligned with the module's learning objectives and the specific content presented by students is created. For example, students might be asked to expand on an aspect of their presentation, clarify any ambiguities, or discuss the concepts in relation to other course materials. The Q&A sessions can be recorded for moderation and review.

The article reported that students valued the interactive oral assessments and their emphasis on communication and critical thinking. The workload for examiners was found to be similar to that for written assessments, although there were fewer student appeals of marks for the interactive oral assessments which might decrease workload over time.

Conclusion

There is growing consensus that assessment practices should support and extend learning – assessment for or as learning – as well as verifying what students know and can do. Authentic assessments, which consider the ways and contexts in which students will use knowledge and skills, play an important role. Yet, as Jan McArthur argues so eloquently (e.g., McArthur, 2022), authenticity shouldn’t be restricted to professional development and the world of work; we also need to think about how we prepare students to live and thrive in an increasingly complex and interconnected world. QMUL’s graduate attributes, encompassing things like communication, collaboration and teamwork, innovation and problem-solving and digital fluency, highlight the capabilities needed to be ‘active global citizens.’

The rise and availability of generative AI poses a challenge to many of our existing assessment practices. However, as the University of Sydney educators highlighted, it also offers an opportunity. Our students will almost certainly use generative AI, and many other digital technologies in work and in life and we need to help them develop the skills to navigate and use these technologies effectively and ethically. Incorporating or allowing their use in assessment, with appropriate caveats and boundaries, can help students develop their information and digital literacy and ethical judgement. Including a range of assessment types, such as interactive oral assessment, reflective work and iterative or portfolio-type assessment, can help us maintain the integrity of our assessment processes while also making them fairer, more inclusive and sustainable.

Back to top