Joanna Tai

ACSME Panel: What is the future of assessment?


Dr Joanna Tai

Deakin University

Very early on in the pandemic, the team at CRADLE (the Centre for Research in Assessment and Digital Learning) were very much in demand for advice about online assessment – and particularly, concerns about academic integrity. One of the first things we worked on was this guide on Digital Assessment – looking back at it now, many of the solutions did require a lot of academic effort to get going, and we thought this would be alright if things blew over quickly. The question remains: how can we ensure our assessment does all the things it needs to within real-world constraints?

The shift to emergency remote teaching also presented opportunities to understand both student and staff experiences. In March 2020 CRADLE joined an international collaboration involving partners from Canada, USA, Belgium, the Philippines, Singapore, and Australia. Together, we sought to investigate how teaching and learning changed as a result of the pandemic, stay at home orders, and the significant shift to online learning. In my wholly Australian experience as a student and academic, most courses at least already used a university LMS to share resources with students -but would you believe it, at some overseas institutions, some lecturers and students had never had to use an LMS before 2020!!

Our first paper has been published, titled A multi-institutional assessment of changes in higher education teaching and learning in the face of COVID-19. The key finding from this paper is that everyone found switching teaching and assessment methods in the face of the pandemic difficult: it didn’t matter how much prior experience teachers had, and they felt it did impact their ability to successfully teach students. There are still a number of papers from the collaboration in the works.

I also led a research project funded by the National Centre for Student Equity in Higher Education (NCSEHE) on Re-imagining Exams 2020 – 2021, with the goal of understanding how we can make assessment more inclusive for students from diverse backgrounds. We focussed particularly on students with access plans, and those who came from a rural, regional or remote area, or from a low socio-economic status background. In this work, the significant shifts to take-home/open book exams with extended timeframes for submission meant that many students were less stressed about their assessment – and felt that in some cases, these types of tasks were more aligned with what they would do in their future careers. However, other students did still wish they could just take a multiple choice exam to demonstrate their knowledge. Overall, we came to the conclusion that there is no one perfect solution for inclusive assessment: what is important is that we discuss assessment design with all stakeholders to figure out what range of assessments might work best for a diverse student group. There’s a range of resources already available at our website linked above, and we hope the full report will be out soon.