Bonnie McBain

ACSME Panel: What is the future of assessment?

Mcbain 1

Dr Bonnie McBain

The University of Newcastle

At the University of Newcastle we have recently revised our Bachelor of Science through a highly collaborative curriculum design process (called CCD) starting in 2017. One significant output of that redesign was 6 new core courses. These courses were largely developed to build graduate employability through a focus on scaffolding student’s

  • transferrable skills and
  • ability to work collaboratively with other disciplines in problem solving.

The first-year core courses have now been implemented three times, whereas the third-year core are currently being taught for the first time at scale. All the courses are blended with information transfer in carefully curated online materials, leaving face to face time in workshops for highly interactive collaborative learning. Significant time is dedicated to undertaking assessments within the workshops – either doing activities to complete team tasks or feedback sessions to assist progress. In Newcastle we are teaching online since the middle of the year in response to covid (I write this during 2021) and, of course, we also had semester one in 2020 in that same national situation.

The assessments in the core courses were very much designed by working our way backwards from two questions 1) what is it that students will be doing in the workforce? and 2) what skills will students need to be able to demonstrate to be able to get into the workforce? i.e. assessments as portfolio pieces.  We then designed assessments that could demonstrate these, constructively before aligning our learning outcomes and curriculum.

The experience, expertise and evidence from CCD indicated that graduates would be working to solve problems in teams with varying types of expertise once they enter the workforce.  Therefore, much of our curriculum scaffolds the critical transferrable skills students need to take part in problem solving in multi-, inter- and then transdisciplinary teams across first to third year.

To provide feedback to students about their level of competence and preparation for the workforce, the assessment of this work needs to evaluate

  • the output (the methodology, evidence, findings and recommendations students produce) but also
  • the outcomes (the skills and experience that students gain from practically applying increasingly more sophisticated science problem solving methodologies in time).

The projects that students can do are highly diverse and deliberately become more open ended as their problem-solving skills increase. Student teams are encouraged to think creatively to build innovative responses to their project brief. We present the curriculum as a safe place to ‘fail’ by assuring students that if they disprove a courageous hypothesis but implement high-quality science to test it and document it to a high quality, they will do well in these courses.

The highly diverse, open-ended projects mean that no student project report is ever the same. Not only does this make it fascinating to mark, it also has benefits to academic integrity. With no precedence there is nothing to copy from. Each learning experience is individual and customised to the needs of each team’s research context and expertise. We work closely with student groups through the ‘doing’ part of the project, and have high familiarity with each project and student’s involvement.

The results from implementing this type of assessment is encouraging. Student engagement and perception of relevance is high. Even in second year, multiple students have already reported that using them as portfolio pieces has helped them to successfully find work. We needed little redesign of the assessment tasks in response to covid; although, we do continue to refine our approach given the courses are new and we learn a lot during the early days for implementation. I’m looking forward to continuing this conversation to share more about the good, the bad and the ugly in our panel session.