PGR Assessment: peer-assessed vivas
I ran a module in the OxICFM CDT with an innovative assessment, and want to archive it here in case other people find it useful. The files I wrote (an assessment brief and guidance for asking questions in a viva) can be downloaded under the “documentation” subheading below.
Context
OxICFM is a centre for doctoral training funded by UKRI and industrial partners. Its ambition is to produce graduates who could articulate and solve challenging problems in chemical manufacturing. The course structure involves the cohort (appx 12 students) doing two terms of taught work before starting their substantive research project.
This module is the bridge between the taught and research programmes, and asks students to produce a proposal for how to start their research project. The largest elements within this are (i) a small literature review and (ii) a description of the work to be attempted in the first year of research.
Intention
The intention of this module was to help students make a successful start of their research. It addressed this in a few ways:
giving students a formal prompt to talk with their supervisors and groups about their projects;
helping students think through the key decision points in their projects;
prompting students to consider what they might do if things go wrong somehow.
Underpinning all of this was an ambition for students to start developing the ways they communicate the rationale and strategies of their work. Specifically, the major skills ambition behind this assessment was to develop students’ familiarity with the viva format.
The central innovation of this assessment related to assessment literacy. By examining students using one academic and one senior student, the senior student gained some perspective on what it was like to be “on the other side of the table”, and this environment brought questions about assessment format and expectations to the surface.
Design
The most important part of the Day 1 design was to establish that the student examiner could not assess the viva. Perhaps the narrow distinction between student as examiner (asking questions) and student as assessor (judging whether the performance was a pass) is one way of expressing the line we took to satisfy internal QA procedures. The viva was set up as pass/fail, and this judgment was very clearly placed in the remit of only the academic examiner.
The year-on-year refinement of the assessment was mostly about giving clearer guidance about what should be in the document, and specifying what could (but needn’t be) included.
Documentation
The key student-facing documents were intended to give clear guidance to students writing proposals, and to students performing the viva.
DOWNLOADS
2023-24 Assessment Brief (.pdf) A fairly clear outline of what each student was expected to submit, and how the presentation of ideas would be assessed in written, spoken, viva, and infographic formats.
Viva guidance (.pdf) Indicative guidance on how a student examiner might construct fair and useful questions in a viva.
Some Practicalities
The main difficulty was organising the vivas. The cosupervision model adopted by OxICFM meant that academics with relevant expertise were sometimes in high demand. The involvement of students was also sometimes hard to coordinate. Some students were always away (e.g. on research visits elsewhere), and some students who had left the taught programme became so immersed in their research that it was hard to get them to commit to being in a viva. This assessment would have completely failed if not for the work done by the Programme Manager to arrange the vivas.
COVID was also a big barrier to the viva format for several years. The first cohort was assessed without a viva (i.e. a paper assessment), and vivas during social distancing were conducted online.
Fallback plans were in place for students with emergency situations or reasonable adjustments, and the normal situation would be to run an academics-only viva (without a student examiner) to provide more flexible scheduling for this.
Reflections
I think the assessment did what it needed to do from an assessment literacy perspective. Students got to think about their project, but they also got to think about the characteristic viva assessment in a slightly more expansive way than is normal. I don’t think every student loved this assessment, though most clearly relished the chance to think about their research projects after the taught course. However I do think every student was given some kind of meaningful chance to think about how to make their end-of-degree viva go ok.
In the vivas themselves, I was always struck by how good the student examiners were. They had read the document carefully, and brought well-prepared questions to the discussion. They also often shared really helpful practical insights about things like room access or people to talk with. The vivas I was part of were sincerely supportive spaces most of the time.
OxICFM has now recruited its last cohort, so this assessment will not be used again. I hope it is repurposed and adapted by others, though, and that perhaps this blog helps to share one considered approach to PGR assessment.