Interactive Oral Assessment
An Interactive Oral (IO) Assessment is an “efficient and effective form of authentic assessment that promotes skill development and employability, enhances overall student engagement and a personalised approach to learning and teaching, and preserves academic integrity”. An IO is innovative in that it is a two-way conversation using a work-based or professional scenario to stimulate a free-flowing discussion. It facilitates the exploration of a student’s deep and higher-order understanding of a topic.
Acknowledgement: “This project is funded by Quality and Qualifications Ireland (QQI) under grant number [DCU20231]”
An Interactive Oral (IO) assessment is a two-way, free-flowing, unscripted conversation based on a real-world scenario. Interactive Oral assessments are usually synchronous and can be in-person or online but can be asynchronous as well. They have been used across a wide range of disciplines and programme cohorts. The Interactive Oral assessment approach is designed to be a curious type of conversation where the assessors' prompts or genuine natural questions allow students to showcase their learning in a professionally aligned environment.
An Interactive Oral is not a traditional oral exam. It is different from an oral exam or a viva where a question-and-answer format is used within strict exam conditions. Interactive Orals use natural conversation prompts as opposed to questions & answers. The assessment conversation is tailored around an authentic scenario like a workplace environment.
The assessment is not conducted under strict exam conditions but is instead a genuine and unscripted conversation with normal communication cues. The authenticity of the scenario also allows students to bring resources to aid the conversation like spreadsheets, presentation slides etc. Ideally, Interactive Oral assessment is well-scaffolded within the module assessment strategy. Overall, it is a much more engaging assessment type when compared with traditional oral assessment approaches.
An Interactive Oral assessment is authentic as it is based on a real-world scenario. It is a personalised approach to learning and teaching since the conversation prompts are unscripted and based on the student’s responses and previous work within the module. This personalisation preserves academic integrity. The authentic scenario around which an IO is based promotes skill development and employability. It also promotes student engagement and facilitates higher-order thinking.
A Fireside Chat with Danielle Logan-Fleming and Monica Ward
The below video is a fireside chat with Danielle Logan-Fleming and Monica Ward on Interactive Oral Assessments. It covers a brief overview of Interactive Oral Assessments and frequently asked questions (FAQs). This recording was created as part of the QQI-funded ‘Rethinking Assessment’ project.
IO across different disciplines
Initial Teacher Education
Engineering Professional development
Veterinary
Nursing
The Designing an Interactive Oral section offers advice for academics planning to design and facilitate an interactive oral assessment within their module. The Interactive Oral assessment process can be planned over 10 weeks, from week 1 to week 10 of a semester as described in the process diagram below. The recommendation would be to allow at least 3 to 4 weeks for designing an Interactive Oral.
When planning to implement an Interactive Oral, it is advisable that academics engaging in this approach consider the method as part of an overall scaffolded assessment design, and as such should reflect on the following points:
- The Interactive Oral should be part of an integrated and scaffolded module assessment design.
- The module learning outcomes will inform the full assessment design, therefore it will be clear where and how the interactive oral is aligned with the learning outcomes.
- This approach requires a pre-designed rubric or marking guide, and a pre-recorded example, which should be based on a scenario or text/topic NOT discussed for the actual IO assessment. The mock scenario can be recorded around any topic but its main aim is to highlight how an interactive oral might look like in practice and how it might be graded against the rubric.
- The rubric and example recording are used as a teaching and dialogue tool with students as partners
In an Interactive Oral assessment, the scenario should frame the two-way free-flowing conversation, and in many ways make it easier and less stressful to engage in a natural conversation. Some examples of scenarios include:
- Students sharing examples of what they are working on at an open day with potential new students.
- Students explain some current economic or social issue to a friend or family at a BBQ.
- Students are interviewed by a radio presenter on some aspect of their work e.g. a new business idea, a book they reviewed etc.
- Students having conversations with peers about work they presented at a conference.
- Students discuss a project design with a potential client or funder.
- Students joining a new work-based multi-disciplinary team need to explain what their discipline expertise will bring to the project.
The students in the above scenarios can assume different roles in the Interactive Oral assessment based on the scenario. For example, they could act as the project manager having a client meeting or a recruit for a company of their choice. There are opportunities for personalisation and student choice in scenario development that can be further harnessed to explore different professional contexts.
Some important practical steps to consider once the Interactive Oral (IO) assessment has been designed and ready to implement:
- To start with, input the rubric to DCU's Loop (Moodle/Other VLEs) ensuring all the file submission settings are turned off as students are not submitting an assignment, they are attending one.
- If the Interactive Oral assessment is online (using the official DCU Zoom platform), set up the Loop Scheduler in advance. You can find more on how to set up Loop Scheduler below.
- Set up the Zoom meeting in advance with the recording function and waiting room pre-set. During the assessment, remember to pause the recording after each oral assessment, and resume just before the next one. The separate recordings are recommended for easy navigation, moderation and internal/external feedback.
- Where possible do not schedule more than 20 IOs in one day or consecutive days; the recommendation is a maximum of 20 IOs per day and approximately 15 mins per IO (i.e., 10 mins for the IO assessment and 5 mins to mark). This will vary when IO is in pairs or groups. An IO of 10 minutes roughly equates to a 4,000-word submission.
- Ensure that all assessors schedule sufficient breaks.
- Keep the Loop schedule ‘hidden’ until you are ready for students to select their time.
- Show students where the scheduling tool is on Loop.
- Remind students to bring their student card to the IO with them, if this is a requirement. With smaller student cohorts, the zoom recording with the student camera on might suffice for you to be assured of the student's identity.
- Remind students that they will be in the Zoom ‘waiting room’ until you give them access to the IO.
- Ensure students are aware that the IO is recorded and will be used as a quality assurance and moderation tool.
- Remind students the IO will NOT be in front of the full class - it is simply between the student(s), and the facilitator (academic/assessor/s).
- Use the rubric and the recorded example IO video for facilitating class discussion around the rubric and assessment format. The mock recording also serves as preparation for IO. Check the rubric/marking guide to ensure it works the way you expect. If appropriate, work with students to amend/clarify the marking guide/criteria (i.e. partner students). This could be an ideal opportunity to co-design a rubric with the students as well.
- Ensure that you have a working webcam, microphone, and two screens available (one is for direct marking purposes) and that any tutors marking for you have the same setup. If two screens are not available, paper can be used for marking/viewing the rubric, and grades can be transferred later.
Listed below are a few practical tips and guidance to help the assessor(s) better navigate an Interactive Oral.
- Time management is essential during an IO. Using a timer to manage time effectively is highly recommended.
- When planning an IO online, where possible use two screens, one for the actual IO conversation, and the other for the rubric/marking sheet. If needed, print out the rubric and make notes on the side for each IO.
- Make sure to state that you will be/are recording the IO when the student joins the session.
- Resume or start recording before each IO, and then pause or stop at the end of each IO as noted above.
- Request that the student have the webcam on and show the picture ID before they start to verify identity.
- Interactive Oral assessment depends on active listening. Assessors are encouraged to listen actively and proactively engage to assure themselves of the student’s learning. Since Interactive Oral assessments are unscripted by design, each IO will look and feel a little different.
- The assessors need to familiarise themselves with the assessment scenario. The assessors should assume the role meant in the scenario and remember to prompt using natural conversation cues to keep the conversation flowing.
- Interactive Oral assessments are all about taking the time to connect with the students and ensure their learning. It is more quality one-to-one time than many of the students will ever have had with their lecturers, that in itself a valuable aspect of this assessment approach.
- Have a moderation process. If the IO spans over multiple days and/or assessors make sure to moderate the marks between days and between markers.
- If possible, provide feedback to students in the form of a class debrief focusing on a general summary (synchronously or asynchronously).
- Send each student individual marks and feedback via a marking guide/rubric.
- If possible, allow space and opportunity for student feedback on the process.
Tools that support Interactive Oral Assessments
Sample Rubrics used in Interactive Oral Assessment
Scheduler user guide
Interactive Oral recorded examples & associated rubrics
Dr Dervila Cooke - French Literature and associated Rubric
Dr Marina Efthymiou -Sustainable aviation and associated Rubric
Dr Monica Ward -Collaboration & Innovation and associated Rubric
Dr Tara & Dr Niamh - Literacy and associated Rubric.
Frequently Asked Questions (FAQ)
A viva is an oral interview where students are asked questions in defense of their work. An IO is a fluid, genuine conversation, designed to simulate some real-life scenario. An IO is focused around the assessor getting into a role and probing the student, in a curious way, in relation to a pre-designed scenario (as outlined in the assessment brief).
In an Interactive Oral assessment, the purpose of paired or group IOs can be to build students’ confidence working with peers, and for scalability for academics with large student cohorts. Even in team IOs, each student is normally marked individually according to their ability in relation to the criteria outlined in the rubric or marking guide. However, there may be times when the module learning outcomes call out teamwork, and then team marks may be appropriate. It is ultimately the prerogative of the lecturer to award marks as they determine appropriate i.e., individual, group, or paired marks.
Yes, recording Interactive Oral assessments is GDPR compliant when using DCU-approved centralised platforms such as Zoom.
No, all students are treated equally and fairly. Each Interactive Oral assessment is a unique, free-flowing conversation with its own individual dynamic related to the specific student(s) involved in the assessment.
Approximately, a minimum of 3 to 4 weeks are required to develop the rubric and exemplar. The academics may need to consider the availability of colleagues to act as participants in a mock IO to record the exemplars. Being part of the IO Community of Practice may help speed up the process through collaboration and sharing.
Yes, Griffith University which pioneered the approach regularly conducts IOs with cohort sizes up to 700+. You can find some examples of IOs at scale in Griffith University’s SWAY portal.
Having a well-defined rubric along with clear pre-moderation and post-moderation processes can ensure consistency across all markers. It is recommended to take some time to evaluate the first couple of IOs and discuss marking process within the assessor team and clarify any issues or discrepancies. During pre-moderation it is recommended to outline possible conversation prompts.
Interactive Oral assessments are meant to elicit responses that highlight student learning and high-order thinking. The focus of the conversation is thus assessing learning outcomes that can be conveyed within relevant contexts. Unless the rubric specifically is looking for fluency and communication competency, how the learning is conveyed is not marked. Some rubrics have a discipline-specific vocabulary criterion, as such the students will be expected to use terminology appropriate to the discipline. However, it is important to make the students aware that the assessors are not marking or looking for fluency in most cases.
Another option would be an asynchronous IO, where the IO is conducted in a written manner with the same unscripted, free-flowing conversational tone that is the hallmark of Interactive Oral assessments. Students are also allowed all support necessary, including an interpreter or extra time (as would be the case with other assessment types).
Interactive Oral assessments provide an opportunity to co-design/co-construct rubrics and facilitate conversations around expectations and discrepancies. It allows for a feedforward process and there are opportunities for sharing the IO recording with the student along with feedback. This kind of feedback can be highly valuable in terms of personalisation and accuracy. Academics can address any student concerns with relative ease since the rubric and expectations are clarified in advance.
No, an IO can be asynchronous depending on needs and circumstances. For example, in Australia, IOs have been successfully implemented with incarcerated populations. The IO is conducted in a written manner with the same unscripted, free-flowing conversational tone that is typical of Interactive Oral assessments.
Since each IO is personalised and unscripted, students sharing conversation prompts will not be a problem. If needed, the academic can design different variables and pick a couple of them at random for each IO: like representing a different company with a different context, a newcomer in the field that might need more clarification and the use of layman's vocabulary.
The students will find it difficult to use Gen AI in the IO context. The unscripted and personalised nature of the assessment leaves little room for academic misconduct. In most cases, the assessment is also synchronous which leaves little time for scripting out context-specific answers. Moreover, ideally, the IO is well-scaffolded and the assessor can refer back to the student's work in the conversation which Gen AI will not be able to assist with.
There has been a significantly positive response from the students. Students have reported feeling less stressed as compared to traditional exams. Students have also noted clarity around expectations due to the exemplar and the rubric shared in advance. More importantly, the students can see the value of an authentic assessment approach due to its real-world application. Some students have even reported enjoying the experience.
The academics are aided by a well-structured process that supports the design and implementation of IOs. There is also some work involved in designing a clear rubric and deciding on the scenario for the assessment and the sample recording. The academics however have found the approach enjoyable and a learning experience in terms of getting to know the students and witnessing their learning process first-hand.