Assessment + Feedback

This page focuses on strategies for guiding student learning through assessment design and feedback.

Assessment + feedback may differ for a Dual Delivery subject. For guidance specific to Dual Delivery, please see BEL+T’s Guidance for Dual Delivery page.

The move to an online learning environment does not necessarily require a radical rethinking of Assessment, but it can prompt valuable and creative responses—like considering how to ensure learning outcomes have been met remotely, and reviewing the schedule for activities. Designing assessment to guide learning remains the central aim.

Of course, Assessment is an area in which institutional processes and requirements influence what, how and when students undertake assessable activities and receive feedback for learning. The nature of remote assessment (including temporary changes to invigilated exams) has led the University to encourage subject coordinators to consider adopting different assessment approaches in this context, when these are approved within governance processes. This might include moving away from end-of-semester, high-stakes assessment tasks to more regular, low-stakes assessment activities. These tasks may be considered “authentic” in that they relate directly to significant concepts and learning activities.

The multiple paths of the Assessment element in the DIAgram indicate the importance of aligned assessment actions in students’ learning experiences. As with the other elements, Assessment is not a stand-alone activity, and in the DIAgram all of these overlap, influence and inform each other. You will find details, examples and tool guidance via the Delivery + Interaction + Assessment links.

The sections below have been informed by Learning Environments’ advice, UoM policy, a review of relevant scholarship and BEL+T’s ongoing work within the Faculty. For more on assessment-related learning tools, visit our Learning Tools page in the Canvas section of the Teaching Toolbox.

Additionally, BEL+T has produced a set of Tactics for Assessment Related Expectations in alignment with the End of Subject Survey (ESS) questions that relate to assessment and feedback. This guidance presents student commentary, tactics used by subject coordinators and things to consider for their applications. Click below to access the guides (staff login required):

  • Assessment Design

    Assessment design is one of the key tasks of pre-semester planning. This includes reviewing intended learning outcomes (or ILOs) developed for each subject and published in the University of Melbourne Handbook (MCSHE’s guidance on writing ILOs is available here). Assessment tasks should be devised as a means for students to evidence the degree to which they have achieved each ILO, and subsequent assessment procedures are the search for this evidence. Through their engagement with these activities “students develop and demonstrate the ability to judge the quality of their own work and the work of others against agreed standards” (Boud et al, 2009).

    Rubrics should be provided to students at the beginning of each assessment task to guide their learning. Assessment rubrics provide qualitative language (ideally active verbs) for coding and recording evidence of learning and the quality of learning, thus demystifying the assessment procedures by linking the ILOs to grade bands and clarifying the intention of an assessment task for learning. The act of developing a rubric is also a way of making implicit dimensions of assessment explicit to teachers to assist marking and moderation. This transparency becomes all the more critical in contexts of remote assessment, and particularly in design-based tasks (Jones, 2020).

    Digital rubrics are available via Canvas, which can be designed and implemented according to subject requirements. Instructions on how to setup Canvas Rubrics for assessments can be found via this link, and guidance on how to manage rubrics in courses can be found here. Keep in mind that qualities of certain subjects may complicate the default matrix system in Canvas. In these circumstances, subjects tend to use PDF files of assessment rubrics, which tutors can fill out digitally (i.e., with tablets or PDF-editing software) or by filling out printed copies and sharing scanned copied with students. Digital documents can be attached to assessment via Canvas Speedgrader. Note that if grades have been hidden from students, feedback and any attachments will not be accessible to students until grades are published.

    When providing submission instructions and details consider the following:

    • Cluster instructions into small digestible chunks (e.g. according to modes of submission).
    • Checklists of requirements are a clear way to communicating expectations from students.
    • Instructions should be clear and explicit in terms of what is expected by the students and by what date and time.
    • Consider including the hyperlink to the Canvas submission location.

    Ideally, online assessment tasks are chunked into smaller weightings that are low-risk. This may require re-structuring high-stakes assessments to smaller weighted tasks. When adjusting previous assessment designs consider the following:

    • Identify why adjustments need to be made (i.e., access to facilities, like examination halls or fabrication labs, or access to materials and hardware for printing, model making, etc.).
    • Refer to the subject ILOs.
    • Review past assessments tasks and assessment data gathered from student marks to identify potential areas requiring revision.
    • Consider online platforms and tools available, suggesting or requiring the most appropriate.
  • Desk Crits / Work-In-Progress Feedback Sessions

    One of the most fundamental modes of interaction and assessment in our Faculty, and design-based subjects in particular, is the “desk crit” when students and tutors engage in live informal discussion about iterative work-in-progress. Translating such an informal and tacit mode of engagement online requires more structure and coordination. Tutors and students without experience in this context will need to familiarise themselves with the rhythm and logistics of creating submission boxes, uploading, viewing digital work, annotating work and providing feedback.

    Relying primarily on asynchronous modes for submission and feedback will encourage both tutors and students to be clearer about their progress and areas for improvement (see Jones, 2020). Feedback provided asynchronously can be followed by videoconference sessions to discuss. You can also opt for “silent” sessions on whiteboard platforms like BullclipMiroMural, and Conceptboard. Try to avoid creating a “revolving door” of individual discussions with students, as this encourages the remaining students to adopt a passive or disengaged mode of participation. If individual or small group meetings are necessary, you can create breakout rooms for the remaining students and provide them with a collaborative task or discussion prompt.


    1. Nano Langenheim has shared a video (UoM login required) about the hardware and software she uses for interactive annotations.
    2. James Helal has shared a video (UoM login required) of the hardware he uses for sketching/annotating.
    3. Check out Peter Raisbeck’s blog posts on running studio sessions on Zoom: Part 1 and Part 2
  • Student Presentations / Design Reviews

    Student presentations and subsequent feedback can both be conducted synchronously, asynchronously or by mixing the two modes. Synchronous interaction can be especially valuable for certain modes of feedback that may be more conversational. Equally, taking the asynchronous approach of allowing students to upload pre-recorded presentations can significantly reduce their anxieties—not to mention limit technical challenges.

    For asynchronous approaches, you can direct students to specific audio/video screencast software that you may already use yourself for delivery of content and feedback. They will also benefit from tips for delivering pre-recorded presentations effectively. Be sure to provide specific instructions for how students can upload video files to Echo360 on Canvas and share these with their instructor. Teachers can then group uploaded videos in Echo360 and provide any external reviewers with a link. For synchronous presentations, consider requesting submission of slides and even scripts as a backup in case of technology failure. For more guidance, visit Learning Environments’ assessment advice on asynchronous presentations and synchronous presentations, as well as their page dedicated to how to assess and provide feedback on online presentations.

    Design studio reviews contain their own complexities when operating in virtual or blended environments. The Directors of the MSD and Bachelor of Design, in consultation with BEL+T and Pathway/Program Coordinators, have developed a set of guidelines for conducting reviews online.

    Planning Assessment in the Design Studio

    After further input from studio tutors, BEL+T created a flowchart that illustrates the various steps and key decisions involved in coordinating online design reviews.

    Coordinating Online Design Reviews Flowchart

    Rather than merely translating the format of conventional design reviews online, the ideal approach mixes the benefit of asynchronous and synchronous modes of interaction and assessment to support learning. This experience may forever change the model for final design reviews for the benefit of all.

  • Exams and End-of-Semester Assessments

    The Central Exams Unit and Learning Environments will prepare Canvas exam shells for scheduled online exams. Please note that there is a range of support and resources available to assist subject coordinators.

    The Learning Environments website offers guides and an overview of different assessment options. Options include LMS Assignments, LMS Quizzes, and external tools (Gradescope and Cadmus).

    Please check the live workshops and webinars offered by Learning Environments for targeted sessions. Drop-in sessions are also offered to assist with the setup of exams, and Pre-recorded webinars are available.

    Additional resources to support digital exams are available on the Digital Assessment website. The Central Exams Unit has also prepared a Guide for Semester 2 Exams – Subject Coordinator (intranet access required).

    When writing exams, please be sure to consider copyright issues. Melbourne's CSHE has also produced guidance on moving from closed-book to open-book exams. The Exam Content Checklist can assist Subject Coordinators to confirm that the subject exam is ready.

    For any technical inquiries, please submit a request to the Learning Environments team via ServiceNow.

  • Formative Assessment (during Semester)

    The goal of formative assessment is "to monitor student learning to provide ongoing feedback that can be used by instructors to improve their teaching and by students to improve their learning" (Eberly Center, 2020). This makes formative assessment a “low-stakes” endeavour as opposed to “high-stakes” summative assessment tasks like end-of-semester exams. Be sure to visit the Learning Environments’ page on during-semester digital assessment options for University-wide guidance.

    Formative online quizzes are an effective, low-stakes mechanism of gauging student progress. Quizzes can be managed through Canvas (an online guide for teaching staff is available here). The University also provides full licenses for the Microsoft Office 360 suite, containing useful e-learning tools for students in their assessment tasks. For example, Microsoft Sway is a quick and easy way for students to design blogs or newsletters.

  • Online Feedback

    Online learning and teaching offers both students and tutors various digital avenues for feedback. Feedback plays a critical role in guiding learning by identifying areas of development and progress. Good feedback allows students in not only examining the outcome (i.e. grades) but the process, thus facilitating reflection on their own methods and approaches to learning (Griffin, 2018). Modes and methods of providing meaningful feedback is particularly important in an online learning environment, where opportunities for delivery of informal feedback are limited.

    Written feedback is a useful way of highlighting to students how their work aligns to specific criteria in assessment rubrics. Individual feedback can be tailored and delivered directly through Canvas or Gradescope, or general comments can be shared with the entire subject cohort. Such general comments on cohort-wide progress can help frame widespread challenges and achievements, whilst also being an efficient way to deliver feedback (see Poyatos-Matas & Allan, 2005).

    Recorded audio/video comments can also provide students with detailed formative feedback in relation to specific criterion. Screencast feedback brings together audio/video with the ability to narrate annotations of student work (VoiceThread, FastStone and OBS Studio are software used by ABP colleagues to record feedback). Often times, annotating student work is the most appropriate means of providing directed feedback. Depending on whether the work is written or graphic, there are different cloud-based platforms suited for producing annotations. Perusall is designed for annotating written work, whereas collaborative whiteboard platforms (such as BullclipMiroMural, and Conceptboard) allow annotations on graphic work. For the latter, a tablet with accompanying stylus is helpful. Check out this video of Mural in action from Nano Langenheim!

    Peer-to-peer feedback can serve multiple purposes: providing alternative perspectives of a student’s work, improving generic skills related to criticism and encouraging interaction between students. Tutors will need to consider the appropriate platform, whilst providing some form of scaffolding and terms of engagement to guide students (e.g. using the ladder model, reflective questions, etc.). Perusall and Feedback Fruits allow for anonymous feedback, which is found to be preferred amongst students (Razi, 2016). For peer-to-peer feedback to be most effective, this form of interaction ideally should be an assessment requirement (i.e. via a reflective journal). ePortfolio in Canvas is an effective platform to facilitate students in developing and curating their journals and provide comments and feedback to each other (Gordon, 2017). Depending on the cohort of students involved, encouraging voluntary participation in peer-to-peer feedback can be challenging. However, students often prefer peer-to-peer feedback to be part of their assessment (Fleischmann, 2019). Self-assessment activities also offer valuable benefits to students (visit the Learning Environments page on self-assessment).

  • Special Consideration

    In the move online, the University’s policy and procedures for Special Consideration will continue to apply. These cover a wide range of circumstances that may have affected student progress. The University has put forth specific guidance regarding special consideration documentation for those affected by the COVID-19 circumstances.

  • Academic Integrity

    The sudden move to online learning in higher education has raised concerns about academic integrity (Liu, 2020). Students may be feeling under increased pressure to succeed and therefore able to rationalise opportunities to gain academic advantage over their peers (White, 2020). Ensuring academic integrity is vital to protecting the standards of The University of Melbourne’s degrees. Below are suggestions about actions that coordinators can take to reduce the risk of students committing academic misconduct in online assessment.

    1. Use high stakes assessment tasks with caution. Research on academic integrity found that cheating was more likely on a heavily weighted, high-stakes task than almost any other kind of task (Contract Cheating and Assessment Design from Bretag et al, 2019).
    2. Substitute high-stakes assessment with smaller, scaffolded or linked tasks. These allow students to gain feedback on their progress, and for teachers to see examples of student work-in-progress (Sotiriadous and Learning Futures, 2020).
    3. Familiarise yourself with the types of academic misconduct that get detected in ABP subjects. BEL+T has developed a resource that defines nine types of academic misconduct, with guidance about the steps coordinators should take if they have suspicions about the authenticity of a student's submission.
    4. Be explicit when explaining the expectations that you have about how students will complete an assessment task. The following paragraph has been reviewed by the ABP Academic Support Office (ASO) as a message to students about collusion in open-book online exams.

    The online exam on [insert date] is an individual assessment task. Students must complete the task without any input from other individuals. While the exam is open book, students are reminded not to share study notes or summaries of the lecture content. The University of Melbourne takes academic integrity seriously and provides advice here for students about how to avoid academic misconduct. There are serious consequences for students who commit academic misconduct. Penalties can include terminating or suspending the student’s enrolment. Details are provided here

    1. Require students to sign a plagiarism declaration when submitting assessment tasks. Coordinators can either provide students with the following text to include as part of their submission, including through Canvas submissions, or require students use the ABP Coversheet.

    By submitting work for assessment I hereby declare that I understand the University’s policy on academic integrity and that the work submitted is original and solely my work, and that I have not been assisted by any other person (collusion) apart from where the submitted work is for a designated collaborative task, in which case the individual contributions are indicated. I also declare that I have not used any sources without proper acknowledgment (plagiarism). Where the submitted work is a computer program or code, I further declare that any copied code is declared in comments identifying the source at the start of the program or in a header file, that comments inline identify the start and end of the copied code, and that any modifications to code sources elsewhere are commented upon as to the nature of the modification.

    1. When setting up text-based assignments in Canvas, teachers have the option to generate a Turnitin Similarity Report for each student submission. Click here for details.