Souping up summative assessment
Modified from on MB Hub blog post Souping up summative assessment. Written by Mona Maxwell, Senior Instructional Designer. Originally published March 3, 2021.
Edited by JJ Cloutier
Content review July 2023
- Preparing students for assessments
- Tip #1: Before making a whole pot of soup, sample the soup
- Summative vs formative assessments
- Examples of formative and summative assessments
- Effective feedback on formative assessments
- Instructor and learner benefits
- Tip #2: Keep flies out of the soup
- Swat the flies away
- Provide support for your students
- Eliminate a few more flies
- Soup for thought
- Looking for more help with assessment strategies?
Preparing students for assessments
Perhaps you have had an opportunity to make soup from scratch or alter premade soups to bulk up the flavour or vegetables. Regardless of your love (or distaste) for soup, join us in a soup-making analogy to help frame your thinking around two tips to increase validity and learner success with summative assessments. In this analogy, great soup is learner success.
Tip #1: Before making a whole pot of soup, sample the soup
Creating a small taster (formative assessment) or sample cup of the soup before the main pot (summative assessment) is served often results in better soup.
Summative vs formative assessments
Formative assessment – The act of monitoring student learning and providing ongoing feedback is like sampling the soup as it is in progress. Feedback – both from the instructor and from peers – can help in certain what the soup or an assessment needs more or less of. We can make incremental improvements based on a formative assessment of the soup’s taste, colour, and textures and enhance strengths. To do this, it helps if learners know what criteria are used to define a soup as fantastic in the particular circle of people consuming the soup. It often helps if feedback is given by those other than the ones creating the soup.
Summative assessment – The act of evaluating what has been learned, usually for a reporting purpose in the higher education context, is like serving the soup. This is the grand finale, when we may receive feedback again, not with the intent of improving this particular pot of soup but of improving the next pot of soup.
Effective feedback on formative assessments
Feedback as to whether the soup needs more basil or less cooking time can come from learners’ own judgement, from other trusted household members (peers), or from their instructor.
Regardless of the source of the feedback, effective feedback is likely based on pre-determined criteria that may be written in a rubric, passed on orally or gleaned from past experiences. The opportunity to check in, provided by formative feedback, enables learners to improve in a low-risk environment and, ideally, gain satisfaction by applying the incremental improvements to summative assessments. Instructors can determine what steps (if any) are needed to achieve mastery.
Similarly, learners can be given a chance to self-assess themselves against a checklist or a rubric related to what makes their research paper or laboratory report good or fantastic. Many learning management systems (LMS) offer a rubric tool to specify criteria for good, better, and fantastic, as well as self-assessment tools that provide immediate feedback to learners without any numeric evaluation. Checklist tools can verify the completion or consideration of essential components of a summative assessment.
A quiz tool can also provide formative feedback using a numeric evaluation. These numeric results do not need to influence the final mark in the course to have a positive impact on learning. One study involving 471 health sciences students attending classes both online and face-to-face reported that “…a significant positive relationship was found between final grade achieved and percentage of self-tests attempted. This relationship was significant regardless of study status (on-site or distance), course studied or total activity logged”. (Thomas et al., 2017)
Instructor and learner benefits
Instructors can also benefit from the result of a formative assessment. For example, instructors might proactively identify areas in need of further reinforcement if numeric results are low on a particular set of questions or there is a pattern of low achievement on a particular summative assessment. The timely source of data provided by formative assessments can help to provide effective further practice if needed by particular individuals. One can even use the results of formative assessment to make changes in the course design for future offerings of the course. If learners often seem to systematically produce a soup that is too salty, they might benefit from an adjustment to the recipe!
Given an opportunity to provide feedback to each other, learners can also benefit from rating a peer’s work against criteria on a rubric. Check whether your learning organization’s LMS has tools enabled for peer assessment.
If your next step is to decide which tool in an LMS helps instructors and learners ‘taste the soup’ (formative assessment), feel confident that the tools in your LMS may be able to accommodate your particular need. Most LMS allows the instructor choices on the following:
- Whether feedback is…
- Instantaneously released to the learner upon completion of each question of an assessment, or
- Released after the completion of an entire assessment in the form of a report
- Whether instructors can view…
- Scores on all answers for all students, or
- Anonymized scores of all students
- Whether it is desirable to…
- Have all students view the answers of all others (as in a survey, for example), or
- Have student answers completely private.
Contact an instructional designer or another staff member at your institution to explore ways that the various tools can be used to accomplish your goals.
Tip #2: Keep flies out of the soup
No one wants to be distracted by a fly in their soup! Similarly, preparing learners for summative assessments can alleviate any distracting elements that prevent a valid assessment of the soup itself.
Communicate openly about the intent of assessments, both formative and summative. Giving the soup a name and a description honours the reality that the diverse array of learners in your course will not necessarily have experienced that soup before!
Swat the flies away
Here are a few suggestions that you can follow to ensure that external factors (flies) do not interfere with your assessments.
First, provide specific information about the format of the summative assessment in the course syllabus or course information module online.
- Will it be timed?
- What materials, if any, are allowable for use during the summative assessment?
Provide support for your students
Your campus may have many untapped supports for learners. Reach out, and you may be surprised at what already exists. You may not need to re-invent the supports yourself!
- Provide links or advice on where students can find specific support for skills that may be assumed, such as research skills and writing skills.
- For project work, ensure that access to appropriate presentation software or providing presentation ideas is helpful, especially in an online environment.
- Encouraging students to share work among themselves – and providing a discussion forum or other avenue for facilitating that sharing process – can yield a soup bowl full of great ideas!!
Staff at MB Hub have learned much about presentation software and the pros and cons of each one through our students and thru personal experience with our children!
Eliminate a few more flies
Specific to online summative assessments, one of the distracting flies can be the LMS itself. When possible, students should be relieved of any concerns about the format of the tool they will be using for summative assessments. Providing specific information about the quiz details and the question types that students will experience on a summative assessment can be very helpful.
Quiz instructions and descriptions (as they appear in the LMS) should be demonstrated well in advance of a summative assessment. Typical information that is seen in an LMS quiz can be found below:
Current Time: 10:11 am
Time Allowed: 2 hours
Attempts: Allowed – 1 Completed 0
Introduction: The introduction is seen on the page before a student begins the quiz. Provide reminders such as supplies allowed, whether navigation to previous questions will be possible or not, and/or how/when/whether the results of the assessment will be available to the learner.
Instructions: Instructions are also seen on the page before a learner begins the assessment. Provide helpful information such as what happens when the allowed time has expired and whether there is a pre-warning about remaining time left for the quiz.
Select start to begin the quiz. The timer will not begin until after you have pressed start and the start-up process is finished.
A sample quiz also allows a variety of types of questions to be explored in advance of the summative assessment. For example, will there be true or false questions, multiple choice questions, or will there be a text box to provide a short answer of a limited number of words?
Again, this may not require too much additional effort on the part of instructors. Your organization may have access to sample quizzes for you to import into your LMS in a matter of minutes.
Using these strategies to limit the number of flies and the probability that one will land in your learner’s soup can yield more valid results for your summative assessments.
Soup for thought
We hope that soup sampling before the serving of the soup as well as keeping those distracting flies out of your soup can provide some ‘soup for thought.’ The goal is to increase the validity of summative assessments and increase learner success.
Looking for more help with assessment strategies?
If you are looking for help with online or blended assessment strategies, we have staff at MB Hub that instructors can consult with for free.
Book a one-on-one “Instructional Design Consultation” with our Instructional designer to work on assessments today!
Thomas, J. A., Wadsworth, D., Jin, Y., Clarke, J., Page, R. & Thunders, M. (2017). Engagement with online self-tests as a predictor of student success, Higher Education Research & Development, 36:5, 1061-1071, DOI: 10.1080/07294360.2016.1263827