Skip to content

Tag: Quizzes

Pushing Boundaries in Canvas #1

Occasionally we get questions from instructors wanting to do things that Canvas is ill-equipped to handle. This series celebrates that spirit of innovation and provides the best answers we can come up with to approximate the desired effect.

Quiz Feedback on randomized questions

Creating a practice quiz before a big exam that automatically provides immediate feedback* can be an effective formative assessment that is convenient for both student and instructor. One instructor had a brilliant idea to repurpose last quarter’s exams as practice quizzes. There were just a few complications: there was a mix of essay questions and auto-graded questions, the questions were randomized, and the instructor didn’t want to have to grade any of the questions manually. The desired result was that the student would take the practice exam with randomized questions and then get the answers and explanations in the feedback for each question. Unfortunately, Canvas quizzes don’t provide feedback text for ungraded questions, so students would be unable to compare their essay responses to the exemplars unless the instructor graded each question individually, even if it was simply to assign a 0 out of 0 possible.

Here are some of the solutions we came up with, none of which quite satisfied the instructor, but they may inspire you to try something:

  • Set the quiz to show “one question at a time” without backtracking. The next item will be a text only “question” with the exemplar answer. When the students go back to review the quiz, they can compare their answer to the example. These can be mixed in with auto-graded questions with feedback comments as long the quiz is set to allow students to see their responses and the correct answers. This option requires the questions to be in a set order and not randomized.
  • Keep the quiz randomized and add a number key to each essay question so they can look up the answer and feedback on an answer key that is only accessible once they have submitted the quiz.  This can be done by setting up requirements in the module where the quiz resides. Auto-graded questions can still provide feedback within the quiz results.
  • Separate the practice quiz into two quizzes: one with randomized auto-graded questions and one with essay questions in a set order with an answer key released after it is submitted.

*Those are instructions for New Quizzes. For feedback on Classic Quizzes, see the instructions for individual question types. Here’s a useful resource for deciphering the answer and feedback settings in New Quizzes.

If you have a better solution that meets all of the conditions, we would love to hear it. You can post your answer in the comments or send them to elearning@everettcc.edu. But your situation might not be as complicated, so one of these solutions, or something similar, might work just fine for you. Also, if you think Canvas can work better for instructors and students with a bit of a teak, you are always welcome to suggest something. That’s how Canvas evolves.

And if you can’t get Canvas to do something that you think it should, and you want to brainstorm options, don’t hesitate to contact us. That’s why we’re here.

Comments closed

Revisiting Online Quizzes

The Teaching and Tools workshop series included two seminars with a tongue-in-cheek title “Beat the Cheat.” The first session was a broader exploration of the general premise of exams as an assessment tool (spoiler alert – Derek is an occasional skeptic), and the second session explored some of the Canvas features that allow for “security” measures when online quizzes are offered.

Feel free to take a listen to the Podcast versions here:

Part One podcast

Part Two podcast

You can also access the transcripts here:

Beat the Cheat part one transcript

Beat the Cheat part two transcript

And the handouts from the in-person workshops are available as well!

Beat the Cheat part one handout

Beat the Cheat part two handout

 

Comments closed

Incorporating Metacognition Practices Mid-Quarter

Image result for metacognition definition

At a recent conference on departmental support of evidence-based teaching practices in Biology (PULSE), I picked up two Metacognition techniques to bring into my classrooms. These seemed so powerful and, honestly, easy to implement, that I did it the following week.

This first idea stems from work that Ricky Dooley (new colleague in Biology) developed with Scott Freeman and others at the University of Washington. In my majors’ Biology class, I have weekly quizzes over the past week’s material.  Standard in-class quizzes, mostly multiple choice (taken with iClickers) with a short answer question here and there. Student performance was mixed, and when we went over the correct answers, many students had “ah-ha” moments when ideas began to click.

Of course, these ah-ha moments were a few moments too late to help on that particular quiz. What I’ve begun doing is flipping that around. First off, I’ve moved this quiz completely onto Canvas. And rather than the usual 10 questions/10 points, they are now 20 questions, still worth 10 points. The first question is the usual question I would ask (although I’ve added more short-answer questions, reflecting questions I will ask on the exams.). This first question (and all of the odd-numbered questions) are worth zero points, so there’s no risk to the student to do their best from their memory (no reason to cheat). The second question (all of the even-numbered questions) is the same question, followed by how I would answer the question. This question then asks the student if they think they got it right, wrong, or somewhere in between. If they didn’t get it right, I ask them 1) explain why they got it wrong, 2) what the right answer is, and 3) why is the right answer correct. This question is worth 1 point, and I grade it based upon how they’ve reflected on their work. Sometimes, within their summary explanations, students will still not fully understand the material. Here, it’s very easy for me to jump in (while grading) and help them individually. An additional benefit is that these quizzes, with the addition of more short-answer questions, more closely resembles the question types I have on my midterms.

The first time I did this (in the 5th week of this quarter), my last question asked the students their opinion on this new style of testing. With the exception of the one student who was already doing exceptionally well, feedback was very positive. They appreciated the ability to correct themselves, and feel that they better understand the material. Their explanations seemed genuine to me, so I’m hopeful that they’ll perform better on our midterms.Image result for metacognition

The second idea I implemented I borrowed from another biology colleague, Hillary Kemp. This I’ve done with my non-majors Cellular Biology course, one that is typically tough for many students, as they begin their path towards an allied health degree.  Exam performance on my short-answer questions is always spotty (lots of higher-order Bloom’s Taxonomy questions).  Usually I would go over the correct answer with the class, in the hopes that they’d do better on the final. Now, rather than go over those answers, I give them their marked-up short-answer sections back, and let them correct their answers for partial credit. I stress that in their corrections I’m looking for them to explain why they got it wrong, and why the correct answer is correct. This is worth just enough to eliminate the need to curve the exam (essentially, they’re working to “earn” the curved points). In my large class (n=48), results were mixed. Many students clearly explained why they got it wrong and understand why the correct answer is correct. However, others just put down correct answers or, worse, Googled the answer and put down technically correct answers, well above the level of our course. Again, I awarded points based upon their explanations rather than the correctness of their answers. I think this exam reflection is helping those students who genuinely want to do well in class, as opposed to those who are maybe not too sure about this degree path. I’m hopeful that performance on our comprehensive final will show improvement because of this reflection exercise.

This post was generously contributed by Jeff Fennell, who teaches in the Biology department at Everett Community College.

Comments closed