Skip to content

Category: Tools

Tools and technologies to enhance teaching and learning activities

More on Student Generated Exam Questions

Who writes “better” exam questions? In an earlier blog post, I talked about student generated exam questions. Shriram Krishnamurthi, a professor of computer science at Brown University, said the approach is an example of what he called “Contributed Student Pedagogy.” Let’s add Benjamin Wiles, the chief data officer at Clemson University, to the conversation. Are you familiar with “Self-Determination Theory?” According to the website verywellmind, “In psychology, self-determination is an important concept that refers to each person’s ability to make choices and manage their own life. Self-determination allows people to feel that they have control over their choices and lives. It also has an impact on motivation—people feel more motivated to take action when they feel that what they do will have an effect on the outcome.” Will student generated exam questions foster engagement? Will greater student learning take place?

Wiles has made use of it in some of his math courses, and he knows that some students may be a bit hesitant at first, but believes that this is an excellent way to build community and engagement from the very first day of the term. In fact, he asks students to help write the syllabus with him.

In a study at the University of Michigan a few years ago, published in 2015 in the Journal of Dental Education, the rigor of exam questions generated by students was compared to the questions generated by instructors. The students were members of a first-year dentistry course. Here’s where it gets interesting…three experts did a blind study of exam questions, using Bloom’s Taxonomy as a scale, and discovered that almost half of students’ questions assessed high-level cognitive skills, and 16 percent of the instructors’ questions did. That gave me pause!

The study also interviewed students about this exercise (not the instructors, though!) More than three-quarters of the students found this exercise helpful in making connections between the dentistry course and other courses they were taking! “This exercise forced me to evaluate questions and review why they were right and wrong,” one student wrote.

Wouldn’t this be an interesting exercise for you to do in one of your classes this fall? Remember that students don’t automatically write great questions. Like the students in Wiles’ classes, it may take some coaxing and coaching, and an introduction to Bloom’s Taxonomy, but the outcomes will help students make more and better connections both in your class and other courses.

Comments closed

Quick Tip #3 – Giving Them a Say

Many of you know that I am a fan of James Lang, the author of such books as Small Teaching and co-author with Flower Darby of Small Teaching Online. [Small Teaching: how minor modifications to our teaching can have a major impact on student learning.]

I was reviewing an article written by Lang several years ago [Small Changes in Teaching: Giving Them a Say] in which he described something I know every classroom instructor has experienced: civil attention. Civil attention is a phrase coined by Jay Howard in his book, Discussion in the College Classroom: Getting Your Students Engaged and Participating in Person and Online. In an article in the Chronicle of Higher Education, Lang writes, “Citing research that dates back to the 1970s, Howard writes that “in the vast majority of college classrooms, we expect college students to pay civil attention. Actually paying attention is optional.” Students pay “civil attention” when they face the front of the room, eyes open, taking notes and occasionally making eye contact with us. But we all know — from our own experiences in boring faculty meetings or conference talks — that looking like you’re paying attention doesn’t mean you are.”

Back to Small Teaching. Lang recognized that when students are passive learners, there is much less learning taking place (because they are paying civil attention), and when they actively engaged, much more learning takes place. I think we all know this. In Giving Them a Say, Lang’s main point is that we should give students a measure of control to improve learning. Here are the three ways he recommends:

Student-generated exam questions. 

Students expect our exams to look a certain way, but what if we allowed them to choose from a longer list to choose from? Lang suggests that you create more questions than you actually want students to answer, or to even have them write their own questions. Asking students to write exam questions is a great group activity!

Open assessments. 

Lang also writes, “I have been intrigued in recent years with assessment systems in which students are offered a wide range of possible assignments and get to choose which ones to complete to earn the grade they desire. I profiled the work of one teacher who uses such a system, John Boyer, in my book Cheating Lessons: Learning From Academic Dishonesty. Bonni Stachowiak, host of the Teaching in Higher Ed podcast, spoke about her own use of open assessments in a recent episode.

Class constitutions. 

Are you familiar with the concept of  “a class constitution”? I found this idea intriguing. Cathy Davidson, in How a Class Becomes a Community: Theory, Method, Examples asks “Why does a class need community rules?” Perhaps you are not quite ready to go as far as Davidson does, allowing students to set most of the operating rules, but what about a class discussion on Day 1 about things like participation, cell phone use in class, late policies? Invite students in to the decision making process as part of your community building.

While many of us who are “old school” instructors will find the idea of giving students more control in the class, trying just one of these items might prove to be the first step in a much more successful course with improved learning outcomes.

Comments closed

Student Feedback Loops

A few years ago, edutopia, an excellent resource for faculty teaching at all levels, shared an article by Taylor Meredith on student feedback loops. Meredith writes, “A feedback loop is a process of checking for and affirming understanding that is specific, non-evaluative, manageable, and focused on a learning target.”

What Are Student Feedback Loops?

This process aims to move learning forward through feedback. Ideally, this feedback loop would happen frequently, in all subject areas. Meredith offers these steps as a way to start the process:

1. Begin With an Aim

An aim is a learning target or essential question that is unpacked from the standards, a part of a learning progression that is clearly communicated to the students at the beginning of each lesson.

2. Feedback Exchange

Feedback should be specific, non-evaluative, manageable, and focused on the aim. If the aim for the day is that readers should structure reasons to develop a compelling argument in a research-based essay, all feedback exchanged should be focused on that aim.

3. Revision and Application

In order for feedback to be effective, students must be given time to revise and apply their new understandings or ideas. Susan Brookhart and Connie Moss, authors of Advancing Formative Assessment in Every Classroom: A Guide for Instructional Leaders, speak of the Golden Second Opportunity, that moment when feedback is grasped and applied. When a student takes the feedback, makes changes to his or her work, and as a result moves a step closer to meeting the desired learning of the day’s aim, then the loop has started. It is authentic, purposeful learning. The teacher begins the process, but the student owns it.

4. Reflection

Closing the loop is time to reflect on the aim. Did students meet the desired learning of the day’s aim? Could they move to a different level of proficiency? Could they ask for more feedback? Are there any other areas to revise?

In student feedback loops, students are the ones who drive this process. The teacher supports the students by clearly defining a structure for feedback, modeling effective feedback, highlighting critical student feedback, and participating when necessary.

That’s Meredith’s approach to giving students feedback. Let’s look at the feedback process as a way to improve your teaching. Consider formative assessments such as the Minute Paper as a way to solicit comments from students about the class lesson or a just completed project. Spend several minutes at the end of class and have students do a quick write (anonymously) to determine if they had difficulties, felt the directions weren’t clear, or perhaps they were able to correctly summarize the topic. Using formative assessments to hear what students have to say about your class can help improve our dialogue with students, and help students develop a sense of belonging. If you collect student feedback, though, make sure you respond, and that your response comes quickly. Tell students what you learned (For example: I heard you say that the directions on the project weren’t clear, and I will make sure to check in with you about directions before the next project is assigned.)

Learn more about using formative assessments in your classes by reading this edutopia article, 7 Smart, Fast Ways to Do Formative Assessment.

Resources:

Visible Learning for Teachers

Comments closed

Pushing Boundaries in Canvas #1

Occasionally we get questions from instructors wanting to do things that Canvas is ill-equipped to handle. This series celebrates that spirit of innovation and provides the best answers we can come up with to approximate the desired effect.

Quiz Feedback on randomized questions

Creating a practice quiz before a big exam that automatically provides immediate feedback* can be an effective formative assessment that is convenient for both student and instructor. One instructor had a brilliant idea to repurpose last quarter’s exams as practice quizzes. There were just a few complications: there was a mix of essay questions and auto-graded questions, the questions were randomized, and the instructor didn’t want to have to grade any of the questions manually. The desired result was that the student would take the practice exam with randomized questions and then get the answers and explanations in the feedback for each question. Unfortunately, Canvas quizzes don’t provide feedback text for ungraded questions, so students would be unable to compare their essay responses to the exemplars unless the instructor graded each question individually, even if it was simply to assign a 0 out of 0 possible.

Here are some of the solutions we came up with, none of which quite satisfied the instructor, but they may inspire you to try something:

  • Set the quiz to show “one question at a time” without backtracking. The next item will be a text only “question” with the exemplar answer. When the students go back to review the quiz, they can compare their answer to the example. These can be mixed in with auto-graded questions with feedback comments as long the quiz is set to allow students to see their responses and the correct answers. This option requires the questions to be in a set order and not randomized.
  • Keep the quiz randomized and add a number key to each essay question so they can look up the answer and feedback on an answer key that is only accessible once they have submitted the quiz.  This can be done by setting up requirements in the module where the quiz resides. Auto-graded questions can still provide feedback within the quiz results.
  • Separate the practice quiz into two quizzes: one with randomized auto-graded questions and one with essay questions in a set order with an answer key released after it is submitted.

*Those are instructions for New Quizzes. For feedback on Classic Quizzes, see the instructions for individual question types. Here’s a useful resource for deciphering the answer and feedback settings in New Quizzes.

If you have a better solution that meets all of the conditions, we would love to hear it. You can post your answer in the comments or send them to elearning@everettcc.edu. But your situation might not be as complicated, so one of these solutions, or something similar, might work just fine for you. Also, if you think Canvas can work better for instructors and students with a bit of a teak, you are always welcome to suggest something. That’s how Canvas evolves.

And if you can’t get Canvas to do something that you think it should, and you want to brainstorm options, don’t hesitate to contact us. That’s why we’re here.

Comments closed

Canvas Quirks #1

Is Canvas not behaving like you think it should? Is there an obnoxious feature that you can’t seem to turn off or change the settings for? You may have discovered a Canvas Quirk – a persistent feature of Canvas that gets under our skins or foils some of our best design ideas. We’ll be collecting examples of these and publishing them in this occasional column, along with workarounds or thoughts about how to turn what seem like bugs into features.

Future entries might be more elaborate, but here are three quick ones to get us started:

What’s the quirk?

Instructors can’t use the same Zoom link for meetings happening at alternating times within the week for the same material to accommodate students, like a Tuesday morning session and a Thursday afternoon session. Canvas creates a unique link each time you create a new Zoom meeting and is unable to create recurring sessions using multiple times.

What is Canvas thinking? 

This is a security feature. Creating a unique Zoom link address for each session reduces the possibility of students sharing a recurring link that might be picked up by a Zoombomber. It also reduces the risk of FERPA violations. For this same reason, Canvas does not give you the option to use your personal room ID for recurring sessions.

What’s the workaround? 

See option 2 below. 


What’s the quirk? 

Recurring Zoom meetings crowding the ToDo list and overwhelming students. Instructors are unable to turn off the ToDo list or adjust any ToDo-related settings, unlike the Announcements at the top of the Home Page.

What is Canvas thinking? 

Canvas seems to have 3 operating principles here:

  • The ToDo list will show everything due within the next 7 days
  • The ToDo list would like at least 6 items and will reach further than a week to get them.
  • If there is only one type of event, like a recurring Zoom meeting, it will show all of them – even if that is 20 items.

What’s the workaround? 

  1. If you started by setting up your Zoom meetings, adding other items with due dates in that week, such as discussions, assignments, and quizzes, should trigger the first two principles, reducing your list to less overwhelming 6 items or however many items are due that week. You can even add pages to your ToDo list if you want to remind students to do readings by a certain date. 
  2. However, if a recurring Zoom meeting is the only thing you are assigning to students on Canvas, you can create a single meeting and then use the generated link as your recurring meeting. You can publish a link to it prominently on your home page along with your Zoom session schedule or on a special Zoom session page with additional login, help, and troubleshooting information. The Zoom link does not expire after the official date of the meeting. If you still want individual meetings on your ToDo list, you can also add calendar events every week and include the Zoom link in the event description. You might think to simplify this workaround by using your personal meeting link for all course meetings, but this could create potential FERPA violations or Zoombombing if students from different sections or previous classes figured out the link remained constant, so it isn’t recommended.

What’s the quirk?

There is a practical limit to how long answers to matching questions can be in Canvas Quizzes. Long answers will simply run off the screen so that only part of the answer is visible, and a smaller part for a smaller screen. The answers don’t wrap.

What is Canvas thinking? 

Text wrapping the answers would be confusing because the answers can’t be formatted with bullets to separate multiple line answers. The answers are highlighted as the student scrolls through them, but even that might be confusing.

What’s the workaround? 

The right side of a matching question won’t wrap, but the left side will, so always put your (longer) description or example on the left and the (shorter) matching term on the right when creating matching questions. In this case, the answers on the left would fit nicely in the dropdown answer menu. This will even work if you have multiple examples of the same term or concept and you want students to use some of the same answers for multiple prompts or questions.

If you want a bit more flexibility when creating quiz questions, I also like the multiple dropdown question. It can do more than just short answers.

I hope you found these useful. If you have discovered a Canvas Quirk or have a better workaround than the ones suggested here, please send them to eLearning@everettcc.edu and let us know if we can give you credit.

Comments closed

Annotate Videos for Use in Your Courses

In the world of online, blended, hybrid, and “flipped” courses, video is one of the things that separates an average learning experience from an exceptional one. Unfortunately, the majority of videos–70%? 80%? quite possibly more, based on personal observations over the past several years–in these types of courses follow the same general model: a narrated slide presentation or screencast, sometimes a recording of a live lecture or webinar-style presentation. Video, in these cases, is really just a substitute for conducting an in-person lecture. And, like it or not, that isn’t something most students  consider particularly exciting.

But course videos don’t have to be warmed-over lectures. Video has some real strengths when it comes to things like establishing social presence (a topic I’ve written about previously on this blog), demonstrating actions or phenomena that need to be observed visually to be fully understood, or leveraging the potential of multimedia learning to draw out and illustrate connections between concepts. Of course, making videos that do those things can be both difficult and time-consuming–and if there’s one thing that most faculty don’t have a lot of, it’s time.

So what can you do if you want to create more than a voiceover of your lecture slides but don’t have the time or skills needed to do so? One potential answer is video annotation. The idea is simple: take an existing video that someone else has produced and generously uploaded for public viewing and then add annotations that guide students through it, thereby connecting it to the specific topics or activities in your course. Carefully annotated, a video becomes something more than a lecture intended to convey information. Instead, at its best it functions as an exercise in modeling for students a thought process or a line of inquiry. An well-annotated video can establish a kind of dialogue between the video’s content and the larger conceptual or theoretical structures of the course or program of study. It can invite students to explore or understand a topic more deeply, and to connect that topic to others they have studied, rather than passively view and process simply for its informational content.

Video annotations can work in a number of different ways, depending on the specific tool being used to create and display them, but the general idea is to attach notes, usually in the form of text (although some tool support images and multimedia), to specific spans of time within the video. Once the annotations have been created, many annotation tools allow the viewer either to watch the full video and see the annotations displayed in a synchronized fashion as it plays or, alternatively, to jump immediately to specific portions of the video based on a selected annotation.

Some video annotation tools can also facilitate individual or group annotation by students, opening up a variety of possibilities for student projects or, potentially, for moving course discussions out of traditional threaded forums or message boards and into a world where comments are attached directly to the object of study itself.

Used thoughtfully and creatively, annotated videos provide numerous ways to move beyond the conventional video lecture in an online or hybrid course. In a companion post, coming soon, I’ll highlight a couple of free tools that you can start using right away to experiment with video annotation in your own courses. Stay tuned!

Comments closed

When Is Your Computer Unlike Mine? When We Consider Technology Opportunity Gaps

Regular readers of this blog will know that EvCC, like many community colleges across the country, continues to engage with knotty, challenging questions of equity in higher education. Conversations about equity have been central to Guided Pathways efforts at the college (and long before), and they’re also part of our work at the Center for Transformative Teaching. A few months ago, I wrote on this blog about my initial investigation of potential equity gaps in online course enrollments, and I’ve continued to think about this problem since then.

Equity in online, hybrid/blended, and technology-enhanced learning environments is in many ways a classic manifestation of the digital divide — inequalities in “access to, use of, or impact of information and communication technologies” (Wikipedia). The heart of the problem, in my mind, lies in the final part of that definition: the impact of technologies on the people using them. While we tend to be pretty good about asking important questions related to students’ access to technologies, all too often we overlook an even more significant question. Once we’ve ensured all students have access to learning technologies (for instance, through low-cost laptop rentals — a service we provide to students here at EvCC) what are we doing to ensure that the use of those technologies is providing the same advantages to all students? Are we inadvertently perpetuating inequities by assuming that the beneficial effects of educational technologies are evenly distributed and available to all?

Comments closed

Bring Them or Ban Them? Laptops (and Mobile Devices) in the Classroom

In the list of perennial ‘controversies’ at the intersection of teaching and technology, the lowly laptop computer has always played something of an outsized role. I’m old enough to remember a time when the laptop’s extreme portability was breathlessly heralded as something that would revolutionize how and where learning would take place. (“It only weighs eight pounds; ten if you include the charger! Now students can read, conduct research, or write papers anywhere and everywhere! The era of ubiquitous learning has arrived!”) I also remember some of the dire predictions that were lobbed back in response. (“Students will be endlessly distracted! They will use their computers as intellectual crutches instead of learning to think and do for themselves! The end of deep, focused learning has arrived! Besides, what’s wrong with going to the computer lab — or using a typewriter, for that matter?! “)

Comments closed

What Words Appear Most Frequently in Our Course Learning Objectives?

Here’s a fun way to spend a few minutes, assuming you’re the kind of person who enjoys looking at things like course learning objectives. (Is there anyone who doesn’t?)

This is a word cloud representation of the current course learning objectives for most of EvCC’s courses. This is generated using Voyant Tools, a online text analysis platform that can do all sorts of neat and sophisticated things with large quantities of text.

By default, the word cloud displays the most common words appearing in the collected course learning outcomes across all departments and divisions. You can move the Terms slider to display fewer or more words. If you’d like to look at the outcomes for a single course, click the Scale button, select the “Documents” option, and then choose the specific course you’re interested in.

I find this visualization interesting to think about in relation to Bloom’s Taxonomy of Educational Objectives (a nice web version can be found here). By removing a lot of the domain- and subject-specific words that often appear in learning objectives, the word cloud view illuminates some of the broader categories of learning our courses identify as essential to a student’s progress through a given course and program of study. Looking at these categories in terms of their position along Bloom’s spectrum of lower-to-higher-order thinking strikes me as productive and potentially revealing exercise: what should we make of the prominence of words like “demonstrate,” “describe,” and “identify” and the diminutive size of “analyze” and “create”?

Comments closed

Incorporating Metacognition Practices Mid-Quarter

Image result for metacognition definition

At a recent conference on departmental support of evidence-based teaching practices in Biology (PULSE), I picked up two Metacognition techniques to bring into my classrooms. These seemed so powerful and, honestly, easy to implement, that I did it the following week.

This first idea stems from work that Ricky Dooley (new colleague in Biology) developed with Scott Freeman and others at the University of Washington. In my majors’ Biology class, I have weekly quizzes over the past week’s material.  Standard in-class quizzes, mostly multiple choice (taken with iClickers) with a short answer question here and there. Student performance was mixed, and when we went over the correct answers, many students had “ah-ha” moments when ideas began to click.

Of course, these ah-ha moments were a few moments too late to help on that particular quiz. What I’ve begun doing is flipping that around. First off, I’ve moved this quiz completely onto Canvas. And rather than the usual 10 questions/10 points, they are now 20 questions, still worth 10 points. The first question is the usual question I would ask (although I’ve added more short-answer questions, reflecting questions I will ask on the exams.). This first question (and all of the odd-numbered questions) are worth zero points, so there’s no risk to the student to do their best from their memory (no reason to cheat). The second question (all of the even-numbered questions) is the same question, followed by how I would answer the question. This question then asks the student if they think they got it right, wrong, or somewhere in between. If they didn’t get it right, I ask them 1) explain why they got it wrong, 2) what the right answer is, and 3) why is the right answer correct. This question is worth 1 point, and I grade it based upon how they’ve reflected on their work. Sometimes, within their summary explanations, students will still not fully understand the material. Here, it’s very easy for me to jump in (while grading) and help them individually. An additional benefit is that these quizzes, with the addition of more short-answer questions, more closely resembles the question types I have on my midterms.

The first time I did this (in the 5th week of this quarter), my last question asked the students their opinion on this new style of testing. With the exception of the one student who was already doing exceptionally well, feedback was very positive. They appreciated the ability to correct themselves, and feel that they better understand the material. Their explanations seemed genuine to me, so I’m hopeful that they’ll perform better on our midterms.Image result for metacognition

The second idea I implemented I borrowed from another biology colleague, Hillary Kemp. This I’ve done with my non-majors Cellular Biology course, one that is typically tough for many students, as they begin their path towards an allied health degree.  Exam performance on my short-answer questions is always spotty (lots of higher-order Bloom’s Taxonomy questions).  Usually I would go over the correct answer with the class, in the hopes that they’d do better on the final. Now, rather than go over those answers, I give them their marked-up short-answer sections back, and let them correct their answers for partial credit. I stress that in their corrections I’m looking for them to explain why they got it wrong, and why the correct answer is correct. This is worth just enough to eliminate the need to curve the exam (essentially, they’re working to “earn” the curved points). In my large class (n=48), results were mixed. Many students clearly explained why they got it wrong and understand why the correct answer is correct. However, others just put down correct answers or, worse, Googled the answer and put down technically correct answers, well above the level of our course. Again, I awarded points based upon their explanations rather than the correctness of their answers. I think this exam reflection is helping those students who genuinely want to do well in class, as opposed to those who are maybe not too sure about this degree path. I’m hopeful that performance on our comprehensive final will show improvement because of this reflection exercise.

This post was generously contributed by Jeff Fennell, who teaches in the Biology department at Everett Community College.

Comments closed