Skip to content

The Center for Transformative Teaching Posts

Bring Them or Ban Them? Laptops (and Mobile Devices) in the Classroom

In the list of perennial ‘controversies’ at the intersection of teaching and technology, the lowly laptop computer has always played something of an outsized role. I’m old enough to remember a time when the laptop’s extreme portability was breathlessly heralded as something that would revolutionize how and where learning would take place. (“It only weighs eight pounds; ten if you include the charger! Now students can read, conduct research, or write papers anywhere and everywhere! The era of ubiquitous learning has arrived!”) I also remember some of the dire predictions that were lobbed back in response. (“Students will be endlessly distracted! They will use their computers as intellectual crutches instead of learning to think and do for themselves! The end of deep, focused learning has arrived! Besides, what’s wrong with going to the computer lab — or using a typewriter, for that matter?! “)

Leave a Comment

Revisiting Online Quizzes

The Teaching and Tools workshop series included two seminars with a tongue-in-cheek title “Beat the Cheat.” The first session was a broader exploration of the general premise of exams as an assessment tool (spoiler alert – Derek is an occasional skeptic), and the second session explored some of the Canvas features that allow for “security” measures when online quizzes are offered.

Feel free to take a listen to the Podcast versions here:

Part One podcast

Part Two podcast

You can also access the transcripts here:

Beat the Cheat part one transcript

Beat the Cheat part two transcript

And the handouts from the in-person workshops are available as well!

Beat the Cheat part one handout

Beat the Cheat part two handout


Comments closed

Succeed in College by … Sleeping More?

Man and dog sleeping on couch

Sleeping by Andrew Roberts licensed under Creative Commons CC-BY-2.0

Do you ever talk to your students about what they can do to be successful in your class and in college more generally? When you have that conversation, what are the essential factors that you discuss?

Is sleep one of them? If not, maybe it should be.

I recently finished reading Matthew Walker’s Why We Sleep: Unlocking the Power of Sleep and Dreams, a book that is considerably more substantive than its vaguely pop-sci titles makes it sound. Walker, a respected sleep researcher, directs the Sleep and Neuroimaging Lab at the University of California – Berkeley, and his book offers a very readable synthesis of what scientists have learned about sleep’s essential role in human health, psychological well-being, and–as it now turns out–learning.

Comments closed

Rethinking the Syllabus

After a late-2017 hiatus here on the CTT blog, I thought the first post of 2018 should touch on something many of us might be thinking about as winter quarter classes begin at EvCC today: the course syllabus.

Useful information about constructing a course syllabus can be found almost everywhere these days: here, here, here — I could keep this up for a long time, but won’t since you get the idea (and know perfectly well how to perform your own internet searches). But over on the Chronicle of Higher Education’s ChronicleVitae blog, Kevin Gannon last fall posted a series of musings that go beyond the general “how-to” approach you’ll find in most syllabus guides, tutorials, and similar resources. Instead, he invites us to ask what a syllabus is for, why it matters, and what we can do as teachers to bring the present-day syllabus back into the realm of “good pedagogy.”

Here’s a taste, from the first of the series, “What Is a Syllabus Really For, Anyway?“:

The key role of this document is spelled out clearly in The Course Syllabus: A Learning-Centered Approach: ‘The syllabus provides the first opportunity faculty have to encourage and guide students to take responsibility for their learning…When reading a learning-centered syllabus, students learn what is required to achieve the course objectives, and they learn what processes will support their academic success.’ In short, students need to know what they need to do to succeed in your course, and how they’re being empowered to do it.

But the syllabus has evolved (hideously mutated?) from a course guide to its present-day incarnation as a lengthy compendium of policies and procedural statements where the course material almost feels like an afterthought.

So how do we reclaim the syllabus for its rightful purpose? The first step is to ask, What is a syllabus for, anyway? If we can’t answer that question concisely and unambiguously, then there are conversations that need to happen.

Read the full article here, then see Parts 2 and 3 of the DIY Syllabus series, “What Goes Into a Syllabus?” and “How to Move Beyond the Transactional“.

How do you approach your syllabus? Have your views on “what a syllabus is really for” changed in recent years? How do you keep your syllabus fresh, engaging, and useful as a teaching instrument?


Comments closed

What Words Appear Most Frequently in Our Course Learning Objectives?

Here’s a fun way to spend a few minutes, assuming you’re the kind of person who enjoys looking at things like course learning objectives. (Is there anyone who doesn’t?)

This is a word cloud representation of the current course learning objectives for most of EvCC’s courses. This is generated using Voyant Tools, a online text analysis platform that can do all sorts of neat and sophisticated things with large quantities of text.

By default, the word cloud displays the most common words appearing in the collected course learning outcomes across all departments and divisions. You can move the Terms slider to display fewer or more words. If you’d like to look at the outcomes for a single course, click the Scale button, select the “Documents” option, and then choose the specific course you’re interested in.

I find this visualization interesting to think about in relation to Bloom’s Taxonomy of Educational Objectives (a nice web version can be found here). By removing a lot of the domain- and subject-specific words that often appear in learning objectives, the word cloud view illuminates some of the broader categories of learning our courses identify as essential to a student’s progress through a given course and program of study. Looking at these categories in terms of their position along Bloom’s spectrum of lower-to-higher-order thinking strikes me as productive and potentially revealing exercise: what should we make of the prominence of words like “demonstrate,” “describe,” and “identify” and the diminutive size of “analyze” and “create”?

Comments closed

A First Look at Equity in eLearning

As EvCC has continued its Guided Pathways efforts over the past year, equity has been frequently discussed as essential to  helping students make informed decisions about their education and future careers. In a post on the Guided Pathways blog last spring, Samantha Reed discussed some of the ways that increased awareness of equity considerations can help programs identify gaps in outcomes, thereby creating openings for change that will help us “make sure our institution serves all our students equitably.” More recently, Director of Institutional Research Sean Gehrke has been posting on using data to identify equity gaps. Equity was also a topic of discussion at the summer meeting of our system’s eLearning Council, where we noted as a clear priority the need for more research on “equity gaps in applying technology to learning” and “structural barriers to access to technology-mediated instruction.”

Prompted by some of these ongoing conversations, I decided to do a little initial investigating of my own to see where there might be obvious equity gaps in the context of eLearning at EvCC. The real work of examining equity is difficult and potentially requires multiple types of data in order to get meaningful analytical purchase on its many dimensions. So as a somewhat easier starting point, I posed a fairly simple question: “Are there significant differences between student populations in face-to-face and online courses at EvCC?” Granted, that’s probably a diversity question rather than an equity question–but it creates necessary space for considering those more challenging equity issues in online learning. Once we have a better sense of who might be missing from online courses, we can take up the questions of why they’re missing and how their absence may by symptomatic of systemic inequities.

To answer my question, I turned to our institutional Enrollment, Headcounts, and FTE Tableau dashboard (thank you, Sean!) and starting crunching some numbers.

Comments closed

Incorporating Metacognition Practices Mid-Quarter

Image result for metacognition definition

At a recent conference on departmental support of evidence-based teaching practices in Biology (PULSE), I picked up two Metacognition techniques to bring into my classrooms. These seemed so powerful and, honestly, easy to implement, that I did it the following week.

This first idea stems from work that Ricky Dooley (new colleague in Biology) developed with Scott Freeman and others at the University of Washington. In my majors’ Biology class, I have weekly quizzes over the past week’s material.  Standard in-class quizzes, mostly multiple choice (taken with iClickers) with a short answer question here and there. Student performance was mixed, and when we went over the correct answers, many students had “ah-ha” moments when ideas began to click.

Of course, these ah-ha moments were a few moments too late to help on that particular quiz. What I’ve begun doing is flipping that around. First off, I’ve moved this quiz completely onto Canvas. And rather than the usual 10 questions/10 points, they are now 20 questions, still worth 10 points. The first question is the usual question I would ask (although I’ve added more short-answer questions, reflecting questions I will ask on the exams.). This first question (and all of the odd-numbered questions) are worth zero points, so there’s no risk to the student to do their best from their memory (no reason to cheat). The second question (all of the even-numbered questions) is the same question, followed by how I would answer the question. This question then asks the student if they think they got it right, wrong, or somewhere in between. If they didn’t get it right, I ask them 1) explain why they got it wrong, 2) what the right answer is, and 3) why is the right answer correct. This question is worth 1 point, and I grade it based upon how they’ve reflected on their work. Sometimes, within their summary explanations, students will still not fully understand the material. Here, it’s very easy for me to jump in (while grading) and help them individually. An additional benefit is that these quizzes, with the addition of more short-answer questions, more closely resembles the question types I have on my midterms.

The first time I did this (in the 5th week of this quarter), my last question asked the students their opinion on this new style of testing. With the exception of the one student who was already doing exceptionally well, feedback was very positive. They appreciated the ability to correct themselves, and feel that they better understand the material. Their explanations seemed genuine to me, so I’m hopeful that they’ll perform better on our midterms.Image result for metacognition

The second idea I implemented I borrowed from another biology colleague, Hillary Kemp. This I’ve done with my non-majors Cellular Biology course, one that is typically tough for many students, as they begin their path towards an allied health degree.  Exam performance on my short-answer questions is always spotty (lots of higher-order Bloom’s Taxonomy questions).  Usually I would go over the correct answer with the class, in the hopes that they’d do better on the final. Now, rather than go over those answers, I give them their marked-up short-answer sections back, and let them correct their answers for partial credit. I stress that in their corrections I’m looking for them to explain why they got it wrong, and why the correct answer is correct. This is worth just enough to eliminate the need to curve the exam (essentially, they’re working to “earn” the curved points). In my large class (n=48), results were mixed. Many students clearly explained why they got it wrong and understand why the correct answer is correct. However, others just put down correct answers or, worse, Googled the answer and put down technically correct answers, well above the level of our course. Again, I awarded points based upon their explanations rather than the correctness of their answers. I think this exam reflection is helping those students who genuinely want to do well in class, as opposed to those who are maybe not too sure about this degree path. I’m hopeful that performance on our comprehensive final will show improvement because of this reflection exercise.

This post was generously contributed by Jeff Fennell, who teaches in the Biology department at Everett Community College.

Comments closed

Friday Fun with Core Learning Outcomes

As our college has been gearing up for its accreditation site visit, which happens next week, I’ve been thinking quite a bit about our seven Core Learning Outcomes (CLOs). Naturally, they play a fairly large role in the self-evaluation report that we’re submitting to the accreditors, so that’s one reason they’ve been on my mind. But I’ve also been thinking about connections among those college-wide outcomes, and how those connections inform in various ways EvCC’s ongoing Guided Pathways work.

Noodling about on that topic recently, I found myself wanting to visualize how many CLO connections there actually among the courses we offer. In particular, I wondered whether there might be certain clusters of courses that all support the same CLOs, even if the courses themselves are part of different departments, programs, or degree/certificate pathways. Here’s the visualization I came up with:

To get a sense of how courses in different divisions are connected to one another via shared CLOs, click on the name of a division you’re interested in. This will highlight all courses within that division. You can also click and drag any of the nodes representing an individual CLO; this will lock in place wherever you release it, which can make it a bit easier to see which courses are clustered around specific outcomes. Hover your cursor over an individual course to reveal the specific CLOs it introduces. (A larger view is also available.)

Do you see any interesting patterns in the network? Are there any groupings of courses you might not expect? How might visualizations like this help us see new connections or patterns that could help us approach our Guided Pathways efforts–particularly the process of developing pathways and program maps–with fresh ideas or insight into possible points of intersection across our college’s academic divisions?

Comments closed

Why you should appear (at least sometimes) in your course videos

First, a shameless plug: EvCC instructor Joe Graber and I will be teaming up to offer a one-hour workshop on October 3 on using the EvCC lightboard, built by a team of engineering faculty, to create engaging and effective instructional videos. If you haven’t already done so, mark your calendar!

With videos on my mind recently, and with this being a time of the year when many faculty are creating new videos to share with their students, I thought it might be useful to address a couple of the myths, misperceptions, and generalizations about instructional videos that I encounter most frequently.

Students don’t need to see me in videos. All they need to see are my slides and the information I’m presenting. (Besides, I hate being on camera!)

Comments closed

Storyline helps you tell stories with data

How frequently in your teaching do you use simple data to help students understand an important concept or trend, or to create opportunities for students to incorporate data into their own critical thinking around a particular subject or topic? Chances are you use data of some kind fairly frequently, even in disciplines that aren’t known for being particularly data-heavy. (As an example, in literature courses I taught I would frequently talk to students about, say, trends in literacy rates during the period we were studying, or shifts in newspaper circulation and public library memberships. In other words, I would share data that helped contextualize what we were reading in contemporary social, cultural, and economic conditions.)

All too often, when we use data in classes we treat it as something that is fairly static: a printed handout, an image on a slide, or a graph we draw on the whiteboard. There’s nothing wrong with that, exactly, but I often find myself wanting to give students a better entry point into data — and, more importantly, to help students understand the story the data can help us tell. “Teaching with data” is a broad category that can mean many things, but I take as one of its fundamental components a desire to teach students how to think with data and to construct meaning from it. So I was very excited to see that the Knight Lab recently released a tool for creating simple annotated charts. It’s called Storyline, and while its features are minimal I think it has great teaching potential.

Storyline is a web-based tool, and it’s so easy to use that if you know how to make a spreadsheet you can certainly make a Storyline. At the moment, Storyline makes it possible to generate a time-series line chart (essentially, a chart that shows a data variable over time) with up to 800 data points.

Unlike a static chart, Storyline allows you to attach brief textual annotations to individual data points. Here’s what it looks like in action:

The annotations are displayed in sequential order beneath the chart. Interaction with the chart can take two forms: clicking an annotated data point (those shown as circles on the chart) or clicking an annotation bubble beneath the chart. Go ahead — give it a try in the example above. And then keep reading to find out how to create your own…

Comments closed