Skip to content

Category: Tools

Tools and technologies to enhance teaching and learning activities

When Is Your Computer Unlike Mine? When We Consider Technology Opportunity Gaps

Regular readers of this blog will know that EvCC, like many community colleges across the country, continues to engage with knotty, challenging questions of equity in higher education. Conversations about equity have been central to Guided Pathways efforts at the college (and long before), and they’re also part of our work at the Center for Transformative Teaching. A few months ago, I wrote on this blog about my initial investigation of potential equity gaps in online course enrollments, and I’ve continued to think about this problem since then.

Equity in online, hybrid/blended, and technology-enhanced learning environments is in many ways a classic manifestation of the digital divide — inequalities in “access to, use of, or impact of information and communication technologies” (Wikipedia). The heart of the problem, in my mind, lies in the final part of that definition: the impact of technologies on the people using them. While we tend to be pretty good about asking important questions related to students’ access to technologies, all too often we overlook an even more significant question. Once we’ve ensured all students have access to learning technologies (for instance, through low-cost laptop rentals — a service we provide to students here at EvCC) what are we doing to ensure that the use of those technologies is providing the same advantages to all students? Are we inadvertently perpetuating inequities by assuming that the beneficial effects of educational technologies are evenly distributed and available to all?

Leave a Comment

Bring Them or Ban Them? Laptops (and Mobile Devices) in the Classroom

In the list of perennial ‘controversies’ at the intersection of teaching and technology, the lowly laptop computer has always played something of an outsized role. I’m old enough to remember a time when the laptop’s extreme portability was breathlessly heralded as something that would revolutionize how and where learning would take place. (“It only weighs eight pounds; ten if you include the charger! Now students can read, conduct research, or write papers anywhere and everywhere! The era of ubiquitous learning has arrived!”) I also remember some of the dire predictions that were lobbed back in response. (“Students will be endlessly distracted! They will use their computers as intellectual crutches instead of learning to think and do for themselves! The end of deep, focused learning has arrived! Besides, what’s wrong with going to the computer lab — or using a typewriter, for that matter?! “)

Comments closed

What Words Appear Most Frequently in Our Course Learning Objectives?

Here’s a fun way to spend a few minutes, assuming you’re the kind of person who enjoys looking at things like course learning objectives. (Is there anyone who doesn’t?)

This is a word cloud representation of the current course learning objectives for most of EvCC’s courses. This is generated using Voyant Tools, a online text analysis platform that can do all sorts of neat and sophisticated things with large quantities of text.

By default, the word cloud displays the most common words appearing in the collected course learning outcomes across all departments and divisions. You can move the Terms slider to display fewer or more words. If you’d like to look at the outcomes for a single course, click the Scale button, select the “Documents” option, and then choose the specific course you’re interested in.

I find this visualization interesting to think about in relation to Bloom’s Taxonomy of Educational Objectives (a nice web version can be found here). By removing a lot of the domain- and subject-specific words that often appear in learning objectives, the word cloud view illuminates some of the broader categories of learning our courses identify as essential to a student’s progress through a given course and program of study. Looking at these categories in terms of their position along Bloom’s spectrum of lower-to-higher-order thinking strikes me as productive and potentially revealing exercise: what should we make of the prominence of words like “demonstrate,” “describe,” and “identify” and the diminutive size of “analyze” and “create”?

Comments closed

Incorporating Metacognition Practices Mid-Quarter

Image result for metacognition definition

At a recent conference on departmental support of evidence-based teaching practices in Biology (PULSE), I picked up two Metacognition techniques to bring into my classrooms. These seemed so powerful and, honestly, easy to implement, that I did it the following week.

This first idea stems from work that Ricky Dooley (new colleague in Biology) developed with Scott Freeman and others at the University of Washington. In my majors’ Biology class, I have weekly quizzes over the past week’s material.  Standard in-class quizzes, mostly multiple choice (taken with iClickers) with a short answer question here and there. Student performance was mixed, and when we went over the correct answers, many students had “ah-ha” moments when ideas began to click.

Of course, these ah-ha moments were a few moments too late to help on that particular quiz. What I’ve begun doing is flipping that around. First off, I’ve moved this quiz completely onto Canvas. And rather than the usual 10 questions/10 points, they are now 20 questions, still worth 10 points. The first question is the usual question I would ask (although I’ve added more short-answer questions, reflecting questions I will ask on the exams.). This first question (and all of the odd-numbered questions) are worth zero points, so there’s no risk to the student to do their best from their memory (no reason to cheat). The second question (all of the even-numbered questions) is the same question, followed by how I would answer the question. This question then asks the student if they think they got it right, wrong, or somewhere in between. If they didn’t get it right, I ask them 1) explain why they got it wrong, 2) what the right answer is, and 3) why is the right answer correct. This question is worth 1 point, and I grade it based upon how they’ve reflected on their work. Sometimes, within their summary explanations, students will still not fully understand the material. Here, it’s very easy for me to jump in (while grading) and help them individually. An additional benefit is that these quizzes, with the addition of more short-answer questions, more closely resembles the question types I have on my midterms.

The first time I did this (in the 5th week of this quarter), my last question asked the students their opinion on this new style of testing. With the exception of the one student who was already doing exceptionally well, feedback was very positive. They appreciated the ability to correct themselves, and feel that they better understand the material. Their explanations seemed genuine to me, so I’m hopeful that they’ll perform better on our midterms.Image result for metacognition

The second idea I implemented I borrowed from another biology colleague, Hillary Kemp. This I’ve done with my non-majors Cellular Biology course, one that is typically tough for many students, as they begin their path towards an allied health degree.  Exam performance on my short-answer questions is always spotty (lots of higher-order Bloom’s Taxonomy questions).  Usually I would go over the correct answer with the class, in the hopes that they’d do better on the final. Now, rather than go over those answers, I give them their marked-up short-answer sections back, and let them correct their answers for partial credit. I stress that in their corrections I’m looking for them to explain why they got it wrong, and why the correct answer is correct. This is worth just enough to eliminate the need to curve the exam (essentially, they’re working to “earn” the curved points). In my large class (n=48), results were mixed. Many students clearly explained why they got it wrong and understand why the correct answer is correct. However, others just put down correct answers or, worse, Googled the answer and put down technically correct answers, well above the level of our course. Again, I awarded points based upon their explanations rather than the correctness of their answers. I think this exam reflection is helping those students who genuinely want to do well in class, as opposed to those who are maybe not too sure about this degree path. I’m hopeful that performance on our comprehensive final will show improvement because of this reflection exercise.

This post was generously contributed by Jeff Fennell, who teaches in the Biology department at Everett Community College.

Comments closed

Friday Fun with Core Learning Outcomes

As our college has been gearing up for its accreditation site visit, which happens next week, I’ve been thinking quite a bit about our seven Core Learning Outcomes (CLOs). Naturally, they play a fairly large role in the self-evaluation report that we’re submitting to the accreditors, so that’s one reason they’ve been on my mind. But I’ve also been thinking about connections among those college-wide outcomes, and how those connections inform in various ways EvCC’s ongoing Guided Pathways work.

Noodling about on that topic recently, I found myself wanting to visualize how many CLO connections there actually among the courses we offer. In particular, I wondered whether there might be certain clusters of courses that all support the same CLOs, even if the courses themselves are part of different departments, programs, or degree/certificate pathways. Here’s the visualization I came up with:

To get a sense of how courses in different divisions are connected to one another via shared CLOs, click on the name of a division you’re interested in. This will highlight all courses within that division. You can also click and drag any of the nodes representing an individual CLO; this will lock in place wherever you release it, which can make it a bit easier to see which courses are clustered around specific outcomes. Hover your cursor over an individual course to reveal the specific CLOs it introduces. (A larger view is also available.)

Do you see any interesting patterns in the network? Are there any groupings of courses you might not expect? How might visualizations like this help us see new connections or patterns that could help us approach our Guided Pathways efforts–particularly the process of developing pathways and program maps–with fresh ideas or insight into possible points of intersection across our college’s academic divisions?

Comments closed

Why you should appear (at least sometimes) in your course videos

First, a shameless plug: EvCC instructor Joe Graber and I will be teaming up to offer a one-hour workshop on October 3 on using the EvCC lightboard, built by a team of engineering faculty, to create engaging and effective instructional videos. If you haven’t already done so, mark your calendar!

With videos on my mind recently, and with this being a time of the year when many faculty are creating new videos to share with their students, I thought it might be useful to address a couple of the myths, misperceptions, and generalizations about instructional videos that I encounter most frequently.

Students don’t need to see me in videos. All they need to see are my slides and the information I’m presenting. (Besides, I hate being on camera!)

Comments closed

Storyline helps you tell stories with data

How frequently in your teaching do you use simple data to help students understand an important concept or trend, or to create opportunities for students to incorporate data into their own critical thinking around a particular subject or topic? Chances are you use data of some kind fairly frequently, even in disciplines that aren’t known for being particularly data-heavy. (As an example, in literature courses I taught I would frequently talk to students about, say, trends in literacy rates during the period we were studying, or shifts in newspaper circulation and public library memberships. In other words, I would share data that helped contextualize what we were reading in contemporary social, cultural, and economic conditions.)

All too often, when we use data in classes we treat it as something that is fairly static: a printed handout, an image on a slide, or a graph we draw on the whiteboard. There’s nothing wrong with that, exactly, but I often find myself wanting to give students a better entry point into data — and, more importantly, to help students understand the story the data can help us tell. “Teaching with data” is a broad category that can mean many things, but I take as one of its fundamental components a desire to teach students how to think with data and to construct meaning from it. So I was very excited to see that the Knight Lab recently released a tool for creating simple annotated charts. It’s called Storyline, and while its features are minimal I think it has great teaching potential.

Storyline is a web-based tool, and it’s so easy to use that if you know how to make a spreadsheet you can certainly make a Storyline. At the moment, Storyline makes it possible to generate a time-series line chart (essentially, a chart that shows a data variable over time) with up to 800 data points.

Unlike a static chart, Storyline allows you to attach brief textual annotations to individual data points. Here’s what it looks like in action:

The annotations are displayed in sequential order beneath the chart. Interaction with the chart can take two forms: clicking an annotated data point (those shown as circles on the chart) or clicking an annotation bubble beneath the chart. Go ahead — give it a try in the example above. And then keep reading to find out how to create your own…

Comments closed

See the EvCC lightboard in action

Last Thursday’s Opening Week session on “Cool things faculty are doing in the classroom,” facilitated by my colleague Peg, was great fun–and a good chance for me to find out more about some of the thoughtful and innovative work EvCC faculty are doing. I learned something from every presenter, and as a result my notebook is now brimming with new ideas for future workshops, conversations, and potential blog posts.

For now, though, I’ll mention just one of the cool things from the session: Joe Graber’s demonstration of the lightboard he and some of his EvCC engineering colleagues have constructed over the past year and are now using to create videos for their courses. What’s a lightboard, you ask? It’s essentially a transparent, edge-illuminated chalkboard you can use to create videos that show you and what you’re writing at the same time. If that’s hard to envision, take a look at this demonstration video that Joe has created to show off some of the lightboard’s uses and capabilities:

This is DIY educational technology at its best!

Joe will be hosting an informal demonstration at 2:30 p.m. on Tuesday, September 19, in Whitehorse 109 if you want to stop by to take a quick look. Later this fall, we’ll also be offering a workshop on creating videos using the lightboard, combining a discussion of best practices in planning and structuring lightboard videos with an opportunity to visit the lightboard studio and give it a try yourself.

[Update 9/20/2017 — Joe an I will be facilitating a workshop on October 3, at noon, in Whitehorse 105. We’ll discuss recommendations for creating effective videos using the lightboard, then spend some time putting it through it’s paces. Light snacks will be provided, but bring your lunch — and your curiosity! For complete details, see our schedule of upcoming workshops.]

Comments closed

Panopto’s long tail

Last week I posted briefly about exploring some simple data showing how many EvCC courses use Canvas. This time around I’m turning my attention to Panopto, our video content management platform. Extracting useful information out of Panopto is a bit harder, so I figured I’d start with something simple: the total number of video hours viewed by (anonymized) course.

Let’s take a look:

Comments closed

How many EvCC courses use Canvas?

I’m asked on a fairly regular basis how many courses at EvCC use the campus learning management system, Canvas, in some capacity. There are many reasons for this question–ranging from general curiosity to specific ideas the questioner may have about, say, the most effective methods for communicating with students–but until fairly recently I couldn’t provide a very reliable answer. That’s partly due to the fact that we automatically create an empty Canvas course (what we sometimes call a “shell”) for every course at the college, meaning we can’t automatically assume the existence of a course in Canvas indicates active use by the faculty member teaching that course. The difficulty in pinning down exactly how many courses use Canvas is also due, in part, to the many other purposes for which faculty, staff, and students use Canvas: clubs and student organizations; departmental or program-based groups; faculty and staff programs; and so on.

Unsatisfied with only being able to say that “many” or “the majority” of courses at the college use Canvas in some way, I set out last fall to develop a more reliable measure of Canvas use and its change, if any, over the past few years. I’m happy to say the results are in. By combining course information from our student management system with data from the Canvas API, we can quickly identify the subset of Canvas shells that correspond to courses students take for credit at the college. Then, within that subset, we look only at those courses that have been published and that have at least 3 students enrolled. (I won’t bore you with the details of why that is necessary, but in general it helps filter out a variety of unusual cases that might otherwise provide a false sense of the rate of Canvas use.)

This yields a reasonably good approximation of actual Canvas use for credit-bearing courses at EvCC:

As this chart shows, 83% of courses at the college used Canvas in the spring of 2017, up from about 68% when we first moved to Canvas in 2013.

Obviously, this doesn’t tell us anything at all about how Canvas is being used, or why, or whether it benefits students or faculty. There are other data that could help us begin to investigate all of those more nuanced and complex questions–and I hope to write about some of that here in the future–but these numbers alone doesn’t tell any of those stories. Still, it’s interesting to observe the adoption of this particular platform on our campus over time.

Comments closed