Skip to content

Tag: Visualization

What Words Appear Most Frequently in Our Course Learning Objectives?

Here’s a fun way to spend a few minutes, assuming you’re the kind of person who enjoys looking at things like course learning objectives. (Is there anyone who doesn’t?)

This is a word cloud representation of the current course learning objectives for most of EvCC’s courses. This is generated using Voyant Tools, a online text analysis platform that can do all sorts of neat and sophisticated things with large quantities of text.

By default, the word cloud displays the most common words appearing in the collected course learning outcomes across all departments and divisions. You can move the Terms slider to display fewer or more words. If you’d like to look at the outcomes for a single course, click the Scale button, select the “Documents” option, and then choose the specific course you’re interested in.

I find this visualization interesting to think about in relation to Bloom’s Taxonomy of Educational Objectives (a nice web version can be found here). By removing a lot of the domain- and subject-specific words that often appear in learning objectives, the word cloud view illuminates some of the broader categories of learning our courses identify as essential to a student’s progress through a given course and program of study. Looking at these categories in terms of their position along Bloom’s spectrum of lower-to-higher-order thinking strikes me as productive and potentially revealing exercise: what should we make of the prominence of words like “demonstrate,” “describe,” and “identify” and the diminutive size of “analyze” and “create”?

Comments closed

A First Look at Equity in eLearning

As EvCC has continued its Guided Pathways efforts over the past year, equity has been frequently discussed as essential to  helping students make informed decisions about their education and future careers. In a post on the Guided Pathways blog last spring, Samantha Reed discussed some of the ways that increased awareness of equity considerations can help programs identify gaps in outcomes, thereby creating openings for change that will help us “make sure our institution serves all our students equitably.” More recently, Director of Institutional Research Sean Gehrke has been posting on using data to identify equity gaps. Equity was also a topic of discussion at the summer meeting of our system’s eLearning Council, where we noted as a clear priority the need for more research on “equity gaps in applying technology to learning” and “structural barriers to access to technology-mediated instruction.”

Prompted by some of these ongoing conversations, I decided to do a little initial investigating of my own to see where there might be obvious equity gaps in the context of eLearning at EvCC. The real work of examining equity is difficult and potentially requires multiple types of data in order to get meaningful analytical purchase on its many dimensions. So as a somewhat easier starting point, I posed a fairly simple question: “Are there significant differences between student populations in face-to-face and online courses at EvCC?” Granted, that’s probably a diversity question rather than an equity question–but it creates necessary space for considering those more challenging equity issues in online learning. Once we have a better sense of who might be missing from online courses, we can take up the questions of why they’re missing and how their absence may by symptomatic of systemic inequities.

To answer my question, I turned to our institutional Enrollment, Headcounts, and FTE Tableau dashboard (thank you, Sean!) and starting crunching some numbers.

Comments closed

Friday Fun with Core Learning Outcomes

As our college has been gearing up for its accreditation site visit, which happens next week, I’ve been thinking quite a bit about our seven Core Learning Outcomes (CLOs). Naturally, they play a fairly large role in the self-evaluation report that we’re submitting to the accreditors, so that’s one reason they’ve been on my mind. But I’ve also been thinking about connections among those college-wide outcomes, and how those connections inform in various ways EvCC’s ongoing Guided Pathways work.

Noodling about on that topic recently, I found myself wanting to visualize how many CLO connections there actually among the courses we offer. In particular, I wondered whether there might be certain clusters of courses that all support the same CLOs, even if the courses themselves are part of different departments, programs, or degree/certificate pathways. Here’s the visualization I came up with:

To get a sense of how courses in different divisions are connected to one another via shared CLOs, click on the name of a division you’re interested in. This will highlight all courses within that division. You can also click and drag any of the nodes representing an individual CLO; this will lock in place wherever you release it, which can make it a bit easier to see which courses are clustered around specific outcomes. Hover your cursor over an individual course to reveal the specific CLOs it introduces. (A larger view is also available.)

Do you see any interesting patterns in the network? Are there any groupings of courses you might not expect? How might visualizations like this help us see new connections or patterns that could help us approach our Guided Pathways efforts–particularly the process of developing pathways and program maps–with fresh ideas or insight into possible points of intersection across our college’s academic divisions?

Comments closed

Storyline helps you tell stories with data

How frequently in your teaching do you use simple data to help students understand an important concept or trend, or to create opportunities for students to incorporate data into their own critical thinking around a particular subject or topic? Chances are you use data of some kind fairly frequently, even in disciplines that aren’t known for being particularly data-heavy. (As an example, in literature courses I taught I would frequently talk to students about, say, trends in literacy rates during the period we were studying, or shifts in newspaper circulation and public library memberships. In other words, I would share data that helped contextualize what we were reading in contemporary social, cultural, and economic conditions.)

All too often, when we use data in classes we treat it as something that is fairly static: a printed handout, an image on a slide, or a graph we draw on the whiteboard. There’s nothing wrong with that, exactly, but I often find myself wanting to give students a better entry point into data — and, more importantly, to help students understand the story the data can help us tell. “Teaching with data” is a broad category that can mean many things, but I take as one of its fundamental components a desire to teach students how to think with data and to construct meaning from it. So I was very excited to see that the Knight Lab recently released a tool for creating simple annotated charts. It’s called Storyline, and while its features are minimal I think it has great teaching potential.

Storyline is a web-based tool, and it’s so easy to use that if you know how to make a spreadsheet you can certainly make a Storyline. At the moment, Storyline makes it possible to generate a time-series line chart (essentially, a chart that shows a data variable over time) with up to 800 data points.

Unlike a static chart, Storyline allows you to attach brief textual annotations to individual data points. Here’s what it looks like in action:

The annotations are displayed in sequential order beneath the chart. Interaction with the chart can take two forms: clicking an annotated data point (those shown as circles on the chart) or clicking an annotation bubble beneath the chart. Go ahead — give it a try in the example above. And then keep reading to find out how to create your own…

Comments closed

Panopto’s long tail

Last week I posted briefly about exploring some simple data showing how many EvCC courses use Canvas. This time around I’m turning my attention to Panopto, our video content management platform. Extracting useful information out of Panopto is a bit harder, so I figured I’d start with something simple: the total number of video hours viewed by (anonymized) course.

Let’s take a look:

Comments closed

How many EvCC courses use Canvas?

I’m asked on a fairly regular basis how many courses at EvCC use the campus learning management system, Canvas, in some capacity. There are many reasons for this question–ranging from general curiosity to specific ideas the questioner may have about, say, the most effective methods for communicating with students–but until fairly recently I couldn’t provide a very reliable answer. That’s partly due to the fact that we automatically create an empty Canvas course (what we sometimes call a “shell”) for every course at the college, meaning we can’t automatically assume the existence of a course in Canvas indicates active use by the faculty member teaching that course. The difficulty in pinning down exactly how many courses use Canvas is also due, in part, to the many other purposes for which faculty, staff, and students use Canvas: clubs and student organizations; departmental or program-based groups; faculty and staff programs; and so on.

Unsatisfied with only being able to say that “many” or “the majority” of courses at the college use Canvas in some way, I set out last fall to develop a more reliable measure of Canvas use and its change, if any, over the past few years. I’m happy to say the results are in. By combining course information from our student management system with data from the Canvas API, we can quickly identify the subset of Canvas shells that correspond to courses students take for credit at the college. Then, within that subset, we look only at those courses that have been published and that have at least 3 students enrolled. (I won’t bore you with the details of why that is necessary, but in general it helps filter out a variety of unusual cases that might otherwise provide a false sense of the rate of Canvas use.)

This yields a reasonably good approximation of actual Canvas use for credit-bearing courses at EvCC:

As this chart shows, 83% of courses at the college used Canvas in the spring of 2017, up from about 68% when we first moved to Canvas in 2013.

Obviously, this doesn’t tell us anything at all about how Canvas is being used, or why, or whether it benefits students or faculty. There are other data that could help us begin to investigate all of those more nuanced and complex questions–and I hope to write about some of that here in the future–but these numbers alone doesn’t tell any of those stories. Still, it’s interesting to observe the adoption of this particular platform on our campus over time.

Comments closed