Where Students Get Stuck Learning Technology, and How to See Them Through

We tend to think of students as “digital natives” who just “get technology,” but my experience has been that college students are often good with cellphones, and not necessarily with software, computers and equipment that are the tools of the trade.

No matter how carefully crafted the modules, or how clear the path to learning objectives is made, sometimes students just get lost.

No matter how carefully crafted the modules, or how clear the path to learning objectives is made, sometimes students just get lost. Try to think of these moments as gifts – opportunities to dig into the mental processes you use – and a chance to help integrate the concepts you’re teaching with the technology.

ADVERTISEMENT

In part one, I talked about identifying clear learning goals and designing modules based on concepts to keep technology from subsuming the course. To keep the focus on concepts, turn course goals into active learning exercises and lesson plans with a process called Decoding the Disciplines.

Define the Bottleneck

First, ask yourself: “What do students struggle with in my course?” The places where students struggle show where the concepts aren’t clear. Try to be as specific and detailed as possible. I’m going to use teaching basic usability concepts from a Web Design course to illustrate this process.

Vague: Students are unable to get past their opinion in determining whether an interface is usable or not.

ADVERTISEMENT

Useful: Students are unable to create specific tasks and scenarios to collect evidence of usability problems, unable to identify the appropriate audience for a website, and not able to learn from the data to strategize appropriate solutions and future actions.

Sometimes bottlenecks are emotional, not conceptual. When students come in with preconceived ideas of the field, the topic, or their ability, the goal is to motivate and encourage students to embrace learning something new. My classes frequently include students who think web design is “easy” and doesn’t require attention to detail; students who think they “aren’t good with computers”; and students who are disappointed to learn web technologies are not equivalent to Harry Potter magic.

Uncover the mental operations that students must master to get past the bottleneck

Ask someone who doesn’t understand your field to interview you in-depth about your thought processes for this action. The idea is to fully describe the mental steps you take as an expert.

What strategies did you use? How did you break down the problem? What approach did you take? What do you need to know in order to make that leap?

I asked Joan Middendorf from CITL to interview me about how I teach usability, and how as an expert I might approach completing a usability study (the student’s final assessment for that module). What we discovered in unpacking this task is that I was asking students for two different mental actions (analyzing a website for usability issues, and also evaluating data collected from user interactions), however, I was only testing if students understood these mental leaps in one final report.

Devise strategies to help students model each mental operation

As I did, you’ll likely discover that something about your process isn’t coming across or matching your thought process. Using the mental operations uncovered in your interview to inform and direct your lesson plans, make sure each part of what you do is represented in the lessons. This is especially important for teaching technology where the details can change with each software upgrade.

Model each mental operation by itself by breaking the operation down into components.

Use analogies to show them which “mental muscles” to use.

While using an example, meta-explain exactly where you focus and point out where the critical thinking takes place.

Example bottleneck: Students aren’t making the connection that usability testing isn’t about their opinion, it’s about collecting data on users and the users’ opinions.

In the web course, I began by finding an analogy to explain my process; fitness trackers, for example, collect data by taking measurements to make a quantifiable picture of your health. “Like with a Fitbit,” I say in class, “we may have an idea of how far we walk each day, but we don’t really know. Let’s use our study to collect data and take measurements instead of making assumptions.” I can then discuss how to use our analysis of a site to write better user tasks, and how to better evaluate the user data — two of the issues I’d discovered in the interview.

The Decoding the Disciplines process for course design.

Create opportunities for students to practice essential mental operations and receive feedback

At least for me, this next step meant building in more, and more varied, practice. It’s important that students understand the reason — we don’t want to create more work without a clear goal in mind. I begin each class by repeating where we are in the process and the final goal — like pointing out “you are here” on a map or checking the compass.

Originally I had four main tasks, plus lecture and discussion, for students to learn about usability. I thought this was reasonable, but in rethinking this module I realized I was not breaking the steps — initial analysis, writing tasks and running a study, and subsequent evaluation — into component skills.

Read book

Take a quiz — assessment (medium)

Perform usability test (outside of class)

Write report — assessment (large)

In particular, I began to divide the previously singular task of “complete a usability study” into three separate practice sessions in class, and divided the final assessment into two parts. I also added a few smaller checks at the start and in the middle to better scaffold these mental steps.

Read book

Survey in class — assessment (small)

Take a quiz — assessment (medium)

Perform an evaluation of a site in class — assessment (small)

Write an evaluation of a website — assessment (large)

Write tasks/scenarios and perform a usability test with a partner — assessment (small)

Perform a usability test (outside of class)

Write a report — assessment (large)

Assess student mastery

Comparing the module milestones, you’ll notice I added more checks. Smaller assessments don’t have to mean a lot more grading — I usually use self-grading online quizzes, or grade these as “participation points,” for example as pass/fail or using a checkmark scale.

The largest change was to break the final assignment into two parts to better match the two mental actions taking place — a usability analysis of a website by the student, and a usability report evaluating user data. I’m still in the process of evaluating this change in terms of student success, but responses from students in end-of-each-class surveys show less confusion and more confidence.

Course design is your lifeboat

Instead of constantly adjusting and revising your course in response to turbulent technology, anchor your course around concepts. The examples and skills tutorials can be swapped out, but the concepts and assessments will mostly remain the same. More importantly, now that you can notice where students struggle, you can help them succeed and not get lost at sea– and just maybe you’ll keep your head above the water too.

About EducationShift

EducationShift aims to move journalism education forward with coverage of innovation in the classroom as journalism and communications schools around the globe are coping with massive technological change. The project includes a website, bi-weekly Twitter chats at #EdShift, mixers and workshops, and webinars for educators.
Amanda Bright: Education Curator
Mark Glaser: Executive Editor
Design: Vega Project

MediaShift received a grant from the Knight Foundation to revamp its EducationShift section to focus on change in journalism education.

Get the J-Education Roundup!

ADVERTISEMENT

ADVERTISEMENT

Who We Are

MediaShift is the premier destination for insight and analysis at the intersection of media and technology. The MediaShift network includes MediaShift, EducationShift, MetricShift and Idea Lab, as well as workshops and weekend hackathons, email newsletters, a weekly podcast and a series of DigitalEd online trainings.