more questions than answers

Main menu

Tag Archives: video

Post navigation

This is the follow up to my post last summer that outlined my anticipation of a research-based, video-based professional development that I was to facilitate with the math department at my school. We coined the name of the PD “The Video Club.” So, with the extended arm of Math for America, I was lent a HD camera, omnidirectional microphone, and a tripod. The goal: analyze, interpret, and get better at responding to student thinking.

Here are some takeaways from the experience:

The entire department felt the PD was worthwhile and would love to do it again. Everyone was highly engaged throughout and over 80% of the group felt that this PD helped them gain key insights into student thinking. This was a relief. I believe in and value the work, I just hoped they would also find it worthwhile – which they did. Plus, I’ve never headed an initiative like this before, so I’m glad it was a source of growth…and not utter failure.

The department felt that the experience made us more conscious of the words and terms we use with students. For example, the use of the word “cancel” was recognized as something to rethink. We discovered that many students infer, for instance, that the common factors in the numerator and denominator of a fraction “cancel” out, when in fact they are a form of 1. “Cancel” made our students feel as if the factors just disappeared. This is subtle and a direct result of our utilization of the word in class. On a related note, a colleague mentioned that students’ mathematical abilities are a reflection of our teaching and that she witnessed her own shortcomings embedded in their thinking. Interesting.

The experience helped us develop better questions – or at least to habitually reassess the quality of questions that we ask our students. Questions should anticipate and clarify student thinking all the while pushing kids to make connections. There were instances where we spent all of 15 minutes debating a 7-second student discussion. This deliberate focus on the details of student thinking allowed us craft questions that addressed very specific areas of student understanding.

We realized that the more analysis we did of student thinking via the video club, the more we valued the process of analyzing student thinking. This lead us to create more opportunities for our students to discuss mathematics during class so that, in turn, we could analyze their thinking. This may have been the result of the tangible improvements in our planning and teaching that we made after each of the sessions.

Many teachers are mandated to analyze student work. It hit me early in the year that recording student discussions around a task was actually an elevated form of this. We weren’t interpreting written work to get at student thinking. Instead, we were watching and listening to them explain their thoughts, which is a much more sophisticated way of understanding student thinking.

This seems somewhat counterintuitive, but I learned a great deal about mathematics. Specifically, I learned more about how mathematical relationships and ideas are viewed through the eyes of my students. For example, I explored why it is so common for students to reference the Pythagorean Theorem when they see a triangle labeled with sides a, b, and c – no matter what the problem is asking. (Our dependency on those arbitrary letters may have something to do with it.) This type of perspective taking has proved to be incredibly powerful when it comes to developing impactful learning opportunities for my kids.

I came to embrace the openness of each session. I prepared prompts and questions beforehand, but insights from the team really led the way. Over time, instead of being a “facilitator,” I was just another member of the group who helped push the conversation forward. I learned that the uncertainly involved with this work is a good thing.

When I initially dove into this project, I was concerned about the amount of prep time required – especially since we were dealing with video. Anyone who has dealt with video knows that the editing process can be discouraging and straight-up unbearable. I was elated to find out that, from beginning to end, the process requires no editing. Sustainability!

Lastly, this experience afforded me an opportunity to lead my colleagues. I was empowered. And I’ve taken on other leadership roles in the building, but for reasons that I cannot seem to pinpoint, this one felt different. It may stem from my own personal belief about how this work provides exceptional hands-on improvement for teachers – and how rare this is.

I’m enthused to continue this work next year. MfA has been an invaluable partner and I’m pleased to know that I have their continued support!

I approach a group of students discussing a problem in my class. I listen. I watch. I interpret their thinking. I sense a misconception. I ask a question to clarify what and how they are thinking. Hopefully, in the end, they reach a higher level of understanding of the problem and I reach a higher level of understanding of their comprehension.

Just like other teachers, I often do this sort of complex analysis of my students in under 10 seconds. I’ve been trained to.

That said, what if I could improve this skill I have learned over the course of my career? What if I could somehow train myself to be more attuned to student thinking?

That brings me to my next project. I’m partnering with MfA this year to bring some exciting, new PD to my school. It involves using video to record student discussion and interaction around a specific task (with no focus on the teacher). Afterwards, a group of teachers gather to watch the video, brainstorm about critical moments that occurred, interpret student thinking, and formulating questions that could be asked to clarify the thought process of the students.

It’s all based on the research by Elizabeth A. van Es and Miriam Gamoran Sherin. Here’s a follow up article they wrote on selecting clips and an overview of their work.

The idea is to slow down student thinking to the point where deep analysis can happen. My hope is that teachers at my school, along with myself, are able to use this process to improve our abilities to interpret student thinking and how we address it during our lessons.

Here are some of the challenges I foresee.

Introducing it to teachers. You can only introduce something once and first impressions have impacts that can last until June. I must make it good.

Teachers accepting the idea that interprepting student thinking often contains loads of uncertainty, and that this is okay. Not everything needs a final answer.

Developing engaging prompts for the group when the conversation is lagging. This may depend on the quality of my preparation beforehand.

I don’t see overall engagement being an issue, but you never know.

Being able to record and edit video clips in a timely manner. Luckily, at least in the beginning, MfA will be helping with this. But how sustainable is this type of PD in the long term?

Here are a few other unrelated thoughts.

How will teacher analysis differ if the focus is on student understanding versus misunderstanding, if at all? Does this impact “next steps” after the session?

Speaking of next steps, how will those look?

Can I channel teachers to certain moments in the clip based on my preparation beforehand? Would this be useful?

I may facilitate the initial sessions, but I want to learn perspective from my colleagues about how a student may be thinking. My MfA experiences have been scintillating in this regard. There were things mentioned that I would have never thought of.

This PD involves using video in the classroom. When most teachers think of video, they think of the teacher being recorded as s/he teaches with best practices as the center of attention. This is not that. It should be interesting to see this dynamic play out.

Each session I’ve attended with MfA has focused on one group of students discussing a task. How would the session change if we examined multiple groups of students from different classes – all discussing the same task? How would this affect the analysis?

This type of PD hinges on teachers understanding the content, in my case math. That notwithstanding, is there a way to run something similar that focuses on student discussion, but has a more interdisciplinary approach? Perhaps CRE/advisory?

To help me collect data, I’ve been using a tool for the last couple of months. It’s called Quick Key and it’s used to quickly and easily collect responses from multiple choice questions.

For a long, long time, my school utilized the Apperson Datalink scanner to aid in scoring multiple choice portions of exams. It not only scores exams quickly and efficiently, but its accompanying software provides insightful data analysis that I use to modify my teaching. On the downside, these machines are pricey (almost $1000) and require you to purchase their unique scanning sheets that work only with their machine. Each department in my school had a machine.

Because of my push towards standards-based grading, I find myself giving smaller, bite-size assessments that target fewer concepts. Consequently, I am assessing more frequently and I need the scanning machine at least once a week. The machine was constantly changing hands and I was always running around the building trying to track it down.

I decided that I didn’t want to be a slave to the scanner – and its arbitrary sheets. It’s not sustainable. Especially when we have mobile technology that can perform the same task and provide similar results.

Enter Quick Key.

Quick Key has allowed me to score MC items and analyze my students’ responses in a much more convenient and cost-effective way. Like, free. Hello. You simply set up your classes, print out sheets, and start scanning with your mobile device. (You don’t even need to have wifi or cellular data when scanning.) The interface is pretty clean and easy to use. Plus, it was created and designed by a teacher. Props there too.

Data is synced between my phone and the web, which allows me to download CSV files to use with my standards-based grading spreadsheets.

My SBG tracking spreadsheet

That is the big Quick Key buy-in for me: exporting data for use with SBG. As I have mentioned before, SBG has completely changed my teaching and my approach to student learning. At some point, I hope to write in-depth about the specifics of this process and the structure I use.

Though the Quick Key data analysis isn’t as rigorous as what I would get from Datalink, it suffices for my purposes. I sort of wish Quick Key would improve the analysis they provide, but for now, if I need more detailed analytics, its usually requires a simple formula that I can quickly insert.

Sample data analysis from Quick Key

Sample data analysis from Datalink

Through all this, I don’t overlook the obvious: MC questions provide minimal insight into what students actually know, especially in math. That being said, my students’ graduation exams still require them to answer a relatively large number of MC items. For that reason alone I feel somewhat obligated to use MC questions on unit exams. Also, when assessing student knowledge via MC questions, I do my best to design them as hinge questions. TMC14 (specifically Nik Doran) formally introduced me to the idea of a hinge question, which are MC questions that are consciously engineered to categorize and target student misconceptions based on their answer. In this way, students responses to MC questions, though less powerful than short response questions, can provide me an intuitive understanding of student abilities.

Quick Key recently introduced a Pro plan ($30/year) that now places limitations on those that sign up for free accounts. Their free plan still offers plenty for the average teacher.