Designing Effective Reviews

Helping Students Give Helpful Feedback

Teacher Development Series #2

Designing Effective Reviews

Helping Students Give Helpful Feedback

This module explores the qualities of effective reviews. Good review prompts help reviewers provide feedback that writers can use to make high-quality revisions.

The module identifies some of the choices that instructors can make while designing review tasks in order to generate helpful feedback. It will discuss the qualities of effective review prompts, design choices, and frameworks for helping structure open-ended feedback.

Part 1

Feedback Recap and Teaching Review

To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video

The first installment in this series discussed the importance of feedback and revision to writing improvement. It summarized the long body of research showing that, to help writers improve, our goal as writing teachers should be to help students learn to provide better feedback. To that end, let’s begin with a review of what feedback is and what it isn’t. Feedback differs from advice or simple response; it is:

Formative — helping writers get better at a task or increasing their understanding.

Timely — happening at a moment when it’s possible to learn and change (e.g., revise).

Our goal in designing reviews, and in developing students are reviewers, should be to make sure writers get feedback that helps them understand:

What they accomplished (descriptive feedback).

What they were asked to accomplish (goal-referenced feedback).

What they must do next (goal-directed feedback).

Our first module also proposed a feedback-centric project sequence utilizing the same amount of time but making room for more review and revision:

Common timeline for writing projects

Feedback-centric timeline for writing projects

Improvement happens more effectively in feedback-rich writing sequences like this because students are given more opportunities to practice:

As reviewers, with more opportunities to give, and to get better at giving, feedback.

As writers, identifying high-quality, helpful feedback and using it to make better revisions.

We know from the research literature that students are capable of providing high-quality feedback, but we also know that they are not innately skilled at providing it. There are three specific things teachers can do to help students give better feedback:

Model effective feedback, demonstrating and discussing good examples.

Provide ample opportunities to practice giving feedback.

Construct effective review prompts.

It’s not enough to simply tell students to give helpful feedback — they must be taught, explicitly, what helpful feedback looks like and coached through the process of giving it. An explicit motto of a feedback-rich classroom might then be:

This module is meant to help teachers in feedback-rich classrooms design reviews and coach students to be great reviewers. Section 2 will consider the design of criteria-driven reviews and offer suggestions on how to craft prompts that will guide students to giving criteria-focused feedback. Section 3 will look at the qualities of helpful open-ended comments and offer a heuristic that serves as a good model for students learning to write them. Section 4 will look at a teacher who designed a review-centric curriculum to help students learn and practice a high-stakes genre, and Section 5 looks at an example review activity.

Part 2

The Qualities of Effective Reviews

Reviews from which writers receive helpful feedback that will drive revision rarely happen without coaching, especially with novice reviewers. Teachers in feedback-rich classrooms must give as much attention to designing reviews as they do to designing writing prompts.

To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video

Review prompts shape how reviewers talk to writers, influencing the details reviewers notice and ignore. Prompts are not just words instructors use, but also the various forms of response they choose to help reviewers read a draft carefully and respond to it thoughtfully.

Unhelpful feedback is often the result of reviewer insecurities, caused by many factors:

When designing a review, there are three important factors we can take into account that will help overcome these obstacles and result in better feedback: we can consider the cognitive load of our reviews, start with pedagogical goals and design reviews backwards, and be detailed and specific in how we prompt students.

Design Consideration #1Consider the cognitive load of a review

One mistake we often make is giving students too much to do. Asking reviewers to read too much text and address too many questions can often mean that they don’t have time to respond thoughtfully. Module 1 discussed the issue of time and feedback loops, but some specific strategies for reducing cognitive load on reviewers include:

Review smaller texts - consider smaller, focused reviews as a text develops, rather than asking reviewers to digest and respond to a large text. In an example like this, writers get feedback early, on small pieces, helping make sure that the larger draft they’re building toward is on the right track, with the added benefit of making plagiarism much harder (since you can watch as a text evolves from earliest kernels to a full draft):

Multiple reviews of the same text - Divide reviews to conquer cognitive load. Design smaller, swifter reviews that are focused on specific, granular goals. This will let reviewers focus carefully for discreet moves.

Design Consideration #2Start with learning goals and design backwards

Knowing what you need from reviews can help you design them to yield better data. Think about what kind of feedback you want writers to get from a review as well as the data you need in order to see student progress or evaluate your pedagogy.

Try thinking like a survey designer. Prepare questions that will get reviewers to notice, record, and comment on drafts that will drive revision planning. Well-designed surveys lead participants to make judgement that lead to valuable feedback.

Quantitative data: Checklists, ratings, and scales can focus reviewer attention on specifics. Results from these can give writers and instructors a sense of reviewer perceptions on writers' overall alignment with criteria.

Qualitative data: Open-ended comments are valuable for writers because reviewer feedback can drive revision. They can also provide insight into the abilities of reviewers; instructors can assess the helpfulness of reviewer feedback, and reviewers can reflect on feedback they’ve given and how it has or hasn’t improved.

If you want reviewers to offer a good comment about the effectiveness of evidence in the draft, what should reviewers notice? How should they evaluate those features? What keywords will help them articulate their insights to writers?

Designing with the end in mind allows you to create specific, focused review prompts that will help reviewers make easy, accurate judgements and help them give the directed, helpful feedback that writers need.

Good review prompts start good conversations. They give reviewers specific, structured language for how to talk about writing and generally ensure that writers will get the high-quality feedback they need to make productive revisions.

One of the best ways to prepare prompts is to reference the goals of the writing being reviewed. These goals should, ideally, be explained before writing starts. Explaining learning goals and objectives can lead to better first drafts, but clearly defining the metrics for successful writing will also give reviewers a framework for giving feedback.

Questions 1, 2, and 3 in this example guide reviewers through the assignment criteria and direct them to make specific observations about evidence and the introduction. By the time they get to question 4, reviewers are ready to make detailed comments about how compelling and well-supported the draft is based on their previous observations.

Question 4 in this example asks reviewers to make open-ended comments. Comments are the most common form of feedback but aren’t intuitive like survey questions. Section 3 looks in detail at designing effective comment prompts and coaching students to helpful feedback.

Part 3

Coaching for Helpful Comments

Open-ended comments are the most common form of feedback on writing, but learning how to make helpful comments is difficult.

Unhelpful comments like “this is good” or “I like it” are meant to preserve relationships, but they don’t drive revision. One way to help reviewers give better feedback is by providing them a pattern for how to talk constructively about writing.

To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video

The describe - evaluate - suggest framework asks reviewers to do the following:

Describe - say what you see as a reader.

Evaluate - explain how the text meets or doesn’t meet criteria established in the prompt.

Suggest - offer concrete advice for improvement

The reviewer decides to leave some feedback for this writer.

Save Comment

Comment saved!

The descriptive move helps students give feedback in a non-judgemental way. Psycholinguist Frank Smith proposes that hearing an idea reflected back to them can help writers know that they are meeting readers’ expectations:

“Yes, the reader gets what I was trying to say.”

The describe step can easily be overlooked if reviewers jump too quickly into evaluating and suggesting, but describing is essential for quality feedback. Spending more time on describing will help students give more thoughtful evaluations and may lead to more insightful suggestions.

To reinforce the pattern, include it in your prompts. Use specific language to guide reviewers as they write open-ended comments, especially when they are inexperienced:

As you leave comments, be sure to use the describe-evaluate-suggest model: describe what you see happening, evaluate the effectiveness using the criteria we’ve established, and suggest changes the writer might make.

Once students have had substantial practice and have gotten better at providing feedback, prompts can become reminders:

Be sure to use the describe-evaluate-suggest model when making comments.

Feedback as genre

In feedback-rich classrooms, review comments are a genre that needs to be taught as explicitly and intentionally as the genres writers are using to convey their messages. Priming students to make criteria-focused comments that will drive revision requires:

Thinking about the pace and demands of reviews.

Using a variety of questions to help reviewers notice the key features of drafts.

Repeating key words and goals so that reviewers incorporate them in their comments.

Offering patterns for commenting.

Modeling effective comments and highlighting the qualities of helpful feedback.

Just for a moment, think about how different your class would be if course grades were based entirely on the comments reviewers gave to writers. How would your pedagogy need to change to help reviewers succeed? In Section 4, you’ll see how one teacher revamped his classroom to make reviewing as important as writing--without changing grades at all.

Part 4

A Review-Driven Pedagogy

Here is a specific example of a learning situation where reviewing is just as important - if not more so - than writing. In order for students to perform on a high-stakes writing evaluation, their instructor prepares them to think like evaluators and to use the language of the evaluators in their peer reviews.

Michael Schanhals is a veteran teacher with more than 20 years of experience, having taught a wide variety of writing-intensive courses at all grade levels from K-12, including Advanced Placement courses in literature, language, and composition.

Michael was tasked with helping improve scores on the writing portion of the ACT college readiness test at his school. He designed a feedback-rich curriculum to help writers “understand the game”: students not only practice writing essays, but they review one another’s practice essays using the ACT rubrics and scoring guides. Through peer review, students learn the language ACT evaluators use when scoring.

To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video

In Michael’s curriculum, review is just as important as the practice essays, if not more important. It is through review that students learn the expectations of the ACT - they gain familiarity with the criteria and how they are applied, and see how other writers make (or fail to make) effective moves to address them.

Practicing the Moves

Michael engages students in weekly writing and review exercises. Specifically, he designed five review exercises to help students respond to classmate practice essays using the criteria of the ACT evaluators:

Explaining Paragraphs - writers are expected to produce 5 paragraphs in 30 minutes, so students deconstruct and practice the moves of effective paragraphs.

To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video

It’s through review that students come to “understand the game” and demonstrate if they've learned particular moves. Michael can run students through as many rounds of practice of a particular move as needed until he’s confident they’re proficient and ready to move on.

Section 5 will look in detail at the design of one of Michael's practice reviews and how he designed it to help students practice and internalize the ACT test criteria.

Part 5

A Criteria-Driven Review

As introduced in Section 4, Michael Schanhals' ACT writing test practice curriculum is comprised of a series of reviews designed to introduce students to the criteria of the writing test and review one another's practice essay. This section looks at one of those reviews in detail.

The “Writing Protocol” Review

The review that Michael’s students practice most often is his “writing protocol” review, which is based on a set of traits commonly found in high-scoring essays. He identified this set of traits by boiling down the published readiness standards and from his own experience teaching the ACT.

The "writing protocol" review gives writers feedback on how effectively they responded to the protocol, and it gives reviewers practice using the protocol to give feedback. The review uses a mix of response types intended to yield different types of data, all of which guide writers, reviewers, and Michael as their practice coach.

Writing Protocol Review

Checklist: check each of the statements below that you find to be true about this essay (in Eli Review, these are called trait identification sets):

the question is answered with a well-qualified thesis/answer statement

there are more than four paragraphs

each paragraph includes evidence in the form of an anecdote or personal experience

each paragraph explains the evidence toward an answer/thesis/conclusion

Likert Scale: How much do you agree with this statement: "I personally buy the argument and am persuaded by its reasoning"? (options from Strongly Agree -> Strongly Disagree)

Comments: Please describe where the writer could better meet the requirements of the protocol and offer specific revision suggestions that show the writer how to address each instance where they did not meet the protocol.

By providing detailed, explicit prompts based on criteria previously introduced in class discussion, reviewers know exactly how to respond to practice essays. This means writers are far more likely to get feedback addressing the criteria and offering helpful suggestions on how to improve in the next practice. Reviewers also get experience using the exact language of the ACT evaluators, reinforcing their understanding of the criteria and preparing them to identify those moves in their own writing.

Data-Driven Writing Coaching

The feedback student reviewers generate in criteria-driven reviews, like Michael's Writing Protocol Review, not only give writers and reviewers structure for their work, but also gives instructors insight into where actual learning is (or isn't) happening.

To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video

From their responses to the practice reviews, Michael's students generate data that helps him to find:

If essays address the criteria - Data from the checklists can illustrate the criteria where the class, or even individual writers, aren’t showing signs of proficiency. If one criteria in particular isn’t observed, he knows where he can most effectively offer additional instruction and practice.

If reviewers grasp the criteria - If the scores on the rating scales skew too far in one direction, or if comments aren’t explicitly addressing criteria, he knows that students may need more practice with the rubric or models for effective comments.

Models of effective essays and feedback - Qualitative responses can point to essays that effectively address criteria, and feedback that writers classify as helpful can help Mike identify his best reviewers. Writers and reviewers can be models for discussion, demonstrating authentic examples of good work.

Criteria-driven reviews, then, are how Michael measures whether or not learning is taking place, and show him where he can effectively intervene as a coach to help his students reach their goal.

Questions for Further Discussion

How do you talk about learning objectives with your students? In assignment descriptions and when setting review criteria? At other times?

As a writer/reviewer yourself, have you experienced a review that asked too much of you? How did it affect the feedback you gave as a result? Do you think about this when you assign students review tasks?

Think about some of the best feedback you have received on your writing. What are some of the qualities of effective reviews you’ve participated in?

What types of prompts or questions have produced the best reviewer feedback in your classrooms?

Next Steps and Additional Materials

Explore additional entries in this four-part Teacher Development Series to take these ideas further, or find examples of Eli Review in action and how to use it effectively.

These materials are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. This means you are free to copy and redistribute them in any medium or format and to remix, transform, and build upon the material, as long as you provide proper attribution to the authors (Eli Review, or Michael McLeod, Bill Hart-Davidson, and Jeff Grabill). Commercial use of this content is prohibited.