Archive

Notice how movie theaters have jumped on the rewards bandwagon? Yes, we earn points of our rewards card toward a free popcorn or soda. I’m all about the rewards, but we now have a desk drawer full cards.

If you’ve missed one of the following got-to-see episodes, check it out after you watch this one.

Do use comprehensive assessments, not random samples.

DON’T assess to assess. Assessment is not the end goal.

DO use diagnostic assessments.

DON’T assess what you won’t teach.”

DO analyze data with others (drop your defenses).

DON’T assess what you can’t teach.

DO steal from others.

DON’T assess what you must confess (data is dangerous).

DO analyze data both data deficits and mastery.

DON’T assess what you haven’t taught.

DO use instructional resources with embedded assessments.

DON’T use instructional resources which don’t teach to data.

DO let diagnostic data do the talking.

DON’T assume what students do and do not know.

DO use objective data.

DON’T trust teacher judgment alone.

Now, sit back in your plushy seat and enjoy the flick. In Episode 5 we are taking a look at the following:

DO think of assessment as instruction. DON’T trust all assessment results. DO make students and parents your assessment partners. Don’t go beyond the scope of your assessments.

Wait ’til you download the featured assessment and matrix. It’s worth the wait.

DO think of assessment as instruction.

So often teachers view assessments as extraneous got-to’s, not as integral instructional components. I’ve heard, “I got into teaching to teach, not to assess” more times than I can count.

I kindly suggest that we should re-orient our thinking. No teacher would want to use an instructional resource that provided inaccurate information. No teacher would want to hand out a worksheet that her students had already completed. No teacher would want to waste time teaching something that her students already had mastered. Yet, teachers do so all the time when they have not assessed what studentsknow and what they don’t know.

Diagnostic and formative assessments inform our instruction. No one would trust a doctor who would write a prescription without a diagnosis. Diagnosis is part of the exam. The same is true for teaching. Assessment is an integral component of instruction.

If they know it, they will show it; if they don’t they won’t. So don’t blow it; make ’em show it.

DON’T trust all assessment results.

Even the best of doctors will suggest a second opinion. This is sound advice for teacher diagnosticians as well. Sometimes it makes sense to use an alternative assessment to double-check what studentsknow and what they don’t know, especially when the results seem inconsistent with other data.

When I was in fifth grade, I was pulled out of class to be tested for the gifted program. The assessment consisted of a timed test of orally delivered questions. After the second or third question, I hit upon a strategy to give me more think time. After each oral question, I asked, “What?” I got the question again and had twice as long to answer the question. I don’t remember if I qualified for the program, but I do remember being referred to the audiologist for hearing loss.

When in doubt, double-check with a different assessment.

DO make students and parents your assessment partners.

Test data shouldn’t be secret. Both students and parents need to know what is already known and what needs to be known. Most elementary teachers share some form of data at student-parent-teacher conferences, but secondary teachers rarely do so.

My suggestion is to share both diagnostic and formative assessment data on a regular basis with students and parents. Both are encouraged and motivated by progress. Share progress monitoring matrices with your partners.

Don’t go beyond the scope of your assessments.

Good assessments are limited assessments. They test specific concepts and skills, not general ones. Teachers over-reach when they try to make assessments walk on all fours. In other words, when teachers make assessments prescribe generalizations or treatments beyond the scopes of their applications.

For example, a student who fails to correctly punctuate an MLA citation on a unit test, may not need further instruction in what and what not to cite. Or a student who does not know when and when not to drop the final silent e when adding on a suffix, may not need to practice reading silent final e sound-spellings (the former is a spelling skills; the latter is a phonics skill).

Effective assessment-based instruction sticks to the limits of the assessment and does not generalize.

Glad you dropped by to watch Episode 5? Before you re-fill that unlimited re-fills popcorn on your way out, better grab your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 6. This once could sell out! Also get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 99% score on Rotten Tomatoes! Here’s the preview: Do understand that not all assessment data helpful. Don’t rely solely on teacher observation for assessment. Do review; mastery is not permanent. Don’t just assess Common Core State Standards.

I’ve been using a silly movie theme to weave together a series of articles for my Do’s and Don’tsof ELA and Reading Assessments series. So far I’ve offered these suggestions over the trailer and first three episodes:

Do use comprehensive assessments, not random samples.

DON’T assess to assess. Assessment is not the end goal.

DO use diagnostic assessments.

DON’T assess what you won’t teach.”

DO analyze data with others (drop your defenses).

DON’T assess what you can’t teach.

DO steal from others.

DON’T assess what you must confess (data is dangerous).

DO analyze data both data deficits and mastery.

DON’T assess what you haven’t taught.

DO use instructional resources with embedded assessments.

DON’T use instructional resources which don’t teach to data.

Permit me to tell a brief anecdote. As a junior in high school, I got my license on my sixteenth birthday. At last, I could take my girlfriend out on a real date! Where to go? The movies, of course. Just one problem.

Friday night was guys’ night. My group of buddies and I always got together on Friday night. When Richard called me up after school to tell me that he would pick me up at 7:00, I quickly lied and told him that I was sick. Of course, I had already called my girlfriend to ask her to go to the movies.

We were munching on popcorn, half-way through the movie, when an obnoxiously loud group of guys entered the theater. Yes… my friends. I slumped down in my seat and told my girlfriend that I needed to see all the credits before leaving. When I assumed my friends had left the theater for their next Friday night adventure, my girlfriend and I slowly made our way up to the lobby.

Richard was the first friend to greet me. Let’s just say I paid dearly for that lie.

This article’s focus?

DO let diagnostic data do the talking.DON’T assume what students do and do not know. DO use objective data. DON’T trust teacher judgment alone.

The FREE assessment download at the end of this article includes a recording matrix and two great lessons… all to convince you to check out my assessment-based ELA and reading program resources at Pennington Publishing.

DO let diagnostic data do the talking.

One of the first lessons new teachers learn is how to answer this student or parent question: “Why did you give me (him or her) a ___ on this essay, test, project, etc.?”

A less snotty and more effective response is to reference the data. Data is objective. Changing the subjective nature of the question into an objective answer is a good teacher self-defense mechanism and gets to the heart of the issue.

Diagnostic data is especially helpful in answering why students are having difficulties in a class. Additionally, the data in and of itself offers a prescription for treatment. Going home from the doctor with a “This should go away by itself in a few weeks” or a “Just not sure what the problem is, but it doesn’t seem too serious” is frustrating. Patients want a prescription to fix the issue. Parents and students can get that prescription with assessment-based instructional resources.

One other application for both new and veteran teachers to note: A teacher approaches her principal with this request: “I need $$$$ to purchase Pennington Publishing’s Grammar, Mechanics, Spelling, and Vocabulary BUNDLE. Our program adoption does not provide the resources I need to teach the CCSS standards.”

Answer: “Not at this time.”

Instead, let diagnostic data do the talking.

“Look at the diagnostic data on this matrix for my students. They need the resources to teach to these deficits.”

Answer: “Yes (or Maybe)”

DON’T assume what students do and do not know.

We teachers are certainly not free of presuppositions and bias. As a result, we assume what has yet to be proven. In other words, we beg the question regarding what our students know and don’t know.

He must be smart, but just lazy. His older sister was one of my best students. They’re in an honors class; of course they know their parts of speech. I have to teach everything as if none of my students knows anything; I assume they are all tabula rasa (blank slates). You all had Ms. Peters last year, so we don’t have to teach you the structure of an argumentative essay.

Effective diagnostic assessments eliminates the assumptions. Regarding diagnostic assessments, I always advise teachers: “If they know it, they can show it; if they don’t, they won’t.”

DO use objective data.

Not all diagnostic assessments are created equally. By design, a random sample assessment is subjective, no matter the form of sampling. Those of you who remember your college statistics class will agree.

Teachers need objective data, not data which suggests problem areas. Teachers need to know the specifics to be able to inform their instruction. For this application, objective means comprehensive.

The “objective” PAARC, SWBAC, or state-constructed CCSS tests may indicate relative student weaknesses in mechanics; however, teachers want to know exactly which comma rules have and have not been mastered. Teachers need that form of objective data.

DON’T trust teacher judgment alone.

After years of teaching, veteran teachers learn to rely on their judgment (as they should). After a few more years of teaching, good teachers learn to distrust their own judgment at points. Experienced teachers look for the counter-intuitive in these complex subjects of study that we call students. What makes them tick? Kids keep our business interesting.

Diagnostic and formative assessments bring out our own errors in judgment and help us experiment to find solutions for what our students need to succeed. Assessments point out discrepancies and point to alternative means of instruction.

For example, a student may score high in reading comprehension on an un-timed standards-based assessment. Also, she was in Ms. McGuire’s highest reading group last year. Most teachers would assume that she has no reading problems and should be assigned to an advanced literacy group.

Yet, her diagnostic spelling assessment demonstrates plenty of gaps in spelling patterns. A wise teacher would suspend her initial judgment and do a bit more digging. If that teacher gave the Vowel Sounds Phonics Assessment (our FREE download at the end of this article), the student might demonstrate some relative weaknesses. She may be an excellent sight-word reader, who does fine with stories, but one whom will fall apart reading an expository article or her science textbook.

Like my dad always told me… Measure twice and cut once.

Thanks for watching Episode 4. Make sure to buy your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 5 before you sneak out of the theater with your girlfriend or boyfriend. Also get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 94% score on Rotten Tomatoes! Here’s the preview: DO treat assessment as instruction. DON’T trust all assessment results. DO make students and parents your assessment partners. Don’t go beyond the scope of your assessments.

The thing about movie sequels is that we feel a compulsive necessity to see the next and the next because we’ve seen the first. I’d be interested to know what percentage of movie-goers, who saw all three Lord of the Rings movies, watched both Hobbit prequels. My guess would be a rather high percentage.

If my theory is correct, I’d also hazard to guess that the critic reviews would not substantially alter that percentage.

Of course my hope is that I’ve hooked you on this article series and the FREE downloads 🙂 of assessments, recording matrices, audio files, and activities in order to entice you to check out my corresponding assessment-based products at Pennington Publishing.

In my Do’s and Don’tsof ELA and Reading Assessments series, I’ve offered these bits of advice so far:

Do use comprehensive assessments, not random samples.

DON’T assess to assess. Assessment is not the end goal.

DO use diagnostic assessments.

DON’T assess what you won’t teach.”

DO analyze data with others (drop your defenses).

DON’T assess what you can’t teach.

DO steal from others.

DON’T assess what you must confess (data is dangerous).

But, wait… there’s more!

DO analyze data both data deficits and mastery.

Kids are fixer-uppers, waiting to be fixed and flipped.

Teachers are fixers. In some sense we view our students as “as is” houses or fixer-uppers, waiting for us to determine what needs repair and updating so that we can flip them in market-ready condition to the next teacher.

Teachers should use diagnostic assessments in this way. Most all students need to catch up while they keep up with grade-level instruction.

However, we miss some of the value of diagnostic assessments when we don’t analyze data to build upon the strengths of individual students. For example, teachers are frequently concerned about the student who has high reading fluency rates, but poor comprehension. Yes, some students are able to read quickly with minimal miscues, but understand and retain little of what they have read. Just weird, right?

Looking only at the diagnostic deficit (lack of comprehension) might lead the teacher to assume that the student is a sight word reader in need of extensive decoding practice to shore up this reading foundation. However, if we look at the relative strength (fluency), we might prescribe a different treatment to build upon that strength. It may certainly be true that the student might have some decoding deficits, but if the student is able to recognize the words, it makes sense to use that ability to teach the student how to internally monitor text with self-questioning strategies.

Teachers love to see progress in their students. Our profession enables us to see a student go from A to B throughout the year with us as the relevant variable. Assessment data does provide us with extrinsic rewards and a self-pat-on-the-back. I love our profession!

But we have to use real data to achieve that self-satisfaction. Otherwise, we are only fooling ourselves. As the new school year begins, countless teachers will administer entry baseline assessments, designed to demonstrate student ignorance. These assessments test what students should know by the end of the year, not what they are expected to know at the beginning of the year. Often the same assessment is administered at the end of the year to determine summative progress and assess a teacher’s program effectiveness.

Resist the temptation to artificially produce a feel-good assessment program such as that. Such a baseline test affords no diagnostic data; it does not inform your instruction. It makes students feel stupid and wastes class time. The year-end summative assessment is too far removed from the baseline to measure the effects of the the variables (teacher and program) upon achievement with any degree of accuracy.

Test only what has been taught to see what they’ve retained and forgotten.

DO use instructional resources with embedded assessments.

In my work as an ELA teacher and reading specialist at the elementary, middle school, high school, and community college levels, I’ve found that most teachers use three types of assessments: 1. They give a few entry-level assessments, but do little if any thing with the data. 2. They give unit tests once a month, but do not re-teach or re-test. 3. They give some form of end-of-year or term summative test (the final) with little or no review or re-teaching of the test results.

As you, no doubt, can tell, I don’t see the value in any of the above approaches to assessment. It’s not that these tests are useless; it’s that they tend to be reductive. Teachers give these instead of the tests they should be using to inform their instruction. Diagnostic assessments (as detailed in the previous section) are essential to plan and inform instruction. Also, what’s missing in their assessment plan? Formative assessments.

My take is that the best method of on-going formative assessment is with embedded assessments. I use embedded assessments to mean quick checks for understanding that are included in each lesson. Both the teacher and student need to know whether the skill or concept is understood following instruction, guided practice, and independent practice. For example, in the FREE diagnostic assessment (with audio file), recording matrix, and lessons download at the end of this article, the lesson samples from my Differentiated Spelling Instructionprograms are spelling pattern worksheets. These are remedial worksheets which students would complete if the Diagnostic Spelling Assessment indicated specific spelling pattern deficits. Each worksheet includes a writing application at the end of the worksheet, which demonstrates whether the student has or has not mastered the practiced spelling pattern. These are embedded assessments, which the teacher can use to determine if additional instruction is unnecessary or required.

Use instructional materials which teach and test.

DON’T use instructional resources which don’t teach to data.

The converse of the previous section is also important to bullet point. To put things simply: Why would a teacher choose to use an instructional resource (a worksheet, a game, software, a lecture, a class discussion, an article, anything) which is not testable in some way? Of course, the assessment need not include pencil and paper; informed teacher observation can certainly include assessment of learning.

Let’s use one example to demonstrate an instructional resource which does not teach to data and how that same resource can teach to data: independent reading. This one will step on a few toes.

The instructional resource may or may not be teaching. We don’t know. If the student is reading well at appropriate challenge level, the student is certainly benefiting from vocabulary acquisition. If the student is daydreaming or pretending to read, SSR is producing no instruction benefit. Following is an alternative use of this instructional resource:

Practice with Assessment: Read for 10 minutes, annotating the text. Then do a re-tell with your assigned partner for 1 minute, using the SCRIP Comprehension Strategies Bookmarks as self-questioning prompts. Partners are to complete the re-tell checklist. Repeat after 10 more minutes. Teacher randomly calls on a few readers to repeat their re-tells to the entire class and their partners’ additions. If the checklists and teacher observation of the oral re-tells indicate that the students are missing, say, causes-effect relationships in their reading, the teacher should prepare and present a think-aloud lesson, emphasizing this reading strategy with practice. This practice uses data and informs the teacher’s instruction. Plus, it provides students with a purpose for instruction and holds them accountable for learning.

Thanks for watching Episode 3. Make sure to purchase your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 4 before you walk out of the theater. This episode will sell-out fast! Also get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 98% score on Rotten Tomatoes! Here’s the preview: DO let diagnostic data do the talking.DON’T assume what students do and do not know.DO use objective data.DON’T trust teacher judgment alone.

You know how it is with movie sequels; the sequel rarely lives up to the promise of the original movie. However, there are exceptions and you’re reading one 🙂

In my Do’s and Don’tsof ELA and Reading Assessments series, I began with a trailerto introduce the articles, in which I argued, “Do use comprehensive assessments, not random samples.” I followed with the first episode, in which I elaborate on the following: “DON’T assess to assess. Assessment is not the end goal. DO use diagnostic assessments. DON’T assess what you won’t teach.” Both the trailer and first episode provide some of my 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. Take a look at these later, but you’ve got to read this article first and grab the FREE download.

As an ELA teacher and reading specialist, I believe in the power of ELA and reading assessments. However, as with many educational practices, appropriate use is often coupled with misuse (or even abuse); hence, the Do’s and Don’tsof ELA and Reading Assessments.

DO analyze data with others (drop your defenses).

We teachers love our independence, but it sometimes comes with a cost to our students.

My eighth-grade ELA colleague in the classroom next door has the reputation of being a fine teacher. She serves as our department chair and we’ve taught together for a dozen years. I can tell you all about her two kids and husband. Of course, I spell her once in a while for a bathroom break, but I’ve never seen her teach; nor has she seen me teach. I’ve found this scenario to be quite typical. Our classrooms are our castles. We let down the drawbridges a few times a year for administrative walk-throughs or evaluations, but rarely more than that.

Our department meetings are all business: budget, supply status, pleas to keep the workroom clean, schedules, and novel rotations. We also meet twice-per-month for grade-level team meanings. Again, more business with some curricular planning and the usual complaint-sharing about students, parents, the district, and administrators. Administrators want us to have common assessments, mainly to ensure consistent instruction. We do, but get around that requirement by adding on our own assessments and make these the ones that matter. We never analyze student data, except the Common Core annual assessment (and that data is aggregated by grade-level subject, not by individual teacher). Of course, that data is out-of-date (months ago) and so general as to be of minimal use.

At the beginning of the school year I sing the same old song: “Can’t we set aside time at each meeting to look at each others’ student work and learn from each other?” I mean assignments, essays, and unit tests… the stuff that we are now teaching. Everyone agrees we should, but we never have enough time. Why not?

We’re afraid.

What if she finds out that I’m just a mediocre teacher? What if he finds out that I have no clue about how to teach grammar? What if they discover that I really don’t differentiate instruction, though I have a reputation for doing so? Would I be able to or willing to change how I teach? My colleagues aren’t my bosses.

It’s time we take some risks and let the assessment data do the talking. None of us is as good or bad as we think. Everyone has something to contribute and something to learn. We need different perspectives on analyzing data; looking solely at your own data without comparison to others’ data may lead to inaccurate judgments and faulty instruction.

Let’s drop our defenses and let our colleagues into our professional lives. Data analysis as a community of professional educators can produce satisfying results and helps us grow as professionals.

DON’T assess what you can’t teach.

When teachers sit down and brainstorm what baseline assessments to give at the start of the school year, someone invariably suggests a reading comprehension test and a writing sample. I chime in with a mechanics test. Here’s why my suggestion makes sense and my colleague’s does not.

A mechanics test is teachable: 9 comma rules, 7 capitalization rules, and 16 italics, underlining, quotation marks, etc. rules. A reading comprehension test and a writing sample are not. Check out my article, Don’t Teach Reading Comprehension when you have time. Suffice it to say that the latter two tests will not yield the same kind of specific data as, say, that mechanics test. Want to download that mechanics test and progress monitoring matrix? The FREE download is at the end of the article; you can teach to this assessment.

Bottom line? You don’t have time to assess for the sake of assessing. Refuse to assess what will not yield teachable data.

DO steal from others.

Teacher constructed assessments provide the best tools. Work with colleagues to create diagnostic and formative assessments to measure student achievement and quick follow-up assessments designed to re-assess, once you re-teach what individual students did not master the first time.

Steal exercises, activities, and worksheets from colleagues that will re-teach. No better compliment can be paid to a fellow teacher than “Would you mind making me a copy of that?”

DON’T assess what you must confess (data is dangerous).

I would add an important cautionary note to sharing assessment data. First, students do have a right to privacy. Be careful to keep data analysis in-house. On my recording matrices I suggest using student identification numbers when posting results in the classroom. Second, ill-informed parents and administrators will sometimes misuse data to make judgments about the teacher rather than the student. Lack of mastered concepts and skills could be used to accuse previous or present teachers of educational malpractice. Some administrators will cite quantitative data on evaluations to comment on lack of progress.

Teachers should be judicious and careful in publicizing data. Most parents and administrators will welcome the information, understand it in its proper context, and recognize the level of your professionalism. Set some department or team-level guidelines for data sharing and test the waters before sharing everything.

To clarify, it’s not the data that is dangerous; it’s the misuse that needs to be avoided.

That’s it for now. Some of you will jump up into the aisle to head to the lobby upon seeing “The End.” Others will relax and let the theater clear out before walking out. Make sure to purchase your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 2 and get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 87% score on Rotten Tomatoes! Here’s the preview: DO analyze both data deficits and mastery.DON’T assess what you haven’t taught.DO use instructional resources with embedded assessments.DON’T use instructional resources which don’t teach to data.

Many movie theaters are now opting to sell you specific seats for a show time, rather than the traditional first come first served model. Although you have to pay a premium for this advanced purchase option, I think it’s worth every penny. Here’s why: If you time it right, you can show up to your assigned seat right before the start of the movie and skip the annoying previews (usually known as trailers for some reason). According to an editor on Reddit, these trailers (including commercials and warnings to “Please silence your cell phone”) average 15-20 minutes.

ELA and Reading Assessment Do’s and Don’ts: The Movie Trailer

In my Do’s and Don’tsof ELA and Reading Assessments series, I began with a trailerto introduce the articles. This preview, Do use comprehensive assessments, not random samples, focused on why teachers want quick, whole-class, comprehensive assessments which produce the specific data regarding what students know and what they don’t know about a subject and why normed tests and achievement tests, such as the PAARC, SWBAC, and other state CCSS tests don’t provide that data. As an enticement to read the articles (and check out my Pennington Publishing programs to teach to the assessments) I provided two assessments which meet that desired criteria: the 1. Alphabetic Awareness Assessment and the 2. Sight Syllables (Greek and Latin prefix and suffix) Assessment. Additionally, the respective downloads include the answers, corresponding matrices, administrative audio files, and ready-to-teach lessons.

But first, let’s take a look at the first three-part episode in the Do’s and Don’t of ELA and Reading Assessments series:DON’T assess to assess. Assessment is not the end goal. DO use diagnostic assessments. DON’T assess what you won’t teach. Plus, wait ’til you see the FREE download at the end of this article! Plus, a bonus.

DON’T assess to assess. Assessment is not the end goal.

A number of years ago, our seventh and eighth-grade ELA department gathered over a number of days in the summer to plan a diagnostic assessment and curricular map to teach the CCSS grammar, usage, and mechanics standards L. 1, 2, and 3. I was especially pleased with the diagnostic assessment, which covered K-6 standards and felt that the team was finally ready to help students catch up while they keep up with grade-level standards.

By the end of the first two weeks of instruction, every ELA teacher had dutifully administered, corrected, and recorded the results of the assessment on our progress monitoring matrix. I began developing worksheets to target the diagnostic deficits and formative assessments to determine whether students had mastered these skills and concepts. I placed copies of the worksheets in our “share binder.” My students were excited to see their progress in mastering their deficits while we concurrently worked on grade-level instruction.

At our monthly team meeting, I brought my progress monitoring matrix to brag on my students. “That’s great, Mark.” “Nice work. I don’t know how you do it.” No one else had done anything with the diagnostic data.

Somehow I got up enough courage to ask, “Why did you all administer, correct, and record the diagnostic assessment if you don’t plan on using the data to inform your instruction?”

Responses included, “The principal wants us to give diagnostic assessments.” “The test did give me a feel for what my class did and did not know.” “It shows the students that they don’t know everything.” “It confirms my belief that previous teachers have not done a good job teaching, so I have to teach everything.”

Class time is too valuable to waste. Assessment is not an end in and of itself.

DO use diagnostic assessments.

Let’s face it; we all bring biases into the classroom. We assume that Student A is a fluent reader because she is in an honors class. Of course, Student B must be brilliant just like her older brother. Student C is a teacher’s kid, so she’ll be a solid writer. My assumptions have failed me countless times as I’m sure have yours.

Another piece of baggage teachers carry is generalization. We teach individuals who are in classes. “We all talk about a class as if it’s one organism. “That class is a behavioral nightmare.” “That class is so mean to each other.” “It takes me twice as long to teach anything to that class.” “This class had Ms. McGuire last year. She’s our staff Grammar Nazi, so at least the kids will know their parts of speech.” We lump together individuals when we deal with groups. It’s an occupational hazard.

To learn what students know and don’t know, so that we can teach both the class and individual, we have to remove ourselves as variables to eliminate bias and generalizations. Diagnostic assessments do the trick. Wait ’til you download the FREE diagnostic assessment at the end of this article; it transformed my teaching and has been downloaded thousands of times over the years by teachers to inform their instruction.

Additionally, diagnostic assessments force us to teach efficiently. When we learn that half the class has mastered adverbs and half has not, we are forced to figure out how to avoid re-teaching what some students already know (wasting their time) while helping the kids who need to learn. As an aside, many teachers avoid diagnostic assessments because the results require differentiated or individualized instruction. Naivete is bliss. Diagnostic assessments are amazing guilt-producers.

Be an objective teacher, willing to let diagnostic data guide your instruction. Teaching is an art, but it is also a science.

DON’T assess what you won’t teach.

Many teachers begin the school year with a battery of diagnostic assessments. The results look great on paper and do impress administrators and colleagues; however, the only data that is really impressive is the data that you will specifically use to drive instruction. Gathering baseline data is a waste of time if you won’t teach to that data.

I suggest taking a hard look at the diagnostic assessments you gave last year. If you didn’t use the data, don’t do the assessment. Now, this doesn’t mean that you can’t layer on that diagnostic assessment in the spring if you are willing (and have time) to teach to the data. Diagnosis is not restricted to the fall. Teachers begin the school year with high expectations. Don’t bite off more than you can chew at once.

Additionally, more and more teachers are looking critically about the American tradition of unit-ending tests. Specifically, teachers are using unit tests as formative assessments to guide their re-teaching. Rather than a personal pat on the back (if students scored at an 85% average) or a woe-is-me-I’m-a-horrible-teacher-or-my-students-are-just-so-dumb-or-the-test-was-just-too-hard response (if students scored at a 58% average), unit tests can serve an instructional purpose.

Now I know that teachers will be thinking, “We have to cover all these standards; we don’t have time to re-teach.” I’ll address this concern with a simplistic question that more than once has re-prioritized my own teaching. It really is an either-or question: Is teaching or learning more important?

For those who answer, learning, don’t add to your admirable burden by assessing what you won’t teach.

That’s it for now. The credits are rolling, but keep reading because the end of the credits may have a few surprises. Purchase your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 2 and get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 92% score on Rotten Tomatoes! Here’s the preview: DO analyze data with others (drop your defenses).DON’T assess what you can’t teach.DO steal from others.DON’T assess what you must confess (data is dangerous).