Does the Classroom Prepare Students for the Real World or Just More Testing?

Growing up, this was decidedly the most preferred manner of testing during my twelve short years in the public school system. Memorize what you need just enough to be able to discern the most correct answer amongst three to five options.

Sure, sounds super practical–plus are you really retaining any of that knowledge once the test is over?

Of course during those years, I learned enough about test taking to acquire the necessary skills to make it through the SATs, ACTs and six years of college. Hurray! But if someone were to ask me if I were using any of that learned information today, what could I say?

School should be preparing our youth for life in the real world, right? Yet my students, while so concerned about what will be on the coming test, have very little idea how they can apply the information throughout their lives. So, what are we actually teaching our students?

Instead of being able to explain what they can DO with the information they have learned (i.e., order food at a restaurant from language class, determine appropriate change while shopping from math class), students often only site the technical name of a skill they have learned. They are being trained to take exams that require them to ingest information, and then spew it out without real understanding of its importance. For me, this feels problematic.

As a relatively new teacher, I was introduced early on to alternative forms of testing that may help transform school learned information from technical facts to applicable knowledge. Despite the research based support for methods of testing that avoid pencil, paper and rote memorization, my years teaching have shown me just how ingrained the expectation of regurgitated information is in our understanding of evaluation.

The Alternative Option

First, let’s introduce the alternative. In foreign language instruction, we call one option the Integrated Performance Assessment (IPA). Pioneered by several Pittsburgh doctorates of education, the IPA aims to incorporate three related tasks that cover all three forms of communication in a simulated real life scenario.

While the multiple-choice exam more or less teaches a student to memorize and rehearse information, an IPA challenges them to apply the information in scenarios that they will likely encounter outside of the classroom.

In order to see its benefits and disadvantages, I distributed an IPA as well as a multiple-choice exam for the very same chapter and compared the results. What a difference! Having tracked my students’ overall success rates on both, I saw they were proving much more capable at completing the IPA tasks. I was surprised by how much better a student could write an email in the past tense relative to his or her ability to plug and chug past tense into a paragraph.

More importantly, a survey distributed after each form of exam showed a significant difference in perception of the information learned. While the written test elicited responses referring to grammatical components such as the “past tense” and “conjugation” (as well as a handful of negative comments such as “I don’t like Spanish”), the IPA survey results suggested the students had thought about the content differently.

The comments were more specific. “[I learned] how to send an email in Spanish,” “I learned how to read articles,” and “I learned how to say I hope, I would like or I plan to travel.” And there was not a single, “This is stupid!” The positive change in perspective left me feeling inspired. So, why haven’t we switched to this type of testing in all classes?

The Reality of a Test-Taking Culture

Move ahead to the spring semester where millions of students across the country sit down to complete standardized state exams. Do those exams ask the students to complete interconnected tasks that mirror real life application? Do they show how math, reading and science can be applied to solve a potentially real problem? How about the SATs or the ACTs? I think we all know the answer here.

Outside of my classroom, on the tests that really count, my students are still being given pencil, paper and often multiple-choice exams that do not reflect the manner in which they are being taught or evaluated in the classroom. Which led me to think: Are my IPAs still applicable after all?

Though my Integrated Performance Assessments best evaluate whether my students have truly grasped language learning and how information is applicable to their lives, the IPAs may be ignoring another large part of these students’ real world experience. Without testing practice in their regular classes leading up to these standardized tests, they will surely be doomed when test time comes.

Though I continue to write and create IPAs for my students as often as possible, I do incorporate some multiple-choice tests for the very reason I stated above. While I still fully believe it is a disservice if I do not provide appropriate vocabulary for finding a bathroom in another country after one too many sticks of street meat (an imperative, as many of you travelers may know), I also believe that test taking skills are an absolute must in our current system.

Food For Thought

IPA pros: Mirrors real life and offers the students meaningful application of material. *Bonus if the students can verbalize this application*

IPA cons: Takes about a week to prep, and two and half weeks to complete. In comparison, a multiple-choice test takes no longer than a day or two of class time.

Does not mirror the test design that students have become accustomed to, thus not contributing to test taking prep.*Not so bonus if test is not taken seriously for the aforementioned reason above*

Though I still find testing problematic, the potential benefits of Integrated Performance Assessments may not outweigh the cons. Considering the reality of our testing culture, a large systematic change may need to take place in order to legitimize these assignments.

i’ll add that i think mutiple choice is an important format in some ways, but not the exclusive tool that we should use to evaluate competency. Isolating best choices from a short list is a valuable skill in and of itself, but I obviously agree on the hole.