What do we mean by “information literacy anyway?

(Sidenote—our school has had no Internet for the past two days. This is exam week, aka the throes of “end of year” activities. It’s been fascinating to see the way that technology has snaked its way into so much of our daily lives. Next time someone describes a library as “quaint, old-fashioned, or book-filled” think about your life without Internet, let along without computers.

Things I cannot do:Check out or return books, Update overdue lists:Run usage statistics from library databases and catalogs for end-of-year report:Place my MISBO order for 2016:Access collaborative files on Google Drive:Check the time Amazon is supposed to deliver a box of books for our administrator(needed today):Download summer reading books on Overdrive:Write this blog post on WordPress or get the screenshots I need:Waste time Googling things that pop into my head

Things I can do: put class assignments with my feedback into their relevant folders:Remove extra icons from the library computer desktops:Dust:Reshelve books:Sort book donations :Look at print books (but I’m still not brave enough to read during the day) :Write this post in Microsoft Word and plan to post it later

Libraries are technology hubs! But away from my digression…)

In 2010, a forward-thinking administrator added digital citizenship/information literacy units to grades 6, 9, and 10. I usually prefer teaching within specific classes, but he gave me a lot of leeway in terms of the types of units I could create, so I feel like it’s a pretty creative and engaging series of lessons. At the end of my days with the 10th grade, I give them the TRAILS information literacy assessment. For 5 years, they’ve rocked the 9th grade assessment, with questions like:so this year we changed it up do see how they’d do with the 12th grade assessment. Most of you are probably familiar with TRAILS, the information literacy assessment out of Kent State University. It was designed with AASL’s Standards for the 21st Century Learner and the Common Core standards in mind. Students answer multiple choice questions on research and technology use in the following categories: develop topics; identify potential sources; develop, use, and revise search strategies; evaluate sources and information; and use information responsibly, ethically, and legally. Plus it’s free! Though I think it can be tough to identify digital literacy skills in an abstract test like this, it’s helpful for me to see trends and also to be able to use the results to advocate for more time spent learning specific tasks. For example, As adults, I want my students to be able to identify bias in the media. This is as varied as evaluating the current electoral debates to browsing sponsored content on Instagram. It’s an important life skill. And thus it’s helpful for me to know that it’s one my students aren’t getting. (Yet, Carol Dweck!) Just over half chose one of the other options. It seems so straightforward to me that I probably haven’t done enough to address it directly during my time with them. (Luckily I shared this with an 8th grade teacher and we’re adapting the Visible Thinking Headlines routine that we do to show both unbiased and unbiased options. Advocacy goal met.)

In general, it’s a world of difference between the 9th and 12th grade sets. For example, one of the topic selection questions in the 9th assessment is: Here’s is a topic selection question from the 12th assessment: Because I recognize that there’s a tough balance between getting teens, who I don’t teach on a daily basis and do not grade, to take something seriously but not stress about it, I told them that this was for me to grade myself. I also showed the way that I receive their responses as a class rather than as individuals, which fed class competitive spirit without individual pressure. But, the other way I tried to keep their interest was by telling them that there were three questions where I felt that more than one answer was entirely valid. This is one of them. Well, actually I think the second answer is correct. But I can see some justification for the first.

Curious about the other ones? My kids didn’t particularly answer this “correctly,” and I’m okay with that. I’m saying that I value this assessment at the same time I’m questioning some of the questions. And that can be a reflective tool for both me and my students. They loved guessing the ones where I struggled to find the “correct” answer, and I was happy to have them initiating conversations about information literacy. To be fair, we don’t purchase any math databases, nor do we require any math research projects. But if a student came to me with this open-ended assignment, the first place I’d suggest looking is the table of contents in her math book, followed closely by a Google search on the topic. I probably wouldn’t search through a math magazine, but guess what, neither would any of my students.When I think Powerpoints, I think images. Our teachers are pretty strict about using slides to illustrate discussion points rather than as a teleprompter. But even so, for information for a health class, which in our school would be a basic elective, any of these would be fine places to get information. The first site for diabetes (not exactly nutrition but related) that comes up with the .gov limiter is the National Institute of Health (http://www.niddk.nih.gov/health-information/health-topics/Diabetes/your-guide-diabetes/Pages/index.aspx) The Diabetes Unit of Massachusetts General Hospital, number two on US News and World Reports “Best Hospitals for Adult Diabetes, has a comprehensive and trustworthy page (http://www.massgeneral.org/diabetes/). And just last October, Newsweek published an article about the link between sleep deprivation and diabetes (http://www.newsweek.com/study-lack-sleep-linked-risk-factors-stroke-diabetes-and-heart-disease-386492). Of course I’d trust the science database. But if you are trying to find information to share with your classmates who may have less of a background in a topic, I wouldn’t limit myself.

Returning to the idea of how seriously the students take this, it was helpful to me to see that almost half of my students got this wrong because they selected the library catalog. I never would have believed it otherwise. It’s a terrible answer; even if we took it to the next step, I have no books in my collection that would answer this! Their responses show that I’m likely still favoring library resources over some others. Yet again, for the millionth time in the last month, I’m inspired and challenged by Nora Murphy’s Source Illiteracy presentation. And saving the best for last, I learned that the tree octopus has been around long enough that it’s new again! Unlike a few years ago, this year’s sophomores weren’t familiar with it, and I loved the statement, “ It’s just so cool I want it to be true!” Guess that can go back into my information literacy lessons in the younger grades, huzzah!Please share your thoughts below. Am I off base in questioning some of the questions? Do you use TRAILS with your students and what do you do with the results? In what grades have you tested the students and how has your teaching changed based on what you’ve learned?

I did the TRAILS 9 pre and post test this year with our 9th graders at the beginning and end of year. I found it to be great for analyzing trends and seeing where their skills improved (and where they stayed the same) as a road map for how to strengthen our info lit program. I agree, though, that the questions are not quite perfect.