This month, I'll be trying out some PKM workshops such as "Get Organized", "Stay Current" and "Managing Privacy Online". I'm hoping they will be more popular than some of our database-specific workshops and it will be a good opportunity to sneak in some info-lit skills as well.

" Behavior vs. belief and changing culture | Home By Meredith Farkas | August 10, 2012 It's pretty clear from the comments on my recent posts that many of us have a sense that the sort of information literacy instruction we're providing is not having the impact we'd like.

"The information and materials on this page will help program chairs, program coordinators, and assessment coordinators develop their program assessment plans, implement them, and report on them. Assessment Plans and Reports are created at the program level."
Includes templates, examples, summaries and more about creating department / program assessment activities.

One of their assignments is to interview a researcher in their field. This year, since the students had a nice mix of majors from across the curriculum, we used reports from the interviews as an opportunity to analyze on how research traditions vary from one discipline to another and how these experts’ processes differ from those of non-experts.

One thing that many students remarked on as they reported on their interviews: the activities that define research are enormously varied from one discipline to another. The process a researcher goes through to examine the historical context in which Shakespeare wrote one of his history plays is a world apart from what a researcher does to develop a new vaccine or what an ethnographer does when studying an isolated culture in Brazil.

The scientists all had co-authors; the social scientists were a mix of solo and collaborative projects, and the humanists all performed solo acts. And yet, it became clear that all of them were working within an ongoing conversation. None of them was doing work that didn’t draw on and respond to the work of others.

Every researcher described some strategies for keeping up with new developments in their area of expertise, all of which involved some scanning of new publications and some personal contact with individuals exploring the same territory.

For most, presenting research at conferences was a common part of bringing their research to completion. For all, writing up results for publication was an important final step, and they seemed acutely aware of the pecking order for publication venues in their field.

One thing the students all gained through these interviews was an appreciation that research is not a matter of finding answers in other people’s publications. Every scholar interviewed described how they had asked a question that nobody had asked before, a question they couldn’t answer themselves until they had completed the research. It struck me that so much of what undergraduates experience as “research” is very nearly the opposite, a process of uncovering answers others have already arrived at.

I’m also thinking about what these interviews said collectively about how real research is conducted. It makes me a little crazy when students abandon a truly interesting question because they can’t find sources to quote that provide the answer, or when they change their topic based on what they can find easily. Or (shudder) when they say they've written their paper, but need help finding five sources to cite. Clearly, they are not learning how to do research; they aren't even learning what research is.&nbsp;
What I would really, really like is to figure out how to give every student the experience of not worrying so much about getting the right answers, but learning how to ask a really good question. The kind they won't find answered in the library.

"I teach a course in the spring called Information Fluency... It's an upper division undergraduate course pitched to students who are planning to go to graduate school, giving them a chance to learn more about the way the literature of their field works as well as generally how to use library and internet tools for research."

National Survey of Student Engagement
 “First-year students were asked in NSSE about the frequency
with which they „worked on a paper or project that required
integrating ideas or information from various sources,‟ a
component of information literacy. UNCW first-year students
reported a frequency that was statistically significantly below
that reported by our selected peers, significantly below that
reported by national master‟s universities, and significantly
below that reported by all NSSE 2007 institutional
participants. This information led to the development of a
rubric-based assessment plan for information literacy to be
implemented with the comprehensive assessment of Basic
Studies beginning Fall 2009.”

Assessment Checklist
1. What are our research questions? (What are we trying to discover about
student skills, knowledge, abilities, etc.; and what evidence do we have
already?)
2. What is the expected level of performance?
3. When in the students‟ career do we assess this outcome? (entry, end of
sophomore year, senior, etc.)
4. In which course(s) or venue?
5. What student work/artifacts are collected?
6. How is the student work evaluated? (criteria/rubric)
7. Who evaluates the student work?
8. Who analyzes the results?
9. Where do recommendations for action go?
10. Who takes action? (And how do we ensure changes are evidence‐based and
data‐driven?)
11. How is the process documented?
12. Where is the documentation kept?
13. What is the timetable/schedule for determining which outcomes are assessed
Developed by the General Education Assessment Committee for designing assessment of a learning
when?
outcome.