Abstract

This article describes a new tool for teaching information literacy skills to college undergraduates. BiblioBouts is a game built on premises of educational gaming that came out of research published in the October 2008 issue of D-Lib Magazine. BiblioBouts seeks to satisfy student requests for a gaming experience directly integrated into their current coursework. Alpha testing of the game focuses on answering three main questions: Is BiblioBouts an effective approach for teaching undergraduate students information literacy skills? Do students want to play this game? What improvements does BiblioBouts need? Student response has been encouraging on all fronts. Contact information is provided so that interested parties can learn more.

Background

In the October 2008 issue of D-Lib Magazine, our research and development (R&D) team described the development, testing, and evaluation of the web-based board game Defense of Hidgeon (Markey et al., 2008b). The game's objective was to teach incoming undergraduates information literacy skills, specifically, the General-to-Specific Model for conducting library research (Kirk, 1974). Hidgeon served as a test bed to guide our game design efforts. As a result, we gathered valuable data about how students would prefer to interact with educational games. Primarily, students did not want game play that was apart from or unrelated to their coursework. They wanted a game that was integrated into, and that enhanced the workflow of the courses they were already taking. Game players told us they would play information literacy games that helped them complete their assigned coursework and earn them course credit. They preferred games they could play on their own but they would collaborate with their peers when the collaboration furthered what they wanted to accomplish. Our R&D team concluded that a major upgrade to Hidgeon could not accommodate students' needs so we drew on our experience with the research tasks that went into designing Hidgeon and built a new information literacy game from scratch using students' needs as design principles for the new game.

Envisioning a New Information Literacy Game

We envisioned a new information literacy game that would be a pervasive accompaniment to the online tools students use for resource discovery and management. The most important idea of the game was that the game would guide students through the research process on a topic assigned in a class, and that it would culminate in an actual bibliography to be used to write a specific paper. Like scaffolding for a building, the game would support the student directly in a class assignment, while also teaching research skills generally. The new game would consist of a series of narrowly focused and successive mini-games. Introducing students to a limited number of new research skills at a time, mini-game play would give students tools to compare their performance with their peers, repeated opportunities for skills practice, and incentives for matching the performance of the majority of their peers. Mini-game results would be incremental and cumulative, culminating in the building of a shared digital library of resources bearing game players' ratings, keywords, and categorizations that they would use to select the best resources for their assignments.

Not only would this collaborative research game teach library research skills to students in a manner relevant to their immediate course work, but it would also give instructors entirely new information on exactly how students conducted research. The game would track student research as it happened, allowing instructors to see the successive stages of the student research process as well as the final result.

Far from being a sea change, our new design was similar to an early idea for Hidgeon's design that we were unable to pursue due to the resources required to develop it (Markey et al., 2008a, 6). What was needed to breathe life into this new design was an online citation management system (CMS) into which students could save the citations and digital full-texts they found as a result of searching the web and library databases and share them with others. The CMS would serve as a conduit, feeding citations and full-texts to students during mini-game play. Searching for a CMS, the R&D team inquired about the suitability of the open-source Zotero CMS. The Zotero development team assured us that their plans for implementing shared citation management would synchronize with our game development and deployment plans, and thus, we relied on Zotero for the CMS component of the new game.

Supporting the development, deployment, and evaluation of the information literacy game are National Leadership Grant funds from the Institute of Museum and Library Services (IMLS). Game development, deployment, and evaluation are iterative, taking place multiple times as the game evolves to meet the needs of both game players and instructors. Game development details are documented in the R&D team's interim reports to IMLS (BiblioBouts Project, 2010b). The reasons why we choose games over other approaches to teach students information literacy skills are detailed in our earlier D-Lib Magazine article (Markey et al. 2008b). The purpose of this article is to describe the design and development of the new game, its deployment in academic classes, evaluation findings, and the improvements that the R&D team will make to the game in preparation for a second round of deployment and evaluation.

An Overview of the BiblioBouts Information Literacy Game

The BiblioBouts information literacy game is an accompaniment to course assignments that require undergraduate students to find, read, and synthesize published literature about academic topics and then write a paper or annotated bibliography, or author a multimedia project. Because game play is phased, occurs over several weeks, and requires a significant time and learning investment on the part of students, instructors must incorporate BiblioBouts into their syllabuses, combining game-play with a course assignment and giving students course credit for their participation in the game. This requirement is in keeping with the R&D team's previous experience with deploying games in academic settings, that is, students are much more favorably inclined to play games that are integrated into their coursework and count toward their grades (Markey et al. 2008b).

BiblioBouts deployment is not limited to academic classes. Librarians who teach for-credit, half- or full-semester bibliographic instruction classes are also encouraged to incorporate BiblioBouts into their syllabuses and course assignments.

Students who play BiblioBouts get hands-on experience conducting library research through a series of mini-games or bouts. During the game, they use the same online tools that their instructors use to conduct their research. They get to see the sources (that is, both citations and full-texts) that other players find, where they found them, and discover sources they would not have found on their own. At the end of the game, they have a high-quality bibliography that they can use to write their research paper, and they can immediately use their new research skills in other college coursework.

BiblioBouts consists of a series of narrowly focused and successive bouts or mini-games that introduce students to a few new research skills, concepts, and tools at a time. Table 1 lists the game's bouts, describes them, and summarizes the information literacy skills, concepts, and tools that students encounter during BiblioBouts game play. Click on bout names to link to a video demonstrating game play.

Players choose the best sources that address a specific research question

Using aboutness, disciplinarity, audience, relevance, and credibility to choose the best sources

Instructors are encouraged to partner with librarians who can help them introduce students to the library research concepts, skills, and tools they will encounter during the various phases of the game. Table 2 describes how instructors and librarians partner to deploy BiblioBouts in the classroom.

Develop subcategories that sources cover that are likely themes and subthemes for students' papers

Instructors

Before Best Bibliography begins

Develop 2 or 3 research questions that can be answered by contributed sources

The BiblioBouts Design, Development, Deployment, and Evaluation Cycle

In addition to the University of Michigan (U-M), the BiblioBouts R&D team has partnered with librarians and faculty at these four institutions to deploy and evaluate the game: (1) Chicago State University (CSU), (2) Saginaw Valley State University (SVSU), (3) Troy University-Montgomery Campus (TUMC), and (4) University of Baltimore (UB). The design and development of BiblioBouts is an iterative process, taking place in three cycles over several years, so that faculty, students, and librarians can be involved in design and development through game deployment and evaluation (Table 3).

Table 3. BiblioBouts' Cycles

Cycle

What

Dates

1. Alpha release

Design and development

Winter through summer 2009

Deployment and evaluation

Fall 2009, winter and spring 2010

2. Beta release

Design and development

Winter through fall 2010

Deployment and evaluation

Winter through fall 2011

3. Operational release

Final development and release

Winter through summer 2012

The team completed alpha-system design and development in summer 2009. In fall 2009, game deployment and evaluation took place in three classes, one undergraduate class of 8 SVSU students, one graduate class of 15 students and one undergraduate class of 27 students both at U-M. In winter 2010, 7 classes that total about 300 students are playing BiblioBouts: two UB classes, two TUMC classes (5 sections total), and three U-M classes. The alpha release of BiblioBouts features basic functionality for all 5 bouts and an administrative interface for instructors to create and profile games and to monitor student participation as a whole and individually. Currently, the R&D team is analyzing evaluation data from fall 2009 classes and refining BiblioBouts based on the findings.

Evaluating the Alpha Version of BiblioBouts

Because of the team's near-term goals for the beta-system design and development, these three research questions are most important right now:

Is BiblioBouts an effective approach for teaching undergraduate students information literacy skills?

Do students want to play this game?

What improvements does BiblioBouts need?

Answers to these three questions come from an analysis of pre- and post-game questionnaires, focus group interviews (FGIs), and game diaries that students completed after playing the alpha-release of BiblioBouts in fall 2009.

1. Is BiblioBouts an Effective Approach?

To answer this research question, we analyzed pre- and post-game questionnaires completed by 27 of the 100 students in an undergraduate "Introduction to Information Studies" course who volunteered to play BiblioBouts for extra credit. Asking students their perceptions about various library-research tasks, the questionnaires featured 5-point scales (i.e., 2 = very {challenging, well, confident}, 1 = somewhat {challenging, well, confident}, 0 = neutral, -1 {somewhat unchallenging, poorly, unconfident}, -2 {very unchallenging, poorly, unconfident}). Overall, students' ratings usually ranged from 1 to -1. Few students chose the "very" ratings at either end of the scale.

When students rated how challenging they thought it would be to perform various library-research tasks, their post-game ratings dropped a little over a half point. Thus, playing the game made students feel that various research tasks presented less of a challenge to them.

Students' ratings stayed the same (at neutral or close to 0.0) for these two tasks:

On the one hand, since the game has instructors selecting topics instead of students, students' stable before-and-after ratings for topic selection is in keeping with game play that gives them no practice in this regard. On the other hand, BiblioBouts gives students multiple opportunities to use Zotero to keep track of citations, full-texts, web sites, etc., thus, the almost imperceptible ratings change was unexpected. In FGIs, students acknowledged that setting up Zotero, synchronizing it with BiblioBouts, and learning how to use it was difficult. Here are comments from these same U-M undergraduate students about Zotero:

"Like once I got used to [Zotero], the learning curve was kind of steep for me. I don't know if it was I just wasn't getting it or  I had a lot of like technical problems. I don't know if it was me or what. But starting off I just  I don't know  it wasn't working very well for me. But like once you actually get it going, it's like clearly a really useful thing ... I probably would [not] have been able to get all of the information that I had-like I wouldn't have been able to collect it all like that and Zotero made it possible."

"I think a little more experience with Zotero itself would have helped. I thought it was a pretty simple concept and useful. So maybe experience with it in an earlier class would have been fine."

"That was my first time using Zotero as well and it wasn't too hard to figure out because I had prior experience with software programming ... but I agree it was a little abstract too in the sense that a lot of people hadn't used it before."

Students' "challenging" ratings for all 8 other tasks decreased by about one-quarter of a point to 1 point. Tasks that became especially less challenging were:

Where to find good information after I exhaust Google, Wikipedia, and the web

Judging sources with regard to their scholarly nature

Judging citations with regard to their relevance to my selected topic

Judging sources with regard to their trustworthiness for my selected topic

BiblioBouts gave students ample opportunities to learn about new sources of information  librarians visited class to demonstrate use of scholarly databases and make database recommendations and students searched these databases during the Donor and Closer bouts. In the Rating & Tagging bout, students rated a quota of 18 sources, requiring them to make repeated judgments about the relevance, trustworthiness, and scholarliness of sources submitted by their classmates.

Pre- and post-game questionnaires featured additional questions that asked students to rate how well they thought they would be able to perform various library-research tasks and how confident they would be performing them. On average, post-questionnaire ratings increased between two-thirds and three-quarters of a point for the "how well" and "confidence" questions, respectively. Thus, playing the game made students feel that they would be better at and more confident about performing various research tasks than they felt before playing the game.

With regard to how well and how confident students were "keeping track of the citations, full-text articles, web sites, etc.," rating increases were positive but small  about one-quarter of a point. To explain why, the R&D team will probe students in future FGIs and perform a failure analysis of the citations and full-texts that students submitted in the Closer bout to determine whether they did indeed submit both. Additionally, our experiences in winter 2010 classes troubleshooting students' problems using Zotero leads us to suspect that many students cannot distinguish citations bearing abstracts from full-texts. Such a finding calls for the librarian's class visit to class to include instruction on identifying and using citations, abstracts, and full-texts and encouraging students to watch the game's videos that show how to add full-texts to Zotero.

Registering an increase of 1 point or more were these two skills:

Where to find good information after I exhaust Google, Wikipedia, and the web

Using BiblioBouts' structured library-research process to conduct library research for my coursework

In FGIs, students acknowledged that learning about library databases from playing BiblioBouts and from the librarian's visit was valuable:

"I mean I just think that like usually I use Google just, you know, and like comparing the results I would get from Google to these databases is a huge difference and like I really realize that now, that the material that I was getting was not that reliable and not that scholarly and to be writing research papers and stuff I need to be using like databases and stuff like that."

"I think that I was really able to see a difference between the academic quality of sources from a database and from Google and nowadays, we don't really go to the library and get a hard copy of a book. So the electronic equivalent of that kind of quality could be found in a database rather than just on a Google search."

"I think the biggest help was when the librarian came in and showed us ... how to use the different databases because I think that there is just so many that I wasn't really sure [where] to go. I mean ProQuest was the main one that I used at first because it was just so general but she showed us specific like business type websites that we could go to. And I liked that."

Here are FGI comments that support how students learn a structured research process by playing BiblioBouts:

"[Playing BiblioBouts] reinforced how I would go through my research and make it more methodical ... it solidified my methods of doing research, it solidified the approach of doing research and it also would give me a platform tailored to those methods. Why shouldn't you have a system that teaches you those methods and to go through? There's no reason not to. It only makes sense."

"[Playing BiblioBouts] was helpful to me because sometimes I just know I need like a quote for this or that and I look for something to support what I am saying. But with this, I was able to actually like see the process I should go through and be able to get more information to back up my research and make sure it was credible information instead of just like grabbing something that looked like it went along with what I was trying to say. So I think it made my papers stronger in that aspect."

2. Do Students Want to Play BiblioBouts?

Pre- and post-game questionnaires asked students to rate themselves with respect to playing an online game to learn about conducting library research. Questionnaires featured 5-point scales (i.e., 2 = very high, 1 = somewhat high, 0 = neutral, -1 = somewhat low, -2 = very low) and enumerated these factors:

Motivation

Interest

Perseverance playing the game

Desire to win

Desire to receive the highest grade for game play

Desire to have fun playing the game

Desire to learn something about library research

Overall, pre-game ratings averaged 0.25, a positive quarter-point past neutral. The few students who chose "very high" or "high" ratings did so for the "highest grade" and "learn something" factors only.

Post-game responses were more positive averaging 0.60 overall. This time, the few students choosing "very high" or "high" ratings did so for all but the "fun" factor. After playing the game, students were positive about playing the game but the intensity of their positivity was not breaking any sound barriers. Students' post-game ratings for two factors deserve mention  motivation and perseverance. Most students described their motivation to play BiblioBouts as "very high" (25%) or "somewhat high" (56%). A large percentage of students (44%) reserved the "very high" response for the perseverance factor only. The Rating & Tagging bout especially put the motivation and perseverance factors to the test, requiring students to evaluate 18 sources. Many students commented about this quota being too high during FGIs, telling us how time consuming it was, suggesting the quota be reduced, and the interface simplified. Despite their complaints, most players met the quota and acknowledged their own hard work by rating their motivation and perseverance quite high.

The incentives for playing BiblioBouts were different for each of the three classes. To qualify for the incentives, students had to meet quotas for each of the five bouts. SVSU students received course credit connected to a related writing assignment, U-M undergraduate students received extra credit, and U-M graduate students chose between playing BiblioBouts (14 students) and writing a paper (1 student). All three approaches were effective in terms of recruiting and retaining players.

Also contributing to retention could have been the course-related benefits students said they received. The benefits listed include getting a head start on one's research, finding relevant sources from classmates' submissions, becoming a more confident researcher, and becoming better equipped to use Zotero for citation management in the future. Here are representative comments about game-play benefits:

"You have to do the first step before you can get to the second because if you want to do the second and you didn't do the first, sorry you're out of luck. You can't do it. I thought that was good incentive too to kind of motivate you to get going [to do your research long before it was due]. Yeah [several agree]." (SVSU undergraduate student)

"In the Sorter [bout], I would like kind of like click 'extra' information just so I could try and open articles that were relevant to me and then I ended up sorting all of them so I could see articles that would be helpful to me." (U-M undergraduate student)

"I think it could potentially be really, really helpful to somebody because you are introduced to things you might not have access to finding but somebody else found it and so you could use it so it could really help." (SVSU undergraduate student)

"It made me feel more confident in the research I'll do in the future." (U-M undergraduate student)

"Using Zotero ... because it can help you cite things so, so much easier." (U-M undergraduate student)

"I got to see what others were finding. When I write a paper, I don't know what others are finding ... It was reassuring to see that people are using the same journals as I am using, and oh, I've never thought about using this journal, and that was really helpful." (U-M graduate student)

3. What Improvements Does BiblioBouts Need?

Every data-collection method yielded suggestions for improvements to BiblioBouts, and students, faculty, and project liaisons from participating institutions made suggestions. Because all asked for a simplification of the game set-up process, this is our first order of business in the design and development of the beta version of BiblioBouts. Several design improvements will enhance the competitive aspects of the game. We hope they will motivate students to exceed quotas by locking students into actions that increase their score and give them even more practice with research tasks.

Here are the most important design improvements that the R&D team will introduce to the beta version of BiblioBouts:

Improving the game's attractiveness with a new banner, interface, navigation bar, and leader board.

Adding feedback to scoring. Because game players want to know immediately when and why they score points, the game's new banner will include a personal scorecard that reports their total score and the effect of just-completed game play on this score.

Streamlining BiblioBouts by eliminating the Sorter bout entirely and incorporating its objectives into both the Rating & Tagging and Best Bibliography bouts. Because students told us they gained little from Sorter, we will redesign the Rating & Tagging bout to elicit from players the "big ideas" that sources describe. Then the Best Bibliography bout that follows will elicit the "big ideas" players will describe in their papers and expect them to choose sources for their best bibliographies bearing the same "big ideas" that players gave to these sources during the Rating & Tagging bout.

Streamlining the Rating & Tagging interface to be less time-consuming, less repetitive, and more useful for game players. Especially adding feedback so students can find out how their classmates are rating and tagging sources.

Awarding virtual badges to players based on the nature of their game play. Because game players want acknowledgment for game accomplishments, BiblioBouts will award badges that recognize positive game play such as winning individual bouts, being the first to play a bout, exceeding quotas by the largest margin, adding the most comments to their relevance and credibility ratings, and choosing the most chosen sources for their best bibliographies. BiblioBouts will also award badges to players for their less-than-positive game play such as being the lowest or highest rater, closing the most sources that get eliminated from the game, and closing on sources that other players rate the lowest. Badges will be displayed on their homepage and accompany rating comments that fellow players see during and after the Best Bibliography bout. Thus, badges will bring attention to activities that are likely to make students successful researchers of scholarly information.

Because players want to use other players' closed sources to write their papers, we are adding a database of all closed sources that game players can search, retrieve, and download after the BiblioBouts game ends.

The Next Steps for BiblioBouts

About 300 students in 7 classes at TUMC, UB, and U-M are playing BiblioBouts in winter-spring 2010. The R&D team and partner librarians at TUMC and UB have begun evaluation activities so that we can confirm and expand on findings from the first round of game play in fall 2009. The R&D team has also begun its design and development activity for the beta version of BiblioBouts beginning with a simplified set-up process and focusing on design improvements that enhance the competitive aspects of the game.

The R&D team is recruiting instructors and librarians who want to deploy the beta version of BiblioBouts in their classes in winter and fall 2011. The Instructor FAQ (BiblioBouts Project, 2010a) describes courses that are a good match for BiblioBouts, how instructors can prepare for the game, and the types of class discussions they might want to lead to increase students' understanding of the information literacy tasks students experience during BiblioBouts game play. To play the BiblioBouts demonstration game, readers are encouraged to email "info@bibliobouts.org" for a username and password.

Conclusion

BiblioBouts solves the problem of teaching students information literacy skills, concepts, and tools in a unique way. Students complete their assignment, and they play BiblioBouts at the same time  the game is a pervasive accompaniment to the online tools students use for resource discovery and management, giving them opportunities to practice various information literacy skills over and over again. Playing BiblioBouts puts students in situations in which they leverage their research efforts  each student finds readings, assesses their usefulness for their assignment, and chooses the best readings. But because game play is collaborative, the cumulative effect for student learning and achievement is greater than any one student could have accomplished on his or her own.

Findings from the evaluation of the alpha version of BiblioBouts are promising. Students feel that they would be better at and more confident about performing various research tasks than they felt before playing the game. Students rate their motivation and perseverance at playing the game at high and very high levels. They cite many game-play benefits such as getting a head start on one's research, finding relevant sources from classmates' submissions, becoming a more confident researcher, and becoming more experienced using Zotero for citation management in the future. When the R&D team asked students whether they preferred a game over other approaches to learning about using library resources, this U-M undergraduate student's remark summed up the hopes and dreams that we designers had for BiblioBouts:

"I think it's good because you're not realizing at the time that you're learning about research. Like you might not want to think, 'Oh, I want to go learn about library research today.' You're playing the game and you're learning about it without doing that."

Acknowledgment

The R&D team acknowledges the Institute of Museum and Library Services (IMLS) for its support of the BiblioBouts Project. Our gratitude extends to Professor Sean Takats and the Zotero programmers at George Mason University's Center for History and New Media, and project liaisons Anita Dey at SVSU, Catherine Johnson at UB, Alyssa Martin at TUMC, and Gabrielle Toth at CSU.

About the Authors

Karen Markey is a professor in the School of Information at the University of Michigan. Her research has been supported by the Council on Library Resources, Delmas Foundation, Department of Education, Institute of Museum and Library Studies, National Science Foundation, and OCLC. She is the author of four books, more than a dozen major research reports, and several dozen journal articles and conference proceedings papers. As the principal investigator of both the Storygame and BiblioBouts Projects, she focuses on game design and evaluation and overall project management.

Fritz Swanson earned an MFA in fiction and non-fiction from the University of Michigan. He has been on the faculty of the University of Michigan English Writing Program since 2001, and teaches introductory and advanced courses in academic argument, composition and creative writing. His fiction and essays have appeared in The Mid-American Review, McSweeney's, Best American Fantasy and Esopus. Fritz has drawn on his experience as an avid gamer and undergraduate instructor to generate vision statements for the design of the games in both the Storygame and BiblioBouts Projects and to ensure that game elements are designed with an eye toward their pedagogical value in new-student classrooms.

Chris Leeder is a second-year doctoral student in the School of Information at University of Michigan. As a Graduate Student Research Assistant on the BiblioBouts Project, Chris coordinates user testing, focus groups, user surveys, and data analysis. He also produces and updates user documentation and instructional texts for the project. Prior to coming to Michigan, he earned a MSIS degree in Information Studies at the University at Albany, State University of New York. In the BiblioBouts Project, he puts to work his experience conducting user studies, developing multimedia products, and managing projects.

Gregory R. Peters, Jr. is President and Owner of Cyber Data Solutions. He has been working on web-based solutions for a variety of research projects since 1992, including the TULIP Project and the National Science Foundation-funded Science of Collaboratories Project. A member of the programming team for the NSF-funded University of Michigan Digital Library Project (1994-1998), he pioneered online systems that federate web-based search and retrieval of digitized journal articles. He holds a masters degree in software engineering from the University of Michigan. He is the principal programmer-architect of the BiblioBouts Project, responsible for game development and enhancement tasks.

Brian J. Jennings graduated in spring 2010 from the University of Michigan with a major in Computer Science and Engineering. He has been programming since 2002 and has considerable knowledge of networking and creating computer graphics. A veteran of the Storygame Project, Brian brings programming, graphics design, and gaming experience to the BiblioBouts Project.

Beth St. Jean is a doctoral candidate at the University of Michigan's School of Information. She holds a bachelor's degree in mathematics from Smith College and a master's degree in information with a specialty in library and information services from Michigan. Beth is a research assistant for the Storygame, BiblioBouts, and for the MacArthur-funded Credibility 2.0 Projects. She has put her expertise to work at building the game's scoring algorithm. Her dissertation work consists of a longitudinal investigation into how people diagnosed with a chronic serious health condition (type 2 diabetes) look for and make use of information related to their condition.

Victor Rosenberg is an Associate Professor in the School of Information at the University of Michigan and serves as a co-principal investigator of the BiblioBouts Project. Formerly, he was chairman and CEO of Personal Bibliographic Software, and the developer of ProCite. He is the author of numerous papers, films, and software packages. He has over 25 years of teaching and research experience at the U-M, where he teaches courses in information policy, entrepreneurship, and the information industry. He taught at the University of California at Berkeley after getting his doctorate in Library Science from the University of Chicago and Masters degree in Information Science from Lehigh University.

Soo Young Rieh is an associate professor in the School of Information at the University of Michigan. As a co-principal investigator of the BiblioBouts Project, she is interested in investigating students' credibility and relevance judgments as well as information literacy skills and concepts. Rieh received her Ph.D. from Rutgers University.

Geoffrey V. Carter is an Assistant Professor in the Department of Rhetoric and Professional Writing at Saginaw Valley State University. He teaches courses that explore the academic turn from print-based literacy to digital media-based electracy, and he recently co-edited the journal, Enculturation, that explores the intersection between "Video and Participatory Culture(s)." His work also appears in journals like Pre/Text, Kairos, and Computers and Composition On-Line.

Averill Packard is a Reference Librarian, Melvin J. Zahnow Library, Saginaw Valley State University. In addition to her collection development duties for the Law, Political Science, History, Philosophy, and Geography Departments, she is responsible for library research instruction for English 111 classes and liaison areas, doing classroom instruction, PowerPoint presentations, creating web-based LibGuides and Adobe Captivate online tutorials. Previous career-related experiences include school media specialist, online database trainer, intranet webmaster, public library reference librarian, youth service coordinator and communications specialist.

Robert L. Frost is an associate professor in the School of Information at the University of Michigan. He holds degrees from Grinnell College and the University of Wisconsin and has taught undergraduates for over 20 years in courses ranging from French history to technology management. Undergraduate students in Bob's class played the game, evaluated its effectiveness, and suggested improvements.

Loyd Mbabu is a Senior Associate Librarian at the University of Michigan Library and a lecturer in the School of Information, University of Michigan. He holds degrees from Ohio University, University of Toronto, and University of Missouri - Columbia.

Andrew Calvetti is a User Support Specialist in the School of Information at the University of Michigan. He is also a master's student working toward a master's degree in Information with the School Library Media endorsement. Certified to teach history at the secondary school level, Andrew has taught at schools in Detroit, Ypsilanti, and Ann Arbor. He brings to the BiblioBouts Project his extensive experience and knowledge helping high school students learn technology and diagnose technology-related problems.