During July 2011...

To borrow and slightly change a line from Ben Stiller in Meet the Parents, I’d say this workshop was “strong… to very strong.” The Preservation Institute is taught by top specialists from the Library of Congress. The Library of Congress brought out some of their big guns for this week-long workshop. The sessions were held in the James Madison Building, with tours to the Jefferson and Adams Buildings as well.
The institute is geared towards federal institutions, but there were non-federal libraries there as well (University of Cincinnati and Cincinnati Public). The first day was great. Diane van der Reyden, Director of Preservation of the Library of Congress discussed preservation management. Some of her more interesting points were that the Library of Congress is still working on digitization solutions and have no definitive method. What!? That makes me feel a lot better as we at ZSR still try to improve digitization and incorporate best practices as we do this activity. Diane said the LOC doesn’t particularly like “machine dependent” formats-because without a specific machine, access in the future might be difficult. They have, however come up with some ingenious ways around this problem. IRENE is a machine which reads broken or damaged vinyl, acetate discs or wax cylinders and coverts this into an image which can then be converted into sound.
We also had presentations on pest controls, the environment, and disaster recovery from Nancy Lev-Alexander and Ben Bahlmann from the Preventive Conservation section and along the way, we got a tour of the LOC stacks (crowded with book trucks and books on the floor) and a balcony view of the circular LOC Reading Room. The Library of Congress has the same problems ZSR has: leaky pipes and a faulty HVAC system. Day one finished with a hands on disaster recovery exercise with Alan Haley, Senior Rare Books Conservator and Andrew Robb, coordinator of the LOC’s Emergency Response Team.

Day two featured a group of presentations on collections care, library binding, fundraising and exhibits.We got a behind the scenes look at the Collections Care lab and various enclosures-such as one for a piggy bank from Lexington NC. A new Civil War exhibit called The Last Full Measure: Civil War Photographs (http://myloc.gov/exhibitions/civilwarphotographs/pages/default.aspx) features hundreds of ambrotypes and tintypes of soldiers. The exhibit is very poignant and has the feel of a shrine. An exhibit specialist described mounting The Last Full Measure and showed us samples of book cradles and document mounts we could make ourselves.

Day three saw us tackling digitization. We began with a presentation from the managers of the LOC remote storage facility at Fort Meade, Maryland. They have about six high bay facilities-each larger than ZSR’s one. We followed with a visit to the LOC Internet Archive project where they have 10 Scribe book scanners churning out digital copies of pre-1922 titles from their general collections. So far, they’ve scanned over 93,000 titles. It was an amazing operation to see. They confirmed that standards for resolution are in flux, something we’ve begun to realize at ZSR. They’ve abandoned using uncompressed TIFF files ( a heresy they admitted, but they insist it is a good heresy) in favor of RAW files which they convert to the jpg2000 format. In the afternoon, we had presentations about Rare Book Conservation and had a visit to their lab where we saw samples of supported (case bound) and unsupported (Ethiopian) binding styles, leaf casting and numerous other unique treatments.

Day four was focused on paper and photographic conservation. We heard a presentation from Susan Peckham, Paper Conservator in the Conservation Division. Susan discussed paper history, paper composition, sizing, inks and printing methods. Susan also described problems with paper based collections such as foxing, light damage, ink burn-through, and the “inherent vice” of the wood pulp papers manufactured from 1850-1900. We also heard Dana Hemmenway, Senior Photographic Conservator. Dana described the structure of photographs, a history of photo types( daguerreotypes, ambrotypes, tintypes and paper photographic prints). She had examples of paper photo prints as well: salted paper prints, albumen prints, collodian, and silver gelatin prints. We also discussed their hazards and inherent problems over time. In the afternoon, both Susan and Dana held problem-solving sesions in the paper lab.
During the paper session, I was at a station with a 1780 letter written to George Washington. I was amazed at it’s great condition (which had been prolonged by a Library of Congress repair a hundred years ago called “silking”).

The final day I heard a talk on Digital Preservation from Leslie Johnston, Chief of Repository Development at LOC. Leslie defined digital preservation as “the broad range of activities meant to extend the usable life of machine-readable computer files.” Leslie was quick to point out the different between digitization-an activity primarily focused on access-and digital preservation. She made the pitch to replicate files in geographically dispersed locations, and on different storage media and systems in order to have the best chance for these files to survive. Leslie was all business. They use Jhove in the scanning and preservation of files by using format validation. Leslie also outlined how the University of Maryland is spearheading an area called “digital forensics” at their center which insures the security of digital materials that are transmitted. Matthew Barton from the Packard Campus of the National Audio-Visual Conservation Center in Culpepper, Va spoke on audio and sound preservation. Matthew’s presentation was delightful in describing their attempts to conserve audio recordings on was cylinders, lacquer discs, wire recordings, tape recordings and digital audio tape.
Amy Gailick, also from the Packard Campus, presented on moving image preservation and care. She described the film types and the very advanced cold storage area at the Packard Campus in Culpepper, VA. The have storage areas for some film materials at are held at 25 degrees.

In the afternoon, I met the closest thing to a Preservation Rock Star there is: Dr. Frenella France. Dr. France uses hyper-spectral imaging to examine documents and highlight early versions or corrections.She has isolated a change Thomas Jefferson made to the rough draft of the Declaration of Independence and many other discoveries, such as Abraham Lincoln’s thumbprint on the Gettysburg Address. We also got a tour of the mass deacidification unit. This unit takes acidic books and ‘washes’ them in an alkaline solution which neutralizes the acidic materials in the paper. They were washing comic books the day we were there! As we were leaving the lab, we paused in the hallway by a bunch of old card catalog units. I glanced down and recognized a familiar last name…WOW!

One thing I was in awe of was the beauty of the Jefferson Building. Every inch is covered with hand-painted niches, tiled vaults, coffers with gold leaf, stained glass, sculpture and even WPA paintings.
What were the take-aways? I was able to have some of the staff read the ZSR Disaster Plan and COOP Plan and get constructive feedback. A discussion with the Internet Archive manager told me that the standards for digitization are in flux (something we suspected). Digital preservation will be more important as time goes on for all institutions as born digital material accumulates. I learned some easy-to-do techniques for preserving photographs and paper materials and got some great ideas for exhibits. And…..I’m glad I don’t live in Washington!

This weekend, I had the pleasure of giving a talk at the American Association of Law Librarians in Philadelphia, PA. Some of you may remember that my last trip to Philly resulted in theft of my phone :( so I exercised my best big-city behavior this time and kept my phone in my pocket – except for a few pictures.

What amazed me about AALL was how it is a highly focused ALA. The vendor hall, as you might expect, is focused very much on law librarians but I did get a chance to connect with a few scanner vendors to talk about their work with ILL and E-Reserves software. I also managed to run into a number of our colleagues from WFU and a few people that I have met at other conferences!

On Sunday I shared the stage with Andrew Pace from OCLC and Roy Balleste from the St. Thomas Law Library. It was interesting to hear from both Andrew, who discussed OCLC services as they related to cloud computing, and Roy, whose library has adopted the OCLC Web-Scale product. There was considerable interest in the audience and I was reminded how important continuations were to law libraries when the first question focused on this issue.

On a side note I had a chance to attend the Voyager Law Users Group meeting while there and got some interesting information about the new mass data change features in Voyager 8 and heard about where Voyager libraries think they are headed (ILS wise) in the coming years. Too much detail for this post but if you are interested stop by!

Today Lynn, Carolyn, Tim, Steve, Susan, Leslie, Jean-Paul, Kevin and Erik got together to attend the ASERL webinar on discovery services. We hear from Wally Grotophorst at George Mason University and Marshall Breeding at Vanderbilt.

Wally talked about the George Mason University experience with Aquabrowser. He discussed some approaches to cross data indexing including just in time solutions (e.g. Metalib, Deep Web), hybrid systems (Primo, Encore, EDS, OPAC) and just in case (e.g. Summon) solutions.

He provided an overview of different perspectives of “just in case” solutions and pointed out that these systems can lead users to approach the system from a perspective that assumes “If we dont have it, you probably dont need it.” another interesting (adapted) quote was: “The value of summon is inversely proportional to the sophistication of your researcher.”

Wally did a great job of looking at the user experience in products like Summon and comparing how libraries are finding ways to bring the benefits of JIC search systems while not losing the value of their catalog-based discovery layers (e.g. Villanova).

Marshall approached the issue by talking about different types of search methods – database specific, federated and discovery products (defined as any system designed to locally index a wide variety of data). He reviewed approaches and data models for centrally indexed discovery products (both local and web-scale) and touched on some of the changes in the ILS that the growth of e-books are likely to bring (e.g decrease in role of circulation, discovery). Marshall suggested that next-generation ILSs may include a tighter integration between back-office systems and discovery layers. This is something the industry is already seeing with OCLC’s Web-Scale management system.

On Wednesday morning, five of the members of ZSR Library’s leadership group (Lynn, Wanda, Lauren C., Mary Beth and Susan) left Winston-Salem and headed to Beaufort, NC for the first ever ZSR Library Leadership Retreat. Although we all work closely together and meet regularly through Administrative Council and other in-library venues, we had never had an opportunity for extended, uninterrupted time to reflect, explore and plan. Those of you who have participated in retreats yourselves know that it is a different experience when you have the chance to remove the inevitable interruptions that happen when meeting in the library or even on campus. It also provides an environment that is conducive to developing new appreciations of your fellow retreat participants because you have the time for deeper conversations that bring new understandings of each other!

Wednesday evening, after a day of travel and settling in (and a lovely harbor tour of Beaufort), we started the retreat by comparing our strengths that we had identified prior to the retreat. No, we didn’t just ask around to pinpoint our strengths! Lynn provided each of us with the book StrengthsFinders 2.0 by Tom Rath. Rath proposes that it is a waste of energy to try to try to correct someone’s weaknesses, that there is more potential for growth by focusing on developing people’s strengths. He identified 34 of the most common talents and developed an instrument to help people identify and describe their strongest 5 talents from this list (along with action ideas and tips for how to work with people that have each type of talent). Each of us came to the session with our list of 5 talents. We compared where we had overlap, uniqueness and which talents our small leadership group is missing. I’ll share my 5 with you, and leave it to the others to share theirs :-)

My top five are:

Achiever (these people have a great deal of stamina and work hard, like to be busy and productive)

Maximizer (focus on people’s strengths to stimulate personal and group excellence)

Arranger (these people organize but also have a flexibility that complements this ability. They like to figure out how all the pieces and resources can be arranged for maximum productivity).

Focus (can take a direction, follow through, and make the corrections necessary to stay on track. They prioritize, then act).

Thursday was a full day of sessions that ranged from “Year in Review” to ‘Planning for our Future.” Friday morning was a final session to wrap up and identify action items. You can see from the picture above that we settled in for our discussions (in spite of the perfect weather that was like a siren calling to us) and took turns as “scribe” (MB is the scribe in this picture) so that all the discussion and brainstorming was captured. We are compiling all of the information, ideas, blue skying, and prioritization into a single document that will facilitate our ability to move forward, share with the library at large and maximize our efforts made over the three days.

Retreats aren’t all about work, the team-building aspect of a retreat is almost as important. Lynn built in activities that gave us the time to interact in a more social way by touring Beaufort, dining at wonderful Beaufort restaurants and even getting in a few “summer Olympic” workouts with kayaks and bicycles! I did my best to capture the essence of the experience with my camera, and the results are available on my Flickr site in my “ZSR Leadership Retreat Beaufort 2011” set.

This afternoon, I participated in the OCLC Resource Sharing User Group Meeting, which was facilitated by OCLC and Atlas staff. The webinar was a follow-up to the User Group Meeting at ALA, but several new tools were introduced. One of the most relevant is the Lender String Report. This assessment tool is used to evaluate a lender selection in the lending string (a lending string is comprised of five lending libraries, which are manually selected by ILL staff or automatically selected by OCLC if the patron submits a Direct Request). The Lender String report, in conjunction with the Reciprocity Report, analyses the number of times a lending library says “yes” or “no” to a request. It also calculates the average fill time for that institution. We can use the information to promote or demote libraries in our custom holdings, which will [hopefully] allow for expedient arrival, processing, and delivery of requested materials.

Another feature introduced was OCLC’s new Article Exchange, which is a cloud-based document delivery service. Article Exchange does not require special hardware or proprietary software, and file size is theoretically not an issue. Lending libraries upload material to the cloud, which generates a unique TinyURL and password; staff are able to copy and paste the URL and password into an email or into the borrowing notes field on OCLC as a means of delivery. Once viewed, the article in live in the exchange cloud for five days. If the article is never viewed, it has a 30-day “cloud life.” According to OCLC staff, Article Exchange can probably be incorporated into ILLiad as an Addon. There will be no additional charge for this feature. For those who are interested, the URL is: http://experimental.worldcat.org/AE.

OCLC is also working to migrate Resource Sharing off the First Search platform. They are working with several libraries, who have volunteered to participate in the testing phase of this process. At this time, they project several major developments for ILL operations, including an expanded “buy-it” option, incorporating a variable aging of requests (which would depend on a lending library’s projected time frame to fill a request), and the creation of a dynamic lending string. By May of 2012, they hope to incorporate patron notifications, as well as the ability to alter staff roles and permissions, in the new platform.

I’m a bit late in writing up my report about the 2011 ALA in New Orleans, because I’ve been trying to find the best way to explain a statement that profoundly affected my thinking about cataloging. I heard it at the MARC Formats Interest Group session, which I chaired and moderated. The topic of the session was “Will RDA Be the Death of MARC?” and the speakers were Karen Coyle and Diane Hillmann, two very well-known cataloging experts.

Coyle spoke first, and elaborated a devastating critique of the MARC formats. She argued that MARC is about to collapse due to its own strange construction, and that we cannot redeem MARC, but we can save its data. Coyle argued that MARC was great in its day, it was a very well developed code for books when it was designed. But as other materials formats were added, such as serials, AV materials, etc., additions were piled on top of the initial structure. And as MARC was required to capture more data, the structure of MARC became increasingly elaborate and illogical. Structural limitations to the MARC formats required strange work-arounds, and different aspects of MARC records are governed by different rules (AACR2, the technical requirements of the MARC format itself, the requirements of ILS’s, etc.). The cobbled-together nature of MARC has led to oddities such as the publication dates and language information being recorded in both the (machine readable) fixed fields of the record and in the (human readable) textual fields of the record. Coyle further pointed out the oddity of the 245 title field in the MARC record, which can jumble together various types of data, the title of a work, the language, the general material designation, etc. This data is difficult to parse for machine-processing. Although RDA needs further work, it is inching toward addressing these sorts of problems by allowing for the granular recording of data. However, for RDA to fully capture this granular data, we will need a record format other than MARC. In order to help develop a new post-MARC format, Coyle has begun a research project to break down and analyze MARC fields into their granular components. She began by looking at the 007/008 fields, finding that they have 160 different data elements, with a total of 1,530 different possible values. This data can be used to develop separate identifies for each value, which could be encoded in a MARC-replacement format. Coyle is still working on breaking down all of the MARC fields.

After Karen Coyle, Diane Hillmann of Metadata Management Associates spoke about the developing RDA vocabularies, and it was a statement during her presentation that really struck me. The RDA vocabularies define a set of metadata elements and value vocabularies that can be used by both humans and machines. That is, they provide a link between the way humans think about and read cataloging data and the way computers process cataloging data. The RDA vocabularies can assist in mapping RDA to other vocabularies, including the data vocabularies of record schemas other than the MARC formats. Also, when RDA does not provide enough detailed entity relationships for particular specialized cataloging communities, the RDA vocabularies can be extended to detail more subproperties and relationships. The use of RDA vocabulary extensions means that RDA can grow, and not just from the top-down. The description of highly detailed relationships between bibliographic entities (such as making clear that a short story was adapted as a radio play script) will increase the searching power of our patrons, by allowing data to be linked across records. Hillmann argued that the record has created a tyranny of thinking in cataloging, and that our data should be thought of as statements, not records. That phrase, “our data should be thought of as statements, not records,” struck me as incredibly powerful, and the most succinct version of why we need to eventually move to RDA. It truly was a “wow” moment for me. Near the end of her presentation, Hillmann essentially summed up the thrust of her talk, when she said that we need to expand our ideas of what machines can and should be doing for us in cataloging.

The other session I went to that is really worth sharing with everybody was the RDA Update Forum. Representatives from the Library of Congress and the two other national libraries, as well as the chair of the PCC (Program for Cooperative Cataloging), discussed the results of the RDA test by the national libraries. The national libraries have requested that the PCC (the organization that oversees the RDA code) address a number of problems in the RDA rules over the next eighteen months or so. LC and the other national libraries have decided to put off implementing RDA until January 2013 at the earliest, but all indications were that they plan to adopt RDA eventually. As the PCC works on revising RDA, the national libraries are working to move to a new record format (aka schema or carrier) to replace the MARC formats. They are pursuing a fairly aggressive agenda, intending to, by September 30 of this year, develop a plan with a timeline for transitioning past MARC. The national libraries plan to identify the stakeholders in such a transition, and want to reach out to the semantic web community. They plan for this to be a truly international effort that extends well beyond the library community as it is traditionally defined. They plan to set up communication channels, including a listserv, to share development plans and solicit feedback. They hope to have a new format developed within two years, but the process of migrating their data to the new format will take at least several more years after the format is developed. Needless to say, if the library world is going to move post-MARC format, it will create huge changes. Catalogs and ILS systems will have to be completely re-worked, and that’s just for starters. If some people are uncomfortable with the thought of moving to RDA, the idea of moving away from MARC will be truly unsettling. I for one think it’s an exciting time to be a cataloger.

I felt I had a very productive conference at ALA Annual this year. Once again, the conversations with vendors were the best part. I stayed very busy and came home exhausted.

I’m currently on two ALCTS committees-the Acquisitions Section Technology Committee and the ALCTS Task Force on Transforming Collections. The Transforming Collections meeting was covered on the American Libraries ALA Membership Blog [http://americanlibrariesmagazine.org/ala-members-blog/new-transforming-collections-task-force-outcomes]. As the report says, the task force “is interested in reexamining how we define collections and approach collection management in the future.” I also attended (and served as a volunteer at) the “ALCTS 101″ session Friday night. Rather than have a representative from each Section speak to the whole group, the meeting was set up as “speed networking”; each Section had a table, and participants would choose a table and sit there and talk with the Section rep and others at the table for 5 minutes, then move to another table. I met lots of people and thought the speed networking worked very well.

Most of the regular sessions I attended were somewhat disappointing. I found that the descriptions often didn’t quite match the content. For example, I really looked forward to a session called “Implementing and Managing Webscale Discovery Services: Implications for E-Resources Librarians,” but it ended up being just another “here’s the decision process we went through, and this is the decision we made” presentation.

One session that started out disappointing but got better was called “Getting on Track with Tenure.” This panel discussion started as a discussion of pros/cons of faculty status for librarians, but did include (as advertised) some tips for the research/publication/advancement process. Some examples:

create a coherent research record (what is your area of specialty?);

don’t let Service get in the way of Research;

identify your best setting (time/place/atmosphere) for research, and build it into your schedule;

have benchmarks, map out where you want to be in ___ years;

own your research, but be open to criticism;

read more of the research published in your field;

don’t compare yourself to others (expectations can be very different);

don’t wait!

The Publisher-Vendor-Library Relations Interest Group was the last session I attended, though I wish it had been the first. It was a very good, frank discussion of specific challenges that all players in the e-book market are facing. The three panelists were from YBP, Project MUSE, and the Univ. of North Texas Library. Michael Zeoli (YBP) said that only 20% of approval titles are available simultaneously in print & as e-books, even fewer are available for demand-driven acquisition, and fewer still for consortial purchase. He spoke about the proliferation of e-book platforms, and how even when publishers do make e-books available through aggregators (EBL, ebrary, etc.), they often sell different titles through different aggregators. He also showed data that backlist e-book titles are seeing high use.

Melanie Schaffner, from Project MUSE, described some of the challenges of their venture to offer e-books from various university presses on the Project MUSE platform: getting timely & consistent metadata from publishers; dividing content into subject-based collections (granularity of subject areas, what number of titles in a collection is appropriate, etc.); how to price a subject collection; and institutional customization (most existing Project MUSE customers subscribe to all their journal content, so the platform isn’t currently set up for customized collections).

Beth Avery (Univ. of N. Texas) presented 41 “theses” (she said she was tempted to nail them to the door) of problems with the e-book market. The overarching idea was that now, while the simultaneous print & electronic availability is at 20%, is a great opportunity; we should work with suppliers now to shape the market, not wait until we get to 85-100% saturation. A few other highlights:

What is the unit of transaction? Our users deal in articles/chapters, but publishers/vendors/libraries deal in journals/books;

archiving – How long will publishers keep an e-book in inventory? How compatible will file formats be in 10, 20, 50, 100 years?;

How do we assess the vendor we’re working with? What do we measure? Should university accounting & legal offices fit into the assessment of a vendor? How?

I also attended a few presentations sponsored by vendors. EBSCO sponsored a luncheon specifically targeting e-resource management, but it was essentially a (too) long sales pitch. But other presentations on specific new products were well worth the time. I got a lot of information about the forthcoming E-books on Project MUSE product, and I’m fairly excited to see it roll out. It won’t be perfect, but I think the vision they have is looking in the right direction; for example, while initially the e-books must be bought in collections, by 2013 they hope to be able to offer purchase of individual titles through book vendors like YBP. Commenting on the many issues that must still be worked out, the Director of Project MUSE remarked, “this is 1998 all over again.” I also attended a session on Thomson’s upcoming Book Citation Index on the Web of Science. This will be analogous to their familiar journal citation products, but users will be able to search citations from books, journals, and conference proceedings simultaneously. They expect to have over 30,000 books indexed by the Dec. 2011 launch, and add about 10,000 per year. I do expect it will be a valuable product, but my thought upon leaving the presentation was, “Well, that’s a few more databases we’ll have to cancel to cover the cost…”

My time in the exhibits was more piecemeal than last year, so I didn’t have as much time to wander and explore new products. But I did make a list ahead of time of the vendors I wanted to be sure to talk to, and I felt that my time there was well spent (I ran into two BYU colleagues who were there with an Exhibits Only registration, something I might consider in the future). I had some good conversations with reps at the EBSCO, Overdrive, 3M, Springer, and Palgrave Macmillan booths about their respective e-book platforms and purchase models. I tried to explain why the single-user/unlimited-user dichotomy does not serve us well, and urged all of them to explore other models. I received the usual push-back from Springer when I brought up single-title purchasing (they claimed they wouldn’t be able to make money that way – hmph!). But Steve O’Dell from EBSCO told me that they are developing an e-book option where a library could buy a single “copy” of an e-book, but then lease more “copies” short-term if they knew, for example, that a class was going to need to use it. I also spoke with Drew Watson, product manager for EBL; it’s nice to deal with a company that is still small enough that I could tell Drew “this is a question that came up” and have him make a note and say “I should be able to fix that.”

This is my last ALA Conference post, and it could have just as well been called Connecting and Working.

This was the best ALA conference for me since Chicago, or maybe ever. Part of it was the chance to be involved with things I care a lot about. But a larger part was the people. I love conferences and look forward to them for the chance to see friends that only cross paths every six months. I try to catch up with folks for lunch, over dinner, in meetings, and at happy hours. I had some great conversations and some left me ready for Midwinter already.

Several of these conversations generated projects that I think will be fun to do, have direct benefits for my job, and could be useful for the field.

One of these projects will be based on web design and has been brewing since before Leif was born. Both of us are now in a place to work on the project, so we were able to hammer out a few next actions to get the ball rolling.

Another came up over coffee when I was talking about reframing how to think about reference based on experiences I’ve had as an interdisciplinary liaison. My co-conspirator agreed and had been thinking of something similar, so we’re going to do a project around it.

Several of us talked about pooling perspectives to do a piece on effective webinar design. We can tackle it from the angles of best practices, instructional design, and aspractitionerswho both have given and participated in them.

So all in all, a lot of good projects will come out of this relatively short time.

It’s something that I mulled over when I was seeing tweets from really good programs. I worried I wasn’t getting enough out of the conference for Wake Forest. But then I realized what I was getting out of this particular conference were projects that will inform the work I do at Wake just as much as if I were to sit in on a session and learn about what others have done. And the meetings that spurred these projects wouldn’t have happened if I weren’t at ALA.

The hardest thing about coming back from ALA is having a week of catch up to do, and THEN to do the ALA work. I’m going to do everything in my power not to drop the ball on these projects, though, since I feel really good about the potential outcomes.

ALA was an incredible conference for me this year, from the events I did attend to the meetings I participated in to the conversations and the long term projects that will come out of it. But for now, I have to get through this inbox so I can act on all these new ideas!

As a member of ACRL’s Anthropology and Sociology Section (ANSS), I attended several ANSS sponsored events in New Orleans. The ANSS social was held at Lucy’s Retired Surfers Bar and Restaurant where I met and dined with other anthropology and sociology librarians. I also went to the Anthropology Discussion Librarians Group where such topics as institutional repositories, open access projects, communications technology for anthropology (i.e. social networks, blogs, etc.), and ordering e-books were discussed. Many of the librarians in attendance stated that while e-books are popular with the students they serve, faculty requests for e-books are not forthcoming. Possible reasons given include some e-books don’t contain graphics and there is a lack of e-book publishing in the discipline. I have really benefited from being a member of ANSS in learning more about the discipline of anthropology, its resources and issues, and networking with other anthropology librarians.

This ALA, I also began my 2-year appointment as a member of the ANSS Subject and Bibliographic Access Committee. Each month a committee member answers a different question on cataloging issues (e.g. subject headings, name authorities, etc.) and policies, and a list of new LC subject headings in the social sciences is posted. For October, I will be writing on social tagging, their use, and whether they enhance searching in library catalogs. I am very excited to be on this committee as it is directly related to my primary work responsibility, cataloging, and that it will enhance my work and knowledge as the liaison to Wake’s anthropology department. My fellow committee members seemed excited that I am now a part of this committee as well.

RDA, Cataloging and Classification Research, Value of Grey Literature, and 21st Century Scholarly Communication were some of the other topical ALA sessions that I attended.

“Will RDA Kill MARC?” was the title of a panel discussion put together by ZSR’s Steve Kelley. Panel speakers Karen Coyle and Dianne Hill made some very philosophical and thought-provoking statements as to the benefits for catalogers and the library world to abandon MARC and embrace RDA.

Coyle stated, “RDA is a savior and an opportunity to save library data. We can’t redeem MARC, but we can rescue its content.” She pointed out that MARC contains mixed data, administrative (e.g. OCLC record number) and nonadministrative (e.g. bibliographic fields) and that the rules in MARC are not coordinated with cataloging rules. Some MARC data is in more that one place which demonstrates librarians’ ingenuity of getting around MARC’s inflexible structure by making nonreapeatable fields repeatable. The goal for library data should be data independent of its structure. We should be able to code once and display many times.

In regards to library data, Hill believes, “We need to stop trying to control it all!” We should let others do what they want to our data, but they will not be messing with the data’s integrity.

At the Cataloging and Classification Research Interest Group, UNC SILS professor Jane Greenberg discussed research blitzing as a way to share and motivate cataloging research. Her UNC SILS students meet together in a social setting and each gives a five minute presentation on their research. Afterwards, students are able to dialog with their peers about the research that is currently being conducted in the SILS program.

Because I give a lecture on grey literature in LIB210, I chose to attend the “Grey Literature in the Digital Age” session with speakers Richard Huffine, Director of the Libraries Program at the U.S. Geological Survey and Wayne Strickland of the National Technical Information Service, an agency of the Department of Commerce.

Grey Literature is information produced by government, academics, business, and industries, but it is not controlled by commercial publishing interests and publishing is not the primary activity of the organization.

Today findability is no longer the driving challenge, but reputation is the key. Is the info trustworthy, citable, peer reviewed? Is access persistent? The digital age has thrown the definition of “published” into chaos. Will what’s available today be here tomorrow? Examples of grey literature whose persistent access is questionable include: pre-prints, blogs, preliminary research results (open files), project web sites (schedules), IRs and data archives. Findability relies on cited references in journal articles, IRs, authors’ CVs, and good aggregators (of which there are few) seek it out.

Copyright of grey literature can be even more complex. Some creators want their materials used. Some sources are inherently in the public domain like materials from the U.S. federal government. If unknown, copyright should be assumed. Both authors and the organizations for whom they work can claim copyright of works. Creative Commons licenses are being used by some domains.

Grey Literature has its place, and it’s here to stay. It may not stand alone, but it can contribute substantially to understanding scientific challenges. Every source should be considered in the exploration of an issue. In some domains, the best source of information may be grey. Some grey literature goes through as stringent (or more) of a review as commercially published content. It’s value will always be a mixed bag, and there are risks involved in citing it. Libraries have to be involved in identifying and defining its value. Social tagging could be used to help people assess the validity of grey literature.

Mr. Huffine stated that HathiTrust is considering open membership. LC is a partner in the HathiTrust and is trying to get other federal agencies included. He said the USGS wants to get involved, but they also want to raise the quality of images, dpi, etc. as well. Many of their maps are multi-page foldouts and that line widths on maps are very important to geologists.

The final session I attended was a panel discussion on 21st century scholarly communication. One of the panelists discussed the role of subject liaison librarians in this area. She recommended “keeping your ear to the ground”–know your faculty, their interests and projects, tenure/promotion process at one’s university, open access policies of faculty’s professional associations and organizations. It is also important to know what one’s individual library is trying to accomplish in the area of scholarly communication. One continuing challenge pointed out by Marty Brennan of UCLA’s Copyright Office is convincing scholars that the virtue of OA publishing should outweigh their need to submit to the highest impact journal in their field. The last speaker was a grad student who talked about starting a transdisciplinary OA journal and the difficulties encountered in finding good reviewers and in receiving good scholarly paper submissions.

ALA in New Orleans was fabulous! Informative sessions, delicious food, and a great time exploring the city by bicycle and hanging out with colleagues.

Much of my ALA experience in New Orleans can be summarized in a list of “firsts”:- first time at an ALA Annual conference (Midwinter was my first ALA anything!);- first time presenting at a poster presentation (which I did twice!);- first time attending a full ACRL Scholarly Communications Committee meeting (long (4 hours) but fascinating);- first time the ACRL Scholarly Communication 101 Road Show program presenters met in person (for a very productive day-long curriculum planning retreat);- first time touching an alligator (just a baby one…);- first time eating a beignet (YUM);- first time in NOLA (amazing).As I continue to be involved with ALA and ACRL, I know that many of these firsts will be followed by seconds, thirds and so on, but happily they all combined this year for an overwhelmingly positive conference experience!

The most important reason I attended this ALA (and no, it wasn’t food-related, Facebook photos notwithstanding) was to complete my participation in the Emerging Leaders (EL) program. On Friday, our EL class had a morning leadership training and assessment session, followed by a lunch and “graduation” ceremony, then concluded the program with our afternoon poster session. The posters were outcomes of our six months’ project work, and I thought that all 16 teams did a great job designing interesting, engaging posters. As you might expect, there were a lot of QR codes around (our team used one), as well as Mardi Gras beads, buttons and other goodies to entice people to visit teams’ posters and learn more about our projects.

My EL project team (Team L) was tasked this past winter and spring with completing a webinar series feasibility study for the ALA Learning Round Table (LearnRT) board, which is launching a new Webinar Learning Series later this year. We began our research by identifying many webinar and e-learning series currently and previously in place, and then selecting 16 to assess on a number of criteria that included cost, timing, theme, platform, and promotion. After analyzing the collected data, we contacted organizers of 10 webinar series asking them to complete a detailed webinar assessment survey. The aim of the survey was to gain insight into elements of webinar series production that are less easily quantified or not publicly available (e.g., best practices, planning timelines, number of registrants or attendees, etc.). From the data we gathered and from that provided by the survey respondents (50% response rate), we drafted a report for the LearnRT board that outlines best practices, issues and implications for success, and recommendations for the new webinar series. We submitted our report to the LearnRT board prior to ALA, and several board members came to our poster session to thank us in person for doing a great job. If interested, you can read our report at our team’s ALA Connect page (and certainly ask me questions!).

As mentioned above, the EL poster session was just one of two I co-presented at ALA, but both involved our EL project poster. On Sunday afternoon, we rolled out our poster again at the annual LearnRT training showcase. Although we had lots of traffic and interest at Friday’s EL poster session, including Lauren P. and Steve, the folks we chatted with on Sunday were a more targeted audience, asking in-depth questions about our research and findings. I was very glad that LearnRT invited us to participate in the showcase, and look forward to the launch of the Webinar Learning Series as an outgrowth of our EL work.

Much of the rest of my ALA was spent at various scholarly communication-related presentations and meetings. The Road Show planning retreat was held on Thursday, the day before the conference started, so by the time Saturday morning rolled around, I’d already had two full days of activities. I did not slow much over the weekend, but fortunately didn’t have to dash between the convention center and hotels too often. I attended sessions covering library-university press partnerships, the Berlin Declaration on Open Access and the Berlin9 conference (to be held in the US for the first time this fall), and two on how best to advance scholarly communication and open access conversations on campuses. I was *thrilled* to attend one session on these issues sponsored by someone other than SPARC, the ACRL SC Committee, or the ALCTS SC Interest Group: on Monday morning, the ACRL Women and Gender Studies Section organized a panel addressing “21st Century Scholarly Communication: Conversation for Change.” I was very impressed by one panelist who unabashedly owned the failure of an open access journal she launched as a graduate student, as I believe that success and failure must be acknowledged, as both provide learning opportunities.

I’ve returned from ALA where I “emerged” officially, ate too much phenomenal food and became smitten with NOLA, but most importantly with lots of interesting ideas bouncing around my brain, not the least of which are some programming ideas for Open Access Week in October. Stay tuned!