Because we employed lightning talks and approached the topic broadly, discussions were more exploratory than definitive. As befits a discipline in the throes of formation, a certain element of confusion and chaos attended the day. But my overall sense is that a group of smart, experienced people worked hard to share information about disparate efforts, and to integrate them into a regional and even national conversation. More work, more focus, and still broader participation are needed, but we built on good work already underway.

One strand of conversation very much surprised me. The discussion around deselection and drawdown of duplicative print collections repeatedly turned toward discovery and digital content. This seems ironic in some respects, as shared print initiatives tend to focus first on titles that have never circulated. Why would we be concerned about the discoverability of content that has remained untouched for 10-20 years? On the surface, deselection and discovery seem like mutually exclusive categories. Is there really a need to enhance discovery of something that is being withdrawn or moved to storage for lack of use?

Well, perhaps there is. As a group, we surfaced several arguments in favor of what I'll call 'remedial discovery.'

Self-fulfilling prophecy: One reason that titles don't circulate maybe because they are not found. Users are not always skilled searchers, and even the best cataloging records have a limited number of access points. Cataloging errors may also play a role here (e.g., a misspelling in a primary access field).

Shared print collections limit physical browsing: Paradoxically, the decision to rely on copies that are not held locally in open stacks increases the desire for some form of virtual browsing or enhanced discovery. A user may want to know more about a book before requesting it from another library or from a remote storage facility. The further away the books are, the more desirable virtual browsing appears.

Record enhancements have not been universally applied: Many OPACs have taken a page from Amazon to include cover scans, flap copy, and tables of contents. But these enhancements have not been adopted for all libraries, and may not even be available for older titles--those most likely to surface as withdrawal or storage candidates. In some cases, older titles may not have had the same level of exposure as newer titles.

Discovery layers are just coming into their own: Discovery tools have improved dramatically in the past few years. The more content that is indexed in those tools, the better the chances a user will find resources that may have been overlooked in the past. Here again, older materials have not benefited from these newer techniques. Perhaps they need another chance, with better tools.

It's an interesting take, and perhaps worth some experimentation. We've come at this topic from other angles previously, in posts on 'patron-driven re-acquisition' and 'curating a discovery environment.' All of this needs to be thought through more carefully, but maybe we ought to consider two simultaneous courses of action once unused titles have been identified.

Continue to draw down highly-redundant print collections in the context of shared print archiving and secure digital collections.

Enhance the remaining records for optimum discoverability. Give them a second chance to benefit from newer discovery tools.

The second of these is somewhat counter-intutitve, since it involves additional investment in a resource that has already cost far more than it has yielded. Some titles may not benefit from the additional work. But it may be worth testing on a small scale. Not only would it level the playing field for older titles, it would provide additional convenience to users examining content remotely. Specific enhancements might include:

Devise a virtual browse function, similar to the Hathi page turner or Amazon 'Look Inside the Book'

No doubt there are other ideas. Some of them will require a good deal of work and investment. There are definitely some trade-offs here, and perhaps the approach must be selective to be affordable. But it's intriguing to think about creating better forms of discovery and access for material that is going offsite or will be held by another library. Lack of browsability is one of faculty's main objections to removing print from central campus stacks. Connecting deselection and enhanced discovery may be one way to answer that.

Tuesday, November 1, 2011

In the December 2008 issue of Against the Grain, I introduced a new concept: "The Disapproval Plan: Rules-Based Weeding and Storage Decisions" [pdf]. The article's title was only somewhat tongue-in-cheek. As I tried to demonstrate, selection and deselection represent the same function, performed at different points in a book's lifecycle. At both points, titles are accepted or rejected. Approval plan profiles assure consistent and customizable treatment of newly published titles. Disapproval plan profiles assure consistent and customizable treatment of older titles that have not circulated much. The goal, for most libraries, is to create--and maintain-- an active collection, relevant to the current and future needs of its users.

Approval plans support content acquisition decisions; disapproval plans support storage, weeding, and shared print decisions. Both approval plans and disapproval plans safeguard collection integrity while providing an efficient, reliable alternative to title-by-title scrutiny. Both approval plans and disapproval plans have limitations; both are most appropriately deployed to handle mainstream titles. Subject experts, whether librarians or faculty, remain essential for specialized materials and judgment calls, but rules-based or profile-driven approaches can relieve them of the need to make many obvious and repetitious decisions.

With the benefit of 3 more years of thinking about collection use, deselection, and shared management of print monograph collections, it's become clear that my initial sketch of the disapproval plan concept can be drastically simplified and refined. The original concept focused on commercial alternatives to locally-owned print, such as Google Books, eBook aggregators, print-on-demand, and the used book market. In retrospect, this approach over-emphasized 're-obtainability' and understated the fundamental importance of archival commitments and operating in the context of the 'collective collection.'

Since then, the emergence of the Hathi Trust digital archive (now containing 5.1 million full-text book titles), and shared print archiving initiatives such as those developed by WEST, ASERL, ReCAP, CRL and the CIC have changed the picture, creating new safeguards and expanding deselection options. The work of Constance Malpas, Lizanne Payne, Paul Courant, OhioLINK/OCLC and others has provided new data and insight on print monographs overlap, storage capacity, and costs.

One key fact has not changed, though. The need remains for an automated tool that assembles relevant deselection metadata, and develops rules to operate against that metadata: a deselection decision-support tool. Now is the time to adapt the disapproval plan to new realities, and to incorporate both archival values and service values into the model. The November 2011 release looks like this:

A 'disapproval plan' is a set of library-defined rules that must accomplish four tasks:

Define the deselection universe. What pool of titles is eligible for deselection consideration? Data elements and distinctions might include:

Low-use or no-use titles: These can be identified from circulation, direct borrowing, in-house uses (if captured), and ILL data.

Titles owned more than x years: Titles should be given a chance to circulate. Most libraries won't consider withdrawing a title owned for less than 5-10 years. Publication date provides a rough approximation, but leaves older imprints that are recently purchased in the pool. Acquisition date or accession date are much more reliable.

Titles widely held elsewhere (see below).

Titles that will be kept regardless of use: Works by faculty authors, notable alumni, Nobel Prize winners or are cited in authoritative bibliographies may need to be exempted.

Specific editions or translations: A conservative approach would suggest that matching be conducted with FRBR work families turned off. This retrieves only those holdings that reflect the edition in hand.

Assure that withdrawal candidates remain secure. Once the eligible low-use universe has been defined, the archival security of this content must be gauged. An individual library operating within the academic community must help assure that nothing is lost. While it may not be necessary that a title be held locally, we must satisfy ourselves that it has been secured somewhere. Deselection metadata and rules in support of collection integrity might include:

Presence of a print copy safely archived in a trusted repository

Presence in Hathi Trust digital archive

Presence in Hathi Trust print archive [under development]

Explicit retention commitment for 4-6 copies nationally [MARC 583]

Number and distribution of other print holdings nationally, globally, or regionally (shared print archiving)

Assure that withdrawal candidates remain accessible. Once archiving has been assured in both print and digital form, accessibility comes to the fore. In general, archival copies should not leave the facility where they are secured. Instead, 'service copies' are needed. Deselection profiles need to incorporate this factor, which identifies where usable copies of the content exist and in what form. Here the salient data points are:

Presence of a service copy in a regional service center

Availability of alternate editions; i.e., same content, different vehicle

Availability of commercial eBook versions/PDA records

Availability of a print on demand edition

Re-purchasable on the used book market

Enable data-driven decisions to store, withdraw, or retain/curate. As approval profiles generate books, notifications, or exclusions when the profile is applied, the disapproval plan can support storage, withdrawal or retention decisions, depending on local needs. Low-use titles that are scarcely held elsewhere might become candidates for preservation or digitization, enabling any library to contribute to the collective collection.

As with selection, deselection work can be done by the library without outside assistance. Reports generated from the ILS or open-source collection evaluation tools such as the GIST Gifts & Deselection Manager can be employed. But we suggest there is also room for a vendor-assisted model for deselection and disapproval plans, which is why we founded Sustainable Collection Services (SCS). SCS offers a full-service model for deselection, much like approval vendors do for selection. SCS advises on data extracts; normalizes and validates library-supplied bibliographic, circulation, item, and holdings data; and compares low-use titles to WorldCat, Hathi Trust, and other external sources.

We now offer in a mediated form the ability to interact with preliminary results and model different combinations of factors: use, time in collection, consortial partner print holdings, existence of Hathi Trust digital version, number of WorldCat holdings. FRBR on, FRBR off. Our tools support full-library analysis or focus on specific subjects or locations. Early in 2012, SCS plans to release a Web version of our service that will enable unmediated interaction with deselection metadata. The disapproval plan may yet live and thrive!

A final note: 'disapproval' has negative connotations, but selectivity always implies a mix of acceptance and rejection. Approval profiles always drive as much rejection as acceptance. For English-language books, a large approval plan might supply 20,000 new titles out of 60,000 candidates. The remaining 40,000 new books have in effect been 'disapproved.'

Lack of use over time constitutes disapproval of (or at least indifference toward) that title by patrons. They have chosen other books (or more likely, other [electronic] resources) to use instead. Disapproval in the form of withdrawal or removal to storage by the library simply reflects the preferences of users, and the security and accessibility of the content elsewhere.

Can we 'neutralize' this term by basing disapproval strictly on data? Probably not. Too bad, because it's a very useful term. These are not bad books. Deselection does not connote disapproval of their content. They are simply not relevant or no longer relevant to a specific user community. Selectors face the same decision when first choosing which titles will enter the collection--and with far less data at their disposal. At deselection, there is a track record.

Monday, October 17, 2011

Last week, our small world of print monographs management approached its own instance of 'harmonic convergence'. A number of unrelated events coincided to illuminate the changing role of print collections in the academic community. Heat, light, and even the threat of fire emanated from several angles, and the humble task of library weeding reached the edges of the mainstream media. Overall, despite their sometimes unexpected origins, these developments advanced a timely and necessary discussion. Consider the forces aligned on this topic:

Hathi Trust Constitutional Convention:An excellent summary from Feral Librarian includes the good news that "HathiTrust will establish a distributed print monograph archiving program among HathiTrust member libraries." This supplements Hathi's pioneering archive, that now includes secure digital versions of "over half of the collective holdings of ARL libraries." John Wilkin's opening remarks to the Convention [pdf] emphasized the power of large-scale collaboration toward "an increasingly comprehensive shared collection." The addition of a print archive will assure that no content is lost. Check the Twitter hashtag #htcc for more.

Cultural critics?

6 Reasons We're In Another Book-Burning Period in History:Meanwhile, incomprehensibly, Cracked.com, the web version of a Mad magazine imitator, served up a provocative post by S. Peter Davis. The piece is a strange mix of inflammatory and informed, and occasionally even humorous. Ittaps into the outrage that can be engendered by discarding and destroying books, no matter how strong the rationale. As of today, the article has been viewed 665,668 times--far beyond the audience reached by librarians wrestling with these issues. Davis sees that weeding may be necessary; he objects to the destruction of withdrawn volumes.

'Hard Choices: Do Libraries Really Destroy Books?"An NPR blog called Monkey See brought the "Cracked" entry to a still wider audience but with a more balanced perspective and some actual reporting. The author spoke with Betsy Simpson, President of Association for Library Collections and Technical Services (ALCTS). Ms. Simpson articulated the academic library perspective fully and well, noting the library's "mission to preserve the cultural record' in the face of simultaneous need for [user] "space to interact and collaborate" and the existence of many unused copies of the same titles.

ARL Membership meeting: The Chronicle's 'Wired Campus' blog reported that the agenda for the Association of Research Libraries' Directors annual meeting in Washington, DC included presentations on the Digital Public Library of America, an update on HathiTrust, and on the changing nature of research. In a session on 'Rebalancing the Investment in Collections', H. Thomas Hickerson, the Vice Provost and University Librarian at University of Calgary, noted that 'the comprehensive and well-crafted collection is no longer an end in itself.' Ed van Gemert, deputy director of libraries at UW-Madison, noted: "We simply can't afford to do work separately that could be done collectively."

Not weeding has consequences, too

All in all, not a bad week for the concepts of weeding and collaboration: a healthy mix of attention, controversy, innovation, misunderstanding, good sense, and passion. (More like dissonant convergence, I guess.) Perhaps these events are most usefully seen as a challenge to all of us engaged in deselection, shared print management, and digital archiving. The challenge is to continue to clarify our own thinking, refine our message, and get better at making our case not only to ourselves but to the wider community. Weeding is necessary. Weeding is responsible. Weeding can be made safe. Collection security will assured collaboratively.

This is the main reason that we at Sustainable Collection Services have adopted an approach to monographs deselection that relies on data. Taken together, the number of copies held globally or within a specific region, circulation and in-house use over time, and the existence of a secure digital copy can create a safe and suitable list of withdrawal candidates.Disposition options are numerous, and each involves different trade-offs and costs. But, as Betsy Simpson and even Cracked.com indicated, the underlying problem is real. We need to explain why some books may need to be removed, and what is to be done with them. We need to make the case.

Here are my suggestions for talking points on the rationale for weeding. I'll add supporting data points in another post. First, let's see if this high-level message works. Thoughts?

Monday, October 10, 2011

Recently, in my thinking about servicing shared print monograph collections, the concept of logistics has loomed large. In order to satisfy user expectations for delivery of books from regional repositories, these centers will have to operate as fast and efficiently as Amazon, UPS, or FedEx. In addition to assuring the persistence of the scholarly record, the core competencies of regional print repositories will become inventory control and speed of fulfillment. Digitization and print-on-demand will follow in time. In this deeply unromantic view of library services, shared print service centers will become a key link in the long-tail supply chain.

In a technical definition of logistics drawn from Wikipedia, service centers must achieve 'high due-date reliability, short delivery times, low inventory level and high capacity utilization.' To my mind, this frames the management challenges around shared print collections very accurately. From Paul Courant and others, we are learning just how expensive it is to manage low-use print material. We need to lower those costs while assuring continued and convenient access. To do so, libraries need to embrace logistics.

Always ahead of the curve, Lorcan Dempsey introduced the phrase 'library logistics' in November 2004, and amplified its meaning in subsequent posts. To excerpt from a couple of his attempts at definition:

Logistics is about moving information, materials, and services through a network cost-effectively. Resource sharing is supported by a library logistics apparatus. [...] Increasingly, as libraries look at shared solutions for off-site storage, [...], digitization and archiving, they run into logistics and supply-chain management questions.

There are a number of interesting permutations to this idea, especially in relation to shared print monograph collections. Here are some steps I believe we need to take to begin to benefit from a very different approach to managing tangible collections.

Separate 'archive copies' from 'service copies.' This distinction is underdeveloped in discussions of shared print. I suspect this is largely because those efforts are at present focused on journals. With journals, a single copy can often support both archiving and distribution, because article scanning and document delivery are well-developed systems, and because articles are shorter than books. This allows a print volume to be cared for, but also for its contents to be disseminated. In August, the WEST group, which has wrestled with these issues thoroughly, codified its recommended practices in its "Access Guidelines for WEST Archives [pdf]. For delivery methods, it recommends, in order of preference:

Loan the physical issue or volume to the borrowing library for building use only.

This make excellent sense for journals. For now, though, this approach is not well suited to books. The length of texts and copyright issues make scanning, photocopying, and building-only use much less practical.

Monographs will require a different service model, at least for the foreseeable future. It will most often require the delivery of a copy to a user. This model introduces both risk (of loss or damage) and significant logistical challenges. The risk is easily mitigated: designate a shared copy or copies 'archival' and prohibit their use, except for subsequent non-destructive digitization. For books, one or more dark archive copies, supplemented (where possible) with a secure digitally-archived version in HathiTrust, would satisfy preservation and security needs. This step is foundational; no service layer can be built until the archiving imperative has been met.

The logistical challenges can then be met with an active, well-managed inventory of 'service copies.' Since most titles in offsite storage have seen little or no user demand, the number of service copies needed in a region may be quite small. This would depend on the size of the user base and the number of libraries relying on the regional facility.. This is where inventory management techniques developed in other contexts could begin to contribute to a radically different service environment.

Raise the bar for regional library service centers. Distribution and supply-chain management are highly evolved in other industries and sectors. Servicing of shared print collections could benefit enormously from the expertise of large-scale book distributors like Ingram, Follett, or Baker & Taylor. The library world in general could learn from logistics experts at UPS, Amazon, or other companies. Service from regional library centers should be built to include 24-hour delivery direct-to-user, email order confirmation and tracking capability, real-time display of availability, and perhaps even the option to purchase via partner relationships.

One important question is how many such centers are needed. Amazon services the entire country from a handful of warehouses. UPS positions its distribution centers near airports and highway interchanges to enable rapid delivery. In short, once we separate servicing shared monograph collections from archiving them, we can manage the service copy inventory according to best practices drawn from other industries. A very few such centers, optimized for 24-hour delivery and long-tail inventory management, might serve the whole continent. Over time, these would be obvious nodes for print-on-demand production as well.While service copy distribution centers would cost money to establish and run, much of that cost could be offset by space and time savings in participating libraries. Cost would also suggest that a very few, high-volume centers would make the most sense.

Some library systems operate along these lines already, using a combination of storage facilities and courier services. As a small-scale example, the Tri-Universities Group (TUG) in Ontario provides delivery 'by midafternoon the following weekday' to its three member libraries from a shared storage facility in Guelph. The OrbisCascade Alliance provides 48-hour delivery within its two-state membership of 36 libraries. University of Missouri System Libraries Depository (UMLD) processes up to 100 book requests per day, with requested items leaving the facility within 24 hours of placement. Many similar initiatives exist and are constantly being improved. But the benefits relating to greater scale, automation, and shared costs remain intriguing.

Provide 24-hour delivery directly to any authenticated user in the region.Users will be much more inclined to accept that large portions of the print collection are stored elsewhere if it does not affect their personal workflow. For service copies stored in shared print repositories, 24-hour direct to user delivery would mostly eliminate this concern. While the shipping cost may be higher than at present, this would make offsite storage more palatable, and potentially almost invisible. Delivery direct-to-user would eliminate circulation desk mediation, holds, and staging for pickup by the user. A higher volume of transactions would lower the per-transaction cost, so this again suggests consolidation of supply in fewer, larger regional centers. Delivery and return could be standardized, and bid to local couriers, UPS, DHL, or FedEx.

Manage inventory like a book distributor. Booksellers are experts at determining which titles are moving and which are not. Although these are likely to be long-tail operations (many titles, very few copies of each), it remains true that faster-moving titles require more copies on hand, and slower-moving titles fewer copies.Because archiving is handled separately, there is much more latitude to manage service copies based on their activity level. Titles that do not circulate might become candidates for print-on-demand or digital delivery only. Or, they might simply be retained in the warehouse, where the $.86 per year estimated cost of ownership is amortized across all the libraries in the group.

In the end, to adapt another phrase from Lorcan, broad adoption of library logistics could confer the benefits of the 'network effect' on low-use monographs, driving costs out of the system while improving service to users.

Wednesday, September 28, 2011

For those of us interested in print collection use, the Kent Study has long served as a touchstone. The book's actual title is Use of Library Materials: The University of Pittsburgh Study, by Allen Kent [WorldCat record]. Published in 1979 by Marcel Dekker, it focuses on the 36,892 books acquired by the University of Pittsburgh's Hillman Library in 1969. Of particular interest is Chapter II, "Circulation and In-House Use of Books", which follows those 36,892 books through 6 years of circulation and non-circulation. In a finding that has resonated for more than 30 years, the authors determined that nearly 40% of these books did not circulate within the first six years on the shelves, and predicted that the chances of them circulating after that were a mere 1 in 50.

Automation in the Kent Study era

The high rate of non-use occurred at a time when print was dominant, when ILL and even awareness of other library holdings was in its infancy; in short, during a best-case period for print use. Those are not today's conditions. While the Kent findings remain interesting, they are based on 40-year old data and 40-year old circumstances. Tectonic shifts in electronic availability, resource sharing, and user behavior have long cried out for a newer, broader circulation study.

This month's release of theOhioLINK-OCLC Collection and Circulation Analysis Project 2011[news release and links] provides that new study, along with the ability to interact with the massive data set on which it is based.

Authored by Ed O'Neill (OCLC), Julia Gammon (Univ of Akron), and the OhioLINK Collection Building Task Force, this is a major contribution to emerging conversations around shared print retention and cooperative collection development. Unlike Kent, which focused on a single library, this work encompasses 89 libraries and tracks collection overlap and circulation activity across 30 million items--nearly 100 times the scope of the earlier work. The circulation activity is drawn from the Spring of 2007 and 2008, and so reflects current user behavior. Inshort, this is a much more robust foundation on which to base decisions about the number of print copies needed, for older and newer titles alike.

The big headline is this one:

"The most fascinating result of the study was a test of the “80/20” rule. Librarians have long espoused the belief that 80% of a library’s circulation is driven by approximately 20% of the collection. The analysis of a year’s statewide circulation statistics would indicate that 80% of the circulation is driven by just 6% of the collection."

This is a pretty shocking number, even to someone who has been looking closely at circulation data for several years now. It suggests that there may be very few titles for which the community needs many copies. We'll drill deeper into the report and underlying data in subsequent posts.

In some respects, this OhioLINK-OCLC work can and should be seen as a companion piece to Constance Malpas's excellent Cloud-Sourcing Research Collections: Managing Print in the Mass-Digitized Library Environment.[pdf] Her study, also conducted through the OCLC Office of Research, focuses on collections at NYU, Columbia, Princeton, and the ReCAP storage facility, and quantifies the degree of overlap and redundancy among print books that also reside in secure digital form in HathiTrust. That degree of overlap, when considered in conjunction with the usage data from this new report, speaks volumes (heh-heh) about the potential for carefully managed drawdown of print monographs. There is an enormous opportunity here to dramatically reduce overhead costs related to print collections--and to release space for other uses.

An under-reported finding from Malpas's work suggests the scale:

"...we estimate that the median space savings that could be achieved at an ARL library if a robust shared print offer were in place today to be approximately 36,000 linear feet or the equivalent of more than 45,000 ASF [assignable square feet]...

"In economic terms, the total annual cost avoidance -- assuming all of these books are currently managed on-site -- exceeds $2 million per library."

Certainly caveats and fine print exist. But even if we reduced these estimates by 50%, the case for deselection, shared print, and action is strong. Even as we librarians struggle to come to terms with this, our Provosts, academic Vice Presidents, and CFO's will not be blind to these possibilities. As a community, it behooves us to face--even embrace-- this situation, control the drawdown, and reap the benefits for our libraries and parent institutions. We have new data, robust data. Let's use it.

Tuesday, September 20, 2011

In the previous post we looked at provisional estimates of the sale value of withdrawn library books in Business and Education. For convenience, those tables are replicated here, with comments following:

Business Titles

% of Total

SCS Withdrawal Candidates

8,551

100.0%

Unmatchable: No ISBN

4,107

48.0%

SCS Withdrawal Candidates w/ ISBNs

4,444

51.9%

‘Culls’: no copies wanted

838

9.8%

Unknowns: bad matches

673

7.8%

Titles w/ possible list price $.01-$2.50

976

11.4%

Titles w/ possible list price of $10+

1,124

13.1%

Education Titles

% of Total

SCS Withdrawal Candidates

5,796

Unmatchable: No ISBN

3,026

52.2%

SCS Withdrawal Candidates w/ ISBNs

2,770

47.8%

‘Culls’: No copies wanted

584

10.0%

Unknowns: bad matches

314

5.4%

Titles w/ possible list price $.01-$2.50

804

13.8%

Titles w/ possible list price $10+

579

10.0%

Consignment SellingAlibris, like many local book dealers, Better World Books, and other companies, offers a consignment program, through which titles can be sent in batch to its warehouse. Alibris lists the titles in its online database, manages sales transactions, and ships books to customers, sharing the proceeds with the library. Consignment is attractive because it allows the books to be moved immediately out of the library, and minimizes the amount of library staff time involved in the sale process. It also gives withdrawal candidates another chance at being purchased and used. And it may return some financial benefit to the library.

Each vendor's plan probably works a little differently, and each will have its supporters. The key is that any book sale be handled in batch, with as much work occurring outside the library as possible. Many vendors can provide this service. For the sake of example, Alibris's plan works along these lines:

$1/title fee to list in their database

70% of sale price is retained by the library/30% to Alibris

Batches of titles can be shipped to the Alibris warehouse (library pays shipping)

Minimum of 1,000 titles required

Costs and BenefitsThese terms suggest that titles likely to sell for $2.50 or less are probably not worth pursuing, since it would cost as much to list and ship each volume as the library might realize in revenue--and of course not all titles would sell. At the higher end of the price spectrum, the situation looks a little more interesting. Titles that might sell for $10 would break down like this:

Sale price: $10

Alibris commission ($3)

Listing fee ($1)

Shipping/handling ($1)

That leaves a profit of roughly $5 per volume sold. In the combined Business and Education samples above, there are 1,703 such titles. If all of them sold for $10, the library would realize $8,500 in profit. If half of them sold for $10, the library would realize a profit of $4,250. [Please note that these are just convenient estimates. Some titles might sell for more, some for less. This is intended simply as a way to think through the possibilities.] From a financial standpoint, these 1,703 titles are the obvious sweet spot--the place where withdrawals would most likely yield deposits.

Of course, it is highly unlikely that any library's entire list of titles would ever be sold. Any sales dollars realized would have to be amortized across the cost of all units listed and shipped. In this example, the base problem is what to do with 14,347 withdrawn items. Listing and shipping 14,347 items would cost the library at least $20,000. Only 1,703 of these items are likely to sell for more than $10. Therefore, it does not make economic sense to list and ship all 14,000 items. Listing and shipping only the 7,214 items would cost closer to $10,000, and might help sell additional units, but probably at a lower overall margin per volume. Each library's strategy here must be based on whether the goals are primarily financial or primarily toward assuring another chance for as many titles as possible.

A Selective Approach
Clearly, this option is not appropriate for every library's situation, and probably not for all withdrawal candidates from any institution. Consignment sale makes most sense for the likely high-value items, where the library will realize a clear return after listing and shipping fees. In these examples, items with a likely sale price of $10 or more represent 10%-13% of the total number. The likely list price may make these candidates worth listing even if only 50% of them ultimately sell. Each library should analyze its situation carefully before proceeding.

Below the $10 threshold, the picture is much less clear. The likely lower-value books represent 87%-90% of the entire list. Those sold at $2.50 or less would represent a loss to the library, even if we don't consider their original purchase price or the library-based labor involved in preparing them for shipment. Titles without ISBNs or bad matches are of unknown value, but would still incur listing, shipping, and handling costs. Titles likely to sell in the range of $2.51-$9.99 would have some potential to produce revenue that exceeds costs. But the overall yield is likely to be relatively low, since all titles incur costs but not all titles will sell. From a purely financial standpoint, this range is marginal, but might look better if it were contracted to $5 or $7 at the low end.

The selective approach also leaves the question of what to do with the remaining books -- those that fall below the likely $10 threshold. In this instance, 12,644 books are in this category. Some libraries may find selling these unused books at a loss more tolerable than throwing them out. Some may prefer to dial the range back to $7 or $5. Some may prefer any chance at a sale, regardless of price. But that should be a conscious choice.

Tuesday, September 13, 2011

.One practical (and sometimes emotional) concern with removing unused books from library shelves is what will happen to them. As individual libraries begin to withdraw tens of thousands of volumes—and assuming that sufficient copies are retained to protect and serve the needs of researchers—what is to be done with them? As discussed in an earlier post on ‘Disposition Options’, there are several alternatives worth considering, including donation or sale. Today let’s focus on the costs and benefits of selling withdrawn items.

It is with some misgiving that I raise this option, as the value of this material is already questionable. Library users have weighed in by not using these titles, despite as much as a decade of opportunity. Book sales are labor-intensive. In her excellent 2005 article ‘Library Book Sales: A Cost-Benefit Analysis’, Audrey Fenner concludes that none of the prevailing sale methods (annual, ongoing, or online) is cost-effective. Storing uncirculated inventory for sale also requires space, a resource that is likely in short supply in any library engaged in significant weeding. Staff time, another scarce resource, is required to manage both physical items and transactions with buyers.

Given these inauspicious conditions, it seems obvious that selling withdrawn items will not always (or even often) make sense. If books are to be sold by the library, maximum efficiency must be the byword. This involves working in batches, relying on third-party support, and minimal fussing over individual items. If the withdrawal criteria have been set carefully, candidate titles will be those with the least demonstrated value to users. The question is whether these withdrawal candidates may have value elsewhere—and whether that value is sufficient to warrant the effort of offering them to others.

During ALA in New Orleans, SCS consulted with Shelly Stuard, Director of Library Services at Alibris, a major seller of used and out-of-print books. We wondered if Alibris could help us estimate the potential market value of a list of library withdrawal candidates. It turned out that the company already had a method for doing so, basically matching library-supplied ISBNs against an internal ‘saleability’ database. This database uses historical information to identify ‘keeps’ (where additional copies are needed) and ‘culls’ (where sufficient copies already exist in the Alibris network). The match also captures the range of historical sale prices for each title.

There are strong caveats to be borne in mind. Any such data is inherently dynamic. As with mutual funds, past performance is no guarantee of future results. Suggested prices and ‘keep/cull’ status can change at any time. As more copies withdrawn from libraries enter the market, overall prices are likely to decline. But as an overall indicator of value on the used book market, this provides a useful window, and may be as good as any data available.Here are the results from two lists, one in Business, the other in Education.

Business Titles

% of Total

SCS Withdrawal Candidates

8,551

100.0%

Unmatchable: No ISBN

4,107

48.0%

SCS Withdrawal Candidates w/ ISBNs

4,444

51.9%

‘Culls’: no copies wanted

838

9.8%

Unknowns: bad matches

673

7.8%

Titles w/ possible list price $.01-$2.50

976

11.4%

Titles w/ possible list price of $10+

1,124

13.1%

Education Titles

% of Total

SCS Withdrawal Candidates

5,796

Unmatchable: No ISBN

3,026

52.2%

SCS Withdrawal Candidates w/ ISBNs

2,770

47.8%

‘Culls’: No copies wanted

584

10.0%

Unknowns: bad matches

314

5.4%

Titles w/ possible list price $.01-$2.50

804

13.8%

Titles w/ possible list price $10+

579

10.0%

Clearly there is some potential revenue here, but there also associated cost. We'll analyze these results more fully in the next post.