The parents’ efforts to rescue these libraries are laudable, but lack vision and ambition. They are merely trying to retain a terrible status quo. A room of books is not the kind of library where primary literacy skills are learned. The school superintendent, John Deasy, has it basically right: primary literacy skills are learned in the classroom. Critical reading, identifying high-quality information, web-research techniques, and specific sources for particular subject matters are skills that can be learned only if they are incorporated in every class, every day.

At every level in our society, the response to this terrible economic crisis has been one of incremental retrenchment instead of visionary reinvention. The phrase “don’t let a crisis go to waste” may have a bad image, but it applies in this case. California is the birthplace of information technology, and its schools and their infrastructure should reflect this.

School districts should put out requests for proposals to supply every student with an e-book reader, tablet, or notebook computer that has access to a digital library of books and other resources. Big-name enterprises, such as Amazon, Apple, Barnes&Noble, and Google, would be eager to capture his young demographic. Some philanthropic organizations might be willing to pitch in by buying the rights of some books and putting them in the public domain. A slice of public library funds should be allocated to this digital library.

Traditional school libraries are inadequate. It is time to shelve twentieth century infrastructure and fund the tools students need in the twenty-first century.

Tuesday, September 13, 2011

This week, the Networked Digital Library of Theses and Dissertations (NDLTD) holds its annual international conference in Cape Town, South Africa. Founded by Prof. Edward Fox of Virginia Tech, NDLTD is dedicated to making theses and dissertations available on the web. NDLTD is an organization where library and academic-computing professionals coordinate their activities and support each other as they develop programs to improve the quality of multimedia theses and the repositories that hold them.

The good news is that universities from across the globe are adopting electronic-theses mandates at an astonishing rate. Right now, over two million theses are available with a few mouse clicks. Check out the VTLS Visualizer or the SCIRUS ETD Search. By making their research available online, universities increase its impact. This is especially important for developing nations, who are in dire need of thinkers that solve local problems and contribute to global knowledge. That makes the location of this year’s NDLTD conference crucially important, both from a practical and a symbolic point of view.

The bad news is that thesis repositories are underfunded. Often, a thesis repository is thought of as just an affordable digital service with a fast payoff in research visibility. In fact, it is much more: it is a paradigm shift for the business of university libraries. Paper-era libraries collect information from around the world to be consumed by their communities. This paradigm is largely obsolete and must be turned upside down. As discussed in a previous blog post, “The Fourth Branch Library”, digital-era libraries should focus on the information produced by their communities, collect it, manage it, and make it widely available. Setting up an electronic thesis repository, helping students and faculty develop best practices, and helping universities through policy issues are exactly the kind of activities at the core of the digital library mission.

Repositories should be funded at a level commensurate with their importance to the future of libraries. We need to redouble our efforts to get out of PDF and into structured text, to enable full-text search, to improve reference linking, and to connect scientific formulas and equations to appropriate software for manipulation. We must capture all data underlying thesis research and make it available in raw form as well as through interactive visualizations. We must standardize when appropriate and allow maximum flexibility when feasible. A lot of work is ahead.

I congratulate the organizers of ETD 2011 for putting together a fantastic program. I hope the attendees of ETD 2011 will be inspired to build the foundations for the library of the future.

Sunday, September 4, 2011

The stinging critique of scholarly publishers by George Monbiot in The Guardian and on his blog describes the symptoms accurately, but misses the diagnosis of the problem. As commercial enterprises, publishers have a duty to their shareholders and to their employees to extract as much value as possible out of the information they own. If you think they should not own the scholarly record, blame the academics that signed over copyright. If you think site licenses for scholarly journals are too expensive, blame universities for continuing to buy into the system. Scholarly publishers are neither evil nor dishonest. They are capitalists exploiting a market they have created with eager participation of academia. Academics and librarians have been whining about the cost of scholarly journals for the last twenty years. One more yammering op-ed piece, or a thousand, will not change a dysfunctional scholarly-information market. Only economically meaningful actions can do that. Change the market, and the capitalists will follow.

By making buying decisions on behalf of a community, libraries eliminate competition between journals and create a distorted market. (See my previous blog post “What if Libraries were the Problem?”) The last twenty years were a chaotic period that included inflating and bursting economic bubbles, the worst financial crisis since the Great Depression, several wars, and unprecedented technological advances in the delivery of information. In line with normal expectations under these conditions, most publishers faced an existential crisis. Amazingly, most scholarly publishers thrived. Is it just a coincidence their main revenue source is libraries?

Researchers need access to scholarly research. This legitimate need is conflated with the necessity of buying site licenses. A site license merely extends a rigid paper-era business model that ignores the unlimited flexibility of digital information. As digital-music consumers, students and faculty will not even buy an album of ten songs if they are interested in only one or two. Yet, for this community, their library subscribes to bundles of journals and joins consortia to buy even greater bundles of journals. Pay-per-view systems are expensive and painfully slow, particularly when handled through interlibrary loan. This information-delivery system is out of step with current expectations. The recording industry serves as an example of what happens in these circumstances.

It’s time to face the music. (I could not resist.) For an author, the selection of an appropriate journal and/or publisher is crucially important. For a reader, citations and peer recommendations trump journals’ tables of content, and book reviews trump publishers’ catalogs. I call on publishers to partner with Apple, Amazon, Thomson Reuters (Web of Knowledge), EBSCO, and others to develop convenient and affordable gateways that provide access to any scholarly article or book, from any publisher, whether open or paid access. Such an initiative might eat into site-license revenue, but it just might prevent the system from collapse and provide a platform for sustainable reader-pays models or hybrid models. Publishers have already hedged their bets with sincere, but timid, open-access initiatives. This is just one additional hedge, just in case...

In fact, I suspect many publishers have mixed feelings about site licenses. They generate high revenue, but they also come with high fixed costs. An extensive sales staff keeps track of thousands of libraries and conducts endless negotiations. Middlemen take a bite out of most proceeds. Every special deal must pass through an internal approval process, taking executives’ time and energy. There are serious technical complications in controlling access to journals covered by site licenses, because publishers must cede authentication processes to libraries and because they have no direct relationship with their readership. Publishers are caught in a vicious circle of increasing costs, more difficult negotiations, more cancellations, and increasing prices. I suspect they want a better system, one in which they can offer more services to more users. Yet, they find it impossible to abandon their only significant business model, even one at danger of collapsing under its own weight.

Change will happen only if universities take economically meaningful actions. Stop buying site licenses, let students and faculty decide their personal information requirements, subsidize them where appropriate, and let the free market run its course. (See my previous blog post “Libraries: Paper Tigers in a Digital World”.) In future blog posts, I intend to discuss methods to subsidize information that are more effective than buying site licenses and gradual approaches to get us there. Just as a thought experiment, consider the following: Cancel all site licenses, and use the savings to lower student tuition and raise faculty salaries. How long would it take for alternative distribution channels develop? How would prices evolve? How popular would open access be?

In a web-connected world, the role of libraries as intermediaries between information providers and readers is obsolete. As discussed in “The Fourth Branch Library”, libraries should increase their focus on collecting, managing, and broadcasting the information their communities generate. They should not be concerned with the information their communities consume.

About Me

Eric is a technology consultant specializing in the strategic application of new technologies in academic computing and library services.

Prior to this, Eric was the Director of Library Information Technology, a Senior Research Associate and Lecturer in Applied Mathematics at the California Institute of Technology. Eric holds a computer science degree from the K. U. Leuven (Belgium) and a Ph. D. in Mathematics from the Courant Institute, New York University, NY. He is the author of papers in scientific computing, library technology, and the graduate textbook "Concurrent Scientific Computing", published by Springer-Verlag.

He chaired the OpenURL standardization committee, which developed the OpenURL ANSI standard. He is on the Board of Directors of the Networked Digital Library of Theses and Dissertations (NDLTD).