Thursday, September 30, 2010

Tim O'Reilly, Founder and CEO of O'Reilly Media, didn't mean to become a publisher, he just fell into it. He started out doing technical documentation by accident because a friend who was a programmer was asked to write a manual but didn't know how. O'Reilly had written a book about Frank Herbert, and agreed to help. After doing consulting for a while, he noticed that all his clients wanted manuals for the same software, so he started to retain his rights so he could sell the same manual over and over again. He never wanted to be a publisher; the job he wanted to do was to "spread the knowledge of innovators".

That single-minded attention to "doing a job" has set him apart from many of his colleagues in publishing, because it has led his company into efforts that focus on spreading knowledge rather than selling books. He recognized that by publishing books, he was taking an oral culture around technology and translating it into a written culture.

Early on, O'Reilly expanded into conferences. He found that despite the strong sales of "Programming Perl", no one was talking about or spreading the word about Perl. He organized a "Party for Perl" and a lot of people showed up. The resulting conference business built on the realization that his best selling books were being written by innovators who had no place to get to know each other and spread their innovations.

Another direction that grew naturally out of O'Reilly's focus on spreading knowledge was a digital distribution business. O'Reilly had taken the trouble to figure out how to do some of the messy bits in selling digital versions of their books, so when other publishers saw what they had, they were happy to have O'Reilly help them do the same.

O'Reilly was interviewed in front of a room full of New York publishers on Wednesday as part of a series of interviews produced by the Publishing Point group. He combined an enthusiasm for the changes sweeping publishing with a missional faith in its practicality. Yet many of his themes were deeply at odds with the conventional wisdom of traditional publishers, and he claims not to be a publisher at all from a philosophical point of view. O'Reilly wants to see more innovation in pricing, and thinks that people will buy more books - and spend more, if ebooks are more moderately priced. Publishers trying to preserve the pricing of print books in the shift to ebooks are not following a winning strategy, according to O'Reilly.

O'Reilly's notions of what's NOT important for publishing were surprising to hear. They're shaped by O'Reilly's formative experiences in publishing. He claims to fight the notion that publishing is about quality. His first books sold even though they didn't have an index or an ISBN, or even a spine. They didn't have pretty formatting, but they had good information, and they paid attention to the things that mattered for the job they were meant to do. In a book about programming, the code samples needn't look good, but they do need to be correct, without extra spaces that break syntax. They were selling books for $5, and people would call up from Europe asking them to overnight a copy.

He's also skeptical of the notion that an important role of publishers is curation.

In the old days, we had a long period where it was fairly clear what were the hard things that publishers did. To be quite honest, It was NOT curating the content and finding great stuff. I think that's certainly part of what a publisher does, you know winnow through the chaff and find something really great. It's still part of what you have to do. But I think It's seductive to think 'we're really good at that'.

O'Reilly learned not to overvalue curation the hard way. At the very start of the internet, O'Reilly developed a website called GNN (for Global Network Navigator) which selected the very best websites from the internet and organized them into categories. "We had a publisher's mindset. We said: we're going to winnow through all these emerging world-wide web sites, and we're going to pick out the ones that are the best." You've probably not heard of GNN; that's because Yahoo came along and categorized all the websites it could find, not just the good ones. Google then came along and made Yahoo's categorization irrelevant by indexing all the pages on the web without even bothering to categorizing the web sites.

According to O'Reilly, "Manual curation is going to get it wrong a lot." He mentioned that Frank Herbert's bestseller "Dune" was turned down by over 50 publishers until it was finally published by Chilton's, a publisher of automobile repair manuals, of all things!

Michael Healy, the interviewer for the day, wanted to clarify what he heard O'Reilly saying. "With the notion of curation having been disrupted, if you're a general trade publisher and you're relying on curation as your value-add, are you screwed, or have I missed something?"

"It's certainly true that alot of my thinking is biased by the fact that I publish stuff that people need rather than the stuff that they just want, so I'm not sure how deep my insight goes into the problems of general trade publishing."

Audience member Bill Glass expressed concern about the low prices for ebooks. He pointed to the room (in the Random House building overlooking Broadway), and asked "In the ebook world of the future, will we still be able to afford THIS?"

To me, you gotta care about something more than preserving your business. Because, obviously companies have made these types of transitions, and this is just my personal response. We're all gonna die one day anyway and we'll lose all our stuff. So don't worry too much about it. Just do something that lights you up, and lights up your customers, and lights up the world and scale to that. Because what's going to happen is people who are lit up by the future are going to be pursuing that future, and the people who hold onto the past are going to hold on too long.

Tuesday, September 28, 2010

As the library staff collected across the street at their designated meeting spot, they heard a loud noise and saw a smoking manhole cover in front of them pop. Later, they found that an underground explosion had lifted and broken a concrete floor of the library, buckling walls and blowing doors off their hinges.

The latest news on the Morristown library is that the local power company, whose lines were probably the cause of the explosion, can't figure out how to get a new connection wired, causing further delay to the reopening.

After writing this piece, it occurred to me that there ought to be established an "Emergency Response Digital Public Library" that could be deployed wherever and whenever needed. A library that experiences a loss by fire, flood or other disaster such as Morristown's has enough work on its hands in trying to clean up the mess and to re-establish service to be able to provide for all of its community's library needs. A large number of libraries on the gulf coast simply closed in the wake of Hurricane Katrina.

Certainly many publishers would be willing to provide free licenses to content is such situations- not only would it generate good will, but it would expose audiences to all the good stuff they have to offer. In fact, aggregators such as EBSCO and Proquest have made content available this way; but to my knowledge, there's not been a library organization to organize and promote the content for patrons of libraries that have experienced disaster loss. The Health Library for Disasters is a good example of what's posssible, though with a much narrower focus.

An additional benefit of establishing an emergency response digital public library would be that it could serve as a laboratory for libraries as they make their own transitions into digital information. Extensive "instrumentation" and careful compilation of statistics and usage patterns could be used to improve non-emergency digital libraries.

Too often, people are willing to donate money to relief efforts when a disaster occurs, but the infrastructure needed to absorb that aid doesn't exist. Modest expenditures on preparation before a disaster occurs can greatly increase the effectiveness of that aid. It seems to me that a broad-based digital library would be a useful part of that preparation.

Thursday, September 16, 2010

Libraries like to work together. They also love to form structures to shape this cooperation. There are all sorts of library associations, consortia, coalitions, collectives, cooperatives and federations. One of the big international library conferences is that of the International Federation of Library Associations and Institutions. There is even an International Coalition of Library Consortia. There's not yet been much effort to federate the library consortia coalitions.

When I suggested last month that libraries ought to form an ebook acquisition collective to buy up ebook rights and make the ebooks available on an open-access basis, some readers misunderstood what I was suggesting. They thought that I was suggesting something similar to the purchasing consortia that many libraries have formed to aggregate their buying power and obtain lower prices and standard terms from publishers. Indeed, a recent report (pdf, 3.4 MB) from the Chief Officers of State Library Agencies (COSLA) has proposed formation of a national ebook buying pool. But this report doesn't contemplate making the ebooks open-access, it assumes the buying pools would negotiate deals with existing ebook providers.

To make it clear what I'm suggesting, I'm going to refer to "ebook asset acquisition" instead of just "ebook acquisition". The asset to be acquired is nothing less that the right to distribute an ebook, to anyone, anywhere, without charge. Here's the way I think of this:

The mission of libraries should be to provide access to books to anyone in their community without charge.

To fulfill that mission, libraries should be trying to acquire all the rights needed to give unfettered access to ebooks to anyone anyone in their community without charge.

Community access to ebooks is in many ways incompatible with business models in which restricted access to ebooks is sold to consumers. It doesn't make a lot of sense to pretend that ebooks have the same usage characteristics as print books.

Rational publishers should be willing to sell off ebook rights outright if the price that exceeds the present value of the ebooks expected revenue stream.

The amount of money libraries currently spend on books of all types would be sufficient to acquire outright many works, particularly those sold primarily to libraries.

The way for libraries to meet the publishers' price for these ebook assets is to organize into a collective that aggregates the total demand and acquires ebook rights.

Therefore, an ebook asset acquisition collective out to exist. Why is this not happening?

To some extent, it IS happening.

Frances Pinter, the publisher at Bloomsbury Academic, gave a keynote at the O'Reilly Tools of Change for Publishing conference this past February on her proposal for an "International Library Coalition
for Open Books".

Pinter sees a funding model through the eyes of a publisher, which I'm not, but the idea is pretty much the same. A mechanism that enables open-access to scholarly monographs while keeping the enterprise of publishing economically viable would be of great benefit to society at large.

After the previous post, I got feedback that publishers would be very hesitant to sell off ebook rights; publishers view their intellectual property rights as "part of the company" and divesting of this rights would be a "liquidation strategy". I was surprised at this view. I would have thought most publishers are more interested in the production and launch of books than in servicing them in perpetuity. From a business point of view, books are like the loans made by banks. You put out money up front, and make that money back as a stream of payments.

The banking industry has advanced far beyond the view that holding assets is their core business. Banks that originate the loans don't service them any more, they sell off the loans so they can make new loans. And, no that's not what led to the banking crisis. Publishers of all stripes should likewise be willing to cash in their book assets and use the proceeds to produce new ones.

Another type of feedback I got was that the logistics of buying and selling ebook rights could be difficult. For example, how would a publisher plan print runs and promotion? If a print run had just been paid for, the publisher would be loathe to sell off ebook rights. If a book was due to be published in February and announced in a Spring catalogue that goes to press in December, when might it be sold to a library collective, and how quickly would the collective be able to decide?

There are a couple of answers. First of all, smaller print runs and more POD minimize the risk of printing too many books. This is happening anyway. Second, I assume that publishers can build in print risk to the price they ask for. If printing is really only 25% of first copy cost, well that's only 25%. Supply chain markup is as much as 50%

I'm a pretty strong believer in markets. I imagine that a library collective would help to create a more efficient market for publishing assets which are deployed awkwardly in the current system. Nothing would force publishers to accept any disadvantageous price for an asset. The library collective would weigh the price and quality of each asset before deciding, presumably using automated systems, which ones to acquire collectively. Building internet-based markets is a technology that is reasonably well understood; librarians are familiar with automated purchasing systems and with collaborative selection. It shouldn't be too hard to put an expiration date on an publisher's offer to sell- we've all heard of eBay.

I expect that there will be a variety of product life-cycles for ebooks. The typical midlist book today spends some time on bookshelves before being returned to the publisher or set out on the discount racks. Its life may be extended on Amazon; used-bookstores enter the channel without contributing any revenue back to the publisher. The typical library-acquired title might have had a "toll-access" run of a year or so; the publisher could capture the most eager purchasers using delivery systems using DRM. Most libraries would wait for the book to be acquired by the collective at a discount and made open-access. This market market segmentation is similar to the hardcover/paperback model which adds efficiency to the current market.

Some publishers might try to sell ebooks even before the book is published; they might even succeed. Some books won't get bought at any price, let alone at the cost to produce them; others would probably be acquired even at an obscene profit margin for the publisher. Some things never change.

The biggest uncertainty with a system that allows libraries to collectively acquire ebook rights (including the rights to give them away!) is size of the revenue lost to free-riders. The free riders would be of two types; libraries that don't participate in the system, and book buyers not associated with libraries. The share of sales made by university presses outside of libraries varies from press to press, but some presses indicate that as much as 70% of their sales occurs outside of libraries. This share has increased over the last decade or so, thanks to new sales channels such as Amazon and tightening library budgets. One possible solution to this issue would be to open the collective to consumers; I will write more about this in the future.

Then there's the issue of libraries that would choose not to participate in funding the acquisition collective but would still benefit from the ebooks liberated by the collective. If an ebook rights acquisition collective comes to pass, we'd really find out how much libraries like to work together!

Monday, September 13, 2010

My essay on food replicators for Library Journal, published back in 2010, is posted.

With all the business models in my salad, I started to think about how dinner would be different if vegetables were somehow digital products. All the hours of my youth wasted on Star Trek reruns began to flash before my eyes. What business models might evolve to make the food replicators of the future work? My brain gears started turning...

I learned about purslane from a radio segment on WNYC. Apparently it's really high in omega-3 fatty acids (the good kind) and vitamins. After years of pulling it out of the lawn I thought it would be fun to put it in a salad, and it turned out to be quite good.

For the salad shown, I paired it with mango, carrots, cabbage and a bit of the red amaranth which has aggressively self-seeded from last year's crop, making itself into another sort of weed. Red amaranth leaves work well in recipes designed for spinach, and it's really quite delicious. You can eat the young leaves raw. Last year I harvested the amaranth seed, which can add a poppy-seed like texture to cookies or pound cake.

A couple of weeks ago, I told a friend about purslane. She's well known locally for her organic garden, which supplies her with all her family's vegetables year round, and she periodically gives tours of her garden as a form of evangelism. She was aghast at the idea of eating purslane, though; even organic gardeners hate weeds!

Real Trekkies will be aware that the "replicators" aboard 24th century starships such as the one captained by Jean-Luc Picard weren't restricted to replicating food; they were more like 3-d copying machines. It's not clear if this would make the intellectual property rights to the food any different. If it were possible to copy an avocado, would it be possible to copyright it? If you can copyright a 2-d photo, why would a 3-d photo be any different?

The "food synthesizers" of Jim Kirk's 23rd century worked more on the lines of the food replicators in my essay. Unlike replicators, which understood voice commands ("tea, Earl Grey, hot") food synthesizers used program tapes or cards inserted into a slot. Clearly the food patterns used more petabytes than could be transmitted over the giga-ultra-wifi in use at that time.

Sunday, September 5, 2010

When I was little, some Swedish was spoken in my house. At some point, I realized that our bathroom words were different from those used by my Ohio playmates. In my house, we didn't do "poop" or "poo" and definitely not "crap" and most certainly not "shit". We did "bice". It may be a complete coincidence, but I have never dined at the Italian restaurant named "Bicé".

I know what you're thinking. Ok, that's number two, so what did you call number one? No, not "pee" or "wee", and definitely not "piss". But I remember exactly when my mom explained to us that the word we used was not the one used by most English speakers. It was when my mom read us the book The Story about Ping by Marjorie Flack. My little sister roared with laughter, because "ping" was our word for urine.

I have no idea whether "ping" is a widely used bathroom word, here, in Sweden or anywhere; I don't go around talking much about ping. But I can tell you that Ping, Apple's new iTunes feature, is a piss-poor excuse for a social network.

I was stunned that Apple had not implemented the obvious functionality. When you listen to a song, you should be able to push a comment for it to your followers. In Ping, you can't. iTunes knows the songs I've rated most highly. Inexplicably, these are not the songs it suggests for my profile. Ping seems only interested in things I've bought recently in the store. But it even appears to be inept at using my iTunes-store sanctioned activity in my profile. It's hard to believe that something so poorly executed could get released by Apple.

After thinking about it for a while, I realized what had happened. Imagine a system that can tell people what songs you have on your computer, and can connect you directly with the people interested in those songs. You want to share information about the songs and connect to people. Does that sound vaguely familiar? Do you remember Napster? The only difference between a well implemented Ping and the legally challenged Napster is a way to push files around.

I think that Apple showed Ping to some music publishers, who flipped out at the possibility that it would be used for file sharing and forced Apple to cripple Ping. Or maybe Apple saw the file-sharing potential itself and worried that Ping could kill off its music based revenue stream. Or could it possibly be that Apple is being incredibly devious, and is expecting that someone, somewhere will see how to add file sharing to Ping, resulting in huge popularity of Ping even while some elusive third party assumes all the Napster liability? We shall see.

The only thing I'm sure of is that Apple isn't likely to discuss what "Ping" really means – outside of its own bathroom.