Month: June 2011

Those of us who maintain a critical distance from the miracle that is supposed to be the eBook are often accused of being nostalgic romantics or Luddites for our attachment to the printed form, even if we readily accept that that eBooks will have an important subsidiary role to play in the future world of books and reading. By this reckoning, it would also appear that the whole of Europe is also Luddite, as they too seem to be resisting the hysteria that the American media is whipping up regarding eBooks. It was rather interesting for me, then, to have come across the following essay written by Simone de Beauvoir in 1947 comparing the United States and Europe, in which she first expresses her admiration for the energy and industry of the Americans, manifested by all of the bridges, radio towers, and skyscrapers being built in the USA at the time, but then states that “even among the American intellectuals there is a very definite tendency to confuse the opaque fullness of the object with real human riches. The intellectuals, too, seem to interest themselves exclusively in the result, rather than in the movement of the spirit which engendered it … [The American] does not hope, like the artisan of other times, to confer an eternal value on the object he fashions; he is not concerned with quality, for such a concern implies not the measurement of time but the confident surrender of it; the American wants only to “get on” with his work so that the result be not out of date before completion.” [Philosophical Writings, U of I Press, Urbana,2004, pp. 310-11]

Though de Beauvoir wrote this before Kindles, Nooks, et al. were even a science fiction concept, her analysis would seem to apply to much of the thinking that the more vocal advocates of eBooks manifest. Especially that — in the rush to embrace the seeming transcendence of this new, sexy technology — there is a distinct lack of consideration of the importance that the physical nature of print books has had, and continues to have, on the building up and maintenance of a reading culture; this is especially evident in the role played by book stores, used book stores and, in their current form, libraries. If eBooks do succeed in the way that its proponents often state (and as Amazon, the maker of Kindles, actually plots to do), then these institutions would likely be swept away. We then must ask ourselves whether book culture – both for writers and readers – will be better off if we too are wholly cast into that amorphous digital abyss and have to compete with all the distractions that the digital world offers (video games, music, films, tweets, Facebook, etc.). And, in line with what de Beauvoir was writing, we must ask ourselves whether this desire not to be “out of date before completion” will cost us the possibility of the printed word being, with its efficient, analog technology (the printed book and bookstore) an oasis from the great mass of digital and electronic media that increasingly defines our living environment and all of the impatient clicking, clicking, clicking that we must engage in to navigate it.

On February 24, 2011, Amit Singhal and Matt Cutts posted “Finding more high quality sites in search” on the official Google blog. They described what some have since termed the “Farmer algorithm,” which will change how Google ranks sites and may affect up to 11.8% of queries. Singhal and Cutts say: “This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”

Even to a casual web surfer it’s obvious that there are a lot of low-value pages on the internet, as well as pages which are “just not very useful,” First of all, there’s spam – search engine blekko.com has a spam clock that illustrates the relentless appearance of a million new pages of spam each hour. Then of course there are the pages produced by content farms, which are of dubious value. According to Wikipedia, “In the context of the World Wide Web, the term content farm is used to describe a company that employs large numbers of often freelance writers to generate large amounts of textual content which is specifically designed to satisfy algorithms for maximal retrieval by automated search engines. Their main goal is to generate advertising revenue through attracting reader page views.” By definition, this implies that their main goal is not to produce what web users need and want most.

Google’s decision to target content farms didn’t come out of the blue, and many at these prolific, low-value sites knew they were creating a problem for themselves. On February 8, a couple of weeks before Google’s announcement of its new algorithm, Jason Calcanis, CEO at Mahalo, made a presentation at Signal LA. He cited the following figures on the output of some of the best-known content farms: Demand Media gives us 5,700 pieces of content per day; AOL 1,700 per day; Yahoo’s Associated Content 1,500 pieces per day; and Mahalo produces 1,100 pieces. He quoted from a January 21 post on the Google blog which said, that people want stronger action on sites that produce spam and poor-quality content, and he warned fellow content farmers not to “make Google look stupid” or they “will f**k you up.” He went on to say, “eHow and Demand Media: You have awoken the giant. Don’t do this kind of stuff. We’re all running off a cliff doing this sh***y content. And we have to look in the mirror and say, ‘Is this what we want to have created for our users?’ We are polluting the internet.” He said Mahalo now plans to spend hundreds or thousands of dollars on each page to ensure better content and added, “It’s good for business if we get back to focusing on quality.”

And Calcanis wasn’t the only voice speaking up about the need for quality. Lisa Barone at Outspoken Media posted a February 10 article called “Why Google, SEOs & Users Must ‘Blekko Up.'” In it she advised those involved in Search Engine Optimization as follows: “…if content is King, then put your money where your mouth is and use real copywriters. People that don’t get paid $.50 a word. If we ban content farm sites from the Web, we’re going to be left with a lot of content holes that need to be filled. There’s an opportunity there to create content that will rank on merit and be worthwhile.”

So obviously we need better content on the internet, right? Making content better is good for those who run sites, for the search engines, and especially for those of us who use the internet. Therefore, producers and consumers of content for the internet should be glad that the National Writers Union has launched a campaign to “raise the pay scale for online content writers.” A recent post on the Writers Union site says, “The new world of electronic journalism makes it much easier to get published. Yet, it’s also made it much harder for writers to make a living. Websites keep their finances secret. They encourage writers to submit content with pleas of poverty or calls to civic commitment. Writers are understandably confused. These sites are making money and preaching the civic good, but they are unwilling to pay fair wages.” As a member of the National Writers Union and a reader of online content, I’m glad to see this campaign begin.

Recently, the people at Facebook told us that over the New Year’s weekend alone, its users uploaded 750 million photos. If you had any doubts about the importance of social media in people’s lives, this little statistic says a lot. But in its “unfriend coal” campaign Greenpeace continues to put pressure on Facebook to cut its dependence on coal-powered electricity for its data centers. This is in reaction to the social media giant’s decision to use electricity from PacificCorp, an energy company which makes 2/3 of its power from coal. Facebook says its data center is extremely energy-efficient and says it wants to focus on how much energy it uses rather than the way the energy is produced. But Greenpeace blogger Jodie Van Horn explained why this isn’t good enough for Greenpeace: “A highly efficient data center powered by coal destroys the planet, it just does so more slowly than one lacking in state-of-the-art efficiencies.”

Greenpeace’s Facebook campaign is a high-profile way of pointing to the fact that what’s on the web isn’t free of environmental impact, even if you can’t see digital media polluting the air or ripping down rainforests. Vast server farms do exist and use huge amounts of energy, most of it from nonrenewable sources. In addition, Greenpeace says that by 2020, at current growth rates, data centers and telecommunication networks will consume as much energy as France, Germany, Canada and Brazil combined.

Social media and the web are obviously here to stay, but Greenpeace reminds us that we must pay attention to the environmental consequences of the digital media technologies we use, particularly as they continue to grow.

In his recent Library Journal article “Bottoming Out?” Michael Kelley examines budget issues faced by libraries in the United States. Most important is the fact that governments are looking for ways to slash spending, and libraries are soft targets. Kelley notes that 72% of libraries that responded to a 2010 Library Journal survey said their budget had been cut and 43% of them experienced staff cuts. (Budget problems plague libraries in Britain, too, and passionate library supporters are fighting government cut-backs that could result in the loss of 400 libraries. See the Guardian for more on this.)

Every state expects greater budget shortfalls this year, and newly elected congresspeople who represent Tea Party interests are particularly likely to want to cut federal funding. American Library Association’s Washington office director, Emily Sheketoff, was quoted in a short report on the ALA in Publisher’s Weekly on the short-sightedness of this approach: “Eliminating the debt is all well and good, but you can’t eliminate it by eliminating the democratic institutions that have made our country great, and number one is the library.” This article also described ways that the trend toward digital publishing could take its toll on public libraries.

Understanding how the proliferation of e-books could cause problems for our libraries requires an understanding of why libraries are able to loan books to patrons without violating the author’s copyright. In “First Sale Rights” (December 2010 issue of Computers in Libraries) Samuel Liston does a good job of explaining the first sale doctrine. In the United States at least, the first sale doctrine makes it possible for libraries to buy books and then do with them whatever they please, including loan them out. Though in other countries there are other legal justifications for lending libraries, in the U.S. it’s the first sale doctrine, a concept which was codified in the Copyright Act of 1976.

Liston gives a careful history of recent challenges to first sale rights, particularly the case of Vernor v. Autodesk, Inc. in which the Ninth Circuit court ruled that Timothy Vernor, who bought a used copy of Autodesk’s AutoCAD software, complete with license at a garage sale, didn’t have the right to sell it on eBay. The American Library Association filed a brief in support of Mr. Vernor, fearing that the software industry’s approach to licensing could be taken up by other copyright holders, such as book publishers, but the Ninth Circuit ruled that Mr. Vernor’s right to sell the software wasn’t guaranteed. Liston says that, although we are a long way away from the dominance of e-books, the precedent set by the Ninth Court ruling could mean that libraries could lose the legal right to lend e-books unless they negotiate better terms from wholesalers or with publishers. These considerations could also affect print-on-demand books, which are also expected to play a larger role in the future.

The onward rush of technological change seems inevitable, and budgets are being rigorously cut. As citizens, we should be watchful so that these trends don’t further undermine a valuable and much-beloved institution, our public library system.

As I search the web, I find a wide range of opinions about the future of publishing, reading and writing. At one extreme, there are those who have a neo-Luddite disdain for any form of digital technology, who claim that printed books are perfect the way they are and should never change. At the other extreme, there’s Patrick Tucker at The Futurist, magazine of the World Future Society, who thinks that reading and writing may soon have a marginal role in our lives. In his 2009 article, “The Dawn of the Postliterate Age,” Tucker claims that by 2050 written language will be “functionally obsolete.” He supports that claim by noting that “between 1982 and 2007, reading declined by nearly 20% for the overall U.S. population and 30% for young adults aged 18–24[.]” We love to look at images, Tucker says, and as we have more and more ways to look at images, we will want to read less and less.

Tucker quotes Nicholas Carr’s 2008 Atlantic Monthly article, “Is Google Making Us Stupid?”: “Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory…. The deep reading that used to come naturally has become a struggle.… My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.” The fact that the written word is so important to people who communicate via text message is seen by Tucker as the last gasp of a Golden Age of the technology of writing.

Less extreme points of view on the future of reading and the effects of digital publishing appeared in a New York Times article called “Does the Brain Like e-Books?” Five people gave their thoughtful and well-considered opinions (an English professor, a professor of child development, a computer scientist, a professor of informatics, and an author of a book on the brain). Of the five participants, only the English professor seems unreservedly enthusiastic about digital publishing.