REVIEW OF: Ray E. Metz and Gail Junion-Metz. Using the World Wide Web and Creating Home Pages: A How-To-Do-It Manual. New York: Neal-Schuman, 1996.

by Bradford Eden

The explosion of interest in the Internet and the World Wide Web has created a quandary for many libraries concerning knowledge of and access to the Internet for patrons, as well as for librarians themselves who have no experience with the Internet or perhaps with computers as a whole. This book is another volume in the popular Neal Schuman series, How-to-do-it Manuals for Librarians, that attempts to bridge the gap for librarians and libraries exploring this new area. As the preface states, this book "is designed for library administrators, professional and support staff, and interested patrons." According to the authors, this book is "one of the first that discusses, in detail, the process of planning for and implementing a library/community Web site." (p. xi)

Chapter one is a basic history and introduction to the Web. Chapter two looks at the issues involved in providing access to the Web before planning begins. Chapter three provides basic technical information needed to connect to the Internet and budget for it properly. Upgrading current connections is also covered for those libraries and librarians that are already connected to the Internet. Chapter four covers training issues for staff and patrons. Chapter five outlines the planning process for creating a Web presence on the Internet.

Chapter six explains how to design a good Web site, while chapter seven introduces basic HTML (Hypertext Mark-up Language) commands and tags, as well as information on HTML editors. Chapter eight describes testing your Web site and keeping it up to date. Chapter nine explains marketing procedures and sites for announcing your Web page to the public and the Internet community. Chapter ten is a glossary of Web terms, and chapter eleven is an extensive bibliography of print and net resources for each chapter. A large appendix is given of sample library Web pages from the Internet.

The layout of the book follows other books in the series. A short, one-line summary of each section within the chapters helps to guide the reader who wishes to peruse the book quickly, or find necessary information without having to read the entire book. New terms, technical jargon, and Web terminology are underlined in the text, and are then explained by sidebars in the margins. This helps to keep the flow of thought going forward in the text, without losing those readers who do not understand the terminology. Each chapter ends with short bibliographic entries that point the reader to the full bibliographic citations in chapter eleven.

The authors have followed a practical approach to this complex subject by providing just enough detail in plain, non-technical English, so that both the experienced and the beginning Internet and Web librarian can follow along and accomplish the final objective: designing and creating a Web site for a library. I found that the above principle was achieved rather well.

Three library environments are alluded to throughout the book: the K-12 library, the college or university library, and the public library. The authors go to great lengths to include all three library environments in all discussions throughout the book, going so far as to create bullets in the text for each environment when differences between the three need to be explored. Whether each environment is covered adequately in this book will need to be decided by the librarians in those environments.

In chapter one, I found it helpful that the authors gave command explanations for a number of Internet browsers, including Netscape, Lynx, and the CERN (European Laboratory for Particle Physics) command-line browser. In chapter four, the authors provide a sample library instruction presentation in the form of six weekly brown bag lunch seminars for training patrons and staff in the Internet. Finally, the chapter seven discussion of basic HTML commands is very practical and non-technical, and would be extremely useful to beginning library Web designers.

I would strongly recommend this book, especially to those library administrators looking to budget and plan for Internet and Web access. This is a good book for any librarian whose task it is to design and create a Web page for his or her library. This series has always produced high quality, easily understandable books on subjects of interest to the library community, and this book is a welcome addition to our knowledge base.

Dr. Brad Eden (BEDEN@MAIL.NHMCCD.CC.TX.US) is the Coordinator of Technical Services/Automated Library Services at North Harris Montgomery Community College District in Houston, Texas.

Any book that comes with a free CD-ROM, as this one does, automatically scores extra points with me--even though they are usually next to worthless and end up in the same stack as all the America Online disks, as this one will. The quick and dirty review of this book amounts to a roaring--ahhh, not bad. It's heavily Mac-centric, a bit dated as all books are these days, and all but ignores the UNIX world. On the positive side it covers PC technology fairly well, hits all the multimedia products that one could arguably call standard, and comes with a CD-ROM full of dubious materials. Over all, this is a real borderline, take-it- or-leave-it book. I wouldn't pay the $39.99 Standard Retail Price myself, but I might put out $15 or $20 for it. I doubt I'd try teaching a class from it, but I might use it for a basic level streaming reference, which is its strong point. If you need an introduction to streaming multimedia, this one isn't bad!

The layout of the book is really cool and worth noting. The page is divided vertically in two with a thin column on the left side. This strip is used for nifty snippets that really help the reading. The writing style is generally playful and very personable. The book scores big points here. The book is also littered with high quality screen shots from Web-sites and applications. Even though the book is overall very much a Mac thing, the screen shots are a good variety of PC and Mac screens. Definitely two thumbs up to the layout and writing style.

The introduction claims that chapters one through three cover the "current climate" of Web-media. Chapters one through three do go through last year's "current climate." The video and animation standards-for-now are introduced, i.e., QuickTime, Video for Windows, JPEG (Joint Photographic Experts Group) and MPEG (Motion Picture Experts Group). Streaming, Shockwave, and Java are also introduced. Transmission standards are covered lightly with no more detail than is necessary for a multimedia book. The author does get into the bandwidth problems with "Webmedia" in chapter two, but for my money it should have really been hammered home better. Some reference to the Bandwidth Conservation Society would be in order to really stress to Webmedia developers the need to watch the bandwidth consumption of their creations. The author really seems to push Windows NT Server 4.0 here. No mention of the multitude of UNIX-based servers out there, like Yahoo, which I believe runs on FreeBSD.

The best part of chapter three is a rip-roarin' discussion on the social fallout of Web-mania. Also, this chapter takes the reader on a nice little tour of some Web sites with some inside development information on each. The chapter ends with a light-hearted, but serious, talk on copyright. The discussion is one-sided, though, aiming solely at insuring that one isn't stealing something that belongs to someone else. How to keep your stuff from getting ripped off isn't even touched. I'm not sure I blame the author for that. There doesn't seem to be any way right now to keep people from lifting copies of graphics, etc. from your page once they have viewed it.

Chapters four and five hit streaming hard and well. Being a novice streamer, I learned a thing or two here. This topic is a recurring one throughout the book, and is definitely its content strong point.

Chapter six is entitled "Web Audio." It should have simply been "RealAudio." Nothing really is covered in this chapter but RealAudio, and it deserves it. It really has no worthy competition right now. The chapter includes some good how-to's for RealAudio servers. Pages 110 and 111 have an excellent strategic approach for RealAudio work. Photocopy this one and pin it on the wall above your server! Also worth noting in this chapter is a how-to on installing the RealAudio server for BSDI (Berkeley Software Distribution, Inc. UNIX). The UNIX world is finally given some voice in this book!

Chapter seven covers other miscellaneous types of Webmedia, but mainly MIDI (Musical Instrument Digital Interface). Not much here to write home about. The author start falling back into Mac-mode again. Do they still make Macs? Why?

Chapter eight begins with a Mac versus PC discussion. The role of CPUs (Central Processing Units) in digitizing is also discussed, and, of course, he who has the biggest and fastest wins--no surprise there. After more discussion on varieties of hardware, the author gets back to the Mac versus PC argument over which is the best platform. The author quickly writes off the question, stating that the true question is "how to make movies that look good on Windows and the Mac." A page and half answer follows. So how do you make good video on a Mac? Who cares! No one is going to be looking at it on a Mac! Okay, maybe, what, 5% of netizens will check out your site using a Mac. And how fast is that number declining? Unless you are creating a site specifically for diehard Mac fanatics that can't see the writing on the wall, I don't see the point or profit in worrying about these cross-platform issues.

Chapter nine covers software tools for Webmedia. The first half of the chapter covers software for Windows, the second half covers Macs. At least the author got the order right but what about Silicon Graphics workstations, Sun SPARCstations, and UNIX? A whole plethora of software is mentioned for both PCs and Macs. The author wisely states at the outset that this is not intended to be a complete list.

Chapter ten gets into programming for Webmedia. The author is honest with the reader about the complexity of programming Webmedia and the need for some UNIX knowledge. This is worth a few points. Several comparisons between Web authoring and CD-ROM production are drawn. If you've never mastered a CD-ROM, as I have not, this quickly becomes annoying.

Chapter eleven covers connecting your media masterpieces to your HTML (Hypertext Markup Language) code. First, a brief and necessary rundown of HTML is given. The bulk of the chapter then covers writing references to specific flavors of media into your HTML, like RealAudio and VDOLive. The chapter concludes with a very insightful mention of the growth of third party browser plug-ins.

Programming languages are cover in chapter twelve. I like the way several lesser languages are mentioned, then attention is focused on C++ and even more specifically on Java. The author stumbles a bit on the question of whether Java is C++. It isn't, technically. You must buy the compilers separately which is my argument, but the similarities are obvious. The chapter does a good job of stepping the reader through acquiring a compiler, writing an applet, referencing it in HTML, and viewing it in Netscape. VBScript is covered lightly, and JavaScript is covered a bit more.

Chapter 13 covers authoring software. FrontPage, PageMill, Netscape Gold, and Webauthor are all introduced, but HotDog is left out to my surprise. Microsoft Internet Assistant is covered in excruciating detail. This is a big sales pitch for Bill Gates. By omitting Corel PerfectOffice Suite 7 and its wonderful Web-aware applications and utilities the book shows its datedness. Of course, next year as someone is reading this, they will be able to point out that I have not mentioned Microsoft Office 97. Well I have now, even though it's only in beta!

Chapter 14 is a worthy chapter covering remote servers--namely, how to maintain your site on one. As more Webheads build their sites on space rented on other peoples' servers, this type of knowledge becomes more important. If another edition of this work is written, I'll bet this chapter gets bigger.

Chapter 15 drudges up the old cross-platform issue again, PC development versus Mac. Someone on the net pointed out that there are four billion humans on earth and some-odd trillion cockroaches. His point was that numbers alone do not denote a higher form of life--this from a Mac fanatic, of course. My point is that when cockroaches start using Macs on the Web, I'll start worrying about cross-platform issues. Like it or not, PCs are the standard.

Chapter 16 gets into making your own Webmedia studio. The author obviously has some experience here. There is good advice for anyone seriously building a studio. Chapter 17 is one long sales pitch for Microsoft Information Server. It's free as the author points out, but it requires Windows NT which isn't. Bill isn't dumb!

The appendix takes the reader through a couple of really cool Web sites. I liked this part a lot--a very good way to end the book. Overall it was okay, but probably not worth the money unless you're serious about Macs and media.

Temple Hoff (temhof@mohave.lib.az.us) is the Library Services Coordinator for Automation in the Mohave County Library District.

The strengths of this book are its breadth of coverage, brevity, and currency. It is true to its title in being intended for beginners, and it gives an overview of many of the most-used Internet services. The book does not cover selecting an Internet service provider or learning the basic features of a service. Although the lack of coverage of these topics is clearly noted, beginners may have difficulty making the leap from the information they have been given by their Internet service provider to the starting point of this book. Any Internet beginner will find useful information in this book, and readers who do not want to deal with the huge books that most publishers inflict on even new users will appreciate the succinct treatment of this one. Most readers, however, will wish for clearer writing and better editing and will be tripped up by missing information. The book could serve as a useful structuring device for a series of workshops. In that context the missing information could be filled in and the unclear parts restated.

Types of Internet services covered include e-mail, discussion groups, Gophers, news groups, FTP (File Transfer Protocol), the World Wide Web, and interactive services such as "talk" and chat groups. Each service is given a succinct treatment (the chapter on the Web is ten pages long). Brief introductory sections in each of these chapters describe the uses of the service and provide a history and/or a technical description. Each chapter then describes how to access the service and how to use the resources of the service. For example, the chapter on e-mail covers the basic mail functions (sending, replying, and forwarding), and using directory services; the chapter on Gopher discusses traversing menus and using bookmarks. Each chapter concludes with a small number of sample sites to visit and practice assignments.

General information, net culture, and search strategies are covered in additional chapters and are incorporated throughout the text. Net culture receives excellent coverage. The difficulty of conveying emotions, the use of smilies (with an illustration), standard acronyms, flaming, spamming, and the meaning of bold type are discussed. The value of FAQs (Frequently Asked Questions files), the attitudes of old timers with regard to newbies, and the acceptable length of signature files are covered.

Explanations and descriptions are often incomplete or confusing. The discussion of a shell account implies that one cannot access graphics and sound through a standard telephone line. Later in the text, we learn that with a commercial online service one has access to graphics and sound, but must use the provider's access software. Telnet is described as a way to fool a host computer-- readers may wonder if they are being invited to conduct an illicit activity. An illustration showing typical Telnet logins, passwords, and logouts is badly needed. Even the sample Telnet session does not give the login and password that are needed to access the service being described. One of the example Telnet hosts gives the message "Press the Break key to begin session." This will mystify the typical new searcher, because the PC Break key does not work. There is no terminal keyboard template in the text, and no advice is given on what the problem is.

On the other hand, a good step-by-step description is given of Veronica and Archie searching, and of FTP. In each case information is presented where it is needed--the text covers the commands for each service, creating a Boolean query for Veronica, distinguishing a directory from a file in FTP, and understanding file extensions. The best illustration in the book shows which file types to FTP in text mode and which in binary.

The sample sites are one of the strengths of the book. There aren't many of them, only two Telnet sites, three for Gopher, and six for the Web. The brief text and small number of sites help keep the book from seeming overwhelming. Sites chosen are of interest to many (such as the Weather Underground), are intended specifically for new users (the HELP-NET Listserv), or lead to a broader range of materials (Yahoo). No book can keep up with the shifting content of the Internet, and some sites and links are no longer available. For example, two of the six Web links have moved, but left forwarding links. Another one has gone commercial and yields a message saying the site is forbidden.

Readers must refer to the text of each chapter to complete the practice assignments. Although the chapters are short, the information needed for the assignments is presented so inconsistently that it is often difficult to find. It is sometimes in an illustration, sometimes in an example on a separate line in the text, and sometimes embedded in the text.

Closing chapters on searching and on keeping up with Internet developments are again very brief and are well-done. The chapter on searching gives a good picture of the inconsistency of Internet resources and discusses search strategies and formulating a topic statement.

Although it has its good points, this book is recommended with reservations to its intended audience, new users of the Internet. It is more highly recommended as a springboard for those who are developing workshops for Internet users.

Judy E. Myers (jm@uh.edu) is the Assistant to the Director at the University of Houston Libraries. Her recent publications and presentations have been on expert systems and other computer applications for reference services.

REVIEW OF: Katie Haffner and Matthew Lyon. Where Wizards Stay up Late: the Origins of the Internet. New York: Simon and Schuster, 1996.

by Andrew Wohrley

Where Wizards Stay up Late records the first thirty years of the history of the Internet. What makes the book so important is that it clears up the confusion that has grown up around the origins of the Internet. Before I read this book I knew that the Internet was an attempt by the Department of Defense (DoD) to build a communications network that could control nuclear weapons during a war and that the Internet was viewed as a threat by American Telephone and Telegraph (AT&T). Both of these statements are either wrong or misleading and Where Wizards Stay Up Late rights many misconceptions on how the Internet came to be.

While the outline of what became the Internet was detailed first in a paper devoted to controlling the U.S. nuclear arsenal during a nuclear war, ARPANET, the predecessor to the Internet, was created as a civilian network for academics conducting computer research for the DoD's Advanced Research Projects Agency (ARPA). ARPA is a branch of the DoD created after the Sputnik scare to coordinate research for the DoD. Bob Taylor, the head of ARPA's Information Processing Techniques Office (IPTO), was unhappy because he needed three different computers in order to communicate with other computers. At that time, there was little standardization between computers and often different models of computers by the same company could not communicate with each other.

The research conducted by Paul Baran on control of nuclear weapons during a nuclear war offered a way to throw out two of those computers and let the remaining one handle all of Taylor's communications. The Defense Communications Agency (DCA), the logical home of the Internet, wanted nothing to do with it, both because it was satisfied with telephone lines handling communications, and because of bureaucratic resistance to a technology that no one in the bureaucracy understood. Taylor filled the vacuum left by the DCA and let a contract to Bolt, Beranek, and Newman (BBN) in Massachusetts to create the first Internet, called ARPANET.

Not only did AT&T refuse to consider the Internet a threat, they largely viewed it as a joke. One of the many stories in this book describes Paul Baran's futile efforts in the sixties to convince AT&T to build a network. AT&T would raise objections, and he would answer them. AT&T would hold classes for him where the conclusion always was, "Now do you see why it can't be done?" And Dr. Baran would always answer, "No."

The reason for the resistance to the Internet by everyone but a tiny corps of academics stems largely from the revolutionary technologies that the Internet introduced. Networked connections, packet switching, digital networks, and error correcting codes were all required to make the Internet work. These technologies either did not exist or were still in their infancy when ARPA originally looked at building a network using these ideas. Error correcting codes were critical because they prevented errors from creeping into what was a store-and-forward system; errors would turn information into a garble. Think of an audio tape recording. Now pretend that a copy is made. Will the quality be as good as the original? No, because errors will inevitably creep in. Now pretend that you copy the copy, and so on. Eventually the analog recording will copy so many errors (noise) that the tape will be indecipherable. The Internet can be viewed as a store-and-forward copying machine where packets of information are copied and moved from one node to another. On the Internet a message can be sent to the next node on the network an infinite number of times without errors creeping in.

The authors detail how John Heart at BBN and a small team translated the original outline of the Internet into reality. The demands that they made on ensuring the reliability of the network were such that their hardware contractor could not understand why the BBN team considered 97% availability to be abysmal, when the contractor had never seen that sort of reliability before. The BBN team introduced its own innovations, like separating the Interface Message Processor (IMP) from the accessing computer, providing online network diagnostic programs, and perhaps most revolutionary, allowing up to 1,000 computers at a site to be connected to the network's local IMP, which we would now call a node.

No sooner did BBN start delivering network hosts, than the computer scientists began demanding new software, and when they did not receive it, immediately started developing their own; they documented their work under the Requests For Comments (RFC) standards. From the very beginning the users started taking responsibility for organizing the Internet and shaping its development. ARPANET was only available to those researchers conducting DoD business so other academics were frozen out of a rapidly growing area of technology.

As ARPANET expanded, problems developed that did not exist when the network was just a handful of IMPs. The BBN developers went to great lengths to ensure the reliability of their network, but troubles began to develop as information moved outside ARPANET and was sent from the terminal to the IMP and from the IMP to the terminal. The Transmission Control Protocol (TCP) was developed in response to these troubles. Instead of relying on the IMPs to move information around and ensure that it was sent correctly, TCP assumed an inherently unreliable network and made the destination hosts take responsibility for reassembling the packets of information and ensuring that no packets were lost.

The drive to expand the Internet constitutes the last half of the book; the success of that drive in creating NSFNet closes it.

What can librarians learn from the history of the Internet? The Internet, despite its origins in military research, has always been a democratic forum where ideas, working software, and technical proficiency has counted for more than corporate capitalization or market share in guiding its development. The anti-censorship attitude of the Internet pioneers has carried on to the present day, and the overturning of the Communications Decency Act is due in part to the distributed, decentralized design of the network, and the culture that has supported it. There is nothing new under the sun on the Internet. Flame wars were just as hot when Ph.D.s were reviling each other in the seventies as they are today when America Online users go at it. All of these historical antecedents bode well for librarians who wish for a network where free and unlimited access to information is the expectation, and not the exception.

What makes Where Wizards Stay Up Late enjoyable is the authors' gift for telling anecdotes about technology and people. Sometimes the lines between people and machines blurs entirely as when the transcript of the "conversation" between software named The Doctor that mimicked a therapist and software named PARRY that mimicked a paranoid psychotic is published. The conversation was as bizarre as can be imagined. Another story involves the reaction of a director of ARPA in the seventies who, while talking to a subordinate, saw a folder on the subordinate's desk with the eye-catching title, "Computer-Assisted Choreography". The director stopped the project right there.

Where Wizards Stay Up Late does disappoint in some respects. The technical fine points could have been made in greater detail than they were, and the book could have been longer and more detailed. Tracy Kidder's book, The Soul of a New Machine, proved that in the right hands, technical details are fascinating, not boring. The book spends a great deal of time detailing the theory behind the original packet-switching network and why three or four connections to other sites are ideal for network reliability. When the authors describe BBN's implementation of the scheme, they tersely say that the three to four connections per host model was thrown out for reasons of economy, but they do not describe the economies achieved. In addition, while the BBN and UCLA (University of California at Los Angeles) pioneers rightly get praised, the other early Internet pioneers get scant mention. There must be other pioneers whose contributions to the Internet are detailed in RFCs over the years. Even worse, for a book that has as its topic the creation of a whole new medium of communication, there are few Web links for supplemental information. The need for a new, more scholarly treatment of the Internet's origins has now been demonstrated. Perhaps someone will respond to that need soon. Where Wizards Stay Up Late will honorably fill the void until such a history becomes available.

Andrew Wohrley (WOHRLAJ@LIB.AUBURN.EDU), a contributor to the Cybrarian's Manual, is a librarian at Auburn University.