What Might Happen To Your Music After You Die (and What You Can Do About It)

Foreknowledge of the proper media for ultimate storage of one’s scores, audio/video materials and attendant documents is crucial, since it will often affect the choice of media for one’s current work. Basically, it comes down to two basic issues. First, the storage medium has to be stable over a long period of time (50-100 years, allowing that digital formats will have to be migrated to another format at least once during that period). Second, the medium has to be in a format that can be easily accessed and/or distributed. The first criteria can be satisfied in its most basic sense by merely preserving the score in a dark, humidity-controlled space. But this makes access difficult. Greater access can be obtained by digitizing the score in a format such as PDF, or doing the same with audio/video files on a CD/DVD, all of which can be more easily accessed via a local jukebox configuration or distributed over the Internet. But these strategies have many problems in terms of complexity of administration, along with indefinite long-term viability. You see, operating systems change and playback realization software becomes obsolete, causing tremendous expense as the archival entity needs to continually ‘migrate’ the score or audio/video file to conform with the latest playback technology. As it turns out, these two criteria (long-term stability and accessibility) are usually at odds with each other, and together have formed the basis for an almost complete frustration and lack of progress on the part of the archival communities in library and government who are entrusted with the task of hammering out formats, protocols and standards that can be utilized over the widest variety of disciplines and installed in the most libraries.

To Digitize or Not

The prospect of digitization as the long-term storage and preservation solution hangs as a merry-go-round ring just out of reach for the two principal organizations entrusted with storage protocols, namely the Society of American Archivists and the National Archive Records Administration of the U.S. Government. Their Web sites are replete with forum papers and seminars where they talk around the problem without suggesting solutions due to the sheer complexity of the problem. Before I sample some of the current discussion below it should be stated what we mean by the material we are digitizing. First, of course, there is the artistic material itself, whether it be music, video, photo, graphic, text, or other documents. Second, there is what archivists call the ‘metadata,’ which consists of all the information surrounding the artistic document, including its history, geographical location, storage format, dates of receipt and storage, software and hardware used to retrieve and realize it, authentication (a very real problem with certain types of files such as text, which can easily be altered, thus compelling archivists to establish an authentication/security protocol), to name a few.

The case against digitizing:

The principal case against digitization is not in terms of the feasibility or stability of the storage medium, but rather the obsolescence of the playback medium that the file will eventually be realized on. For example, finding a late-model computer to read a 5.25-inch floppy disk — a format common only a few years ago — or the software to translate MacWrite or WordPerfect 4.0 is practically impossible. On a government and industry level, the retrieval problem is magnified: old DECtape and UNIVAC drives, which recorded massive amounts of government data, are long retired, and programs like FORTRAN II are history. The data stored by these machines in now obsolete formats are virtually inaccessible. Hardware and software manufacturers have shown more interest in discovering new technology than in preserving today’s data.

“My concept of digital preservation encompasses material that begins its life in digital form as well as material that is converted from traditional to digital formats. Recording media for digital materials are vulnerable to deterioration and catastrophic loss, and even under ideal conditions they are short lived relative to traditional format materials. Although archivists have been battling acid-based papers, thermo-fax, nitrate film, and other fragile media for decades, the threat posed by magnetic and optical media is qualitatively different. They are the first reusable media and they can deteriorate rapidly, making the time frame for decisions and actions to prevent loss is a matter of years, not decades. More insidious and challenging than media deterioration is the problem of obsolescence in retrieval and playback technologies… Devices, processes, and software for recording and storing information are being replaced with new products and methods on a regular three- to five-year cycle, driven primarily by market forces. Records created in digital form in the first instance and those converted retrospectively from paper or microfilm to digital form are equally vulnerable to technological obsolescence. Another challenge is the absence of established standards, protocols, and proven methods for preserving digital information. With few exceptions, digital library research has focused on architectures and systems for information organization and retrieval, presentation and visualization, and administration of intellectual property rights (Levy and Marshall). The critical role of digital libraries and archives in ensuring the future accessibility of information with enduring value has taken a back seat to enhancing access to current and actively used materials. As a consequence, digital preservation remains largely experimental and replete with the risks associated with untested methods; and digital preservation requirements have not been factored into the architecture, resource allocation, or planning for digital libraries.”

She further states that “It seems ironic that just as libraries and archives are discovering digital conversion as a cost-effective preservation method for certain deteriorating materials, much information that begins its life in electronic form is printed on paper or microfilm for safe, secure long-term storage. Yet, high-quality acid neutral paper can last a century or longer while archival quality microfilm is projected to last 300 years or more. Paper and microfilm have the additional advantage of requiring no special hardware or software for retrieval or viewing. Perhaps this explains why in many digital conversion projects, the digital images serve as a complement to rather than a replacement for the original hard copy materials (Conway, 1994).”

The National Archives and Records Administration (NARA), the US government agency most responsible for storing the bulk of government records, takes a decidedly conservative approach to digital vs. paper/tape/microfilm storage. In fact, they have not yet implemented any significant effort in digitizing records and disseminating them over the Web, although they have recently developed highly specific and technical guidelines for doing so (go to the NARA Guidelines for Digitizing Archival Materials for Electronic Access). Even so, these guidelines do not include any source material that is time-based (music, video, film, as opposed to text and graphics).

Their Web site includes the following statement of policy: “In an era of digitization, why does NARA continue to microfilm records? Microfilm is a low-cost, reliable, long-term, standardized image storage medium. The equipment needed to view microfilm images is simple, consisting of light and magnification. The medium has a life-expectancy of hundreds of years. Digital images, on the other hand, consist of a wide variety of machine codes that require computer hardware and software to be made visible. To avoid the obsolescence of changing computer technology, digital images must be reformatted periodically. The cost of maintaining microfilm is small compared with that of digital images. Microfilm only needs shelving in a cool, dry place for a very long period of time.”

The case for digitizing on hard drive/optical media:

Two principal benefits accrue from digitizing audio/video sources, images, and paper documents… First, it is generally understood from the research I have done that non-tape digital storage media such as CD, CD-ROM and DVD are more robust than analog or digital tape media. Hard drives, although not suitable for long-term storage, are a good choice for information and files that are regularly read and disseminated. Second, the benefits of greater accessibility and flexibility in handling the data are compelling with all non-tape digital media. Thus, although no single, large-scale program of disseminating digitized artistic material is now in place, a general tendency toward digitizing all artistic output for long-term storage seems inevitable. Even now successful smaller scale efforts, such as the RealAudio Composerver program under the direction of Tom Wells at Ohio State University as part of the Society of Composers, Inc., programs are taking place, although none of these are directly concerned with establishing a permanent posthumous archive (although they conceivably could be).

Another compelling reason for digitizing is the spotty record of traditional storage methods and the instability of traditional media such as tape. In February 1995, National Public Radio reported that their vast audio recordings from the ‘70s were becoming unusable. NPR commiserated that their neighbor, the Smithsonian, had similar trouble with its audio holdings. Moreover, CBS found major degradation in its Viet Nam era video master tapes. Susan Stamberg and Walter Cronkite, lost forever? I personally have seen most of my master tapes recorded on the infamous Ampex 406/407 audiotape during the 1980s become virtually unplayable. Ralph Hodges, in “Things That May Not Last,” (Stereo Review, September 1993, page 128) quoting Terry O’Kelly of BASF bemoans “the tendency of some ferri-cobalt formulations to lose high frequencies with age. This instability, when present, seems to occur whether the tape is played frequently or not, and is not correctable. Print-through, on the other hand, can be addressed through the time-honored method of storing the afflicted cassette with Side A “tail out,” meaning that you’ll have to rewind the cassette if you wish to play it from the beginning. Such storage will progressively erase the print-through that has occurred while encouraging the development of print-through in the opposite tape direction…”

“You can’t say you haven’t been warned and that audio and eternity are eternally incompatible.”

Adrienne Petty, in The Wall Street Journal (October 4, 1993), says, “Contrary to popular opinion, videotape may not last for generations. In fact, it may last only 15 years, and failure to care for it may shorten its life even further.” Similar articles and horror stories about audiotape abound and must be heeded.

Similarly, in the realm of paper documents (musical scores, photos, graphic items), although historical documents have generally held up well under proper care, a new element of risk has been added with the ubiquitous use of computer printing and paper/inks. A number of studies have recently shown these documents to be subject to sometimes even drastic fading over a short period of time (even six months for some ink-jet printer inks). In an era where computer companies are more interested in quick and snappy printer results than in their lasting for 50-100 years, we simply have to become skeptical about any paper document printed on a computer lasting very long in stable condition. Even with microfilm, the method of choice for the Library of Congress, their policy allows for a shelf-life of only 100 years, provided the emulsion used is silver halide (the most common variety now in use). That may be well and good as far as it goes, but if and when the medium deteriorates, and it becomes necessary to migrate the data to another microfilm, generation loss becomes an issue (as it does in photocopying or analog tape dubbing). With a digital medium, there is no generation loss.

To make the case for the desirability of CD-R, CD-ROM and DVD media, one can browse the various technical publications of the manufacturers linked from the CD-Info Company Web site. Here you will also find information about Compact Disc (CD) & Digital Versatile Disc (DVD) in all their forms (physical and data formats), especially how they are manufactured and used for electronic publishing. They also present links to other websites with more information on these subjects. From these one gleans a consensus of a 50-70 year shelf life, longer for read and somewhat shorter for write modes. Another common number bandied about is ‘1000 plays’ with no degradation. A superior link to DVD matters is the DVD Forum. And, for an opposing point of view promoting the new DVD+RW format, visit dvdrw.com.

Although I would not have any hesitation in adopting the CD-R format for long-term storage at this point, there may be reservations about the newer DVD format, namely:

the information being much more dense, allowing for less margin of error;

the lack of verification of company testing for long-term viability, due in part to the reluctance of individual companies to divulge proprietary specifications of this newer media; and

the dizzying plethora of DVD formats available.

To this date there is still no clear winner in DVD formats, especially in the audio world, and one would be foolish to try to choose a single format to project its viability into the future (for a comprehensive look at the current state of DVD, go to p. 144 in the Feb. 2001 issue of Electronic Musician).

As John D. Dvorak says in a recent Forbes magazine article, “Anarchy has reigned over DVD optical storage. Battling formats and technologies are confusing the marketplace. Recently a witches’ brew of a specification called DVD Multi has emerged as a way to end the feuding and halt customer confusion…”

Mr. Dvorak goes on to give a breakdown of current DVD formats:

DVD Video: Used for movies. Total capacity is 17 gigabytes if two layers on both sides of the disk are utilized, but typically only one layer of one side is used, which amounts to 4.7 gigabytes, or about one movie.

DVD-ROM: The same basic technology as DVD Video, with computer-friendly file formats. Used to store data. Should supplant CD-ROM soon.

DVD-R: Developed separately by Panasonic, Hitachi, Pioneer and Philips, this technology has standardized at 4.7 gigabytes. Fully compatible drives should ship by year-end at around $1,500 to $2,000 each. As with CD-R, the user can write only once to the disc. This is the format that was expected to be used to copy movies from DVD to DVD.

DVD-RAM: Developed by Hitachi, Toshiba and Panasonic, this makes a DVD act like a hard disk with a random read-write access. Aopen (Acer), JVC, LG, Samsung and Teac have joined this team. Products should be out by year-end. No prices have been announced. This was initially a 2.6-gigabyte drive but it, too, became a 4.7-gigabyte-per-side disc.

DVD-RW: Similar to DVD-RAM except that the technology mimics CD-RW and uses a sequential read-write access more like a phonograph than a hard disk. Developed by Pioneer. Ricoh and possibly Sony are expected to join forces. Has a read-write capacity of 4.7 gigabytes per side.

DVD+RW: A technology developed by Philips and Sony, initially designed to deliver 3 gigabytes per side, is expected to increase to 4.7 gigabytes. Sony seems to have lost interest in it while Philips announced plans to ship the device someday. No one else is taking it seriously.

DVD Audio: New audio format introduced by Panasonic that arguably doubles the fidelity of a standard CD. Should eventually replace the CD recording. Sony has gone its own way with SuperCD.

HDVD: Developed by Sony and others to present high-definition TV signals from a special DVD. Nobody expects to see this for at least two years. It won’t be included in any DVD Multi specification.

The DVD Forum, a consortium of DVD technology companies, recognizes that the format chaos is costing them a bundle. Consumers are not going to invest time and money on a medium that risks being orphaned a year later. And so DVD Multi aims to deliver a truce that will draw consumers back to the retail counter.

While DVD Multi doesn’t make everyone adopt the same standard, it does intend for a DVD multiplayer/recorder to be able to read and write multiple formats. This kind of thinking years ago would have resulted in a VCR that played both VHS and Beta. It’s a fine idea that should make everyone happy–at a price. A multiplayer will require more components and redundant mechanisms such as multiple heads.

Computer and consumer (as in home theater) DVD drives using the trademarked DVD Multi logo will be required to read DVD-Video, -ROM, -Audio, -RAM, -RW and -R discs as well as standard CD-ROM and CD audio discs. In addition, the computer drive must be able to write on DVD-RAM, DVD-RW and DVD-R discs. If this device comes to market at a reasonable price, it’s what you should buy.

Although recent optical CD media have become more predictable and standardized, there is still a choice between the two dyes used, cyanine and phthalocyanine, and their implications for long-term viability. Dana J. Parker, the co-author of New Riders’ Guide to CD-ROM, CD-ROM Fundamentals, and CD-ROM Professional’s CD-Recordable Handbook, in a posting on the CD Info Company’s Web site writes, “…there are other, very important aspects to evaluating CD-R media besides estimated longevity and a preference for chicken soup or pea soup. It is not quite so cut-and-dried as ‘phthalocyanine discs last longer, so they are better.’ There are far too many CD-R users who, from long experience, swear by cyanine media as staunchly as you and others do by phthalocyanine. As it turns out, they have good reason to do so… It is true that phthalocyanine dye is less sensitive to ordinary light — incoherent, random light such as sunshine, ultra violet, incandescent, and fluorescent light normally found in the real world outside of CD recorders. That means that prolonged exposure to bright light–particularly bright UV light–will render cyanine media unreadable sooner than phthalocyanine. Phthalocyanine will probably last longer and preserve information better under these adverse, but extremely unlikely conditions. If we store information on CD-R media that is so valuable as to merit preservation for a long period of time–say 30 years or more, assuming, of course, that there will be hardware capable of playing the disc at that point in the future–are we going to leave those precious discs laying out in the light and heat? No, we are going to store them carefully in their jewel cases, away from the light, heat, and scratches that are the biggest threats to data loss. Then again, if an application does not require that the data remain readable 30 days from now, who cares if the data fades in 50 years or 100? The important thing is how reliably the disc can be written and read today.” (That is the other side of light sensitivity, and it’s a significant one… It further ensures that cyanine media offers a higher likelihood of compatibility with more CD recorders…)

“Most existing CD Recorders are designed to record to cyanine media. Some CD players and CD-ROM drives will read discs recorded on cyanine media more readily and reliably than they will read discs recorded on phthalocyanine media. This compatibility is tied in with a little-discussed concept known as write strategy.”

Personally, I would strongly recommend the following for all composers concerned with their work being available in the future: acquire a CD burner and record everything you can as a documentary archive — music, papers, video, images, etc. on CD-R. On a separate CD-R, record in the most standard text file an index of the recordings. Do this on two separate formulations of CD (I am not recommending brands since they change formulations too often) producing two identical collections of documents. Although technical information and the protocols of DVD burning are still in the formative stages, this too may be a good choice for the future, especially for video, but not quite yet for audio.

Preservation of composer-specific formats:

Many of us have produced creative works in non-standard formats which are often driven by custom software, use spatialization and sound diffusion protocols, utilize hardware instruments for sound realization, employ multimedia formats, or work in areas of performance art and improvisation. Even such currently ubiquitous ‘non-standard’ formats as MAX/MSP must be considered, if history has any relevance, to be ones that will almost certainly be worthless as systems for recreating and performing the art work of the present in the posthumous future. I would even go as far to say that the CD-ROM should be considered suspect as a viable vehicle for future realization of current data. Notwithstanding the relative ease in preserving the actual data in these formats, the problem will lie in the lack of a hardware/software/system software combination to play them or realize them on. And it is clear from my research that no arts archival organization or library will be willing or able to deal with the complexities of any but the most standard playback/realization systems. As a general guide to determine which formats will last and deserving of utilization as preservation vehicles, I would employ a simple rule: Use any format but only those formats that are widely accepted in a multitude of educational, societal, archival and cultural venues, and which touch many disparate academic and artistic disciplines.

Subscribe

NewMusicBox, a multimedia publication from New Music USA, is dedicated to the music of American composers and improvisers and their champions. NewMusicBox offers: in-depth profiles, articles, and discussions; up-to-the-minute industry news and commentary; a direct portal to our internet radio station, Counterstream; and access to an online library of more than 57,000 works by more than 6,000 composers.