Electronic Book Review - xmlhttp://www.electronicbookreview.com/tags/xml
enNext Generation Student Resources: A Speculative Primerhttp://www.electronicbookreview.com/thread/technocapitalism/resourceful
<div class="field field-name-field-author field-type-node-reference field-label-hidden clearfix">
<div class="markup">by</div>
<div class="field-items">
<div class="field-item even">Susan Schreibman</div>
</div>
</div>
<div class="field field-name-field-publication-date field-type-datetime field-label-hidden"><div class="field-items"><div class="field-item even"><span class="date-display-single">2003-11-08</span></div></div></div><div class="field field-name-field-source-url field-type-link-field field-label-inline clearfix"><div class="field-label">Source URL:&nbsp;</div><div class="field-items"><div class="field-item even"></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>The World Wide Web is both a source of frustration and richness for educators. It is a source of frustration in that students plagiarize from it more easily than from published texts, while they do not seem to be able to differentiate reliable from unreliable resources. Our own searches often reveal substandard source material, particularly when held in comparison with print publication. Some educators refrain from using the World Wide Web in the classroom because they feel intimidated by their students’ seemingly superior ability to navigate virtual space. Yet, it is also a resource of immense richness. Less than a decade after Mosaic, the first Web browser, was launched, users from all over the globe have access to primary materials that were previously the preserve of scholars. They have access to images of unprecedented clarity, entire novels that can be downloaded onto e-readers, and virtual libraries that would make even Alexander jealous. As more information is mediated through the World Wide Web, educators will need to find a balance between the suspicion that every student paper is at least partially cut and pasted from the Web, and the realization that by introducing students to the artifacts of primary research, seventeenth century missionary maps of Latin America, copies of Emily Dickinson’s holograph manuscripts, or the first movies produced by Thomas Edison’s studio, their appreciation of and engagement with the arts and humanities will deepen. Facilitating access to these objects is but the first step in the engagement process. The second step is to create an environment wherein students can become contributors to this docuverse, shifting the balance of power from being consumers to providers of knowledge. Currently, the practice of utilizing the World Wide Web as a pedagogic tool tends to fall into two broad categories: 1) utilizing the Web as a resource, i.e. integrating previously digitized material into teaching practice, and 2) utilizing Web technology so that students become content providers. Although these practices are interrelated, they are not mutually dependent. This article will explore these two categories, surveying current practice and speculating on how it may change in future.</p>
<h2>The World Wide Web as a Resource</h2>
<p>For many, locating high-quality, reliable primary texts on the Web is rather like embarking on a quest. For me, the quest invariably leads to very specific resources: i.e. electronic editions. These editions tend to be lightly contextualised, can be difficult to navigate, and assume some previous knowledge of the subject. They also tend to use Standard Generalised Markup Language (SGML) or extensible Markup Language (XML) for encoding, rather than Hypertext Markup Language (HTML). This distinction in encoding is often invisible to the user, as SGML and XML texts are ‘converted on the fly’ to HTML. In other words, the server on which the text resides processes the SGML/XML files into HTML so that by the time the user views them in her browser they look like any other Web document. The distinction, however, is important. Texts and images which contain metainformation encoded in SGML/XML allow for robust searching not possible in HTML. Thus, in <span class="booktitle">The Blake Archive</span> ( <a class="outbound" href="http://jefferson.village.virginia.edu/blake">http://jefferson.village.virginia.edu/blake</a>) it is possible to search on all images that contain representations of angels. The search can be further refined to return only those images in which angels appear with ‘dark-skinned’ children. <cite id="note_1"><span class="booktitle">The Blake Archive</span> allows searching on text or images. The image search is based on a list of metainformation terms devised by the editors that categorize Blake’s work through four main rubrics: Animals, Vegetation, Objects and Structures. The text search allows for plain text and Boolean searches, as well ways of refining those searches by, such as limiting a search to the titles of poems.</cite></p>
<p>Many of these archives are located at and supported by humanities computing centers, such as The Institute for Advanced Technology in the Humanities at University of Virginia ( <a class="outbound" href="http://jefferson.village.virginia.edu">http://jefferson.village.virginia.edu</a>), which hosts projects such as Ben Ray’s <span class="booktitle">The Salem Witchcraft Trials, 1692-1693: A Thematic Research Archive</span>, Jerome McGann’s <span class="booktitle">The Complete Writings and Pictures of Dante Gabriel Rossetti: A Hypermedia Research Archive</span>, and Stephen Railton’s <span class="booktitle">Uncle Tom’s Cabin and American Culture</span>, or The Humanities Computing Unit at Oxford University which hosts <span class="booktitle">The Wilfred Owen Multimedia Digital Archive</span> (<a class="outbound" href="http://www.hcu.ox.ac.uk/jtap/">http://www.hcu.ox.ac.uk/jtap/</a>). Other humanities archives originate in libraries, such as the University of North Carolina at Chapel Hill’s Academic Affairs Library which is host to <span class="booktitle">Documenting the American South</span> (<a class="outbound" href="http://docsouth.unc.edu/">http://docsouth.unc.edu/</a>) featuring several thematically-based textual archives, including an extensive collection of slave narratives, as well as <span class="booktitle">The Southern Homefront, 1861-1865</span>, which charts Southern life during the Civil War. Other archives are the result of a consortium of scholars who have created standards for electronic editions in their fields, such as <span class="booktitle">The Model Editions Partnership</span> (<a class="outbound" href="http://mep.cla.sc.edu/">http://mep.cla.sc.edu/</a>) whose goal is to explore ways of creating electronic editions of historical documents which meet the standards scholars traditionally use in preparing print editions. <cite id="note_2">For a more comprehensive (although by no means complete) list of humanities projects encoded in SGML, see The Text Encoding Initiative’s list of projects, <a class="outbound" href="http://www.tei-c.org/Applications/index.html">http://www.tei-c.org/Applications/index.html</a></cite></p>
<p>Resources such as these are created by a subject area expert or team of experts, often working with graduate students, undergraduates, librarians, and technical staff who have the institutional support necessary for compiling, digitizing, encoding and designing what is, by digital standards, even a small archive. The creation of these electronic editions is not unlike the work scholars have traditionally engaged in when editing for print publication in that typically resources are collected from a number of locations and synthesized into a single text (or in this case collection of texts) with additional value added through apparatus. The apparatus in a hypertextual environment may differ significantly from that of print, may take the form of metainformation generated through the encoding process itself and/or terminology applied to the text (much like index terms) that facilitates searches not available through plaintext searching. Metainformation may also be applied to images in the form of text header so that they too may be searched through a search engine. <cite id="note_3">At present, retrieval rates for image-based searching lags far behind that for text-based searching. In recent years, however, image-based retrieval has made great strides. An excellent resource for image based humanities computing is Matthew Kirschenbaum’s Looksee at <a class="outbound" href="http://www.glue.umd.edu/~mgk/looksee/">http://www.glue.umd.edu/~mgk/looksee/</a></cite> The archive may also contain a hybrid structure of apparatus, such as introductory essays and textual and/or contextual notes, in addition to apparatus generated from the encoding itself.</p>
<p>Other rich humanities resources are created by cultural institutions, and have given birth to a new genre of online catalogue. These catalogues digitally reproduce objects from their own holdings while providing access through electronic finding aids. Yet, to simply call these resources online catalogues belies their richness. Projects such as The Library of Congress’ <span class="booktitle">American Memory Project</span> (<a class="outbound" href="http://memory.loc.gov/">http://memory.loc.gov/</a>) serves as a gateway to primary source materials relating to the history and culture of the United States. The wealth of digital material here is staggering: over seven million digital items (including text, still and moving images, and sound files) from more than 100 historical collections.</p>
<p>These types of resources have come to dominate primary source material in the humanities on the World Wide Web. As mentioned previously, they tend to favour primary material that has been rigorously transcribed, encoded and digitised. This, however, has not always been the case. In the early to mid 1990s, many humanities resources were created which did not envision a reading audience. This is due, no doubt, because of the early perception of the Web as a democratized space that could inherently overcome the audience-specificity of print publication. This idea was developed by early hypertext theorists, many of whom were also early contributors to humanities resources on the Web, who believed that that freed from the temporal and special restrictions of the codex, sites could be fashioned which served many masters: that audiences would be self-defining and inherently understand how to navigate the multifarious resources available to them at the click of a mouse. By the end of the decade, however, it became clear that the World Wide Web, like any other space, is one of atomization, that the implicit and unstated ideal audience envisioned by resource creator(s) was self-selected, and sites that not only found - but retained - their audiences were those in which the editors’ ideal audience found their flesh and blood equivalents</p>
<p>The problem with many early humanities resources is that they simply port codex norms into the electronic environment. What many creators of digital resources seem to have forgot during the early stages of developing material for the Web is that ‘form reshapes content’ (Burbules). Or perhaps site editors/designers were seduced into thinking that they were reshaping content by integrating the functionality afforded by that overused and over-praised HTML hyperlink. Some of the earliest and indeed, most successful of these resources have fallen victim to their own success, becoming large, unwieldy structures with a preponderance of hyperlinks which send readers down tenuously associated trails in a vaguely circuitous fashion. Others have become a testament to a theoretical understanding of what could be achieved in the medium at a specific point in time, such as George Landow’s <span class="booktitle">The Victorian Web</span> or Stuart Curran’s <span class="booktitle">Frankenstein; or, the Modern Prometheus</span>. Others have caused the medium to be denigrated as a pedagogic tool because of their being hastily conceived, filled with promises of valuable information which never appeared, finding their way into the search engines, and, regretfully, never taken down. Many of these sites are fragmentary, in that freed from the spatial restrictions of the codex, the ambitions for the resource were far beyond what their editors/designers could accomplish. In the early days of hypertext, very few people realized the costs associated with these new editions: freed from print publication costs, editors never reckoned on the enormous time commitment in digitizing objects and designing the space in within which they would be contained. This is not to say that there are no valuable resources conceived and implemented in Hypertext Markup Language. <span class="booktitle">Romantic Circles</span> (<a class="outbound" href="http://www.rc.umd.edu/">http://www.rc.umd.edu/</a>) or <span class="booktitle">Columbus and the Age of Discovery</span> ( <a class="outbound" href="http://muweb.millersville.edu/~columbus/">http://muweb.millersville.edu/~columbus/</a>), for example, are cases in point. The Perseus Digital Library (<a class="outbound" href="http://www.perseus.tufts.edu/">http://www.perseus.tufts.edu/</a>), one of the oldest and most successful humanities computing projects on the Web, began as a resource that concentrated exclusively on ancient Greek culture, but has since expanded its scope to include Roman and now Renaissance materials. Its goal, like so many humanities computing projects, is not only to make available digitized versions of humanities objects, but to study the possibilities (and limitations) of the electronic medium, as well as to serve as the foundation for work in new cultural domains (Crane). <cite id="note_4">An excellent source for locating humanities resources on the web is The National Endowment for the Humanities, a gateway featuring ‘the best of the humanities on the web’. See <a class="outbound" href="http://edsitement.neh.fed.us">http://edsitement.neh.fed.us</a></cite></p>
<p>Being able to locate high-quality, reliable Web resources is not only essential for integration into teaching practice; it is a skill that students need to acquire. Although they may be able to surf with greater ease than many of their teachers, they are far too frequently not discriminating Web users. A case in point was the Junior-level student in my Survey of American Literature class who dutifully cited and footnoted the first paragraph of an article from an Internet site that sold research papers. Students need to be educated as to how to use Web resources as a ‘means and medium of interaction and work’ (Burbules), not simply as an information free-for-all. Rarely do students question the information they receive. Far too frequently they do not attempt to discern if a source is reliable. This is not surprising. Modern systems of education have provided students with a mediated informational environment, through textbooks, through school and public libraries, through reading packs provided by teachers and professors. While it often seems that anyone under the age of 25 inherently knows how to use a mouse, that ability has not provided them with the skills to evaluate the overwhelming amount of information generated by a search engine. Students do not always understand that the characters after the http:// on their Web browsers is a significant set of signs and not a meaningless string. The Internet will be a life-long learning tool for an increasing number of people worldwide; thus it is incumbent for educators to create a knowledge base so that students can navigate this decentered, destabilized informational resource.</p>
<h2>The World Wide Web as a Tool</h2>
<p>Much current pedagogic practice utilizes Web-based material as a visually and orally enhanced textbook mirroring the power and pedagogical relationships of the codex. To utilize Web technology to teach students how to become content providers requires a conceptual shift from thinking of the technology as a machine which lends itself to automated and routine actions, such as typing a string into a search engine and clicking on the results, to thinking of the technology as a tool which lends itself to manipulation as an extension of the user (Muffoletto 93). These tools or models or indeed, games, allow students to become collaborators in content creation through a framework established by the technology. Part of the reason the development of Web-based pedagogic spaces has lagged behind the creation of Web-based scholarly resources, is that our thinking is still, by and large, framed by the codex. We think in pages, chapters, and paragraphs. We think in annotation and footnotes. Replicating codex norms in hypertext may not be the best use of a digital space, as Jenny Lyn Bader seems to suggest in the following passage from The New York Times:</p>
<p class="longQuotation">The reading process mourned by scholars who thought footnotes superior to endnotes - who preferred the process of interruption, mainstream re-evaluation, and jumping around - is the natural process of reading on the Web. Small children who would not normally read books with footnotes until secondary school know their way around bright blue hyperlinks. They learn early that a Web site isn’t complete without references to other sites, and that the cooler a site, the cooler its links.</p>
<p>Many collaborative Web spaces have a game-like quality, for example Jerome McGann and Joanna Drucker’s <span class="booktitle">The Ivanhoe Game</span>, and Neil Fraistat and Steven E. Jones’s <span class="booktitle">MOOzymandias</span>. <span class="booktitle">The Ivanhoe Game</span> was developed ‘to use digital tools and space to reflect critically on received aesthetic works (like novels) and on the process of critical reflection that one brings to such works’ (McGann, <span class="booktitle">Ivanhoe</span>). Players of <span class="booktitle">The Ivanhoe Game</span> not only engage with aesthetic works in performative ways, but intervene in them within an environment which puts their ‘critical and reflective operations on clear display’. In playing the game, the players in effect, perform the novel, making critical and aesthetic decisions about the text which, in fact, creates a new and evolving narrative. <span class="booktitle">The Ivanhoe Game</span> thus becomes, like <span class="booktitle">MOOzymandias</span>, a “ ‘pedagogical edition’ that students build, mutate and inhabit rather than merely read” (Fraistat). The site of <span class="booktitle">MOOzymandias</span> is a MOO (Multiuser Object-Oriented Environment), a text-based, virtual reality space that allows multiple users to connect to the same place at the same time. MOOs differ from conventional chat rooms in that they allow users to manipulate and interact with cyber objects in addition to live communication (Multiuser). MOOzymandias utilises a MOO space in which the physicality of the Villa Diodati (the Swiss country house rented by Lord Byron in the summer of 1816 where Mary Shelley’s <span class="booktitle">Frankenstein</span> was conceived) becomes a virtual environment for exploring Romantic literature, including Percy Bysshe Shelley’s ‘Ozymandias’ and Samuel Taylor Coleridge’s <span class="booktitle">Rime of the Ancient Mariner</span>. In this space students interact with one another, with teachers, as well as with virtual objects to explore the meaning and origins of the primary text(s) around which a particular MOO was developed. Students also have the ability to add to an extant MOO, or indeed, construct their own. Like <span class="booktitle">The Ivanhoe Game</span>, <span class="booktitle">MOOzymandias</span> utilises digital game playing to create a performative and critically reflective immersive digital environment which teaches students on the one hand, about literary texts, and print textuality, and on the other, about editing virtual spaces and visual literacy (Fraistat).</p>
<p>Thus, rather than view the World Wide Web as yet another contributing factor in the marginalization of the humanities, it can be seen as having the potential to revitalize teaching of humanities disciplines by challenging existing pedagogic practice. This does not happen through the use of technology alone, but though a shift in thinking from how can new technologies be accommodated into existing pedagogic practice, to how can they stimulate new learning environments (Salomon). The game-like pedagogies of <span class="booktitle">The Ivanhoe Game</span> and <span class="booktitle">MOOzymandias</span> are cases in point. They are ‘subversive technologies’ which have the ability to stimulate pedagogic changes that affect classroom culture (Salomon). Another subversive technology is the integration of the rich array of previously digitized humanities objects freely available on the World Wide Web into learning environments which facilitate the co-construction of knowledge. Many of the originals of these objects are located in archives that only admit scholars, or are in museums too distant to be visited by students. As mentioned in the first part of this article, there are already thousands of humanities artifacts freely available on the Web. These artifacts can be utilized in environments that not only teach students the basic skills of humanities research, but involve them in the excitement of the investigative nature of working with primary sources. By creating a framework which allows students to experiment with the ordering of manuscript drafts of a poem by Emily Dickinson, students can work in an environment which allows them to create a versioned edition. Furthermore, students could be asked to justify their ordering, through a scholarly introduction and apparatus. This type of learning environment introduces students to the skills of textual analysis and well as literary scholarship. <cite id="note_5">For a working model of such a software system, see <a class="outbound" href="http://www.mith2.umd.edu/products/ver-mach">http://www.mith2.umd.edu/products/ver-mach</a></cite> Students, in the role of scholarly editors, not only become knowledge providers, but understand the process by which the texts they are asked to read are created.</p>
<p>A learning space might be imagined in which college undergraduates create scholarly editions intended for high school seniors. The undergraduates would be asked to pay specific attention to types of information they would have found useful several years earlier. Their edition might include a critical introduction as well as annotation in the form of text, images and sound. Furthermore, students might conduct usability studies by having several high schools classes utilize their edition. Students of history may be asked to create a multi-media timeline utilizing a template that facilitates integrating text, images, sound and video into a timeline rubric. Students may be asked to construct a timeline from a particular historical/cultural perspective, then asked to re-imagine that same time period from an alternative perspective, thus teaching students how the production and interpretation of cultural texts can be re-imagined from different cultural, religious, sexual, psychological, economic and/or political perspectives. The design of the these learning environments would encourage constructivist learning by stressing the active co-construction of contextualized knowledge as well as Webs of relations amongst its nodes. They also facilitate a shift away from a teacher-centered learning environment to an interactive community of active learners (Salomon).</p>
<p>Why, then, have not more of these active learning environments been created for use with the World Wide Web? One reason may be due to the limitations of Web technology itself. Hypertext Markup Language is an un-malleable encoding language: it composes itself in rigidly hierarchical structures, with the hyperlink the only way out. In recent years, however, Web browsers have become more sophisticated, with a variety of plug-ins that more easily accommodate constructivist learning environments. In addition, as HTML quietly fades into the background of XML (Extensible Markup Language) and its sister technologies, it will be easier for new structured environment models for collaborative learning to be developed. The combination of generic Web-based tools and customized ones, of interaction and co-collaboration, promises an exciting new era of humanities scholarship and education. If studying literature becomes as enjoyable as surfing the Web, or studying history becomes as much fun as playing a virtual reality game; if we can engage students with objects and events hundreds or thousands of years old through the language and games to which they relate, we can revitalize disciplines too many students see as ancillary to their lives in the twenty-first century. And in reimagining our disciplines for our students, we reimagine them for ourselves, creating new hypothesis for reading the past, the present, and the future, generated out of and through the same media.</p>
<h2>Bibliography</h2>
<p>Bader, Jenny Lyn. ‘Ideas &amp; Trends: Old Media, Meet New Media; forget Those Old-Fashioned Footnotes. Hyperlink’. <span class="booktitle">The New York Times</span>. (16 July 2000) Section 4, p.1.</p>
<p>Burbules, Nicholas C. and Thomas A Callister, Jr. ‘Universities in Transition: The Promise and the Challenge of New Technologies’. <span class="journaltitle">Teachers College Record</span>. 102:2 (April 2000) 271-93.</p>
<p>Crane, Gregory. ‘The Perseus Project and Beyond: How Building a Digital Library Challenges the Humanities and Technology’. <span class="journaltitle">D-Lib Magazine</span>. (January 1998). <a class="outbound" href="http://www.dlib.org/dlib/january98/01crane.html">http://www.dlib.org/dlib/january98/01crane.html</a></p>
<p>Fraistat, Neil and Steven E Jones. ‘Immersive Textuality: The Editing of Virtual Spaces’. Paper proposal for ACH/ALLC conference 2001. <a class="outbound" href="http://www.nyu.edu/its/humanities/ach_allc2001/papers/fraistat/index.html">http://www.nyu.edu/its/humanities/ach_allc2001/papers/fraistat/index.html</a></p>
<p>Hardwick, Susan W. “Humanising the Technology landscape through a Collaborative Pedagogy”. <span class="journaltitle">Journal of Geography in Higher Education</span>. 24:1 (March 2000). p123-129.</p>
<p>Hiltz, Starr Roxanne. <span class="booktitle">The Virtual Classroom: Learning Without Limits via Computer Networks</span>. (Norwood, New Jersey: Ablex Publishing Co, 1995).</p>
<p>McGann, Jerome. “Imaging What You Don’t Know: The Theoretical Goals of the Rossetti Archive”. <a class="outbound" href="http://jefferson.village.virginia.edu/%7Ejjm2f/chum.html">http://jefferson.village.virginia.edu/%7Ejjm2f/chum.html</a></p>
<p>McGann, Jerome and Joanna Drucker. <span class="booktitle">The Ivanhoe Game</span>. September 2000. <a class="outbound" href="http://jefferson.village.virginia.edu/~jjm2f/Igamesummaryweb.htm">http://jefferson.village.virginia.edu/~jjm2f/Igamesummaryweb.htm</a></p>
<p><span class="booktitle">Multiuser Object Oriented Environment</span>. Athena University. 1998. <a class="outbound" href="http://www.athena.edu/campus/moo.html">http://www.athena.edu/campus/moo.html</a></p>
<p>Muffoletto, Robert. ‘The Expert Teaching Machine: Unpacking the Mask’. <span class="booktitle">Computers in Education: Social, Political and Historical Perspectives</span>. Ed. Robert Muffoletto and Nancy Nelson Knufer. (Cresskill, New Jersey: Hampton Press, 1993) 91-103.</p>
<p>Salomon, Gavriel. ‘Educational Psychology and Technology: A Matter of Reciprocal Relations. <span class="booktitle">Teachers College Record</span>. 100 2 (Winter 1998) 222-41.</p>
<p>Whipple, W.R. ‘Collaborative Learning: Recognizing it When We See It’. <span class="booktitle">Bulletin of the American Association for Higher Education</span> 40:2 (1987) 3-7.</p>
</div></div></div><div class="field field-name-field-tags field-type-taxonomy-term-reference field-label-above"><div class="field-label">Tags:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/tags/www">WWW</a>, <a href="/tags/world-wide-web-0">world wide web</a>, <a href="/tags/blake-archive">blake archive</a>, <a href="/tags/looksee">looksee</a>, <a href="/tags/internet-research">internet research</a>, <a href="/tags/unversity-virginia">unversity of virginia</a>, <a href="/tags/online-scholarship">online scholarship</a>, <a href="/tags/humanities-search-engines">humanities search engines</a>, <a href="/tags/american-memory-project">american memory project</a>, <a href="/tags/html">html</a>, <a href="/tags/victorian-web">victorian web</a>, <a href="/tags/frankenstein">frankenstein</a>, <a href="/tags/columbus">columbus</a>, <a href="/tags/ivanhoe-game">ivanhoe game</a>, <a href="/tags/moozymandius">moozymandius</a>, <a href="/tags/xml">xml</a>, <a href="/tags"></a></div></div></div>Tue, 31 Jan 2012 16:25:05 +0000EBR Administrator893 at http://www.electronicbookreview.comBefore and After the Web: George P. Landow (interviewed by Harvey L. Molloy)http://www.electronicbookreview.com/thread/technocapitalism/uncenterable
<div class="field field-name-field-author field-type-node-reference field-label-hidden clearfix">
<div class="markup">by</div>
<div class="field-items">
<div class="field-item even">George Landow</div>
</div>
</div>
<div class="field field-name-field-publication-date field-type-datetime field-label-hidden"><div class="field-items"><div class="field-item even"><span class="date-display-single">2003-09-13</span></div></div></div><div class="field field-name-field-source-url field-type-link-field field-label-inline clearfix"><div class="field-label">Source URL:&nbsp;</div><div class="field-items"><div class="field-item even"></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p class="longQuotation">George P. Landow is Professor of English and Art History at Brown University. This interview was conducted while he was on leave from Brown University and was Shaw Professor of English and Digital Culture and Director of the University Scholars Programme at the National University of Singapore. His books on hypertext and digital culture include <span class="booktitle">Hypermedia and Literary Studies</span> (MIT, 1991), and <span class="booktitle">The Digital Word: Text-Based Computing in the Humanities</span> (MIT, 1993) both of which he edited with Paul Delany, and <span class="booktitle">Hypertext: The Convergence of Contemporary Critical Theory and Technology</span> (Johns Hopkins UP, 1992), which has appeared in various European and Asian languages and as <span class="booktitle">Hypertext in Hypertext</span> (Johns Hopkins UP, 1994), a greatly expanded electronic version with original texts by Derrida, reviews, student interventions, and works by other authors. In 1997, he published a much-expanded, completely revised version as <span class="booktitle">Hypertext 2.0</span>. He has also edited <span class="booktitle">Hyper/Text/Theory</span>. (Johns Hopkins UP, 1994).</p>
<p class="longQuotation">Harvey L. Molloy is an Assistant Professor in the University Scholars Programme at the National University of Singapore (NUS). His research interests include information design and digital arts. He has seven years experience in the design industry working as an information designer and has worked for clients in the diverse fields of telecommunications, finance, education and the arts. He is currently the Programme’s Web editor.</p>
<h2>The Web and Hypertext</h2>
<p><span class="emphasis">HM:</span> During the 90s, the Web came to dominate how we think about hypertext. What do you think about this domination?</p>
<p><span class="emphasis">GL:</span> As someone who believes that the model of networked - i.e. uncentered, nonhierarchical - digital technology offers important potential for education, educational institutions, scholarly and creative work, and society as a whole, I am fascinated and delighted by the way the Web has taken hold. As someone who came from the pre-Web hypertext community, I am saddened that people have had to settle for such an impoverished version of hypertextuality.</p>
<p>When I look back upon the history of hypertext, I realize that WWW is a kind of latter-day Hypercard in disseminating the idea and use of this kind of infotech: Like Hypercard, which came into being only after dozens of far richer systems had appeared, it appears free and extraordinarily easily to use. Of course, as soon as one tries to do anything rich and strange with either HTML or Hypercard, one begins to experience it much as boat owners tell me one experiences owning a sail boat - as a giant hole into which one pours unlimited time and money.</p>
<p>The lesson of both Hypercard and WWW seems to be that this misleadingly easy first experience leads to great success; the lesson of WWW seems to be that the networked model - this first step towards Nelson’s Docuverse - matters more than anything else.</p>
<p><span class="emphasis">HM:</span> What are the limitations of HTML and the Web?</p>
<p><span class="emphasis">GL:</span> Essentially HTML is a very basic formatting language that looks virtually identical to all the old mainframe and DOS word-processing software - IBM Script, Zywrite, and so on - to which have been added the capacity to add links and images. Adding these two features was an act of genius. Basic HTML is extraordinarily easy to use, and with decent HTML editors, such as BBEdit, Dreamweaver, and Homesite, very easy to use for large projects or sites, using Eastgate Systems Storyspace 2.0 one can even create giant, multi-directory sites, and export them into usable HTML with fairly little effort. So getting started is fairly easy today, as any 12 year-old knows.</p>
<p>The Web today has at least three main deficiencies: First, digital textuality is essentially dynamic; HTML, like its richer predecessor and model, SGML, is suited chiefly to static texts that are created, formatted, and frozen. The very use of the term “homepage,” which derives from a very different world of print, immediately suggests both the difficulties of the technology and the way new users come to it with incorrect - and very limiting - paradigms. HTML and currently available browsers lack some key features that make maintaining any dynamic Web site very time-consuming and therefore expensive.</p>
<p>Second, and related to this last point, is the absence of two defining features of true hypertext - (1) one-to-many linking and (2) automatically generated menus of links available when one clicks on any link-anchor. The first feature, the capacity to attach multiple links to any point in the text or image, creates a vastly richer sense of hypertextuality; in fact many students who learn about hypertext first from an experience of Storyspace, Microcosm, or other systems, find they cannot translate their work into HTML because the Web is “so much flatter,” as they put it, than other forms of hypertext.</p>
<p>In my experience, the second feature, link menus automatically generated by the system, saves much more than half the time and effort required to manage a dynamic site. My sites now comprise more than 42,000 documents and images, and they grow daily. Each time a new document comes to the <span class="booktitle">Victorian Web</span>, I have to do two things: First, I have to format it, which is fairly easy since one can use existing documents as templates for the new one. Second, and much more time-consuming and prone to error, I have to add links to the new doc from as many as six other menus, each of which has to be maintained manually. When one of my contributing editors from Canada (whom, incidentally, I have never met) e-mails an essay on Hardy and Conrad’s use of Miltonic imagery and its relation to their fundamental ideas, links have to be added to the literary relations overviews for each author as well as similar documents for imagery and themes. In richer forms of hypertext, one simply adds a link to each subject heading in each author’s overview using point-and-click techniques; in HTML, one has to edit six documents manually. What a lot of work!</p>
<p>A final problem exists in the instability of the Net. Ideally, one should be able to link to many other Web sites. In fact, painful experience proves that a large number of Webmasters, particularly graduate students, who request links to their sites, move or shut down their sites without warning, and server names seem to change at an astonishing rate, thereby breaking links. This fact means that one of the Web’s greatest promises - a true Nelsonian Docuverse - hasn’t been fulfilled.</p>
<p><span class="emphasis">HM:</span> Do you think that the Web will continue to hold this dominant position? Do you think that future developments in markup languages - such as XML - will allow the Web to fulfill some of the visionary potential of hypertext as imagined by Bush and Nelson?</p>
<p><span class="emphasis">GL:</span> According to people close to the latest developments in XML, it will have the strengths of SGML – essentially, tags describe a text element, such as a paragraph or book title, and one decides on formatting them from a central location. It also seems as if the Xlink protocols will finally give us one-to-many linking; now it’s up to Microsoft and Netscape to produce decent browsers that will support such features. If they do, the Web world could change at light speed.</p>
<p><span class="emphasis">HM:</span> Is there a danger that students and researchers will forget the power of other hypertext systems due to the dominance of the Web?</p>
<p><span class="emphasis">GL:</span> No, I think the danger is that the great majority of students and researchers never even <span class="lightEmphasis">learn</span> about other systems. For someone involved in the field since 1986 or ‘87, one of the most painful (or pathetic) things about much Web-based research projects in Computer Science is seeing people duplicate research done much earlier – often on things that proved to be complete dead-ends. Oh well, it keeps them off the street.</p>
<p><span class="emphasis">HM</span>: In <span class="booktitle">Hypertext 2.0</span> you noted that “Hypertext also offers a means of experiencing the way a subject expert makes connections and formulates inquiries” (226). How does the Web fare in fulfilling this potential?</p>
<p><span class="emphasis">GL:</span> Here I think the Web does an excellent job. The ease with which one can create what are essentially links to a glossary permits beginners to read with the help of expert readers - when they wish to do so.</p>
<h2>The Web and Education</h2>
<p><span class="emphasis">HM:</span> The <span class="booktitle">Victorian Web</span> began as a Storyspace web - what was your experience in converting the <span class="booktitle">Victorian Web</span> from Storyspace to HTML? What were the effects of this change for authors and readers of the <span class="booktitle">Victorian Web</span>?</p>
<p><span class="emphasis">GL:</span> First, an enormous amount of work, which continues on a daily basis. Second, an enormously larger audience that is now around a combined 7-8 million hits/month on my two sites (in Singapore and in the US). Third, as a result of the last effect, contributors to the <span class="booktitle">Victorian Web</span>, chiefly faculty members at other institutions and a few graduate students, have increased enormously. We now have around 500 faculty authors, and in the Victorian Web Books section, which consists of HTML translations of central books in the field, we now have a dozen important books originally published by Cornell, North Carolina, Oxford, Routledge, Princeton, Texas, and Yale UP. None of this could have happened without something like the Web.</p>
<p><span class="emphasis">HM:</span> What’s interesting to me about the <span class="booktitle">Victorian Web</span> and the <span class="booktitle">Hypertext and Critical Theory Web</span> is that you don’t readily distinguish between student authors and established academic writers. Students are effectively engaged in scholarly research projects. Is hypertext unique in allowing students to become active researchers?</p>
<p><span class="emphasis">GL:</span> Two comments: first, each <span class="lightEmphasis">does</span> distinguish between undergraduate, postgraduate, and faculty contributors – at least to the extent that each byline indicates the status of the author. It does not distinguish among them to the extent that faculty and students or members of the general public comment upon one another’s work.</p>
<p>Second, as so many other educational and cultural effects, hypertext makes vastly easier something theoretically possible earlier and occasionally practiced.</p>
<p><span class="emphasis">HM:</span> In Hypertext 2.0 you wrote that “Hypertext, by holding out the possibility of newly empowered, self-directed students, demands that we confront an entire range of questions about our conceptions of literary education” (219). What’s your evaluation of the humanities’ response to this possibility?</p>
<p><span class="emphasis">GL:</span> Qualified medium-range optimism, I guess. Many young teachers immediately saw the possibilities of the Web and other forms of hypertext. For example, using both Storyspace and HTML, Massimo Riva of Brown constructed the massive bilingual <span class="booktitle">Decameron Web</span> with contributions from students and scholars from the USA and Italy. Interestingly enough, scholars working in the fields concerned with earlier literatures - Greek and Latin, Anglo-Saxon, Old Irish, Old Norse, and so on - led the way whereas those in contemporary literature, film, and video often refused even to consider the possibilities of digital technologies. Brown’s Department of Modern Culture and Media, which for almost a decade acted as if all media ended with television and video, blocked several attempts to have an official program or major in digital culture. In my own department, the medievalists and renaissance scholars have long been immersed in computing, but I have never been able to get those in the romantic and Victorian periods, including our chair, to look at the Victorian Web, much less use it for their courses or contribute to it themselves. As soon as I went on leave to come to the National University of Singapore, my department stopped teaching my hypertext courses, even though there are quite a few people who could have kept them going. The Old Guard, the Old Fellas (which in this case includes a large number of women), don’t see what this stuff has to do with an English Department.</p>
<p>My off-the-cuff explanation is that although all modern education is based chiefly upon book technology, those working in earlier fields know the texts that they study and teach bear the marks of scribal, oral, and pre-print infotech; those who work in later fields are so inside the Gutenberg galaxy (as McLuhan called it) that they see anything else as fundamentally anticultural.</p>
<p><span class="emphasis">HM:</span> Do you think that there’s a danger that many teachers in humanities see hypertext as being about computers rather than being a means to do research?</p>
<p><span class="emphasis">GL:</span> Yup. At the very least, they should be leading their students to learn how to evaluate the quality of information. Of course, since most secondary school teachers and college instructors today themselves don’t know how to do research in traditional libraries, they can’t extend these skills to the Net.</p>
<p><span class="emphasis">HM:</span> What are some of the issues that need to be considered by Web publishers who want to create online editions of out-of-print books? How do footnotes, references and bibliographies work when a text is moved from print to hypertext?</p>
<p><span class="emphasis">GL:</span> Since not all users have broadband access to the Internet, avoid adding links to notes where possible by using the following rules: First, all substantial notes should be given titles and treated as separate documents; second, incorporate as many brief comments and notes as possible into the main text; third, for bibliographical information include a list of works cited at the foot of each individual lexia (document) and then use the MLA short form of in-text citation, which means in practice that you only use as much info in the parenthetical reference as is absolutely necessary. Thus, if you introduce quoted material by “According to Spurgeon’s “Christ the Lord,” you only need a page number: “quoted text” (34). If, however, you wrote, “According to a Victorian preacher… ” you’d have to provide the necessary information in full: “quoted text” (Spurgeon, “Christ the Lord,” 34).</p>
<p>Most of the preceding recommendations, you’ll notice, come straight from the best of current book publishing practice. The problem is that many print publishers, including leading academic ones, have incompetent manuscript editors or inadequate house styles. Thirty-five years ago I was told by editors of leading journals and presses (a) not to use things like “Ibid.” or “Op. Cit.,” and (b) never to use unnecessary notes, but I still come upon books like Timothy Hilton’s fine biography of John Ruskin, the second volume of which Yale University Press published last year, that has pages and pages of tiny endnotes with Ibid. and page numbers. A good three-quarters of the endnotes, which are not easy to use in a massive volume, are useless. The lesson here is that one can get by with incompetent manuscript preparation in print, but such poor quality in a Web doc would be a disaster for readers, quickly training them not to follow <span class="lightEmphasis">any</span> links!</p>
<p>The more interesting problems, which we face all the time in the Victorian Web Books - <a class="outbound" href="http://www.scholars.nus.edu.sg/landow/victorian/misc/books.html">http://www.scholars.nus.edu.sg/landow/victorian/misc/books.html</a> - include: (a) what to do with information created, even by the same author, since the book first appeared, (b) how does one add value with links to material not in the original book, and (c) how does one both preserve the text-as-a-book and make it function effectively as a digital text with permeable borders. Finally, can some of the solutions I’ve tried in the <span class="booktitle">Victorian Web</span> be carried out algorithmically?</p>
<h2>Power, Authority, Control, and the Web</h2>
<p><span class="emphasis">HM:</span> In your introduction to the 1994 collection of essays <span class="booktitle">Hyper/Text/Theory</span> that you edited you wryly observed that the humanities excels in “finding mice in molehills.” Do you think that by subscribing to the narrative that the utopian idealism of the early 90s has now been superceded by systems of control and the search for the e-dollar that the humanities finds a late capitalist mouse in a cyberspace molehill? Has utopianism about the Web been replaced by a proliferation of technocapitalism and cybernetic governmentality?</p>
<p><span class="emphasis">GL:</span> Although a certain cyber-utopianism has disappeared as a general characteristic of those involved in the Web, this change has happened in large part because new people with non-utopian goals have quite properly tried to earn a living with the new technology. I don’t see anything wrong in people trying to make money from doing things that other people need or want (not the same thing). At the same time at lot of people see the Web as a new virtual place of freedom. I find wonderfully encouraging the Web public’s refusal to accept channels and other attempts to turn hypertext into television. Michael Joyce’s brilliant challenge thus far has rung true: “Hypertext is the revenge of text upon television.” If it turns out that the most successful way to make money from the Net is business-to-business sales, a few consumer fields, such as music distribution, and the like - that’s fine. None of this drives out more experimental writing and the like.</p>
<p><span class="emphasis">HM:</span> While there has been a rise in cybergovernmentality, there also been a proliferation of free Web-hosting, free email services, free egroups, free Web logs. It’s never been so easy to publish your own material. Do you think that this is significant? Does the rise of these services have implications for teaching and research?</p>
<p><span class="emphasis">GL:</span> Yes, we find ourselves in a situation of creative anarchy, and, like everyone else, I’m waiting to see how things will shake out and down. I also wonder how long services will remain “free,” or if certain aspects of Internet culture will eventually become a kind of inalienable right. It is also possible that, like broadcast TV, such free services will come at the expense of advertisements, in which case skilled reading will involve becoming blind to commercial enticements.</p>
<p>Certain obvious implications have already been realized: my students in Singapore, like those in the US, often develop their work on their own servers, rather than in (and on) University facilities. In addition, the ability to publish anything makes something like a conventional publisher, who selects, regularizes, and advertises, even more important. I don’t think the Web is the death of publishers - just the death of those who insist on remaining clueless.</p>
<p><span class="emphasis">HM:</span> In <span class="booktitle">Hypertext 2.0</span> you argued that “Like other forms of technology, those involving information have shown a double-edged effect, though in the long run - sometimes the run has been very long indeed - the result has always been to democratize information and power” (276). What are some of the dynamics at work which result in this greater democratization?</p>
<p><span class="emphasis">GL:</span> Although clearly many factors are involved, the single most important one, I believe, is the replacement of hierarchy by the uncenterable network. That makes top-down control difficult; hierarchy and lack of transparency almost unworkable; choice inevitable.</p>
<p><span class="emphasis">HM:</span> Let’s talk a little about the issue of surveillance and openness. The extent to which Web surveillance has increased surely depends on very local issues. What do you see as the impact of the Web within Singapore and throughout the entire South-East Asia region?</p>
<p><span class="emphasis">GL:</span> Key issues include (a) literacy, without which accessibility means nothing, (b) access to networked computers, and (c) access to high-speed networks. Much of the population of Singapore has more of these three capacities than most of Europe and America, and vastly more than their neighbors in the region, or countries in South America and Africa.</p>
<p>By announcing recently that Internet service providers are not legally liable for material their customers place on their webservers, the Singapore government took a giant step towards an open society. I have no idea how much Web surveillance actually happens here or throughout the world, though it seems to me that most of it takes a commercial turn, with merchandisers compiling elaborate profiles, which they then exchange with other commercial and possibly governmental entities.</p>
<p>I also don’t have a clear idea of how much surveillance is in fact possible. We all know stories of the Jet Propulsion Lab storing incredible amounts of data sent back by unmanned space vehicles because they don’t have capacities to process it. Even given the resources of NSA and the CIA, I wonder much they can accomplish with the vastly larger amounts of data that pour in from spy satellites, web crawlers, and the like. Singapore has only 3 million people, so the task would be easier <span class="lightEmphasis">if</span> one had access to the same resources.</p>
<p><span class="emphasis">HM:</span> How do you see copyright issues impinging on online publication and scholarship?</p>
<p><span class="emphasis">GL:</span> Back somewhere around 1987, the Annenberg/Corporation for Public Broadcasting assembled about a dozen people in Cambridge, Massachusetts, and asked us what would be needed to make hypertext fulfill its potential as an educational and cultural force. Everyone agreed that the hardware and software will take care of themselves; the one factor that we had to work for was a new conception of copyright that involved something like leasing information for a tiny expenditure - Ted Nelson’s vision, of course. Since then nothing has changed.</p>
<p>Unfortunately, too many of the judges and lawmakers who consider such issues throughout the world do not understand networked digitech. Worse, not realizing that many of their conceptions of intellectual property are print based, they assume their notions of intellectual property are universal. Of course, as many students of copyright law have pointed out, in the commercial world large corporations protect their ideas by means of secrecy, not copyright.</p>
<p><span class="emphasis">HM:</span> What are some of the new issues in hypertext? What would you need to cover if you were writing <span class="booktitle">Hypertext. 3.0</span>?</p>
<p><span class="emphasis">GL:</span> The short answer is that if I knew, I’d be writing <span class="booktitle">Hypertext 3.0</span> right now. The longer one is that I’d have much more digital fiction, poetry and art to examine, and I’d expect to examine various debates over gender, textual embodiment, and other issues increasingly in contemporary critical theory. Of course, I am particularly eager to see if the promise of XML will be fulfilled.</p>
</div></div></div><div class="field field-name-field-tags field-type-taxonomy-term-reference field-label-above"><div class="field-label">Tags:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/tags/internet">internet</a>, <a href="/tags/world-wide-web-0">world wide web</a>, <a href="/tags/www">WWW</a>, <a href="/tags/hypertext">hypertext</a>, <a href="/tags/hyperdcard">hyperdcard</a>, <a href="/tags/html">html</a>, <a href="/tags/xml">xml</a>, <a href="/tags/bbedit">BBEdit</a>, <a href="/tags/dreamweaver">Dreamweaver</a>, <a href="/tags/victorian-web">victorian web</a>, <a href="/tags/thomas-hardy">thomas hardy</a>, <a href="/tags/joseph-conrad">joseph conrad</a>, <a href="/tags/john-milton">john milton</a>, <a href="/tags/ted-nelson">Ted Nelson</a>, <a href="/tags/vannevar-bush">vannevar bush</a>, <a href="/tags/microsoft">microsoft</a>, <a href="/tags/netscape">netscape</a>, <a href="/tags/storyspace">Storyspace</a>, <a href="/tags/decameron-web">decameron web</a>, <a href="/tags/marshall-mucluhan">marshall mucluhan</a>, <a href="/tags/michael-joyce">michael joyce</a>, <a href="/tags"></a></div></div></div>Tue, 31 Jan 2012 16:25:05 +0000EBR Administrator887 at http://www.electronicbookreview.comMetadiversity: On the Unavailability of Alternatives to Informationhttp://www.electronicbookreview.com/thread/technocapitalism/multilingual
<div class="field field-name-field-author field-type-node-reference field-label-hidden clearfix">
<div class="markup">by</div>
<div class="field-items">
<div class="field-item even">David Golumbia</div>
</div>
</div>
<div class="field field-name-field-publication-date field-type-datetime field-label-hidden"><div class="field-items"><div class="field-item even"><span class="date-display-single">2003-08-30</span></div></div></div><div class="field field-name-field-source-url field-type-link-field field-label-inline clearfix"><div class="field-label">Source URL:&nbsp;</div><div class="field-items"><div class="field-item even"></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Despite its apparent global variety, the Internet is more linguistically uniform than it is linguistically diverse. Almost all Internet traffic is conducted in one of the world’s 100 or so dominant languages, and the great majority takes place in the top 10 or so languages, with English being especially dominant due, among other reasons, to its use in the Internet’s coding infrastructure. Unwritten and nonstandardized languages, which make up the majority of the world’s approximately 6,700 languages, are hardly accounted for in the structure of Internet communication. <cite id="note_1">On the worldwide distribution of languages see Grimes, <span class="booktitle">Ethnologue</span>.</cite> The emphasis in today’s Internet development on informatic models and on structured information reveals a bias characteristic not only of modern technological practices but also of modern standardized languages. This bias may limit the Internet’s effectiveness in being deployed for non-informatic uses of language, which have themselves been significantly underplayed in Western technological development and its theory.</p>
<h2>1. Informatics</h2>
<p>Much cultural analysis of the Internet focuses on information - loosely, what is typically thought of as “content.” That is, the analytic object is what the user sees most prominently on the page, the words he or she types into a chat interface, the articles displayed and/or the aesthetic possibilities of website creation, and the means for transmitting, storing, and replicating them. <cite id="note_2">See, for example, Landow, <span class="booktitle">Hypertext</span> and <span class="booktitle">Hyper/Text/Theory</span>, Lunenfeld, <span class="booktitle">Digital Dialectic</span>, and Bolter and Grusin, <span class="booktitle">Remediation</span>, all of which problematize the informatic focus while more or less endorsing it. Lessig, <span class="booktitle">Code</span>, and Poster, <span class="booktitle">The Mode of Information</span> are the best recent attempts to think critically about the informatic infrastructure. Turkle, <span class="booktitle">Second Self</span> remains a touchstone in thinking critically about the cultural-psychological consequences of the computing environment. Also see the references in Mann, “What Is Communication.”</cite> We refer to the advent of the Internet as an “Information Revolution” and to the computing infrastructure as “Information Technology” (IT). All of this suggests that information was somehow what was in need of technological change and that the inefficient transfer of information was an obvious social problem requiring a revolution. But for the human users of the Internet, information is realized, nearly exclusively, via printed language. So in addition to being part of the computer revolution, the Internet needs also to be seen in the wider frames of human languages and language technologies, where the question of the informatic nature of language is much more highly vexed than the IT revolution would make it appear.</p>
<p>Rather than IT, when we talk about what may be socially transformative about the Internet, we focus just as often on social connection and community. So although the Internet is seen</p>
<p class="longQuotation">principally as a valuable reservoir of information, its main contribution may one day be seen as a catalyst for the formation of communities. Since communities bound by common interests existed long before computers, it is not as if we have now entered the next stage in the evolution of society (the `information age’). Rather, computer meshworks have created a bridge to a stable state of social life which existed before massification and continues to coexist alongside it. (DeLanda, <span class="booktitle">A Thousand Years of Nonlinear History</span>, 254)</p>
<p>Yet Manuel De Landa himself points out that it is standardized languages in general and most of all standardized written English as a medium for technical communication that open the possibility of the Internet itself. “English became the language of computers, both in the sense that formal computer languages that use standard words as mnemonic devices (such as Pascal or Fortran) use English as a source and in the sense that technical discussions about computers tend to be conducted in English (again, not surprisingly, since Britain and the United States played key roles in the development of the technology)” (253).</p>
<p>De Landa sees, rightly at least in a limited sense, that the Internet is becoming a place where it can be possible for “pride of the standard [to be] seen as a foreign emotion, where a continuum of neo-Englishes flourishes, protected from the hierarchical weight of `received pronunciations’ and official criteria of correctness” (253-4). But the boundaries of this continuum are narrow precisely because it is neo-Englishes rather than a diversity of world languages that flourish. It is no accident of history that the programming and markup languages that structure the Internet are almost exclusively written in standardized fragments of English, especially as English has been revisioned into the sub-languages of logic and mathematics. <cite id="note_3">I discuss this at greater length in Golumbia, “Computational Object.” Also see Lyotard, <span class="booktitle">Postmodern Condition</span>.</cite> It is, rather, characteristic of these historical developments and of their constitutive relation to modern identity itself. It appears, at best, premature to suggest that systems constructed within such highly formalized, abstracted and, in an important sense, fictional structures could be responsible to the texture of human language - a texture whose variety we have scarcely begun to apprehend. <cite id="note_4">Reddy, “The Conduit Metaphor,” remains the single best articulation of the distance between the formalized communicative object and actual linguistic practice; also see Lakoff and Johnson, <span class="booktitle">Metaphors We Live By</span> and <span class="booktitle">Philosophy in the Flesh</span>, and Mann, “What Is Communication.”</cite> (But which is at the same time familiar enough that we all understand the degree to which computers continue to fail to do anything very close to producing or understanding spontaneous human language.) For despite the appearance created in no small part by programming languages themselves, human languages need not be abstracted, one-to-one, univocally interpretable, or structured much like systems of propositional logic. In fact, these characteristics are rare across the languages we do find in human history and contemporary (but not, in this case, necessarily modern) social life. <cite id="note_5">See Golumbia, “History of `Language.’”</cite> Rather than a medium for transmitting and sharing human language, then, we must be prepared to see the proliferation of computer networks as part of an ongoing effort to shift the basis of language use toward one appropriate for an informatic economy. <cite id="note_6">As discussed in Golumbia, “Hypercapital.”</cite> It is the constitutive power of this phenomenon to which we must learn to be especially attentive.</p>
<h2>2. Hypertext</h2>
<p>There is a curious lack of fit between the phenomenon called <span class="lightEmphasis">hypertext</span> examined as an abstract or theoretical object, and hypertext as it is used on the Internet. As the term has been advanced in academic writing, hypertext refers to what might be thought of as a multidimensional intra-document linking system that helps us to “abandon conceptual systems founded upon ideas of center, margin, hierarchy, and linearity and replace them by ones of multilinearity, nodes, links, and networks” (Landow, <span class="booktitle">Hypertext</span>, 2). Taking as paradigmatic a particular kind of interactive narrative, including the works of Michael Joyce and the program Storyspace, these theories stress the ways in which “hypertext… provides an infinitely re-centerable system whose provisional point of focus depends upon the reader, who becomes a truly active reader in yet another sense” (<span class="booktitle">Hypertext</span>, 11).</p>
<p>To be sure, these distributive, informational networks do exist, but it is also fair to say that they are not the rule in terms of contemporary uses of hypertext. As the Web has matured, another and perhaps much more obvious usage of hypertext dominates, in which stability, centering, order, and logic are not necessarily resisted but may in fact be reinforced. Today’s web pages use hypertextual linking primarily to drive navigation in and among complete, stable, “sticky” application interfaces. This is what drives both standard and personalized portal pages. A personalized news page on a portal site such as Yahoo!, for example, consists of headlines in many areas of world and local news, divided into categories and subcategories that are intensely logical, that are in fact derived from a culturally-preconstructed taxonomy from which dissent is difficult to conceptualize, let alone practice. So the fact that some kinds of interesting and potentially transformative constructions are possible within a given medium should not distract us from understanding how the medium is actually being used, especially when these uses are very large-scale and very directly implicated in the production of contemporary subjectivities.</p>
<p>On our Web, HTML and hypertext are used to create rich, absorbing navigational experiences that instruct the user to stay where they are, with only occasional side glances to alternate information sources. Organizations focus workers’ daily experiences around wide-area websites, confirming exactly the identitarian structures that hypertext might be thought to resist. Every student, teacher, office worker, engineer, professor is compelled to have a relation to these stable, compelling, relentlessly logical interactive presences, in which documents are not so much intercut with each other as presented in orderly, menu-based groups.</p>
<p>In fact, it is odd that, instead of HTML, we speak of hypertext when we try to locate the salient analytic object in digital textuality. On reflection, HTML really does define what happens on the Web to an astonishingly large degree, and HTML is far more defined and linear than the word “hypertext” would suggest. HTML is typically used to structure the page, and the user’s experience of the page, so as to lead the user in a particular direction with particular goals in mind. That these goals are so often commercial and so often transaction-oriented seems to expose, to literalize, the most profound aspects of the Marxist critique of ideology in language. HTML surrounds “written” electronic language with a literal meta-language, whose goal is overt and unavoidable: to structure explicitly the page’s functions.</p>
<p>While the ability of HTML to create links between documents and parts of documents is critical to the Web, it is also merely one of a large set of programmatic features available to the web page writer, all of whose purpose is to help create structure. To some degree this is content-neutral; obviously no particular paragraph of writing is barred from being surrounded with</p>
<p>and</p>
<p>tags. But the entire set of HTML tags is deliberately built up from a system whose purpose is to structure information for cataloging and retrieval: to mark each and every piece of linguistic source data with some kind of markup tag that allows post-processing and redelivery. In this way language is constrained within the informatic paradigm on the Internet to a surprising degree.</p>
<h2>3. Structured Information</h2>
<p>HTML (HyperText Markup Language) is typically thought of as a kind of design tool, and of course it is. But HTML is also a tool for structuring information: for applying general metadata to all the elements in a presentation set. “Structured information is information that is analyzed…. Only when information has been divided up by such an analysis, and the parts and relationships have been identified, can computers process it in meaningful ways” (DeRose, “Structured Information,” 1). HTML was in fact written originally by Tim Berners-Lee as a kind of simplified version of a language in which contents are explicitly tagged with meaningful metadata, called SGML for Standard Generalized Markup Language. <cite id="note_7">See the World Wide Web Consortium’s (W3C) web pages on HTML, e.g., <a class="outbound" href="http://www.w3.org/MarkUp">http://www.w3.org/MarkUp</a>.</cite> SGML was developed for engineering and military documentation, in which it is assumed that every piece of information needs to be indexed for rapid retrieval and cross-matching. <cite id="note_8">Robins and Webster, <span class="booktitle">Times of the Technoculture</span>, provides an excellent overview of some of the direct military interests involved in the information revolution; also see De Landa, <span class="booktitle">War</span>, and Poster, <span class="booktitle">Mode of Information</span>.</cite></p>
<p>Today HTML is used to apply structure to the general linguistic environment of the Internet. The primary structuring use of even the specific function known as hyperlinking is not that of connecting disparate documents or alternate paths through multidimensional content. Rather, linking is used for menus and other navigational elements. The big tabs at the top of the Amazon.Com page that allow the user to choose among Books, Video, and Lawn Tools are the meat of hypertext. The categories themselves are not arbitrary, but instead are generated out of much more highly-structured data environments (databases). <cite id="note_9">See Poster, <span class="booktitle">Mode of Information</span>, especially Chapter Three, “Foucault and Databases: Participatory Surveillance.”</cite> These tabs can even be thought of as a kind of exposure of the metadata environment of the website. In a commercial Web operation like Amazon.Com, this activity is inherently interactive with the user’s patterns of spending, such that the entire structure of the hypertextual experience is laid in place by explicit logical programming rules, which operate ideally out of the realm of conscious comprehension. You don’t know why the website seems to reflect categories that occasionally grab your interest or reviews of books that you have been wondering about.</p>
<p>The inherent structuring of HTML has been built on in recent technology by the advent of increasingly powerful dynamic web page generation language standards (such as Java Server Pages, Active Server Pages, and Cold Fusion pages - each of which can be identified by noting the presence of the extensions.jsp,.asp, and.cfm respectively in web page URLs). These technologies allow the incorporation of database content directly into what look like static HTML documents. They are very literally the language out of which the Web is largely delivered, for academic journals no less than e-commerce sites. Because these meta-rules are applied within the text of the apparent display language, they further blur the distinction that allows us to think of source code as metalinguistic and web page content as ordinary language - content.</p>
<p>Currently, the W3C has nearly finished the articulation of XHTML, a set of standards that allow all HTML content to be rewritten within XML-based contexts. XML stands for eXtensible Markup Language, and represents an explicit attempt to replicate the meta-linguistic tagging properties of SGML widely throughout the Internet (XML is actually a simplified form of SGML, although it has been extended beyond this original base). The standard pocketbook definition (literally) says that XML is a “meta-language that allows you to create and format your own document markups…. Thus, it is important to realize that there are no `correct’ tags for an XML document, except those you define yourself” (Eckstein, <span class="booktitle">XML</span>, 1).</p>
<p>That is, XML is a set of standards for expressing metadata in any form chosen by the programmer. Any viable set of categories should inherently be able to be realized in an XML implementation. In practice, of course, XML documents, especially their large-scale programmatic elements, are written exclusively in English (although the standard allows content to be written in any language, and some levels of tagging are certainly written today using European languages). More importantly, XML is rarely used by individuals or even community groups to create ad-hoc data structures; to the contrary, XML is most widely used by businesses to structure content for electronic commerce, and also for more directly technological applications. In these applications a standards committee drawn from members of prominent businesses and institutions within the appropriate domain is convened. The committee issues successive standards, which dictate exactly how content issued within the industry should be marked up. The neutral standards-based web page known as XML.org promotes itself as “The XML Industry Portal,” and offers pointers to standards for using XML within social domains as widely dispersed as Data Mining, Defense Aerospace and Distributed Management. <cite id="note_10">See <a class="outbound" href="http://www.xml.org">http://www.xml.org</a>. The Oasis-Open project at <a class="outbound" href="http://www.oasis-open.org">http://www.oasis-open.org</a> is currently the locus for the promotion of Structured Information on the Internet.</cite> In fact, not surprisingly, SGML itself has survived in no small part due to its applicability in military engineering projects, where parts, features and functions are categorized to an exorbitant level.</p>
<p>In practice, then, the proliferation of XML and XML-like markup strategies suggests a remarkable degree of institutionally-controlled standardization. By incorporating display standards like XHTML into current web pages, developers can ensure the thorough categorization of every aspect of Web content. Rather than a page, the screen breaks down into more or less discrete units, served up in interaction with masses of data and statistical sampling that are by definition not available for the user to examine or understand. Instead, through such probability- and category-driven conceptions of “personality,” subjectivity itself is presented whole, pre-analyzed, organized, almost always around a central metaphorical goal, usually an economic one. <cite id="note_11">For examples see Birbeck, Duckett, and Gudmundsson, <span class="booktitle">Professional XML</span>, and Fitzgerald, <span class="booktitle">Building B2B Applications</span>.</cite> The user is free to choose whether she is interested in Sports or Finance, Hockey or Baseball, the Detroit Red Wings or the Seattle Seahawks. But she is hardly free to reassemble the page according to different logics, different filtering approaches, applying critical logic or any sort of interpretive strategy to the AP Newswire or Dow Jones news feed. This informatic goal instances itself in every aspect of the web page presentation, cultural-cognitive streambeds in which the water of thought almost unavoidably runs. It is not clear that our society has effective mechanisms for evaluating the repackaging of our language environment in this way, in the sense of allowing a large group of technicians and non-technicians to consider deeply its motivations and consequences.</p>
<h2>4. Metadiversity</h2>
<p><span class="lightEmphasis">Metadiversity</span> is a term that fails to mean what we need it to. The term has been introduced by information scientists and conservation biologists to indicate the need for metadata resources about biological diversity, no doubt a critical requirement. But the term <span class="lightEmphasis">metadiversity</span> suggests something else - a diversity of meta-level approaches, or even more directly, a diversity of approaches, of schemes, of general structuring patterns. Seen from the perspective of linguistic history, the linguistic environment of the Internet seems to offer not a plethora of schemes but a paucity of them, clustered around business-oriented and even military-based informatic uses. The language technology developed for the Web is primarily meant to make it easy to complete a transaction, close a deal, accept a payment; it is less clearly meant to facilitate open and full speech, let alone to foster a true diversity of approaches to language.</p>
<p>The history of language is rich with examples of structural alternatives to our current environment. These examples include phenomena found in what are today known as “polysynthetic” and other primarily oral languages. Such languages display grammatical and lexical differences from English, from European languages, and even from some modern non-European languages like the dominant languages of Asia. The languages stand in ambiguous relation to the kind of form/content split that has ground its way thoroughly into Western language practice, so much so that no less a linguist than Roy Harris can suggest that the triumph of computers represents the triumph of a “mechanistic conception of language” (<span class="booktitle">Language Machine</span>, 161). This is not some isolated ideology that can be contained within the technical study of linguistics (whose participation in the system of disciplinary boundaries is already highly problematic), though its presence in linguistics is clear and unambiguous. It extends outward in every way to the culture at large, providing models of subjectivity for a great percentage of those who provide so-called intellectual capital for international business. The ideology precisely provides form for subjectivity, suggesting to many normative individuals that existence itself is subject to binary thinking and unitary pursuit of goals.</p>
<p>In the most curious way, this ideology reveals its power through a kind of strong misreading. Just as the term <span class="lightEmphasis">metadiversity</span> is in effect encapsulated against its most direct lexical content, so the apparent homology between modern information networks is misrendered, resulting in a highly teleological area of research known loosely as <span class="lightEmphasis">bioinformatics</span>. Thinking broadly of the effects various telematic changes have had on the development of modern consciousness, Gayatri Spivak writes that “the great narrative of Development is not dead. The cultural politics of books like <span class="booktitle">Global Village</span> and <span class="booktitle">Postmodern Condition</span> and the well-meaning raps upon raps upon the global electronic future that we often hear is to provide the narrative of development(globalization)-democratization (U.S. Mission) an alibi” (<span class="booktitle">Critique</span>, 371). The marriage of the deep biological/machine metaphor and the development narrative produces a desire to make information live, to replace and translate the units of biological information (genes) with those of an artificial, formal linguistic system, but which somehow manages always to work in accordance with the needs of transnational capital.</p>
<p>We see the marks of this deep ideology everywhere in culture, where it almost unfailingly works to support the processes of globalist, nationalist development (even where it merely comes down to the more local politics of academic disciplines) and against the claims of marginal, deterritorialized, often de-languaged minority groups. <cite id="note_12">See Grenoble and Whaley, <span class="booktitle">Endangered Languages</span>, and Skutnabb-Kangas, <span class="booktitle">Linguistic Genocide</span>.</cite> The deep metaphors at the heart of Chomsky’s writings have lately pushed closer to the surface, so that he now thinks of language in terms of “perfection” and “optimality.” “The language faculty might be unique among cognitive systems, or even in the organic world, in that it satisfies minimalist assumptions. Furthermore, the morphological parameters could be unique in character, and the computational system CHL biologically isolated” (<span class="booktitle">Minimalist Program</span>, 221). This bio-computer, unique in nature (but ubiquitous in modern thought and fiction), must be characterizable in terms of algebraic or otherwise formal rules, which take their form not from human language but from the logical abstractions on which computers are built. It is no surprise that Chomsky’s writing has lately started to use as core terms, in addition to abstract words such as <span class="lightEmphasis">Move</span> and <span class="lightEmphasis">Derivation</span>, terms which sound derived directly from programming languages. <span class="booktitle">The Minimalist Program</span> invokes <span class="lightEmphasis">Select</span> (226ff.), <span class="lightEmphasis">Merge</span> (226ff.), <span class="lightEmphasis">Spell-Out</span> (229ff.), and perhaps most tellingly, <span class="lightEmphasis">Crash</span> (230ff.), which happens “at LF [Logical Form], violating FI [Full Interpretation]” (230) - all terms with wide applicability and use in various domains of computer science and programming languages. (From this small historical distance, it now seems hard to construe as accident that just as the use and development of the computer really takes off at MIT, so does the theory that language should be understood primarily as the stuff that computers understand - symbols manipulated by a logical processor. <cite id="note_13">This is made clearest in Huck and Goldsmith, <span class="booktitle">Ideology and Linguistic Theory</span>, and Harris, <span class="booktitle">Linguistics Wars</span>, though it requires some interpretation of either of these works to arrive at the point I am making here. Also see Harris, <span class="booktitle">Language Machine</span>, Lyotard, <span class="booktitle">Postmodern Condition</span>, and Turkle, <span class="booktitle">Second Self</span>.</cite> It is also no accident that much of this research was directly funded by the military for the express purpose of getting machines to understand speech, presumably for intelligence purposes. <cite id="note_14">See Harris, <span class="booktitle">Linguistics Wars</span>, and De Landa, <span class="booktitle">War</span>, but also see the footnotes and endnotes of many of the early works of generative grammar in which military funding is explicitly mentioned. It is, for example, an odd note of linguistico-political history that Chomsky’s principal mid-sixties work, <span class="booktitle">Aspects of the Theory of Syntax</span>, “was made possible in part by support extended the Massachusetts Institute of Technology, Research Laboratory of Electronics, by the JOINT SERVICES ELECTRONICS PROGRAM (U.S. Army, U.S. Navy, and U.S. Air Force) under Contract No. DA36-039-AMC-03200(E)…” (<span class="booktitle">Aspects</span>, iv).</cite></p>
<p>Within the field now called <span class="lightEmphasis">bioinformatics</span>, misapplication of the bio-computer metaphoric cluster runs rampant, often mapped very precisely onto the direct-forward telos of capital. Most familiarly, the term refers to the collection of genetic data in computerized databases - where it already bleeds over into the ambition to read the human genome like a book, like a set of explicit and language-like instructions, again construing language explicitly as an information-transfer mechanism. <cite id="note_15">Eugene Thacker discusses this aspect of the phenomenon briefly in his “Bioinformatics.”</cite> Perhaps the genes truly are like human language - in which case they would appear full of systemic possibilities, none of which are realized in similar or equipotent or equally meaningful ways. (Or maybe genes really are informatic, in which case the reverse cautions might also apply.) What would seem plainest on a dispassionate consideration of intellectual history is that there are probably all sorts of ways of processing genetic material that will not be at all obvious or literal. This leads implacably to the conclusion that, because we seem unable to consider what we are doing prior to operating, we are no doubt even now rewriting scripts whose meanings we scarcely know.</p>
<p>Would that this were the only place in which the bio-computer ideology drives us forward. But in fact other programs, also referred to as <span class="lightEmphasis">bioinformatic</span>, grow not unfettered but with the explicit prodding of military and capitalist interests. These programs include efforts to create “living” programs, code that repairs itself, genetic algorithms, “artificial life,” and many others. <cite id="note_16">See, for example, Brown, <span class="booktitle">Bioinformatics</span>, Holland, <span class="booktitle">Adaptation in Artificial Systems</span>, and Vose, <span class="booktitle">Simple Genetic Algorithm</span>.</cite> Of course many of these programs prove to be nearly as science-fictional as they sound, over time, but the fact that they exist as serious human propositions at all seems to me quite startling, and quite characteristic of the lack of metadiversity in our linguistic environment. In every case the motivation and the justification proceed hand-in-hand from remarkable, in-built assumptions about the inherent good in exploring basic natural phenomena via simulation and mimicry. I am not suggesting that such research is wrong, although I do hope it is less transgressive than it seems to want to appear. But it seems to me that an alternate perspective, derived from a cultural politics of the biological and linguistic-cultural environments, suggests that these research programs are profoundly ideological extensions of the public mind, rather than dispassionate considerations of possible roles for sophisticated linguistic tools in the human environment.</p>
<p>From such a perspective, in fact, what is striking about our world is not the attainments of our one linguistic society but the multiple, variant approaches to social reality encoded in the many thousands of human languages and cultures over time. As emblematic as the Internet is, it can be no more representative of the language environment than are the many linguistic technologies that have been systematically pressed out of modern awareness - and the fact that it is so heavily promoted by institutions of authority should, despite all the Internet’s attractions, give us pause. Reflecting on the natural world it seems hard to understand how human beings could come to any other conclusion but that part of our responsibility is to preserve so that we might understand more deeply the many natural processes that have proven themselves to be, so far, largely beyond our ken. Instead, capital insists on the vivisection - or just outright destruction - of these biological and environmental alternatives. Less well-known is the plight of linguistic variety itself, the pressure exerted by English and standardization and the networked reliance on programming and markup languages on those existing remnants of the world’s lost languages. <cite id="note_17">See Crystal, <span class="booktitle">Language Death</span>, Grenoble and Whaley, <span class="booktitle">Endangered Languages</span>, Maffi, <span class="booktitle">Biocultural Diversity</span>, and Skutnabb-Kangas, <span class="booktitle">Linguistic Genocide</span>.</cite> These languages must not be thought of as simple “formal variants,” alternate ways of approaching the same underlying material (which a computational perspective might seem to suggest). Instead, they are true examples of metadiversity - systems or quasi-systems that encode not just methods of approaching social relations but of the history of the self, the constitution of identity and otherness. <cite id="note_18">Thus recent evolutionary theory has begun to point, for example, to social structuring processes as linguistically generative, perhaps more so than the putative features of Universal Grammar - see, e.g., Dunbar, <span class="booktitle">Gossip</span>, and Goody, <span class="booktitle">Social Intelligence and Interaction</span>.</cite></p>
<p>With respect to our linguistic environment, even a dispassionate and so-called scientific perspective, no less a cultural materialist one, suggests that what is most vital to us is our multiplicity of structural alternatives, the heterogeneity of social interpretations whose variance itself is part of what allows society to be flexible, accommodative, meaningful. <cite id="note_19">This is exactly what is suggested in Abram, <span class="booktitle">Spell of the Sensuous</span>, and Maffi, <span class="booktitle">Biocultural Diversity</span> - quite literally that linguistic diversity constitutes a critical feature of the natural environment and even that the environment requires linguistic diversity to sustain biodiversity.</cite> We see again and again the record of apparently significant cultural histories characterized as myth, while one central set of metaphors derived from the success of the physical sciences continues to dominate investigation of not just the body but of human culture itself, which is to say language. <cite id="note_20">See Lakoff and Johnson, <span class="booktitle">Metaphors We Live By</span>, and <span class="booktitle">Philosophy in the Flesh</span>.</cite></p>
<h2>5. Futures</h2>
<p>Perhaps the promise of the Internet lies in the marks within it, even today, of mechanisms leading toward the creation and revitalization of alternate and variable kinds of languages and language-like formations, to some degree beyond and outside of information and communication. Of course a critical part of such formations is the raw assembling of communicative groups, such as newsgroups, chat rooms, website-based communities, and other devices wherein electronic communication is fundamentally multithreaded. Previous innovations in communication have generally been structured either on broadcast (one-to-many) communications, such as print publishing, television and radio broadcasting, where a generally powerful single entity is able essentially to create many copies of its own communications and then to distribute these widely among a population literate in the given medium. Another set of communicative technologies enable one-to-one interactions (the chief examples are letter writing, the telegraph and telephony). The Internet does encourage various and to some extent innovative kinds of both one-to-one and broadcast communications. Even more than these, however, the promise of the Internet seems to reside in its ability to facilitate something like many-to-many communicative formations. This is to approximate something not like the myriad forms of small group and peer communication that are characteristic of social groups.</p>
<p>In both the one-to-one and many-to-many registers we find true arenas for linguistic innovation. One reason there has been such proliferation of language in our world (prior to the work of standardized languages like English) is that both intimate and social communication, when unconstrained by institutional pressures that are especially characteristic of broadcast communicative praxes, provide especially fertile ground for experimentation and performative adoption of linguistic and cultural strategies. <cite id="note_21">This seems to me in line, to at least some degree, with the approach toward identity and cultural politics found, for example, in Butler, “Performative Acts” and <span class="booktitle">Gender Trouble</span>, and Spivak, “Acting Bits/Identity Talk” and <span class="booktitle">Critique of Postcolonial Reason</span>.</cite> Outside modern institutionalized standards, language is often perceived less as a set of static elements and rules to be applied according to pre-existing constraints, and more as cognitive medium for live innovation, deconstruction, creation, interaction. <cite id="note_22">See Golumbia, “History of `Language’,” and Harris, <span class="booktitle">Language Machine</span>.</cite> One reason for the proliferation of languages the world over may be that linguistic diversity correlates somewhat directly with a kind of local adaptiveness - providing both for certain kinds of local cultural homogeneity but also for a great deal of areal cultural diversity. <cite id="note_23">See Abram, <span class="booktitle">Spell of the Sensuous</span>, and Maffi, <span class="booktitle">Biocultural Diversity</span>. On local cultural homogeneity, see Sapir, <span class="booktitle">Language</span>. On areal diffusion and its influence on linguistic history see Dixon, <span class="booktitle">Rise and Fall of Languages</span>. Derrida, <span class="booktitle">Monolingualism of the Other</span>, offers some provocative reflections on the consequences of monolinguality.</cite></p>
<p>There exists a relatively clear historical line from the monolingual policies and technologies that have been advocated especially by the West to the current relative monolinguality of the Web. <cite id="note_24">On the earlier parts of this history see, for example, Ong, <span class="booktitle">Interfaces</span>, and <span class="booktitle">Orality and Literacy</span>, and, in another register, Anderson, <span class="booktitle">Imagined Communities</span>. On the consequences of the abrupt imposition of such technologies on human societies more generally, see Mander, <span class="booktitle">Absence of the Sacred</span>.</cite> At the same time many of the phenomena descried by critics of the Web - the bad spelling caused by typing emails quickly, poor editing of “fan”-created Web pages, apparently vague “emoticons” - demonstrate the power of noncanonical language to rise above the constraints on which standardization insists, usually for the purposes of social interaction, often far above or beyond meaning per se. <cite id="note_25">In addition to the social approach suggested in Dunbar, <span class="booktitle">Gossip</span>, also see the work of more recent language ideology theorists such as Kroskrity, <span class="booktitle">Regimes of Language</span>, and Schieffelin, Woolard, and Kroskrity, <span class="booktitle">Language Ideologies</span>.</cite> So does the Web’s ability to draw into interaction communities from many different language groups, including groups whose languages have not been part of the standardization process but who nevertheless wish to use the network to speak in other registers. <cite id="note_26">See Crystal, <span class="booktitle">Language and the Internet</span>.</cite> To some extent, then, what seems on the surface least political about the Web may be what is most important: providing raw bandwidth to those whose voices and languages have been pushed away by standardization. (However, the relative difficulty of sustaining broadcast media technologies in nonstandard languages such as low-power radio and television stations lends some caution to this view.)</p>
<p>This is not exactly to argue that we should resist technological innovation altogether (though see Mander, <span class="booktitle">Absence of the Sacred</span> and Abram, <span class="booktitle">Spell of the Sensuous</span> for surprisingly compelling statements in this direction). It is to say that, in the realm of linguistic technology, it may well be the case that the stuff of spoken language itself provides a kind of bare technological matter that can help us to restructure social life in significant ways. A more effective Internet may need to be not merely written, but verbal and visual; it may need to accommodate better the full range of human sight, sound and gesture, to allow us to push beyond the linguistic constraints print and standardization have unwittingly placed on us. It may also be interesting to see if it is possible to encourage the development of new, non-roman-script linguistic representations (such as emoticons) which lack strongly standardized underpinnings. If, in fact, some kind of change in language technology is needed to create a more flexible and diverse society (as the IT revolution seems to suggest on its face), we might look just as fruitfully to the innovations produced over tens of generations by thoughtful speakers of human languages, as we do to the more short-term innovations produced in the name of the general reduction of social language to informatic technologies.</p>
<h2>Works Cited</h2>
<p>Abram, David. <span class="booktitle">The Spell of the Sensuous: Perception and Language in a More-than-Human World</span>. New York: Pantheon Books, 1996.</p>
<p>Anderson, Benedict. <span class="booktitle">Imagined Communities: Reflections on the Origin and Spread of Nationalism</span>. Revised and Expanded Edition, London: Verso, 1991.</p>
<p>Birbeck, Mark, Jon Duckett, Oli Gauti Gudmundsson, et. al. <span class="booktitle">Professional XML</span>. Chicago: Wrox Press, 2001.</p>
<p>Bolter, Jay David, and Richard Grusin. <span class="booktitle">Remediation: Understanding New Media</span>. Cambridge, MA: The MIT Press, 1999.</p>
<p>Brown, Stuart M. <span class="booktitle">Bioinformatics: A Biologist’s Guide to Biocomputing and the Internet</span>. Natick, MA: Eaton, 2000.</p>
<p>Butler, Judith. “Performative Acts and Gender Constitution: An Essay in Phenomenology and Feminist Theory.” <span class="booktitle">Theatre Journal</span> 40:4 (December 1988). 519-531.</p>
<p>—. <span class="booktitle">Gender Trouble: Feminism and the Subversion of Identity</span>. New York and London: Routledge, 1990.</p>
<p>Chomsky, Noam. <span class="booktitle">Aspects of the Theory of Syntax</span>. Cambridge, MA and London: The MIT Press, 1965.</p>
<p>—. <span class="booktitle">The Minimalist Program</span>. Cambridge, MA and London: The MIT Press, 1995.</p>
<p>Crystal, David. <span class="booktitle">Language and the Internet</span>. New York: Cambridge University Press, 2001.</p>
<p>—. <span class="booktitle">Language Death</span>. New York: Cambridge University Press, 2000.</p>
<p>De Landa, Manuel. <span class="booktitle">A Thousand Years of Nonlinear History</span>. New York: Swerve Editions/Zone Books, 1997.</p>
<p>—. <span class="booktitle">War in the Age of Intelligent Machines</span>. New York: Swerve Editions/Zone Books/MIT Press, 1991.</p>
<p>Derrida, Jacques. <span class="booktitle">Monolingualism of the Other; or, The Prosthesis of Origin</span>. Trans. Patrick Mensah. Stanford, CA: Stanford University Press, 1998.</p>
<p>DeRose, Steven J. “Structured Information: Navigation, Access, and Control.” Paper presented at the Berkeley Finding Aid Conference, Berkeley, CA, April 4-6, 1995. <a class="outbound" href="http://sunsite.berkeley.edu/FindingAids/EAD/derose.html">http://sunsite.berkeley.edu/FindingAids/EAD/derose.html</a>.</p>
<p>Dixon, R. M. W. <span class="booktitle">The Rise and Fall of Languages</span>. Cambridge and New York: Cambridge University Press, 1997.</p>
<p>Dunbar, Robin I.M. <span class="booktitle">Grooming, Gossip, and the Evolution of Language</span>. Cambridge, MA: Harvard University Press, 1996.</p>
<p>Eckstein, Robert. <span class="booktitle">XML Pocket Reference</span>. Sebastopol, CA: O’Reilly, 1999. Fitzgerald, Michael. <span class="booktitle">Building B2B Applications with XML: A Resource Guide</span>. New York: John Wiley &amp; Sons, 2001.</p>
<p>Golumbia, David. “The Computational Object: A Poststructuralist Approach.” <span class="booktitle">Computers and the Humanities</span> (under review).</p>
<p>—. “Hypercapital.” <span class="booktitle">Postmodern Culture</span> 7:1 (September 1996). <a class="outbound" href="http://www.mindspring.com/~dgolumbi/docs/hycap/hypercapital.html">http://www.mindspring.com/~dgolumbi/docs/hycap/hypercapital.html</a>.</p>
<p>—. “Toward a History of `Language’: Ong and Derrida.” <span class="booktitle">Oxford Literary Review</span> 21 (1999). 73-90.</p>
<p>Goody, Esther N., ed. <span class="booktitle">Social Intelligence and Interaction: Expressions and Implications of the Social Bias in Human Intelligence</span>. Cambridge: Cambridge University Press, 1995.</p>
<p>Grenoble, Lenore A., and Lindsay J. Whaley, eds. <span class="booktitle">Endangered Languages: Current Issues and Future Prospects</span>. Cambridge and New York: Cambridge University Press, 1998.</p>
<p>Grimes, Barbara F., ed. <span class="booktitle">Ethnologue</span>. 14th Edition. CD-ROM. Dallas, TX: SIL International, 2000.</p>
<p>Harris, Randy Allen. <span class="booktitle">The Linguistics Wars</span>. New York and Oxford: Oxford University Press, 1993.</p>
<p>Harris, Roy. <span class="booktitle">The Language Machine</span>. Ithaca, NY: Cornell University Press, 1987.</p>
<p>Holland, John H. <span class="booktitle">Adaptation in Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence</span>. Cambridge, MA: The MIT Press, 1992.</p>
<p>Huck, Geoffrey J., and John A. Goldsmith. <span class="booktitle">Ideology and Linguistic Theory: Noam Chomsky and the Deep Structure Debates</span>. London and New York: Routledge, 1995.</p>
<p>Kroskrity, Paul V., ed. <span class="booktitle">Regimes of Language: Ideologies, Polities, and Identities</span>. Santa Fe, NM: School of American Research Press, 2000.</p>
<p>Lakoff, George. <span class="booktitle">Women, Fire, and Dangerous Things: What Categories Reveal about the Mind</span>. Chicago and London: University of Chicago Press, 1987.</p>
<p>— and Johnson, Mark. <span class="booktitle">Metaphors We Live By</span>. Chicago and London: University of Chicago Press, 1980.</p>
<p>— and —. <span class="booktitle">Philosophy in the Flesh: The Embodied Mind and Its Challenge to Western Thought</span>. New York: Basic Books, 1999.</p>
<p>Landow, George P. <span class="booktitle">Hypertext: The Convergence of Contemporary Critical Theory and Technology</span>. Baltimore, MD: Johns Hopkins University Press, 1992.</p>
<p>—, ed. <span class="booktitle">Hyper/Text/Theory</span>. Baltimore and London: Johns Hopkins University Press, 1994.</p>
<p>Lessig, Lawrence. <span class="booktitle">Code and Other Laws of Cyberspace</span>. New York: Basic Books, 1999.</p>
<p>Lunenfeld, Peter, ed. <span class="booktitle">The Digital Dialectic: New Essays on New Media</span>. Cambridge, MA: The MIT Press, 1999.</p>
<p>Lyotard, Jean-François. <span class="booktitle">The Postmodern Condition: A Report on Knowledge</span>. Geoff Bennington and Brian Massumi, trans. Minneapolis: University of Minnesota Press, 1984.</p>
<p>Maffi, Luisa, ed. <span class="booktitle">On Biocultural Diversity: Linking Language, Knowledge and the Environment</span>. Washington, DC: Smithsonian Institute Press, 2001.</p>
<p>Mander, Jerry. <span class="booktitle">In the Absence of the Sacred: The Failure of Technology and the Survival of the Indian Nations</span>. San Francisco, CA: Sierra Club Books, 1992.</p>
<p>Mann, William. “What Is Communication? A Summary.” Posting to FUNKNET list (February 17, 2001). Archived at <a class="outbound" href="http://listserv.linguistlist.org/cgi-bin/wa?A2=ind0102&amp;L=funknet&amp;P=R391">http://listserv.linguistlist.org/cgi-bin/wa?A2=ind0102&amp;L=funknet&amp;P=R391</a>.</p>
<p>National Federation of Abstracting and Information Services (NFAIS). <span class="booktitle">Proceedings of the Symposium on Metadiversity, 1998</span>. Philadelphia, PA: NFAIS, 1998.</p>
<p>Ong, Walter J. <span class="booktitle">Interfaces of the Word: Studies in the Evolution of Consciousness and Culture</span>. Ithaca, NY: Cornell University Press, 1977.</p>
<p>—. Orality and Literacy: <span class="booktitle">The Technologizing of the Word</span>. London and New York: Routledge, 1988.</p>
<p>Poster, Mark. <span class="booktitle">The Mode of Information: Poststructuralism and Social Context</span>. Chicago: University of Chicago Press, 1990.</p>
<p>Reddy, Michael J. “The Conduit Metaphor: A Case of Frame Conflict in Our Language about Language.” In Andrew Ortony, ed., <span class="booktitle">Metaphor and Thought</span>. Cambridge: Cambridge University Press, 1979. 284-324.</p>
<p>Robins, Kevin, and Frank Webster. <span class="booktitle">Times of the Technoculture: From the Information Society to the Virtual Life</span>. London and New York: Routledge, 1999.</p>
<p>Sapir, Edward. <span class="booktitle">Language: An Introduction to the Study of Speech</span>. London: Granada, 1921 (Reprinted, 1978).</p>
<p>Schieffelin, Bambi B., Kathryn A. Woolard, and Paul V. Kroskrity, eds. <span class="booktitle">Language Ideologies: Practice and Theory</span>. Oxford: Oxford University Press, 1998.</p>
<p>Skutnabb-Kangas, Tove. <span class="booktitle">Linguistic Genocide in Education – or Worldwide Diversity and Human Rights?</span> Mahwah, NJ and London: Lawrence Erlbaum Associates, 2000.</p>
<p>Spivak, Gayatri Chakravorty. “Acting Bits/Identity Talk.” <span class="booktitle">Critical Inquiry</span> 18:4 (Summer 1992). 770-803.</p>
<p>—. <span class="booktitle">A Critique of Postcolonial Reason: Toward a History of the Vanishing Present</span>. Cambridge, MA: Harvard University Press, 1999.</p>
<p>Thacker, Eugene. “Bioinformatics: Materiality and Data between Information Theory and Genetic Research.” <span class="booktitle">CTheory</span> Article 63 (October 28, 1998).</p>
<p>Turkle, Sherry. <span class="booktitle">The Second Self: Computers and the Human Spirit</span>. New York: Simon and Schuster, 1984.</p>
<p>Vose, Michael D. <span class="booktitle">The Simple Genetic Algorithm: Foundations and Theory</span>. Cambridge, MA: The MIT Press, 1999.</p>
</div></div></div><div class="field field-name-field-tags field-type-taxonomy-term-reference field-label-above"><div class="field-label">Tags:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/tags/derrida">derrida</a>, <a href="/tags/spivak">spivak</a>, <a href="/tags/linguistics">linguistics</a>, <a href="/tags/globalization">globalization</a>, <a href="/tags/new-media">new media</a>, <a href="/tags/chomsky">Chomsky</a>, <a href="/tags/delanda">delanda</a>, <a href="/tags/bolter">bolter</a>, <a href="/tags/landow">landow</a>, <a href="/tags/lyotard">lyotard</a>, <a href="/tags/internet">internet</a>, <a href="/tags/code">code</a>, <a href="/tags/informatics">informatics</a>, <a href="/tags/html">html</a>, <a href="/tags/xml">xml</a>, <a href="/tags/hypertext">hypertext</a>, <a href="/tags/java">Java</a>, <a href="/tags/active-server">active server</a>, <a href="/tags/cold-fusion">cold fusion</a>, <a href="/tags/sgml">sgml</a>, <a href="/tags/postmodernism">postmodernism</a>, <a href="/tags/websites">websites</a>, <a href="/tags/bioinformatics">bioinformatics</a>, <a href="/tags/metadiversity">metadiversity</a>, <a href="/tags/deep-stru">deep stru</a></div></div></div>Tue, 31 Jan 2012 16:25:05 +0000EBR Administrator865 at http://www.electronicbookreview.com