¶ 1Leave a comment on paragraph 10
As becomes clear from the discourse sketched above, a combination of technological, formal, and cultural factors (as well as discursive, practical and institutional ones) has brought about a certain semblance of fixity, trust and endurance, together with a number of conventions related to the preservation of the printed book. It is these conventions, or the disciplining regime Johns talks about, that have privileged certain cuts in intra-action with the book’s material becoming. With the growing use and importance of the digital medium in scholarship, one could argue that the book’s material becoming has altered. However, it is in the interaction with the established disciplining regime that its development has been structured. An increasing interest in the communication and publishing of humanities research in what can be seen as a less fixed and more open way, has nonetheless challenged the integrity of the book, something that the system surrounding it has tried so hard to develop and maintain.

¶ 2Leave a comment on paragraph 20
Why is this disciplining regime, and the specific print-based stabilisations it promotes, being interrogated at this particular point in time? First of all, and as was made clear by the history provided above, in order to answer this question we need to keep in mind that this regime has seen a continuing power struggle over its upkeep and constituency, and as such has always been disputed. Nonetheless, changes in technology, and in particular the development of digital media, have acted as a disruptive force, especially since much of the discourse surrounding digital media, culture and technology tends to promote a narrative of openness, fluidity and change. Therefore this specific moment of disruption and remediation brings with it an increased awareness of how the semblances of fixity that were created and upheld in, and by, the printed medium, are a construct, upheld to maintain certain established institutional, economical and political structures (Johns 1998). This has lead to a growing awareness of the fact that these structures are formations we can rethink and perform otherwise. All of which may explain why there is currently a heightened interest in how we can intra-act with the digital medium in such a way as to explore potential alternative forms of fixity and fluidity, from blogs to multimodal publications.

¶ 3Leave a comment on paragraph 30
The construction of what we perceive as stable knowledge objects serves certain goals, mostly to do with the establishment of authority, preservation (archiving), reputation building (stability as threshold) and commercialisation (the stable object as a reproducible product). In Writing Space: Computers, Hypertext, and the Remediation of Print (2001), Bolter conceptualises stability (as well as authority) as a value under negotiation, as well as the product of a certain writing technology: ‘it is important to remember, however, that the values of stability, monumentality and authority, are themselves not entirely stable: they have always been interpreted in terms of the contemporary technology of handwriting or printing’ (2001: 16). This acknowledgment of the relative and constructed nature of stability and of the way we presently cut with and through media, encourages us to conduct a closer analysis of the structures underlying our knowledge and communication system and how they are set-up at present: who is involved in creating a consensus on fixity and stability, and what is valued and what is not in this process?

¶ 4Leave a comment on paragraph 40
It could therefore be argued that it is the specific cuts or forms of fixing and cutting down of scholarship that are being critiqued at the moment, while the potential of more processual research is being explored at the same time: for example, via the publication of work in progress on blogs or personal websites. The ease with which continual updates can be made has brought into question not only the stability of documents but also the need for such stable objects. Wikipedia is one of the most frequently cited examples of how the speed of improving factual errors and the efficiency of real-time updating in a collaborative setting can win out over the perceived benefits of stable material knowledge objects. There has perhaps been a shift away from the need for fixity in scholarly research and communication towards the importance of other values such as collaboration, quality, speed and efficiency, combined with a desire for more autonomous forms of publishing. Scholars are using digital media to explore the possibilities for publishing research in more direct ways, often cutting out the traditional middlemen (publishers and libraries) that have become part of the print disciplining regime they aim to critique. Accordingly, they are raising the question: do these middlemen still serve the needs of its users, of scholars as authors and readers? For example, the desire for flexibility, speed, autonomy etc. has caused new genres of formal and informal scholarly communication to arise; a focus on openness and fluidity is seen as having the potential to expand academic scholarship to new audiences; digital forms of publishing have the potential to include informal and multi-modal scholarship that hasn’t been communicated particularly extensively before; and new experimental publishing practices are assisting scholars in sharing research results and forms of publication that cannot exist in print, because of their scale, their multimodality, or even their genre. Making the processual aspect of scholarship more visible—which includes the way we collaborate, informally communicate, review, and publish our research—and highlighting not only the successes but also the failures that come with that, has the potential to demystify the way scholarship is produced.

¶ 5Leave a comment on paragraph 50
From blogging software and social media, to mailing lists and institutional repositories, scholars have thus increasingly moved to digital media and the Internet to publish both their informal and formal research in what they perceive as a more straightforward, direct and open way. This includes the mechanisms developed for the more formal publication of research I discussed in the previous chapter, via either green (archiving) or gold (journal publishing) open access. Nonetheless, the question remains whether these specific open forms of publishing have really produced a fundamental shift away from fixity. In this section I therefore would like to draw attention to a specific feature of openness—a feature that can in many ways be seen as one of its most contested aspects (Adema 2010: 60)—namely, the possibility to reuse, adapt, modify and remix material.[1] It is this part of the ethos or definition of openness (libre more than gratis)[2] that can be said to most actively challenge the concepts of stability, fixity, trust and authority that have accompanied the rhetoric of printed publications for so long (Johns 1998). Where more stripped-down versions of openness focus on achieving access, and on doing so in a way that the stability of a text or product need not be threatened (indeed, the open and online distribution of books might even promote its fixity and durability due to the enlarged availability of digital copies in multiple places), libre openness directly challenges the integrity of a work by enabling different versions of a work to exist simultaneously. At the same time libre forms of openness also problematise such integrity by offering readers the opportunity to remix and re-use (parts of) the content in different settings and contexts, from publications and learning materials, to translations and data mining. Within academia this creates not only practical problems (which version to cite and preserve, who is the original author, who is responsible for the text), it creates theoretical problems too (what is an author, in what ways are texts ever stable, where does the authority of a text lie?). Fitzpatrick discusses the ‘repurposing’ of academic content in this regard:

¶ 6Leave a comment on paragraph 60
What digital publishing facilitates, however, is a kind of repurposing of published material that extends beyond mere reprinting. The ability of an author to return to previously published work, to rework it, to think through it anew, is one of the gifts of digital text’s malleability—but our ability to accept and make good use of such a gift will require us to shake many of the preconceptions that we carry over from print. (2011: 2)

¶ 7Leave a comment on paragraph 70
The ability to expand and build upon, to make modifications and create derivative works, to appropriate, change and update content within a digital environment, also has the potential to shift the focus in scholarly communication away from the product of our publishing and on to the process of researching. It is a shift that, as I discussed previously in this section, may have the ability to make us more aware of the contingency of our research and the cuts and boundaries we enact and that are enacted for us when we communicate and disseminate our findings. It is this shift away from models of print stability and towards process and fluidity (including the necessary cuts) that I want to focus on here, in order to explore some of the ways in which both the practical and theoretical problems that are posed within this development are being dealt with at this moment in time, and whether these should or can be approached differently.

¶ 8Leave a comment on paragraph 80
To investigate these potential features of openness, the following section on Remixing Knowledge will analyse a variety of theoretical and practical explorations of fluidity, liquidity and remix, focusing specifically on scholarly research in a digital context. The aim is to examine some of the ways in which scholars within the humanities are dealing with these issues of fluidity and versioning, especially where they concern the scholarly book. This section therefore looks at theories and performative practices that have tried to problematise ideas such as authorship and stability by exploring critically concepts of the archive, selection and agency. At the same time it will offer a critique of these theories and practices and the way they still mostly adhere to fixtures and boundaries—such as authorship and copyright—that have been created within the print paradigm, thus maintaining established institutions and practices. My aim in offering such a critique is to push forward our thinking on the different kind of cuts and stabilisations that are possible within humanities research, its institutions and practices; interruptions that are perhaps both more ethical and open to difference, and which are critical of both the print paradigm and of the promises of the digital.[3] How might these alternative and affirmative cuts enable us to conceive a concept of the book built upon openness, and with that, a concept of the humanities built upon fluidity?

6.2.1 Remixing Knowledge

¶ 9Leave a comment on paragraph 90
The ability to reuse and remix data and research to create derivative works is a practice that challenges the stability of a text, and puts into question its perceived boundaries.[4] Within a scholarly context the concept of derivative works also offers the potential to challenge the idea of authorship or, again, the authority of a certain text. The founding act of a work, that specific function of authorship described by Foucault in his seminal article ‘What is an Author?’, can be seen as becoming less important for both the interpretation and the development of a text, once it goes through the processes of adaptation and reinterpretation and the meaning given as part of the author function becomes dispersed (1977). In this section I therefore want to focus on three alternatives to authorship, authority and stability as put forward in discussions on remix; alternatives I will argue are important for knowledge production in the humanities. I will shortly discuss the concept of modularity; before proceeding to the concept of the fluid text and, related to that, the agency of the selector or moderator; and finally, to the concept of the (networked) archive, by looking at the work of remix theorists Lev Manovich and Eduardo Navas, among others, as well as the writing of the textual critic John Bryant.

6.2.1.1 Modularity

¶ 10Leave a comment on paragraph 100
Media theorist Lev Manovich discusses the concept of modularity extensively in his research on remix. He explores how, with the coming of software, a shift in the nature of what constitutes a cultural object has taken place, where cultural content no longer has finite boundaries. Content is no longer received by the user, in Manovich’s vision, but is traversed, constructed and managed. With the shift away from stable environments in a digital online environment, he argues that there are no longer senders and receivers of information in the classical sense. There are only temporary reception points in information’s path through remix. Therefore, culture for Manovich is a product that is constructed, both by the maker as well as the consumer, where it is actively being modularised by users to make it more adaptive (2005). In other words, culture is not modular; it is (increasingly) made modular in digital environments. However, the real remix revolution lies not in this kind of agency provoked by the possession of production tools. According to Manovich it lies in the possibility this generates to exchange information between media; what in Software Takes Command he calls the concept of ‘deep remixability’. Here, Manovich talks about a situation in which modularity is increasingly being extended to media themselves. The remixing of various media has now become possible in a common software-based environment, along with a remixing of the methodologies of these media, offering the possibility of mash-ups of text with audio and visual content, expanding the range of cultural and scholarly communication (Manovich 2008).

¶ 11Leave a comment on paragraph 110
In his writings on remix, Manovich thus sketches a rather utopian future (one that does not take into account present copyright regimes, for instance) in which cultural forms will be deliberately made from Lego-like modular building blocks, designed to be easily copied and pasted into new objects and projects. For Manovich, these forms of standardisation function as a strategy to make culture freer and more shareable, with the aim of creating an ecology in which remix and modularity are a reality. In this respect ‘helping bits move around more easily’ is a method for Manovich to devise a new way with which we can perform cultural analysis (2005). These concepts of modularisation and of recombinable data-sets offer a way of looking beyond static knowledge objects, presenting an alternative view on how we structure and control culture and data, as well as how we can analyse our ever-expanding information flows. With the help of his software-based concepts, he thus examines how remix can be an active stance by which people will be able to shape culture in the future and deal with knowledge objects in a digital context.

¶ 12Leave a comment on paragraph 120
Within scholarly communication the concept of modularity has already proved popular when it comes to making research more efficient and coping with information overload: from triplets[5] and nano-publications[6], to forms of modular publishing, these kind of software-inspired concepts have mostly found their way into scientific publishing. Instead of structuring scholarly research according to linear articles, for instance, Joost Kirzc argues that we should have a coherent set of ‘well-defined, cognitive, textual modules’ (1998). Similarly, Jan Velterop and Barend Mons suggest moving towards nano-publications to deal with information overload, which can be seen as a move in the direction of both more modularity and the standardisation of research outcomes (Groth et al. 2010).

¶ 13Leave a comment on paragraph 130
There are, however, problems with applying this modular database logic to cultural objects. Of course, when culture is already structured and modular this makes reuse and repurposing much easier. However, cultural objects differ, and it is not necessarily possible or appropriate to modularise or cut-up a scholarly or fictional work. Not all cultural objects are translatable into digital media objects either. Hence, too strict a focus on modularity might be detrimental to our ideas of cultural difference. Tara McPherson formulates an important critique of modularity to this end. She is mostly interested in how the digital, privileging as it does a logic of modularity and seriality, became such a dominant paradigm in contemporary culture.[7] How did these discourses from coding culture translate into the wider social world? What is the specific relationship between context and code in this historical context? How have code and culture become so intermingled? As McPherson argues, in the mid-20th century modular thinking took hold in a period that also saw the rise of identity politics and racial formations in the US, hyper-specialisation and niched production of knowledge in the university, and forms of Fordist capitalism in economic systems—all of which represent a move toward modular knowledges. However, modular thinking, she points out, tends to obscure the political, cultural and social context from which it emerged. McPherson emphasises that we need to understand the discourses and peculiar histories that have created these forms of the digital and of digital culture, which encourage forms of partitioning. We also need to be more aware that cultural and computational operating systems mutually infect one another. In this respect, McPherson wonders ‘how has computation pushed modularity in new directions, directions in dialogue with other cultural shifts and ruptures? Why does modularity emerge in our systems with such a vengeance across the 1960s?’ (2012). She argues that these forms of modular thinking, which function via a lenticular logic, offer ‘a logic of the fragment or the chunk, a way of seeing the world as discrete modules or nodes, a mode that suppresses relation and context. As such, the lenticular also manages and controls complexity’ (McPherson 2012: 25). We therefore need to be wary of this ‘bracketing of identity’ in computational culture, McPherson warns, where it holds back complexity and difference. She favours the application of Barad’s concept of the agential cut in these contexts, using this to replace bracketing strategies (which bring modularity back). For, as McPherson states, the cut as a methodological paradigm is more fluid and mobile (2014).

¶ 14Leave a comment on paragraph 140
The concept of modularity, as described by Manovich (where culture is made modular), does not seem able to guarantee these more fluid movements of culture and knowledge. The kind of modularity he is suggesting does not offer so much of a challenge to object and commodity-thinking, as apply the same logic of stability and standardised cultural objects or works, only on another scale. Indeed, Manovich defines his modular Lego-blocks as ‘any well-defined part of any finished cultural object’ (2005). There is thus still the idea of a finished and bound entity (the module) at work here, only it is smaller, compartmentalised.

6.2.1.2 Fluid Environments and Liquid Publications

¶ 15Leave a comment on paragraph 150
Where Manovich’s concept of modularity mostly focuses on criticising stability and fixity from a spatial perspective (dividing objects into smaller re-combinable blocks), within a web environment forms of temporal instability—where over time cultural objects change, adapt, get added to, re-envisioned, enhanced etc.—are also being increasingly introduced. In this respect, experiments with liquid texts and with fluid books not only stress the benefits and potential of processual scholarship, of capturing research developments over time and so forth, they also challenge the essentialist notions that underlie the perceived stability of scholarly works.

¶ 16Leave a comment on paragraph 160
Textual scholar John Bryant theorises the concept of fluidity extensively in his book The Fluid Text: A Theory of Revision and Editing for Book and Screen (2002). Bryant’s main argument is that stability is a myth and that all works are fluid texts. As he explains, this is because fluidity is an inherent phenomenon of writing itself, where we keep on revising our words to approach our thoughts more closely, with our thoughts changing again in this process of revision. In The Fluid Text, Bryant displays (and puts into practice) a way of editing and doing textual scholarship that is based not on a final authoritative text, but on revisions. He argues that for many readers, critics and scholars, the idea of textual scholarship is designed to do away with the otherness that surrounds a work and to establish an authoritative or definitive text. This urge for stability is part of a desire for what Bryant calls ‘authenticity, authority, exactitude, singularity, fixity in the midst of the inherent indeterminacy of language’ (2002: 2). By contrast, Bryant calls for the recognition of a multiplicity of texts, or rather ‘the fluid text’. Texts are fluid in his view because the versions flow from one to another. For this he uses the metaphor of a work as energy that flows from version to version.

¶ 17Leave a comment on paragraph 170
In Bryant’s vision this idea of a multiplicity of texts extends from different material manifestations (drafts, proofs, editions) of a certain work to an extension of the social text (translations and adaptations). Logically this also leads to a vision of multiple authorship, where Bryant wants to give a place to what he calls ‘the collaborators’ of or on a text, to include those readers who also materially alter texts. For Bryant, with his emphasis on the revisions of a text and the differences between versions, it is essential to focus on the different intentionalities of both authors and collaborators. The digital medium offers the perfect possibility to achieve this and to create a fluid text edition. Bryant established such an edition—both in a print and an online edition—for Melville’s Typee, showing how a combination of book format and screen can be used to effectively present a fluid textual work.[8]

¶ 18Leave a comment on paragraph 180
For Bryant, this specific choice of a textual presentation focusing on revision is at the same time a moral choice. This is because, for him, understanding the fluidity of language enables us to better understand social change. Furthermore, constructionist intentions to pin a text down fail to acknowledge that, as Bryant puts it, ‘the past, too, is a fluid text that we revise as we desire’ (2002: 174). Finally, he argues that the idea of a fluid text encourages a new kind of critical thinking, one that is based on difference, otherness, variation and change. This is where the fixation on the idea of having a stable text to achieve easy retrieval and unified reading experiences loses out to a discourse that focuses on the energies that drive text from version to version. In Bryant’s words, ‘by masking the energies of revision, it reduces our ability to historicize our reading, and, in turn, disempowers the citizen reader from gaining a fuller experience of the necessary elements of change that drive a democratic culture’ (2002: 113).

¶ 19Leave a comment on paragraph 190
Alongside Bryant’s edition of Melville’s Typee, another example of a practical experiment focusing upon the benefits of fluidity specifically for scholarly communication is the Liquid Publications (or LiquidPub) project.[9] As described by Casati, Giunchiglia, and Marchese (2007), this is a project that tries to bring into practice the idea of modularity as described previously. Focusing mainly on textbooks in the sciences, the aim of the project is to enable teachers to compose a customised and evolving book out of modular pre-composed content. This book will then be a ‘multi-author’ collection of materials on a given topic that can include different types of documents.

¶ 20Leave a comment on paragraph 200
The LiquidPub project tries to cope with issues of authority and authorship in a liquid environment by making a distinction between versions and editions. Editions are solidifications of the liquid book, with stable and fixed content, which can be referred to, preserved, and made commercially available. Furthermore the project creates different roles for authors, from editors to collaborators, accompanied by an elaborate rights structure, with the possibility for authors to give away certain rights to their modular pieces whilst holding on to others. As a result, the LiquidPub project is very pragmatic, catering to the needs and demands of authors (mainly for the recognition of their moral rights), while at the same time trying to benefit from, and create efficiencies and modularity within, a fluid environment. In this way it offers authors a choice of different ways to distribute content, from completely open, to partially open, to completely closed books.

¶ 21Leave a comment on paragraph 210
Introducing graduations of authorship such as editors and collaborators, as proposed in the work of Bryant and in the LiquidPub project, is one way to deal with multiple authorship or authorship in collaborative research or writing environments. However, as I showed in chapter 3, it does not address the problem of how to establish authority in an environment where the contributions of a single author are difficult to trace back; or where content is created by anonymous users or by avatars; or in situations where there is no human author, but where the content is machine-generated. What becomes of the role of the editor or the selector as an authoritative figure when selections can be made redundant and choices altered and undone by mass-collaborative, multi-user remixes and mash-ups? The projects mentioned above are therefore not so much posing a challenge to authorship or questioning the authorship function as it is currently established, as they are merely applying this established function to smaller compartments of text and dividing them up accordingly.

¶ 22Leave a comment on paragraph 220
Furthermore the concept of fluidity as described by Bryant, together with the notion of liquidity as used in the LiquidPub project, does not significantly disturb the idea of object-like thinking or stability within scholarly communication either. For Bryant, a fluid book edition is still made up of separate, different versions, while in the LiquidPub Project, which focuses mostly on an ethos of speed and efficiency, a liquid book is a customised combination of different recombinable documents. In this sense both projects adhere quite closely to the concept of modularity as described by Manovich (where culture is made modular), and therefore do not reach a fluid or liquid state in which the stability and fixity of a text is fundamentally reconsidered in a continual or processual manner. There is still the idea of the object (the module); however, it is smaller; compartmentalised. Witness the way both of these projects still hinge on the idea of extracted objects, of editions and versions, in the liquid project. For example, Bryant’s analysis is focused not so much on creating fluidity or a fluid text—however impossible this might be—but on creating a network between more or less stable versions, whilst showcasing their revision history. He thus still makes a distinction between works and versions, neither seeing them as part of one extended work, nor giving them the status of separate works. In this way he keeps a hierarchical thinking alive: ‘a version can never be revised into a different work because by its nature, revision begins with an original to which it cannot be unlinked unless through some form of amnesia we forget the continuities that link it to its parent. Put another way, a descendant is always a descendant, and no amount of material erasure can remove the chromosomal link’ (Bryant 2002: 85). Texts here are not fluid, at least not in the sense of their being process-oriented; they are networked at the most. McKenzie Wark’s terminology for his book Gamertheory—which Wark distinctively calls a ‘networked book’—might therefore be more fitting and applicable in such cases, where a networked book, at least in its wording, positions itself as being located more in between the ideal types of stability and fluidity.[10]

¶ 23Leave a comment on paragraph 230
A final remark concerning the way in which these two projects theorise and bring into practice the fluid or liquid book: in both projects, texts are actively made modular or fluid by outside agents, by authors and editors. There is not a lot of consideration here of the inherent fluidity or liquidity that exists as part of the text or book’s emergent materiality, in intra-action with the elements of what theorists such as Jerome McGann and D.F. McKenzie have called ‘the social text’—which, in an extended version, is what underlies Bryant’s concept of the fluid text. In the social text, human agents create fluidity through the creation of various instantiations of a text post-production. As McKenzie has put it: ‘a book is never simply a remarkable object. Like every other technology, it is invariably the product of human agency in complex and highly volatile contexts’ (1999). McKenzie, in his exploration of the social text, sought to highlight the importance of a wide variety of actors in a text’s emergence and meaning giving, from printers to typesetters. He does so in order to argue against a narrow focus on a text’s materiality or an author’s intention. However, there is a lack of acknowledgement here of how the processual nature of the book comes about out of an interplay of agential processes of both a human and non-human nature.

¶ 24Leave a comment on paragraph 240
Something similar can be seen in the work of Bryant, in that for him a fluid text is foremost fluid because it consists of various versions. Bryant wants to showcase material revision here, by authors, editors, or readers, among others. But this is a very specific—and humanist—understanding of the fluid text. For revision is, arguably, only one major source of textual variation or fluidity. In this sense, to provide some alternative examples, it is not the inherent emergent discursive-materiality of a text, nor the plurality of material (human or machinic) reading paths through a text, that make a text always already unstable, for Bryant. What does make a text fluid for him is the existence of multiple versions brought into play by human and authorial agents of some sort. This is related to his insistence on a hermeneutic context in which fluid texts are representations of extended and distributed forms of intentionality. As I will ask in what follows, would it not be more interesting to perceive of fluidity or the fluid text rather as a process that comes about out of the entanglement and performance of a plurality of agentic processes: material, discursive, technological, medial, human and non-human, intentional and non-intentional? From this position, a focus on how cuts and boundaries are being enacted within processual texts and books, in an inherently emergent and ongoing manner, might offer a more inclusive strategy to deal with the complexity of a book’s fluidity. This idea will be explored in more depth toward the end of this chapter when I take a closer look at Jerome McGann’s theories of textual criticism.

6.2.1.3 The Archive

¶ 25Leave a comment on paragraph 250
As discussed in chapter 3, remix as a practice has the potential to raise questions for the idea of authorship as well as for the related concepts of authority and legitimacy. For example, do moral and ownership rights of an author extend to derivative works? And who can be held responsible for the creation of a work when authorship is increasingly difficult to establish in music mash-ups or in data feeds, where users receive updated information from a large variety of sources? As I touched upon previously, one of the suggestions made in discussions of remix to cope with the problem of authorship in a digital context has involved shifting the focus from the author to the selector, moderator or curator. Similarly, in cases where authorship is hard to establish or even absent, the archive could potentially establish authority. Navas examined both of these notions as potential alternatives to established forms of authority in an environment that relies on continual updates and where process is preferred to product. Navas stresses, however, that keeping a critical distance from the text is necessary to make knowledge possible and to establish authority. As authorship has been replaced by sampling—and ’sampling allows for the death of the author’, according to Navas, as the origin of a tiny fragment of a musical composition becomes hard to trace—he argues that the critical position in remix is taken in by s/he who selects the sources to be remixed. However, in mashups, this critical distance increasingly becomes difficult to uphold. As Navas puts it, ‘this shift is beyond anyone’s control, because the flow of information demands that individuals embed themselves within the actual space of critique, and use constant updating as a critical tool’ (2010).

¶ 26Leave a comment on paragraph 260
To deal with the constantly changing present, Navas turns to history as a source of authority: to give legitimacy to fluidity retrospectively by means of the archive. The ability to search the archive gives the remix both its reliability as well as its market value, Navas points out. By recording information it becomes meta-information, information that is static, available when needed and always in the same form. Retrospectively, this recorded state, this staticity of information, is what makes theory and philosophical thinking possible. As Navas claims, ‘the archive, then, legitimates constant updates allegorically. The database becomes a delivery device of authority in potentia: when needed, call upon it to verify the reliability of accessed material; but until that time, all that is needed is to know that such archives exist’ (2010).

¶ 27Leave a comment on paragraph 270
At the same time Navas is ambivalent about the archive as a search engine. He argues that in many ways it is a truly egalitarian space—able to answer ‘all queries possible’—but one that is easily commercialised too. What does it mean when Google harvests the data we collect and our databases are predominantly built upon social media sites? In this respect we are also witnessing an increasing rise of information flow control (Navas 2010).

¶ 28Leave a comment on paragraph 280
The importance of Navas’ theorising in this context lies in the possibilities his thinking offers for the book and the knowledge system we have created around it. First of all, he explores the archive as a way of both stabilising flow and of creating a form of authority out of fluidity and the continual updating of information. Additionally, he proposes the role of s/he who selects, curates or moderates as an alternative to that of the author. In a way one can argue that this model of agency is already quite akin to that found in scholarly communication, where selection of resources and referring to other sources, next to collection building, is part of the research and writing process of most academics. Manovich argues for a similar potential, namely the potential of knowledge producers to modularise data and make it adaptable within multiple media and various platforms, mirroring scientific achievements with standardised meta-data and the semantic web.

¶ 29Leave a comment on paragraph 290
These are all interesting steps to think beyond the status quo of the book, challenging scholarly thinking to experiment with notions of process and sharing, and to let go of idealised ideas of authorship. Nonetheless, the archive as a tool poses some serious problems with respect to legitimating fluidity retrospectively and providing the necessary critical distance, as Navas positions it. For the archive as such does not provide any legitimation but is built upon the authority and the commands that constitute it. This is what Derrida calls ‘the politics of the archive’ (1996). What is kept and preserved is connected to power structures, to the interests of those who decide what to collect (and on what grounds) and the capacity to interpret the archive and its content when called upon for legitimation claims later on. The question of authority does not so much lie with the archive, but with who has access to the archive and with who gets to constitute it. At the same time, although it has no real power of its own to legitimize fluidity, the archive is used as an objectified extension of the power structures that control it. Furthermore, as Derrida shows, archiving is an act of externalisation, of trying to create stable abstracts (1996: 12). A still further critique of the archive is that, rather than functioning as a legitimising device, its focus is first and foremost on objectification, commercialisation and consummation. In the archive, knowledge streams are turned into knowledge objects when we order our research into consumable bits of data. As Navas has shown, the search engine, based on the growing digital archive we are collectively building, is Google’s bread and butter. By initiating large projects like Google Books, for instance, Google aims to make the world’s archive digitally available or to digitise the ‘world’s knowledge’—or at least, that part of it that Google finds appropriate to digitise (i.e. mostly works in American and British libraries, and thus mostly English language works). In Google’s terms, this means making the information they deem most relevant—based on the specific programming of their algorithms—freely searchable, and Google partners with many libraries worldwide to make this service available. However, most of the time only snippets of poorly digitised information are freely available, and for full-text functionality, or more contextualised information, books must be acquired via Google Play Books (formerly Google Editions) for instance, the company’s ebook store. This makes it clear how search is fully embedded within a commercial framework in this environment.

¶ 30Leave a comment on paragraph 300
The interpretation of the archive is therefore a fluctuating one and the stability it seems to offer is, arguably, relatively selective and limited. As Derrida shows, the digital offers new and different ways of archiving, and thus also provides a different vision on what it constitutes and archives (both from a producer as well as from a consumer perspective) (1996: 17). Furthermore, the archiving possibilities also determine the structure of the content that will be archived as it is becoming. The archive thus produces just as much as it records the event. In this respect the archive is highly performative: it produces information, creates knowledge, and decides how we determine what knowledge will be. And the way the archive is constructed is very much a consideration under institutional and practical constraints. For example, what made the Library of Congress decide to preserve and archive all public Twitter feeds starting from its inception in 2006, and why only Twitter and not other similar social media platforms? The relationship of the archive to scholarship is a mutual one, as they determine one another. A new scholarly paradigm therefore also asks for and creates a new vision of the archive. This is why, as Derrida states, ‘the archive is never closed. It opens out of the future’ (1996: 45).[11] Therefore the archive does not stabilise or guarantee any concept.

¶ 31Leave a comment on paragraph 310
Foucault acknowledges this fluidity of the archive, where he sees it as a general system of both the formation and transformation of statements. However, the archive also structures our way of perceiving the world, as we operate and see the world from within the archive. As Foucault states: ‘it is from within these rules that we speak’ (1969: 146). The archive can thus be seen as governing us, and this again directly opposes the idea of critical distance that Navas wants to achieve with his notion of the archive, as we can never be outside of it. Matthew Kirschenbaum argues along similar lines when he discusses the preservation of digital objects, pointing out that their preservation is ‘logically inseparable from the act of their creation (emphasis in the original)’ (2013). He explains this as follows:

¶ 32Leave a comment on paragraph 320
The lag between creation and preservation collapses completely, since a digital object may only ever be said to be preserved if it is accessible, and each individual access creates the object anew. One can, in a very literal sense, never access the “same” electronic file twice, since each and every access constitutes a distinct instance of the file that will be addressed and stored in a unique location in computer memory. (Kirschenbaum 2013)

¶ 33Leave a comment on paragraph 330
This means that every time we access a digital object, we duplicate it, we copy it. And this is exactly why, in our strategies of conservation, every time we access a file we also (re)create these objects anew over and over again. Critical distance here is impossible when we are actively involved in the archive’s functioning. As Kirschenbaum states, ‘the act of retrieval precipitates the temporary reassembling of 0’s and 1’s into a meaningful sequence that can be decoded by software and hardware’ (2013). Here the agency of the archive, of the software and hardware, also becomes apparent. Kirschenbaum refers to Wolfgang Ernst’s notion of archaeography, which denotes forms of machinic or medial writing, or as Ernst puts it, ‘expressions of the machines themselves, functions of their very mediatic logic’ (2011: 242). At this point archives become ‘active ‘‘archaeologists’’ of knowledge’ (Ernst 2011: 239), or as Kirschenbaum puts it, ‘the archive writes itself’ (2013).

¶ 34Leave a comment on paragraph 340
Let me reiterate that the above critique is not focused on doing away with either the archive or the creation of (open access) archives: archives play an essential role in making scholarly research accessible, preserving it, adding metadata and making it harvestable. However, I do want scholars to be aware of the structures at play behind the archive, and I want to put question marks at both its perceived stability, as well as at its (objective) authority and legitimacy.

6.2.2 The Limits of Fluidity and Stability

¶ 35Leave a comment on paragraph 350
The theories and experiments described above in relation to modularity, fluid and liquid publications, new forms of authorship and the archive, offer valuable insights into some of the important problems, as well as some of the possibilities, with knowledge production in a digital context. I will however argue that most of the solutions presented above when it comes to engaging with fluidity in online environments still rely on print-based answers (favouring established forms of fixity and stability). The concepts and projects I have described have not actively explored the potential of networked forms of communication to truly disrupt or rethink our conventional understandings of the autonomous human subject, the author, the text, and fixity in relation to the printed book. Although they take on the challenge of finding alternative ways of establishing authority and authorship in order to cope with an increasingly fluid environment, they still very much rely on the print-based concept of stability and on the knowledge and power systems built around it. In many ways they thus remain bound to the essentialisms of this object-oriented scholarly communication system. The concepts of the archive, of the idea of the selector or moderator, of modularity, and of fluidity and liquidity neither fundamentally challenge nor form a real critical alternative to our established notions of authorship, authority and stability in a digital context.

¶ 36Leave a comment on paragraph 360
As I said before, my critique of these notions is not intended as a condemnation of their experimental potential. On the contrary, I support these explorations of fluidity strongly, for all the reasons I have outlined here. However, instead of focussing on reproducing print-based forms of fixture and stability in a digital context, as the concepts and projects mentioned above still end up doing, I want to examine these practices of stabilising, and the value systems on which they are based. Books are an emergent property. Instead of trying to cope with the fluidity offered by the digital medium by using the same disciplinary regime we are used to from a print context, to fix and cut down the digital medium, I want to argue that we should direct our attention more toward the cuts we make in, and as part of our research, and on the reasons why we make them (both in a print and digital context) as part of our intra-active becoming with the book.

¶ 37Leave a comment on paragraph 370
As I made clear in my introduction to this section, instead of emphasising the dualities of fixity/fluidity, closed/open, bound/unbound, and print/digital, I want to shift attention to the issue of the cut; or better said, to the performative agential processes of cutting. How can we, through the cut, take responsibility for the boundaries we enact and that are being enacted? How can we do this whilst simultaneously enabling responsiveness by promoting forms and practices of cutting that allow the book to remain emergent and processual (i.e. that do not tie it down or bind it to fixed and determined meanings, practices and institutions), and that also examine and disturb the humanist and print-based notions that continue to accompany the book?

¶ 38Leave a comment on paragraph 380
Rather than seeing the book as either a stable or a processual entity, a focus on the agential processes that bring about book objects, on the constructions and value systems we adhere to as part of our daily scholarly practices, might be key in understanding the performative nature of the book as an on-going effect of these agential cuts. In the next section I therefore want to return to remix theory, this time exploring it from the perspective of the cut. I want to analyse the potential of remix as part of a discourse of critical resistance against essentialism to question humanist notions such as quality, fixity and authorship/authority; notions which continue to structure humanities scholarship, and on which a great deal of the print-based academic institution continues to rest. I will argue that within a posthumanist performative framework remix can be a means to intervene in and rethink humanities knowledge production, specifically with respect to the political-economy of book publishing and the commodification of scholarship into knowledge objects. I will illustrate this at the end of the next section with an analysis of two book publishing projects that have experimented with remix and reuse.