This article provides a critical account of Bogost and Montfort’s Platform Studies series, established in 2009 with their book Racing the Beam: The Atari Video Computer System which aims to ‘promote the investigation of underlying computer systems and how they enable, constrain, shape, and support the creative work that is done on them’. The article begins with an overview of platform studies, seeking to define the term ‘platform’ within the contemporary digital media industry before outlining Montfort and Bogost’s methodological approach. It then examines the two latest books in the series: Codename Revolution by Steven E. Jones and George K. Thiruvathukal on the Nintendo Wii console; and Jimmy Maher’s The Future Was Here about the Commodore Amiga. It interrogates the extent to which these books continue the project begun by Racing the Beam, while at the same time highlighting some of the limitations of the series’ approach. Lastly it considers how the execution of the series to date might be counterproductive for its wider goal of promoting the study of digital platforms. The article concludes by considering how future books in the series—and indeed any researchers interested in adopting a platform studies approach more broadly—might address these concerns.

The ‘Platform Studies’ series was inaugurated in 2009 by series editors Nick Montfort and Ian Bogost with the publication of Racing the Beam: The Atari Video Computer System. In that book, Montfort and Bogost laid the foundations for a theoretical framework that centred on what they argue is the “most neglected” level in analyses of digital media: the platform, or the underlying ‘computing systems and computer architecture’ of digital technologies (2009a, p. 147). Racing the Beam was a valuable demonstration of this project, setting the agenda of the series as an intervention into digital media studies. It included a short preface, later expanded on in the book’s afterword, which laid the groundwork for future books in the series. It established the series’ stated aim of seeking to “promote the investigation of underlying computer systems and how they enable, constrain, shape, and support the creative work that is done on them” (p. vii). The first half of 2012 saw the publication of another two books in the series – Codename Revolution by Steven E. Jones and George K. Thiruvathukal on the Nintendo Wii console; and Jimmy Maher’s The Future Was Here about the Commodore Amiga. With the publication of these new books, it seems apt to evaluate how the project initiated by Montfort and Bogosthas evolved more than three years since its inception. But it is also worth examining the broader context in which platform studies emerged and what the series has contributed since then towards its goal of promoting the analysis of digital technologies at the level of the ‘platform’.

In this paper, I provide a critical account of Bogost and Montfort’s Platform Studies series, drawing on a close analysis of the two latest publications. I acknowledge that the books in the series to date provide invaluable and often intriguing insights into the production and development of the particular platforms they discuss, as well as the wider culture in which they are situated. I argue, however, that in the process of providing “a place for studies that focus on the platform level” (Montfort &Bogost 2009a, p. 150), the series risks reducing platform studies to a generic formula that limits, rather than expands, the approach’s contribution to studies of digital culture. Despite minor variations in their authors’ disciplinary approaches and focus, the latest books in the series closely emulate the model established by Racing the Beam. Rather than offering a space for platform studies to evolve by critiquing the underlying assumptions and politics of its approach, the series has become a ‘production line’ of texts that apply the same standardised format across each of the volumes. As such, if platform studies is to become something more than a ‘brand’, researchers of digital platforms will need to develop it beyond the archetype established by Montfort and Bogost’s series.

I begin this paper with an overview of platform studies, seeking to define the term ‘platform’ within the contemporary digital media industry before outlining Montfort and Bogost’s approach, which emerged as an ‘intervention’ into these debates. I then examine the two latest books in the series and the extent to which they continue the project begun by Racing the Beam, while at the same time highlighting some of the limitations of the series’ approach. Lastly I consider how the execution of the series to date might be counterproductive for its wider goal of promoting the study of digital platforms. I suggest that, ironically, by providing a pre-established format for their study, it has diminished the capacity for platform studies to problematise and challenge some of the constraints of Bogost and Montfort’s approach. I conclude by considering how future books in the series – and indeed any researchers interested in adopting a platform studies approach more broadly – might address these concerns.

How to Do Things with Platforms

In their book Biomedical Platforms, Keating and Cambrosio (2003) consider the etymology of the term ‘platform’ and how it has evolved over time. They observe that ‘platform’ originally referred to a material structure, a “flat form – a plane, more or less elevated surface, be it natural (the top of a small hill) or artificial (the top of a flat building)” (2003, p. 26). ‘Platform’ embodied a purely material meaning; one that can be described as neutral, passive and apolitical. While this definition still persists in geology, its present meaning is more figurative. Today, it more commonly refers to “a set of ideas, objectives, and principles supporting a common course of action and upheld by a political party, a union, or any other organized group”; a definition which did not emerge until the 1840s in the United States to refer to a tribune, or speaker’s platform for electoral speeches. At this point, the shift “from the material to the figurative meaning entailed a shift in connotation, from platforms as passive supports to platforms as springboards for future action. In this process, technical and political meanings have become inextricably linked” (p. 27).

This intertwining of the material and metaphorical meaning of the term has been further extended in the contemporary digital media era. Keating and Cambrosio note that in the technology industries the term has come to embody “a basis for change and innovation…computer platforms cut across social institutions such as firms” (p. 28). They thus emerge as a kind of organising logic around which a particular hardware or software platform is developed. As a result, “platforms, not firms, account for the dynamics of technological competition in the sector” (p. 28). As Gillespie (2010, p. 350) illustrates, ‘platform’ thus comes to take on more subtle, discursive meanings as “a progressive and egalitarian arrangement, promising to support those who stand upon it”. This rhetoric, he observes, permeates the marketing of technology giants like Google and Apple, imbuing their products with ‘progressive’ values. In the context of information technology and computing, then, platforms take on political, even ideological, connotations: they are no longer ‘passive’ and ‘transparent’ infrastructure, but are “active, generative, and opaque” (Keating & Cambrosio 2003, p. 326).

Platform studies is squarely situated within this historical trajectory and evolution of the term. It provides a framework for analysing the culture within which platforms are created, taking into account the development of their material, technical components as well as broader social and cultural concerns. As Montfort and Bogost (2009b, p. 2) note, platform studies was first introduced in a paper they gave at the 2007 Digital Arts and Cultures Conference; the book series was subsequently launched in 2009 with Racing the Beam. They frame the series as both an intervention into academic scholarship—calling for “the humanities to seriously consider the lowest level of computing systems and how these systems relate to culture and creativity” (2009a, p. vii)—while also serving as a ‘platform’ itself of sorts for this occur.

In differentiating their approach from others in digital media studies, Montfort and Bogost outline what they consider to be “the five levels of digital media” that have been the focus of scholarship over the years. These include the levels of ‘reception/operation’ (media effects and reader-response theory); the interface; form/function; code; and the platform (pp. 145-7).The authors acknowledge that “many studies of digital media and computer games span multiple levels’—such as Bolter and Grusin’s (2000) book Remediation which addresses both the interface and reception/operation levels—but insist that, predominantly, “studies often focus on one” (Montfort & Bogost 2009a, p. 146). Likewise, the platform is but one level and should not be the only level taken into consideration (p. 147).

Nonetheless, the authors’ taxonomy implies that there is a linear progression across the various levels, with the platform as the ‘base’ or most fundamental level. As the ‘lowest level’, it is the formative level, the one that shapes or determines those above it. In describing how platform studies differs from other approaches, for instance, the authors state:

Platform is the abstraction level beneath code, a level that has unfortunately received some attention and acknowledgement, but which has not yet been systematically studied. If code studies are new media’s analogue to software engineering and computer programming, platform studies are more similar to computing systems and computer architecture, connecting the fundamentals of digital media work to the cultures in which that work was done and in which coding, forms, interfaces, and eventual use are layered upon them. (2009a, p. 147; original emphasis)

In this sense, Bogost and Montfort’s approach to platform studies is thus entangled within the intersection between the material and figurative understandings of platforms that Keating and Cambrosio (2003) identify. It also brings in elements of the ‘new materialist’ or ‘materialist turn’ in digital media studies, whereby theorists are increasingly “prepared to tackle what goes on in inside the machine” (Parikka 2012, p. 89; see also Apperley & Jayemane, 2012). Bogost and Montfort state that “a computational platform is not an alien machine, but a cultural artifact that is shaped by values and forces and which expresses views about the world” (2009a, p. 148). As Apperley writes in his review of Racing the Beam for this journal, “the core of the platform studies agenda is to consider how particular platforms…embed material limits into how computer systems may be used, whilst considering how those limits are both challenged and used creatively by programmers” (2009, p. 83). Racing the Beam was a valuable first contribution to this project, offering an account of how the material constraints of the Atari VCS ‘can be seen as providing opportunities for the creative process – not obstacles’ (Montfort & Bogost 2009a, p. 140).

In this sense, the Platform Studies series recognises that the technical limitations and constraints of platforms are not inhibitors to creativity, but rather shape and often generate the creative labour that is produced by them. This approach resonates with Peter Krapp’s compelling argument in his recent book Noise Channels. In it, he examines how “digital culture taps reservoirs of creative expression under the conditions of networked computing” by embracing “the limitations and closures of computing culture…rather than trying to overcome them” (2011, p. ix-x). The structure of Racing the Beam similarly demonstrates this approach through a close technical analysis of several key Atari VCS titles released over the course of its history. It concludes with the videogame industry ‘crash’ of 1983 and considers the console’s ongoing relevance and the creative work that continues to be done with it. It goes against the grain of prevailing attitudes which consider ‘old’, dated platforms as ‘obsolete’, recognising them instead as continuing to generate creative potential. As James Newman writes in a recent book on the subject,

platforms are superseded and eventually rendered obsolete as games are no longer available for them, while the games themselves slip from view as they are superseded by new, faster, “better” versions that, for their part, can only be played on current generations of hardware. When we see old games and platforms referred to at all…it is often in this comparative mode, as a baseline by which we are invited to judge the additional processing power or graphical resolution of the replacement. (2012, p. 9)

Broadly speaking, then, since the publication of Racing the Beam platform studies continues to speak to the concerns of current theorists of digital media by challenging the progressivist assumptions of media studies and acknowledging the material limitations of computer systems as sites for creativity and contestation. The methodology established by the series editors in the foreword is, at first glance, refreshingly open in this regard. They avoid proscribing a single theoretical or critical approach, instead listing several common traits that all books will contain. These include a focus on “a single platform or a closely related family of platforms”; a rigorous technical analysis of these platforms; and an examination of their wider cultural and social importance (2009a, pp. vii-viii).

With the publication of another two books in the series, however, it has become evident that the execution of the series has limited this approach in other ways. Platform studies excel at identifying why the platform should be acknowledged in digital media studies and the wider humanities, and at promoting this cause. But demonstrating ‘an awareness’ of the platform becomes the series’ mantra: it simply becomes a place for the format established by Montfort and Bogost to be recycled from one platform to another. This is illustrated by the way subsequent books in the series have chosen to closely conform to the generic format and framework established by the Racing the Beam as the prototypical entry in the series, albeit with some variations. In the following sections, I examine the latest entries in the series with a particular emphasis on how they expand, to some degree, the project initiated by Montfort and Bogost. I then consider how platform studies might expand the series’ scope even further by addressing some of the concerns I flagged at the beginning of this section by critiquing the notion of studying the platform itself.

The Wii Platform: Platform Studies for the Current Generation

The first book to follow Racing the Beam in 2012 is Steven E. Jones and George K. Thiruvathukal’s Codename Revolution: The Nintendo Wii Platform. The book can be seen as an expansion of Jones’ chapter on the Wii in his book The Meaning of Videogames, which was itself “inspired in part” by the announcement of the Platform Studies series (2008, p. 127). Jones is a ‘textual studies scholar’ whose work focuses on how “manuscripts, print technologies, and publishing all work together as a system to afford and constrain the meaning of poetic texts.” Meanwhile, Thiruvathukal comes from a more traditional computer science background: his current research is about “distributed systems and pervasive computing—a space dominated by low-powered devices” (2012, p. 6). These two approaches gel effectively to analyse how the Wii edged out its competitors by combining motion-sensor technology and a “smaller, nimbler, lower-powered” design (p. 33) with a marketing and distribution model that established the console as first and foremost a “social platform”.

The Wii was released in 2006 and is Nintendo’s fifth home console and among the seventh ‘generation’ of home consoles overall (the others are Sony’s PlayStation 3 and the Microsoft Xbox 360). Its most distinguishing design characteristic is its pioneering use of motion-sensor technology through the console’s Wii Remote controller. The Wii remote is able to detect movement three-dimensionally and map the player’s hand motions – waving, pointing, swinging and jabbing the controller – into the virtual game environment, making it primarily a physical or ‘kinaesthetic’ gaming activity (see Nansen, 2009). This capability became the selling point of the console as it sought to carve out a niche from its competitors by targeting casual gamers and, crucially, those who have never played a game before. Nintendo thus promoted the Wii as a console ‘anyone can pick up and play’. Jones and Thiruvathukal contend that this is what makes the Wii a ‘social platform’: although all platforms are in a sense social systems designed for interaction among individuals, the Wii was engineered from the ground up as “the first home video game platform consciously designed as a whole…[to] promote social gameplay out in physical space” (2012, p. 4).

Earlier chapters in the book focus on the conceptualisation and mechanics of the Wii’s hardware, such as the console itself and the Wii remote controller, as well as peripherals released for the system such as the Wii Balance Board (a flat, motion-sensitive board that is used for the fitness/training software Wii Fit). The authors note that the physical appearance of the Wii diverged from other consoles in order to snare the casual and non-gamer market, with an emphasis on “quiet and efficient technologies” and “small, slim design” (Jones & Thiruvathukal, 2012, p. 30). As a result, “graphic realism in general was set aside…or it was at least relegated to a much lower priority than it usually is on other game consoles” (p. 35). Similarly, the simple, accessible design of the controller—which, as its name suggests, was modelled to look like a TV remote (p. 54)—contrasts sharply with the increasingly complex, sophisticated game controllers of consoles like the PlayStation 3 and Xbox 360, with their growing numbers of buttons and control configurations.

A focus on technical rigour and the material architecture of computing technologies is a hallmark of Platform Studies – and, depending on one’s tastes, also its most laborious trademark feature. But while there is plenty of technical analysis in Codename Revolution, particularly in the early chapters, it doesn’t get bogged down in detailed descriptions of hardware, coding and game mechanics to the same extent as Racing the Beam. This is understandable given the much more technologically sophisticated nature of the console, which as the authors note “only invokes a feeling of simplicity…The Wii, like other computing devices, is actually a relatively complex system” (p. 27, original emphasis). In contrast to the Atari VCS, which is perhaps much easier for programmers to grasp and ‘pull apart’, the Wii is a typically ‘closed’ system with its technological core concealed beneath its sleek and seemingly simplistic veneer. In this sense, the Wii is consistent with Gillespie’s observation that technologies increasingly “submerge their workings” to discourage tinkering with the machine’s material structure, so that “the ‘black box’ of technology is itself being further black boxed” (2007, p. 239). This is in contrast to old school computer systems like the Atari VCS, for instance, which have given rise to everything from glitch electronica to the development of ‘retro’ games that deliberately exploit their archaic architecture (see Krapp, 2011).

Jones and Thiruvathukal offer, however, a welcome alternate reading of the ‘open vs. closed’ debate later in the book when they examine Wii hacks and mods. They contend that the Wii demonstrates different degrees of openness through its gadgets and peripherals like the Wii Remote and sensor bar, which have been transformed through both gaming and non-gaming hacks and mods. Examples of this include Johnny Chung Lee’s projects that reverse-engineer these devices, allowing them to be used as an electronic whiteboard or 3D modelling technology (2012, p. 131). They argue that such experimentation with the console is demonstrative of a wider understanding of “openness” which takes into account not just the physical architecture of the system, but also the way in which its “networking protocols and standards” create an environment for experimentation and creative exploitation (2012, pp. 126-8). Here, as with Racing the Beam, the varying degrees of openness and closure of the system’s material components are recognised as spurning creativity in a way that is “sanctioned” by the architecture of the system, but which also pushes these boundaries (p. 133).

Perhaps the book’s most valuable technical insight comes in the final chapter, ‘After the Revolution’, when Jones and Thiruvathukal contrast the Wii with other the motion-sensitive gaming platforms that have emerged since its release: the Xbox Kinect and (to a lesser extent) the PlayStation Move. They suggest that while the Wii was the forerunner in the shift towards the ‘casual revolution’ aimed at capturing the casual gaming market, it no longer holds this privileged position as the Kinect has (to a certain extent successfully) sought to cut into this demographic. In a useful contrast, though, they contend that the Kinect differentiates itself from the Wii by focussing on ‘full immersion’ in the game environment. Unlike other kinaesthetic gaming platforms like the Wii and Move, both of which use a physical artefact (such as the Wii Remote) as the device that mediates between the player’s physical actions and the virtual game environment, the Kinect seeks to eliminate this interface between the player and game world with its motto “You Are the Controller.” They write,

[The Kinect’s] sensors and cameras work by capturing and mapping the living room the way that certain security systems do, rendering it as a field, the disturbances of which can be detected as bodies in motion. In this sense, it turns your living room into the game space in a much more literal way than the Wii’s motion-control system does. (2012, p. 165)

Although the authors do not explicitly make this theoretical connection, their reading suggests that the Wii could be conceived as illustrative of Bolter and Grusin’s (2000) notion of “hypermediacy”, in that it draws attention to the interface through which you interact with the game. As you swing and wave the Wii Remote the way you would a sword or baton, you’re constantly reminded of the materiality of the device that acts a mediator between the physical space of the living room and the virtual world of the game on the television screen. Meanwhile, the Kinect is more closely aligned with the notion of ‘immediacy’, in that it seeks to efface these distinctions: it is designed to “turn your living room into a sublime, transcendent game space, realizing the fantasy of cyberspace or the holodeck” (Jones & Thiruvathukal, 2012, p. 164). This contrast is further reinforced by the different marketing strategies employed by Nintendo and Microsoft in promoting their hardware (p. 163-166).

These observations are insightful and illuminating. They demonstrate the extent to which a platform studies approach can provide a close technical reading of a system’s functionality, interface design and affordances, while examining broader transformations like the increasingly competitive attempts to redefine the gaming market. But what Codename Revolution offers in informed analysis of the Wii platform and its role in bringing “the social nature of game platforms to the centre” (Jones & Thiruvathukal, 2012, p. 170), it lacks in a more comprehensive theoretical debate. In their theoretical approach, the authors rely almost solely on Juul’s book A Casual Revolution (2010); a useful study of the turn towards casual and social gaming, but by no means the only account of the videogame industry that is relevant to their discussion. For instance, the authors could engage more with arguments around the “capture” of game labour (Dyer-Witheford and de Peuter 2009; Kucklich 2005; Sotaama 2010) in their account of Wii hacks and mods. In particular, there is scope for this kind of analysis in the context of the WiiWare digital distribution system, and the way this creative work becomes assimilated back into the games industry.

Instead, the authors of Codename Revolution closely follow the path laid out by Racing the Beam, albeit with a newer console –and one that, as of the book’s publication, had not reached the end of its mainstream commercial lifecycle. This makes for an interesting new step for Platform Studies that allows the book to provide a different set of observations related to the console’s more complex, and less malleable, architecture. But it doesn’t substantially depart from the formula developed by the first book in the series, with the close analysis of a single platform framed by a lucid discussion of its technical capabilities and constraints in constant focus.

The Commodore Amiga: Back to the Future (Via the Past)

Jimmy Maher’s book The Future Was Here is perhaps even closer to Montfort and Bogost’s inaugural book in the series, both in its subject and its structure. As mentioned, Codename Revolution has to deal with the mechanics of a ‘current generation’ console (which, as of its publication, had yet to be succeeded by the Wii U and the ‘eighth generation’ of videogame consoles). However The Future Was Here, like Racing the Beam, has the benefit of hindsight: the Atari VCS was released in 1979, and the Commodore Amiga in 1985. There is undoubtedly something of a continuation in the narrative arc between the books: Racing the Beam ends with the videogame industry ‘crash’ of 1983 and the subsequent demise of Atari’s dominance of the early commercial videogame industry. Maher’s book naturally picks up from here historically, situating the conceptualisation and development of the Amiga as being directly influenced by the fallout of these events. Early in the book, for instance, he notes that Amiga sought to avoid being tainted by the industry’s implosion by deliberately branding the machine as a personal computer rather than just a game console (Maher, 2012, p. 17). As such, the development of the Amiga is examined as both a response the impending threat of the game industry’s collapse, as well as a visionary exploration of the emerging capabilities of the personal computer that would eventually become a commonplace, mundane feature of contemporary culture.

The Future Was Here is an exhausting and comprehensive account of the Amiga platform. Similar to Racing the Beam (and in contrast to the more thematic structure of Codename Revolution) each chapter of the book is devoted to a particular piece or type of software program or demo designed for the system. These include the famous Boing demo; Deluxe Paint and its successors; the platform’s operating system and coding language AmigaOS and ARexx; and development companies like NewTek. At nine chapters and more than 300 pages, however, The Future Was Here is much longer and perhaps more hard going than previous books in the series. While the book has an overarching argument—that the Commodore Amiga, with its innovative hardware features, pioneering applications and ambitious conceptualisation, can be seen as “the world’s first true multimedia PC” (Maher, 2012, p. 5)—at times it seems to be more of a companion handbook to the system. It is replete with technical specifications, programming instructions and detailed deconstructions of various programs and applications, to the extent that the lay reader not familiar with the intricacies of computer programming—myself among them—might struggle to extract value from every page.

This attempt to reach out to enthusiasts of the system is reinforced by the accompanying website for the book (http://amiga.filfre.net) which provides a wealth of technical resources and aids such as images and video clips to accompany the explanations provided in each chapter, as well as programs that can be downloaded and run on an actual Amiga console or an emulator. This is a welcome addition both in providing a deeper level of technical analysis than even Racing the Beam managed; but also for breaking out of the mould of the series and bringing in something of the author’s personal expertise. The book is also designed in such a way, though, that certain sections can be skimmed for readers less interested in technical knowledge and more inclined towards the console’s history and evolution. By balancing these different concerns, the book addresses Platform Studies’ goal of being useful both to scholars and ‘the general public’; but it also risks trying to cater to both audiences without fully engaging either.

The Future Was Here, as its title suggests, considers the Amiga as a revolutionary platform that introduced many features and applications that are by now a staple of computing systems. Maher’s approach to the book, then, is as an “important link between the pioneering early years of personal computing and the ubiquitous digital culture of today” (2012, p. 249). Maher argues that the Amiga can be seen as a precursor to many of today’s technological innovations, from online video-sharing, digital cameras and MP3 players to the open source movement and platforms like Linux (pp. 5-7). For instance, Maher argues that in an era before tools like YouTube and vodcasting existed, the Amiga played a crucial role in ‘democratizing’ the ‘means of cultural production’ by making editing and production software available not just to professionals or the wealthy, but to a generation of dedicated amateurs. This, Maher writes, is “the Amiga’s most exciting and lasting legacy” (p. 142). In another link with Racing the Beam’s account of the emerging videogame industry, he also interestingly notes how EA—today regarded as one of the most generic videogame development companies, producing a seemingly endless string of sequels, remakes and licenced games to tie in with blockbuster films—were once at the forefront of experimentation with the possibilities of digital art (p. 45). For all its technical jargon and lengthy deconstructions of software programming, The Future Was Here is often illuminated by insightful anecdotes that focus as much on the role that companies and communities played in the platform’s development as its technological capabilities.

At times, though, the book is too overeager in drawing links between the Amiga and contemporary digital technologies, without the rigorous historical, discursive analysis that would be required of a scholarly work to make these connections. In what he describes as a ‘bold’ move, Maher claims that the Amiga was a direct precursor to the participatory culture embodied by services like YouTube and Flickr, and similarly can be seen as prefiguring the rise of open-source software though projects like Fred Fish’s library of free software and the fact many of those who worked on the Amiga ended up as Linux users and designers (2012, p. 7). In a nod to the authors of Codename Revolution, Maher even posits that the Joyboard—an early game controller for the Amiga that consists of a flat board players stand on and use to control the game by moving their body from side to side—is a ‘forerunner’ of the Wii’s motion-sensor controls (p. 153). These arguments are made without the kind of broader theoretical debates—such as a deeper account of the formation and evolution of the open source movement by considering the many studies of it that have taken place over recent years—that are necessary to draw a more convincing link between these developments.

In these moments, as with Codename Revolution, it becomes clear that the focus on a single platform through its material limitations and the culture in which it emerged becomes too confining. There is little room for the kind of deeper theoretical considerations that would lend arguments like Maher’s in The Future Was Here—about the Amiga’s foundational role in the ubiquitous computing ideology that emerged in the 1990s and continues to influence digital culture today—more credence and weight. But as I will suggest in the following section, it is partly the execution of the platform studies itself through the book series that in many ways lends itself to these constraints. The Future Was Here is an impressive book; one that clearly benefits from an enormous amount of research and attention to detail and that, despite its daunting technological fixation, is really quite accessibly written. But, like its predecessor Codename Revolution, it doesn’t offer a challenge to or expand platform studies to any significant extent; even as it offers a critical appraisal of the Amiga’s history, it never moves beyond a kind of hybrid technical handbook/scholarly textbook account of the platform in much the same format that previous books in the series have offered. This is by no means to dismiss the work put into what is an accomplished analysis of the system. Rather, it is illustrative of what I contend is the problematic execution of the series itself that needs to be addressed in order to expand the scope of the series further than what has been accomplished so far.

Stepping Beyond the Platform

The Platform Studies series is undoubtedly highly polished and remarkably consistent across the three volumes to date, both aesthetically and structurally. The books themselves are superbly designed and laid out, and edited to maintain a consistent level of quality and structure that is carefully cultivated across the series. But this uniformity also underscores a recurring problem with the series: the books are perhaps too consistent and undeviating, too closely conforming to the prototype established by Racing the Beam, to fully take up the challenge to digital media scholarship proposed by Montfort and Bogost’s afterword to that book.

As mentioned, Montfort and Bogost do not proscribe a particular approach for the series, leaving it open to contributions from any and all theoretical fields and research backgrounds. There are, as discussed, differences and variations between the books so far: Codename Revolution takes an inherently somewhat different approach to its predecessor by discussing a console that is not at the end of its “lifecycle”. Further, structurally—in contrast to the other two books—it devotes each chapter to different aspects of the system rather than specific software or “moments” in its history. Nonetheless, the three books to date share a common methodology which closely conforms to the format established by Montfort and Bogost, rendering platform studies almost as a kind of brand, or as so many variations on a theme, rather than taking platform studies in a potentially more challenging direction.

It is almost as if, in a strange irony, in the very act of providing “a place for studies that focus on the platform level” (Monfort & Bogost, 2009a, p. 150) the series has instead reduced the platform studies to a generic formula that can be emulated for any platform that’s called for. One can imagine an endless production line of books—one on the Magnavox Odyssey or Sega Dreamcast, another on Java or Microsoft DOS—that are valuable in themselves, but which don’t expand on the established formula of the series. Indeed, a recent article by Montfort and Mia Consalvo (2012) framing the Dreamcast as a “console of the avant-garde”—and which reads almost like a proposal for a forthcoming book in the series—suggests at least one potential future book in the series is likely to follow this archetype. The article’s structure is divided up into a close analysis of five specific games for the console, while examining the culture of creativity within which it was situated, not unlike Racing the Beam.

There is, of course, nothing wrong with the quality or level of analysis in the three books produced thus far, and it would by no means be a great loss if several more volumes were produced that perpetuate the same framework as those already published. But there are other directions in which the platform studies project could go, and the series could do more to cultivate this. I would like to propose here a number of brief critiques of Montfort and Bogost’s platform studies model that might be considered in future research on platforms; whether through the book series or scholarship more broadly. The first concerns platform studies’ relationship with the shift towards materialism in media studies, and the focus on the hardware, software programming and material components that make up platforms. In their article “Game Studies’ Material Turn”, Apperley and Jayemane write that platform studies’ flexibility lies in the fact that

The materiality of platforms can be turned inwards to examine the individual components of a platform, and just as easily outwards to focus on the organizational structure that allows the platform to be produced. The genius of platform studies is to locate the platform as the stable object within this complex, unfolding entanglement, allowing it to perform the role of a centre around which other relationships may be traced and examined. (2012, p. 12)

So far books in the series have recognised this potential, but there are other ways in which they could move beyond the characteristics favoured by Bogost and Montfort to expand this relationship between the material and social further. Apperley and Jayemane observe that, interestingly, discussions of the materiality of videogames often overlook the material traces of the object itself, such as the “feel of the console and the controller” (2012, p. 16). There is also the consideration of tracing where the physical components of these platforms—the microchips, processors, cables, casing and so forth—are produced and the material labour that is put into them. As Dyer-Witheford and de Peuter somewhat dryly point out, “ultimately the components of game machines come from sources such as the mines of the Congo and end up in the electronic waste dumps of Nigeria and India” (2009, p. xviii). In contrast to the series’ overarching focus on technical hardware and programming, then, a deeper critique of the embodied materiality of platforms would situate platform studies more concretely within current research on “new materialism” and media studies (see Parikka 2012).

There is also room to critique the very project of platform studies itself, and the underlying discursive and ideological connotations that the term carries. I gestured towards this concern in my discussion of Keating and Cambrosio’s (2003) etymology of the term. As I pointed out, Gillespie goes even further in recognising how platforms have become imbued with rhetorical values by software developers and media industry giants that make them more appealing to consumers. He observes how organisations like Google have “positioned themselves as champions of freedom of expression, and ‘platform’ works here too, deftly linking the technical, figurative and political” (2010, p. 356). Given that platform studies to date has focused on the role that the systems they discuss play in the going battle for dominance of the gaming industry and introducing technological innovations into the lives of its users, a more self-reflexive approach to the concept of the platform in digital culture could emerge. In particular, platform studies has struggled to a certain extent in differentiating itself from this celebratory recuperation of the term and moving beyond an analysis of the “rise and fall” of the videogame industry and other creative industries, as evidenced by the theoretical shortcomings of Codename Revolution and The Future Was Here that I outlined earlier.

Platform studies is only a relatively new concept, and while Montfort and Bogost may have coined the term and played an instrumental role in shaping its existence through their series, it is by no means the only path forward for analysis of the platform. Indeed, the editors express their hope that “others will choose to undertake studies that centre on platforms themselves”. With the exception of a relatively few critical takes on both the series and the concept of the platform itself as a framework for study (Apperley&Jayemane 2012; Gillespie 2010; Keating & Cambrosio 2003), the Platform Studies series remains the dominant mechanism for these discussions. As such, if the project is to evolve and adapt beyond the overly standardised, generic framework of the series an even more radical intervention might be needed into the study of the platform “level”—both from new books in the series and scholars of digital media more widely.

Conclusion

Platform studies is a fascinating and exciting agenda, and one that holds a great deal of promise for expanding the scope of digital media studies. To date the series has offered three excellent books that take up this cause in slightly different ways. But if Platform Studies is to fully developits intervention in opening up the study of digital mediato a new level that significantly departs from previous approaches in new media scholarship, it needs to move beyond the archetypical approach established thus far. This entails becoming more self-reflexive about what it means to focus on the platform as an object of theoretical analysis, and taking up some of the problems that myself and others have raised with the approach. Codename Revolution and The Future Was Here are full of insightful observations and enlightening anecdotes about their respective platforms, but highlight the extent to which a single, closely edited series might blunt the radical potential of the project. In this paper, I have offered a challenge to Platform Studies: to take the framework established thus far in new directions that fulfil the promise of Racing the Beam that “studying what is underlying and assumed—the platform—is rewarding in all sorts of digital media research” (Montfort &Bogost, 2009, p. 150).

Montfort, N. and Consalvo, M. (2012). The Dreamcast, Console of the Avant-garde. Loading… The Canadian Journal of Game Studies 6(9). http://journals.sfu.ca/loading/index.php/loading/article/viewArticle/104.

I would like to thank Luke van Ryn and the anonymous reviewers of this article for their useful suggestions.

Biographical Statement

Dale Leorke is a PhD candidate in the School of Culture and Communication at the University of Melbourne, Australia. His thesis examines location-based gaming and play in public space, using case studies of games that merge mobile and location-aware devices with physical locations for playful interaction in urban space. His most recent work can be found on his research page: http://unimelb.academia.edu/DaleLeorke

We are interested in empirical and conceptual approaches to theorising globalisation, development, sustainability, wellbeing, subjectivities, networks, new media, gaming, multimodality, literacies and related issues and their implications for how we educate and why. We encourage submissions in a variety of modes and invite guest editors to propose special editions.

DCE is an online, open access journal. It does not charge for article submission or for publication. All manuscripts submitted to DCE are double blind reviewed. Articles are published through a Creative Commons (CC) License and made available for viewing and download on a bespoke page at www.digitalcultureandeducation.com

Follow us on Twitter at @DigitalCultureE

The scale and speed at which digital culture has entered all aspects of our lives is unprecedented. We publish articles and digital works including eBooks (published under Creative Commons Licenses) that address the use of digital (and other) technologies and how they are taken up across diverse institutional and non-institutional contexts. Scholarly reviews of books, conferences, exhibits, games, software and hardware are also encouraged.

All manuscripts submitted to Digital Culture & Education (DCE) are double-blind reviewed where the identity of the reviewers and the authors are not disclosed to either party.

Manuscripts submitted should be original, not under review by any other publication and not published elsewhere.
The expected word count for submissions to the journal is approximately 7500 words, excluding references. Each paper should be accompanied by an abstract of up to 200 words. Authors planning to submit manuscripts significantly longer than 7500 words should first contact the Editor at editor@digitalcultureandeducation.com

All pages should be numbered. Footnotes to the text should be avoided and endnotes should be used instead. Sponsorship of research reported (e.g. by research councils, government departments and agencies, etc.) should be declared.

Digital Culture & Education (DCE) invites submissions on any aspect of digital culture and education. We welcome submissions of articles and digital works that address the use of digital (and other) technologies and how they are taken up across diverse institutional and non-institutional contexts. For further inquiries and submission of work, send an email to editor@ digitalcultureandeducation.com