http://bod.sagepub.com/cgi/alerts Email Alerts:

- Oct 9, 2013 OnlineFirst Version of Record

- Feb 20, 2014 Version of Record >>

at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from ArticleWhere Bodies Endand Artefacts Begin:Tools, Machines andInterfacesDaniel BlackMonash University, Narre Warren, AustraliaAbstractOur use of artefacts has at different moments been characterised as either replacing orimpoverishing our natural human capacities, or a key part of our humanity. This articlecritically evaluates the conceptionof thenatural invokedby bothaccounts, andhighlightsthe degree to which engagement with material features of the environment is funda-mental toall living things, thecloseness of this engagement making any account that seekstodrawa clear boundary between body and artefact problematic. By doing this I seek toclarify the nature of our embodied relationship with various kinds of artefacts; movingfrom tools to machines to digital interfaces, I consider their differing potentials to begathered into the body schema, and thus change our embodied horizons of perceptionand action. While much research currently seeks to facilitate a more natural mode ofinteracting with technology, I argue that such a mode of interaction does not existoutsidetheparticularity of our relationships withspecific objects. As a result, rather thantrying to cater to supposedly more natural modes of action and perception, futuretechnologies should aim to enrich our experience with new modes, inviting novelrelationships that produce new kinds of sensory and other experience.Keywordsdigitality, embodiment, phenomenology, technics, technology, toolsWhere Machines BeginAll the machines with which we interact serve to a greater or lesserextent to replace human action or respond to human expression orCorresponding author:Daniel Black, Monash University, PO Box 1071, Narre Warren, Victoria 3085, Australia.Email: Daniel.Black@monash.eduwww.theoryculturesociety.org2014, Vol. 20(1) 3160 The Author(s) 2014Reprints and permission:sagepub.co.uk/journalsPermissions.navDOI: 10.1177/1357034X13506946bod.sagepub.comBody & Society at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from sensation. This article seeks to clarify the nature of our embodiedrelationship with technological artefacts, and to make some sugges-tions regarding the ways in which the development of ever morecomplex artefacts has altered and might in future further alter thenature of that relationship. In order to do this I will move from aconsideration of the tool, to the machine, to the humancomputerinterface. Existing periodisations of this evolutionary movementtowards increasingly autonomous technological artefacts tendtowards a lapsarian narrative, in which our original self-sufficiencyand direct manipulation of the environment is taken away from usby our machines, while a coming age of the natural user interface(NUI) might return us to the Eden of spontaneous and direct interac-tion. After critically discussing the ways in which tools and machineshave been contrasted on the grounds of their respective relationshipswith the natural, I will then critique the NUIs posited relationshipwith this nature, and argue for the value of artefacts, not in comple-menting natural modes of action and perception, but rather inengaging our natural capacity to develop new ways of acting andsensing through objects.During his development of the phonograph, Thomas Edison toyedwith the idea of providing the machine with a voice chambermodelled on the human mouth, complete with teeth and tongue(Wood, 2002: 128). There is a sense of redundancy to these artificialanatomical features, but the phonograph is not the only machine tohave produced an initial confusion regarding how much of the humanbody it should replace. For example, Friedrich von Knauss, theinventor of the first typewriter, came to the device through a seriesof evolutionary steps: the first four writing machines created byKnauss, between 1753 and 1760, were actually androids, reproduc-tions of the entire human body, which did their writing with handsand arms like a living writer (Bedini, 1964: 39). The typewriter, likethe phonograph, was an attempt to replace human activity, and, aswith the phonograph, there is the suggestion of an initial vaguenessregarding how directly to recreate the human bodily actions uponwhich it was modelled, and thus how much of the human body shouldbe replaced by the machine.Such cases might be dismissed as resulting from a past na vetyregarding technology, evidence that, prior to our present intimacywith machines, people were less adept at identifying the point where32 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from machines should end and bodies begin. On the other hand, however,it might just as easily be argued that our contemporary intimacy withmachines makes identifying this point more difficult than ever.Take the following example. The spread of multi-touch interfaces,through digital devices such as the iPhone, has resulted in numerousresearch projects aimed at developing haptic feedback. The greatlimitation of touch interfaces is that, while they free us from separatehardware inputs by allowing the user to simply touch the screen, theyprovide no tactile feedback, such as that which allows a touch typistto find the correct keys with fingertips alone.Most attempts to create haptic feedback have employed methodssuch as vibrations or electrical fields to create sensory stimulation,1but not so in the case of a haptic feedback system being developed bythe Kawasaki and Mouri Laboratory, which was demonstrated at the2010 Automotive NEXT Industry Fair in Japan.2What is strikingabout the approach taken by this research is that it seeks to create theexperience of actually exploring the shape of a solid object in spacewith the hand, rather than simply feeling a bump or buzz whenpressing a virtual button, and to do this it employs a robotic hand,called Hiro III (Endo et al., 2011). The user looks at virtual objectson a computer screen while, hidden from view, her fingertips arelocked to those of the robot with magnetic thimbles. Because therobot can realistically reproduce the movement of a human hand andfingers, it can expertly shepherd the fingers of the human operator inany anatomically possible way. By miming the inverse of a humanhands movement over its contours, the robot can enforce the manualmovements that would accompany contact with a physical exampleof the virtual object on the screen.So, just as Edisons mouthed phonograph would have reproducedthe mouth of the speaker, the robot hand reproduces the hand of thetoucher. Whereas robots are usually understood to be surrogatebodies, acting upon the world in place of a human being, Hiro IIIreverses this, acting on a human body in place of the world. It is amirror image of the hand, which creates the sensation of interactingwith illusory objects situated on the other side of the mirror.Rather than being anomalies born of a misidentification of theboundary separating body from machine, these examples can betaken to indicate the impossibility of ever fixing such a boundary.There must always be some degree of overlap and reversibility. ButBlack 33 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from is it possible to give an account of this area of overlap and reversibil-ity that does not either essentialise the body and naturalise its modesof sensation and action on the one hand, or lose the specificity of thebody and embodied experience on the other? Our embodied relation-ship with artefacts cannot be explained adequately by drawing a hardboundary between the body proper and the environmental featureswith which it interacts, or sharp contrasts between how we sense andact with objects and without. At the same time, it cannot be explainedadequately by representing the body as if it were just one more objectinteracting with other objects, and seeing our embodiment as simply(re)constituted by the technologies that surround it.Tools as NatureAt least since the Industrial Revolution, our dependence upon arte-facts has regularly been characterised as threatening to dehumaniseor enslave us, turning us into extensions of or motors for themachines originally intended to serve us. More recently, however,our relationship with tools has been put forward as definitional tohumanity. Tools would seem to be the artefacts with which we havethe most intimate relationships, being arguably the artefacts mostdifficult to understand in isolation from their human users; thereforethey provide a logical starting point from which to explore the fluid-ity of the bodyartefact boundary. Does the case of tool use under-mine any sense of bodily integrity and specificity, leaving us witha belief that bodies are simply products of their technologicallandscape, their intimacy with and reliance upon artefacts havingalienated them from their natural state?Is tool use natural? Andre Leroi-Gourhans account of the tool asexuded by humans in the course of their evolution (1993: 239)suggests that it is. In the first volume of Technics and Time, BernardStiegler draws on Leroi-Gourhan to repudiate Rousseaus account ofa prelapsarian savage man, whose body is the only instrument heknows (Rousseau, 2007: 23). There was no first man experiencingan Edenic lifestyle prior to the influence of technology, as, accordingto Leroi-Gourhan, the tool is a criterion for humanity the humanand the tool invent each other (Stiegler, 1998: 175).However, by making the tool definitional to humanity, Leroi-Gourhan and, by extension, Stiegler retain for it a status as34 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from something special, outside nature broadly conceived. The tool is part ofhuman nature, but in an important sense this removes both the tool andthe human from nature as a whole. While the appearance of the tool ispresented as part of a larger evolutionary process of freeing (depen-dent upon and following other physical changes such as the freeing ofthe hands by bipedalism), the use of tools, understood to be exclusivelyand quintessentially human, nevertheless represents a qualitativechange inthis process, a moment of rupture (Stiegler, 1998: 1412). Thevery language of freeing, with its suggestion of the human animalprogressively liberating itself fromthe limitations of inheritedphysical-ity first by reshaping its body and then by moving into the realm ofculture through language and technics suggests an underlyingopposition between the natural world of biology and the developmentof tools and technology.3As with that other cherished myth of human exceptionalism, ourmonopoly on language, increased knowledge of other species hasforced a series of retreats from the claim that our relationship withtools sets us apart. Since Jane Goodall first documented chimpanzeetool use (Goodall, 1968),4new examples of animal tool use havebeen turned up regularly, not only among our nearest primate rela-tives but also among birds and crabs, octopuses and insects; arecently compiled table of documented cases runs to nearly 30 pages(Bentley-Condit and Smith, 2010: Appendix A).5From man the tooluser to man the tool maker, and from there to a rueful acknowl-edgement that many other species can fabricate as well as use toolsof their own, we are now in a position where tool use is no more ableto mark an absolute difference between human and animal than anyother ability.Indeed, it might be argued that any privileging of tool use, regard-less of the ends it is intended to serve, will always be burdened withanthropocentric, lapsarian baggage. The categorisation of some kindsof behaviour as tool use (or tool-making) originates in attempts toisolate a certain sphere of action as uniquely and exclusively human,leaving the very category under suspicion of being simply an artifi-cial and arbitrary grouping designed to flatter and elevate humanbeings. While Leroi-Gourhan and Stiegler might reject the idea thathuman beings were ever on the non-tool-using side of the fence, theystill subscribe to a belief in the existence of the fence, and thusmaintain an implicit division between the technical world of theBlack 35 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from human and the natural world of non-human animals, who are nomore than spray from the central jet that gushes human-ward(Leroi-Gourhan, 1993: 58).6Although the widespread use of tools among animals is nowindisputable, among experts on animal behaviour there remains nouniversally accepted definition of what precisely constitutes tool use.While Benjamin Becks 1980 definition of animal tool use asthe external employment of an unattached environmental object toalter more efficiently the form, position, or condition of anotherobject, another organism, or the user itself when the user holds orcarries the tool during or just prior to use and is responsible for theproper and effective orientation of the tool . . . (Beck, 1980: 10)has been influential, Beck acknowledges contentious borderlinecases (1980: 12433), which blur the boundaries of the category,and even among those who broadly agree with this definition thereare often attempts to modify or improve it so that it will better maponto particular examples.7Can water be a tool (Seed and Byrne,2010: R1032)? Would a tool really stop being a tool if you attachedit to a feature of the environment with a piece of string (St Amant andHorton, 2008: 1201)?According to Bentley-Condit and Smith, [t]he definition of tooluse is problematic, often arbitrary or subjective, sometimes anthro-pocentric, and open to interpretation (2010: 186). Where does tooluse end and a more general engagement with the world begin? IfBecks definition accepts an elephant spraying water on itself to cooloff as tool use (which it does), then perhaps air can be understood as atool animals use to gather oxygen and track prey by scent, or to man-ufacture the vibrations used in communication. Assuming a specialrelationship between humanity and tools as he does, Stiegler statesthat [t]he being of humankind is to be outside itself (1998: 193);however, all living organisms are to some degree outside themselves constant exchange and interaction with the environment is aprecondition for all life.While breathing and talking, spraying water on oneself to keepcool and building a shelter, or using a rock as an artificial fist andoperating a smartphone are clearly different activities in importantways, the differences do not ultimately arise from whether they uti-lise features of the environment or innate features of the living body.36 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from Rather they depend most importantly on the degrees of agency andattention associated with their productive occurrence. Breathingoccurs without conscious direction or intention; speaking andpounding with a rock occur with intention but little or no consciousdirection (I and presumably an ape do not plot the trajectory ofrock to nut, but simply harbour an intention to smash the nut andleave it to non-conscious processes to bring this about); building ashelter, operating a smartphone and spraying water (at least for me I do not know how intuitive it is for an elephant) require both inten-tion and a substantial degree of conscious direction. In other words,these various activities are defined more by the modes of engagementwith ones body or environment that they entail than by whether theyemploy features of the environment, or those features degree ofseparability from the environment.The difficulty of clearly differentiating tool use from other kindsof animal behaviour generates a suspicion that it is a largely arbitrarycategory, produced more to serve the assumptions of humanresearchers than account for gathered data. In 2008, Mike Hanselland Graeme D. Ruxton presented an effective critique of the focuson tool use from the perspective of researchers of animal construc-tion behaviour. Excluding the building of structures such as neststends to be a key criterion for success in definitions of tool use, andyet Hansell and Ruxton (2008) highlight the distortion of values thiscreates. For example, much was made of the recent first observationof tool use among wild gorillas (Breuer et al., 2005), and yet for sometime gorillas have been known to construct sophisticated nest struc-tures every day, and to culturally transmit the skills necessary to doso. The documented tool use, which consisted of using a stick to testthe depth of a stretch of water, would seem to be less sophisticatedbehaviour, and yet it was understood to challenge existing beliefsconcerning the cognitive abilities of gorillas (Hansell and Ruxton,2008: 75).It would not be difficult to hypothesise about the cause of thisprivileging of more masculine, outwardly directed tool use associ-ated with hunting, pounding, fighting and exploring as a marker ofcognitive sophistication, and its resultant devaluing of equally(or more) complex material interactions associated with shelter andnesting, but, whatever its cause, it again highlights the arbitrarinessof hiving off certain kinds of interaction with the material environmentBlack 37 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from from a much larger and more varied totality and treating them as thegold standard of cognitive sophistication. When Leroi-Gourhan andStiegler argue that the human appears with the tool, they are not simplyplacing tool use back into nature, but are perpetuating the idea that tooluse is something special, something qualitatively different from otherways of existing in the material environment, and reasserting thehuman exceptionalism that gave rise to a focus on tool use in the firstplace.The use of tools might therefore better be understood as simplyone way in which living organisms interact with their environment,something that is quite widespread and part of a much largercontinuum of material interaction. Human beings certainly do nothave a monopoly on this kind of action, and it therefore cannot besaid to define the human condition or set it apart.At the same time, however, the use of tools does provide anexample of a certain mode of action. Rather than focusing on toolsphysical properties, we can use them as exemplary of a certain kindof bodily engagement with objects in our environment.8Ironically,while tool use has traditionally been seen as a marker of humanitysgreater cognitive prowess and capacity for conscious planning andreflection, this mode of bodily engagement is actually differentiatedby its independence from conscious reflection. While a gorillasshelter might be more sophisticated and require more thought andplanning than its use of a stick to test water depth, the gorilla doesnot have a direct sensory relationship with the shelter equivalent toits experience of using the stick. The stick is incorporated into thesensorium, and for it to be useful the gorilla must have a direct senseof a relationship of scale between the extension of the stick and thedimensions of its own body. This sensory experience is presumablypart of a continuum of degrees of engagement with the environment,but examples from this end of the continuum provide the best illus-trations of the close articulation of body and object. As a result, whilethe category of tool use is a problematic one, research into howactions believed to fall into this category come about can tell us muchabout how we are able to act with and through artefacts.Tools donot marka point of rupture withnon-humannature anymorethan they do with Rousseaus human nature; nevertheless, the physicalproperties of those objects with which humans interact have changeddramatically over time, from stone tools to the tools of the master38 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from craftsman to automated machinery and digital interfaces. The differingmodes of interaction facilitated by these various artefacts must haveproduced important changes in our embodied relationships with them,and thus our field of action and sensation when using them.Machines: The Second FallBut how much can our use of tools really tell us about our relation-ship with contemporary technological artefacts, anyway? Leroi-Gourhan claims that the process of liberation, once it shifts to theuse of machines, renders the body progressively more marginal asincreasingly sophisticated artefacts come to replace more and moreof the bodys powers. In Leroi-Gourhans periodisation, industria-lisation fundamentally shifts our relationship with artefacts throughthe introduction of objects whose autonomy from human actionmeans that they are no longer tools. If anything, human bodies arenow tools utilised by these machines.With the passage to industrial motor function, the situation changedthoroughly. The purpose of operational sequences was now to fill thegaps still very wide in the behavior of the machine. The workerwas required to perform parts of sequences measured at the rhythmof the machine, series of gestures that excluded the worker as anindividual. (Leroi-Gourhan, 1993: 253)By replacing the fine manipulations of the skilled worker withautomated machinery, the Industrial Age produced crowds of work-ers requiring no more than a five-fingered claw to feed in the materialor simply an index finger to push the buttons (Leroi-Gourhan, 1993:255). Leroi-Gourhans vision of humanity being progressively freedfrom the physical particularities of our bodies through the pursuit oflife by means other than life (Stiegler, 1998: 17) leads him to invokescience fiction imagery of our post-industrial descendants as foetus-like blobs supinely poking at computer screens (Leroi-Gourhan,1993: 129). In Stieglers writing, this sense of increasingly active andautonomous machines producing increasingly passive and ineffec-tual human beings is given an additional dimension by the integrationof Gilbert Simondons work, in which the industrial technicalobject is characterised by a kind of self-sufficiency and internallogic by virtue of a tendency toward the unification of parts in aBlack 39 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from whole . . . whereby the technical object invents itself independentlyof a fabricating intention (Stiegler, 1998: 75), and this unificationtakes place in parallel with a loss of unity in the human worker, whobecomes isolated not only from nature but also from himself, andenclosed in piecemeal tasks (Simondon, 2012: 8).As with Rousseau, then, Leroi-Gourhans account does include aFall. Where Rousseaus Fall casts humanity from a technology-freeself-sufficiency into a world where the powers of the body aresupplanted by tools and so atrophy, for Leroi-Gourhan the atrophythat turns hand back into claw occurs later and more slowly, asmachines gradually render more and more of our manual skillsredundant and demote us to a supporting role in automated industry.This theme is hardly an invention of Leroi-Gourhan, of course. ForKarl Marx, the loss of an intimate relationship between worker andtool is a key part of industrialisations dehumanising effect. Wherefor Leroi-Gourhan the tool is definitional to the human being, forMarx the tool is definitional to the machine: Marx (1976: 4945)defines the industrial machine as an entity that appropriates thehuman workers tools and wields them in his stead.These accounts of industrialisations impoverishment of thehuman beings physical and intellectual powers make perfect sensefrom a perspective focused upon tools: Leroi-Gourhans humanbeing is brought into existence by her use of tools, but the creationof ever more complex tools results in the appearance of Marxsmachine, which then plucks the tool from the human beings handsand continues to labour without her, leaving the human being lookingon, empty-handed and dejected, from the sidelines. In the words ofMarx, In handicrafts and manufacture, the worker makes use of atool; in the factory, the machine makes use of him (1976: 548).In the original biblical narrative of the Fall, technics is a result ofthe Fall itself; no longer living in a state of grace, Adam and Eve andtheir descendants must create tools to wrest sustenance and shelterfrom the environment. However, as explained by Jonathan Sawday,during the Renaissance, at the dawn of modernity and mechanisation,this association between technics and the Fall could produce both anegative view of the machine as fundamentally at variance with theideal of nature, or even of God, something that epitomized themoment of transgression, exile and loss . . . a mark of shame (Saw-day, 2007: 4), and a positive view of the machine as a partial40 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from compensation, through Gods grace, for that original punishment bywhich humanity was exiled from its place of origin, by which a par-tial replica of the lost paradise might be confected (Sawday, 2007:3). In secular accounts of technologys influence on the naturalhuman state, there can be a similar ambivalence, but, in the absenceof God, the machine itself tends to be the agent of humanitys down-fall. To struggle against nature is itself the natural, prelapsarian stateof humanity rather than its successor, and the arrival of the machineprecipitates our descent from this into an indolent state of weakness,dehumanisation and evolutionary decline. For Rousseau, to strugglewith his bare hands is the natural state of Man, and by making thisunnecessary the machine robs him of his self-sufficiency andphysical powers. For Marx, to wield a tool is the natural state of theautonomous worker, but the machine takes the tool from him andwields it in his place, leaving the worker as a mere living accessoryof the machine (1980: 134). For Leroi-Gourhan, to use the handswith or without a tool is humanitys natural state, and the machinerenders both obsolete, putting an end to the very activities that madeus human in the first place.Whether or not any of these accounts is accepted, or whether theyare simply dismissed as secular retellings of the original biblicalstory, the more general point to take from all this is that our relation-ships with tools and machines are different in important and obviousways. All technological artefacts are not the same, and an evolvedaffinity for using artefacts as direct prosthetic extensions of ourbodies might not be of much use when dealing with artefacts that canexercise a degree of autonomy from human action.What, then, is the difference between a tool and a machine?Although most people probably have an instinctive sense of whatfalls into which category, the key definitional qualities of each an artefact directly employed by the hand, for the former, and an arte-fact designed to achieve a particular purpose, for the latter9 are notmutually exclusive, and, as machines have become smaller and moretightly integrated into various kinds of work and interaction, theamount of overlap between the two has increased.However, a clear differentiation of tool and machine is crucial toboth Leroi-Gourhans and Marxs accounts of human action, andtheir harder distinctions depend on a clear contrast between the expe-rience of using different kinds of artefact. The key phenomenalBlack 41 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from quality of the tool is, of course, its Heideggerian handiness(Heidegger, 1996: 65). For both Leroi-Gourhan and Marx, themachine refuses this kind of direct engagement with the artefactand, through it, with the world. The machine operators actionsmight be more or less harmoniously integrated with the industrialmachine, but the machine nonetheless remains the object of theoperators attention and actions; the tool user contrastinglyexperiences the tool as simply extending her range of actionsfurther out into the world.Handiness is not grasped theoretically at all, nor is it itself initially atheme for circumspection. What is peculiar to what is initially at handis that it withdraws, so to speak, in its character of handiness in orderto be really handy. (Heidegger, 1996: 65)The term handy is more useful than ready-to-hand10becausethe latter explicitly sets out a relationship between the artefact andthe hand (which it makes itself available to). Handy blurs the dis-tinction between the two terms by taking the name of one as the rootof the adjective used to describe the other. This is appropriatebecause what Heidegger is describing is the quality of being like ahand. This becomes immediately apparent if we alter the above quoteto make the hand its subject:[The hand] is not grasped theoretically at all, nor is it itself initially atheme for circumspection. What is peculiar to [the] hand is that itwithdraws, so to speak, in its character of handiness in order to bereally handy.This special quality of withdrawal is most notably present not intools but in our own bodies. The title of Drew Leders book TheAbsent Body derives from this principle, whereby [t]he body con-ceals itself precisely in the act of revealing what is Other (1990:22), and this aspect of embodiment is central to Merleau-Pontysphenomenology:In so far as it sees or touches the world, my body can therefore be nei-ther seen nor touched. What prevents its ever being an object, everbeing completely constituted is that it is that by which there areobjects. It is neither tangible nor visible in so far as it is that whichsees and touches. (Merleau-Ponty, 2002: 105)42 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from Insofar as this quality can be discerned in tools, then, it is theresimply as a consequence of our using tools as if they were themselvespart of our own bodies.As a result, for the purposes of my argument I will define a tool asan object that can be absorbed into the body schema of its user. Asexemplified by Heideggers hammer (Heidegger, 1996: 645) andMerleau-Pontys white cane (Merleau-Ponty, 2002: 1656, 1756),certain objects can be integrated into our sense of the limits and capa-cities of our bodies so effectively that they become a part of thedirect, unreflective capacity for action usually associated with ourown limbs (see de Preester, 2011: 123).A capacity to employ these objects without planning or reflectionis most crucially the result of a seamless sensory connection with theobject. While the tool might change our field of action, perception, orboth, even the capacity for action is, at base, dependent upon a mod-ification of our sensory relationship with the environment. While it isclear that the stick is no longer an object perceived by the blind man,but an instrument with which he perceives (Merleau-Ponty, 2002:175), the handiness of the hammer arises from a similar sensoryrelationship. An understanding of how much force a hammer blowrequires arises from a sensing, through the body of the hammer, ofthe nails resistance; the ability to focus attention on the nail, uncon-cerned by questions regarding the hammers position in space, resultsfrom the kind of unconscious sensing of its spatial positioning wehave with regard to our hands and feet. This sensory relationship withthe tool can be explained as the tools incorporation into the bodyschema, but the specificity of the sensory basis of this relationshipmust be explained in terms of proprioception.The extension of our sensory reach through a tool can potentiallyengage any of our senses, but the extension of any sense through thetool and out into the environment, like any extension of our range ofactions, requires the primary engagement of a proprioceptiverelationship with the tool itself. Proprioception, our sense of the posi-tion of our own body in space (Gallagher, 2005: 437), is what allowsour bodies to recede from conscious awareness while we are focusedon the goals of our actions; I can catch a ball because I can direct allof my concentration towards the ball rather than having to simultane-ously worry about where my hand is and how it will intercept theball. Similarly, a blind person using a cane can sensorially engageBlack 43 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from with the world through the cane because she does not need to concen-trate on where the cane is she has an immediate and pre-consciousunderstanding of the distance to an obstacle struck by the end of thecane because the properties of the cane itself require no consciousconsideration or calculation.Clinical research into how our brains process spatial informationindicates that this capacity to incorporate objects into our perceptionand conception of action arises at a fundamental level. The clearestexample of this is the way in which perception changes dependingon whether a tool is near or far in relation to the body of theperceiver.This distinction is fundamental enough that individuals with brainlesions that interfere with one or other kind of spatial perceptionseemingly gain or lose the capacity to spatially process an objectas it moves nearer to them or further away:Thus . . . one may conclude that the distinction between near and farspace is not simply descriptive, but that the brain has different waysfor coding the position of objects placed in different location [sic]with respect to body coordinates. . . . [T]he brain constructs differentmaps according to far and near space. (Berti and Frassinetti, 2000:416)The boundary between far (extrapersonal) space and near (peri-personal or cutaneous) space, discovered through primate brainresearch, is marked by the sphere of possible physical interaction(Gallese and Lakoff, 2005: 460; Rizzolatti and Sinigaglia, 2008:64). That is to say, peripersonal space is the area of space reachableby body parts (Gallese and Lakoff, 2005: 459), which is perceiveddifferently precisely because it is understood in terms of the possiblemovements and interactions that might take place within it.How we interact with our environment would therefore seem to befoundational to our perception, and other research suggests that toolscan be integrated into this perception of space. The brain activity ofJapanese macaques trained to reach for objects with small rakesdemonstrates that, when using a tool for reaching, the limits ofperipersonal space are extended to reflect the increased reachallowed by the tool. In other words, the tool is incorporated into thebody schema of the monkey, changing basic perception of spatial andenvironmental relations (Maravita and Iriki, 2004).44 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from Our most fundamental experiences of our bodies routinely incor-porate the physical properties of artefacts. An experiment by AnnaBerti and Francesca Frassinetti centred on a subject who, followinga stroke, had an impaired perception of peripersonal space, meaningthat she could not process spatial information about locations withina certain distance from her body. While she was able to use a lightpen to indicate a location a certain distance away from herself asbeing the solution to a problem, she was unable to do the same thingwith a stick. The reason for this would seem to be that her periperso-nal space expanded to encompass the stick, encroaching into the areain question and thus impairing her ability to perceive it.What these examples from brain research suggest is that ourrelationship with objects and artefacts arises from having innate,evolved abilities that facilitate the understanding, learning and exe-cution of interactions with objects, and our very sense of our ownbodily boundaries and sphere of action can gather tools into itselfin order to facilitate their use in a direct, unreflective manner.11When we take this into account, the difficulty of identifying a pointwhere body ends and artefact begins becomes even greater, as itshows that our facility with tools arises from an ability to dynami-cally shift this boundary.However, in the accounts of Marx and Leroi-Gourhan, the signi-ficance of the shift from tool to machine results from the fact thatautomated machines refuse this kind of intimate relationship. Ratherthan being incorporated into the body schema, the automatedmachine functions independently and in a way alien to human stylesof action, making it something human operators must act on ratherthan with. But what of our contemporary, post-industrial, relation-ship with technological artefacts that are importantly different fromthe automated factory machinery to which Marx and Leroi-Gourhanwere referring?There is a contrast to be drawn between the rigidity of the machinesof the Industrial Age and the flexibility of those machines consideredmost representative of contemporary technology, machines that aremore ubiquitous, more transparent, and seek to insinuate themselvesmore comprehensively into human habits. While the machines of thefactory, according to Marx or Leroi-Gourhan, marginalise the humanuser, demoting her to an onlooker or helper at the margins of an auto-mated system, the consumer electronics of the Information Age seek toBlack 45 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from integrate both their form and their function into the everyday habits ofa human user. These devices still change the nature of work and thebodily habits of their operators, but they do so at a finer grain, and,because they are no longer tied to the specialised environment of thefactory or office, the user is never or almost never free of them.Rather than the worker being required to operate in a factory that hasbecome a space tailored to the function of machines, the informationprocessing device has been tailored to the functioning of the humanuser, and tends to breach divisions between spaces of work, recreation,or social interaction.While their autonomy is greater again than that of the industrialmachine, this greater autonomy allows them more flexibility andtransparency; where the industrial machines autonomy made it anintractable, inhuman entity, which left little room for human bodilyactions or habits, the information processing devices autonomyallows it to pre-empt the needs and mimic the habits of its users whilehiding its functioning away behind a readily manipulable exterior.These devices call for a user with more than Leroi-Gourhansfive-fingered claw; their autonomy is employed to give the appear-ance of responding directly and transparently to the fine manipula-tion of a human body in other words, to seem more like tools.The Eden of the InterfaceThe word interface is a technical term that became popularised as aresult of the post-Second World War technicisation of society. In apost-cybernetic age that often saw the world as composed of systemsin interaction with one another, it was necessary to talk about thepoints where those systems came into contact. While it can beemployed more broadly and is in specific technical contexts it isin the context of the interaction of information systems that the wordinterface has most importantly entered popular usage. In everydaydiscussion, the word interface alone is understood to refer to themode of interaction between human user and machine; for example,a computer will be described as having a graphical user interface,or an iPad will be described as having a multi-touch interface.This usage has clearly been driven in large part by the marketing ofconsumer digital devices. The ease with which a user might operate acomplex digital device is crucial to its appeal, and so advertising46 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from seeks to instil a belief that the devices interface is inviting evenintuitive, to repeat a common marketing buzzword. But this sugges-tion that a digital device has a certain interface is out of keepingwith the meaning of the term. After all, the word interface literallymeans the point at which the exteriors of two or more entities meet;to speak of one entity independently possessing an interface makesno sense. A machine logically cannot have an interface, any morethan a human being can. A humanmachine interface can only existwhen human and machine are interacting with one another. It appearsspontaneously during interaction, and disappears when interactionceases. Not only can an interface not be located on or in themachine, it cannot be given any stable or persistent spatial locationat all. It appears spontaneously during interaction, and does so betweenthe surface of the machine and the surface of the human body.Periodisations of the history of humancomputer interaction, orHCI, move from the CLI, or command line interface, in whichcommands are typed into the machine, to the GUI, or graphical userinterface, which was popularised by Apple and Microsoft, andplaces us at the threshold of the era of the NUI, or natural user inter-face, which is expected to replace the allegedly outmoded andrestrictive conventions of the GUI. The following quote is indicativeof the widespread belief in the desirability and inevitability of theNUI among those involved in interface design:The term natural user interface is an emerging computer interac-tion methodology which focuses on human abilities such as touch,vision, voice, motion and higher cognitive functions such as expres-sion, perception and recall. A natural user interface or NUI seeks toharness the power of a much wider breadth of communication mod-alities which leverage skills people gain through traditional physicalinteraction.Much in the same way the graphical user interface (GUI) was a leapforward for computer users from command line interfaces, naturaluser interfaces in all of their various forms will become a commonway we interact with computers. The ability for computers and humanbeings to interact in diverse and robust ways, tailored to the abilitiesand needs of an individual user, will release us from the current con-straints of computing allowing for complex interaction with digitalobjects in our physical world. (Liu, 2010: 204)Black 47 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from NUI has now become a technological buzzword, althoughprecisely what attributes might qualify as natural in an NUI remainsunclear, even among those working in the area (George and Black,2010: 2). In general terms, however, there is a belief that NUIs shouldbuild upon users pre-existing knowledge of the everyday,non-digital world and hence lead to a more natural and reality-based interaction (Konig et al., 2009: 4562). The Wikipedia entryfor natural user interface describes it as:a user interface that is effectively invisible, or becomes invisible withsuccessive learned interactions, to its users. The word natural is usedbecause most computer interfaces use artificial control devices whoseoperation has to be learned. An NUI relies on a user being able toquickly transition from novice to expert. . . . This can be aided bytechnology which allows users to carry out relatively natural motions,movements or gestures that they quickly discover control thecomputer application or manipulate the on-screen content.Terms such as invisible, spontaneous and innate are oftenutilised in descriptions of the NUI. The interfaces characterisationas natural suggests that control of the machine should come natu-rally to the user; rather than the user conforming to a set of conven-tions and principles produced by the machine, the machine isexpected to conform to conventions and principles already familiarto the user from other forms of interaction.Natural interaction is defined in terms of experience: people naturallycommunicate through gestures, expressions, movements, anddiscover the world by looking around and manipulating physical stuff.The key assumption here is that people are meant to interact withtechnology as they are used to interact [sic] with the real world ineveryday life, as evolution and education taught them to do. (Valli,2007: 2)The most obvious means of realising this idea is to make the oper-ation of the interface mimic interaction with other features of theusers environment; as a result, the digital representations producedby the machine should behave as if they were physical features of theworld upon which the user might act (Valli, 2007: 6).Where the industrial machine, as characterised by Marx andLeroi-Gourhan, takes away our humanity through its requirement48 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from that we work to accommodate it and the larger systems to which itbelongs, then, the NUI promises to reinstate the natural mode ofaction machines have previously taken away from us. We willinteract with our environment using our natural faculties of manipu-lation, gesture and speech, for example, although this natural modeof action will rely upon an environment that is itself a technologicalfabrication, a digital Eden composed of complex technological arte-facts able to mimic the properties of natural, non-technological,physicality.However, the idea of making interaction more natural by mimick-ing physical objects and the existing gestures and movements ofthe user introduces the possibility of a closed circularity. Many of thephysical objects with which we currently interact, and which shapethe range of movements that are familiar to us, are themselvesmachines, which have their own interfaces with the human body.Therefore, the natural modes of interaction appealed to by the NUIwill in many cases simply be older forms of humanmachine interac-tion; furthermore, the older machines responsible for those kinds ofinteraction are likely to have been far more rigid and constrained inthe range of human interaction made possible by them, as they lackedthe flexibility and simulationary and representational abilities ofrecent computer technologies. The natural way of interacting withtechnology an NUI seeks to complement might actually have devel-oped in response to the particularities of some older technologicalinterface, which will be superseded by the NUI and thus fall fromuse; ironically, once this has happened, the feeling of this interactionbeing natural would be sustained entirely by the NUI itself. As withother kinds of technology, the users sense of what is natural or intui-tive would be created by the interaction of body and machine, ratherthan arising from the users body independently of the machine.The NUI, then, seeks to replace the alienated, machine-centricworld of the Industrial Age with a new state of nature, in whichwe can act in a natural, spontaneous way rather than having to workwithin the constraints and conventions of machines. However, itwould do this by creating a new nature out of machines. Rather thansome romantic escape from technology, it means creating newmachines that can naturalise themselves by virtue of having inter-faces that seem wholly determined by our existing habits. It mighttherefore be considered a conservative movement, rather than aBlack 49 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from progressive one. The reason why we interact with old-fashioned tele-phones in the way we do is that the unavoidable physical properties ofthe machine gave us no other option; the reason why one might openan avenue for verbal communication using an NUI by making the signof the Beast with a pinkie finger at ones mouth and a thumb at onesear, or activate an invoked computing telecommunication system byputting a banana to the side of ones head (Price, 2012),12is simplybecause these gestures are reminiscent of using an old-fashioned tele-phone. Such gestural interfaces seem destined to become a trash heapfor actions once mandated by obsolete machines, compelling us tocontinue miming our interactions with the machines of the IndustrialAge long after those machines are gone.Understandings of the term NUI therefore follow the skewingof the term interface itself. Rather than being something thatappears spontaneously through interaction between machine andbody, the NUI becomes something given by the machine to a bodythat is understood to be a fixed externality, an intractable set ofexternal variables that the machine must accommodate. Ironically,this reverses Marxs characterisation of the relationship betweenhuman body and machine: the human body becomes the rigidsystem that the machine must modify its behaviour in order tocomplement.Any attempt to create a mode of interaction that is transparentbut which ignores both the degree to which our current baselinemotor skills are created through previous interactions with artefactsand objects, and precludes the development of a more skilled,acquired capacity to use new artefacts, can only impoverish ourfuture opportunities for interaction with our material environment.After all, the handiness of Heideggers hammer does not arise fromthe fact that hammering nails is a natural human activity. A violinplayer does not produce music from her instrument using naturalgestures transplanted from some other sphere of action, nor do thesegestures feel natural when a violin student first begins to play orquickly and easily come to seem so. The gestures required to play theviolin are highly unnatural, dictated largely by the physical proper-ties of the instrument rather than the players existing habits ofmovement, and yet the final level of habituated, naturalised interac-tion between expert player and instrument is very high. For this tohappen, the brains of violin players must actually change, enlarging50 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from the cortical representation of those digits used for fingering (Elbertet al., 1995). In fact, any mode of interaction that employs the fingersindividually, such as playing musical instruments or typing, is anunnatural movement that can be learnt only through slow habitua-tion. The human hand has evolved for grasping using all the fingerssimultaneously; when we move one finger individually, we do so notby willing that one finger to move, but rather by willing the hand tograsp while simultaneously willing the other four fingers to staystill.13Clearly, an interface that did not make use of individual fingermovements would be much less sophisticated than one that did; how-ever, we only know that complex movements of individual fingerscan become part of our habituated lexicon of gestures because theyhave been necessitated by our use of older tools and devices, objectswhose physical characteristics required human operators to modifytheir bodily habits and abilities to accommodate them. A naturalinterface would be one that sought to reflect only the innatecapacities of a hypothetical natural human body, rather than thosecapacities produced by a negotiation between body and artefact.The touch typists relationship with the keyboard, the blind per-sons relationship with the cane, or the pianists relationship with thepiano keys are not natural; the only thing that is natural is the humancapacity to incorporate objects and artefacts in new ways. In eachcase, the human is presented with an artefact that must be absorbedinto the body schema. This process of absorption might be more orless easy, more or less time-consuming, but ultimately the absorptioncan be almost complete, leaving almost no phenomenological seamwhere the two entities have been joined. Such a process obviouslycannot take place for any kind of artefact the artefacts physicalproperties must fall within the field of habituated gestures possiblefor a human body but the human body is highly adaptable, and thereis a dizzying array of such novel gestures and habits that have arisenthroughout the performance of music, the playing of sports, the pilot-ing of vehicles, expert craftsmanship and more. The NUI threatens toend this evolution of human habit and gesture, freezing it in time byutilising only those gestures already available. At most, new gesturesthat can be quickly and easily mastered would be added, but no newequivalent to typing or playing a musical instrument, or even writing,could arise.Black 51 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from New NaturesDigital devices are highly flexible and responsive to their humanoperators. Rather than employing the flexibility of these devices tosimulate older kinds of sensing and action, they can be used to facil-itate the development of new human skills and experiences, whoserichness and novelty surpass those made possible by other kinds oftechnological artefacts.14The history of sensory substitution devices gives an indication ofthe potential to use artefacts as facilitators of new sensory relation-ships with the world around us. Our sensory experiences are, neces-sarily, produced through the interplay of our sensory apparatus andthe environment that stimulates it, and digital information processingdevices, in particular, can alter the nature of this interplay. The powerand ubiquity of digital data rely most importantly on its technicalcapacity to encode, digitize, and transcode various things from thereal world (Thacker, 2004: 9); disparate kinds of material can betranslated into digital data, which can then be worked upon and thenrecoded as various kinds of sensory stimulation. When this capacityis combined with devices able to sense us and sense our sensing inever more sophisticated ways, the possibilities for new kinds of expe-rience are dramatically expanded.Technologies able to effect sensory substitution have been aroundfor some time, but the work of the late Paul Bach-y-Rita and hiscolleagues, beginning at the end of the 1960s, is probably the best-known attempt to create electronic sensory substitution devices.Bach-y-Ritas work focused on tactile-vision sensory substitution(TVSS) through devices that could substitute tactile stimuli for visualstimuli in the blind. Beginning with a dentists chair that pressed a gridof pixels of varying intensity into the users back (Bach-y-Rita,1972), Bach-y-Rita and his colleagues ultimately created the tonguedisplay unit or brainport, a device that converts visual informationinto electrical impulses, which are then delivered to a lollipop on ausers tongue (Bach-y-Rita and Kercel, 2003).15Peter Meijers vOICesystem, which substitutes sound for visual information and can run ona mobile phone, has also been very successful (Meijer, 1992; Trivedi,2010). Both approaches have demonstrated that, with sustained use,the brains of users can actually remap their processing of sensoryinformation to produce an experience that is as direct and spontaneous52 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from as seeing itself. That is, the users of such devices do not receive tactilesensations or sounds, which they then translate into spatial informa-tion; rather, the technology produces an acquired synaesthesia (Wardand Meijer, 2010: 494), which gives them a pre-conscious, direct sen-sory awareness of the world around them.16In the 1970s, Bach-y-Ritarecorded how, while a test subject was using the sensory substitutionchair, the cameras zoomwas activated, causing the image being trans-lated into tactile stimuli to seemingly loom forward. Bach-y-Ritadescribes how the subject responded as if seeing this sudden visualchange:The change in visual angle produced by the zoom lens changeproduced a looming effect, and the startled subject raised his arms andthrew his head backwards to avoid the approaching object. It is note-worthy that, although the stimulus array was, at the time, on the sub-jects back, he moved backward and raised his arms in front to avoidthe object, which was subjectively located in the three-dimensionalspace before him. (Bach-y-Rita, 1972: 99)In other words, the subject reacted as if unexpectedly perceivingsomething looming up in front of him, rather than unexpectedly feel-ing pressure on the skin of his back. Similarly, the synaesthetic mer-ging of auditory and visual experience can be fundamental enoughthat some users of sensory substitution technology, such as vOICeuser Claire Cheskin, have started seeing shapes when hearing loud,unexpected noises unrelated to the vOICe system (Trivedi, 2010).These devices do not themselves create new sensory experiences only living bodies can do that. As with tools, they simply lend them-selves readily to the human capacity to integrate artefacts and objectsinto a field of action and sensation in order to produce new kinds ofembodied experience, and the blind mans cane is itself a pre-digitalsensory substitution device. While the examples above have beenfocused on replacing lost sensory modalities, there is no reason whythe same approach could not be utilised to augment existing sensorymodalities, or perhaps even to create entirely new ones.The feelSpace project, led by Peter Konig at the University ofOsnabruck Institute of Cognitive Science, explicitly seeks to test thepossibility of creating new senses (Nagel et al., 2005: R14). Ratherthan substituting one kind of sensory input for another, the feelSpaceproject seeks to create a completely novel form of sensory experienceBlack 53 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from using a belt that girdles the midriff with 13 vibrators. At any giventime, one of these vibrators will be active, indicating the subjectsorientation relative to magnetic north. Obviously, human beings haveno natural awareness of magnetic north, and thus the project is able toinvestigate our potential to make use of an entirely new sense. Of thefour test subjects initially fitted with the belt, two showed strongindications that this new sense had been incorporated into theirpre-conscious sense of bodily orientation, changing their powers ofnavigation and their spatial awareness. To quote one of the subjects:During the first two weeks, I had to concentrate on it; afterwards, itwas intuitive. I could even imagine the arrangement of places androoms where I sometimes stay. Interestingly, when I take off the beltat night I still feel the vibration: When I turn to the other side, thevibration is moving too this is a fascinating feeling! (Nagel et al.,2005: R22)Our awareness of our orientation in space arises from a combinationof information from different senses, making it unclear to what extentthe feelSpace belt truly can be said to constitute a new sense in itself,rather than being an additional form of stimuli feeding into this exist-ing composite awareness (see Nagel et al., 2005: R24), but even so itdemonstrates our ability to seamlessly incorporate new kinds ofstimuli producedbytechnological artefacts intoour sensoryexperience.To speak of any of these modes of interacting with the world asnatural or unnatural has little value they are simply instances ofan interplay between human bodies and features of the environmentproducing sensory experience. All sensory experience and actionarises this way, regardless of the features of the environmentinvolved, and there is no natural dimension to these things that existswithin any human body in isolation from its surroundings. Whatis notable about the examples above is the ways in which new kindsof technological artefacts can be tailored to the creation of new kindsof experience in a way that far exceeds that of older tools.While such technologies can just as readily be utilised to repro-duce older modes of engaging with our environment, motivated bya belief that they are more natural, to do so would be to smother thepromise of these technologies and hobble the ongoing process ofproductive interaction between living bodies and features of theirenvironments.54 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from ConclusionWhile human beings have never existed in a prelapsarian state ofnature prior to the use of tools, this is only because the use of suchobjects is only one part of all animals extensive engagement with thematerial environment around them. This engagement is made possi-ble by an ability to actively sense and sensitively act upon the thingsaround us, and our capacity to seamlessly incorporate tools into ourbody schemata is one of the most impressive examples of the degreeto which our perception is geared to facilitate this. The handiness ofa tool arises from a direct, pre-conscious sensory relationship with it,one in which the tool is absorbed into our sense of the capacities andextension of our own bodies. This extends not only the reach of ourcapacity to act, but also our sensory reach, through the tool and outinto the world.Where figures such as Leroi-Gourhan and Marx have contrasted thetool with the industrial machine, this contrast has hinged on a charac-terisation of the machine as something that precludes such a relation-ship. The industrial machine, as something that can operate to somedegree independently of a human body and whose operation is highlyconstrained by the physical particularities and limitations of itsmechanism, generally cannot function as a seamless extension of thehuman body, acting in perfect concert with it; rather, the humanworker must struggle to act in a way tailored to the inhuman habitsof the machine.However, automatism does not necessarily militate against theabsorption of machines into our schemata of perception and action.The automatism of many, more recent, digital devices is directedtowards giving them a capacity to sense our bodies or sense oursensing of them. This holds out the possibility of seamlessly incor-porating machines into our sense of bodily action and perceptionin new ways.At the same time, the discourse of natural user interfaces, whilearising from a recognition of this potential, misunderstands our rela-tionship with tools and other material resources. It seeks to create atechnological simulation of a pre-technological Eden where ourlexicon of gestures and habits arises from the body alone, uncontami-nated by the restrictions imposed by the physical particularities ofindividual machines. However, our gestures, habits and perceptionsBlack 55 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from always arise through the interaction of our bodies and materialfeatures of our environment; there is no natural, originary dimensionto these things that arises purely from within human bodies inisolation. Our capacities for action and perceptual experience takedifferent shapes and produce different kinds of experience, depend-ing upon the material objects with which we engage. Rather thancreating new technological artefacts that cater to assumptions aboutwhat human action and perception already are, the potential of thesenew technologies should be exploited to produce new ways of actingon and with our material environment, and new kinds of sensoryexperience through which it can be explored.Notes1. See Paterson (2005) for a discussion of some early haptictechnologies.2. Video of the demonstration is available at http://www.diginfo.tv/v/10-0099-r-en.php.3. And this is further suggested by the extension of this trajectoryinto the technological manipulation of information, given N.Katherine Hayles, (1999) and others critique of informationdiscourse as indulging fantasies of escape from material embodi-ment (see Vaccari, 2009).4. Goodalls ground-breaking research was published several yearsafter Leroi-Gourhan treated tool use as definitional to the humanin Le Geste et la parole (1965, English trans. 1993), but manyyears before Stiegler did the same in La Technique et le temps:la faute dEpimethee (1994, English trans. 1998).5. Exactly how many animals are credited with tool use will dependupon the definition of tool use employed (see below).6. This problem with Leroi-Gourhans account is discussed in moredetail in Ingold (1999: 422).7. See, for example, Bentley-Condit and Smith (2010) and StAmant and Horton (2008). Becks (1980) work has also beenrevised in Shumaker et al. (2011).8. Which is not to say, of course, that physical properties areirrelevant. Obviously certain kinds of bodily engagement are onlypossible with objects that possess certain kinds of physical prop-erties (a mountain, for example, will never be used as a tool).56 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from 9. Here I am liberally paraphrasing the relevant OED definitions.10. To favour Joan Stambaughs translation of the termZuhandenheit.11. This is not a scientific paper, nor am I a scientist; I am not in aposition to falsify any of these clinical findings. At the sametime, however, this research provides another account of our sen-sory engagement with features of our environment that, whiletaking a very different approach, has produced conclusions thatare similar or complementary to those of my own. While they nomore prove the validity of my conclusions than the related con-clusions of Merleau-Ponty, for example, on the subject (andMerleau-Pontys conclusions were themselves heavily indebtedto clinical brain research), they do provide an additional perspec-tive, which can be productively integrated into a larger whole.12. For a demonstration, see: http://www.diginfo.tv/v/11-0232-d-en.php.13. And we often can manage this only imperfectly it is extremelydifficult to move ones middle finger with no accompanyingmovement of the fingers to either side of it.14. See Froese et al. (2012) for an extended discussion of thepotential value of such an approach for research into cognition.15. The tongue having been chosen because of its density of nerve-endings and conductivity.16. Whether this experience therefore qualifies as seeing is subjectto debate, the debate ultimately hinging on how the word see-ing is defined (see Kauffmann, 2011; see also Auvray andMyin, 2009).ReferencesAuvray M and Myin E (2009) Perception with compensatorydevices: From sensory substitution to sensorimotor extension.Cognitive Science 33(6): 10361058.Bach-y-Rita P (1972) Brain Mechanisms in Sensory Substitution.New York: Academic Press.Bach-y-Rita P and Kercel SW (2003) Sensory substitution and thehumanmachine interface. Trends in Cognitive Sciences 7(12):541546.Beck BB (1980) Animal Tool Behavior: The Use and Manufacture ofTools by Animals. New York: Garland STPM.Black 57 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from Bedini SA (1964) The role of automata in the history of technology.Technology and Culture 5(1): 2442.Bentley-Condit VK and Smith EO (2010) Animal tool use: Currentdefinitions and an updated comprehensive catalog. Behaviour147(2): 185221.Berti A and Frassinetti F (2000) When far becomes near: Remappingof space by tool use. Journal of Cognitive Neuroscience 12(3):415420.Breuer T, Ndoundou-Hockemba Mand Fishlock V(2005) First obser-vation of tool use in wild gorillas. PLoS Biology 3(11): e380.de Preester H (2011) Technology and the body: The (im)possibilitiesof re-embodiment. Foundations of Science 16(23): 119137.Elbert T, Pantev C, Wienbruch C, Rockstroh B and Taub E (1995)Increased cortical representation of the fingers of the left handin string players. Science 270(5234): 305307.Endo T, Kawasaki H, Mouri T, Ishigure Y, Shimomura H, Matsu-mura M and Koketsu K (2011) Five-fingered haptic interfacerobot: HIRO III. IEEE Transactions on Haptics 4(1): 1427.Froese T, Suzuki K, Ogai Y and Ikegami T (2012) Using humancomputer interfaces to investigate mind-as-it-could-be from thefirst-person perspective. Cognitive Computation 4(3): 365382.Gallagher S(2005) Howthe Body Shapes the Mind. Oxford: Clarendon.Gallese V and Lakoff G (2005) The brains concepts: The role of thesensory-motor system in conceptual knowledge. CognitiveNeuropsychology 22(3/4): 455479.George R and Black J (2010) Objects, containers, gestures, andmanipulations: Universal foundational metaphors of natural userinterfaces. Paper presented at the ACM Conference on HumanFactors in Computing Systems, New York.Goodall J (1968) The behaviour of free-livingchimpanzees inthe GombeStream Reserve. Animal Behaviour Monographs 1(3): 161311.Hansell M and Ruxton GD (2008) Setting tool use within the contextof animal construction behaviour. Trends in Ecology & Evolution23(2): 7378.Hayles NK (1999) How We Became Posthuman: Virtual Bodies inCybernetics, Literature, and Informatics. Chicago, IL: Universityof Chicago Press.Heidegger M (1996) Being and Time: A Translation of Sein Und Zeit,trans. Stambaugh J. Albany, NY: SUNY Press.58 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from Ingold T (1999) Tools for the hand, language for the face: Anappreciation of Leroi-Gourhans Gesture and Speech. Studies inHistory and Philosophy of Science 30(4): 411453.Kauffmann O (2011) Brain plasticity and phenomenal conscious-ness. Journal of Consciousness Studies 18(78): 4670.Konig WA, Radle WA and Radle H (2009) Squidy: A zoomabledesign environment for natural user interfaces. Paper presentedat the CHI 09 Extended Abstracts on Human Factors in Comput-ing Systems, New York.Leder D(1990) The Absent Body. Chicago, IL: University of ChicagoPress.Leroi-Gourhan A (1993) Gesture and Speech, trans. Berger AB.Cambridge, MA: MIT Press.Liu W (2010) Natural user interface Next mainstream product userinterface. Paper presented at the 2010 IEEE 11th InternationalConference on Computer-Aided Industrial Design & ConceptualDesign (CAIDCD) 1: 203205.Maravita A and Iriki A (2004) Tools for the body (schema). Trends inCognitive Sciences 8(2): 7986.Marx K (1976) Capital: A Critique of Political Economy, vol. 1,trans. Fowkes B. Harmondsworth: Penguin Books in associationwith New Left Review.Marx K (1980) Marxs Grundrisse, 2nd edn, ed. McLellan D. Lon-don: Macmillan.Meijer PBL (1992) An experimental system for auditory imagerepresentations. IEEE Transactions on Biomedical Engineering39(2): 112121.Merleau-Ponty M (2002) Phenomenology of Perception, ed. bySmith C. London: Routledge.Nagel SK, Carl C, Kringe T, Martin R and Konig P (2005) Beyondsensory substitution Learning the sixth sense. Journal of NeuralEngineering 2(4): R13R26.Paterson MWD (2005) Digital touch. In: Classen C (ed.) The Book ofTouch. Oxford: Berg.Price S (2012) Alvaro Cassinelli: You really can use a banana as atelephone. The Observer, 1 January.Rizzolatti G and Sinigaglia C (2008) Mirrors in the Brain: How OurMinds Share Actions, Emotions, and Experience. Oxford: OxfordUniversity Press.Black 59 at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from Rousseau J-J (2007) Discourse on the Origin of Inequality. Minneapolis,MN: Filiquarian Publishing.Sawday J (2007) Engines of the Imagination: Renaissance Cultureand the Rise of the Machine. London: Routledge.Seed A and Byrne R (2010) Animal tool-use. Trends in Ecology &Evolution 20(23): R1032R1039.Shumaker RW, Walkup KR and Beck BB (2011) Animal Tool Beha-vior: The Use and Manufacture of Tools by Animals. Baltimore,MD: Johns Hopkins University Press.Simondon G (2012) Technical mentality. In: De Boever A, Murray A,Roffe J, and Woodward A (eds) Gilbert Simondon: Being and Tech-nology. Edinburgh: Edinburgh University Press.St Amant R and Horton TE (2008) Revisiting the definition of animaltool use. Animal Behaviour 75(4): 11991208.Stiegler B (1998) Technics and Time: The Fault of Epimetheus, trans.Beardsworth R and Collins G. Stanford, CA: Stanford UniversityPress.Thacker E (2004) Biomedia. Minneapolis, MN: University of Minne-sota Press.Trivedi B (2010) Sensory hijack: Rewiring brains to see with sound.New Scientist 2773: 4245.Vaccari A (2009) Unweaving the program: Stiegler and thehegemony of technics. Transformations 17.Valli A (2007) Natural interaction white paper. iO Agency. Availableat: http://citeseerx.ist.psu.edu/viewdoc/download?doi10.1.1.98.9153&reprep1&typepdf (accessed September 2013).Ward J and Meijer P (2010) Visual experiences in the blind inducedby an auditory sensory substitution device. Consciousness andCognition 19(1): 492500.Wood G (2002) Edisons Eve: A Magical History of the Quest forMechanical Life. New York: Alfred A. Knopf.Daniel Black lectures in the School of English, Communications andPerformance Studies at Monash University, Australia. His research isparticularly focused on our embodied relationship with technology. He isthe author of Embodiment and Mechanisation (forthcoming).60 Body & Society 20(1) at University of Hull on April 24, 2014 bod.sagepub.com Downloaded from