Abstract-In the pursuit of developing touch sensors or tactilesensing arrays, the emphasis has only been on the sensors. Thisled to a large number of ‘bench top’ sensors, very few of whichhave actually been used in robotic systems. And those that haveseen the actual use have almost invariably been used in staticcontact point imaging rather than the active manipulation orexploration. Perhaps the lack of the system approach renderedmany of them unusable. In this work, we present the design of atactile sensing system taking into account not only the parametersto be sensed but also the physical and operational constraints ofrobotic system.I. INTRODUCTION

Like humans, the interaction behaviors of robot withenvironment can be better understood by physically interactingwith it. Without touching the objects, it would be difficult for arobot to know their interaction behaviors which depend on howheavy and hard the object is when hold, how its surface feelswhen touched, how it deforms on contact and how it moveswhen pushed etc. Availability of high performance videocameras and the significant research in the area of computervision has made robot interaction with environment mainlythrough visual sensing techniques [1], which, at times can bemisleading due to lack of direct physical interaction. Surely,some of the information about real world objects e.g. shapes ofobjects etc. can be obtained from the vision cameras [2] and afurther detail can be obtained by moving them around theobject. But, moving the robot around the object is not alwayspossible, as can happen with humans as well. And, even if it ispossible to move it around the object under observation, thepresence of visual inaccuracies due to large distance betweenrobot cameras and the object can make it difficult to explore ormanipulate a given object. Of course, such inaccuracies can bereduced by keeping the cameras close to the object, or in otherwords, somewhere close to fingers on the robot hand (e.g. Eye-in-Hand Configuration) [1] – but not without paying the pricein terms of loss of dexterity.Despite an important role, the lesser use of tactile sensing inrobotics, as compared to other sensory modalities e.g. visionand auditory sensing, could partly be attributed to the complexand distributed nature of tactile sensing and partly also to thenon availability of satisfactory tactile sensors. In last more thantwo decades many new touch sensors have been reported [3-6];by exploring nearly all modes of transduction viz:Resistive/Piezoresistive, based on Tunnel Effect, Capacitive,Optical, Ultrasonic, Magnetic, and Piezoelectric. A range ofsensors that can detect object shape, size, presence, position,forces and temperature have been reported in the reviews ontactile sensing [3-5]. A very few examples of sensors thatcould detect surface texture [7], hardness or consistency [8] arereported. Most of the reported devices are either of the scalarsingle point contact variety (for intrinsic tactile sensing) or arelinear or rectangular arrays of sensing elements (for extrinsictactile sensing). The production of new designs and improvedconfigurations of tactile sensors still continues apace, but, thetouch sensor technology largely remains unsatisfactory forrobotics either because the developed sensors are single bigsize touch elements and are too big to be used withoutsacrificing the dexterity of robot, or because they are slow, orfragile and also in some cases due to the digital nature of touchsensors i.e. touch or no touch.Clearly, the emphasis, ‘only’ on the sensor development hasresulted in a large number of ‘bench top’ sensors, very few ofwhich have actually been used in robotic systems. This issurprising, considering the long history of gripper design formanipulative tasks. We believe that the lack of the systemapproach has rendered many of them unusable, despite theirgood design. It is evident from the fact that very few works ontactile sensing has taken into account the system constraints,like those posed by other sensors or by robot controller etc. Tothe best of our knowledge, only [9] has reported the design oftactile sensing system that also considers system constraints.Overall system performance is dictated not only by the isolatedquality of the individual system elements, but also by howsystem elements integrate to achieve a goal. As an example,the development of tactile sensing arrays for fingertips of ahumanoid robot should also take into account system levelissues like – availability of space on the fingertip (whichdecides size of the array), nature of signal going out of array(analog/digital), position of analog sensor front end (on thesame chip with array or on separate chip), sources of noise(there are many motors on robots), time response of the sensorwith respect to other sensors involved in the closed loopcontrol of the robot, division of functions to be performedlocally and centrally etc. Much work needs to be done atsystem level before artificial touch can be used in real worldenvironment. This will also serve as basis for the developmentof practical and economic tactile sensing system in future.Inclusion of tactile arrays in the control loop of robot will helpThe work presented in this paper has been supported -in part- by theROBOTCUB project (IST-2004-004370), funded by the EuropeanCommission through the Unit E5 -Cognitive Systems.in exploring deeper issues involved both in the exploration andmanipulation. This will not only advance research in roboticsbut will also help in understanding the human interaction withenvironment through touch.In this work, we present the design of a complete tactilesensing system taking into account not only the parameters tobe sensed but also the physical and operational constraints ofrobotic system.

Although the work presented here mainlyrefers to tactile sensing for parts like fingertips that requirehigh-density of touch sensors, it can also be applied to low-density touch sensors areas like large skin applications. Thispaper is organized as follows. Section II presents an overviewof the robot tactile sensing system. Section III discusses thedesign constraints and specifications involved in acquiring thetactile data. In Section IV, the architecture of the sensor systemis described along with the solutions to be adopted to overcomethe stringent system constraints. Conclusions are then drawn insection V.

II. TACTILESENSINGSYSTEM OFROBOT

The development of tactile sensing system requires theunderstanding and design of sensor system architecture at alllevels, starting from the sensing the external stimulus to theaction taken based on this stimulus. In general, this wouldinclude following functions:

The hierarchical functional and structural block diagram ofcomplete tactile sensing system is shown in Fig.1, in which,the complex tactile sensing process is systematically dividedinto sub processes. Such a division helps in designing variousparts of the system, to a desired level of complexity, accordingto the tactile sensing mechanisms involved during interactionwith environment. The levels from bottom to top depict thesensing of signal, perception of real world and ultimately theinitiation of action by controller. The level of complexityincreases from low to high with more computation intensiveprocesses occurring at the top. The signal flow in thefunctional block diagram is somewhat similar to that of humantactile sensing system [10]. Various functional levels aredescribed below, starting from bottom.Transduction of contact data constitutes the lowest level ofthe system. This involves measurements like magnitude anddirection of forces, distribution of force in space, temperatureetc. An accurate reconstruction of contact details requires asufficient number of sensing elements placed within the spaceavailable; for example, on the robot finger (as generally fingersare involved in interaction with environment, through touch).This places a constraint on the method of transduction to beused. Speed of response also places a constraint on the type oftransduction method. The transduction of touch signal can bedone either by single sensor or with array of touch sensors. Therequirement of fast response places a constraint on the numberof touch sensors on the array.The second level involves the signal conditioning and readout, which greatly depends on the type of transduction methodused in previous level. In Fig. 1, this level has been dividedinto two parts: analog sensors frontend and digital core. Somelow level computations like simple scaling (amplification) andsegregation of the data from different kind of touch sensors(e.g. force, temperature etc) can also be made at this level byanalog sensors frontend. Digital core is used for linearization,compensation (like temperature compensation, if thetransducer performance changes with temperature),compressing information, slip detection, texture recognitionTransductionRead Out&SignalConditionMultiplexingTactile DataSelectionHigh Level Computat ionConstruction of world modelVi si onAudi oTouchTouch(includingintrinsicsensors)VisionAudioSensor FusionTrans-missionControlAction(signal toactuators)Read Out&SignalConditionTrans-missionRead Out&SignalConditionTrans-missionRead Out&SignalConditionTrans-missionRead Out&SignalConditionTrans-missionFunctional LevelsTransductionTransductionTransductionTransductionAnalog SensorsfrontendCommunicationinterfaceTactile Sensing ArrayMult iplexorControllerComput ingH/WDigital CoreLinearization,compensationetc.Linearization,compensationetc.Linearization,compensationetc.Linearization,compensationetc.Linearization,compensationetc.

etc. In order to maintain a better signal to noise ratio, it isdesirable to keep the signal conditioning circuit close to thetransducers array (for example, if the transducer arrays areplaced on the distal phalange of robot fingers, thenconditioning circuits can be placed on the adjacent phalange, ifspace constraint doesn’t allow the same to be placed on thedistal phalange). A System on Chip (SoC) approach would beideal in this case which is the also the goal of the workpresented here. The initial choice of transduction method andconditioning circuit are important from system point of view,as they set the bandwidth limits of data accessed by the higherlevels of the system.The third level involves the transmission of collectedinformation to higher levels through communication interface.The desired operation speed, noise and number of wires put aconstraint on the type of communication channel used forinteraction with higher levels. The transmission of digital datacan be done either serially or by CAN bus. CAN bus isgenerally a preferred choice due to high real-time capabilities,fast transmission (up to 1 Mbit/s) and high transmissionreliability. High transmission reliability makes CAN buspreferred choice over wireless transmission also because of thesafety issues involved during robotic interaction with theenvironment – even though wireless transmission would be anideal solution as it helps in reducing wiring problems.The fourth level involves the multiplexing of tactile datacoming from different parts, for example, from differentfingers during a typical manipulation/explorative task.Due to large number of sensing elements, the data size alsomultiplies. Not all the data collected from various parts isuseful and hence useless data can be rejected. This is basicallythe function of fifth level in the hierarchy of tactile sensingsystem shown in Fig.1. For example, a grasp may not involveall the fingers and hence the data obtained from the fingersother than those involved in the grasp can be rejected. Thisargument is also valid for certain patches on the tactile array ona particular finger involved in grasping. Based on the task,involvement of a scheme for reading data from certainpredetermined tactile sensor elements can be useful. Thisrequires addressing of all the touch sensor elements; which isthe reason why the data transfer in Fig. 1 is shown asbidirectional.The next level is Sensor Fusion. At this level, the signalsfrom different kind of sensors are collected. In case ofhumanoid robot, these signals could be from touch sensors(both extrinsic and intrinsic), from vision sensors and fromaudio sensors. In humans, the interaction with environmentinvolves the statistical combination of sensory data fromdifferent sensing modalities [11], for example, touch andvision, as shown in Fig. 1. Some attempts of robot controlinvolving different sensing modalities has also been reported inpast [12, 13]. Availability of fast and efficient vision and audiosensors places a constraint on the speed with which tactile datashould be obtained, if the data from different sensingmodalities are involved in robot control.Higher level computations are done at the seventh level toobtain the model/image of the environment (object in contact),based on the data obtained from earlier level (fromindependent sensing modalities or fused data of differentsensing modalities). This level doesn’t impose any majorconstraint on design of lower levels of tactile sensing system.A dedicated computing hardware is required to perform thefunctions of this level and those of earlier two level i.e. dataselection, sensor fusion and model construction.At the highest level, the control algorithms are implemented.For a reliable control of complex tasks, the tactile sensingparameters like sensor density, resolution, speed and locationare particularly important. Thus, the final design of tactilesensor and associated electronics circuitry, is the result ofmany trade-offs.Our approach for development of tactile sensing system is toclimb the hierarchical ladder of tactile sensing system fromhardware intensive bottom. The work presented here is relatedto the three lowest levels of the functional diagram shown inFig. 1. For the lowest level, we have developed POSFET(Piezelectric Oxide Semiconductor Field Effect Transistor)based tactile sensing arrays which are to be placed on thefingertips of the humanoid robot, ‘icub’ [14], shown in Fig. 2.To perform the second and third level functions, we aredeveloping analog sensors frond end, digital core andcommunication interface.

III. TOWARDSIMPLEMENTATION

The first step towards implementation of tactile sensingsystem is to fix the system requirements. The systemrequirements presented below are divided here into two parts;those related to sensor and those related to conditioningelectronics.A. Sensor RequirementsIn absence of any rigorous artificial tactile sensing theorythat can help in specifying important system parameters suchas sensor density, resolution, location, bandwidth etc. one canturn to human tactile sensing to get some initial cues.Following the information on human tactile sensing system,one can formulate some basic design features of artificial

Fig. 2 Humanoid robot, ‘icub’ with the hand shown in inset.tactile system for a general robotic system intended to be usedin real world environment. Few such studies have beenreported in literature [3-5], following which some designfactors for artificial tactile sensing are presented below:

1) The distributed nature of receptors calls for using variouskinds of miniaturized sensors arranged in matrix. Numberof elements in the array may vary with its desiredphysical location on the robot.2) The spatial resolution of the array of sensors should beabout 1-2 mm, which translates to an approximately 10 x15 element grid on a fingertip-sized area.3) In general, the sensor should demonstrate high sensitivityand broad dynamic range. Force sensitivity range of 0.01– 10 N (~1g – 1 Kg) with a dynamic range of 1000:1would be satisfactory.4) It should be multifunctional i.e. in addition to thedetection of forces, touch sensor should be able to detectother interaction behaviors like hardness, temperature etc.5) Linearity and low hysteresis are desired. Although non-linearity can be dealt with through inverse compensation,the handling high hysteresis is difficult. Output from thetactile sensor should be stable, monotonic and repeatable.It is interesting to note that the human tactile sensing ishysteric, nonlinear, time varying and slow. But, perhapsthe presence of large number of these ‘technologicallypoor’ biological receptors enables central nervous systemto extract useful information.6) The artificial tactile sensor should be fast. This isparticularly true, if the tactile sensor is part of the controlloop. In general, for real time contact details, each touchelement should have a response time lesser than 1 ms, ora similar value related to the total number of elements.7) In addition to above factors, the artificial tactile sensorsshould be robust and thus must be capable ofwithstanding harsh conditions of temperature, humidity,chemical stresses, electric field etc.

However, it should be noted that these characteristics, likeany other design factors, are application dependent and thusshould not be considered as definitive.B. Electronics Circuitry RequirementsDimension of the chip, depends on availability of space onthe robot. For the fingertip of robot the dimension of chip(including sensor) should be approximately 13 mm x 15 mm.Scheme of Addressing: To reproduce the image of contactobject, each touch element on the array needs to be addressed.This can be done by selecting rows and columns separately orby addressing individual touch elements. In our case, access toindividual POSFET based touch sensor is preferred. For a 5 x 5array, there must be 5 address lines.Pre-charge bias arrangement: In applications involvingpiezoelectric materials as transducer a voltage fluctuation isobserved in output during application of external load. This canbe reduced by pre-charge bias technique [15].Noise: Apart from many sources of noise in robot, thevariation in temperature can be a source of noise in our case,due to choice of piezoelectric polymer as transducer material.Total noise from the system puts a constraint on the resolutionof ADC and hence on the resolution of parameters to bemeasured. To get 1000:1 dynamic range of forces, a 10 bitADC is required.Cross talk: In order to measure the value of force ataddressed touch element, the read out circuitry must beinsensitive to parasitic due to touch elements in neighbouringrows and columns. In our earlier sensor design [16], a 25%mechanical cross talk was observed. One reason for high crosstalk was the presence of uniform metal layer on one side ofpolymer. The cross talk is expected to go down in the POSFETbased tactile sensing array, as in this case the metal electrodeshave been patterned to be present over the touch element only.Any electrical cross talk (change in capacitance of polymer dueto adjacent touch elements) can be reduced by grounding touchelements other than the one which is being read.Read out time: While interacting with environment, theimage of contact object may be reconstructed if tactile sensingarray is scanned with some minimum frames per second. Thismust also take into account the read out time of other sensorsand also the bandwidth of controller. For the 100 Hzfrequency of robot arm controller and 30 frames per secondreading of vision sensor, for example, assuming 100 frames persecond, the read out time for 5 x 5 array, at the 100 frames persecond is 0.4 ms (= 1/ (5x5x100)). For very dense arrays, theread out time is very less and at times can be unrealistic. Thus,the number of touch elements on array depends on the read outcircuitry.Some low level computations like temperaturecompensation, averaging etc. can be performed on the chipitself. For example, if the temperature variation is high, thenvoltage output of each touch element needs to be compensated(if the response due to force only, is desired). The system hasTABLE I.SYSTEMREQUIREMENTS

The architecture of lowest three levels of the tactile sensingsystem is shown in Fig. 3. The total tasks are divided into threeparts: development of the POSFET based tactile sensing array;development of dedicated electronic circuitry and integrationof whole system - in a single package (SIP) in the first phaseand on a single chip (SOC) in the second phaseThe POSFET based tactile sensors arrays have beendesigned for the fingertips of robot and they are now infabrication. The tactile sensing array, as shown in Fig. 3,comprises of 5 x 5 POSFET (Piezoelectric OxideSemiconductor Field Effect Transistors) based tactile sensors[16, 17], obtained depositing the piezoelectric polymer (PVDF-TrFE) film on the gate area of MOSFET. The charge of thepiezoelectric polymer generated due to applied force,modulates the charge in the induced channel of MOSFET,which is then converted into a voltage value by means ofreadout circuitry that can be embedded into the chip. While thepiezoelectric polymer film as sensing element improvessensors time response; the tight coupling of sensing material(PVDF-TrFE) and electronics using MOS technology willimprove force resolution, spatial resolution and signal to noiseratio. As an example, with the extended gate approach used in[15], the 8 x 8 tactile sensors array was scanned in around 50ms and thus the response bandwidth of 25 Hz was achieved.With POSFET based touch sensors and whole system on chip(SOC), which is our final goal, the bandwidth can be pushed to>100 Hz, which is desired for involving touch sensing intorobotic arm control. As an example, for 5 address lines for 25(Ns) touch elements, 10 bits of data per touch element (Nd),assuming POSFET response time as 50 µs (Tr), delay of 50 µsduring addressing (Ta) and delay of 50 µs during transmissionof data (Tt), the scanning frequency of entire array can beobtained by substituting the corresponding values in followingequation:

Fs= 1/ (Ns*(Ta+ Tr+ Tt)) (1)

Thus, assuming number of data lines to be equal to thetransmitted data bits, the scanning frequency is about 270 Hzand the communication bandwidth is 67.5 Kbits/sec. It shouldbe noted that with POSFET based touch elements and SOCapproach, the delay and response times are expected to lessthan those assumed in above example. In other words, thescanning rate of array would be faster.For the feasibility and reliability of prototypeimplementation, the ad hoc System in Package (SIP), withdedicated chips - tactile sensing array, analog frontend, DigitalCore and Interface (all in one package), could be the startingpoint. The main functional blocks of the SIP, as shown in Fig.3, are: a) POSFET based tactile sensing array; b) analogsensors frontend; c) digital core and serial communicationinterface. To optimize the performance a dedicatedimplementation of the analog sensors frontend is mandatory.Thus, an Application Specific Integrated Circuit (ASIC) willbe designed and manufactured for this purpose, taking intoaccount the system requirements given in Table I. The analog5 x 5 POSFET based tactile sensing arrayanalog forceoutaddress busbias currentsignalAnalog to DigitalConverterLow noiseamplifiers &signalconditioningDecodertaxelselectionCurrentreferenceDigital controlblockSerialCommunicationInterfaceDedicatedserial line52510temp.outSecond PCB on next phalangeConnector in betweenFirst PCB on distal phalangeASICtemperature sensor

Fig. 3: System architecture (first phase) of tactile sensing system. The tentative location of different chips of the robot finger is also shown.sensors frontend will provide sensors, the necessary bias-voltage and currents to acquire the sensors signals withminimal noise (i.e. on chip filtering to remove out of bandnoise components). And, if necessary, the signals areamplified, to make the noise introduced by subsequent stages,less critical. The signals are then converted to digital values.Functions of digital core are to address touch sensors; toextract and compress information from the tactile sensorsarray; to compensate for non linear and pyroelectric effectswhich may affect measurement; and to drive the analogfrontend control signals, etc. Moreover, the digital coremanages the interface to the communication channel. Theprototype tactile sensing system will be interfaced either withCAN bus or with a dedicated digital serial line (a digitalinterface is almost mandatory to protect the sensor data fromnoise due to the fact that the robot controller cards can befurther away from the sensors). The bandwidth andconnectivity (constrained by the overall size of the fingertip)with the electronics existing on the humanoid robotic hand/armwill be considered according to system requirements.A major breakthrough in robot tactile sensing would be‘System on Chip’ (SOC) implementation of tactile sensingsystem. Presence of analog sensor front end, digital core andcommunication interface along with tactile sensors array onsame chip is expected to improve (among others) the speed,bandwidth, signal to noise ratio (keeping in view many sourcesof noise on humanoid), overall sensitivity, efficiency androbustness. Apart from these, with SOC approach, the problemof wiring complexity- a key robotics problem - can also beeffectively dealt with. Thus, rather than an alternative solution,SOC is the requirement for the tactile sensing system.

V. CONCLUSION

The tactile sensing system of robot is presented. Instead ofcoming up with ‘Yet another touch sensor,’ only, the need todevelop the tactile sensing system for robot has been presented.The system requirements are outlined, based on which, thesystem architecture is presented. The system architecture(tactile sensing arrays + analog front end +digital core+communication interface) will be implemented by SIP (Systemin Package) approach, in the first phase and by SOC (Systemon Chip) in the second phase. SIP approach is preferred in firstphase, to study the feasibility and reliability of system, as SIPallows simpler designs, easy design verification, processeswith minimal mask steps and the use of optimized technologiesfor different functions. This will also provide an opportunity tocompare the SIP and SOC approaches, in terms of cost andperformance improvement for this application.

ACKNOWLEDGMENT

We are thankful to Dr. Leandro Lorenzelli, FBK-IRST,Trento, Italy for the support on the development of tactilesensing arrays.