Elsevier

Brain Research Bulletin

Volume 75, Issue 6, 15 April 2008, Pages 785-795
Brain Research Bulletin

Research report
Bio-inspired sensorization of a biomechatronic robot hand for the grasp-and-lift task

https://doi.org/10.1016/j.brainresbull.2008.01.017Get rights and content

Abstract

It has been concluded from numerous neurophysiological studies that humans rely on detecting discrete mechanical events that occur when grasping, lifting and replacing an object, i.e., during a prototypical manipulation task. Such events represent transitions between phases of the evolving manipulation task such as object contact, lift-off, etc., and appear to provide critical information required for the sequential control of the task as well as for corrections and parameterization of the task. We have sensorized a biomechatronic anthropomorphic hand with the goal to detect such mechanical transients. The developed sensors were designed to specifically provide the information about task-relevant discrete events rather than to mimic their biological counterparts. To accomplish this we have developed (1) a contact sensor that can be applied to the surface of the robotic fingers and that show a sensitivity to indentation and a spatial resolution comparable to that of the human glabrous skin, and (2) a sensitive low-noise three-axial force sensor that was embedded in the robotic fingertips and showed a frequency response covering the range observed in biological tactile sensors. We describe the design and fabrication of these sensors, their sensory properties and show representative recordings from the sensors during grasp-and-lift tasks. We show how the combined use of the two sensors is able to provide information about crucial mechanical events during such tasks. We discuss the importance of the sensorized hand as a test bed for low-level grasp controllers and for the development of functional sensory feedback from prosthetic devices.

Introduction

In this work we present a bio-inspired anthropomorphic artificial hand that mimics the biomechanical features and the sensory apparatus of the human hand. The rationale for this endeavor was multifold. First, such an artifact can be used as an experimental test object to explore control strategies proposed and tested in human subjects and to develop and evaluate novel ones. In addition, autonomous robots are likely to be required to handle objects designed primarily for human manipulation and as such may need to be equipped with anthropomorphic hands. Finally, and not less importantly, artificial hands may find a place in the rehabilitation of hand amputees and with patients congenitally lacking hands.

Humans use their hands for a range of behaviors including power grips, exquisitely fine manipulation, and communicative gestures. The versatility of the human hand relies both on its structure and its control. It displays an intricate biomechanical structure with many degrees of freedom and is endowed with a large number of specialized sensory endings residing in the skin, joints and muscles. Achieving our goal – an artificial hand that mimics the human hand – thus requires not only the mechanical design and implementation of an anthropomorphic hand [8], but also the implementation of a sensory system that compares with the human sensory system. It should be noted, however, that dexterous manipulation does not emerge by simply constructing an exact or improved replica of the human hand. Even simple manipulation tasks, such as a power grasp, require sophisticated control; indeed, simple grasps engage large parts of the human brain [17]. The coordinated and graceful lifting patterns observed in adults are not achieved in humans until 8–10 years of age [18], [19]. Humans thus need almost a decade of daily practice before they master this apparently simple sensorimotor task.

The sensory system of the human hand serves a multitude of functions, e.g., nociception, thermal sensing, texture discrimination, stereognosis, and control of manipulation. Tactile sensor designers have often referred to the human mechanoreceptors of the glabrous (hairless) skin of the hand to draw inspiration [10], [11], [44], [59]. In most cases however, the development of tactile sensors have focused on the individual sensor components rather than on complete systems for tactile sensing, and focused on general functionality rather than task-specific functions. In contrast, we have taken a biomechatronic approach, that is, we have concurrently addressed mechanical and sensor design. As such, we have focused in this study on the sensory signals pertaining to grasping tasks and specifically the signals required for grasp stability control during a prototypical task that forms an integrated part of most kinds of manipulation, i.e., grasping, lifting, and replacing an object [28], [30].

The task of grasping-and-lifting objects and placing them back in the environment has been studied in humans in a large number of neurophysiological and behavioral studies in which subjects have lifted instrumented objects with or without simultaneous recordings of muscle activity (electromyography) and single afferent activity (microneurography) [33], [34], [35], [36], [57]. The behavioral organization of this task is characterized by both sequential and parallel coordination, i.e., it can be subdivided in sequential phases each characterized by a specific sensorimotor behavior and parallel activation of various muscle groups (Fig. 1a and b). Importantly, responses in tactile afferents mark transitions between phases [37], [58].

The strategy represented by a parallel coordination of the grip and load forces in conjunction with a sequential organization of the specific sensorimotor conditions offers several advantageous features [27], [30]. It allows, for instance, objects with unknown weights to be lifted since the load phase characterized by a parallel increase in grip and load forces may continue until lift-off has been detected. Similarly, manipulative tasks involving objects with different frictional properties are simplified because the required adjustment can be implemented by changing a single parameter, i.e., the ratio of the grip and load forces applied to the individual digit–object interfaces [16].

The mechanoreceptors known to be engaged in manipulative tasks can be subdivided into exteroceptors and proprioceptors. By definition, exteroceptors are primarily affected by external stimuli and are represented by tactile mechanoreceptors residing in the skin. Proprioceptors primarily encode mechanical events related to joint positions and muscle forces; they are traditionally represented by joint mechanoreceptors, muscle spindles and Golgi tendon organs. The subdivision into exteroceptors and proprioceptors is didactically useful but somewhat artificial given that typical exteroceptors respond to joints positions [13], [14] and muscle spindles are exquisitely sensitive to external perturbations [48]. While classical proprioceptors are likely to provide crucial information necessary for appropriate reaching and grasping behaviors, there is no evidence that either muscle spindles or Golgi tendon organ afferents play important roles either during the prototypical-lifting task, or when the grasp is perturbed by external forces [20], [46]. We will therefore focus entirely on skin mechanoreceptors and their role in the control of object manipulation.

The study of the neurophysiological properties of human skin mechanoreceptors was made possible with the advent of microneurography in the late 1960s [55]. With this technique a fine insulated tungsten electrode is inserted percutaneously into the nerve and carefully manipulated until the uninsulated 5–20 μm tip picks up signals from a single afferent nerve fiber originating in an individual peripheral receptor. Among all types of afferents supplying the human glabrous skin, the low-threshold, fast-conducting afferents are relevant for the current study. They belong to one of two main groups: slowly (SA) and fast adapting (FA) afferents (Fig. 1d). Whereas the SA afferents show a sustained response to indenting stimuli, FA afferents respond only when mechanical stimuli change. Both SA and FA can be further subdivided into types I and II afferents on the basis of their receptive field (RF) properties. By definition, the RF of an afferent corresponds to the skin area in which indenting stimuli evoke action potentials; the detailed topography of RFs could be very complex [26]. Type I afferents possess small and well-delineated RF whereas the RFs of type II afferents are large with poorly defined borders [56]. Each of the afferents innervates histologically defined structures which in part are responsible for the afferents’ characteristic response properties. For instance, type I afferents in contrast to type II afferents innervate structures located superficially in the skin and this explains their different RF sizes. There are about 17,000 mechanoreceptive units innervating the glabrous part of the human hand and the majority (72%) are type I units [31]. The overall density increases in the proximo-distal direction and this increase can almost completely be explained by differences in the density of type I units (Fig. 1d). Instead of a smooth density gradient, there is a slight increase in innervation density from the palm to the main parts of the fingers and an abrupt increase from the main part of the finger to the fingertip.

When the fingertips make initial contact with an object, all types of afferents discharge although the response is most distinct in type I afferents (Fig. 1a). Similarly, all types of afferents discharge when the object is released from the grasp. While the initial contact activity provides important confirmation that the finger has made contact with an object, FA I afferents also seem to encode the frictional properties of the skin–object interface (Fig. 1e; [34]). Moreover, the initial contact information provides information about the spatial relationship between the tip of the digits and the object as well as the local curvature of the object in contact with the digits [24]. When the object starts to move or when it is replaced on the support, the mechanical transients evoke responses in the sensitive FA II afferents (Fig. 1a). These transients may have been caused by vibrations due to small lateral sliding movements between the object and the table. Importantly, the time of lift-off indirectly provides information about the object's weight and allows subjects to adjust the forces applied to the object during the actual behavior but also to parameterize subsequent manipulative actions with the same object since the lifting drive must be appropriately adjusted to match the weight of the object to allow smooth vertical lifting movements [30].

The importance of the tactile afferents in releasing the sequence of coordinated motor behaviors characteristic of the lift-and-hold task is illustrated by the behavior of humans with digital anesthesia. With anesthesia and without visual control, the phase from contact to the initiation of the load phase is significantly prolonged presumably because contact has not been confirmed by the tactile sensors [33]. Likewise, under digital anesthesia, subjects may fail to notice that a lift-off takes place and accordingly may continue to increase the vertical force until some other sensory system indicate the lift-off [38]. Except for the initial contact phase, the load and grip force increase and decrease in parallel during manipulative tasks (Fig. 1b). If the grip force is too low with respect to the load force, the object will slip. A parallel coordination of the grip and load forces thus ensures grasp stability [58]. The human tactile system plays an important role in adjusting the grip:load force ratio to allow a certain safety margin against slippage (Fig. 1a). As noted above, the discharge in FA I units reflects the frictional conditions at the digit–object interfaces. Adjustments to new frictional conditions are evident in a change in the grip:load force ratio within 100 ms after the initial contact (cf., Fig. 1e). This adjustment is usually sufficient for establishing an adequate safety margin. If, however, an overt or incipient slip occurs during the manipulative task this is promptly signaled by FA I and SA I units and the corrective action that results in increased grip:load force ratio commences in about 70 ms [34]. Importantly, while these corrective actions ensure long-term grasp stability, in most situations the mechanisms are too slow to prevent slippage if overt slips occur at all engaged digits. Nevertheless, taking the human performance as a reference, the design goal was to provide a reaction time to slippage of ≤70 ms for the integrated system of sensors, mechanisms, and control.

Only limited information about how various features of mechanical stimuli impinging on the skin are encoded by skin afferents is available. In general, stronger stimuli, e.g., larger forces, indentations or more rapid changes give rise to higher discharge rates, i.e., some stimulus features are encoded by the discharge rate in individual afferents [15], [41], [42]. In addition, larger stimuli evoke responses in more afferents, i.e., stimulus strength and spatial properties may be encoded in population codes [23]. Moreover, it has recently been proposed on the basis of microneurographic experiments that the very first spikes in ensembles of human skin afferents may encode complex mechanical events such as the direction of fingertip force and the shape of the surface contacting the fingertip [29].

From the short review above regarding the normal physiology of grasping activities in humans, it is possible to identify the minimum requirements of a bio-inspired tactile system that can fulfill the functional requirements of its biological counterpart with respect to prototypical grasp-and-lift actions. In short, the tactile sensory system of an artificial hand must have the possibility to detect:

  • contact and release between the fingertip and the object;

  • object take-off and replacement to the environment; and

  • slippage between the fingertip and the object.

Other functions of the physiological sensory system such as providing information about the local shape at contact points and overall object shape were thus outside the primary design goals.

Section snippets

The biomechatronic hand

We have exploited robotic and microengineering technology to design and fabricate the building blocks of an underactuated biomechatronic hand [7]. The architecture of the hand comprises an embedded actuator system, an artificial ‘proprioceptive sensory system’ (position and force sensors), an exteroceptive sensory system, and an embedded low-level control unit. The biomechatronic hand has five digits (thumb and four fingers) with cylindrical phalanges (∅ 16, 14 and 12 mm and lengths 45, 25 and 22

Results

The main issues addressed in this study were related to the sensory system performance, and in particular if the implemented system was able to fulfill the functional requirements of a prototypical-lifting task, that is, to detect critical mechanical transients that occur during such tasks.

Discussion

The goal of this study was to create an artificial tactile system to be embedded in a biomechatronic hand able to convey information believed to be crucial for the successful completion of prototypical manipulation task, viz., to grasp-and-lift an object. It should be emphasized that the developed contact sensor and the three-axial force sensor should be regarded not as precision devices per se but rather as part of a system able to extract qualitative or semi-quantitative information of

Acknowledgements

This work was supported by EU (NEUROBOTICS IST-001917 and the Swedish Research Council (projects 08667 and 2005-6994).

References (59)

  • G. Canepa et al.

    Detection of incipient object slippage by skin-like sensing and neural network processing

    IEEE Trans. Syst. Manuf. Cybern.

    (1998)
  • M.C. Carrozza et al.

    The Cyberhand: on the design of a cybernetic prosthetic hand intended to be interfaced to the peripheral nervous system

  • M.C. Carrozza et al.

    The SPRING hand: development of a self-adaptive prosthesis for restoring natural grasping

    Auton. Robots

    (2004)
  • M.R. Cutkosky et al.

    Manipulation control with dynamic tactile sensing

  • J. Dargahi et al.

    Human tactile perception as a standard for artificial tactile sensing—a review

    Int. J. Med. Robot. Comput. Assist. Surg.

    (2004)
  • P. Dario et al.

    Biologically-inspired microfabricated force and position mechano-sensors

  • P. Dario et al.

    An integrated miniature fingertip sensor

  • B.B. Edin

    Cutaneous afferents provide information about knee joint movements in humans

    J. Physiol.

    (2001)
  • B.B. Edin et al.

    Finger movement responses of cutaneous mechanoreceptors in the dorsal skin of the human hand

    J. Neurophysiol.

    (1991)
  • B.B. Edin et al.

    Receptor encoding of moving tactile stimuli in humans. I. Temporal pattern of discharge of individual low-threshold mechanoreceptors

    J. Neurosci.

    (1995)
  • B.B. Edin et al.

    Independent control of human finger-tip forces at individual digits during precision lifting

    J. Physiol.

    (1992)
  • H.H. Ehrsson et al.

    Cortical activity in precision- versus power-grip tasks: an fMRI study

    J. Neurophysiol.

    (2000)
  • H. Forssberg et al.

    Development of human precision grip. I. Basic coordination of force

    Exp. Brain Res.

    (1991)
  • H. Forssberg et al.

    Development of human precision grip. II. Anticipatory control of isometric forces targeted for object's weight

    Exp. Brain Res.

    (1992)
  • C. Häger-Ross et al.

    Nondigital afferent input in reactive control of fingertip forces during precision grip

    Exp. Brain Res.

    (1996)
  • R.D. Howe

    Tactile sensing and control of robotic manipulation

    J. Adv. Robot.

    (1994)
  • J.M. Hyde et al.

    A phase management framework for event-driven dextrous manipulation

    IEEE Trans. Robot. Autom.

    (1998)
  • P. Jenmalm et al.

    Influence of object shape on responses of human tactile afferents under conditions characteristic of manipulation

    Eur. J. Neurosci.

    (2003)
  • P. Jenmalm et al.

    Visual and somatosensory information about object shape control manipulative fingertip forces

    J. Neurosci.

    (1997)
  • View full text