Skip to main content

Über dieses Buch

As robots improve in efficiency and intelligence, there is a growing need to develop more efficient, accurate and powerful sensors in accordance with the tasks to be robotized. This has led to a great increase in the study and development of different kinds of sensor devices and perception systems over the last ten years. Applications that differ from the industrial ones are often more demanding in sensorics since the environment is not usually so well structured. Spatial and agricultural applications are examples of situations where the environment is unknown or variable. Therefore, the work to be done by a robot cannot be strictly programmed and there must be an interactive communication with the environment. It cannot be denied that evolution and development in robotics are closely related to the advances made in sensorics. The first vision and force sensors utilizing discrete components resulted in a very low resolution and poor accuracy. However, progress in VLSI, imaging devices and other technologies have led to the development of more efficient sensor and perception systems which are able to supply the necessary data to robots.



Force and Torque Sensors


Joint Force Sensing for Unified Motor Learning

Motor learning consists of using in-built sensors to learn more about one’s own motion behavior. In this paper, we present an approach for motor learning based on a perturbed parameter scheme. A technique is developed for determining the link inertias based on joint reaction data, obtained through force sensors. Due to inexactness of the model, the parameters thus estimated are likely to differ from their true values. This perturbed parameter set can be thought of as a “learned” model of the executed motion. Running the dynamics procedure with these altered parameters results in a more accurate prediction of the control torques needed for the desired motion.
A. Mukerjee

Modelling the Interaction between Robot and Environment

The contact between robot and environment generates forces of interaction which need to be controlled. In addition, the motion of the robot interacting with the environment must also be controlled. The unification of these two objectives is usually titled ‘force and position control’. In the recent past this issue has been extensively addressed in the research, and several basic approaches to force control have emerged. The synthesis of controllers for force and position control has also been addressed. The paper reviews some of the relevant work in this area in particular the techniques used to generate models of the interaction between robot and environment. The paper also addresses conceptually the implementation of impedance control and presents an application of this technique as well as of the descriptor system to control of interaction forces in a dexterous multifingered hand.
A. A. Goldenberg

Pneumatic Sensors: Their Use and Performance in Force, Tactile and Position Sensing and in Shape Recognition

The application of pneumatic sensors on robots or manipulation systems is possibile and convenient in some cases when the use of compressed air is particularly suitable.
A. Romiti

Tactile Sensors


Carbon Fibre Sensors

Carbon Fibre sensors possess the potential of providing robust, highly sensitive transducers with poor respeatabi1ity and time dependent characteristics. Dynamic response is limited to about 30 Hz.
J. B. C. Davies

Tactile Geometry for Images and Normals

The shape of a tactile sensor imposes constraints upon its performance. This paper discusses the geometries of currently available tactile sensors for robots, and compares them with man’s own tactile sensor, the finger. Possible alternative shapes are discussed along with the requirements for a versatile tactile sensor. Finally, a new tactile sensor is presented which appears to satisfy these requirements. Significantly, this sensor can be readily moulded to any desired geometry and degree of compliance.
A. Cameron, R. Daniel, H. Durrant-Whyte

A Video Speed Tactile Camera

A tactile image sensor is described that applies piezoelectric films of PVDF (polyvinylidene fluoride) to map a 2dimensional pressure distribution. The sensor consists of two layers of PVDF. One layer is used to generate, the other layer to sense acoustical vibration. Driven at fixed frequency the configuration can locally switch between resonance and non-resonance at a change of the acoustical impedance of the surrounding media. An object is detected as a blob of locally deviating acoustical impedance. The spatial distribution of acoustical impedance is resolved by linewise driving and columnwise sensing. Application of pressure impedance converters makes the sensor pressure sensitive. Measurements on an experimental tactile image sensor with nine elements are discussed. These experiments have shown that for a tactile image sensor with a binary response a spatial resolution of at least 25 elements per cm2 can be realized. The acoustical frequency is sufficiently high to allow video-speed electronic read-out of the detected amplitudes.
P. W. Verbeek, P. T. A. Klaase, A. Theil

Present and Future of Tactile Sensors

In this discussion it was pointed out that there is not a clear definition of “tactile”, but this word seems to encompass some qualities such as: pressure, temperature, shear, heat conductivity and humidity. An intuitive antropomorphic interpretation of tactile sensing is feeling or touching.
P. W. Verbeek

Acoustic Sensors


Acoustic Range Sensing for Robotic Control

Manufacturing systems often require frequent changes in product shape or assembly positioning. Effective use of robots in production lines require sensors to adapt the path of the end effector to the task. In addition variations in part details in many batch process or low volume production runs may arise which are not incorporated in the off-line programming assembly procedures. It is therefore desirable to incorporate the adaptive ability to follow a surface from a precisely fixed offset distance of a few inches/centimeters in real time to track true surface contours, while still satisfying global task goals.
J. S. Schoenwald

SONAIR Ultrasonic Range Finders

Rangefinders, particularly SONAIRS, constitute an important part of new robot sensory systems. We review the different measuring techniques. The attained performances depend mostly on the:
  • physical properties of the air and ultrasonic transducers
  • electronic and its coupled microprocessing
  • movment of SONAIR in the discovery of its surrounding.
Some of them are contradictory. We must make a good compromise. We present our choices in our field of application: the mobile robot sensory equipment and those of the 3D synthetic vision system. In order to improve their performances, we conclude our paper by considerations on the ideal transducers that would be produced industrially and we present the concept of multisensory range finding systems using the present possibilities of optics and acoustics.
S. Monchaud

Ultrasonic Imaging for Industrial Scene Analysis

An overview on the complex field of acoustical imaging is presented. In the field of industrial scene analysis mostly optical systems are employed. Acoustical methods are inferior to optical methods with respect to resolution and to data acquisition time. However acoustical systems can measure the distance of objects easily thus giving a three dimensional representation of the field of view. This is difficult to obtain by visual methods and for a limited class of applications acoustical imaging may be complementary or even competitive to optical imaging.
H. Urban

Adaptative Ultrasonic Range-Finder for Robotics

The Instituto de Automática Industrial has carried out researches in the field of sensors since its very beginning. During the last five years, the work on sensors has been, and still is, going on within the general program of the Instituto de Automática Industrial about Flexible Manufacturing Integrated Systems. In this field of activities the research is focused on two different topics: machining processes cutting condition appraisement and industrial robots perception. On the first one, and related with other works performed on machine-tools at the Instituto, the pursued objectives are the continuous appraisal of the cutting and tool wear down conditions and its breakage prediction. Attainment of these objectives would allow disposing of unmanned machining processes by means of adaptive control of machining parameters, dynamic compensation of machining and replacement of tool at the adequate moment.
J. M. Martín, R. Ceres, J. No, L. Calderón

Optical Sensors


TH 7864 Area Array Charge-Coupled Device (CCD) Image Sensor with Built-In Antiblooming Device

The TH 7864 is a 2/3” format area array CCD image sensor that incorporates an antiblooming system. It delivers 576 lines of 550 pixels in the CCIR TV standard.
D. Herault, G. Boucharlat

Real Time Holes Location. A Step Forward in Bin Picking Tasks

A special purpose hardware module developed to identify and locate holes in 3D is described. The system is based on the matching of the virtual points corresponding to the centers of the holes in a stereo pair. Location of the target regions on the image has been performed using two different approaches: the Radon transform and a modified version of the Hough transform.
Since the projection of a hole in the field of view on the image plane may be either a circle or an ellipse depending on the angle between the optical axis of the sensor and the base plane of each part in the scene, the system has been designed to identify and detect both, circular shapes and elliptical shapes.
Disparity analysis of the stereo pair is based on the determination of the coordinates on the image plane of the virtual points corresponding to the centers of the holes. The system supplies the 3D coordinates of the centers as well as the radii of the holes detected in the scene every 20 ms.
Although the system was initially thought to solve some bin picking applications it can be used to solve a great deal of applications not only in industrial environements but also in mobile robot guidance, traffic control, inspection and surveillance among others.
We present two real time implementations mostly based on a pipeline architecture wich allow real time performance, and as a consequence may be suitable for robotics applications.
Antonio B. Martínez, Vicenç Llario

Combined 2-D and 3-D Robot Vision System

The automation of assembly and handling of tools or workpieces can be performed by the integration of sensory systems only. Sensors have a fundamental importance in this process. Sensors used in manufacturing systems can be classified into two groups:
  • Obtaining internal data from robots (e.g. joint position, angular velocity, etc.)
  • Obtaining external data from the environment to detect the presence, type, orientation, surface or other characteristics of objects in order to perform different operations with them.
P. Levi, L. Vajtá

The Calibration Problem for Stereoscopic Vision

The problem of calibrating a stereo system is extremely important in practical applications. We describe in this paper our approach for coming up with an efficient and accurate solution. We first review the pinhole camera model that is used and analyze its relationship with respect to the internal camera parameters and its position in space. We then study its behavior with respect to changes of coordinate systems.
This yields a constraint which is used in the meansquare solution of the calibration problem that we propose. Since an estimation of the uncertainty is also important, we suggest another solution based on Kaiman filtering.
We show a number of experimental results and compare them with those obtained by Tsai [8]. We finish with two practical applications of our calibration technique: reconstructing 3D points and computing the epipolar geometry of a stereo system.
O. D. Faugueras, G. Toscani

Non Heuristic Estimation of Object Shapes from Partial Information

This paper considers the problem of reconstructing shapes of objects from sparse measures such as points on the boundary of an object. In most situations, the points are the end points of a curve, called a “ray”, which does not cross the objects. For example, if the sensor is an optical device, the ray is the straight line (the optical ray) joining the camera center to the point. We show that the information provided by rays is crucial when determining the shapes of objects and describe non heuristic reconstruction methods in 2D and 3D space.
J. D. Boissonnat, O. Monga

Parameter Estimation in Signal Processing

The filtering problem encountered in signal processing uses implicitely a model of the signal generator. The simultaneous estimation of the parameters of this model and the signal can be solved by the partitioning approach if there exist optimal, or at least sub-optimal filters for all possible values of the unknown parameters. We develop here this approach for the case of a Gauss-Markov signal observed through a counting Poisson process, problem that commonly arises with sensors using radiactive tracers.
J. Aguilar

Other Kind of Sensors


Dynamic Weighing in a Pick-and-Place Environment

The accurate weight measurement of food products during manufacture is of great importance as this significantly affects the amount of ‘give—away’ necessary to comply with weight legislation. Generally, in production this weight measurement is performed on a regular basis to ensure that the product is maintained to the target weight and within the target deviation. In the case considered here, the products dealt with are chocolate confectionary, but the problems faced are applicable to many production items. The manual method of on—line weight inspection uses an operator to remove the product from the production line, weigh it on an electronic balance, record the weight and return the product to the line. Some care is required in the removal and replacement of the units so that they remain ordered in complete rows.
D. G. Whitehead, I. M. Bell, D. J. Mulvaney, A. Pugh, P. Sweeting

Sensor Fusion in Certainty Grids for Mobile Robots

A numerical representation of uncertain and incomplete sensor knowledge we call Certainty Grids has been used successfully in several of our past mobile robot control programs, and has proven itself to be a powerful and efficient unifying solution for sensor fusion, motion planning, landmark identification, and many other central problems. We had good early success with ad-hoc formulas for updating grid cells with new information. A new Bayesian statistical foundation for the operations promises further improvement. We propose to build a software framework running on processors onboard our new Uranus mobile robot that will maintain a probabilistic, geometric map of the robot’s surroundings as it moves. The “certainty grid” representation will allow this map to be incrementally updated in a uniform way from various sources including sonar, stereo vision, proximity and contact sensors. The approach can correctly model the fuzziness of each reading, while at the same time combining multiple measurements to produce sharper map features, and it can deal correctly with uncertainties in the robot’s motion. The map will be used by planning programs to choose clear paths, identify locations (by correlating maps), identify well known and insufficiently sensed terrain, and perhaps identify objects by shape. The certainty grid representation can be extended in the time dimension and used to detect and track moving objects. Even the simplest versions of the idea allow us fairly straightforwardly to program the robot for tasks that have hitherto been out of reach. We look forward to a program that can explore a region and return to its starting place, using map “snapshots” from its outbound journey to find its way back, even in the presence of disturbances of its motion and occasional changes in the terrain.
H. P. Moravec

A Six Degrees of Freedom Positional Deviation Sensor for the Teaching of Robots

A device for physically guiding a robot manipulator through its task is described. It consists of inductive, contact-free positional deviation sensors. The sensor will be used in high performance sensory control systems. The paper describes problems concerning multi-dimensional, non-linear measurement functions and the design of the servo control system.
F. Dessen, J. G. Balchen



Force Feedback Strategies and their Application to Assembly

Force sensors have great potential for improving the autonomy and flexibility of robots in assembly systems. Although force sensors are sufficiently developed in order to be reliably used in industrial environments, the control aspects involving force feedback are not yet fully understood. The aim of the paper is to assess and to compare some potentially useful schemes involving force control.
H. Van Brussel

Application of Laser Range Finder to Robot Vision

Sensing technology plays an important role in order to realize flexible robots that can perform tasks in a variety of situations. Sensors are divided into two categories: sensors for local information such as tactile sensors, and sensors for global information such as vision sensors. The former is effectively used by actuators while they are interacting with the environment. While, the latter is useful for higher level planning by providing information of the global environment. The main role of robot vision is, therefore, understanding the environment of the robot.
Y. Shirai

Visual Inspection System with Qualitative Analysis Capabilities

In every production system, inspection constitutes a very relevant task. Even now, many visual inspection tasks are done in a non automatic way. Human visual inspection is characterized by its ability to detect easily qualitative faults even if they are very small, for instance a selection of wood based on its knots and waters or by the faults on the finish of its surface. On the other hand, a person has great difficulties in appreciating, by means of his vision, quantitative parameters such as precise measures.
J. Amat, A. Casals

Predictive and Estimation Schemes in Sensor-Controlled Telerobotics

The paper discusses the problems that arise when sensorcontrolled robots in space are teleoperated from ground stations. Predictive 3D-computer-graphics presently seems the only way to successfully cope with the problem of transmission time delays of several seconds. Appropriate estimation schemes are outlined which include models of the delay lines the robot/ moving objects etc., and which derive the necessary corrections from sensory data as they are sent down from the spacecraft to earth. The space robot technology experiment ROTEX scheduled for the next spacelab-mission D2 is taken as a basis for the problem description.
G. Hirzinger


Weitere Informationen