Skip to main content
Top

1991 | Book

Sensor-Based Robots: Algorithms and Architectures

Editor: C. S. George Lee

Publisher: Springer Berlin Heidelberg

Book Series : NATO ASI Series

insite
SEARCH

About this book

Most industrial robots today have little or no sensory capability. Feedback is limited to information about joint positions, combined with a few interlock and timing signals. These robots can function only in an environment where the objects to be manipulated are precisely located in the proper position for the robot to grasp (i. e. , in a structured environment). For many present industrial applications, this level of performance has been adequate. With the increasing demand for high performance sensor-based robot manipulators in assembly tasks, meeting this demand and challenge can only be achieved through the consideration of: 1) efficient acquisition and processing of intemaVextemal sensory information, 2) utilization and integration of sensory information from various sensors (tactile, force, and vision) to acquire knowledge in a changing environment, 3) exploitation of inherent robotic parallel algorithms and efficient VLSI architectures for robotic computations, and finally 4) system integration into a working and functioning robotic system. This is the intent of the Workshop on Sensor-Based Robots: Algorithms and Architectures - to study the fundamental research issues and problems associated with sensor-based robot manipulators and to propose approaches and solutions from various viewpoints in improving present day robot manipula­ tors in the areas of sensor fusion and integration, sensory information processing, and parallel algorithms and architectures for robotic computations.

Table of Contents

Frontmatter

Sensor Fusion and Integration

Frontmatter
An Integrated Sensor System for Robots
Summary
In this paper the architecture and functions of the sensor system of an autonomous mobile system are described. The sensor system supports the operation of the planning, execution and supervision modules necessary to operate the robot. Since there is a multitude of concepts of vehicles available the sensor system will be explained with the help of an autonomous mobile assembly robot which is being developed at the University of Karlsruhe. The vehicle contains a navigator, a docking module and an assembly planner. The driving is done with the help of cameras and sonic sensors in connection with a road map under the direction of the navigator. The docking maneuver is controlled by sensors and the docking supervisor. The assembly of the two robot arms is prepared by the planner and controlled by a hierarchy of sensors. The robot actions are planned and controlled by several expert systems.
Ulrich Rembold, Paul Levi
Robot Tactile Perception
Abstract
In this paper we discuss some fundamental issues related to the development of an artificial tactile sensing system intended for investigating robotic active touch. The analysis of some psychological and psychophysical aspects of human tactile perception, and a system design approach aimed at effectively integrating the motor and sensory functions of the robot system, suggested to conceptually organize tactile exploratory tasks into a hierarchical structure of sensory-motor acts. Our approach is to decompose complex tactile operations into elementary sensory-motor acts, that we call “TACTILE SUBROUTINES”, each aimed at the extraction of a specific feature from the explored object. This approach simplifies robot control and allows a modular implementation of the system architecture: each function can be developed independently and new capabilities can easily be added to the system. All tactile exploratory procedures are selected and coordinated by a high-level controller, which also operates the integration of tactile data coming from sensors and from lower levels of the hierarchy.
Some experimental results will be presented demonstrating the feasibility and usefulness of tactile sensing in exploratory operations. A recently developed sensor will be briefly presented, which exploits force/torque information measured directly at the tip of the robot end-effector. This sensor is able to detect, besides the position of the contact point, the normal and tangential components of the contact force. Methods for characterizing the surface of manipulated objects, according to their hardness, texture and friction properties will also be discussed.
G. Buttazzo, A. Bicchi, P. Dario
Uncertainty in Robot Sensing
Abstract
This paper deals with sensing uncertainty in a robot world. Sensors typically provide signals that are both incomplete and ambiguous. Three pieces of research are described which attempt effective solutions to this common problem but using three different approaches. The first piece of work uses vision to demonstrate the construction and integration of a dynamic world model for mobile robot navigation. The second, provides an adaptive rule-based controller for an inverted pendulum and cart problem and the third, sensory integration of vision and taction for the purposes object recognition.
The theme for the first piece of work is that the most effective solutions are obtained when maximising the amount of representational data available. The theme of the second is that broad qualitative partitioning of a state-space can avoid problems of ambiguity and noise without performance decrement. Indeed, the use of broad qualitative partitions is shown to lead to the development of heuristic adaptive controllers for complex dynamic systems that offer far greater than flexibility than those based on classical methodologies. The final theme is that machine learning can play a powerful role in the generation of sensor-based models.
E. Grant

Vision Algorithms and Architectures

Frontmatter
Robotic Vision Knowledge System
Abstract
This article presents a robotic vision knowledge system based on some current sensor and machine intelligence methodologies, quite a large portion of which has recently been developed by the PAMI Group at the University of Waterloo.
Andrew K. C. Wong
Algorithm for Visible Surface Pattern Generation — a Tool for 3D Object Recognition
Abstract
This paper describes useful algorithms, developed for model-based object recognition, which happens to be one of the basic problems in the area of Robot Vision research. The important vision-oriented functions derived through these algorithms are: (i) the generation of the3D convex hull of an object to calculate its feasible stable positions, (ii) the determination of the pattern of the visible surfaces in the orthographic projection and finally (iii) the extraction of characteristic features invariant to the object rotation. The parameters will be used for a consequent matching phase. The feasibility of the algorithm is demonstrated through several sample objects.
J. Majumdar, P. Levi, U. Rembold
Knowledge-Based Robot Workstation: Supervisor Design
Abstract
There are several problems currently inhibiting the growth of automation in industry. In particular, the growing interest in the application of robots to assembly tasks is being limited by the way such tasks are programmed and executed. Current robotic assembly systems force an exact, detailed description of the task to be executed. To perform a given assembly task, the detailed actions of the robot, as well as the successive positions and orientations of the gripper, must be specified. In addition, the work environment of the robot and the state of the objects in it must be completely controlled for the robot to successfully accomplish its task. These requirements could be reduced by the use of a variety of sensors and, in this way, allow the degree of uncertainty in the environment to be increased. Nevertheless, the use of sensors alone could also make the prograrnming phase more difficult.
Robert B. Kelley
Robot/Vision System Calibrations in Automated Assembly
Abstract
A vision guided robot for assembly is defined to be a robot/vision system that acquires robotic destination poses (location and orientation) by visual means so that the robot’s end-effector can be positioned at the desired poses. In this paper, the robot/vision system consists of a stereo-pair of CCD array cameras mounted to the end-effector of a six-axis revolute robot arm. Automated calibration methodologies for local and global work volumes of the robot/vision system are described, including a perspective error transform calibration method for cameras. Multiple component assembly and robotic fastening has been demonstrated with the developed vision guided robot.
F. G. King, G. V. Puskorius, F. Yuan, R. C. Meier, V. Jeyabalan, L. A. Feldkamp

Neural Networks, Parallel Algorithms and Control Architectures

Frontmatter
A Unified Modeling of Neural Networks Architectures
Abstract
Although neural networks can ultimately be used for many applications, their suitability for a specific application depends on the acquisition/representation, performance vs. training data, response time, classification accuracy, fault tolerance, generality, adaptability, computational efficiency, size and power requirement. In order to deal with such a multiple-spectrum consideration, there is a need of unified examination of the theoretical foundations of neural network modeling. This can lead to more effective simulation and implementation tools. For this purpose, the paper proposes a unified modeling formulation for a wide variety of artificial neural networks (ANNs): single layer feedback networks, competitive learning networks, multilayer feed-forward networks, as well as some probabilistic models. The existing connectionist neural networks are parameterized by nonlinear activation function, weight measure function, weight updating formula, back-propagation, and iteration index (for retrieving phase) and recursion index (for learning phase). Based on the formulation, new models may be derived and one such example is discussed in the paper. The formulation also leads to a basic structure for a universal simulation tool and neurocomputer architecture.
S. Y. Rung, J. N. Hwang
Practical Neural Computing for Robots: Prospects for Real-Time Operation
Abstract
The sudden growth of interest in neural computing is a remarkable phenomenon that will be seen by future historians of computer science as marking the 1980s in much the same way as research into artificial intelligence (AI) has been the trademark of the 1970s. There is one major difference, however: in contrast with AI which was largely an outlet for a minority of computer scientists, neural computing unites a very broad community: physicists, statisticians, parallel processing experts, optical technologists, neurophysiologists and experimental biologists. The focus of this new paradigm is rather simple. It rests on the recognition by this diverse community that the brain ‘computes’ in a very different way from the conventional computer.
I. Aleksander
Self-Organizing Neuromorphic Architecture for Manipulator Inverse Kinematics
Abstract
We describe an efficient neuromorphic formulation to accurately solve the inverse kinematics problem for redundant manipulators, thereby enabling development of enhanced anthropomorphic capability and dexterity. Our approach involves a dynamical learning procedure based on a novel formalism in neural network theory: the concept of “terminal” attractors, that are shown to correspond to solutions of the nonlinear neural dynamics with infinite local stability. Topographically mapped terminal attractors are then used to define a neural network whose synaptic elements can rapidly encapture the inverse kinematics transformations using a priori generated examples and, subsequently generalize to compute the joint-space coordinates required to achieve arbitrary end-effector configurations. Unlike prior neuromorphic implementations, this technique can also systematically exploit redundancy to optimize kinematic criteria, e.g. torque optimization, manipulability etc. and is scalable to configurations of practical interest. Simulations on 3-DOF and 7-DOF redundant manipulators, are used to validate our theoretical framework and illustrate its computational efficacy.
Jacob Barhen, Sandeep Gulati
Robotics Vector Processor Architecture for Real-Time Control
Abstract
This paper proposes a restructurable architecture based on a VLSI Robotics Vector Processor (RVP) chip. It is specially tailored to exploit parallelism in the low-level matrix/vector operations characteristic of the kinematics and dynamics computations required for real-time control. The RVP is comprised of three tightly synchronized 32-bit floating-point processors to provide adequate computational power. Besides adder and multiplier units in each processor, the RVP contains a triple register-file, dual shift network and dual high-speed input/output channels to satisfy the storage and data movement demands of the computations targeted. Efficiently synchronized multiple-RVP configurations, that may be viewed as Variable-Very-Long-Instruction-Word (V2LIW) architectures, can be constructed and adapted to match the computational requirements of specific robotics computations. The use of the RVP is illustrated through a detailed example of the Jacobian computation, demonstrating good speedup over conventional microprocessors even with a single RVP. The RVP has been developed to be implementable on a single VLSI chip using a 1.2 μm CMOS technology, so that a single-board multiple-RVP system may be targeted for use on a mobile robot.
David E. Orin, P. Sadayappan, Y. L. C. Ling, K. W. Olson
On The Parallel Algorithms for Robotic Computations
Abstract
The kinematics, dynamics, Jacobian, and their corresponding inverses are six major computational tasks in the real-time control of robot manipulators. The parallel algorithms for these computations are examined and analyzed. They are characterized based on six well-defined features that have greatest effects on the execution of parallel algorithms. These features include type of parallelism, degree of parallelism (granularity), uniformity of operations, fundamental operations, data dependency, and communication requirement. It is found that the inverse dynamics, the forward dynamics, the forward kinematics and the forward Jacobian computations possess highly regular properties and they are all in homogeneous linear recursive form. The inverse Jacobian is essentially the problem of solving a system of linear equations. The closed-form solution of the inverse kinematics problem is obviously non-uniform and robot dependent. The iterative solution for the inverse kinematics problem seems uniform and the parallel portions of the algorithm involve the forward kinematics, the forward Jacobian, and the inverse Jacobian computations. Suitable algorithms for the six basic robotics computations are selected and parallelized to make use of their common features. The characterization of the six basic robotics algorithms is tabulated for discussion and the results can be used to design better parallel architectures or a common architecture for the computation of these robotics algorithms.
C. S. George Lee
Report on the Group Discussion about Neural Networks in Robotics
Abstract
The discussion developed around the following four topics: advancements that have led to the resurgence of neural netwok (NN), operations that NN are best suited for, main applications, and key problems that need to be solved.
Carme Torras
Backmatter
Metadata
Title
Sensor-Based Robots: Algorithms and Architectures
Editor
C. S. George Lee
Copyright Year
1991
Publisher
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-75530-9
Print ISBN
978-3-642-75532-3
DOI
https://doi.org/10.1007/978-3-642-75530-9