Skip to main content
Top

2008 | Book

Soft Computing Applications in Industry

insite
SEARCH

About this book

Softcomputing techniques play a vital role in the industry. This book presents several important papers presented by some of the well-known scientists from all over the globe. The application domains discussed in this book include: agroecology, bioinformatics, branched fluid-transport network layout design, dam scheduling, data analysis and exploration, detection of phishing attacks, distributed terrestrial transportation, fault detection of motors, fault diagnosis of electronic circuits, fault diagnosis of power distribution systems, flood routing, hazard sensing, health care, industrial chemical processes, knowledge management in software development, local multipoint distribution systems, missing data estimation, parameter calibration of rainfall intensity models, parameter identification for systems engineering, petroleum vessel mooring, query answering in P2P systems, real-time strategy games, robot control, satellite heat pipe design, monsoon rainfall forecasting, structural design, tool condition monitoring, vehicle routing, water network design, etc.

The softcomputing techniques presented in this book are on (or closely related to): ant-colony optimization, artificial immune systems, artificial neural networks, Bayesian models, case-based reasoning, clustering techniques, differential evolution, fuzzy classification, fuzzy neural networks, genetic algorithms, harmony search, hidden Markov models, locally weighted regression analysis, probabilistic principal component analysis, relevance vector machines, self-organizing maps, other machine learning and statistical techniques, and the combinations of the above techniques.

Table of Contents

Frontmatter
Optimization of Industrial Processes Using Improved and Modified Differential Evolution
Introduction
Optimization refers to finding one or more feasible solutions, which correspond to extreme values of one or more objectives. The need for finding such optimal solutions in a problem comes mostly from the extreme purpose of either designing a solution for minimum possible cost of fabrication, or for maximum possible reliability, or others. Because of such extreme properties of optimal solutions, optimization methods are of great importance in practice, particularly in engineering design, scientific experiments and business decision-making.
B. V. Babu, Rakesh Angira
Noise-Robust Tool Condition Monitoring in Micro-milling with Hidden Markov Models
Introduction
Tool condition monitoring is crucial to the efficient operation of machining process where the cutting tool is subject to continuous wear. In particular, in micro machining, the tolerances, depth of cut, and even workpiece sizes are in micro scale. Micromachining can overcome the shortcomings of micro fabrication techniques (such as lithography and etching) with limitation of work materials (mostly on silicon) and geometric forms (2 or 2.5 dimensions) (Byrne et al. 2003; Liu et al. 2004). One very versatile micro-machining process is micro-milling. Micro-milling has advantages over other micro-machining techniques with respect to the types of workable materials and the free-form 3D micro structures with high aspect ratios and high geometric complexity. However, in micro-milling, with the miniaturisation of the cutting tool (<1 mm in diameter), and the use of high speed (>10,000 rpm), the tool wears quickly. It is critical to monitor the tool wear in micro-machining due to the high precision required. Compared to conventional machining, the noise component in the signal for monitoring micro-machining is usually very high and difficult to separate (Tansel et al 1998; Zhu et al. 2007). This phenomenon makes it difficult to apply TCM in micro-machining.
K. P. Zhu, Y. S. Wong, G. S. Hong
Dynamically Self-generated Fuzzy Neural Networks with Industry Applications
Introduction
Over the last few decades, fuzzy logic has been shown as a powerful methodology for dealing with imprecision and nonlinearity efficiently. Applications can be found in a wide context ranging from medicine to finance, from human factors to consumer products, from vehicle control to computational linguistics, and so on (Wang 1997; Dubois and Prade 2000; Passino and Yurkovich 1998; Jang et al. 1997; Sugeno 1985; Pedrycz 1993). However, one of the shortcomings of fuzzy logic is the lack of systematic design. To circumvent this problem, fuzzy logic is usually combined with Neural Networks (NNs) by virtue of the learning capability of NNs. NNs are networks of highly interconnected neural computing elements that have the ability of responding to input stimuli and learning to adapt to the environment. Both fuzzy systems and NNs are dynamic and parallel processing systems that estimate input-output functions (Mitra and Hayashi 2000). The merits of both fuzzy and neural systems can be integrated in Fuzzy Neural Networks (FNNs) (Lee and Lee 1974, 1975; Pal and Mitra 1999; Zanchettin and Ludermir 2003). Therefore, the integration of fuzzy and neural systems leads to a symbiotic relationship in which fuzzy systems provide a powerful framework for expert knowledge representation, while NNs provide learning capabilities.
Meng Joo Er, Yi Zhou
Kernel Estimators in Industrial Applications
Introduction
The specification, based on experimental data, of functions which characterize an object under investigation, constitutes one of the main tasks in modern science and technological problems. A typical example here is the estimation of density function of random variable distribution from any given sample. The classical procedures rely here on arbitrary assumption of the form of this function, and then in specification of its parameters. These are called parametric methods. A valuable advantage is their theoretical and calculational simplicity, as well as their being commonly known and present in subject literature. Nowadays – along with the dynamic development of computer systems – nonparametric methods, whose main feature constitutes a lack of arbitrary assumptions of the form of a density function, are used more and more often. In a probabilistic approach, kernel estimators are becoming the principal method in this subject. Although their concept is relatively simple and interpretation transparent, the applications are impossible without a high class of computer which, even until recently, significantly hindered theoretical, and especially practical research.
Piotr Kulczycki
Negative Selection Algorithm with Applications in Motor Fault Detection
Introduction
Natural immune systems are complex and enormous self-defense systems with the remarkable capabilities of learning, memory, and adaptation (Goldsby et al. 2003). Artificial Immune Systems (AIS), inspired by the natural immune systems, are an emerging kind of soft computing methods (de Castro and Timmis 2002). With the distinguishing features of pattern recognition, data analysis, and machine learning, the AIS have recently gained considerable research interest from different communities (Dasgupta 2006; Dasgupta and Attoh-Okine 1997; Garrett 2005). Being an important constituent of the AIS, Negative Selection Algorithm (NSA) is based on the principles of maturation of T cells and self/nonself discrimination in the biological immune systems. It was developed by Forrest et al. in 1994 for the real-time detection of computer viruses (Forrest et al. 1994). During the past dec ade, the NSA has been widely applied in such promising engineering areas as anomaly detection (Stibor et al. 2005), networks security (Dasgupta and González 2002), aircraft fault diagnosis (Dasgupta 2004), and milling tool breakage detection (Dasgupta and Forrest 1995). In this paper, we first introduce the basic principle of the NSA. Two modified NSA, clonal selection algorithm-optimized and neural networks-based NSA, are next introduced. Their applications in the motor fault detection are also discussed.
X. Z. Gao, S. J. Ovaska, X. Wang
Harmony Search Applications in Industry
Introduction
In this chapter, the recently-developed music-inspired harmony search (HS) algorithm is introduced and its various industrial applications are reviewed.
The HS algorithm (Geem et al 2001) mimics the behaviors of musicians improvising to find a fantastic harmony in terms of aesthetics. Similarly, the optimization process seeks a superior vector in terms of objective function. This is the core analogy between improvisation and optimization in the HS algorithm.
Zong Woo Geem
Soft Computing in Bioinformatics: Genomic and Proteomic Applications
The Age of Bioinformatics
Bioinformatics has been described as the science of managing, mining, and interpreting information from biological sequences and structures (Li et al 2004). The emergence of the field has been largely attributed to the increasing amount of biomedical data created and collected and the availability and advancement of high-throughput experimental techniques. One recent example of this is the advancement of ‘lab-on-achip’ (see Figure 1) technology which allows experimentation to be performed more rapidly and at lower cost, whilst introducing the possibility of observing new phenomena or obtaining more detailed information from biologically active systems (Whitesides 2006). Such advances enable scientists to conduct experiments which result in large amounts of experimental data over a relatively short period of time. The need to analyse such experimental data has often necessitated a similarly highthroughput approach in order to produce rapid results, employing the use of efficient and flexible analysis methods and, in many areas, driving the need for every improving data analysis techniques. It is for this reason bioinformatics draws upon fields including, but not limited to, computer science, biology (including biochemistry), mathematics, statistics and physics.
James Malone
Evolutionary Regression and Neural Imputations of Missing Values
Introduction
While the information age has made a large amount of data available for improved industrial process planning, occasional failures lead to missing data. The missing data may make it difficult to apply analytical models. Data imputation techniques help us fill the missing data with a reasonable prediction of what the missing values would have been.
Pawan Lingras, Ming Zhong, Satish Sharma
Glowworm Swarm Optimization Algorithm for Hazard Sensing in Ubiquitous Environments Using Heterogeneous Agent Swarms
Introduction
Ubiquitous computing based environments may be defined as human surroundings that are furnished with a network of intelligent computing devices, which could be either stationary or mobile (or an assortment of both), in order to service certain human- generated or needed tasks. Mark Weiser introduced ubiquitous computing, in its current form, in 1988 at the Computer Science Lab at Xerox PARC and wrote some of the earliest papers on ubiquitous computing (Weiser 1999).Ubiquitous computing based environments have several applications to the industry like environmental monitoring (Kim et al. 2007), ubiquitous factory environments (Jabbar et al. 2007), and self-sensing spaces (El-Zabadani et al. 2007). Kim et al. (Kim et al. 2007) develop a framework that uses ubiquitous sensor networks for atmospheric environment monitoring. Jabbar et al.(Jabbar et al. 2007) present methods that integrate latest technologies like RFID, PDA, and Wi-Fi in order to transform a nuclear power plant into an ubiquitous factory environment where effective data communication among local area operators and control room and minimization of work duration and errors in wake of safety requirements are achieved. El-Zabadani et al. (El-Zabadani et al. 2007) propose a novel approach to mapping and sensing smart spaces in which a mobile platform equipped with on-board RFID modules identifies and locates RFID-tags that are embedded in the carpet in the form of a grid.
K. N. Krishnanand, D. Ghose
Self-organization in Evolution for the Solving of Distributed Terrestrial Transportation Problems
Introduction
The method presented in this chapter has its origin in adaptive meshing, using planar honeycomb structures as a tool to dimension radio-cellular network according to mobile traffic (Créput et al. 2000, 2005; Créput and Koukam 2006). Here, the approach has been transferred and generalized to a terrestrial transportation context. The transport mesh is a geometric structure, in the plane, that adapts and modifies its shape according to traffic demands. By separating the transportation network from the underlying demands, the approach theoretically allows to deal with noisy or incomplete data as well as with fluctuating demand. Furthermore, continuous visual feedback during simulations is naturally allowed.
Jean-Charles Créput, Abderrafiaâ Koukam
Statistical Forecasting of Indian Summer Monsoon Rainfall: An Enduring Challenge
Introduction
Forecasting All India Summer Monsoon Rainfall (AISMR), one or more seasons in advance, has been an elusive goal for hydrologists, meteorologists, and astrologers alike. In spite of advances in data collection facilities, improvements in computational capabilities, and progress in our understanding of the physics of the monsoon system, our ability to forecast AISMR has remained more or less unchanged in past several decades. On one hand, physically based numerical prediction models that are considered a panacea for daily weather forecasting have not evolved to a stage where they can realistically predict or even simulate annual variations in Indian monsoon. On the other hand, statistical models that have traditionally been used for making operational forecasts have failed in forecasting extreme monsoon rainfall years. It has been suggested that, in future, physically based models may improve to an extent where they can produce useful forecasts. However, until then, it would be prudent to develop statistical forecast models using state-of-the-art soft-computing techniques.
Shivam Tripathi, Rao S. Govindaraju
Fault Diagnosis of Electronic Circuits Using Cellular Automata Based Pattern Classifier
Introduction
This chapter formulates fault diagnosis in electronic circuits as a pattern classification problem. The proposed pattern classification scheme employs the computing model of a special class of sparse network referred to as cellular automata (CA). A particular class of CA referred to as multiple attractor CA (MACA) has been projected as a classifier of faulty response-pattern of a circuit. The genetic algorithm (GA) is employed to synthesize the desired CA required for diagnosis of a circuit under test (CUT). The CUT is assumed to have a network of large number of circuit components partitioned into a number of sub-circuits referred to as modules. Introduction of GA significantly reduces the design overhead of the MACA based classifier that supports:
  • low memory overhead for diagnosis - reduction of one to two order of magnitude of memory overhead has been achieved over that required for conventional fault dictionary based diagnosis scheme;
  • excellent diagnostic resolution and low diagnostic aliasing;and
  • low cost hardware of a generic fault diagnosis machine (FDM) with simple, regular, modular, and cascadable structure of CA that suits ideally for very large scale integration (VLSI) implementation.
Pradipta Maji, P. Pal Chaudhuri
A Survey of Artificial Neural Network-Based Modeling in Agroecology
Introduction
Agroecological systems are difficult to model because of their high complexity and their nonlinear dynamic behavior. The evolution of such systems depends on a large number of ill-defined processes that vary in time, and whose relationships are often highly non-linear and very often unknown. According to Schultz et al. (2000), there are two major problems when dealing with modeling agroecological processes. On the one hand, there is an absence of equipment able to capture information in an accurate way, and on the other hand there is a lack of knowledge about such systems. Researchers are thus required to build-up models in rich and poor-data situations, by integrating different sources of data, even if this data is noisy, incomplete, and imprecise.
Jiménez Daniel, Pérez-Uribe Andrés, Satizábal Héctor, Barreto Miguel, Van Damme Patrick, Tomassini Marco
Software Development Knowledge Management Using Case-Based Reasoning
Introduction
The knowledge generated in the software development process is a precious resource for companies. Usually this knowledge is not stored, which does not enable its reuse in future projects. Knowledge reuse in the domain of software development has several advantages: it improves productivity and quality of software systems (Coulange 1997, Gamma et al. 1995, Jacobson et al. 1997, Meyer 1987, Prieto-Diaz 1993); it minimizes the loss of know-how when a member of the development team abandons the company; and it enables knowledge sharing across different development teams or projects.
Paulo Gomes, Joel Cordeiro, Pedro Gandola, Nuno Seco
Learning from Demonstration and Case-Based Planning for Real-Time Strategy Games
Introduction
Artificial Intelligence (AI) techniques have been successfully applied to several computer games. However, in the vast majority of computer games traditional AI techniques fail to play at a human level because of the characteristics of the game. Most current commercial computer games have vast search spaces in which the AI has to make decisions in real-time, thus rendering traditional search based techniques inapplicable. For that reason, game developers need to spend a big effort in hand coding specific strategies that play at a reasonable level for each new game. One of the long term goals of our research is to develop artificial intelligence techniques that can be directly applied to such domains, alleviating the effort required by game developers to include advanced AI in their games.
Santiago Ontañón, Kinshuk Mishra, Neha Sugandh, Ashwin Ram
A CBR System: The Core of an Ambient Intelligence Health Care Application
Introduction
This paper presents a case-based reasoning system developed to generate an efficient and proactive ambient intelligent application. Ambient Intelligence (AmI) proposes a new way to interact between people and technology, where this last one is adapted to individuals and their context (Friedewald and Da Costa 2003). The objective of Ambient Intelligence is to develop intelligent and intuitive systems and interfaces capable to recognize and respond to the user’s necessities in a ubiquitous way, providing capabilities for ubiquitous computation and communication, considering people in the centre of the development, and creating technologically complex environments in medical, domestic, academic, etc. fields (Susperregui et al. 2004). Ambient Intelligence requires new ways for developing intelligent and intuitive systems and interfaces, capable to recognize and respond to the user’s necessities in a ubiquitous way, providing capabilities for ubiquitous computation and communication. The multi-agent systems (Wooldridge and Jennings 1995) have become increasingly relevant for developing distributed and dynamic intelligent environments. A case-based reasoning system (Aamodt and Plaza 1994) has been embedded within a deliberative agent and allows it to respond to events, to take the initiative according to its goals, to communicate with other agents, to interact with users, and to make use of past experiences to find the best plans to achieve goals. The deliberative agent works with the concepts of Belief, Desire, Intention (BDI) (Bratman 1987), and has learning and adaptation capabilities, which facilitates its work in dynamic environment.
Juan M. Corchado, Javier Bajo, Yanira de Paz
Soft Query-Answering Computing in P2P Systems with Epistemically Independent Peers
Introduction to P2P Systems: A Survey
Knowledge-based systems typically deals with incomplete and uncertain knowledge. Numerous extensions have been made to the logic programming and deductive databases in order to handle this incompleteness/uncertainty. These extensions can broadly be characterized into non-probabilistic and probabilistic formalisms. Approximate (uncertain) information can be considered as a kind of relativization of truth values for sentences. This is the standard approach used in several many-valued logics. For example, these many-valued logics include the smallest 4-valued logic programming based on Belnap’s bilattice [1] with two additional logic values (’unknown’ and ’possible’ values for incomplete and locally-inconsistent information respectively), and infinitary fuzzy logics and their extensions [2] and combinations [3] also. In query-answering from distributed databases, especially in the web framework, we often have to deal with partial information and query-algorithms which are defined for real-time applications. Hence the obtained results are generally incomplete. That is, the soft computing query-answering in web database applications is naturally correlated with effective non-omniscient query-agents. There does not exist a centralized control mechanism to handle the entire information in web database applications. The complete answers to such queries are practically undecidable because an enormous amount of time is required to compute the answers. As a result, such answering systems cannot be used in practice. Some approximative algorithms are needed to obtain reasonable but incomplete answers. In addition, these algorithms need to be parametrically incremental.
Zoran Majkić, Bhanu Prasad
Power Distribution System Fault Diagnosis Using Hybrid Algorithm of Fuzzy Classification and Artificial Immune Systems
Introduction
As a vital lifeline of the modern society for maintaining adequate and reliable flows of energy, power distribution systems deliver the electricity from high voltage transmission circuits to customers. Any interruption in their service may cause economical loss, damage equipments, and even endanger people lives. When a power outage (i.e., the loss of the electricity supply to an area) occurs, it is of great importance to diagnose the fault and restore the system in a timely manner in order to maintain the system availability and reliability. However, the restoration process may take from tens of minutes to hours. Most utilities for safety concerns do not restore the system until the outage cause has been found: linemen may need to inspect the distribution lines section by section in an attempt to find evidences (e.g., burn marks on the pole for possible lightning caused faults, dead animal bodies for possible animal activity related faults) and to ensure safety prior to re-energizing the system (e.g., no down distribution lines). Sometimes specific crew need to be further dispatched for advanced tasks such as the removal of fallen trees. Effective identification of either the cause or the location of the outage can provide valuable information to expedite the restoration procedure.
Le Xu, Mo-Yuen Chow
Detection of Phishing Attacks: A Machine Learning Approach
Introduction
Phishing is a form of identity theft that occurs when a malicious Web site impersonates a legitimate one in order to acquire sensitive information such as passwords, account details, or credit card numbers.Though there are several anti-phishing software and techniques for detecting potential phishing attempts in emails and detecting phishing contents on websites, phishers come up with new and hybrid techniques to circumvent the available software and techniques.
Ram Basnet, Srinivas Mukkamala, Andrew H. Sung
Backmatter
Metadata
Title
Soft Computing Applications in Industry
Editor
Bhanu Prasad
Copyright Year
2008
Publisher
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-77465-5
Print ISBN
978-3-540-77464-8
DOI
https://doi.org/10.1007/978-3-540-77465-5

Premium Partners