Skip to main content

Über dieses Buch

This volume constitutes the thoroughly refereed conference proceedings of the 26th International Conference on Industrial Engineering and Other Applications of Applied Intelligence Systems, IEA/AIE 2013, held in Amsterdam, The Netherlands, in June 2013. The total of 71 papers selected for the proceedings were carefully reviewed and selected from 185 submissions. The papers focus on the following topics: auctions and negotiation, cognitive modeling, crowd behavior modeling, distributed systems and networks, evolutionary algorithms, knowledge representation and reasoning, pattern recognition, planning, problem solving, robotics, text mining, advances in recommender systems, business process intelligence, decision support for safety-related systems, innovations in intelligent computation and applications, intelligent image and signal processing, and machine learning methods applied to manufacturing processes and production systems.



Auctions and Negotiation

Developing Online Double Auction Mechanism for Fishery Markets

In spot markets for trading fishes, single-sided auctions are used for clearing the market by virtue of its promptness and simplicity, which are important in dealing with perishable goods. However, in those auctions, sellers cannot participate in price-making process. A standard double auction market collects bids from traders and matches buyers’ higher bids and sellers’ lower bids to find the most efficient allocation, assuming that values of unsold items remain unchanged. Nevertheless, in the spot fish market, sellers suffer loss when they fail to sell the fish, whose salvage value is lost due to perishability. To solve the problem, we investigate the suitable design of an online double auction for fishery markets, where bids arrive dynamically with their time limits. Our market mechanism aims at improving traders’ profitability by reducing trade failures in the face of uncertainty of incoming/leaving bids. We developed a heuristic matching rule for the market to prioritize traders’ bids based on their time-criticality and evaluated its performance empirically.

Kazuo Miyashita

A Mediator-Based Agent Negotiation Protocol for Utilities That Change with Time

Multiple-issue negotiation has been studied extensively because most real-world negotiations involve multiple issues that are interdependent. Our work focuses on negotiations with multiple interdependent issues in which agent utility functions are complex and nonlinear. Issues being interdependent means that it is not appropriate to negotiate over issues one-by-one. The decision on one issue is dependent on decisions about previous and subsequent issues. In the literature, several negotiation protocols are proposed: bidding-based protocol, constraints-based protocol, secure SA-based protocol, etc. However, all have assumed that utility does not change with time, whereas in reality, this may not be the case. In this paper, we focus on finding and following the ”Pareto front” of the changing utility space over time. To find and follow the Pareto front effectively, we employ an evolutionary negotiation mechanism, in which a mediator takes the lead in negotiation based on GA. The experimental results show that our approach is suitable for the case where utility is dynamically changing over time.

Keisuke Hara, Mikoto Okumura, Takayuki Ito

Cognitive Modeling

Learning Parameters for a Cognitive Model on Situation Awareness

Cognitive models are very useful to establish human-like behavior in an agent. Such human-like behavior can be essential in for instance serious games in which humans have to learn a certain task, and are either faced with automated teammates or opponents. To tailor these cognitive models towards a certain scenario can however be a time-consuming task requiring a lot of domain expertise. In this paper, a cognitive model is taken as a basis, and the addition of scenario specific information is for a large part automated. The performance of the approach of automatically adding scenario specific information is rigorously evaluated using a case study in the domain of fighter pilots.

Richard Koopmanschap, Mark Hoogendoorn, Jan Joris Roessingh

Fostering Social Interaction of Home-Bound Elderly People: The EasyReach System

This paper presents the


system, a tool that aims at getting the elderly and pre-digital divide population closer to new technologies by creating a simplified social enviroment that facilitates interaction, trying to allow them to (i) easily keep in contact with friends and relatives, (ii) share their lifetime expertise, and (iii) avoid isolation. The


tool creates for the elderly a special social TV channel accessed by means of their own TV set and a specialized remote control endowed with gesture recognition, video and audio capture capabilities. A hidden personal assistant reasons on user preferences in the background allowing better focalization on his/her social interests.

Roberto Bisiani, Davide Merico, Stefano Pinardi, Matteo Dominoni, Amedeo Cesta, Andrea Orlandini, Riccardo Rasconi, Marco Suriano, Alessandro Umbrico, Orkunt Sabuncu, Torsten Schaub, Daniela D’Aloisi, Raffaele Nicolussi, Filomena Papa, Vassilis Bouglas, Giannis Giakas, Thanassis Kavatzikidis, Silvio Bonfiglio

Cooperative Games with Incomplete Information among Secondary Base Stations in Cognitive Radio Networks

In this paper, we propose a model for coalition formation among Secondary Base Stations (SBSs) with incomplete information in cognitive radio (CR) networks. This model allows us to analyze any situation in which players are imperfectly informed about the aspect of their environment that is relevant to their decision-making. On the other hand, by using the proposed method based on the game theory with incomplete information, SBSs can collaborate and self-organize into disjoint independent coalitions. The simulation results show that the proposed method yields a performance advantage in terms of the average payoff per SBS reaching up to 145% relative to the non-cooperative case.

Jerzy Martyna

Crowd Behavior Modeling

Ant Colony Optimisation for Planning Safe Escape Routes

An emergency requiring evacuation is a chaotic event filled with uncertainties both for the people affected and rescuers. The evacuees are often left to themselves for navigation to the escape area. The chaotic situation increases when a predefined escape route is blocked by a hazard, and there is a need to re-think which escape route is safest.

This paper addresses automatically finding the safest escape route in emergency situations in large buildings or ships with imperfect knowledge of the hazards. The proposed solution, based on Ant Colony Optimisation, suggests a near optimal escape plan for every affected person — considering both dynamic spread of hazards and congestion avoidance.

The solution can be used both on an individual bases, such as from a personal smart phone of one of the evacuees, or from a remote location by emergency personnel trying to assist large groups.

Morten Goodwin, Ole-Christoffer Granmo, Jaziar Radianti, Parvaneh Sarshar, Sondre Glimsdal

A Spatio-temporal Probabilistic Model of Hazard and Crowd Dynamics in Disasters for Evacuation Planning

Managing the uncertainties that arise in disasters – such as ship fire – can be extremely challenging. Previous work has typically focused either on modeling crowd behavior or hazard dynamics, targeting fully known environments. However, when a disaster strikes, uncertainty about the nature, extent and further development of the hazard is the rule rather than the exception. Additionally, crowd and hazard dynamics are both intertwined and uncertain, making evacuation planning extremely difficult. To address this challenge, we propose a novel spatio-temporal


model that integrates crowd with hazard dynamics, using a ship fire as a proof-of-concept scenario. The model is realized as a


Bayesian network (DBN), supporting distinct kinds of crowd evacuation behavior – both descriptive and normative (optimal). Descriptive modeling is based on studies of physical fire models, crowd psychology models, and corresponding flow models, while we identify optimal behavior using Ant-Based Colony Optimization (ACO). Simulation results demonstrate that the DNB model allows us to track and forecast the movement of people until they escape, as the hazard develops from time step to time step. Furthermore, the ACO provides safe paths, dynamically responding to current threats.

Ole-Christoffer Granmo, Jaziar Radianti, Morten Goodwin, Julie Dugdale, Parvaneh Sarshar, Sondre Glimsdal, Jose J. Gonzalez

Predicting Human Behavior in Crowds: Cognitive Modeling versus Neural Networks

Being able to make predictions on the behavior of crowds allows for the exploration of the effectiveness of certain measures to control crowds. Taking effective measures might be crucial to avoid severe consequences in case the crowd goes out of control. Recently, a number of simulation models have been developed for crowd behavior and the descriptive capabilities of these models have been shown. In this paper the aim is to judge the predictive capabilities of these complex models based upon real data. Hereby, techniques from the domain of computational intelligence are used to find appropriate parameter settings for the model. Furthermore, a comparison is made with an alternative approach, namely to utilize neural networks for the same purpose.

Mark Hoogendoorn

Distributed Systems and Networks

Analyzing Grid Log Data with Affinity Propagation

In this paper we present an unsupervised learning approach to detect meaningful job traffic patterns in Grid log data. Manual anomaly detection on modern Grid environments is troublesome given their increasing complexity, the distributed, dynamic topology of the network and heterogeneity of the jobs being executed. The ability to automatically detect meaningful events with little or no human intervention is therefore desirable. We evaluate our method on a set of log data collected on the Grid. Since we lack a priori knowledge of patterns that can be detected and no labelled data is available, an unsupervised learning method is followed. We cluster jobs executed on the Grid using Affinity Propagation. We try to explain discovered clusters using representative features and we label them with the help of domain experts. Finally, as a further validation step, we construct a classifier for five of the detected clusters and we use it to predict the termination status of unseen jobs.

Gabriele Modena, Maarten van Someren

Using Ising Model to Study Distributed Systems Processing Capability

Quality of service based on distributed systems must be preserved throughout all stages of life cycle. The stage in which this feature is critical is in stage planning of system capacity. Because this is an estimate, the traditional approach is based on the use of queues for capacity calculation. This paper proposes the use of Ising traditional model to capacity study.

Facundo Caram, Araceli Proto, Hernán Merlino, Ramón García-Martínez

Evolutionary Algorithms

Computing the Consensus Permutation in Mallows Distribution by Using Genetic Algorithms

We propose the use of a genetic algorithm in order to solve the rank aggregation problem, which consists in, given a dataset of rankings (or permutations) of


objects, finding the ranking which best


such dataset. Though different probabilistic models have been proposed to tackle this problem (see e.g. [12]), the so called

Mallows model

is the one that has more attentions [1]. Exact computation of the parameters of this model is an NP-hard problem [19], justifies the use of metaheuristic algorithms for its resolution. In particular, we propose a genetic algorithm for solving this problem and show that, in most cases (specially in the most complex ones) we get statistically significant better results than the ones obtained by the state of the art algorithms.

Juan A. Aledo, Jose A. Gámez, David Molina

A Hybrid Genetic Algorithm for the Maximum Satisfiability Problem

The satisfiability problem is the first problem proved to be NP-complete and has been one of the core NP-complete problems since then. It has many applications in many fields such as artificial intelligence, circuit design and VLSI testing. The maximum satisfiability problem is an optimization version of the satisfiablity problem, which is a decision problem. In this study, a hybrid genetic algorithm is proposed for the maximum satisfiability problem. The proposed algorithm has three characteristics: 1. A new fitness function is designed to guide the search more effectively; 2. A local search scheme is designed, in which a restart mechanism is devised to help the local search scheme escape from the solutions near an already searched local optimum; 3. The local search scheme is hybridized with a two-layered genetic algorithm. We compared the proposed algorithm with other algorithms published in the literature, on the benchmarks offered by Gotllieb, Marchion and Rossi [12] and Hao, Lardeux and Saubion[18].

Lin-Yu Tseng, Yo-An Lin

Knowledge Representation and Reasoning

An Object Representation Model Based on the Mechanism of Visual Perception

In areas of artificial intelligence and computer vision, object representation and recognition is an important topic, and lots of methods have been developed for it. However, analysis and obtain the knowledge of object’s structure at higher levels are still very difficult now. We draw on the experience of pattern recognition theories of cognitive psychology to construct a compact, abstract and symbolic representing model of object’s contour based on components fitting which is more consistent with human’s cognition process. In addition, we design an algorithm to match components between different images of the same object in order to make the feature extraction at higher levels possible.

Hui Wei, Ziyan Wang

Web Metadata Extraction and Semantic Indexing for Learning Objects Extraction

In this work, a new approach to automatic metadata extraction and semantic indexing for educational purposes is proposed to identify learning objects that may assist educators to prepare pedagogical materials from the Web. The model combines natural language processing techniques and machine learning methods to deal with semi-structured information on the web from which metadata are extracted. Experiments show the promise of the approach to effectively extract metadata web resources containing educational materials.

John Atkinson, Andrea Gonzalez, Mauricio Munoz, Hernan Astudillo

Pattern Recognition

Approximately Recurring Motif Discovery Using Shift Density Estimation

Approximately Recurring Motif (ARM) discovery is the problem of finding unknown patterns that appear frequently in real valued timeseries. In this paper, we propose a novel algorithm for solving this problem that can achieve performance comparable with the most accurate algorithms to solve this problem with a speed comparable to the fastest ones. The main idea behind the proposed algorithm is to convert the problem of ARM discovery into a density estimation problem in the single dimensionality shift-space (rather than in the original time-series space). This makes the algorithm more robust to short noise bursts that can dramatically affect the performance of most available algorithms. The paper also reports the results of applying the proposed algorithm to synthetic and real-world datasets.

Yasser Mohammad, Toyoaki Nishida

An Online Anomalous Time Series Detection Algorithm for Univariate Data Streams

We address the online anomalous time series detection problem among a set of series, combining three simple distance measures. This approach, akin to control charts, makes it easy to determine when a series begins to differ from other series. Empirical evidence shows that this novel online anomalous time series detection algorithm performs very well, while being efficient in terms of time complexity, when compared to approaches previously discussed in the literature.

Huaming Huang, Kishan Mehrotra, Chilukuri K. Mohan

Assisting Web Site Navigation through Web Usage Patterns

Extracting patterns from Web usage data helps to facilitate better Web personalization and Web structure readjustment. There are a number of different approaches proposed for Web Usage Mining, such as Markov models and their variations, or models based on pattern recognition techniques such as sequence mining. This paper describes a new framework, which combines clustering of users’ sessions together with a novel algorithm, called PathSearch-BF, in order to construct smart access paths that will be presented to the Web users for assisting them during their navigation in the Web sites. Through experimental evaluation on well-known datasets, we show that the proposed methodology can achieve valuable results.

Oznur Kirmemis Alkan, Pinar Karagoz


GRASPing Examination Board Assignments for University-Entrance Exams

In the sequel, we tackle a real-world problem: forming examination boards for the university-entrance exams at the autonomous region of Asturias (Spain). We formulate the problem, propose a heuristic GRASP-like method to solve it and show some experimental results to support our proposal.

Jorge Puente-Peinador, Camino R. Vela, Inés González-Rodríguez, Juan José Palacios, Luis J. Rodríguez

A Greedy Look-Ahead Heuristic for the Container Relocation Problem

This paper addresses a classic problem in the container storage and transportation, the container relocation problem. For a given layout of a container bay, the containers should be retrieved in a predefined order, the best operation plan with fewest crane operations is going to be determined. We develop a greedy look-ahead heuristic for this particular purpose. Comparing with existing approaches presented from literature, our heuristic provides better solutions than best known solutions in shorter computational time.

Bo Jin, Andrew Lim, Wenbin Zhu

Integrating Planning and Scheduling in the ISS Fluid Science Laboratory Domain

This paper describes a Planning and Scheduling Service (


) to support the Increment Planning Process for the International Space Station (ISS) payload management. The


is described while targeting the planning of experiments in the Fluid Science Laboratory (FSL), an ISS facility managed by Telespazio User Support and Operation Centre (T-USOC) that has been identified as a representative case study due to its complexity. The timeline-based approach inside the


is evaluated against realistic planning benchmark problems.

Amedeo Cesta, Riccardo De Benedictis, Andrea Orlandini, Riccardo Rasconi, Luigi Carotenuto, Antonio Ceriello

Efficient Identification of Energy-Optimal Switching and Operating Sequences for Modular Factory Automation Systems

In order to enable energy-efficient operation of factory automation systems during non-productive (idling) phases, the energy-optimal sequence of operating modes has to be calculated. Due to modular structures and runtime constraints, the combinatorial optimization problems that have to be solved to calculate energy-minimizing schedules for today’s automation systems become extremely complex. In this paper, a novel domain-specific branch-and-bound algorithm is proposed that takes the structural knowledge about the automation system into account in order to attenuate the exponential complexity. Relying on a network of automation subsystems representing the switching and energetic operating behavior of the automation system, the method uses the minimal energy demand for unrelated subsystems, which can be efficiently calculated using off-the-shelf constraint optimization techniques, as lower bounds for the energy demand of the complete automation system. Evaluations indicate that the proposed procedure outperforms a complete enumeration in terms of computational time by more than 70 percent while assuring to identify the energy-optimal switching and operating sequences.

Sebastian Mechs, Steffen Lamparter, Jörn Peschke, Jörg P. Müller

Problem Solving

The Two-Dimensional Vector Packing Problem with Courier Cost Structure

The two-dimensional vector packing problem with courier cost structure is a practical problem faced by many manufacturers that ship products using courier service. The manufacturer must ship a number of items using standard-sized cartons, where the cost of a carton quoted by the courier is determined by a piecewise linear function of its weight. The cost function is not necessarily convex or concave. The objective is to pack all items into cartons such that the total delivery cost is minimized while observing both the weight limit and volume capacity constraints. In this study, we investigate solution methods to this problem.

Qian Hu, Andrew Lim, Wenbin Zhu

Penguins Search Optimization Algorithm (PeSOA)

In this paper we propose a new meta-heuristic algorithm called penguins Search Optimization Algorithm (PeSOA), based on collaborative hunting strategy of penguins. In recent years, various effective methods, inspired by nature and based on cooperative strategies, have been proposed to solve NP-hard problems in which, no solutions in polynomial time could be found. The global optimization process starts with individual search process of each penguin, who must communicate to his group its position and the number of fish found. This collaboration aims to synchronize dives in order to achieve a global solution (place with high amounts of food). The global solution is chosen by election of the best group of penguins who ate the maximum of fish. After describing the behavior of penguins, we present the formulation of the algorithm before presenting the various tests with popular benchmarks. Comparative studies with other meta-heuristics have proved that PeSOA performs better as far as new optimization strategy of collaborative and progressive research of the space solutions.

Youcef Gheraibia, Abdelouahab Moussaoui

A Bidirectional Building Approach for the 2D Guillotine Knapsack Packing Problem

We investigate the 2D guillotine knapsack packing problem, where the objective is to select and pack a set of rectangles into a sheet with fix size and maximize the total profit of packed rectangles. We combine well known two methods namely top-down approach and bottom-up approach into a coherent algorithm to address this problem. Computational experiments on benchmark test sets show that our approach could find optimal solution for almost all the instances with moderate size and outperform all existing approaches for the larger instances.

Lijun Wei, Andrew Lim, Wenbin Zhu

Increasing the Antibandwidth of Sparse Matrices by a Genetic Algorithm

The antibandwidth problem consists in finding a labeling of the vertices of a given undirected graph such that among all adjacent node pairs, the minimum difference between the node labels is maximized. In this paper, we formulate the antibandwidth problem in terms of matrices and propose an efficient genetic algorithm based heuristic approach for increasing the corresponding antibandwidth. We report computational results for a set of 30 benchmark instances. The preliminary results point out that our approach is an attractive and appropriate method to explore the solution space of this complex problem and leads to good solutions in reasonable computational times.

Petrica C. Pop, Oliviu Matei

A New GA-Based Method for Temporal Constraint Problems

Managing numeric and symbolic temporal information is very relevant for a wide variety of applications including scheduling, planning, temporal databases, manufacturing and natural language processing. Often these applications are represented and managed with the well known constraint-based formalism called the Constraint Satisfaction Problem (CSP). We then talk about temporal CSPs where constraints represent qualitative or quantitative temporal information. Like CSPs, temporal CSPs are NP-hard problems and are traditionally solved with a backtrack search algorithm together with constraint propagation techniques. This method has however some limitations especially for large size problems. In order to overcome this difficulty in practice, we investigate the possibility of solving these problems using Genetic Algorithms (GAs). We propose a novel crossover specifically designed for solving TCSPs using GAs. In order to assess the performance of our proposed crossover over the well known heuristic based GAs, we conducted several experiments on randomly generated temporal CSP instances. In addition, we evaluated the performance of an integration of our crossover within a Parallel GA (PGA) approach. The test results clearly show that the proposed crossover outperforms the known GA methods for all the tests in terms of success rate and time needed to reach the solution. Moreover, when integrated within the PGA, our crossover is very efficient for solving very large size hard temporal CSPs.

Reza Abbasian, Malek Mouhoub

On Using the Theory of Regular Functions to Prove the ε-Optimality of the Continuous Pursuit Learning Automaton

There are various families of Learning Automata (LA) such as Fixed Structure, Variable Structure, Discretized etc. Informally, if the environment is stationary, their


-optimality is defined as their ability to converge to the optimal action with an arbitrarily large probability, if the learning parameter is sufficiently small/large. Of these LA families, Estimator Algorithms (EAs) are certainly the fastest, and within this family, the set of


algorithms have been considered to be the pioneering schemes. The existing proofs of the


-optimality of all the reported EAs follow the same fundamental principles. Recently, it has been reported that the previous proofs for the


-optimality of


the reported EAs have a

common flaw

. In other words, people have worked with this flawed reasoning for almost three decades. The flaw lies in the condition which apparently supports the so-called “monotonicity” property of the probability of selecting the optimal action, explained in the paper. In this paper, we provide a new method to prove the


-optimality of the Continuous Pursuit Algorithm (CPA), which was the pioneering EA. The new proof follows the same outline of the previous proofs, but instead of examining the monotonicity property of the action probabilities, it rather examines their


property, and then, unlike the traditional approach, invokes the theory of


functions to prove the


-optimality. We believe that the proof is both unique and pioneering, and that it can form the basis for formally demonstrating the


-optimality of other EAs.

Xuan Zhang, Ole-Christoffer Granmo, B. John Oommen, Lei Jiao


Online Exploratory Behavior Acquisition of Mobile Robot Based on Reinforcement Learning

In this study, we propose an online active perception system that autonomously acquires exploratory behaviors suitable for each embodiment of mobile robots using online learning. We especially focus on a type of exploratory behavior that extracts object features useful for robot’s orientation and object operation. The proposed system is composed of a classification system and a reinforcement learning system. While a robot is interacting with objects, the classification system classifies observed data and calculates reward values according to the cluster distance of the observed data. On the other hand, the reinforcement learning system acquires effective exploratory behaviors useful for the classification according to the reward. We validated the effectiveness of the system in a mobile robot simulation. Three different shaped objects were placed beside the robot one by one. In this learning, the robot learned different behaviors corresponding to each object. The result showed that the behaviors were the exploratory behaviors that distinguish the difference of corner angles of the objects.

Manabu Gouko, Yuichi Kobayashi, Chyon Hae Kim

Improved Sound Source Localization and Front-Back Disambiguation for Humanoid Robots with Two Ears

An improved sound source localization (SSL) method has been developed that is based on the generalized cross-correlation (GCC) method weighted by the phase transform (PHAT) for use with humanoid robots equipped with two microphones inside artificial pinnae. The conventional SSL method based on the GCC-PHAT method has two main problems when used on a humanoid robot platform: 1) diffraction of sound waves with multipath interference caused by the shape of the robot head and 2) front-back ambiguity. The diffraction problem was overcome by incorporating a new time delay factor into the GCC-PHAT method under the assumption of a spherical robot head. The ambiguity problem was overcome by utilizing the amplification effect of the pinnae for localization over the entire azimuth. Experiments conducted using a humanoid robot showed that localization errors were reduced by 9.9° on average with the improved method and that the success rate for front-back disambiguation was 32.2% better on average over the entire azimuth than with a conventional HRTF-based method.

Ui-Hyun Kim, Kazuhiro Nakadai, Hiroshi G. Okuno

Designing and Optimization of Omni-Directional Kick for Bipedal Robots

The paper presents designing and optimization of key-frame based kick skills for bipedal robots. The kicks, evolved via evolutionary algorithms, allow a humanoid robot to kick in straight, sideways, backward and in angular directions. Experiments are conducted on the simulated model of Nao robot that is being used in the RoboCup Soccer 3D Simulation league. The initial sets of kicks were manually designed by human experts and were passed as seed values to the optimization process. Correctness in the kick direction and the distance covered by the ball were used as the fitness criteria. The findings of the paper not only significantly improves the capability of our RoboCup Soccer 3D team but also provides insight in the designing and optimization of key-frame based kicks that can be utilized by other teams participating in bipedal soccer.

Syed Ali Raza, Sajjad Haider

Aerial Service Vehicles for Industrial Inspection: Task Decomposition and Plan Execution

We propose an autonomous control system for Aerial Service Vehicles capable of performing inspection tasks in buildings and industrial plants. In this paper, we present the applicative domain, the high-level control architecture along with some empirical results. The system has been assessed on real-world and simulated scenarios representing an industrial environment.

Jonathan Cacace, Alberto Finzi, Vincenzo Lippiello, Giuseppe Loianno, Dario Sanzone

A BSO-Based Algorithm for Multi-robot and Multi-target Search

Swarm robots are used in robotic applications where it is difficult or impossible for a single robot to accomplish a task. In this paper, we study multi-robot, multi-target search problem in an unknown environment. Our goal is to use a group of distributed cooperative mobile robots to find position of an object which is emitting the strongest intensity of radio frequency in the environment. We propose a novel algorithm based on Bee Swarm Optimization (BSO) which is able to automatically find the object. Our experimental results, simulated on a set of random benchmarks, show that the algorithm is able to outperform the state-of-the-art techniques, in particular Particle Swarm Optimization (PSO). We show that our algorithm can be 50.6% more effective for this application in comparison to PSO.

Hannaneh Najd Ataei, Koorush Ziarati, Mohammad Eghtesad

Text Mining

Economic Sentiment: Text-Based Prediction of Stock Price Movements with Machine Learning and WordNet

This paper explores the use of machine learning techniques in classifying financial news for the purpose of predicting stock price movements. The current body of literature on the subject is small, and the reported results are mixed. During the course of this paper we attempt to identify some causes for the divergent results, and devise experiments that account for weaknesses in existing research. A corpus of Thomson Reuter newswires was collected from Dow Jones’ Factiva for seven large stocks. Each article was then linked with the associated price gap of the trading day following the article’s publish date. Utilizing a sequential minimal optimization based support vector machine along with a WordNet-transformed bag-of-words representation, predictions were made in the form of long and short signals. Another variant of the system was also evaluated, wherein Latent Semantic Analysis was employed to process the input data. The signals were conditioned on a set of thresholds, meaning that trade signals were only generated when the predicted values exceeded certain threshold values. Higher thresholds were associated with higher accuracy but a lower number of trading signals. Overall the results were promising.

Arne Thorvald Gierløff Hollum, Borre P. Mosch, Zoltán Szlávik

Feature Extraction Using Single Variable Classifiers for Binary Text Classification

The most popular approach for document representation is the bag-of-words where terms are considered as features. In order to compute the values of these features, the term frequencies are generally scaled by a collection frequency factor to take into account the relative importance of different terms. The term frequencies can be considered as raw data about the input document. In this study, a novel framework for feature extraction is proposed for binary text classification where feature extraction is defined as a single variable classification problem. The term frequencies are the inputs and the output of each classifier is used to define a triple of features for the corresponding term. The magnitude of the classifier output that is in the interval [0.5,1] is an indicator for the confidence of the classifier and it is also employed in document representation together with the term frequency and the collection frequency factor.

Hakan Altınçay

An Approach to Automated Learning of Conceptual Graphs from Text

Many document collections are private and accessible only by selected people. Especially in business realities, such collections need to be managed, and the use of an external taxonomic or ontological resource would be very useful. Unfortunately, very often domain-specific resources are not available, and the development of techniques that do not rely on external resources becomes essential. Automated learning of conceptual graphs from restricted collections needs to be robust with respect to missing or partial knowledge, that does not allow to extract a full conceptual graph and only provides sparse fragments thereof. This work proposes a way to deal with these problems applying relational clustering and generalization methods. While clustering collects similar concepts, generalization provides additional nodes that can bridge separate pieces of the graph while expressing it at a higher level of abstraction. In this process, considering relational information allows a broader perspective in the similarity assessment for clustering, and ensures more flexible and understandable descriptions of the generalized concepts. The final conceptual graph can be used for better analyzing and understanding the collection, and for performing some kind of reasoning on it.

Fulvio Rotella, Stefano Ferilli, Fabio Leuzzi

Semi-supervised Latent Dirichlet Allocation for Multi-label Text Classification

This paper proposes a semi-supervised latent Dirichlet allocation (ssLDA) method, which differs from the existing supervised topic models for multi-label classification in mainly two aspects. Firstly both labeled and unlabeled learning data are used in ssLDA to train a model, which is very important for reducing the cost by manually labeling, especially when obtaining a fully labeled dataset is difficult. Secondly ssLDA provides a more flexible training scheme that allows two ways of labeling assignment while existing topic model-based methods usually focus on either of them: (1) a document-level assignment of labels to a document; (2) imposing word-level correspondences between words and labels within a document. Our experiment results indicate that ssLDA gains an advantage over other methods in implementation flexibility and can outperform others in terms of multi-label classification performance.

Youwei Lu, Shogo Okada, Katsumi Nitta

Special Session on Advances in Recommender Systems

Recommending Interest Groups to Social Media Users by Incorporating Heterogeneous Resources

Due to the advance of social media technologies, it becomes easier for users to gather together to form groups online. Take the for example (which is a popular music sharing website), users with common interests can join groups where they can share and discuss their loved songs. However, since the number of groups grows over time, users often need effective group recommendation (also called affiliation or community recommendation) in order to meet like-minded users. In this paper, based on the matrix factorization mechanism, we have investigated how to improve the accuracy of group recommendation by fusing other potentially useful information resources. Particulary, we adopt the collective factorization model to incorporate the user-item preference data, and the similarity-integrated regularization model to fuse the friendship data. The experiment on two real-world datasets (namely and Douban) shows the outperforming impact of the chosen models relative to others on addressing the data sparsity problem and enhancing the algorithm’s accuracy. Moreover, the experimental results identify that the user-item preference data can be more effective than the friendship in terms of benefiting the group recommendation.

Wei Zeng, Li Chen

Recent Advances in Recommendation Systems for Software Engineering

Software engineers must contend with situations in which they are exposed to an excess of information, cannot readily express the kinds of information they need, or must make decisions where computation of the unequivocally correct answer is infeasible. Recommendation systems have the potential to assist in such cases. This paper overviews some recent developments in recommendation systems for software engineering, and points out their similarities to and differences from more typical, commercial applications of recommendation systems. The paper focuses in particular on the problem of software reuse, and speculates why the recently cancelled Google Code Search project was doomed to failure as a general purpose tool.

Robert J. Walker

WE-DECIDE: A Decision Support Environment for Groups of Users

Group recommendation technologies are becoming increasingly popular for supporting group decision processes in various domains such as interactive television, music, and tourist destinations. Existing group recommendation environments are focusing on specific domains and do not include the possibility of supporting different kinds of decision scenarios. The


group decision support environment advances the state of the art by supporting different decision scenarios in a domain-independent fashion. In this paper we give an overview of the


environment and report the results of a first user study which focused on system usability and potentials for further applications.

Martin Stettinger, Gerald Ninaus, Michael Jeran, Florian Reinfrank, Stefan Reiterer

Special Session on Business Process Intelligence

Logic-Based Incremental Process Mining in Smart Environments

Understanding what the user is doing in a Smart Environment is important not only for adapting the environment behavior, e.g. by providing the most appropriate combination of services for the recognized situation, but also for identifying situations that could be problematic for the user. Manually building models of the user processes is a complex, costly and error-prone engineering task. Hence, the interest in automatically learning them from examples of actual procedures. Incremental adaptation of the models, and the ability to express/learn complex conditions on the involved tasks, are also desirable. First-order logic provides a single comprehensive and powerful framework for supporting all of the above. This paper presents a First-Order Logic incremental method for inferring process models, and show its application to the user’s daily routines, for predicting his needs and comparing the actual situation with the expected one. Promising results have been obtained with both controlled experiments that proved its efficiency and effectiveness, and with a domain-specific dataset.

Stefano Ferilli, Berardina De Carolis, Domenico Redavid

Information Mining Processes Based on Intelligent Systems

Business Intelligence offers an interdisciplinary approach (within which is Information Systems), that taking all available information resources and using of analytical and synthesis tools with the ability to transform information into knowledge, focuses on generating knowledge that contributes to the management decision-making and generation of strategic plans in organizations. Information Mining is the sub-discipline of information systems which supports business intelligence tools to transform information into knowledge. It has defined as the search for interesting patterns and important regularities in large bodies of information. We address the need to identify information mining processes to obtain knowledge from available information. When information mining processes are defined, we may decide which data mining algorithms will support the information mining processes. In this context, this paper proposes a characterization of the information mining process related to the following business intelligence problems: discovery of rules of behavior, discovery of groups, discovery of significant attributes, discovering rules of group membership and weight of rules of behavior or rules of group memberships.

Ramón García-Martínez, Paola Britos, Dario Rodríguez

Customer Churn Detection System: Identifying Customers Who Wish to Leave a Merchant

Identifying customers with a higher probability to leave a merchant (churn customers) is a challenging task for sellers. In this paper, we propose a system able to detect churner behavior and to assist merchants in delivering special offers to their churn customers. Two main goals lead our work: on the one hand, the definition of a classifier in order to perform churn analysis and, on the other hand, the definition of a framework that can be enriched with social information supporting the merchant in performing marketing actions which can reduce the probability of losing those customers. Experimental results of an artificial and a real datasets show an increased value of accuracy of the classification when random forest or decision tree are considered.

Cosimo Birtolo, Vincenzo Diessa, Diego De Chiara, Pierluigi Ritrovato

Special Session on Decision Support for Safetyrelated Systems

Hazard Identification of the Offshore Three-Phase Separation Process Based on Multilevel Flow Modeling and HAZOP

HAZOP studies are widely accepted in chemical and petroleum industries as the method for conducting process hazard analysis related to design, maintenance and operation of the systems. Different tools have been developed to automate HAZOP studies. In this paper, a HAZOP reasoning method based on function-oriented modeling, Multilevel Flow Modeling (MFM), is extended with function roles. A graphical MFM editor, which is combined with the reasoning capabilities of the MFM Workbench developed by DTU is applied to automate HAZOP studies. The method is proposed to support the “brain-storming” sessions in traditional HAZOP analysis. As a case study, the extended MFM based HAZOP methodology is applied to an offshore three-phase separation process. The results show that the cause-consequence analysis in MFM can infer the cause and effect of a deviation used in HAZOP and used to fill HAZOP worksheet. This paper is the first paper discussing and demonstrate the potential of the roles concept in MFM to supplement the integrity of HAZOP analysis.

Jing Wu, Laibin Zhang, Morten Lind, Wei Liang, Jinqiu Hu, Sten Bay Jørgensen, Gürkan Sin, Zia Ullah Khokhar

Multilevel Flow Modeling Based Decision Support System and Its Task Organization

For complex engineering systems, there is an increasing demand for safety and reliability. Decision support system (DSS) is designed to offer supervision and analysis about operational situations. A proper model representation is required for DSS to understand the process knowledge. Multilevel Flow Modeling (MFM) represents complex system in multiple levels of means-end and part-whole decompositions, which is considered suitable for plant supervision tasks. The aim of this paper is to explore the different possible functionalities by applying MFM to DSS, where both currently available techniques of MFM reasoning and less mature yet relevant MFM concepts are considered. It also offers an architecture design of task organization for MFM software tools by using the concept of agent and technology of multiagent software system.

Xinxin Zhang, Morten Lind, Ole Ravn

Simulation-Based Fault Propagation Analysis – Application on Hydrogen Production Plant

Production of hydrogen from water through the Cu-Cl thermochemical cycle is a relatively new technology. The main advantages of this technology over existing ones are higher efficiency, lower costs, lower environmental impact, and reduced greenhouse gas (GHG) emissions. Considering these advantages, the usage of this technology in industries such as nuclear and oil is increasing. Due to different hazards involved in hydrogen production, design and implementation of hydrogen plants require provisions for safety, reliability, and risk assessment. However, a very less research is done from the safety point of view. This paper introduces fault semantic network (FSN) as a novel method for fault diagnosis and fault propagation analysis by using interactions among process variables. The interactions among process variables are estimated through genetic programming (GP) and neural network (NN) modeling. The effectiveness, feasibility, and robustness of the proposed method have been demonstrated on simulated data obtained from the simulation of hydrogen production process in AspenHysys software. The proposed method has successfully achieved reasonable detection and prediction of non-linear interaction patterns among process variables.

Amir Hossein Hosseini, Sajid Hussain, Hossam A. Gabbar

Automatic Decomposition and Allocation of Safety Integrity Levels Using a Penalty-Based Genetic Algorithm

Automotive Safety Integrity Levels (ASILs) are used in the new automotive functional safety standard, ISO 26262, as a key part of managing safety requirements throughout a top-down design process. The ASIL decomposition concept, outlined in the standard, allows the safety requirements to be divided between multiple components of the system whilst still meeting the ASILs initially allocated to system-level hazards. Existing exhaustive automatic decomposition techniques drastically reduce the effort of performing such tasks manually. However, the combinatorial nature of the problem leaves such exhaustive techniques with a scalability issue. To overcome this problem, we have developed a new technique that uses a penalty-based genetic algorithm to efficiently explore the search space and identify optimum assignments of ASILs to the system components. The technique has been applied to a hybrid braking system to evaluate its effectiveness.

David Parker, Martin Walker, Luís Silva Azevedo, Yiannis Papadopoulos, Rui Esteves Araújo

HAZOP Analysis System Compliant with Equipment Models Based on SDG

It is important to assess the risk in chemical plants. HAZOP is widely used in the risk assessment to identify hazard. An automatic analysis system is developed to perform HAZOP effectively. In this study, semi-automatic analysis system was developed by using the Signed Directed Graph (SDG) as a deviation in the behavior of the propagation of equipment. Versatility of analysis is raised based on the propagation of deviation by adding the device in accordance with the rules. Our developed HAZOP analysis system is applied to one chemical process. And the future works for this study are explained.

Ken Isshiki, Yoshiomi Munesawa, Atsuko Nakai, Kazuhiko Suzuki

Special Session on Innovations in Intelligent Computation and Applications

TAIEX Forecasting Based on Fuzzy Time Series and Technical Indices Analysis of the Stock Market

This paper presents a new method for forecasting the TAIEX based on fuzzy time series and technical indices analysis of the stock market. Because the proposed method uses both fuzzy time series and technical indices analysis of the stock market to analyze the historical training data in details for forecasting the TAIEX, it can get higher forecasting accuracy rate than the existing methods. The contribution of this paper is that we present a new fuzzy time series forecasting method based on the MACD index, combined with the stochastic line indices (KD indices) to forecast the TAIEX. It gets a higher average forecasting accuracy rate than the existing method for forecasting the TAIEX.

Shyi-Ming Chen, Cheng-Yi Wang

Hierarchical Gradient Diffusion Algorithm for Wireless Sensor Networks

In this paper, a hierarchical gradient diffusion algorithm is proposed to solve the transmission problem and the sensor node’s loading problem by adding several relay nodes and arranging the sensor node’s routing path. The proposed hierarchical gradient diffusion aims to balance sensor node’s transmission loading, enhance sensor node’s lifetime, and reduce the data package transmission loss rate. According to the experimental results, the proposed algorithm not only reduces power consumption about 12% but also decreases data loss rate by 85.5% and increases active nodes by about 51.7%.

Hong-Chi Shih, Jiun-Huei Ho, Bin-Yih Liao, Jeng-Shyang Pan

Constructing a Diet Recommendation System Based on Fuzzy Rules and Knapsack Method

Many people suffer from three chronic diseases(diabetes, hypertension, cholesterol), and they often use search engine to collect related information. However, most of dietary information on the networks is not convenient for users to collect about the diet recommendations. In this paper, a diet recommendation system is suggested which can recommend a rational diet for users. We design a diet recommendation system which has the expert knowledge of three high chronic diseases. We use Protégé to establish ontology and OWL DL to construct the structure of knowledge. The system uses fuzzy logic as a guide prior to inference. According to the patient’s health information, the system infers daily calories requirement, and then use JENA inference device and JENA rule format to build our knowledge of the rules. The Knapsack-like algorithm is used to recommend suitable foods for users. The system was evaluated by nutritionists to prove it is effective.

Rung-Ching Chen, Yung-Da Lin, Chia-Ming Tsai, Huiqin Jiang

An Intelligent Stock-Selecting System Based on Decision Tree Combining Rough Sets Theory

This study presents a stock selective system by using hybrid models to look for sound financial companies that are really worth making investment in stock markets. The following are three main steps in this study: First, we utilize rough sets theory to sift out the core of the financial indicators affecting the ups and downs of a stock price. Second, based on the core of financial indicators coupled with the technology of decision tree, we establish hybrid classificatory models and predictable rules that would affect the ups and downs of a stock price. Third, by sifting the sound investing targets out, we use the established rules to set out to invest and calculate the rates of investment. These evidences reveal that the average rates of reward are far larger than the mass investment rates.

Shou-Hsiung Cheng

Cloud-Based LED Light Management System and Implementation Based-on Wireless Communication

In this paper, a LED lighting management system based on wireless communication and sensor network is proposed to improve public lighting system (PLS). A web-based Human-Computer Interaction (HCI) is designed to remotely control, monitor and manage LED lights efficiently. This management system of LED lights provides energy-saving mode, periodical light-checking mode and immediate light-checking mode to reduce the energy consumption of public lighting systems and to enhance the management efficiency. This system will automatically send SMS of state notification to notify administrator when the failure exception of LED lights is occurred. The management system will reduce the maintain cost and more efficient energy-saving than traditional lightings system.

Yi-Ting Chen, Yi-Syuan Song, Mong-Fong Horng, Chin-Shiuh Shieh, Bin-Yih Liao

A Study on Online Game Cheating and the Effective Defense

Since online game becomes more and more popular and virtual assets can be transferred to real money which motivated malicious players using illegal means like “cheating” to gain the profit or superiority in the game. Therefore, we have researched existing online game cheating practices and make recommended defenses against such computer crimes. We improved existing online game cheating classification by enhancing the scope, perspective, structure and comprehensiveness of all the existing classification schemes. We also survey related defense methods against online game cheating and integrate all the public known defense strategies and methods which are applicable in each classification category and make suitable defense recommendations.

Albert B. Jeng, Chia Ling Lee

Credit Rating Analysis with Support Vector Machines and Artificial Bee Colony Algorithm

Recently, credit rating analysis for financial engineering has attracted many research attentions. In the previous, statistical and artificial intelligent methods for credit rating have been widely investigated. Most of them, they focus on the hybrid models by integrating many artificial intelligent methods have proven outstanding performances. This research proposes a newly hybrid evolution algorithm to integrate artificial bee colony (ABC) with the support vector machine (SVM) to predict the corporate credit rating problems. The experiment dataset are select from 2001 to 2008 of Compustat credit rating database in America. The empirical results show the ABC-SVM model has the highest classification accuracy. Hence, this research presents the ABC-SVM model could be better suited for predicting the credit rating.

Mu-Yen Chen, Chia-Chen Chen, Jia-Yu Liu

An ACO Algorithm for the 3D Bin Packing Problem in the Steel Industry

This paper proposes a new Ant Colony Optimization (ACO) algorithm for the three-dimensional (3D) bin packing problem with guillotine cut constraint, which consists of packing a set of boxes into a 3D set of bins of variable dimensions. The algorithm is applied to a real-world problem in the steel industry. The retail steel cut consists on how to cut blocks of steel in order to satisfy the clients orders. The goal is to minimize the amount of scrap metal and consequently reduce the stock of steel blocks. The proposed ACO algorithm searches for the best orders of the boxes and it is guided by a heuristic that determines the position and orientation for the boxes. It was possible to reduce the amount of scrap metal by 90% and to reduce the usage of raw material by 25%.

Miguel Espinheira Silveira, Susana Margarida Vieira, João Miguel Da Costa Sousa

Special Session on Intelligent Image and Signal Processing

Controlling Search Using an S Decreasing Constriction Factor for Solving Multi-mode Scheduling Problems

The multi-mode resource-constrained project scheduling problem (MRCPSP) is an important issue for industry, and has been confirmed to be an NP-hard problem. The particle swarm optimization meta-heuristic is an effective and promising method and well applied to solve a variety of NP application problems. MRCPSP involves two sub-problems: the activity mode selection and the activity order sub-problems. Therefore, a discrete version PSO and constriction version PSO were applied for solving these two sub-problems respectively. Discrete PSO is utilized for determining the activity operation mode, the constriction PSO is applied for deciding the activity order. To enhance the exploration and exploitation search so as to improve search efficiency, an S decreasing constriction factor adjustment mechanism was proposed. To verify the performance of proposed scheme, instances of MRCPSP in PSPLIB were tested and comparisons with other state-of-art algorithms were also conducted. The experimental results reveal that the proposed S decreasing constriction factor adjustment scheme is efficient for solving MRCPSP type scheduling problems.

Reuy-Maw Chen, Chuin-Mu Wang

Extracting Blood Vessel in Sclera-Conjunctiva Image Based on Fuzzy C-means and Modified Cone Responses

In this paper, we present a method of sclera-conjunctiva segmentation and blood vessel extraction to assist the doctor of traditional Chinese Medicine to diagnose patients. First, the color eye image was converted to grayscale image and clustered three classes using fuzzy c-means algorithm. Then the Sobel operator will be applied to detect the edges. Therefore, the sclera-conjunctiva region will be obtained using the morphological dilation, holes filling, and connectivity algorithm. Finally, a modified cone responses algorithm was proposed to extract the blood vessel from the sclera-conjunctiva.

Jzau-Sheng Lin, Yu-Yang Huang, Yu-Yi Liao

A Spherical Coordinate Based Fragile Watermarking Scheme for 3D Models

In this paper, a new spherical coordinate based fragile watermarking scheme is proposed. At first, the three dimensional (3D) model is translated from the Cartesian coordinate system to the spherical coordinate system. Then the quantization index modulation technique is employed to embed the watermark into the


coordinate for authentication and verification. By adapting the quantization index modulation technique together with some keys in the spherical coordinate system, the distortion is controlled by the quantization step setting. Experimental results show that both the 100% embedding rate and low distortion can be achieved simultaneous in the proposed method. Moreover, the causality, convergence and vertex reordering problems can be overcome.

Cheng-Chih Huang, Ya-Wen Yang, Chen-Ming Fan, Jen-Tse Wang

Palm Image Recognition Using Image Processing Techniques

In this research, we find the palm from the hand image firstly. And then distinguish the fingers using the triangular calculation method. In the palm detection, the color of skin, background subtraction, hand image extraction, edge detection and histogram analysis are used to achieve the goal. In fingers distinguish; we record the tips and valley of the fingers by means of calculating the histogram of the palm image firstly. Next, we find out the original point which is the center of the gravity of the palm using the area that the palm image gets rid of the fingers part. Successively, we draw the original point and center of the cut line of the palm use as the base line, means zero angle line. Meanwhile, we draw another line from tip of finger to the original point called tip line. Finally, we calculate the angle between base line and tip line use as the finger angular. Since the fingers have different angle, so the fingers are easily be distinguished.

Wen-Yuan Chen, Yu-Ming Kuo, Chin-Ho Chung

Liver Cell Nucleuses and Vacuoles Segmentation by Using Genetic Algorithms for the Tissue Images

This paper proposes image segmentation methods for cell nucleuses and vacuoles in the liver fibrosis tissue images. The novel idea is to segment the objects by extracting the image features to determine the required cell in liver fibrosis images. In the proposed segmentation phase, some image processing methods are applied to segment the objects of nucleuses and vacuoles. Run Length method makes the object regions become obviously and the noises can be suppressed. The morphological opening operation is performed to split connecting objects. For vacuole regions segmentation, the opening operation applies the mode filter to stuff up the dark holes in the objects and keep the completeness of regions. Furthermore, the proposed method uses the Genetic Algorithm to find the most appropriate parameters and weights for the region segmentation. From the experimental results, the proposed method can achieve a good performance on the segmentation of cell nucleuses and vacuoles.

Ching-Te Wang, Ching-Lin Wang, Yung-Kuan Chan, Meng-Hsiun Tsai, Ying-Siou Wang, Wen-Yu Cheng

Mouth Location Based on Face Mask and Projection

Facial feature detection plays an important role in biological recognition applications, such as criminal investigation, special surveillance, photographic mode, etc. In this paper, we propose an approach to achieve the mouth feature extraction and detection. This approach involves the steps of skin-color segmentation, face feature extraction, edge detection and edge projection. In order to more effectively and correctly locate the mouth position, we adopt two phases to perform our purposes, one phase is to label face mask by using our designed skin-color filter, the other phase is to detect the edge and to compute edge projection for the decided mouth region within the face mask. The present results demonstrate that our proposed system can achieve a high accuracy for month detection and can obtain coarsely the face posture estimation based on specified rule.

Hui-Yu Huang, Yan-Ching Lin

Environmental Background Sounds Classification Based on Properties of Feature Contours

In this paper, an approach to environmental sound recognition (ESR) by using properties of feature trajectories is presented. To determine the discriminative attributes of background sounds, several audio classes have been analysed. Selected groups of sounds reflect the acoustical environments that may occur in real sound acquisition situations. We proposed the feature extraction scheme, where obtained trajectories at parameterization stage are further processed in order to improve classification accuracy. A discriminatory analysis of popular audio features for ESR task has been performed. Obtained results show that proposed technique gives promising classification results and can be applied in systems where properly identified audio scene can improve other audio processing tasks.

Tomasz Maka

On the Prediction of Floor Identification Credibility in RSS-Based Positioning Techniques

The future of Location Based Services largely depends on the accuracy of positioning techniques. In the case of indoor positioning, frequently fingerprinting-based solutions are developed. A well known k Nearest Neighbours method is frequently used in this case. However, when the detection of a floor a mobile terminal is located at is an objective, only limited accuracy can be observed when the number of available signals is limited.

The primary objective of this work is to analyse whether the credibility of floor estimates can be a priori assessed. A method assigning weights to individual GSM fingerprints and estimating their reliability in terms of floor estimation is proposed. The method is validated with an extensive radio map. It has been shown that both low and high accuracy floor estimates are correctly identified. Moreover, the objective criterion is proposed to assess individual weight functions from a proposed family of functions.

Maciej Grzenda

A Study of Vessel Origin Detection on Retinal Images

Parabolic model is commonly used for fovea and macular detection. But, the center of an optic disc is mostly taken as the vertex of the parabola-like vasculature. Since vessels generate out from the vessel origin, taking vessel origin as the vertex can provide better fovea localization than taking optic disc center. Recently, the vessel origin is also used to detect vessels within an optic disc. However, there is no published research for finding the exact vessel origin position. This paper proposed a novel method to locate the position of vessel origin. First, a retinal image is processed to get the vascular structure. Then, four features based on the characteristic of vessel origin are selected, and Bayesian classifier is applied to locate the vessel origin. The proposed method is evaluated on the publicly available database, DRIVE. The experimental results show that the average Euclidean distance between the vessel origin and the one marked by experts are 13.3 pts, which are much better than other methods. This can further provide a more accurate vessel and fovea detection.

Chun-Yuan Yu, Chen-Chung Liu, Jiunn-Lin Wu, Shyr-Shen Yu, Jing-Yu Huang

The Tires Worn Monitoring Prototype System Using Image Clustering Technology

In order to improve traffic safety, researchers studied the driver’s physical or mental monitoring, vehicle structure, airbags, brake systems and tires and so on for continuous improvement. The tires need to carry the vehicle loading, to enhance grip, to improve the drainage ability and to reduce the friction noise. Accordingly, tire wear will affect the aforementioned features. This study had designed an experiment platform which can detect the main tread depth, applying image clustering technique, under conditions of low tire speed. In addition, the proposed image clustering algorithm FCM_sobel, could measure the depth of the main tread, at


= 0.5 (the influence weighting of the neighboring pixels) and rotating cycle equal 2.5 seconds/rotation. The implemental results show that the precision rates were 93.41%, 96.86 % for the depths of the main tread Iand II respectively. Consequently, detected the depth of the main tread I, the precision rate improved 3% compared with FCM_S1.

Shih-Yen Huang, Yi-Chung Chen, Kuen-Suan Chen, Hui-Min Shih

Quality Assessment Model for Critical Manufacturing Process of Crystal Oscillator

Quartz crystal is an electronic component made of “quartz” element. In the beginning, it was used in timepieces as the basis of reference for time keeping. Because the crystal has excellent features of stability in temperature change and low wearing, crystal component-based piezoelectric oscillators like crystal, crystal oscillator, crystal filter and optical device have become indispensible passive components for communication in the telecom industry. Furthermore, Of these processes, the wiring process is one of the crucial processes throughout the packaging, where if the quality of the wire for packaging the product is poor, it is very likely to cause poor contact between signal connectors and gold wire on IC or broken gold wires in the course of product transfer, sealing and baking as well as in the course of the product being bonded to the substrate, resulting in the whole IC-packaged product unable to function normally. This article, thus, will develop a model of assessing and testing process capabilities with process capability index,



, specifically for the larger-the-better quality characteristic of wire process. The article will also provide the assessment procedure, whereby the industry can effectively evaluate whether the process capabilities of their products meet the benchmarks they are supposed to.

Kuen-Suan Chen, Ching-Hsin Wang, Yun-Tsan Lin, Chun-Min Yu, Hui-Min Shih

Local Skew Estimation in Moving Business Cards

Current methods to help visually impaired persons read brief text like menus, business cards, and book covers are problematic because they assume both the user and the captured scene are stationary and they do not tell the visually impaired user if the target is captured by the camera. Further, these methods cannot estimate whether the text is locally skewed. This paper presents an intelligent system to estimate movement, thumbs, motion blur, text, and local skew in moving business card targets. Experimental results show that the proposed method can reduce time complexity, obtain high text detection rates, and achieve high local skew estimation rates.

Chun-Ming Tsai

Special Session on Machine Learning Methods Applied to Manufacturing Processes and Production Systems

Imbalanced Learning Ensembles for Defect Detection in X-Ray Images

This paper describes the process of detection of defects in metallic pieces through the analysis of X-ray images. The images used in this work are highly variable (several different pieces, different views, variability introduced by the inspection process such as positioning the piece). Because of this variability, the sliding window technique has been used, an approach based on data mining. Experiments have been carried out with various window sizes, several feature selection algorithms and different classification algorithms, with a special focus on learning unbalanced data sets. The results show that Bagging achieved significantly better results than decision trees by themselves or combined with SMOTE or Undersampling.

José Francisco Díez-Pastor, César García-Osorio, Víctor Barbero-García, Alan Blanco- Álamo

Improvements in Modelling of Complex Manufacturing Processes Using Classification Techniques

The improvement of certain manufacturing processes often involves the challenge of how to optimize complex and multivariable processes under industrial conditions. Moreover, many of these processes can be treated as regression or classification problems. Although their outputs are in the form of continuous variables, industrial requirements define their discretization in compliance with ISO 4288:1996 Standard. Laser polishing of steel components is an interesting example of such a problem, especially its application to finishing operations in the die and mould industry. The aim of this work is the identification of the most accurate classifier-based method for surface roughness prediction of laser polished components in compliance with the aforementioned industrial standard. Several data mining methods are tested for this task: ensembles of decision trees, classification via regression, and fine-tuned SVMs. These methods are also tested by using variants that take into account the ordinal nature of the class that has to be predicted. Finally, all these methods and variants are applied over different transformations of the dataset. The results of these methods show no significant differences in accuracy, meaning that a simple decision tree can be used for prediction purposes.

Pedro Santos, Jesús Maudes, Andrés Bustillo, Juan José Rodríguez

A Study on the Use of Machine Learning Methods for Incidence Prediction in High-Speed Train Tracks

In this paper a study of the application of methods based on Computational Intelligence (CI) procedures to a forecasting problem in railway maintenance is presented. Railway maintenance is an important and long-standing problem that is critical for safe, comfortable and economic transportation. With the advent of high-speed lines, the problem has even more importance nowadays. We have developed a study, applying forecasting procedures from Statistics and CI, to examine the feasibility of predicting one-month-ahead faults on two high-speed lines in Spain. The data are faults recorded by a measurement train which traverses the lines monthly. The results indicate that CI methods are competitive in this forecasting task against the Statistical regression methods, with


-support vector regression outperforming the other employed methods. So, application of CI methods is feasible in this forecasting task and it is useful in the planning process of track maintenance.

Christoph Bergmeir, Gregorio Sáinz, Carlos Martínez Bertrand, José Manuel Benítez

Estimating the Maximum Shear Modulus with Neural Networks

Small strain shear modulus is one of the most important geotechnical parameters to characterize soil stiffness. In-situ stiffness of soils and rocks is much higher than was previously thought as finite element analysis have shown. Also, the stress-strain behaviour of those materials is non-linear in most cases with small strain levels. The commun approach for getting the small strain shear modulus is usually based on measure of seismic wave velocities. Nevertheless, for design purposes is very useful to derive that modulus from correlations with in-situ tests output parameters. In this view, the use of Neural Networks seems very appropriate as the complexity of the system keeps the problem very unfriendly to treat following traditional data analysis methodologies. In this work, the use of Neural Networks is proposed to estimate small strain shear modulus for sedimentary soils from the basic or intermediate parameters derived from Marchetti Dilatometer Test.

Manuel Cruz, Jorge M. Santos, Nuno Cruz


Weitere Informationen

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.




Der Hype um Industrie 4.0 hat sich gelegt – nun geht es an die Umsetzung. Das Whitepaper von Protolabs zeigt Unternehmen und Führungskräften, wie sie die 4. Industrielle Revolution erfolgreich meistern. Es liegt an den Herstellern, die besten Möglichkeiten und effizientesten Prozesse bereitzustellen, die Unternehmen für die Herstellung von Produkten nutzen können. Lesen Sie mehr zu: Verbesserten Strukturen von Herstellern und Fabriken | Konvergenz zwischen Soft- und Hardwareautomatisierung | Auswirkungen auf die Neuaufstellung von Unternehmen | verkürzten Produkteinführungszeiten
Jetzt gratis downloaden!