Genetic optimization of GRNN for pattern recognition without feature extraction

https://doi.org/10.1016/j.eswa.2007.04.006Get rights and content

Abstract

This paper describes an approach for pattern recognition using genetic algorithm and general regression neural network (GRNN). The designed system can be used for both 3D object recognition from 2D poses of the object and handwritten digit recognition applications. The system does not require any preprocessing and feature extraction stage before the recognition. In GRNN, placement of centers has significant effect on the performance of the network. The centers and widths of the hidden layer neuron basis functions are coded in a chromosome and these two critical parameters are determined by the optimization using genetic algorithms. Experimental results show that the optimized GRNN provides higher recognition ability compared with that of unoptimized GRNN.

Introduction

The genetic algorithm (GA) is an optimization and search technique based on the principles of natural selection and Darwin’s most famous principle of survival of the fittest (Marco, Désidéri, & Lanteri, 1999). By using genetic algorithms, many parameters of neural networks can be determined such as, weight, function and hidden layer. In this study, genetic algorithms are used to determine the spread value and the positions of centers in neural network.

The general regression neural network (GRNN) which is a kind of radial basis function (RBF) networks was developed by Specht (1991) and is a powerful regression tool with a dynamic network structure. The network training speed is extremely fast. Due to the simplicity of the network structure and its implementation, it has been widely applied to a variety of fields including image processing. Specht (1991) addressed the basic concept of inclusion of clustering techniques in the GRNN model.

In the literature, various studies can be found about RBF networks and genetic algorithms. The study (Barreto, 2002), presents a new crossover operator that allows for some control over the competing conventions problem by using genetic algorithm on the configuration of RBF networks. Another study (de Lacerda, de Carvalho, & Ludermir, 2000), discusses how RBF networks can have their parameters defined by GA. The proposed GA is applied to a benchmark problem, a Hermite polynomial approximation. In Zuo, Liu, and Ruan (2004), the authors describe a study on voice conversion using GA to train the hidden layer of RBF network, which is expected to help improve the preference of converted speech for the target speaker’s characteristics.

In Yazıcı et al. (2006), genetic algorithm is used to optimize RBF and GRNN networks to classify iris flower, thyroid disease and Escherichia coli bacteria datasets.

In this study, our purpose is to make an optimum classification of patterns without feature extraction using GRNN together with genetic algorithm. Next Section 2 gives an overview of GRNN structure. In Section 3, a brief summary of genetic algorithm is presented. In Section 4, simulation about how centers are selected is mentioned and in the last Section 5, simulations and results are given.

Section snippets

Overview of GRNN Structure

The GRNN predicts the value of one or more dependent variables, given the value of one or more independent variables. The GRNN thus takes as an input vector x of length n and generates an output vector (or scalar) y′ of length m, where y′ is the prediction of the actual output y. The GRNN does this by comparing a new input pattern x with a set of p stored patterns xi (pattern nodes) for which the actual output yi is known. The predicted output y′ is the weighted average of all these associated

Genetic algorithms (GA)

Genetic algorithms are powerful stochastic and optimization (soft computing) techniques based on principles from evolution theory. Genetic algorithms are proven more effective in multi-peak optimization problems. Algorithm is started with a set of solution (represented by chromosomes) called subpopulation. Solutions from one subpopulation are taken and used to form a new subpopulation by a hope that the new population will be better, at least not worse than the old one (Goldberg, 1989).

GRNN-GA (genetically optimized GRNN) for pattern recognition

In algorithm, firstly, the initial population of individuals is generated randomly from the dataset. The fitness, which is a measure of adaptation to environment, is calculated for each individual. Then, “selection” operation leaving individuals to next generation is performed based on fitness value, and then “crossover” and “mutation” are performed on the selected individuals to generate new population by transforming parent’s chromosomes into offspring’s (child) ones. This procedure is

Optimization for object recognition

For the experiments we are currently using images of the Columbia image database. Columbia object image library (COIL-100) is a database of color images of 100 objects. We selected the 10 objects from the dataset shown in Fig. 3. The objects were placed on a motorized turntable against a black background. The turntable was rotated through 360° to vary object pose with respect to a fixed color camera. Images of the objects were taken at pose intervals of 5°. This corresponds to 72 poses per

Conclusion

This paper investigates the performance of GRNN optimized by genetic algorithm. Different patterns were used to compare the success between GRNN-GA and GRNN methods. System was tested for 3D object recognition using 2D poses and handwritten digits recognition. For object recognition, the application was carried out for 10 objects and high recognition rate was obtained. The ability of recognizing unknown objects after training with low number of samples is the important property of this system.

References (14)

  • A.M.S. Barreto

    Growing compact RBF networks using a genetic algorithm

    Seventh Brazilian Symposium on Neural Networks

    (2002)
  • E.G.M. de Lacerda et al.

    Evolutionary optimization of RBF networks

    Sixth Brazilian Symposium on Neural Networks

    (2000)
  • D.E. Goldberg

    Genetic algorithm in search optimization and machine learning

    (1989)
  • T. Hatanaka et al.

    Multi-objective structure selection for radial basis function networks based on genetic algorithm

    The 2003 Congress on Evolutionary Computation CEC ’03

    (2003)
  • Heimes, F., & van Heuveln, B.. (1998). The normalized radial basis function neural network. In 1998 IEEE international...
  • Available in...
  • N. Marco et al.

    Multi objective optimization in CFD by genetic algorithms

    (1999)
There are more references available in the full text version of this article.

Cited by (57)

  • Rapid in measurements of brown tide algae cell concentrations using fluorescence spectrometry and generalized regression neural network

    2022, Spectrochimica Acta - Part A: Molecular and Biomolecular Spectroscopy
    Citation Excerpt :

    Azim Heydari et al. used grey wolf optimizer (GWO) to optimize GRNN and improved the prediction accuracy of carbon dioxide emissions[22]. Ovunc Polat et al. used GA to optimize GRNN and improved the recognition accuracy of handwritten digits[23]. In 2020, a novel swarm intelligent algorithm called sparrow search algorithm (SSA) was proposed, SSA is superior to algorithms such as GWO, GA and PSO in terms of convergence speed and accuracy[24].

View all citing articles on Scopus
View full text