Paper The following article is Open access

An approach for automating the design of convolutional neural networks

, and

Published under licence by IOP Publishing Ltd
, , Citation Dmitry Plotnikov et al 2018 IOP Conf. Ser.: Mater. Sci. Eng. 450 052004 DOI 10.1088/1757-899X/450/5/052004

1757-899X/450/5/052004

Abstract

Image recognition is an independent field of the computer science nowadays. Image classification is one of its main domains, in which investigated objects can be represented by an image or a video stream. The objective of the image classification is correct assigning of objects to corresponding classes, and there exist many effective approaches for solving this problem. One of the most popular approaches is artificial neural networks, which are a method from the field of machine learning. Despite the fact that neural networks cover a wide range of machine learning problems, they are also able to solve the problem of the image classification. However, there is one more specific approach for neural networks-based images classification that applies the deep learning conception. The best-known deep learning algorithm is called the convolutional neural network (CNN). The CNN uses a principle of using the same parts of a neural network to manipulate with different local parts of an input image. As well as the standard neural network architecture, the convolutional neural network should be fine-tuned for solving a certain problem. Because of the CNN's depth and complexity, the tuning process usually is complex and needs huge computational efforts. In this study, we have proposed an approach for creating ensembles of previously trained convolutional neural networks. The approach allows to increase the performance of the image classification. The results of experiments for image classification problems are presented and discussed. The experiments show that the proposed approach is able to outperform the standard perceptron and single convolutional neural network.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1757-899X/450/5/052004