Elsevier

Neurocomputing

Volume 138, 22 August 2014, Pages 3-13
Neurocomputing

A brain-inspired spiking neural network model with temporal encoding and learning

https://doi.org/10.1016/j.neucom.2013.06.052Get rights and content

Abstract

Neural coding and learning are important components in cognitive memory system, by processing the sensory inputs and distinguishing different patterns to allow for higher level brain functions such as memory storage and retrieval. Benefitting from biological relevance, this paper presents a spiking neural network of leaky integrate-and-fire (LIF) neurons for pattern recognition. A biologically plausible supervised synaptic learning rule is used so that neurons can efficiently make a decision. The whole system contains encoding, learning and readout. Utilizing the temporal coding and learning, networks of spiking neurons can effectively and efficiently perform various classification tasks. It can classify complex patterns of activities stored in a vector, as well as the real-world stimuli. Our approach is also benchmarked on the nonlinearly separable Iris dataset. The proposed approach achieves a good generalization, with a classification accuracy of 99.63% for training and 92.55% for testing. In addition, the trained networks demonstrate that the temporal coding is a viable means for fast neural information processing.

Introduction

The great computational power of biological systems has drawn increasing attention from researchers. Although the detailed information processing involved in memory is still unclear, observed biological processes have inspired many computational models operating at power efficiencies close to biological systems. Pattern recognition is the ability to identify objects in the environment, and several conventional methods are used to implement it, such as maximum entropy classifier, naive Bayes classifier, decision trees, and support vector machines. As is a necessary first step in all cognitive processes including memory, it is better to consider pattern recognition from brain-inspired models which could potentially provide great computational power.

To approach biological neural networks, the artificial neural networks (ANNs) are developed as simplified approximations in terms of structure and function. Since early neurons of the McCulloch–Pitt neuron in 1940s and the perceptron in 1950s [1], referred as the first generation neuron models, ANNs have been evolving towards more neural-realistic models. Different from the first generation neurons in which step-function threshold is used, the second generation neurons use continuous activation functions (like a sigmoid or radial basis function) as threshold for output determination [2]. The first two generations are referred as traditional neuron models. Studies on biological systems disclose that neurons communicate with each other through action potentials (pulses or spikes). As the third generation neuron model, spiking neurons raise the level of biological realism by utilizing spikes. The spiking neurons dealing with precise timing spikes improve the traditional neural models on both the aspects of accuracy and computational power [3]. There are several kinds of spiking neuron models such as the integrate-and-fire (IF) model [4], the resonate-and-fire model [5], the Hodgkin–Huxley model [6], and the Izhikevich model [7]. Since the IF model is simple and computationally effective [8], it is the most widely used spiking neuron model [9], [10], [11], [12], [13], [14], [15], despite other more biologically realistic models.

Encoding is the first step in creating a memory, which considers how information is represented in the brain. Although results remains unclear, there are strong reasons to believe that it is optimal using pulses to encode the information for transmission [16]. The inputs to a spiking neuron are discrete spike times. Rate coding and temporal coding are two basic and widely studied schemes of encoding information in these spikes. In the rate coding the average firing rate within a time window is considered, while for the temporal coding the precise timings of spikes are considered [17]. Neurons, in the retina [18], [19], the lateral geniculate nucleus (LGN) [20] and the visual cortex [21] as well as in many other sensory systems, are observed to precisely respond to stimuli on a millisecond timescale [22]. Temporal patterns can carry more information than rate-based patterns [23], [24], [25]. A simple example of the temporal encoding is spike latency coding. The capability of encoding information in the timing of single spikes to compute and learn realistic data is demonstrated in [26]. Since this coding utilizes only single spikes to transfer information, it could potentially be beneficial for efficient pulse-stream very large scale integration (VLSI) implementations.

Many algorithms for spiking neural networks (SNNs) have been proposed. Based on arithmetic calculations, the SpikeProp [9], [26] was proposed for training SNNs, similar in concept to the backpropagation (BP) algorithm developed for traditional neural networks [27]. Others use bio-inspired algorithms, such as spike timing dependent plasticity (STDP) [28], [29], [30], [31], the spike-driven synaptic plasticity [13], and the tempotron rule [14]. Although the arithmetic calculations can easily reveal why and how networks can be trained, the arithmetic-based rules are not a good choice building networks with a biological performance. STDP is found to be able to learn distinct patterns in an unsupervised way [12], and it characterizes synaptic changes solely in terms of the temporal contiguity of presynaptic spikes and postsynaptic potentials or spikes. In the spike-driven synaptic plasticity [13], a rate coding is used. The learning process is supervised and stochastic, in which a teacher signal steers the output neuron to a desired firing rate. Being different with spike-driven synaptic plasticity, the tempotron learning rule [14] is efficient to learn spiking patterns where information is embedded in precise timing spikes.

Although SNNs show promising capability in playing a similar performance as living brains due to their more faithful similarity to biological neural networks, the big challenge of dealing with SNNs is reading data into and out of them, which requires proper encoding and decoding methods [32]. Some existing SNNs for pattern recognition (as in [13], [33]) based on the rate coding. Different from these SNNs, we focus more on the temporal coding which could potentially carry the same information efficiently using less number of spikes than the rate coding. This could largely facilitate the computing speed.

In this paper, we build a bio-inspired model of SNNs containing encoding, learning and readout. Neural coding and learning are the main considerations in this paper, since they are important components in cognitive memory system by processing the sensory inputs and distinguishing different patterns to allow for higher level brain functions such as memory storage and retrieval [34]. Inspired by the local receptive fields of biological neurons, the encoding neuron integrates information from its receptive field and represents the encoded information through precise timing of spikes. The timing scale of spikes is on a millisecond level which is consistent with biological experimental observations. The readout part uses a simple binary presentation as proposed in this paper to represent fired or non-fired state of the output neuron. Through the encoding and readout, SNNs can be applied to deal with real data well.

The main contribution of this paper lies in the approaches of designing SNNs for pattern recognition. Pattern recognition helps to identify and sort information for further processing in brain systems. A new coming pattern is recognized upon paying attention and similarity to previously learned patterns which are obtained through weight modification. Recognition memory is formed and stored in synaptic strengths. Inspired by biology, spiking neurons are employed for computation in this paper. This paper is extended from our preliminary work [35] by adding more comparative and analytic studies. The system contains encoding, learning and readout part. We demonstrate that, utilizing the temporal coding and learning, networks of spiking neurons can effectively and efficiently perform various classification tasks. In addition, the results also demonstrate that the temporal coding is a viable means for fast neural information processing and learning on real-world data.

The rest of this paper is organized as follows. Section 2 presents the architecture of the spiking neural network. Section 3 describes the temporal learning rule we used in our approaches. The relationship between this rule and well-studied STDP is also introduced. Section 4 shows the ability of the network to learn different patterns of neural activities (discrete-valued vectors). Section 5 shows the SNN for learning continuous input variables. We use the well-known Iris dataset problem to benchmark our approach against several existing methods. In Section 6, we demonstrate the ability of our spiking network for learning real-world stimuli (images). Finally, we end up with discussions in Section 7, followed by conclusions in the last section.

Section snippets

The spiking neural network

In this section, we describe the whole system architecture of spiking neurons for obtaining recognition memory. The system composes 3 functional parts: the encoding part, the learning part and the readout part (see Fig. 1). A stimulus consists of several components. The components are partially connected to encoding neurons to generate encoded spiking information. The encoding neurons are fully connected to learning neurons.

Each part plays a different functional role in the system: the encoding

Temporal learning rule

Temporal learning rules aim to deal with information encoded by precise spike timing. One of the most commonly studied rules is spike-timing-dependent plasticity (STDP) which has emerged in recent years as experimentally most studied form of synaptic plasticity (see [28], [29], [30], [31], [36] for reviews). According to STDP, the plasticity depends on the intervals between pre- and postsynaptic spikes. The basic mechanisms of plasticity found in STDP are the long term potentiation (LTP) and

Learning patterns of neural activities

Many ways of encoding memory patterns in neural networks have been studied. The memory patterns encoded in synaptic weights can be taken to be binary vectors, as well as they can also be taken to be drawn from a distribution with several discrete activity values or from a continuous distribution [34]. In Hopfield network [46], memory patterns are expressed through the activities of neurons, where the states of the neurons have binary values (+1 for active neuron and −1 for inactive neuron). In

Learning patterns of continuous input variables

In this section, we conduct experiments with our spiking neural network on classifying patterns with continuous variables. We use the Iris dataset to benchmark our approach against several existing methods.

Learning real-world stimuli

From the previous sections, we show the SNN has the ability to learn different patterns of activities and continuous input variables. It can separate different vector patterns successfully. To move forward a step, we apply the tempotron to learn some real-world stimuli (images).

In this section, we show the system architecture to perform visual pattern recognition. The image is composed of n pixels. The pixels are partially connected to encoding neurons to generate spiking information.

We

Discussion

In this section, we discuss several considerations regarding biological relevance that benefit the procession in our approach, and present the future directions.

Conclusion

This paper presents an architecture of spiking neurons to approach pattern recognition on various classification tasks such as recognition of neural activities, continuous input variables and real-world stimuli. The recognition memory is formed through weights modification during learning process. A new pattern could be recognized through matching what the network has learnt. Since the temporal encoding and learning in this paper are believed to be inherited from the biological neural systems,

Acknowledgments

This work was supported by Agency for Science, Technology, and Research (A*STAR), Singapore under SERC Grant 092 157 0130.

Qiang Yu received his B.Eng. degree in Electrical Engineering and Automation from Harbin Institute of Technology, Harbin, China, in 2010. He is a Ph.D. student in Department of Electrical and Computer Engineering, National University of Singapore. His research interests include learning algorithms in spiking neural networks, neural encoding and cognitive computation. He is currently working on neural circuit theories for cognitive computing.

References (57)

  • H. Adeli et al.

    Machine learning – neural networks, genetic algorithms, and fuzzy sets

    (1995)
  • W. Maass

    Lower bounds for the computational power of networks of spiking neurons

    Neural Comput.

    (1996)
  • S. Ghosh-Dastidar et al.

    Improved spiking neural networks for eeg classification and epilepsy and seizure detection

    Integr. Comput.-Aided Eng.

    (2007)
  • W. Gerstner et al.

    Spiking Neuron ModelsSingle Neurons, Populations, Plasticity

    (2002)
  • A. Hodgkin et al.

    A quantitative description of membrane current and its application to conduction and excitation in nerve

    J. Physiol.

    (1952)
  • E.M. Izhikevich

    Simple model of spiking neurons

    IEEE Trans. Neural Netw

    (2003)
  • E.M. Izhikevich

    Which model to use for cortical spiking neurons?

    IEEE Trans. Neural Netw.

    (2004)
  • J.J. Wade et al.

    SWAT: a spiking neural network training algorithm for classification problems

    IEEE Trans. Neural Netw.

    (2010)
  • F. Ponulak et al.

    Supervised learning in spiking neural networks with resumesequence learning, classification, and spike shifting

    Neural Comput.

    (2010)
  • T. Masquelier et al.

    Competitive stdp-based spike pattern learning

    Neural Comput.

    (2009)
  • J.M. Brader et al.

    Learning real-world stimuli in a neural network with spike-driven synaptic dynamics

    Neural Comput.

    (2007)
  • R. Gütig et al.

    The tempotrona neuron that learns spike timing-based decisions

    Nat. Neurosci.

    (2006)
  • X. Pan et al.

    The spatiotemporal learning rule and its efficiency in separating spatiotemporal patterns

    Biol. Cybern.

    (2005)
  • V.J. Uzzell et al.

    Precision of spike trains in primate retinal ganglion cells

    J. Neurophysiol.

    (2004)
  • T. Gollisch et al.

    Rapid neural coding in the retina with relative spike latencies

    Science

    (2008)
  • P. Reinagel et al.

    Temporal coding of visual information in the thalamus

    J. Neurosci.

    (2000)
  • W. Bair et al.

    Temporal precision of spike trains in extrastriate cortex of the behaving macaque monkey

    Neural Comput.

    (1996)
  • D.A. Butts et al.

    Temporal precision in the neural code and the timescales of natural vision

    Nature

    (2007)
  • Cited by (121)

    • Towards Energy-Efficient Spiking Neural Networks: A Robust Hybrid CMOS-Memristive Accelerator

      2024, ACM Journal on Emerging Technologies in Computing Systems
    View all citing articles on Scopus

    Qiang Yu received his B.Eng. degree in Electrical Engineering and Automation from Harbin Institute of Technology, Harbin, China, in 2010. He is a Ph.D. student in Department of Electrical and Computer Engineering, National University of Singapore. His research interests include learning algorithms in spiking neural networks, neural encoding and cognitive computation. He is currently working on neural circuit theories for cognitive computing.

    Huajin Tang received the B.Eng. degree from Zhejiang University, Hangzhou, China, the M.Eng. degree from Shanghai Jiao Tong University, Shanghai, China, and the Ph.D. degree in Electrical and Computer Engineering from the National University of Singapore, Singapore, in 1998, 2001, and 2005, respectively. He was a System Engineer with STMicroelectronics, Singapore, from 2004 to 2006. From 2006 to 2008, he was a Postdoctoral Fellow with Queensland Brain Institute, University of Queensland, Australia. He is currently a Research Scientist and leading the cognitive computing group at the Institute for Infocomm Research, Singapore. He has published one monograph (Springer-Verlag, 2007) and over 20 international journal papers. His current research interests include neural computation, machine learning, neuromorphic cognitive systems, and neuro-cognitive robots. He now serves as an Associate Editor of IEEE Transactions on Neural Networks and Learning Systems.

    Kay Chen Tan is an associate professor at National University of Singapore in Department of Electrical and Computer Engineering. He received the B.Eng. degree with First Class Honors in Electronics and Electrical Engineering, and the Ph.D. degree from the University of Glasgow, Scotland, in 1994 and 1997, respectively. He is actively pursuing research in computational and artificial intelligence, with applications to multi-objective optimization, scheduling, automation, data mining, and games. He has published over 100 journal papers, over 100 papers in conference proceedings, co-authored 5 books.

    He is currently the Editor-in-Chief of IEEE Computational Intelligence Magazine (CIM). He also serves as an Associate Editor/Editorial Board member of over 15 international journals, such as IEEE Transactions on Evolutionary Computation, IEEE Transactions on Systems, Man and Cybernetics: Part B Cybernetics, IEEE Transactions on Computational Intelligence and AI in Games, Evolutionary Computation (MIT Press), European Journal of Operational Research, Journal of Scheduling, and International Journal of Systems Science. Dr Tan is the awardee of the 2012 IEEE Computational Intelligence Society (CIS) Outstanding Early Career Award for his contributions to evolutionary computation in multi-objective optimization.

    Haoyong Yu received the B.S. and M.S. degrees in Mechanical Engineering from Shanghai Jiao Tong University, Shanghai, China, in 1988 and 1991, respectively. He received the Ph.D. degree in Mechanical Engineering from Massachusetts Institute of Technology, Cambridge, Massachusetts, US, in 2002. He is currently an Assistant Professor with the Department of Bioengineering and Principal Investigator of the Singapore Institute of Neurotechnology (SiNAPSE) at National University of Singapore. His areas of research include medical robotics, rehabilitation engineering and assistive technologies, system dynamics and control. He is a member of IEEE.

    View full text