Skip to main content
main-content

Über dieses Buch

Handbook of Neuroevolution Through Erlang presents both the theory behind, and the methodology of, developing a neuroevolutionary-based computational intelligence system using Erlang. With a foreword written by Joe Armstrong, this handbook offers an extensive tutorial for creating a state of the art Topology and Weight Evolving Artificial Neural Network (TWEANN) platform. In a step-by-step format, the reader is guided from a single simulated neuron to a complete system. By following these steps, the reader will be able to use novel technology to build a TWEANN system, which can be applied to Artificial Life simulation, and Forex trading. Because of Erlang’s architecture, it perfectly matches that of evolutionary and neurocomptational systems. As a programming language, it is a concurrent, message passing paradigm which allows the developers to make full use of the multi-core & multi-cpu systems. Handbook of Neuroevolution Through Erlang explains how to leverage Erlang’s features in the field of machine learning, and the system’s real world applications, ranging from algorithmic financial trading to artificial life and robotics.

Inhaltsverzeichnis

Frontmatter

Chapter 1. Introduction: Applications & Motivations

Abstract
This chapter discusses the numerous reasons for why one might wish to study the subject of neuroevolution. I cover a number of different applications of such a system, giving examples and scenarios of a neuroevolutionary system being applied within a variety of different fields. A discussion then follows on where all of this research is heading, and what the next step within this field might be. Finally, a whirlwind introduction of the book is given, with a short summary of what is covered in every chapter.
Gene I. Sher

Foundations

Frontmatter

Chapter 2. Introduction to Neural Networks

Abstract
In this chapter we discuss how the biological neurons process information, the difference between the spatiotemporal processing of frequency encoded information conducted by a biological neuron and the amplitude and frequency encoded signals processed by the artificial neural networks. We discuss the various types of artificial neural networks that exist, their architectures and topologies, and how to allow such neural networks to possess plasticity, which allows the neurons to adapt and change as they process presynaptic signals.
Gene I. Sher

Chapter 3. Introduction to Evolutionary Computation

Abstract
In this chapter we discuss biological evolution, and the way it has evolved the organisms and structures that we see around us today. We then extract the essentials of this natural stochastic search method, and discuss how one could implement the same, or an even more efficient version, in software. Once the standard evolutionary algorithm methods are introduced (genetic algorithms, genetic programming, evolutionary strategies, and evolutionary programming), we also discuss the slightly lesser known memetic algorithm approaches (hybrid algorithms), and how it compares to the already discussed methods. Finally, we discuss the equivalency between all these methods, and the fact that all of them are just different sides of the same coin.
Gene I. Sher

Chapter 4. Introduction to Neuroevolutionary Methods

Abstract
Neuroevolution is the machine learning approach through neural networks and evolutionary computation. Before a neural network can do something useful, before it can learn, or be applied to some problem, its topology and the synaptic weights and other parameters of every neuron in the neural network must be set to just the right values to produce the final functional system. Both, the topology and the synaptic weights can be set using the evolutionary process. In this chapter we discuss what Neuroevolution is, what Topology and Weight Evolving Artificial Neural Network (TWEANN) systems are, and how they function. We also discuss how this highly advanced approach to computational intelligence can be implemented, and what some of the problems that the evolved neural network based agents can be applied to.
Gene I. Sher

Chapter 5. The Unintentional Neural Network Programming Language

Abstract
The programming language Erlang has a perfect 1:1 mapping to the problem domain of developing neural network computational intelligence based systems. Erlang was created to develop distributed, process based, message passing paradigm oriented, robust, fault tolerant, concurrent systems. All of these features are exactly what a programming language created specifically for developing neural network based systems would have. In this chapter I make claims to why Erlang is such a perfect choice for the development of distributed computational intelligence systems, and how the features of this programming language map perfectly to the features needed by a neural network programming language. In this chapter I briefly discuss my reasons for considering Erlang to be, though unintentionally so, the quintessential neural network programming language.
Gene I. Sher

Neuroevolution: Taking the First Step

Frontmatter

Chapter 6. Developing a Feed Forward Neural Network

Abstract
In this chapter we discuss how a single artificial neuron processes signals, and how to simulate it. We then develop a single artificial neuron and test its functionality. Having discussed and developed a single neuron, we decide on the NN architecture we will implement, and then develop a genotype constructor, and a mapper from genotype to phenotype. Finally, we then ensure that that our simple NN system works by using a simple sensor and actuator attached to the NN to test its sense-think-act ability.
Gene I. Sher

Chapter 7. Adding the “Stochastic Hill-Climber” Learning Algorithm

Abstract
In this chapter we discuss the functionality of an optimization method called the Stochastic Hill Climber, and the Stochastic Hill Climber With Random Restarts. We then implement this optimization algorithm, allowing the exoself process to train and optimize the neural network it is overlooking. Afterwards, we implement a new problem interfacing method through the use of public and private scapes, which are simulated environments, not necessarily physical. We apply the new system to the XOR emulation problem, testing its performance on it. Finally, looking to the future and the need for us to be able to test and benchmark our neuroevolutionary system as we add new features to it, we create the benchmarker process, which summons the trainer and the NN it trains, applying it to some specified problem X number of times. Once the benchmarker has applied the trainer to the problem X number of times and accumulated the resulting statistics, it calculates the averages and the associated standard deviations for the important performance parameters of the benchmark.
Gene I. Sher

Chapter 8. Developing a Simple Neuroevolutionary Platform

Abstract
In this chapter, we take our first step towards neuroevolution. Having developed a NN system capable of having its synaptic weights optimized, we will combine it with an evolutionary algorithm. We will create a population_monitor, a process that spawns a population of NN systems, monitors their performance, applies a selection algorithm to the NNs in the population, and generates the mutant offspring from the fit NNs, while removing the unfit. In this chapter we also add topological mutation operators to our neuroevolutionary system, which will allow the population_monitor to evolve the NNs by adding new neural elements to their topologies. By the end of this chapter, our system becomes a fully-fledged Topology and Weight Evolving Artificial Neural Network.
Gene I. Sher

Chapter 9. Testing the Neuroevolutionary System

Abstract
In this chapter we test the newly created basic neuroevolutionary system, by first testing each of its mutation operators, and then by applying the whole system to the XOR mimicking problem. Though the XOR problem test will run to completion and without errors, a more detailed, manual analysis of the evolved topologies and genotypes of the fit agents will show a number of bugs to be present. The origins of the bugs is then analyzed, and the errors are fixed. Afterwards, the updated neuroevolutionary system is then successfully re-tested.
Gene I. Sher

A Case Study

Frontmatter

Chapter 10. DXNN: A Case Study

Abstract
This chapter presents a case study of a memetic algorithm based TWEANN system that I developed in Erlang, called DXNN. Here we will discuss how DXNN functions, how it is implemented, and the various details and implementation choices I made while building it, and why. We also discuss the various features that it has, the features which we will eventually need to add to the system we’re building together. Our system has a much cleaner and decoupled implementation, and which by the time we’ve reached the last chapter will supersede DXNN in every way.
Gene I. Sher

Advanced Neuroevolution: Creating the Cutting Edge

Frontmatter

Chapter 11. Decoupling & Modularizing Our Neuroevolutionary Platform

Abstract
In this chapter we modify the implementation of our TWEANN system, making all its parts decoupled from one another. By doing so, the plasticity functions, the activation functions, the evolutionary loops, the mutation operators… become independent, each called and referenced through its own modules and function names, and thus allowing for our system to be crowd-sourced, letting anyone have the ability to modify and add new activation functions, mutation operators, and other features, without having to modify or augment any other part of the TWEANN. This effectively makes our system more scalable, and easier to augment, advance, and improve in the future.
Gene I. Sher

Chapter 12. Keeping Track of Important Population and Evolutionary Stats

Abstract
To be able to keep track of the performance of a neuroevolutionary system, it is essential for that system to be able to accumulate the various statistics with regards to its fitness, population dynamics, and other changing features, throughout the evolutionary run. In this chapter we add to the population_monitor of our TWEANN system the ability to compose a trace, which is a list of tuples, where each tuple is calculated every 500 (by default) evaluations, containing the various statistics about the population achieved during those evaluations, tracing the population’s path through its evolutionary history.
Gene I. Sher

Chapter 13. The Benchmarker

Abstract
In this chapter we add the benchmarker process which can sequentially spawn population_monitors and apply them to some specified problem/simulation. We also extend the database to include the experiment record, which the benchmarker uses to deposit the traces of the population’s evolutionary statistics, and to recover from crashes to continue with the specified experiment. The benchmarker can compose experiments by performing multiple evolutionary runs, and then produce statistical data and GNUplot ready files of the various evolutionary dynamics and averages calculated within the experiment.
Gene I. Sher

Chapter 14. Creating the Two Slightly More Complex Benchmarks

Abstract
To test the performance of a neuroevolutionary system after adding a new feature, or in general when trying to assess its abilities, it is important to have some standardized benchmarking problems. In this chapter we create two such benchmarking problems, the Pole Balancing Benchmarks (Single, Double, and With and Without dampening), and the T-Maze navigation benchmark, which is one of the problems used to assess the performance of recurrent and plasticity enabled neural network based systems.
Gene I. Sher

Chapter 15. Neural Plasticity

Abstract
In this chapter we add plasticity to our direct encoded NN system. We implement numerous plasticity encoding approaches, and develop numerous plasticity learning rules, amongst which are variations of the Hebbian Learning Rule, Oja’s Rule, and Neural Modulation. Once plasticity has been added, we again test our TWEANN system on the T-Maze navigation benchmark.
Gene I. Sher

Chapter 16. Substrate Encoding

Abstract
In this chapter we augment our TWEANN to also evolve indirect encoded NN based systems. We discuss, architect, and implement substrate encoding. Substrate encoding allows for the evolved NN based systems to become geometrical-regularity sensitive with regards to sensory signals. We extend our existing genotype encoding method and give it the ability to encode both, neural and substrate based NNs. We then extend the exoself to map the extended genotype to the extended phenotype capable of supporting substrate encoded NN systems. Finally, we modify the genome mutator module to support new, substrate NN specific mutation operators, and then test the system on our previously developed benchmarking problems.
Gene I. Sher

Chapter 17. Substrate Plasticity

Abstract
In this chapter we develop a method for the substrate to possess plasticity, and thus have the synaptic weights of its neurodes change through experience. We first discuss the ABC and the Iterative substrate learning rules popularized within the HyperNEAT neuroevolutionary system. Then we implement the said learning rules within our own system, through only just a few minor modifications to our existing architecture.
Gene I. Sher

Applications

Frontmatter

Chapter 18. Artificial Life

Abstract
In this chapter we apply our neuroevolutionary system to an ALife simulation. We create new sensors and actuators for the NN based agents interfacing with their simulated environment avatars. Discuss the construction and implementation of the 2d ALife environment called Flatland. Interface our system to the said system, and then observe the resulting evolving behaviors of the simulated organisms.
We now come full circle. We started this book with a discussion on evolution of intelligent organisms, inhabiting a simulated (or real) world. In this chapter we convert that discussion into reality. We will create a flatland, a 2d world inhabited by 2d organisms, artificial life. Our neuroevolutionary system will evolve the brains of these 2d organisms, the avatars within the simulated environment controlled by our NN based agents, and through their avatars our NN based agents will explore the barren flatland, compete with other flatlanders for food, and predate on each other, while trying to survive and consume the simulated food within the 2d world.
Gene I. Sher

Chapter 19. Evolving Currency Trading Agents

Abstract
The application of Neural Networks to financial analysis in general, and currency trading in particular, has been explored for a number of years. The most commonly [2,3,4,5] used NN training algorithm in this application is the backpropagation. The application of TWEANN systems to the same field is only now starting to emerge, and is showing a significant amount of potential. In this chapter we create a Forex simulator, and then use our neuroevolutionary system to evolve automated currency trading agents. For this application we will utilize not only the standard sliding window approach when feeding the sensory signals to the neural encoded agents, but also the sliding chart window, where we feed the evolved substrate encoded agents the actual candle-stick price charts, and then compare the performance of the two approaches. As of this writing, the use of geometrical pattern sensitive NN based agents in the analysis of financial instrument charts has not yet been explored in any other paper, to this author’s knowledge. Thus in this chapter we pioneer this approach, and explore its performance and properties.
Gene I. Sher

Promises Kept

Frontmatter

Chapter 20. Conclusion

Abstract
Last words, future work, and motivation for future research within this field.
Gene I. Sher

Backmatter

Weitere Informationen

Premium Partner

    Bildnachweise