skip to main content
10.1145/3071178.3071292acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Neuroevolution on the edge of chaos

Published: 01 July 2017 Publication History

Abstract

Echo state networks represent a special type of recurrent neural networks. Recent papers stated that the echo state networks maximize their computational performance on the transition between order and chaos, the so-called edge of chaos. This work confirms this statement in a comprehensive set of experiments. Furthermore, the echo state networks are compared to networks evolved via neuroevolution. The evolved networks outperform the echo state networks, however, the evolution consumes significant computational resources. It is demonstrated that echo state networks with local connections combine the best of both worlds, the simplicity of random echo state networks and the performance of evolved networks. Finally, it is shown that evolution tends to stay close to the ordered side of the edge of chaos.

Supplementary Material

ZIP File (p465-matzner.zip)
Supplemental material.

References

[1]
Peter Barančok and Igor Farkaš. 2014. Memory Capacity of Input-Driven Echo State Networks at the Edge of Chaos. In International Conference on Artificial Neural Networks. Springer International Publishing, 41--48.
[2]
John M Beggs. 2008. The Criticality Hypothesis: How Local Cortical Networks Might Optimize Information Processing. Philosophical Transactions of the Royal Society of London A: Mathematical, Physical and Engineering Sciences 366, 1864 (2008), 329--343.
[3]
Nils Bertschinger and Thomas Natschläger. 2004. Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks. Neural computation 16, 7 (2004), 1413--1436.
[4]
Joschka Boedecker, Oliver Obst, Joseph T. Lizier, N. Michael Mayer, and Minoru Asada. 2012. Information Processing in Echo State Networks at the Edge of Chaos. Theory in Biosciences 131, 3 (2012), 205--213.
[5]
Jason Gauci and Kenneth Stanley. 2007. Generating Large-Scale Neural Networks Through Discovering Geometric Regularities. In Proceedings of the 9th annual conference on Genetic and evolutionary computation. ACM, 997--1004.
[6]
Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long Short-Term Memory. Neural computation 9, 8 (1997), 1735--1780.
[7]
Herbert Jaeger. 2001. The "Echo State" Approach to Analysing and Training Recurrent Neural Networks. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report 148 (2001).
[8]
Herbert Jaeger. 2002. Short Term Memory in Echo State Networks. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report 152 (2002).
[9]
Herbert Jaeger. 2012. Long Short-Term Memory in Echo State Networks: Details of a Simulation Study. Technical Report No. 27. Jacobs University.
[10]
Stuart A Kauffman. 1993. The Origins of Order: Self-Organization and Selection in Evolution. Oxford University Press.
[11]
Chris G Langton. 1990. Computation at the Edge of Chaos: Phase Transitions and Emergent Computation. Physica D: Nonlinear Phenomena 42, 1--3 (1990), 12--37.
[12]
Robert Legenstein and Wolfgang Maass. 2007. Edge of Chaos and Prediction of Computational Performance for Neural Circuit Models. Neural Networks 20, 3 (2007), 323--334.
[13]
Joseph T. Lizier. 2014. JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems. Frontiers in Robotics and AI 1, 11 (2014).
[14]
Joseph T. Lizier, Mikhail Prokopenko, and Albert Y. Zomaya. 2008. Local Information Transfer as a Spatiotemporal Filter for Complex Systems. Physical Review E 77, 2 (2008), 026110.
[15]
Joseph T. Lizier, Mikhail Prokopenko, and Albert Y. Zomaya. 2014. A Framework for the Local Information Dynamics of Distributed Computation in Complex Systems. In Guided Self-Organization: Inception. Springer Berlin Heidelberg, 115--158.
[16]
James Martens and Ilya Sutskever. 2011. Learning Recurrent Neural Networks with Hessian-Free Optimization. In 28th International Conference on Machine Learning (ICML-11). 1033--1040.
[17]
Razvan Pascanu, Tomas Mikolov, and Yoshua Bengio. 2013. On the Difficulty of Training Recurrent Neural Networks. (2013), 1310--1318.
[18]
Thomas Schreiber. 2000. Measuring Information Transfer. Physical Review Letters 85, 2(2000), 461--464.
[19]
Julien C. Sprott. 2003. Chaos and Time-Series Analysis. Oxford University Press.
[20]
Julien C. Sprott. 2015. Numerical Calculation of Largest Lyapunov Exponent. (2015). http://sprott.physics.wisc.edu/chaos/lyapexp.htm Accessed online: 2016-05-25.
[21]
Larry Squire, Darwin Berg, Floyd E. Bloom, Sascha Du Lac, Anirvan Ghosh, and Nicholas C. Spitzer. 2013. Fundamental Neuroscience (4th ed.). Academic Press.
[22]
Kenneth O. Stanley and Risto Miikkulainen. 2002. Evolving Neural Networks Through Augmenting Topologies. Evolutionary Computation 10, 2(2002), 99--127.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '17: Proceedings of the Genetic and Evolutionary Computation Conference
July 2017
1427 pages
ISBN:9781450349208
DOI:10.1145/3071178
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 July 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. echo state networks
  2. edge of chaos
  3. neuroevolution
  4. phase transition
  5. recurrent neural networks

Qualifiers

  • Research-article

Funding Sources

Conference

GECCO '17
Sponsor:

Acceptance Rates

GECCO '17 Paper Acceptance Rate 178 of 462 submissions, 39%;
Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)0
Reflects downloads up to 10 Feb 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media