skip to main content
research-article

Neural networks

Published: 19 November 2016 Publication History

Abstract

This paper presents a comprehensive overview of modelling, simulation and implementation of neural networks, taking into account that two aims have emerged in this area: the improvement of our understanding of the behaviour of the nervous system and the need to find inspiration from it to build systems with the advantages provided by nature to perform certain relevant tasks. The development and evolution of different topics related to neural networks is described (simulators, implementations, and real-world applications) showing that the field has acquired maturity and consolidation, proven by its competitiveness in solving real-world problems. The paper also shows how, over time, artificial neural networks have contributed to fundamental concepts at the birth and development of other disciplines such as Computational Neuroscience, Neuro-engineering, Computational Intelligence and Machine Learning. A better understanding of the human brain is considered one of the challenges of this century, and to achieve it, as this paper goes on to describe, several important national and multinational projects and initiatives are marking the way to follow in neural-network research.

References

[1]
J. Sjöberg, Q. Zhang, L. Ljung, A. Benveniste, B. Delyon, P.-Y. Glorennec, H. Hjalmarsson, A. Juditsky, Nonlinear black-box modeling in system identification: a unified overview, Automatica, 31 (1995) 1691-1724.
[2]
S. Geisser, Predictive Inference, Chapman and Hall, 1993.
[3]
R. Kohavi, R, A study of cross-validation and bootstrap for accuracy estimation and model selection, IJCAI, 14 (1995) 1137-1145.
[4]
H. He, E.A. Garcia, Learning from imbalanced data, IEEE Trans. Knowl. Data Eng., 21 (2009) 1263-1284.
[5]
M. Frasca, A. Bertoni, M. Re, G. Valentini, A neural network algorithm for semi-supervised node label learning from unbalanced data, Neural Netw., 43 (2013) 84-98.
[6]
Z. Ghahramani, M.I. Jordan, Dept. of Brain & Cognitive Sciences, MIT Center for Biological and Computational Learning. Technical Report 108, 16 pages. MIT, Cambridge, MA 02139, 1994. {http://mlg.eng.cam.ac.uk/zoubin/papers/review.pdf}.
[7]
R. Kumar, T. Chen, M. Hardt, D. Beymer, K. Brannon, T. Syeda-Mahmood, Multiple Kernel Completion and its application to cardiac disease discrimination.¿Biomedical Imaging (ISBI), 2013 IEEE 10th International Symposium on, IEEE, 2013.
[8]
V. Mayer-Schönberger, C. Kenneth, Big data: A revolution that will transform how we live, work, and think, Houghton Mifflin Harcourt, 2013.
[9]
J. Bornholt, R. Lopez, D.M. Carmean, L. Ceze, G. Seelig, K. Strauss, A DNA-based archival storage system, in: Proceedings of the Twenty-First International Conference on Architectural Support for Programming Languages and Operating Systems, ACM, 2016, pp. 637-649.
[10]
K. Hornik, Multilayer feedforward networks are universal approximators, Neural Netw., 2 (1989) 359-366.
[11]
M.I. Jordan, D.E. Rumelhart, Forward models: Supervised learning with a distal teacher, Cognit. Sci., 16 (1992) 307-354.
[12]
Z. Ghahramani, Unsupervised learning, in: Advanced Lectures on Machine Learning. Lecture Notes in Artificial Intelligence, 3176, Springer Verlag, Berlin, 2004.
[13]
R.S. Sutton, A.G. Barto, Reinforcement Learning. An Introduction, MIT Press, 1998.
[14]
M.I. Rabinovich, M.K. Muezzinoglu, Nonlinear dynamics of the brain: emotion and cognition, Phys.-Uspekhi, 53 (2010) 357-372.
[15]
W.S. McCullough, W.H. Pitts, A logical calculus of the ideas immanent in nervous activity, Bull. Math. Biophys., 5 (1943) 115-133.
[16]
D.O. Hebb, The organization of behavior: a neuropsychological theory, Psychology Press, 2005.
[17]
W. Gerstner, W.M. Kistler, Mathematical formulations of hebbian learning, Biol. Cybern., 87 (2002) 404-415.
[18]
A.L. Hodgkin, A.F. Huxley, A quantitative description of membrane current and its applications to conduction and excitation in nerve, J. Physiol., 117 (1952) 500-544.
[19]
A.M. Uttley, A Theory of the Mechanism of Learning Based on the Computation of Conditional Probabilities, Gauthier-Villars, Paris, 1956.
[20]
W.K. Taylor, Electrical Simulation Of Some Nervous System Functional Activities Information Theory, 3, Butterworths, 1956, pp. 314-328.
[21]
F. Rosenblatt, The Perceptron: a probabilistic model for information storage and organization in the brain, Psichol. Rev., 65 (1958) 386-408.
[22]
B. Widrow and M.E. Hoff, Jr., Adaptive switching circuits, IRE WESCOM Convention Record, pp. 96-104.
[23]
R. FitzHugh, Impulses and physiological states in theoretical models of nerve membrane, Biophys. J., 1 (1961).
[24]
J. Nagumo, S. Arimoto, S. Yoshizawa, An active pulse transmission line simulating nerve axon, Proc. IRE, 50 (1962) 2061-2070.
[25]
M. Ruth, Matthias, B. Hannon, Fitzhugh-Nagumo Neuron Model, Modeling Dynamic Biological Systems, Springer, New York, 1997.
[26]
M.L. Minsky, Computation: Finite and Infinite Machines, Prentice-Hall, Englewood Cliffs, N.J., 1967.
[27]
M.L. Minsky, S.A. Papert, Perceptrons, MIT Press, Cambridge, MA, 1969.
[28]
J.A. Anderson, A simple neural network generating an interactive memory, Math. Biosci., 14 (1972) 197-220.
[29]
T. Kohonen, Correlation matrix memories, IEEE Trans. Comput., C-21 (1972) 353-359.
[30]
Nakano, Association: a model of associative memory, IEEE Trans. Syst., Man Cynbern. (1972) 380-388.
[31]
J. Nagumo, S. Sato, On a response characteristic of a mathematical neuron model, Kybernetik, 10 (1972) 155-164.
[32]
E.R. Caianiello, Outline of a theory of thought-processes and thinking machines, J. Theor. Biol., 1 (1961) 204-235.
[33]
W.A. Little, The existence of persistent states in the brain, Math. Biosci., 19 (1974) 101-120.
[34]
D.J. Willshaw, C. von der Malsburg, How patterned neural connections can be set up by self-organization, Proc. R. Soc. Lond. Ser. B, 194 (1976) 431-445.
[35]
S.I. Amari, Topographic organization of nerve fields, Bull. Math. Biol., 42 (1980) 339-364.
[36]
T. Kohonen, Self-organized formation of topologically correct feature maps, Biol. Cybern., 43 (1982) 59-69.
[37]
A.C.C. Coolen, R. Kühn, P. Sollich, Theory of Neural Information Processing Systems, Oxford University Press, 2005.
[38]
J.J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proc. Natl. Acad. Sci. USA, 79 (1982) 2554-2558.
[39]
J.M. Zurada, I. Cloete, E. van der Poel, Generalized Hopfield networks for associative memories with multi-valued stable states, Neurocomputing, 13 (1996) 135-149.
[40]
E. Oja, A simplified neural model as a principal component analyzer, J. Math. Biol., 15 (1982) 267-273.
[41]
E. Oja, Principal components, minor components and linear neural networks, Neural Netw., 5 (1992) 927-936.
[42]
J.L. Hindmarsh, R.M. Rose, A model of the nerve impulse using three coupled first-order differential equations, Proc. R. Soc. Lond., B221 (1984) 87-102.
[43]
J.L. Hindmarsh, P. Cornelius, The development of the Hindmarsh-Rose model for bursting, World Science Publication, Hackensack, NJ, 2005.
[44]
D.H. Ackley, G.E. Hinton, T.J. Sejnowski, A learning algorithm for Boltzmann Machines, Cognit. Sci., 9 (1985) 147-169.
[45]
S. Kirkpatrick, C.D. Gelatt, M.P. Vecchi, Optimization by simulated annealing, Sci. New Ser., 220 (1983) 671-680.
[46]
V. ¿erný, Thermodynamical approach to the traveling salesman problem: An efficient simulation algorithm, Journal. Optim. theory Appl., 45 (1985) 41-51.
[47]
J. Herault, C. Jutten, B. Anns, Detection de grandeurs primitives dans un message composite par une architecture de calcul neuromimetique un aprentissage non service, Procedures of GRETSI, Nice, France, 1985.
[48]
C. Jutten, J. Herault, Blind separation of sources, part I: an adaptive algorithm based on neuromimetic architecture, Signal Process., 24 (1991) 1-10.
[49]
P. Comon, Independent component analysis: a new concept, Signal Process., 36 (1994) 287-314.
[50]
A. Hyvarinen, E. Oja, Independent component analysis: algorithms and applications, Neural Netw., 13 (2000) 411-430.
[51]
D.E. Rumelhart, G.E. Hinton, R.J. Williams, Learning representations of back-propagation errors, Nature, 323 (1986) 533-536.
[52]
A.E. Bryson, W. Denham, S.E. Dreyfus, Optimal programming problems with inequality constraints, AIAA J., 1 (1963) 2544-2550.
[53]
J. Schmidhuber, Deep learning in neural networks: an overview, Neural Netw., 61 (2015) 85-117.
[54]
S. Grossberg, Adaptive pattern classification and universal recoding, I: Parallel development and coding of neural feature detectors & II: Feedback, expectation, olfaction, and illusions, Biol. Cybern., 23 (1976) 187-202.
[55]
G.A. Carpenter, S. Grossberg, A massively parallel architecture for a self-organizing neural pattern recognition machine, Computer Vision, Graph., Image Process., 37 (1987) 54-115.
[56]
G.A. Carpenter, S. Grossberg, D.B. Rosen, Fuzzy ART: Fast stable learning and categorization of analog patterns by an adaptive resonance system, Neural Netw., 4 (1991) 759-771.
[57]
R. Linsker, Self-organization in a perceptual network, Computer, 21 (1989) 105-117.
[58]
D.S. Broomhead, D. Lowe, Multivariable functional interpolation and adaptive networks, Complex Syst., 2 (1988) 321-355.
[59]
L. Chua, L. Yang, Cellular neural networks - theory, IEEE Trans. Circ. Syst., 35 (1988) 257-1272.
[60]
L. Chua, L. Yang, Cellular neural networks - applications, IEEE Trans. Circ. Syst., 35 (1988) 1273-1290.
[61]
M. Anguita, F. Pelayo, F.J. Fernandez, A. Prieto, A low-power CMOS implementation of programmable CNN's with embedded photosensors, Circuits and Systems I: Fundamental Theory and Applications, IEEE Trans., 44.2 (1997) 149-153.
[62]
C.A. Mead, Analog VLSI and Neural Systems, Addison-Wesley, Reading, MA, 1989.
[63]
Y.H. Pao, Y. Takefji, Functional-link net computing, IEEE Comput. Journal., 25 (1992) 76-79.
[64]
K. Aihara, T. Takabe, M. Toyoda, M, Chaotic neural networks, Phys. Lett. A, 144 (1990) 333-340.
[65]
E.R.I.K. De Schutter, J.M. Bower, An active membrane model of the cerebellar Purkinje cell. I. Simulation of current clamps in slice, J. Neurophysiol., 71 (1994) 375-400.
[66]
E.R.I.K. De Schutter, J.M. Bower, An active membrane model of the cerebellar Purkinje cell. II. Simulation of synaptic responses, J. Neurophysiol., 71 (1994) 400-419.
[67]
A.J. Bell, T.J. Sejnowski, An Information-maximizatium approach to blind separation and blind deconvolution, Neural Comput., 6 (1995) 1129-1159.
[68]
D.J.D. MacKy, A practical Bayesian framework for backpropagation networks, Neural Comput., 4 (1992) 448-472.
[69]
C.M. Bishop, Neural Networks for Pattern Recognition, Oxford Universituy Press, Oxford, 1995.
[70]
B.D. Ripley, Pattern Recognition and Neural Networks, Cambridge University Press, Cambridge, 1996.
[71]
C. Bielza, P. Larrañaga, Bayesian networks in neuroscience: a survey, Front. Comput. Neurosci., 8 (2014) 131-153.
[72]
C.K.I. Willian, C.E. Ramunsen, Gaussian processes for regression, in: Advanced in Neural Information Processing Systems, 8, MIT Press, 1996, pp. 514-520.
[73]
M. Seeger, Gaussian processes for machine learning, Int. J. Neural Syst., 14 (2004) 69-106.
[74]
K. Murphy, Machine Learning: A Probabilistic Perspective, The MIT Press, 2012.
[75]
F.M. l Schleif, M. Biehl, A. Vellido, Advances in machine learning and computational intelligence, Neurocomputing, 72 (2009) 1377-1378.
[76]
S.I. Amari, Natural gradient works efficiently in learning, Neural Comput., 10 (1998) 251-276.
[77]
V. Vapnik, Statistical Learning Theory, Wiley, New York, 1998.
[78]
V. Vapnik, The Nature¿of Statistical Learning Theory, Springer Science & Business Media, 2013.
[79]
B. Schölkopfand, A. Smola, Learning with Kernels, MIT Press, 2002.
[80]
J. Shawe-Taylor, N. Cristianini, Kernel Methods for Pattern Analysis, Cambridge University Press, Cambridge, 2004.
[81]
J.H. Chiang, Choquet fuzzy integral-based hierarchical networks for decision analysis, Fuzzy Syst. IEEE Trans., 7 (1999) 63-71.
[82]
S. Haykin, Neural Networks. A Comprensive foundation, Prentice Hall, New Jersey, 1994.
[83]
F.L. Luo, R. Unbehauen, Applied neural networks for signal processing, Cambridge University Press, 1998.
[84]
C. Feng, R. Plamondon, On the stability analysis of delayed neural networks systems, Neural Netw., 14 (2001) 1181-1188.
[85]
C.H. Feng, R. Plamondon, Stability analysis of bidirectional associative memory networks with time delays, IEEE Trans. Neural Netw., 14 (2003) 1560-1565.
[86]
K. Gopalsamy, Stability of artificial neural networks with impulses, Appl. Math. Comput., 154 (2004) 783-813.
[87]
L. Wu, Z. Feng, W.X. Zheng, Exponential stability analysis for delayed neural networks with switching parameters: average dwell time approach, Neural Netw. IEEE Trans., 21 (2010) 1396-1407.
[88]
X. Zhang, L. Wu, S. Cui, An improved integral inequality to stability analysis of genetic regulatory networks with interval time-varying delays, IEEE/ACM Trans. Comput. Biol. Bioinforma. (TCBB), 12 (2015) 398-409.
[89]
M. Cottrell, J.C. Fort, G. Pages, Theoretical aspects of the SOM algorithm, Neurocomputing, 21 (1998) 119-138.
[90]
S. Bermejo, J. Cabestany, The effect of finite sample size on on-line K-means, Neurocomputing, 48 (2002) 511-539.
[91]
M.C. Fu, Optimization for simulation: Theory vs. practice, Informs J. Comput., 14 (2002) 192-215.
[92]
M. Gevrey, L. Dimopoulos, S. Lek, Review and comparison of methods to study the contribution of variables in artificial neural network models, Ecol. Model., 160 (2003) 249-264.
[93]
J. Ilonen, J.K. Kamarainen, J. Lampinen, Differential evolution training algorithm for feed-forward neural networks, Neural Process. Lett., 17 (2003) 93-105.
[94]
A. Abraham, Meta learning evolutionary artificial neural networks, Neurocomputing, 56 (2004) 1-38.
[95]
P.J. Zufiria, On the discrete-time dynamics of the basic Hebbian neural-network node, IEEE Trans. Neural Netw., 13 (2002) 1342-1352.
[96]
M. Forti, P. Nistri, Global convergence of neural networks with discontinuous neuron activations. Circuits and systems I: fundamental theory and applications, IEEE Trans., 50 (2003) 1421-1435.
[97]
M. Forti, P. Nistri, D. Papini, Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain, Neural Netw. IEEE Trans., 16 (2005) 1449-1463.
[98]
W. Lu, T. Chen, Dynamical behaviors of Cohen-Grossberg neural networks with discontinuous activation functions, Neural Netw., 18 (2005) 231-242.
[99]
L. Duan, L. Huang, Z. Guo, Stability and almost periodicity for delayed high-order Hopfield neural networks with discontinuous activations, Nonlinear Dyn., 77 (2014) 1469-1484.
[100]
T. Kim, T. Adali, Fully complex multi-layer perceptron network for nonlinear signal processing, J. VLSI signal Process. Syst. Signal Image Video Technol., 32 (2002) 29-43.
[101]
T. Nitta, On the inherent property of the decision boundary in complex-valued neural networks, Neurocomputing, 50 (2003) 291-303.
[102]
I. Aizenberg, C. Moraga, Multilayer feedforward neural network based on multi-valued neurons (MLMVN) and a backpropagation learning algorithm, Soft Comput., 11 (2007) 169-183.
[103]
R. Savitha, S. Suresh, N. Sundararajan, A fully complex-valued radial basis function network and its learning algorithm, Int. J. Neural Syst., 19 (2009) 253-267.
[104]
M.F. Amin, K. Murase, Single-layered complex-valued neural network for real-valued classification problems, Neurocomputing, 72 (2009) 945-955.
[105]
T. Xiong, Y. Bao, Z. Hu, R. Chiong, R, Forecasting interval time series using a fully complex-valued RBF neural network with DPSO and PSO algorithms, Inf. Sci., 305 (2015) 77-92.
[106]
Hirose, IEEE Press, John Wiley, 2013.
[107]
H. Leung, S. Haykin, The complex backpropagation algorithm, Signal Process. IEEE Trans., 39 (1991) 2101-2104.
[108]
G.-B. Huang, L. Chen, C.-K. Siew, Universal approximation using incremental constructive feedforward networks with random hidden nodes, IEEE Trans. Neural Netw., 17 (2006) 879-892.
[109]
G.B. Huang, L. Chen, Neurocomputing, 70 (2007) 3056-3062.
[110]
Y. Bengio, Learning Deep Architectures for AI, Found. Trends Mach. Learn., 2 (2009) 1-27.
[111]
G. Hinton, L. Deng, D. Yu, G. Dahl, A. Mohamed, N. Jaitly, A. Senior, V. Vanhoucke, P. Nguyen, T. Sainath, B. Kingsbury, Deep neural networks for acoustic modelling in speech recognition, IEEE Signal Process. Mag., 29 (2012) 82-97.
[112]
L. Deng, G. Hinton, B. Kingsbury,¿New types of deep neural network learning for speech recognition and related applications: An overview,¿Acoustics, Speech and Signal Processing (ICASSP), IEEE International Conference on (ICASSP), 2013, pp. 8599-8603.
[113]
H. Lee, R. Grosse, R. Ranganath, A.Y. Ng, Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations, in: Proceedings of the 26th Annual International Conference on Machine Learning, ACM, 2009, 609-616.
[114]
D. Ciresan, U. Meier, J. Schmidhuber, Multi-column deep neural networks for image classification, in: Computer Vision and Pattern Recognition (CVPR), 2012 IEEE Conference on, IEEE, 2012, pp. 3642-3649.
[115]
R. Collobert, J. Weston, L. Bottou, M. Karlen, K. Kavukcuoglu, P. Kuksa, Natural language processing (almost) from scratch, J. Mach. Learn. Res., 12 (2011) 2493-2537.
[116]
Y. Bengio, A. Courville, P. Vincent, Representation learning: a review and new perspectives, Pattern Anal. Mach. Intell. IEEE Trans., 35 (2013) 1798-1828.
[117]
G. Hinton, S. Osindero, Y.-W. Teh, A fast learning algorithm for deep belief nets, Neural Comput., 18 (2006) 1527-1554.
[118]
P. Zhou, C. Liu, Q. Liu, L. Dai, H. Jiang, A cluster-based multiple deep neural networks method for large vocabulary continuous speech recognition, in: Acoustics, Speech and Signal Processing (ICASSP), IEEE International Conference on, IEEE, 2013, pp. 6650-6654.
[119]
B. Chandra, R.K. Sharma, Fast learning in deep neural networks, Neurocomputing, 171 (2016) 1205-1215.
[120]
B.A. Olshausen, Emergence of simple-cell receptive field properties by learning a sparse code for natural images, Nature, 381 (1996) 607-609.
[121]
W. Maass, Networks of spiking neurons: The third generation of neural network models, Neural Netw., 10 (1997) 1659-1671.
[122]
S. Ghosh-Dastidar, Spiking Neural Networks, Int. J. Neural Syst., 19 (2009) 295-308.
[123]
W. Maass, T. Natschläger, H. Markram, Real-time computing without stable states: a new framework for neural computation based on perturbations, Neural Comput., 14 (2002) 2531-2560.
[124]
W. Maass, H. Markram, On the computational power of recurrent circuits of spiking neurons, J. Comput. Syst. Sci., 69 (2004) 593-616.
[125]
W. Maass, Liquid computing. In Computation and Logic in the Real World, Springer, Berlin Heidelberg, 2007.
[126]
E.M. Izhikevich, Simple model of spiking neurons, IEEE Trans. Neural Netw., 14 (2003) 1569-1572.
[127]
R. Brette, W. Gerstner, Adaptive exponential integrate-and-fire model as an effective description of neuronal activity, J. Neurophysiol., 94 (2005) 3637-3642.
[128]
R. Naud, N. Marcille, C. Clopath, W. Gerstner, Firing patterns in the adaptive exponential integrate-and-fire model, Biol. Cybern., 99 (2008) 335-347.
[129]
A.N. Kolmogorov, On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition, Am. Math. Soc. Transl., 28 (1963) 55-59.
[130]
J.D.Schaffer, D. Whitley, L.J. Eshelman, Combinations of genetic algorithms and neural networks: A survey of the state of the art, Combinations of Genetic Algorithms and Neural Networks, COGANN-92. International Workshop on, IEEE, 1992.
[131]
D. Whitley, Genetic algorithms and neural networks, Genet. algorithms Eng. Comput. Sci., 3 (1995) 203-216.
[132]
D. Heinke, F.H. Hamker, Comparing neural networks: a benchmark on growing neural gas, growing cell structures, and fuzzy ARTMAP, Neural Netw., IEEE Trans., 9 (1998) 1279-1291.
[133]
M. Lehtokangas, Modelling with constructive backpropagation, Neural Netw., 12 (1999) 707-716.
[134]
R. Zhang, Y. Lan, G.B. Huang, Z.,B. Xu, Z. B, Universal approximation of extreme learning machine with adaptive growth of hidden nodes, Neural Netw. Learn. Syst. IEEE Trans., 23 (2012) 365-371.
[135]
R. Reed, R, Pruning algorithms-a survey, Neural Netw. IEEE Trans., 4 (1993) 740-747.
[136]
B.E. Segee, M.J. Carter, IJCNN-91-Seattle International Joint Conference on Fault tolerance of pruned multilayer networks. In Neural Networks, IEEE, vol. 2, 1991, pp. 447-452.
[137]
Y. Le Cun, J.S. Denker, S.A. Solla, Optimal brain damage, NIPs, 89 (1989).
[138]
M. Yoan, A. Sorjamaa, P. Bas, O. Simula, C. Jutten, A. Lendasse, OP-ELM: optimally pruned extreme learning machine, Neural Netw. IEEE Trans., 21 (2010) 158-162.
[139]
P.L. Narasimha, W.H. Delashmit, M.T. Manry, J. Li, F. Maldonado, An integrated growing-pruning method for feedforward network training, Neurocomputing, 71 (2008) 2831-2847.
[140]
M.M. Islam, M.A. Sattar, M.F. Amin, X. Yao, K. Murase, A new adaptive merging and growing algorithm for designing artificial neural networks, systems, man, and cybernetics, Part B: cybernetics, IEEE Trans., 39 (2009) 705-722.
[141]
M. Bortman, M. Aladjem, A growing and pruning method for radial basis function networks, Neural Netw., IEEE Trans., 20 (2009) 1039-1045.
[142]
S. Haykin, Neural Networks and Learning Machines, Pearson, 2009.
[143]
W. Gerstner, R. Brette, Adaptive exponential integrate-and-fire model, Scholarpedia, 4 (2009) 8427.
[144]
E. Claverol, A. Brown, J. Chad, Discrete simulation of large aggregates of neurons, Neurocomputing, 47 (2002) 277-297.
[145]
M. Mattia, P. del Giudice, Efficient event-driven simulation of large networks of spiking neurons and dynamical synapses. Neural Computation, vol. 12 (200), pp. 2305-2329.
[146]
J. Reutimann, M. Giugliano, S. Fusi, Event-driven simulation of spiking neurons with stochastic dynamics, Neural Comput., 15 (2003) 811-830.
[147]
E. Ros, R. Carrillo, E.M. Ortigosa, B. Barbour, R. Agís, Event-Driven Simulation Scheme For Spiking Neural Networks Using Lookup Tables To Characterize Neuronal Dynamics, Neural Comput., 18 (2006) 2959-2993.
[148]
F. Naveros, N.R. Luque, J.A. Garrido, R.R. Carrillo, M. Anguita, E. Ros, A spiking neural simulator integrating event-driven and time-driven computation schemes using parallel CPU-GPU co-processing, IEEE Trans. Neural Netw., 26 (2014) 1567-1574.
[149]
M. Rudolph, A. Destexhe, A. How much can we trust neural simulation strategies?, Neurocomputing, 70 (2007) 1966-1969.
[150]
R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J.M. Bower, M. Diesmann, A. Morrison, P.H. Goodman, F.C. Harris, J.,M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A.P. Davison, S.E. Boustani, A. Destexhe, Simulation of networks of spiking neurons: a review of tools and strategies, J. Comput. Neurosci., 23 (2007) 349-398.
[151]
P. Hammarlund, Ö. Ekeberg, Large neural network simulations on multiple hardware platforms, J. Comput. Neurosci., 5 (1998) 443-459.
[152]
M. Hereld, R.L. Stevens, J. Teller, W. van Drongelen, Large neural simulations on large parallel computers, Int. J. Bioelectromagn., 7 (2005) 44-46.
[153]
U. Seiffert, Artificial neural networks on massively parallel computer hardware, Neurocomputing, 57 (2004) 135-150.
[154]
H. de Garis, C. Shou, B. Goertzel, L. Ruiting, A world survey of artificial brain projects, Part I: Large-scale brain simulations, Neurocomputing, 74 (2010) 3-29.
[155]
A.P. Davison, D. Brüderle, J. Eppler, J. Kremkow, E. Muller, D.A. Pecevski, L. Perrinet, P. Yge, PyNN: a common interface for neuronal network simulators, Front. Neuroinform., 2 (2008) 11.
[156]
D.F. Goodman, R. Brette, The Brian simulator, Front. Neurosci., 3 (2009) 192-197.
[157]
M. Stimberg, D.F.M. Goodman, V. Benichoux, R. Brette, Equation-oriented specification of neural models for simulations, Front. Neuroinform., 8 (2014) 1-14.
[158]
R. Blaško, Soft Computing Applications, Developed by ECANSE, in: The State of the Art in Computational Intelligence Advances, 5, Springer-Verlag, Berlin, 2000, pp. 233-237.
[159]
R.C. O'Reilly, Y. Munakata, Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain, MIT Press, 2000.
[160]
J.M. Bower, D. Beeman, The Book of GENESIS: Exploring Realistic Neural Models with the General Neural Simulation System, Springer, New York, 1998.
[161]
J.D. Johnsen, P. Nielsen, H.J. Luhmann, G. Northoff, R. Kötter, Multi-level network modelling of cortical dynamics built on the GENESIS environment, Neurocomputing, 44-46 (2002) 863-868.
[162]
O. Rochel, D. Martinez, An event-driven framework for the simulation of networks of spiking neurons, Proceedings of the 11th European Symposium on Artificial Neural Networks (ESANN) 2003, 295-300.
[163]
C.E. Wilson, P.H. Goodman, F.C. Harris, Implementation of a biologically realistic parallel neocortical-neural network simulator, in:¿Proceedings of the Tenth SIAM on Conference on Parallel Process. Sci. Comp. (PPSC), 2001.
[164]
J.B. Maciokas, P.H. Goodman, J. Kenyon, M. Toledo-Rodriguez, H. Markram, Accurate dynamical models of interneuronal GABaergic channel physiologies, Neurocomputing, 65 (2005) 5-14.
[165]
C. Eliasmith, How to Build a Brain: A Neural Architecture for Biological Cognition, Oxford University Press, 2013.
[166]
C. Eliasmith, T.C. Stewart, X. Choo, T. Bekolay, T. DeWolf, Y. Tang, D. Rasmussen, A large-scale model of the functioning brain, Science, 338 (2012) 1202-1205.
[167]
T.C. Stewart, B. Tripp, C. Eliasmith, Python scripting in the Nengo simulator, Front. Neuroinform., 3 (2009).
[168]
M. Diesmann, M.O. Gewaltig, NEST: An environment for neural systems simulations, in Forschung und wisschenschaftliches Rechnen, Beitr. zum Heinz-Billing-Preis, 58 (2001) 43-70.
[169]
M.L. Hines, N.T. Carnevale, N. T. NEURON: a tool for neuroscientists, Neuroscientist, 7 (2001) 123-135.
[170]
N.T. Carnevale, M.L. Hines, The NEURON Book, Cambridge University Press, Cambridge, UK, 2006.
[171]
M.L. Hines, N.T. Carnevale, Discrete event simulation in the NEURON environment, Neurocomputing, 58-60 (2004) 1117-1122.
[172]
M. Migliore, C. Cannia, W.W. Lytton, H. Markram, M.L. Hines, Parallel network simulations with NEURON, J. Comput. Neurosci., 21 (2006) 119-129.
[173]
N. Zell, R. Mache, G. Hübner, M. Mamier, M. Vogt, K.U. Schmalzl, K. Herrmann, SNNS (Stuttgart Neural Network Simulator): In Neural Network Simulation Environments, Springer, US, 1994.
[174]
A. Delorme, J. Gautrais, R. van Rullen, S. Thorpe, SpikeNET: a simulator for modelling large networks of integrate and fire neurons, Neurocomputing, 26-27 (1999) 989-996.
[175]
S.J. Thorpe, R. Guyonneau, N. Guilbaud, J.M. Allegraud, R. VanRullen, SpikeNet: real-time visual processing with one spike per neuron, Neurocomputing, 58 (2004) 857-864.
[176]
J.F. Vibert, N. Azmy, Neuro-bio-clusters: a tool for interacting biological neural networks simulation, in: Artificial Neural Networks, Elsevier S. P. North-Holland Pub, 1991, pp. 551-556.
[177]
J.F. Vibert, F. Alvarez, E.K. Kosmidis, XNBC V9: A user friendly simulation and analysis tool for neurobiologists, Neurocomputing, 38-40 (2001) 1715-1723.
[178]
B. Ermentrout. Simulating, analyzing, and animating dynamical systems: A guide to XPPAUT for researchers and students, SIAM, vol. 14, 2002.
[179]
K.H. Pettersen, H. Lindén, A.M. Dale, G.T. Einevoll, Extracellular spikes and CSD, in: Handbook of Neural Activity Measurement, Cambridge University Press, 2012, pp. 92-135.
[180]
U. Bernardet, M. Blanchard, P. FMJ Verschure, IQR: a distributed system for real-time real-world neuronal simulation, Neurocomputing, 44-46 (2002) 1043-1048.
[181]
H. Cornelis, E. de Schutter, NeuroSpaces: separating modelling and simulation, Neurocomputing, 52 (2003) 227-231.
[182]
F.K. Skinner, J.B. Liu, NNET: linking small- and large-scale network models, Neurocomputing, 52 (2003) 381-387.
[183]
M. Sousa, P. Aguiar, Building, simulating and visualizing large spiking neural networks with NeuralSyns, Neurocomputing, 123 (2014) 372-380.
[184]
M. Mulas, P. Massobrio, NEUVISION: a novel simulation environment to model spontaneous and stimulus-evoked activity of large-scale neuronal networks, Neurocomputing, 122 (2013) 441-457.
[185]
E. Schikuta, NeuroWeb: An Internet-based'neural'network'simulator, in: Proc. of the 14th IEEE International Conference on Tools with Artificial Intelligence,¿Washington, IEEE Computer Society, 2002, pp. 407-412.
[186]
C. Bergmeir, J.M. Benitez, Neural networks in R using the stuttgart neural network simulator: RSNNS, J. Stat. Softw., 46 (2012) 1-26.
[187]
M. Djurfeldt, A. Sandberg, O. Ekeberg, A. Lansner, SEE¿a framework for simulation of biologically detailed and artificial neural networks and systems, Neurocomputing, 26-27 (1999) 997-1003.
[188]
K.M.L. Menne, A. Folkers, T. Malina, U.G. Hofmann, Test of spike-sorting algorithms on the basis of simulated network data, Neurocomputing, 44 (2002) 1119-1126.
[189]
D. Hansel, G. Mato, C. Meunier, L. Neltner, On numerical simulations of integrate-and-fire neural networks, Neural Comput., 10 (1998) 467-483.
[190]
M. Resta, An agent-based simulator driven by variants of self-organizing maps, Neurocomputing, 147 (2015) 207-224.
[191]
K.G. Spiliotis, C.I. Siettos, A timestepper-based approach for the coarse-grained analysis of microscopic neuronal simulators on networks: Bifurcation and rare-events micro-to macro-computations, Neurocomputing, 74 (2011) 3576-3589.
[192]
I. Ziv, Da Baxter, Jh Byrne, Simulator for neural networks and action potentials: description and application, J. Neurophysiol., 71 (1994) 294-308.
[193]
M. Sanchez-Montanez, Strategies for the optimization of large scale networks of integrate and fire neurons, in: Connectionist Models of Neurons, Learning Processes, and Artificial Intelligence, 2084, Lecture Notes in Computer Science, 2001, pp. 117-125.
[194]
J.M. Nageswaran, N. Dutt, J.L. Krichmar, A. Nicolau, A.V. Veidenbaum, A configurable simulation environment for the efficient simulation of large-scale spiking neural networks on graphics processors, Neural Netw., 22 (2009) 791-800.
[195]
H.E. Plesser, J. Eppler, A. Morrison, Abigail; et al. Efficient parallel simulation of large-scale neuronal networks on clusters of multiprocessor computers, Lect. Notes Comput. Sci., 4641 (2007) 672-681.
[196]
P. Pacheco, M. Camperi, T. Uchino, PARALLEL NEUROSYS: a system for the simulation of very large networks of biologically accurate neurons on parallel computers, Neurocomputing, 32 (2000) 1095-1102.
[197]
A. d'Acierno, Back-propagation learning algorithm and parallel computers: the CLEPSYDRA mapping scheme, Neurocomputing, 31 (2000) 67-85.
[198]
V. Kumar, S. Shekhar, M.B. Amin, A scalable parallel formulation of the back-propagation algorithm for hypercubes and related architectures, IEEE Trans. Parallel Distrib. Syst., 5 (1994) 1073-1090.
[199]
L.M. Patnaik, R.N. Rao, Parallel implementation of neocognitron on star topology: theoretical and experimental evaluation, Neurocomputing, 41 (2001) 109-124.
[200]
J. Ortega, I. Rojas, A.F. Díaz, A. Prieto, Parallel coarse grain computing of boltzmann machines, Neural Process. Lett., 7 (1998) 169-184.
[201]
C. Chen, T.M. Taha, Spiking neural networks on high performance computer clusters, Proc. SPIE, Opt. Photon-. Inf. Process., 8134 (2011) 813406.
[202]
H. Markram, The Blue Brain project, Nat. Rev. Neurosci., 7 (2006) 153-160.
[203]
R. Fontaine, F. Belanger, N. Viscogliosi, H. Semmaoui, M.A. Tetrault, J.B. Michaud, C. Pepin, J. Cadorette, R. Lecomte, The hardware and signal processing architecture of LabPET (TM), a small animal APD-based digital PET scanner, IEEE Trans. Nucl. Sci., 56 (2009) 3-9.
[204]
The Blue Brain Project. 2011; Available from: {http://bluebrain.epfl.ch/}. The Blue Brain Project. EPFL.
[205]
H. Markrama, K. Meierb, T. Lippertc, S. Grillnerd, R. Frackowiake, S. Dehaenef, A. Knollg, H. Sompolinskyh, K. Verstrekeni, J. DeFelipe j, S. Grantk, J.-P. Changeuxl, A. Sariam, Introducing the Human Brain Project, Procedia Comput. Sci., 7 (2011) 39-42.
[206]
J. Soto, J.M. Moreno, J. Cabestany, A self-adaptive hardware architecture with fault tolerance capabilities, Neurocomputing, 121 (2013) 25-31.
[207]
J. Misra, I. Saha, Artificial neural networks in hardware: a survey, Neurocomputing, 74 (2010) 239-255.
[208]
L. Reyneri, On the performance of pulsed and spiking neurons, Analog. Integr. Circ. Signal Process., 30 (2002) 101-119.
[209]
K. Goser, U. Ramacher, Mikroelektronische Realisierung von künstlichen neuronalen Netzen/Microelectronic Realizations of artificial neural networks, Informationstechnik, 34 (1992) 241-247.
[210]
M. Glesner, W. Poechmueller, Neurocomputers: An Overview of Neural Networks in VLSI, Chapman and Hall, London, 1994.
[211]
A. Prieto, A. Andreou, Microelectronics for bio-inspired systems, Analog. Integr. Circ. Signal Process., 30 (2002) 87-90.
[212]
J. Lachmair, E. Merényi, M. Porrmann, U. Rückert, A reconfigurable neuroprocessor for self-organizing feature maps, Neurocomputing, 112 (2013) 189-199.
[213]
M.L. Rossmann, A. Bühlmeier, G. Manteuffel, K. Goser, Dynamic Hebbian learning strategies for VLSI-systems, Neurocomputing, 28 (1999) 157-164.
[214]
G. Indiveri, B. Linares-Barranco, T.J. Hamilton, A. van Schaik, R. Etienne-Cummings, T. Delbruck, S.-C. Liu, P. Dudek, P. Häfliger, S. Renaud, J. Schemmel, G. Cauwenberghs, J. Arthur, K. Hynna, F. Folowosele, S. Saighi, T. Serrano-Gotarredona, J. Wijekoon, Y. Wang, K. Boahen, Neuromorphic silicon neuron circuits, Front. Neurosci., 5 (2011) 73.
[215]
K. Zaghloul, K. Boahen, A silicon retina that reproduces signals in the optic nerve, J. Neural Eng., 3 (2006) 257-267.
[216]
M. Mahowald, The silicon retina, An Analog VLSI System for Stereoscopic Vision, Springer, 1994.
[217]
M. Anguita, F.J. Pelayo, A. Prieto, J. Ortega, Analog CMOS implementation of a cellular neural networks with programmable cloning templates, IEEE Trans. Circuits Syst., 40 (1993).
[218]
T. Delbrück, B. Linares-Barranco, E. Culurciello, C. Posch, Activity-driven, event-based vision sensors. In Circuits and Systems (ISCAS), in: Proceedings of 2010 IEEE International Symposium on, 2010, pp. 2426-2429.
[219]
K. Boahen, Neurogrid: emulating a million neurons in the cortex, in: Grand Challenges in Neural Computation, 2006, 6702.
[220]
C. Johansson, A. Lansner, Towards cortex sized artificial neural systems, Neural Netw., 20 (2007) 48-61.
[221]
J. Lazzaro, J. Wawrzynek, M. Mahowald, M. Sivilotti, D. Gillespie, Silicon auditory processors as computer peripherals, Neural Netw. IEEE Trans., 4 (1993) 523-528.
[222]
F. Corradi, D. Zambrano, M. Raglianti, G. Passetti, C. Laschi, G. Indiveri, Towards a neuromorphic vestibular system, IEEE Trans. Biomed. Circ. Syst., 8 (2014) 669-680.
[223]
S.M. Fakhraie, H. Farshbaf, K.C. Smith, Scalable closed-boundary analog neural networks, IEEE Trans. Neural Netw., 15 (2004) 492-504.
[224]
M. Verleysen, P. Thissen, J.L. Voz, J. Madrenas, An analog processor architecture for a neural network classi¿er, IEEE Micro, 14 (1994) 16-28.
[225]
U. Lotric, P. Bulic, Applicability of approximate multipliers in hardware neural networks, Neurocomputing, 96 (2012) 57-65.
[226]
J.L. Bernier, J. Ortega, I. Rojas, A. Prieto, Improving the tolerance of multilayer perceptrons by minimizing thestatistical sensitivity to weight deviations, Neurocomputing, 31 (2000) 87-103.
[227]
J.L. Bernier, J. Ortega, E. Ros, I. Rojas, A. Prieto, A quantitative study of fault tolerance, noise immunity and generalization ability of MLPs, Neural Comput., 12 (2000) 2941-2964.
[228]
C. Johansson, A. Lansner, Implementing plastic weights in neural networks using low precision arithmetic, Neurocomputing, 72 (2009) 968-972.
[229]
A.K. Fidjeland, M.P. Shanahan, Accelerated simulation of spiking neural networks using GPUs, IJCNN, Barcelona, Spain, 2010.
[230]
J.M. Nageswaran, N. Dutt, J.L. Krichmar, A. Nicolau,¿A. Veidenbaum, Efficient simulation of large-scale spiking neural networks using CUDA graphics processors, in: Proc. IJCNN, Atlanta, GA, USA, June 2009.
[231]
R. Brette, D.F. Goodman, Simulating spiking neural networks on GPU, Network, 23 (2012) 167-182.
[232]
A. Ahmadi, H. Soleimani, A GPU based simulation of multilayer spiking neural networks, ICEE, Tehran, Iran, 2011.
[233]
Y. Lu, D.W. Li, Z.H. Xu, Y.G. Xi, Convergence analysis and digital implementation of a discrete-time neural network for model predictive control, IEEE Trans. Ind. Electron., 61 (2014) 7035-7045.
[234]
J. Moreno, M.E. Ortuzar, J.W. Dixon, Energy-management system for a hybrid electric vehicle, using ultracapacitors and neural networks, IEEE Trans. Ind. Electron., 53 (2006) 614-623.
[235]
E. Ros, E.M. Ortigosa, R. Agís, R. Carrillo, M. Arnold, Real-time computing platform for spiking neurons (RT-spike), IEEE Trans. Neural Netw., 17 (2006) 1050-1063.
[236]
A. Strey, N. Avellana, A new concept for parallel neurocomputer architectures, in: Proceedings of EuroPar'96, Lyon, France, 1996, pp. 470-477.
[237]
{http://www.artificialbrains.com/darpa-synapse-program}.
[238]
P. Merolla, J. Arthur, F. Akopyan, N. Imam, R. Manohar, D. Modha, A digital neurosynaptic core using embedded crossbar memory with 45 pJ per spike in 45nm, in Proc. Custom Integr. Circuits Conf., 2011.
[239]
R. Preissl, T.M. Wong, P. Datta, M. Flickner, R. Singh, S.K. Esser, W.P. Risk, H.D. Simon, D.S. Modha, Compass: A scalable simulator for an architecture for Cognitive Computing; in: SC '12 Proceedings of the International Conference on High Performance Computing, Networking, Storage and Analysis, Article No. 54, IEEE Computer Society Press Los Alamitos, CA, USA, 2012.
[240]
{http://www.research.ibm.com/articles/brain-chip.shtml}.
[241]
S.B. Furber, D.R. Lester, L.A. Plana, J.D. Garside, E. Painkras, S. Temple, A.D. Brown, Overview of the SpiNNaker system architecture, IEEE Trans. Comput., 62 (2013) 2454-2467.
[242]
S.B. Furber, F. Gallupi, S. Temple, L.A. Plana, The SpiNNaker project: a massively-parallel computer architecture for neural simulations, Proc. IEEE, 102 (2014).
[243]
{http://apt.cs.manchester.ac.uk/projects/SpiNNaker/architecture/}.
[244]
A. Rast, F. Galluppi, S. Davies, L. Plana, C. Patterson, T. Sharp, D. Lester, S. Furber, Concurrent heterogeneous neural model simulation on real-time neuromimetic hardware, Neural Netw., 24 (2011) 961-978.
[245]
J. Schemmel D. Bru¿derle A. Gru¿bl M. Hock K. Meier S. Millner A wafer-scale neuromorphic hardware system for large-scale neural modelling, in: Proc. IEEE Int. Symp. Circuits Syst., 2010, pp. 1947-1950.
[246]
{http://www.uni-heidelberg.de/presse/news2013/pm20130128_hbp_en.html}.
[247]
B.V. Benjamin, P. Gao, E. McQuinn, S. Choudhary, A.R. Chandrasekaran, J.-M. Bussat, R. Alvarez-Icaza, J.V. Arthur, P.A. Merolla, K. Boahen, Neurogrid: a mixed-analog-digital multichip system for large-scale neural simulations, Proc. IEEE, 102 (2014) 699-716.
[248]
R. Silver, K. Boahen, S. Grillner, N. Kopell, K.L. Olsen, Neurotech for neuroscience: unifying concepts, organizing principles, emerging tools, J. Neurosci., 27 (2007) 11807-11819.
[249]
F. Yang, M. Paindavoine, Implementation of an RBF neural network on embedded systems: real-time face tracking and identity veri¿cation, IEEE Trans. Neural Netw., 14 (2003) 1162-1175.
[250]
L. Reyneri, Implementation issues of neuro-fuzzy hardware: going toward HW/SW codesign, IEEE Trans. Neural Netw., 14 (2003) 176-194.
[251]
B. Guo, D.H. Wang, Y. Shen, Z. Liu, Hardware-software partitioning of real-time operating systems using Hopfield neural networks, Neurocomputing, 69 (2006) 2379-2384.
[252]
J. Zhu, P. Sutton, FPGA implementations of neural networks-a survey of a decade of progress, Field-Program. Log. Appl., 2778 (2003) 1062-1066.
[253]
L.P. Maguire, T.M. McGinnity, B. Glackin, A. Ghani, A. Belatreche, J. Harkin, Challenges for large-scale implementations of spiking neural networks on FPGAs, Neurocomputing, 71 (2007) 13-29.
[254]
M. Atencia, H. Boumeridja, G. Joya, F. Garcia-Lagos, F. Sandoval, FPGA implementation of a systems identification module based upon Hopfield networks, Neurocomputing, 70 (2007) 2828-2835.
[255]
W. Gerstner, Spiking Neuron Models: Single Neurons, Populations, Plasticity, Cambridge University Press, Cambridge, UK, 2002.
[256]
F.J. Pelayo, E. Ros, X. Arreguit, A. Prieto, VLSI neural model using spikes, Analog. Integr. Circ. Signal Process., 13 (1997) 111-121.
[257]
E. Chicca, F. Stefanini, C. Bartolozzi, G. Indiveri, Neuromorphic electronic circuits for building autonomous cognitive systems, Proc. IEEE, 102 (2014) 1367-1388.
[258]
M.R. Azghadi, N. Iannella, S.F. Al-Sarawi, G. Indiveri, Spike-based synaptic plasticity in silicon: design, implementation, application, and challenges, Proc. IEEE, 102 (2014) 717-737.
[259]
M. Schaefer, T. Schoenauer, C. Wolff, C, Simulation of spiking neural networks - architectures and implementations, Neurocomputing, 48 (2002) 647-679.
[260]
B. Linares-Barranco, E. Sánchez-Sinencio, A. Rodríguez-Vázquez, J.L. Huertas, CMOS implementation of FitzHugh-Nagumo neuron model, IEEE J. Solid-State Circ., 26 (1991) 956-965.
[261]
L.O. Chua, Memristor-the missing circuit element, IEEE Trans. Circuit Theory, CT-18 (1971) 507-519.
[262]
D.B. Strukov, G.S. Snider, D.R. Stewart, S.R. Williams, The missing memristor found, Nature, 453 (2008) 80-83.
[263]
A. Thomas, Memristor-based neural networks, J. Phys. D: Appl. Phys., 46 (2013) 093001.
[264]
S. Li, F. Zeng, C. Chen, H. Liu, G. Tang, S. Gao, C. Song, Y. Lin, F. Pan, D. Guob, Synaptic plasticity and learning behaviours mimicked through Ag interface movement in an Ag/conducting polymer/Ta memristive system, J. Mater. Chem. C, 1.34 (2013) 5292-5298.
[265]
G. Indiveri, B. Linares-Barranco, R. Legenstein, G. Deligeorgis, T. Prodromakis, Integration of nanoscale memristor synapses in neuromorphic computing architectures, Nanotechnology, 24 (2013) 384010.
[266]
L. Chua, V. Sbitnev, H. Kim, Hodgkin-Huxley axon is made of memristors, Int. J. Bifurc. Chaos, 22 (2012).
[267]
Y.V. Pershin, M. Di Ventra, Experimental demonstration of associative memory with memristive neural networks, Neural Netw., 23 (2010) 881-886.
[268]
M. Itoh, L.O. Chua, Memristor cellular automata and memristor discrete-time cellular neural networks, Int. J. Bifurc. Chaos, 19 (2009) 3605-3656.
[269]
L. Duan, L. Huang, Periodicity and dissipativity for memristor-based mixed time-varying delayed neural networks via differential inclusions, Neural Netw., 57 (2014) 12-22.
[270]
Artificial Brains. DARPA SyNAPSE Program. {http://www.artificialbrains.com/darpa-synapse-program#memristor-chip}.
[271]
Y.S. AbuMostafa, D. Psaltis, Optical neural computers, Sci. Am., 255 (1987) 88-95.
[272]
E. Lange, Y. Nitta, K. Kyuma, Optical neural chips, IEEE Micro, 14 (1994) 29-41.
[273]
A.K. Datta, S.K. Sen, S. Bandyopadhyay, Optical computing techniques, IETE Tech. Rev., 12 (1995) 93-105.
[274]
F.T.S. Yu, C.M. Uang, Optical neural networks, in: Encyclopedia of Optical Engineering, 1:1, CRC Press, New York, NY, USA, 2003, pp. 1763-1777.
[275]
P.E.X. Silveira, Optoelectronic neural networks, in: Encyclopedia of Optical Engineering, 1:1, CRC Press, New York, NY, USA, 2003, pp. 1887-1902.
[276]
A. Serrano-Heredia, C.M. Hinojosa, R. Ponce, et al., Opto-digital implementation of a neural network using a Joint Transform Correlator based in a Hopfield inner product model for character recognition, Conference on Optical Information Systems, Proceedings of the Society of Photo-Optical Instrumentation Engineers (SPIE), San Diego, CA, Aug 04-05, Vol. 5202, 2003, pp. 365-372.
[277]
V.P. Shmerko, S.N. Yanushkevich, Computing paradigms for predictable nanoelectronics, J. Comput. Theor. Nanosci., 7 (2010) 303-324.
[278]
M. Schuld, I. Sinayskiy, F. Petruccione, The quest for a quantum neural network, Quantum Inf. Process., 13 (2014) 2567-2586.
[279]
W. Up, A. Renn, Molecular computing - A review. 1. Data and image storage, J. Mol. Electron., 7 (1991) 1-20.
[280]
M. Conrad, The lure of molecular computing, IEEE Spectr., 23 (1988) 55-60.
[281]
J.C. Chen, R.D. Chen, Toward an evolvable neuromolecular hardware: a hardware design fbr a multilevel artificial brain with digital circuits, Neurocomputing, 45 (2002) 9-34.
[282]
F. Alibart, S. Pleutin, D. Gue¿rin, C. Novembre, S. Lenfant, K. Lmimouni, C. Gamrat, D. Vuillaume, An organic nanoparticle transistor behaving as a biological spiking synapse, Adv. Funct. Mater., 20 (2009) 330-337.
[283]
K.L. Wang, Issues of nanoelectronics: A possible roadmap, Journal. Nanosci. Nanotechnol., 2 (2002) 235-266.
[284]
L. Nunes de Castro, Fundamentals of natural computing: an overview, Phys. Life Rev., 4 (2007) 1-36.
[285]
D. Hammerstrom, A survey of bio-inspired and other alternative architectures, Nanotechnology (2010).
[286]
Applications of Neural Networks, in: Applications of Neural Networks, Springer, 1995.
[287]
M. Tkáč, R. Verner, Artificial neural networks in business: two decades of research, Appl. Soft Comput., 38 (2016) 788-804.
[288]
Y. Bentz, D. Merunka, Neural networks and the multinomial logit for brand choice modelling: a hybrid approach, J. Forecast., 19 (2000) 177-200.
[289]
P. Berkhin, A survey of dataminig techniques, in: J. Kogan, C. Nicholas, M. Tebouille (Eds) Grouping Multidimensional Data: Recent Advances in Clustering, Springer, pp. 25-71.
[290]
L.J. Lancashire, C. Lemetre, G.R. Ball, An introduction to artificial neural networks in bioinformatics-application to complex microarray and mass spectrometry datasets in cancer studies, Briefings Bioinform. (2009).
[291]
W. Zhao, R. Chellappa, P.J. Phillips, A. Rosenfeld, Face recognition: a literature survey, ACM Comput. Surv. (CSUR), 35 (2003) 399-458.
[292]
E. Hjelmås, B.K. Low, Face detection: A survey, Comput. Vis. Image Underst., 83 (2001) 236-274.
[293]
J. Li, W. Hao, X. Zhang, X. Learning kernel subspace for face recognition, Neurocomputing, 151 (2001) 1187-1197.
[294]
M. Cannon, J.J.E. Slotine, Space-frequency localized basis function networks for nonlinear system estimation and control, Neurocomputing, 9 (1995) 293-342.
[295]
Y.X. Zhao, X. Du, G.L. Xia, L.G. Wu, A novel algorithm for wavelet neural networks with application to enhanced PID controller design, Neurocomputing, 158 (2015) 257-267.
[296]
M.R. Warnes, J. Glassey, G.A. Montaguen, B. Kara, Application of radial basis function and feedforward artificial neural networks to the Escherichia coli fermentation process, Neurocomputing, 2 (1998) 67-82.
[297]
M.A. Bezerra, R.E. Santelli, E.P. Oliveira, L.S. Villar, L.A. Escaleira, Response surface methodology (RSM) as a tool for optimization in analytical chemistry, Talanta, 76 (2008) 965-977.
[298]
M. Ibnkahla, Applications of neural networks to digital communications - a survey, Signal Process., 80 (2000) 1185-1215.
[299]
A. Guisan, N.E. Zimmermann, Predictive habitat distribution models in ecology, Ecol. Model., 135 (2000) 147-186.
[300]
W.Z. Lu, H.Y. Fan, S.M. Lo, Application of evolutionary neural network method in predicting pollutant levels in downtown area of Hong Kong, Neurocomputing, 51 (2003) 387-400.
[301]
J.M. Gutierrez-Villalobos, J. Rodriguez-Resendiz, E.A. Rivas-Araiza, V.H. Mucino, A review of parameter estimators and controllers for induction motors based on artificial neural networks, Neurocomputing, 118 (2013) 87-100.
[302]
L. Qi, H.B. Shi, Adaptive position tracking control of permanent magnet synchronous motor based on RBF fast terminal sliding mode control, Neurocomputing, 115 (2013) 23-30.
[303]
S.A. Kalogirou, Applications of artificial neural-networks for energy systems, Appl. Energy, 67 (2000) 17-35.
[304]
C. Booth, J.R. McDonald, The use of artificial neural networks for condition monitoring of electrical power transformers, Neurocomputing, 23 (1998) 97-109.
[305]
P.K. Wong, Z.X. Yang, C.M. Vong, J.H. Zhong, Real-time fault diagnosis for gas turbine generator systems using extreme learning machine, Neurocomputing, 128 (2014) 249-257.
[306]
H.S. Hippert, C.E. Pedreira, R.C. Souza, Neural networks for short-term load forecasting: a review and evaluation, IEEE Trans. Power Syst., 16 (2001) 44-55.
[307]
R.R. Trippi, E. Turban, Neural Networks in Finance and Investing: Using Artificial Intelligence to Improve Real World Performance, McGraw-Hill, Inc, 1992.
[308]
J.R. Coakley, C.E. Brown, Artificial neural networks in accounting and finance: modeling issues, Int. J. Intell. Syst. Acc. Financ. Manag., 9 (2000) 119-144.
[309]
I. Kaastra, M. Boyd, Designing a neural network for forecasting financial and economic time series, Neurocomputing, 10 (1996) 215-236.
[310]
G. Wang, J. Hao, J. Ma, H. Jiang, A comparative assessment of ensemble learning for credit scoring, Expert. Syst. Appl., 38 (2011) 223-230.
[311]
S. Khemakhem, Y. Boujelbènea, Credit risk prediction: a comparative study between discriminant analysis and the neural network approach, J. Acc. Manag. Inf. Syst., 14 (2015) 60-78.
[312]
H.M. Zhong, C.Y. Miao, Z.Q. Shen, Y.H. Feng, Comparing the learning effectiveness of BP, ELM, I-ELM, and SVM for corporate credit ratings, Neurocomputing, 128 (2014) 285-295.
[313]
E.I. Altman, G. Marco, F. Varetto, Corporate distress diagnosis: comparisons using linear discriminant analysis and neural networks (the Italian experience), J. Bank. Financ., 18 (1994) 505-529.
[314]
M.G. Reese, Application of a time-delay neural network to promoter annotation in the Drosophila melanogaster genome, Comput. Chem., 26 (2001) 51-56.
[315]
M. Liu, Y.D. He, J.X. Wang, H.P. Lee, Y.C. Liang, Hybrid intelligent algorithm and its application in geological hazard risk assessment, Neurocomputing, 149 (2015) 847-853.
[316]
D. Lu, P. Mausel, E. Brondizio, E. Moran, Change detection techniques, Int. J. Remote. Sens., 25 (2004) 2365-2407.
[317]
S. Mukkamala, G. Janoski, A. Sung, Intrusion detection using neural networks and support vector machines, in: Proceeding of the 2002 International Joint Conference on Neural Networks, vol. 13, IEEE Neural Network Soc., pp. 1702-1707, 2002.
[318]
S.C. Lee, D.V. Heinbuch, Training a neural-network based intrusion detector to recognize novel attacks, Syst., Man. Cybern., 31 (2001) 294-299.
[319]
E. De la Hoz, E. De La Hoz, A. Ortiz, J. Ortega, B. Prieto, PCA filtering and probabilistic SOM for network intrusion detection, Neurocomputing, 164 (2015) 71-81.
[320]
Z.L. Sun, H. Wang, W.S. Lau, G. Seet, D.W. Wang, Application of BW-ELM model on traffic sign recognition, Neurocomputing, 128 (2014) 153-159.
[321]
R.W. Swiniarski, L. Hargis, Rough sets as a front end of neural-networks texture classifiers, Neurocomputing, 36 (2001) 85-102.
[322]
R. Plamondon, S.N. Srihari, Online and off-line handwriting recognition: a comprehensive survey, Patten Anal. Mach. Intell. IEEE Trans., 22 (2000) 63-84.
[323]
A.K.S. Jardine, D.M. Lin, D. Banjevic, A review on machinery diagnostics and prognostics implementing condition-based maintenance, Mech. Syst. Signal Process., 20 (2006) 1483-1510.
[324]
H.H. Bafroui, A. Ohadi, Application of wavelet energy and Shannon entropy for feature extraction in gearbox fault detection under varying speed conditions, Neurocomputing, 133 (2014) 437-445.
[325]
J.A. Noble, D. Boukerroui, Ultrasound image segmentation: a survey, IEEE Trans. Med. imaging, 25 (2006) 987-1010.
[326]
D.J. Hemanth, C.K.S. Vijila, A.I. Selvakumar, J. Anitha, Performance improved iteration-free artificial neural networks for abnormal magnetic resonance brain image classification, Neurocomputing, 130 (2014) 98-107.
[327]
J. Khan, J.S. Wei, M. Ringner, L.H. Saal, M. Ladanyi, F. Westermann, F. Berthold, M. Schwab, C.R. Antonescu, C. Peterson, Meltzer. Classification and diagnostic prediction of cancers using gene expression profiling and artificial neural networks, Nat. Med., 7 (2001) 673-679.
[328]
J.D. Wulfkuhle, L.A. Liotta, E.F. Petricoin, Proteomic applications for the early detection of cancer, Nat. Rev. Cancer, 3 (2003) 267-275.
[329]
A. Statnikov, C.F. Aliferis, I. Tsamardinos, D. Hardin, A comprehensive evaluation of multicategory classification methods for microarray gene expression cancer diagnosis, Bioinformatics, 21 (2005) 631-643.
[330]
J.C. Lindon, E. Holmes, J.K. Nicholson, Pattern recognition methods and applications in biomedical magnetic resonance, Prog. Nucl. Magn. Reson. Spectrosc., 39 (2001) 40.
[331]
L.A. Berrueta, R.M. Alonso-Salces, K. Héberger, Supervised pattern recognition in food analysis, J. Chromatogr. A, 1158 (2007) 196-214.
[332]
A. Afantitis, G. Melagraki, P.A. Koutentis, H. Sarimveis, G. Kollias, Ligand-based virtual screening procedure for the prediction and the identification of novel ß-amyloid aggregation inhibitors using Kohonen maps and Counterpropagation Artificial Neural Networks, Eur. J. Med. Chem., 46 (2011) 497-508.
[333]
M.M. Ardestan, M. Moazen, Z.X. Chen, J. Zhang, Z.M. Jin, A real-time topography of maximum contact pressure distribution at medial tibiofemoral knee implant during gait: application to knee rehabilitation, Neurocomputing, 154 (2015) 174-188.
[334]
M.A. Lopez-Gordo, F. Pelayo, A. Prieto, E. Fernandez, An auditory brain-computer interface with accuracy prediction, Int. J. Neural Syst., 22 (2012) 1250009.
[335]
I.A. Basheer, M. Hajmeer, Artificial neural networks: fundamentals, computing, design, and application, J. Microbiol. Methods, 43 (2000) 3-31.
[336]
L. Carro-Calvo L, S. Salcedo-Sanz, J. Luterbache, Neural computation in paleoclimatology: General methodology and a case study, Neurocomputing, 113 (2013) 262-268.
[337]
C. Ambroise, G. Seze, F. Badran, S. Thiria, Hierarchical clustering of self-organizing maps for cloud classification, Neurocomputing, 30 (2000) 47-52.
[338]
P. Li, L. Dong, H. Xiao, M. Xu, A cloud image detection method based on SVM vector machine, Neurocomputing, 169 (2015) 34-42.
[339]
P.M. Ferreira, E.A. Faria, A.E. Ruano, Neural network models in greenhouse air temperature prediction, Neurocomputing, 43 (2002) 51-75.
[340]
T.G. Barbounis, J.B. Theocharis, A locally recurrent fuzzy neural network with application to the wind speed prediction using spatial correlation, Neurocomputing, 70 (2007) 1525-1542.
[341]
T.G. Barbounis, J.B. Theocharis, Locally recurrent neural networks for long-term wind speed and power prediction, Neurocomputing, 69 (2006) 466-496.
[342]
M. Frasca, A. Bertoni, G. Valentini, UNIPred: Unbalance-aware Network Integration and Prediction of protein functions, J. Comput. Biol., 22 (2015) 1057-1074.
[343]
K. Sachs, O. Perez, D. Pe'e, D. Lauffenburger, G. Nolan, Causal protein-signaling network derived from multiparameter single-cell data, Science, 308 (2005) 523-529.
[344]
H.R. Maier, G.C. Dandy, Neural networks for the prediction and forecasting of water resources variables: a review of modelling issues and applications, Environ. Model. Softw., 15 (2000) 101-124.
[345]
J.W. Labadie, Optimal operation of multireservoir systems: State-of-the-art review, Journal. water Resour. Plan. Manag., 130 (2004) 93-111.
[346]
B. Bhattacharya, D.P. Solomatine, Neural networks and M5 model trees in modelling water level-discharge relationship, Neurocomputing, 63 (2005) 381-396.
[347]
T. Hill, L. Marquez, M. O'Connor, W. Remus, Artificial neural network models for forecasting and decision making, Int. J. Forecast., 10 (1994) 5-15.
[348]
D. West, S. Dellana, J. Qian, Neural network ensemble strategies for financial decision applications, Comput. Oper. Res., 32 (2005) 2543-2559.
[349]
B. Malakooti, Y.Q. Zhou, Feedforward artificial neural networks for solving discrete multiple criteria decision making problems, Manag. Sci., 40 (1994) 1542-1561.
[350]
L. Chen, S. Lin, An interactive neural network-based approach for solving multiple criteria decision-making problems, Decis. Support. Syst., 36 (2003) 137-146.
[351]
D. Floreano, F. Mondada, Evolutionary neurocontrollers for autonomous mobile robots, Neural Netw., 11 (1998) 1461-1478.
[352]
M.J. Mahmoodabadi, M. Taherkhorsand, A. Bagheri, Optimal robust sliding mode tracking control of a biped robot based on ingenious multi-objective PSO, Neurocomputing, 124 (2014) 194-209.
[353]
H.N. Nguyen, J. Zhou, H.J. Kang, A calibration method for enhancing robot accuracy through integration of an extended Kalman filter algorithm and an artificial neural network, Neurocomputing, 151 (2015) 996-1005.
[354]
F.H. Lu, R. Undehauen, Applied Neural Networks for Signal Processing, Cambridge Universitu Press, 1998.
[355]
F. Lotte, M. Congedo, A. Lecuyer, F. Lamarche, B. Arnaldi, A review of classification algorithms for EEG-based brain-computer interfaces, J. Neural Eng., 4 (2007).
[356]
Y. Wu, Y.B. Ge, A novel method for motor imagery EEG adaptive classification based biomimetic pattern recognition, Neurocomputing, 116 (2013) 280-290.
[357]
E. Oja, The nonlinear PCA learning rule in independent component analysis, Neurocomputing, 17 (1997) 25-45.
[358]
C.G. Puntonet, A. Prieto, Neural net approach for blind separation of sources based on geometric properties, Neurocomputing, 18 (1998) 141-164.
[359]
A. Prieto, C.G. Puntonet, B. Prieto, A neural learning algorithm for blind separation of sources based on geometric properties, Signal Process., 64 (1998) 315-331.
[360]
A. Cichocki, J. Karhunen, W. Kasprzak, R. Vigario, Neural networks for blind separation with unknown number of sources, Neurocomputing, 24 (1999) 55-93.
[361]
D. Simon, H. Elsherief, Navigation satellite selection using neural networks, Neurocomputing, 7 (1995) 247-258.
[362]
P. Cheeseman, J. Kelly, M. Self, J. Stutz, W. Taylor, D. Freeman, Autoclass: A Bayessian Classification Systems: In Readings in knowledge acquisition and learning, Morgan Kaufmann Publishers, 1993.
[363]
T. Sagara, M. Hagiwara, Natural language neural network and its application to question-answering system, Neurocomputing, 142 (2014) 201-208.
[364]
S.M. Siniscalchi, T. Svendsen, C.H. Lee, An artificial neural network approach to automatic speech processing, Neurocomputing, 140 (2014) 326-338.
[365]
L. Gajecki, Architectures of neural networks applied for LVCSR language modelling, Neurocomputing, 133 (2014) 46-53.
[366]
K. Lo, F. Hahne, R. Brinkman, R. Ryan, R. Gottardo, Flow-class: a bioconductor package for automated gating of flow cytometry data, BMC Inform., 10 (2009) 145.
[367]
G.M. Foody, A. Mathur, A relative evaluation of multiclass image classification by support vector machines, IEEE Trans. Geosci. Remote. Sens., 42 (2004) 1335-1343.
[368]
A.K. Jain, R.P.W. Duin, J.C. Mao, Statistical pattern recognition: a review, IEEE Trans. Pattern Anal. Mach. Intell., 22 (2000) 4-37.
[369]
E.R. Hruschka, N.F. Ebecken, Extracting rules from multilayer perceptrons in classification problems: a clustering-based approach, Neurocomputing, 70 (2006) 384-397.
[370]
A. Nazemi, M. Dehghan, A neural network method for solving support vector classification problems, Neurocomputing, 152 (2015) 369-376.
[371]
G.B. Huang, X. Ding, H. Zhou, Optimization method based extreme learning machine for classification, Neurocomputing, 74 (2010) 155-163.
[372]
Y. Lan, Y.C. Soh, G.B. Huang, Constructive hidden nodes selection of extreme learning machine for regression, Neurocomputing, 73 (2010) 3191-3199.
[373]
T.Y. Kim, K.J. Oh, C.H. Kim, J.D. Don, Artificial neural networks for non-stationary time series, Neurocomputing, 61 (2004) 439-447.
[374]
T. KuremIoto, S. Kimura, K. Kobayashi, M. Obayashim, Time series forecasting using a deep belief network with restricted Boltzmann machines, Neurocomputing, 137 (2014) 47-56.
[375]
C. Alippi, V. Piuri, Experimental neural networks for prediction and identification, IEEE Trans. Instrum. Meas., 45 (1996) 670-676.
[376]
D.K. Wedding, K.J. Cios, Time series forecasting by combining RBF networks, certainty factors, and the Box-Jenkins model, Neurocomputing, 10 (1996) 149-168.
[377]
L.J. Herrera, H. Pomares, I. Rojas, A. Guillén, A. Prieto, O. Valenzuela, Recursive prediction for long term time series forecasting using advanced models, Neurocomputing, 70 (2007) 2870-2880.
[378]
A.K. Jain, M.N. Murty, P.J. Flynn, Data clustering: a review, ACM Comput. Surv. (CSUR), 31 (1999) 264-323.
[379]
G.P. Zhang, Neural networks for classification: a survey. systems, man, and cybernetics, part C: applications and reviews, IEEE Trans., 30 (2000) 451-462.
[380]
R. Xu, D. Wunsch, Survey of clustering algorithms, Neural Netw. IEEE Trans. Neural Netw., 16 (2005) 645-678.
[381]
D.J. Hemanth, C.K.S. Vijila, A.I. Selvakumar, J. Anitha, An adaptive filtering approach for electrocardiogram (ECG) signal noise reduction using neural networks, Neurocomputing, 117 (2013) 206-213.
[382]
P.S. Churchland, C. Koch, T.J. Sejnowski, What is computational neuroscience?, in: Computational Neuroscience, MIT Press, 1993, pp. 46-55.
[383]
T.J. Sejnowski, Computational Neuroscience International Encyclopedia of the Social & Behavioral Sciences, Elsevier, 2015.
[384]
M. Akay (Edt), Handbook of Neural Engineering, Wiley-IEEE Press, 2007.
[385]
D.J. DiLorenzo, J.D. Bronzino (Edts), Neuroengineering, CRC Press, 2008.
[386]
C.E. Schmidt, J.B. Leach, Neural tissue engineering: strategies for repair and regeneration, Annu. Rev. Biomed. Eng., 5 (2003) 293-347.
[387]
J.R. Wolpaw, N. Birbaumer, D.J. McFarland, G. Pfurtschellere, T.M. Vaughana, Brain-computer interfaces for communication and control, Clin. Neurophysiol., 113 (2002) 767-791.
[388]
M.A. Lopez, A. Prieto, F. Pelayo, C. Morillas, Use of Phase in Brain-Computer Interfaces based on Steady-State Visual Evoked Potentials, Neural Process. Lett., 32 (2010) 1-9.
[389]
E. Cattin, S. Roccella, N. Vitiello, I. Sardellitti, P.K. Artemiadis, P. Vacalebri, F. Vecchi, M.C. Carrozza, K.J. Kyriakopoulos, P. Dario, Design and development of a novel robotic platform for neuro-robotics applications: the neurobotics arm (NEURARM), Adv. Robot., 22 (2008) 3-37.
[390]
N. Burgess, J.G. Donnett, J. O'Keefe, Using a Mobile Robot to Test a Model of the Rat Hippocampus, Connect. Sci., 10 (1998) 291-300.
[391]
N.R. Luque, J.A. Garrido, R.R. Carrillo, E. D'Angelo, E. Ros, Fast convergence of learning requires plasticity between inferior olive and deep cerebellar nuclei in a manipulation task: a closed-loop robotic simulation, Front. Comput. Neurosci. (2014) 1-16.
[392]
Handbook of Natural Computing, in: Handbook of Natural Computing, Springer Verlag, 2012.
[393]
W. Pedrycz, Computational Intelligence: An Introduction, CRC Press, 1997.
[394]
L.A. Zadeh, Fuzzy sets, Inf. Control., 8 (1965) 338-353.
[395]
A. Prieto, M. Atencia, F. Sandoval, Advances in artificial neural networks and machine learning, Neurocomputing, 121 (2013) 1-4.
[396]
G.M. Shepherd, J.S. Mirsky, M.D. Healy, M.S. Singer, E. Skoufos, M.S. Hines, P.M. Nadkarni, P.L. Miller, The Human Brain Project: neuroinformatics tools for integrating, searching and modelling multidisciplinary neuroscience data, Trends Neurosci., 21 (1998) 460-468.
[397]
Brain Mapping by Integrated Neurotechnologies for Disease Studies. Official website: {http://brainminds.jp/en/}. Last modification: 15.07.15.
[398]
Website: {https://www.science.org.au/publications/inspiring-smarter-brain-research-australia}. Last modification: 24.02.14.
[399]
Brainnetome Project. Official website: {http://www.brainnetome.org/en/brainnetomeproject.html}. Last modification: 22.07.15.
[400]
J. TianZi, Brainnetome and related projects, Sci. Sin. Vitae, 57 (2014) 462-466.
[401]
Norwegian University of Science and Technology. {http://www.ntnu.edu/kavli/research/norbrain}.
[402]
University of Oslo. {http://www.med.uio.no/imb/english/research/about/infrastructure/norbrain/}.
[403]
SpikeFORCE Project in Information Society Technologies World. Website: {http://www.ist-world.org/ProjectDetails.aspx?ProjectId=5e284098967d4471961edde067abd27a}.
[404]
Sensemaker Project in Information Society Technologies World. Website: {http://www.ist-world.org/ProjectDetails.aspx?ProjectId=e9a2613ab2d64ef7b8ea8ab113f11976}.
[405]
The FACETS project. Website: {http://facets.kip.uni-heidelberg.de/}.
[406]
The SENSOPAC Project. Website: {http://www.sensopac.org/}.
[407]
The BrainScaleS Project. Website: {http://brainscales.kip.uni-heidelberg.de/}.
[408]
The Blue Brain Project. Website: {http://bluebrain.epfl.ch/}.
[409]
The REALNET Project. Website: {http://www.realnet-fp7.eu/}.
[410]
The Human Brain Project. A Report to the European Commission. The HBP-PS Consortium, Lausanne, April 2012. {https://goo.gl/3G6HMd}.
[411]
Human Brain Project. Official website: {https://www.humanbrainproject.eu/}.
[412]
The Neurorobotics platform (HBP). Website: {http://neurorobotics.net/the-human-brain-project/}.
[413]
BRAIN 2025, A Scientific Vision, National Institutes of Health, 2014. {http://braininitiative.nih.gov/2025/BRAIN2025.pdf}
[414]
E.R. Kandel, H. Markram, P.M. Matthews, R. Yuste, C. Koch, Neuroscience thinks big (and collaboratively), Neuroscience, 14 (2013) 659-664.
[415]
Allen Institute for Brain Science. Official website: {http://alleninstitute.org/}.
[416]
Human Brain Project. Press Officer. What People are saying. {https://www.humanbrainproject.eu/es/media}.
[417]
A. Roy; {http://www.neuroinf.org/pipermail/comp-neuro/2014-June/004822.html}.
[418]
B. Meyerson, Top 10 emerging technologies of 2015, The World Economic Forum (2015), {https://agenda.weforum.org/2015/03/top-10-emerging-technologies-of-2015-2/}.
[419]
L.A. Barroso, U. Hölzle, The case for energy-proportional computing, Computer, 12 (2007) 33-37.
[420]
M. Costandi, How to buil a brain, Sciencefocus.com, 2012.
[421]
The Blue Brain Project. EPFL. {http://bluebrain.epfl.ch/}.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Neurocomputing
Neurocomputing  Volume 214, Issue C
November 2016
1063 pages

Publisher

Elsevier Science Publishers B. V.

Netherlands

Publication History

Published: 19 November 2016

Author Tags

  1. Applications of neural networks
  2. Artificial neural networks
  3. Brain Initiative
  4. Human Brain Project
  5. Learning algorithms
  6. Neural hardware
  7. Neural modelling
  8. Neural networks
  9. Neural simulators

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 07 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media