skip to main content
article

Local coupled feedforward neural network

Published: 01 January 2010 Publication History

Abstract

In this paper, the local coupled feedforward neural network is presented. Its connection structure is same as that of Multilayer Perceptron with one hidden layer. In the local coupled feedforward neural network, each hidden node is assigned an address in an input space, and each input activates only the hidden nodes near it. For each input, only the activated hidden nodes take part in forward and backward propagation processes. Theoretical analysis and simulation results show that this neural network owns the ''universal approximation'' property and can solve the learning problem of feedforward neural networks. In addition, its characteristic of local coupling makes knowledge accumulation possible.

References

[1]
A parallel recursive prediction error algorithm for training layered neural networks. International Journal of Control. v51 i6. 1215-1228.
[2]
An accelerated learning algorithm for multilayer pereceptrons: Optimization layer by layer. IEEE Transactions on Neural Networks. v6 i1. 31-42.
[3]
Acceleration of the backpropagation through dynamic adaptation of the momentum. Neural, Parallel&Scientific Computations. v5 i3. 297-308.
[4]
Incremental backpropagation learning networks. IEEE Transactions on Neural Networks. v7 i3. 757-761.
[5]
A pulse-based reinforcement algorithm for learning continuous functions. Neurocomputing. v14 i4. 319-344.
[6]
Some aspects of radial basis function approximation. In: Approximation theory, spline functions and applications, Kluwer, Dordrecht, Netherlands. pp. 163-190.
[7]
Adaptive improved natural gradient algorithm for blind source separation. Neural Computation. v21 i3. 872-889.
[8]
A new adaptive backpropagation algorithm based on Lyapunov stability theory for neural networks. IEEE Transactions on Neural Networks. v17 i6. 1580-1591.
[9]
Efficient block training of multilayer perceptrons. Neural Computation. v12 i6. 1429-1447.
[10]
Magnified gradient function with deterministic weight modification in adaptive learning. IEEE Transactions on Neural Networks. v15 i6. 1411-1423.
[11]
H∞-learning of layered neural networks. IEEE Transactions on Neural Networks. v12 i6. 1265-1277.
[12]
Improving the convergence of the backpropagation algorithm. Neural Networks. v5. 465-471.
[13]
Biologically plausible error-driven learning using local activation differences:The generalized recirculation algorithm. Neural Computation. v8 i5. 895-938.
[14]
Fast second order learning algorithm for feedforward multilayer neural networks and its applications. Neural Networks. v9 i9. 1583-1596.
[15]
Deterministic nonmonotone strategies for effective training of multilayer perceptrons. IEEE Transactions on Neural Networks. v13 i6. 1268-1284.
[16]
The theory of radial basis functions in 1990. In: Advances in numerical analysis II: Wavelets, subdivision, and radial basis functions, Oxford University Press, Oxford, UK. pp. 105-210.
[17]
Dynamic tunneling technique for efficient training of multilayer perceptrons. IEEE Transactions on Neural Networks. v10 i1. 48-55.
[18]
The layer-wise method and the backpropagation hybrid approach to learning a feedforward neural network. IEEE Transactions on Neural Networks. v11 i2. 295-304.
[19]
Accelerating backpropagation through dynamic self-adaptation. Neural Networks. v9 i4. 589-601.
[20]
A new error function at hidden layers for fast training of multilayer perceptrons. IEEE Transactions on Neural Networks. v10. 960-964.
[21]
Improving the back-propagation algorithm using evolutionary strategy. IEEE Transactions on Circuits and Systems-II: Express Briefs. v54 i2. 171-175.
[22]
Accelerating backpropagation through dynamic self-adaptation. Neural Networks. v9 i4. 589-601.
[23]
Sun, J. (1998). A new kind of feedforward neural network with advanced learning property. Intelligent Engineering Systems Through Artificial Neural Networks. In Proceedings of the 1998 artificial networks in engineering conference (pp. 81-88) Vol. 4
[24]
A correlation-based learning algorithm for feedforward and recurrent neural networks. Neural Computation. v6 i3. 469-490.
[25]
A second-order learning algorithm for multilayer networks based on block Hessian matrix. Neural Networks. v11 i9. 1607-1622.
[26]
Efficient backpropagation learning using optimal learning rate and momentum. Neural Networks. v10 i3. 517-527.
[27]
Optimization of a three-term backpropagation algorithm used for neural network learning. International Journal of Computational Intelligence. v3 i4. 322-327.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Neural Networks
Neural Networks  Volume 23, Issue 1
January, 2010
156 pages

Publisher

Elsevier Science Ltd.

United Kingdom

Publication History

Published: 01 January 2010

Author Tags

  1. BP
  2. Feedforward
  3. LCFNN
  4. MLP
  5. Neural networks

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 06 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media