skip to main content
Volume 6, Issue 3May 1994
Publisher:
  • MIT Press
  • 55 Hayward St.
  • Cambridge
  • MA
  • United States
ISSN:0899-7667
Reflects downloads up to 21 Jan 2025Bibliometrics
Skip Table Of Content Section
article
Review:

In recent years there has been significant interest in adapting techniques from statistical physics, in particular mean field theory, to provide deterministic heuristic algorithms for obtaining approximate solutions to optimization problems. Although ...

article
Object recognition and sensitive periods: A computational analysis of visual imprinting

Using neural and behavioral constraints from a relatively simple biological visual system, we evaluate the mechanism and behavioral implications of a model of invariant object recognition. Evidence from a variety of methods suggests that a localized ...

article
Computing stereo disparity and motion with known binocular cell properties

Many models for stereo disparity computation have been proposed, but few can be said to be truly biological. There is also a rich literature devoted to physiological studies of stereopsis. Cells sensitive to binocular disparity have been found in the ...

article
Integration and differentiation in dynamic recurrent neural networks

Dynamic neural networks with recurrent connections were trained by backpropagation to generate the differential or the leaky integral of a nonrepeating frequency-modulated sinusoidal signal. The trained networks performed these operations on arbitrary ...

article
A convergence result for learning in recurrent neural networks

We give a rigorous analysis of the convergence properties of a backpropagation algorithm for recurrent networks containing either output or hidden layer recurrence. The conditions permit data generated by stochastic processes with considerable ...

article
Topology learning solved by extended objects: A neural network model

It is shown that local, extended objects of a metrical topological space shape the receptive fields of competitive neurons to local filters. Self-organized topology learning is then solved with the help of Hebbian learning together with extended objects ...

article
Dynamics of discrete time, continuous state hopfield networks

The dynamics of discrete time, continuous state Hopfield networks is driven by an energy function. In this paper, we use this tool to prove under mild hypotheses that any trajectory converges to a fixed point for the sequential iteration, and to a cycle ...

article
Alopex: A correlation-based learning algorithm for feedforward and recurrent neural networks

We present a learning algorithm for neural networks, called Alopex. Instead of error gradient, Alopex uses local correlations between changes in individual weights and changes in the global error measure. The algorithm does not make any assumptions ...

article
Duality between learning machines: A bridge between supervised and unsupervised learning

We exhibit a duality between two perceptrons that allows us to compare the theoretical analysis of supervised and unsupervised learning tasks. The first perceptron has one output and is asked to learn a classification of p patterns. The second (dual) ...

article
Finding the embedding dimension and variable dependencies in time series

We present a general method, the δ-test, which establishes functional dependencies given a sequence of measurements. The approach is based on calculating conditional probabilities from vector component distances. Imposing the requirement of continuity ...

article
Comparison of some neural network and scattered data approximations: The inverse manipulator kinematics example

This paper compares the application of five different methods for the approximation of the inverse kinematics of a manipulator arm from a number of joint angle/Cartesian coordinate training pairs. The first method is a standard feedforward neural ...

article
Functionally equivalent feedforward neural networks

For a feedforward perceptron type architecture with a single hidden layer but with a quite general activation function, we characterize the relation between pairs of weight vectors determining networks with the same input-output function.

Comments