Academia.eduAcademia.edu

Basic structure of a neural network

Basic structure of a neural network The general feedback affects the function and logic of each node (or its ‘weight’), that is the way the node computes: it is information turning into logic. By modifying the node threshold the control feedback can change an OR gate into an AND gate, for example. TECHNO-LOGICAL FORMS Each network node is a transmission node but also a computation node, a logic gate, a little operator or Turing machine. Each node is both information and function, or logic. Adaptation Learning Reinforcement Backpropagation Weight adjustment ƒ Scansion ƒ 0 Logic gate 1 Feedback loop Control Feedback ƒ The control feedback that performs weight adjustment or backpropagation can be an equation, an algorithm or even (ideally) a human operator. 1 environment sensors 1 1 0 Network INPUT 01010 image sound The neural network idea is the composition of four technological forms: 1) scansion, that is discretisation or digitisation of analogue inputs since the age of radio, TV, etc.; 2) logic gate, that can be encoded into valves, transistors or microchips; 3) feedback loop, or the basic concept of cybernetics; 4) network, actually inspired here by biological neurons. Neural network is a cybernetic network in which a general feedback loop affects each individual node. In this sense it is the most adaptive architecture (whose topology opens to a true combinatory art that goes beyond the linear logic that is expected from a classic Turing machine: see the topologies of Recurrent Neural Network and Long Short-Term Memory NN). OUTPUT 0 1 INPUT LAYER The input layer of a neural network is sometimes called retina (since the first Perceptron) even if it does not compute visual data. 01010 pattern recognition classification HIDDEN LAYER OUTPUT LAYER The neural network is constructed as a network in which information is pipelined and distilled in higher forms of abstraction. It is in fact an arborescent network that sustain a hierarchical cone of more abstract features. Diagram: Prof. Matteo Pasquinelli with Lukas Rehm 2017