Structured Dynamics in the Algorithmic Agent
Abstract
:1. Introduction
2. Generative Models as Lie Groups
2.1. Classifying Model-Generated Cat Images
2.1.1. Generative Models
2.1.2. Recursion and Compositionally in Lie Group Action
2.1.3. Lie Pseudogroups and Generative Models
2.2. Implications of Invariance Under Group Action
3. The Agent as a Dynamical System
3.1. General Model
3.2. Conserved Quantities and Symmetries
4. The World-Tracking Condition
- Agent inputs are generated by simple rules (the world is inherently simple), i.e., by a hierarchical generative model as discussed above.
- The agent is able to “track” its inputs, i.e., use an internal generative model to approximately match them.
4.1. Constrained Dynamics I
4.2. Constrained Dynamics II
4.3. The Fixed Input Problem
Covariance of the Equations
4.4. Tracking Time-Varying Inputs
4.5. Coarse-Graining, Hierarchical Constraints, and Manifolds
5. Discussion
- As a dynamical system, the successful agent displays conserved quantities—with dynamics in a reduced hierarchical manifold—corresponding to the world-tracking constraints generated by a Lie group. For dynamical inputs, this stability is achieved after initial transients, captured by a Lyapunov function (see Section 4.4).
- The constraints force structural symmetries in the dynamical system constitutive equations that meet them: the agent carries the symmetries of the world generative model. This is encoded in the structural elements of the dynamical system (the ws in our formulation).
5.1. Symmetry, Groups, and Algorithmic Complexity
5.2. Connections with Empirical Observations
5.3. Connections with Neurophenomenology
5.4. Discovering Structure
5.5. Applications in AI and Neurosynthetic Agent Design
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. Equations for a Turing Pair
Appendix B. Notes on Lie Groups
Appendix B.1. Definitions
- 1.
- Associativity. If g, h, and k are elements of G, then
- 2.
- Identity Element. There is a distinguished element e of G, called the identity element, which has the property that
- 3.
- Inverses. For each g in G, there is an inverse, denoted , with the property
- 1.
- The coordinate charts cover M:
- 2.
- On the overlap of any pair of coordinate charts , the composite map
- 3.
- If , are distinct points of M, then there exist open subsets , with , , satisfying
- 1.
- If , , and also , then
- 2.
- For all ,
- 3.
- If , then and
- 1.
- Identity: The identity map for any open subset belongs to .
- 2.
- Closure under composition: If and belong to , then their composition also belongs to .
- 3.
- Closure under inversion: If belongs to , then its inverse also belongs to .
- 4.
- Closure under restriction: If belongs to and , , then the restriction also belongs to .
- 5.
- Lie structure: The local diffeomorphisms in are solutions of a system of finite-order partial differential equations defined on M, ensuring has the structure of a smooth (Lie) manifold.
Appendix B.2. Important Theorems
Dimension and Classification
- Finite-Dimensional Pseudogroups: Certain classes of finite-dimensional Lie pseudogroups (e.g., isometry groups, projective groups) are well-studied. However, there is no complete classification of all finite-dimensional Lie pseudogroups, reflecting their richness and the vast variety of PDEs that can define them.
- Infinite-Dimensional Pseudogroups: In highly complex or topologically intricate manifolds, one typically needs infinite-dimensional pseudogroups (or their groupoid of germs) to navigate the space fully.
Appendix B.3. Finite-Dimensional Lie Groups vs. Lie Pseudogroups and Moduli Stacks
Summary
Appendix B.4. Action of a Lie Group on a Manifold
Appendix B.5. Invariance and Equivariance
Appendix B.6. Further Notes on Lie Groups
Appendix C. Are All Lie Generative Models Finite-Dimensional?
Appendix D. Groups, Turing Machines, and Generative Models
- Q is a finite set of states.
- is a finite input alphabet that does not contain the blank symbol ⊔.
- is the tape alphabet, where and .
- is the transition function. Here, L, R, and S denote left shift, right shift, and no shift on the tapes, respectively.
- is the initial state.
- is the accept state.
- is the reject state, distinct from the accept state.
- The input tape contains the input string and is read-only.
- The output tape is used to write the output and is write-only.
- The private tape is used for intermediate computations and can be both read from and written to.
- The transition function dictates the machine’s actions based on the current state and the symbols under the tape heads. It specifies the next state, the symbols to write on each tape, and the movements of the tape heads.
- The computation begins in the initial state and proceeds according to the transition function until the machine enters either the accept state or the reject state .
- Closure: Ensuring that the combination of any two states (or actions) results in another valid state within the system.
- Associativity: The sequence of transitions must not affect the final state of the machine, a non-trivial property to verify in computational processes.
- Identity and Invertibility: Identifying a state that acts as an identity element, and for each state, an inverse state, can be complex in the context of a Turing machine.
Appendix E. Symmetry in ODEs: Abstract vs. Traditional Definitions
Olver’s Definition of Symmetry of PDEs [73,168]
Abstract Definition of Symmetry for ODEs
Appendix F. Conserved Quantities and Canonical Variables
Appendix F.1. Some Examples
Appendix F.2. A Not-So-Simple Example of Constrained Dynamics
Appendix F.3. Approximate Symmetry After Transients
Appendix F.4. Transition to Canonical Coordinates
References
- Metzinger, T. Artificial suffering: An argument for a global moratorium on synthetic phenomenology. J. Artif. Intell. Conscious. 2021, 8, 43–66. [Google Scholar] [CrossRef]
- Ethics of Artificial Intelligence | Internet Encyclopedia of Philosophy. 2023. Available online: https://iep.utm.edu/ethics-of-artificial-intelligence/ (accessed on 1 December 2024).
- Ruffini, G. Information, complexity, brains and reality (“Kolmogorov Manifesto”). arXiv 2007, arXiv:0704.1147. [Google Scholar]
- Ruffini, G. Reality as Simplicity. arXiv 2007, arXiv:0903.1193. [Google Scholar] [CrossRef]
- Ruffini, G. Models, networks and algorithmic complexity. arXiv 2016, arXiv:1612.05627. [Google Scholar]
- Ruffini, G. An algorithmic information theory of consciousness. Neurosci. Conscious. 2017, 2017, nix019. [Google Scholar] [CrossRef]
- Ruffini, G.; Lopez-Sola, E. AIT foundations of structured experience. J. Artif. Intell. Conscious. 2022, 9, 153–191. [Google Scholar] [CrossRef]
- Ruffini, G.; Castaldo, F.; Lopez-Sola, E.; Sanchez-Todo, R.; Vohryzek, J. The algorithmic agent perspective and computational neuropsychiatry: From etiology to advanced therapy in major depressive disorder. Entropy 2024, 26, 953. [Google Scholar] [CrossRef] [PubMed]
- Ruffini, G. Navigating Complexity: How Resource-Limited Agents Derive Probability and Generate Emergence. PsyrXiv 2024. [Google Scholar] [CrossRef]
- Van Gulick, R. Consciousness. The Stanford Encyclopedia of Philosophy. 2016. Available online: https://plato.stanford.edu/entries/consciousness/ (accessed on 1 January 2022).
- Friston, K. Does predictive coding have a future? Nat. Neurosci. 2018, 21, 1019–1021. [Google Scholar] [CrossRef] [PubMed]
- Parr, T. Active Inference: The Free Energy Principle in Mind, Brain, and Behavior; The MIT Press: Cambridge, MA, USA, 2022. [Google Scholar]
- Tishby, N.; Pereira, F.C.; Bialek, W. The information bottleneck method. arXiv 2000, arXiv:physics/0004057. [Google Scholar] [CrossRef]
- Li, M.; Vitanyi, P.M. Applications of algorithmic information theory. Scholarpedia 2007, 2, 2658. [Google Scholar] [CrossRef]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; John Wiley & Sons: New York, NY, USA, 2006. [Google Scholar]
- Li, M.; Vitanyi, P. An Introduction to Kolmogorov Complexity and Its Applications; Spriger: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
- Schmidhuber, J. Discovering neural nets with low Kolmogorov complexity and high generalization capability. Neural Netw. 1997, 10, 857–873. [Google Scholar] [CrossRef] [PubMed]
- Hutter, M. Universal Artificial Intellegence; Texts in Theoretical Computer Science An EATCS Series; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar] [CrossRef]
- Chaitin, G.J. Epistemology as Information Theory: From Leibniz to Omega. arXiv 2005, arXiv:math/0506552. [Google Scholar] [CrossRef]
- Sober, E. Ockham’s Razors: A User’s Manual; Cambridge University Press: Cambridge, UK, 2015. [Google Scholar]
- Conant, R.C.; Ashby, W.R. Every good regulator of a system must be a model of that system. Int. J. Syst. Sci. 1970, 1, 89–97. [Google Scholar] [CrossRef]
- Parr, T.; Pezzulo, G. Active Inference; MIT Press: London, UK, 2022. [Google Scholar]
- Feynman, R.P. Feynman Lectures on Computation; Advanced book program; Perseus: Cambridge, MA, USA, 2001. [Google Scholar]
- Langton, C.G. Computation at the edge of chaos: Phase transitions and emergent computation. Phys. D Nonlinear Phenom. 1990, 42, 12–37. [Google Scholar] [CrossRef]
- Ruffini, G.; Lopez-Sola, E.; Vohryzek, J.; Sanchez-Todo, R. Neural geometrodynamics, complexity, and plasticity: A psychedelics perspective. Entropy 2024, 26, 90. [Google Scholar] [CrossRef]
- McCabe, T. A Complexity Measure. IEEE Trans. Softw. Eng. 1976, SE-2, 308–320. [Google Scholar] [CrossRef]
- Shew, W.L.; Plenz, D. The functional benefits of criticality in the cortex. Neurosci. Rev. J. Bringing Neurobiol. Neurol. Psychiatry 2013, 19, 88–100. [Google Scholar] [CrossRef] [PubMed]
- Erten, E.; Lizier, J.; Piraveenan, M.; Prokopenko, M. Criticality and information dynamics in epidemiological models. Entropy 2017, 19, 194. [Google Scholar] [CrossRef]
- Li, M.; Han, Y.; Aburn, M.J.; Breakspear, M.; Poldrack, R.A.; Shine, J.M.; Lizier, J.T. Transitions in information processing dynamics at the whole-brain network level are driven by alterations in neural gain. PLoS Comput. Biol. 2019, 15, e1006957. [Google Scholar] [CrossRef]
- Mediano, P.A.M.; Rosas, F.E.; Farah, J.C.; Shanahan, M.; Bor, D.; Barrett, A.B. Integrated information as a common signature of dynamical and information-processing complexity. Chaos 2022, 32, 13115. [Google Scholar] [CrossRef]
- Deco, G.; Jirsa, V.K. Ongoing cortical activity at rest: Criticality, multistability, and ghost attractors. J. Neurosci. 2012, 32, 3366–3375. [Google Scholar] [CrossRef] [PubMed]
- Deco, G.; Jirsa, V.; McIntosh, A.R.; Sporns, O.; Kötter, R. Key role of coupling, delay, and noise in resting brain fluctuations. Proc. Natl. Acad. Sci. USA 2009, 106, 10302–10307. [Google Scholar] [CrossRef] [PubMed]
- Cieri, F.; Zhuang, X.; Caldwell, J.Z.K.; Cordes, D. Brain entropy during aging through a free energy principle approach. Front. Hum. Neurosci. 2021, 15, 139. [Google Scholar] [CrossRef]
- Tagliazucchi, E.; Balenzuela, P.; Fraiman, D.; Chialvo, D.R. Criticality in large-scale brain FMRI dynamics unveiled by a novel point process analysis. Front. Physiol. 2012, 3, 15. [Google Scholar] [CrossRef] [PubMed]
- Chialvo, D.R. Emergent complex neural dynamics. Nat. Phys. 2010, 6, 744–750. [Google Scholar] [CrossRef]
- Haimovici, A.; Tagliazucchi, E.; Balenzuela, P.; Chialvo, D.R. Brain organization into resting state networks emerges at criticality on a model of the human connectome. Phys. Rev. Lett. 2013, 110, 178101. [Google Scholar] [CrossRef]
- Marinazzo, D.; Pellicoro, M.; Wu, G.; Angelini, L.; Cortés, J.; Stramaglia, S. Information transfer and criticality in the ising model on the human connectome. PLoS ONE 2014, 9, e93616. [Google Scholar] [CrossRef] [PubMed]
- Hancock, F.; Rosas, F.; Mediano, P.; Luppi, A.; Cabral, J.; Dipasquale, O.; Turkheim, F. May the 4c’s be with you: An overview of complexity-inspired frameworks for analyzing resting-state neuroimaging data. J. R. Soc. Interface 2022, 19, 20220214. [Google Scholar] [CrossRef] [PubMed]
- Ruffini, G.; Deco, G. The 2D Ising model, criticality and AIT. bioRxiv 2021. [Google Scholar] [CrossRef]
- Ruffini, G.; Damiani, G.; Lozano-Soldevilla, D.; Deco, N.; Rosas, F.E.; Kiani, N.A.; Ponce-Alvarez, A.; Kringelbach, M.L.; Carhart-Harris, R.; Deco, G. LSD-induced increase of Ising temperature and algorithmic complexity of brain dynamics. PLoS Comput. Biol. 2023, 19, e1010811. [Google Scholar] [CrossRef]
- Carhart-Harris, R.L.; Friston, K.J. REBUS and the anarchic brain: Toward a unified model of the brain action of psychedelics. Pharmacol. Rev. 2019, 71, 316–344. [Google Scholar] [CrossRef] [PubMed]
- Jensen, H.J. Self-Organized Criticality: Emergent Complex Behavior in Physical and Biological Systems; Cambridge lecture notes in physics; Cambridge University Press: Cambridge, UK, 1998. [Google Scholar] [CrossRef]
- Pruessner, G. Self-Organised Criticality; Cambridge University Press: Cambridge, UK, 2012. [Google Scholar]
- Christensen, K.; Moloney, N.R. Complexity and Criticality; Imperial College Press: London, UK, 2005. [Google Scholar]
- Touboul, J.; Destexhe, A. Can Power-Law Scaling and Neuronal Avalanches Arise from Stochastic Dynamics? PLoS ONE 2010, 5, e8982. [Google Scholar] [CrossRef] [PubMed]
- Mariani, B.; Nicoletti, G.; Bisio, M.; Maschietto, M.; Vassanelli, S.; Suweis, S. Disentangling the critical signatures of neural activity. Sci. Rep. 2022, 12, 10770. [Google Scholar] [CrossRef] [PubMed]
- Morrell, M.C.; Nemenman, I.; Sederberg, A. Neural criticality from effective latent variables. eLife 2024, 12, RP89337. [Google Scholar] [CrossRef] [PubMed]
- Turrigiano, G.G.; Nelson, S.B. Homeostatic plasticity in the developing nervous system. Nat. Rev. Neurosci. 2004, 5, 97–107. [Google Scholar] [CrossRef]
- Plenz, D.; Ribeiro, T.L.; Miller, S.R.; Kells, P.A.; Vakili, A.; Capek, E.L. Self-organized criticality in the brain. Front. Phys. 2021, 9, 639389. [Google Scholar] [CrossRef]
- Hesse, J.; Gross, T. Self-organized criticality as a fundamental property of neural systems. Front. Syst. Neurosci. 2014, 8, 166. [Google Scholar] [CrossRef] [PubMed]
- Carhart-Harris, R.L. The entropic brain-revisited. Neuropharmacology 2018, 142, 167–178. [Google Scholar] [CrossRef] [PubMed]
- Wigner, E.P. The unreasonable effectiveness of mathematics in the natural sciences. Richard courant lecture in mathematical sciences delivered at New York University, May 11, 1959. Commun. Pure Appl. Math. 1960, 13, 1–14. [Google Scholar] [CrossRef]
- Henry, S.; Kafura, D. Software Structure Metrics Based on Information Flow. IEEE Trans. Softw. Eng. 1981, SE-7, 510–518. [Google Scholar] [CrossRef]
- Aho, A.V.; Hopcroft, J.E.; Ullman, J.D. The Design and Analysis of Computer Algorithms; Addison-Wesley series in computer science and information processing; Addison-Wesley: Reading, MA, USA, 2000. [Google Scholar]
- Rajamaran, V.; Murthy, C.S.R. Parallel Computers: Architecture and Programming; PHI Learning Pvt. Ltd.: Delhi, India, 2016. [Google Scholar]
- Pierce, B.C. Basic Category Theory for Computer Scientists; Foundations of computing; MIT Press: Cambridge, MA, USA, 2011. [Google Scholar]
- Hughes, J. Why Functional Programming Matters. Comput. J. 1989, 32, 98–107. [Google Scholar] [CrossRef]
- Bird, R.; Wadler, P. Introduction to Functional Programming; Prentice-Hall International series in computer science; Prentice Hall: New York, NY, USA, 1995. [Google Scholar]
- Carhart-Harris, R.L.; Leech, R.; Hellyer, P.J.; Shanahan, M.; Feilding, A.; Tagliazucchi, E.; Chialvo, D.R.; Nutt, D. The entropic brain: A theory of conscious states informed by neuroimaging research with psychedelic drugs. Front. Hum. Neurosci. 2014, 8, 20. [Google Scholar] [CrossRef]
- Bengio, Y.; Courville, A.; Vincent, P. Representation learning: A review and new perspectives. arXiv 2012, arXiv:1206.5538. [Google Scholar] [CrossRef] [PubMed]
- Brown, B.C.; Caterini, A.L.; Ross, B.L.; Cresswell, J.C.; Loaiza-Ganem, G. Verifying the union of manifolds hypothesis for image data. In Proceedings of the Eleventh International Conference on Learning Representations, Kigali, Rwanda, 1–5 May 2023. [Google Scholar]
- Shine, J.M.; Hearne, L.J.; Breakspear, M.; Hwang, K.; Müller, E.J.; Sporns, O.; Poldrack, R.A.; Mattingley, J.B.; Cocchi, L. The low-dimensional neural architecture of cognitive complexity is related to activity in medial thalamic nuclei. Neuron 2019, 104, 849–855.e3. [Google Scholar] [CrossRef]
- Noether, E. Invariante variationsprobleme. In Nachrichten von der Gesellschaft der Wissenschaften zu Göttingen; Mathematisch-Physikalische Klasse: Berlin, Germany, 1918; pp. 235–257. [Google Scholar]
- Neuenschwander, D.E. Emmy Noether’s Wonderful Theorem; Johns Hopkins University Press: Baltimore, MD, USA, 2017. [Google Scholar]
- Poggio, T.; Mutch, J.; Leibo, J.; Rosasco, L.; Tacchetti, A. The Computational Magic of the Ventral Stream: Sketch of a Theory (and Why Some Deep Architectures Work); Technical Report; MIT Computer Science and Artificial Intelligence Laboratory: Cambridge, MA, USA, 2012; Issue: MIT-CSAIL-TR-2012-035. [Google Scholar]
- Anselmi, F.; Poggio, T. Representation learning in sensory cortex: A theory. IEEE Access Pract. Innov. Open Solut. 2022, 10, 102475–102491. [Google Scholar] [CrossRef]
- Kaiser, E.; Kutz, J.N.; Brunton, S.L. Data-driven discovery of Koopman eigenfunctions for control. arXiv 2017, arXiv:1707.01146. [Google Scholar] [CrossRef]
- Kaiser, E.; Kutz, J.N.; Brunton, S.L. Discovering conservation laws from data for control. In Proceedings of the 2018 IEEE Conference on Decision and Control (CDC), Miami Beach, FL, USA, 17–19 December 2018. [Google Scholar]
- Otto, S.E.; Zolman, N.; Kutz, J.N.; Brunton, S.L. A unified framework to enforce, discover, and promote symmetry in machine learning. arXiv 2023, arXiv:2311.00212. [Google Scholar]
- Fukushima, K. Neocognitron: A self organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybern. 1980, 36, 193–202. [Google Scholar] [CrossRef]
- LeCun, Y.; Boser, B.; Denker, J.S.; Henderson, D.; Howard, R.E.; Hubbard, W.; Jackel, L.D. Backpropagation applied to handwritten zip code recognition. Neural Comput. 1989, 1, 541–551. [Google Scholar] [CrossRef]
- Hall, B.C. Lie Groups, Lie Algebras, and Representations: An Elementary Introduction; Springer: Berlin/Heidelberg, Germany, 2015. [Google Scholar]
- Olver, P.J. Applications of Lie Groups to Differential Equations, 1986th ed.; Graduate texts in mathematics; Springer: New York, NY, USA, 2012. [Google Scholar]
- Rao, R.; Ruderman, D. Learning lie groups for invariant visual perception. In Advances in Neural Information Processing Systems; Kearns, M., Solla, S., Cohn, D., Eds.; MIT Press: Cambridge, MA, USA, 1998; Volume 11. [Google Scholar]
- Poggio, T. The computational magic of the ventral stream. Nat. Preced. 2012. [Google Scholar] [CrossRef]
- Moskalev, A.; Sepliarskaia, A.; Sosnovik, I.; Smeulders, A. LieGG: Studying learned Lie group generators. Adv. Neural Inf. Process. Syst. 2022, 35, 25212–25223. [Google Scholar]
- Stillwell, J. Naive Lie Theory; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
- Schmid, R. Infinite-dimensional Lie groups and algebras in mathematical physics. Adv. Math. Phys. 2010, 2010, 280362. [Google Scholar] [CrossRef]
- Selig, J.M. Geometric Fundamentals of Robotics, 2nd ed.; Monographs in computer science; Springer: New York, NY, USA, 2004. [Google Scholar]
- Blender Online Community. Blender—A 3D Modelling and Rendering Package. Manual. Blender Foundation, Stichting Blender Foundation, Amsterdam. 2018. Available online: http://www.blender.org (accessed on 15 October 2024).
- Miao, X.; Rao, R.P.N. Learning the Lie groups of visual invariance. Neural Comput. 2007, 19, 2665–2693. [Google Scholar] [CrossRef] [PubMed]
- Ibrahim, M.; Bouchacourt, D.; Morcos, A.S. Robust self-supervised learning with lie groups. arXiv 2023, arXiv:2210.13356. [Google Scholar]
- Lynch, K.M.; Park, F.C. Modern Robotics; Cambridge University Press: Cambridge, UK, 2017. [Google Scholar]
- Hornik, K. Approximation capabilities of multilayer feedforward networks. Neural Netw. 1991, 4, 251–257. [Google Scholar] [CrossRef]
- Poggio, T.; Mhaskar, H.; Rosasco, L.; Miranda, B.; Liao, Q. Why and when can deep—But not shallow—Networks avoid the curse of dimensionality: A review. arXiv 2016, arXiv:1611.00740. [Google Scholar] [CrossRef]
- Mhaskar, H.; Liao, Q.; Poggio, T. Learning Functions: When Is Deep Better Than Shallow. arXiv 2016, arXiv:1603.00988. [Google Scholar]
- Agrawal, D.; Ostrowski, J. A classification of G-invariant shallow neural networks. Adv. Neural Inf. Process. Syst. 2022, 35, 13679–13690. [Google Scholar]
- Moore, C. Generalized shifts: Unpredictability and undecidability in dynamical systems. Nonlinearity 1991, 4, 199–230. [Google Scholar] [CrossRef]
- Branicky, M.S. Universal computation and other capabilities of hybrid and continuous dynamical systems. Theor. Comput. Sci. 1995, 138, 67–100. [Google Scholar] [CrossRef]
- Graça, D.; Zhong, N. Analytic one-dimensional maps and two-dimensional ordinary differential equations can robustly simulate Turing machines. Computability 2023, 12, 117–144. [Google Scholar] [CrossRef]
- Friedenberger, Z.; Harkin, E.; Tóth, K.; Naud, R. Silences, spikes and bursts: Three-part knot of the neural code. J. Physiol. 2023, 601, 5165–5193. [Google Scholar] [CrossRef] [PubMed]
- Charó, G.D.; Chekroun, M.D.; Sciamarella, D.; Ghil, M. Noise-driven topological changes in chaotic dynamics. Chaos 2021, 31, 103115. [Google Scholar] [CrossRef]
- Hydon, P.E. Cambridge Texts in Applied Mathematics: Symmetry Methods for Differential Equations: A Beginner’s Guide Series Number 22; Cambridge texts in applied mathematics; Cambridge University Press: Cambridge, UK, 2000. [Google Scholar]
- Leinster, T. Galois Theory. arXiv 2024, arXiv:2408.07499. [Google Scholar] [CrossRef]
- Coddington, E.A. An Introduction to Ordinary Differential Equations; Dover Books on mathematics; Dover Publications: New York, NY, USA, 1989. [Google Scholar]
- Fradkin, D.M. Existence of the dynamic SymmetriesO4andSU3for all classical central potential problems. Prog. Theor. Phys. 1967, 37, 798–812. [Google Scholar] [CrossRef]
- Campbell, S.; Linh, V.; Petzold, L. Differential-algebraic equations. Sch. J. 2008, 3, 2849. [Google Scholar] [CrossRef]
- Sanchez, R.M.; Dunkelberger, G.R.; Quigley, H.A. The number and diameter distribution of axons in the monkey optic nerve. Investig. Ophthalmol. Vis. Sci. 1986, 27, 1342–1350. [Google Scholar]
- Li, Y.; Zhang, L.; Qiu, Z.; Jiang, Y.; Zhang, Y.; Li, N.; Ma, Y.; Xu, L.; Yu, J. NIMBLE: A non-rigid hand model with bones and muscles. Acm Trans. Graph. (Tog) 2022, 41, 1–16. [Google Scholar] [CrossRef]
- Herculano-Houzel, S. The remarkable, yet not extraordinary, human brain as a scaled-up primate brain and its associated cost. Proc. Natl. Acad. Sci. USA 2012, 109 (Suppl. 1), 10661–10668. [Google Scholar] [CrossRef]
- Pakkenberg, B.; Pelvig, D.; Marner, L.; Bundgaard, M.J.; Gundersen, H.J.G.; Nyengaard, J.R.; Regeur, L. Aging and the human neocortex. Exp. Gerontol. 2003, 38, 95–99. [Google Scholar] [CrossRef] [PubMed]
- Jaeger, H. The “Echo State” approach to analysing and training recurrent neural networks. Bonn Ger. Ger. Natl. Res. Cent. Inf. Technol. Gmd Tech. Rep. 2001, 148, 13. [Google Scholar]
- Lukoševičius, M. A practical guide to applying echo state networks. In Neural Networks: Tricks of the Trade; Springer: Berlin/Heidelberg, Germany, 2012; pp. 659–686. [Google Scholar] [CrossRef]
- Tanaka, G.; Yamane, T.; Héroux, J.B.; Nakane, R.; Kanazawa, N.; Takeda, S.; Numata, H.; Nakano, D.; Hirose, A. Recent advances in physical reservoir computing: A review. Neural Netw. 2019, 115, 100–123. [Google Scholar] [CrossRef] [PubMed]
- Kingma, D.P.; Welling, M. Auto-encoding variational bayes. arXiv 2013, arXiv:1312.6114. [Google Scholar]
- Khalil, H.K. Nonlinear Systems. Hauptbd, 3rd ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2002. [Google Scholar]
- DiCarlo, J.J.; Zoccolan, D.; Rust, N.C. How does the brain solve visual object recognition? Neuron 2012, 73, 415–434. [Google Scholar] [CrossRef]
- Grill-Spector, K.; Weiner, K.S. The functional architecture of the ventral temporal cortex and its role in categorization. Nat. Rev. Neurosci. 2014, 15, 536–548. [Google Scholar] [CrossRef] [PubMed]
- Bizley, J.K.; Cohen, Y.E. The what, where and how of auditory-object perception. Nat. Rev. Neurosci. 2013, 14, 693–707. [Google Scholar] [CrossRef]
- Perl, Y.S.; Bocaccio, H.; Pérez-Ipiña, I.; Zamberlán, F.; Piccinini, J.; Laufs, H.; Kringelbach, M.; Deco, G.; Tagliazucchi, E. Generative embeddings of brain collective dynamics using variational autoencoders. Phys. Rev. Lett. 2020, 125, 238101. [Google Scholar] [CrossRef]
- Cayton, L. Algorithms for Manifold Learning; Technical report; Department of Computer Science & Engineering: San Diego, CA, USA, 2008. [Google Scholar]
- Urai, A.E.; Doiron, B.; Leifer, A.M.; Churchland, A.K. Large-scale neural recordings call for new insights to link brain and behavior. Nat. Neurosci. 2022, 25, 11–19. [Google Scholar] [CrossRef]
- Nicoletti, G.; Busiello, D.M. Information Propagation in Multilayer Systems with Higher-Order Interactions across Timescales. Phys. Rev. X 2024, 14, 021007. [Google Scholar] [CrossRef]
- Poggio, T. Foundations of Deep Learning: Compositional Sparsity of Computable Functions; CBMM memo, Center for Brains, Minds, and Machines, MIT: Cambridge, MA, USA, 2023; Issue: 138. [Google Scholar]
- Poggio, T. Compositional Sparsity: A Framework for ML; CBMM memo, Center for Brains, Minds, and Machines, MIT: Cambridge, MA, USA, 2022; Issue: 138. [Google Scholar]
- Perl, Y.S.; Geli, S.; Pérez-Ordoyo, E.; Zonca, L.; Idesis, S.; Vohryzek, J.; Jirsa, V.K.; Kringelbach, M.L.; Tagliazucchi, E.; Deco, G. Whole-brain modelling of low-dimensional manifold modes reveals organising principle of brain dynamics. bioRxiv 2023. [Google Scholar] [CrossRef]
- Farnes, N.; Juel, B.E.; Nilsen, A.S.; Romundstad, L.G.; Storm, J.F. Increased signal diversity/complexity of spontaneous EEG, but not evoked EEG responses, in ketamine-induced psychedelic state in humans. PLoS ONE 2020, 15, e0242056. [Google Scholar] [CrossRef]
- Khalili-Ardali, M.; Wu, S.; Tonin, A.; Birbaumer, N.; Chaudhary, U. Neurophysiological aspects of the completely locked-in syndrome in patients with advanced amyotrophic lateral sclerosis. Clin. Neurophysiol. Off. J. Int. Fed. Clin. Neurophysiol. 2021, 132, 1064–1076. [Google Scholar] [CrossRef] [PubMed]
- Zilio, F.; Gomez-Pilar, J.; Chaudhary, U.; Fogel, S.; Fomina, T.; Synofzik, M.; Schöls, L.; Cao, S.; Zhang, J.; Huang, Z.; et al. Altered brain dynamics index levels of arousal in complete locked-in syndrome. Commun. Biol. 2023, 6, 757. [Google Scholar] [CrossRef]
- Varley, T.F.; Craig, M.; Adapa, R.; Finoia, P.; Williams, G.; Allanson, J.; Stamatakis, E.A. Fractal dimension of cortical functional connectivity networks & severity of disorders of consciousness. PLoS ONE 2020, 15, e0223812. [Google Scholar]
- Luppi, A.I.; Craig, M.M.; Pappas, I.; Finoia, P.; Williams, G.B.; Allanson, J.; Pickard, J.D.; Owen, A.M.; Naci, L.; Menon, D.K.; et al. Consciousness-specific dynamic interactions of brain integration and functional diversity. Nat. Commun. 2019, 10, 4616. [Google Scholar] [CrossRef]
- Viol, A.; Palhano-Fontes, F.; Onias, H.; De Araujo, D.B.; Viswanathan, G.M. Shannon entropy of brain functional complex networks under the influence of the psychedelic Ayahuasca. Sci. Rep. 2017, 7, 7388. [Google Scholar] [CrossRef]
- Timmermann, C.; Roseman, L.; Schartner, M.; Milliere, R.; Williams, L.T.J.; Erritzoe, D.; Muthukumaraswamy, S.; Ashton, M.; Bendrioua, A.; Kaur, O.; et al. Neural correlates of the DMT experience assessed with multivariate EEG. Sci. Rep. 2019, 9, 16324. [Google Scholar] [CrossRef] [PubMed]
- Hipólito, I.; Mago, J.; Rosas, F.E.; Carhart-Harris, R. Pattern breaking: A complex systems approach to psychedelic medicine. Neurosci. Conscious. 2023, 2023, niad017. [Google Scholar] [CrossRef]
- Zenil, H. A review of methods for estimating algorithmic complexity: Options, challenges, and new directions. Entropy 2020, 22, 612. [Google Scholar] [CrossRef] [PubMed]
- Gallego, J.A.; Perich, M.G.; Miller, L.E.; Solla, S.A. Neural Manifolds for the Control of Movement. Neuron 2017, 94, 978–984. [Google Scholar] [CrossRef] [PubMed]
- Mitchell-Heggs, R.; Prado, S.; Gava, G.P.; Go, M.A.; Schultz, S.R. Neural manifold analysis of brain circuit dynamics in health and disease. J. Comput. Neurosci. 2023, 51, 1–21. [Google Scholar] [CrossRef] [PubMed]
- Saxena, S.; Russo, A.A.; Cunningham, J.; Churchland, M.M. Motor cortex activity across movement speeds is predicted by network-level strategies for generating muscle activity. eLife 2022, 11, e67620. [Google Scholar] [CrossRef] [PubMed]
- Brennan, C.; Proekt, A. A quantitative model of conserved macroscopic dynamics predicts future motor commands. eLife 2019, 8, e46814. [Google Scholar] [CrossRef]
- Fortunato, C.; Bennasar-Vázquez, J.; Park, J.; Chang, J.C.; Miller, L.E.; Dudman, J.T.; Perich, M.G.; Gallego, J.A. Nonlinear manifolds underlie neural population activity during behaviour. bioRxiv 2024. [Google Scholar] [CrossRef]
- Chaudhuri, R.; Gerçek, B.; Pandey, B.; Peyrache, A.; Fiete, I. The intrinsic attractor manifold and population dynamics of a canonical cognitive circuit across waking and sleep. Nat. Neurosci. 2019, 22, 1512–1520. [Google Scholar] [CrossRef]
- Stopfer, M.; Jayaraman, V.; Laurent, G. Intensity versus Identity Coding in an Olfactory System. Neuron 2003, 39, 991–1004. [Google Scholar] [CrossRef] [PubMed]
- Gardner, R.J.; Hermansen, E.; Pachitariu, M.; Burak, Y.; Baas, N.A.; Dunn, B.A.; Moser, M.B.; Moser, E.I. Toroidal topology of population activity in grid cells. Nature 2022, 602, 123–128. [Google Scholar] [CrossRef] [PubMed]
- Kim, S.S.; Rouault, H.; Druckmann, S.; Jayaraman, V. Ring attractor dynamics in the Drosophila central brain. Science 2017, 356, 849–853. [Google Scholar] [CrossRef]
- Petrucco, L.; Lavian, H.; Wu, Y.K.; Svara, F.; Štih, V.; Portugues, R. Neural dynamics and architecture of the heading direction circuit in zebrafish. Nat. Neurosci. 2023, 26, 765–773. [Google Scholar] [CrossRef]
- Smith, D.W. Phenomenology. In The Stanford Encyclopedia of Philosophy, Summer 2018 ed.; Zalta, E.N., Ed.; Metaphysics Research Lab, Stanford University: Stanford, CA, USA, 2018. [Google Scholar]
- Kawakita, G.; Zeleznikow-Johnston, A.; Takeda, K.; Tsuchiya, N.; Oizumi, M. Is my “red” your “red”?: Unsupervised alignment of qualia structures via optimal transport. PsyArXiv 2023. [Google Scholar] [CrossRef]
- Zamberlan, F.; Sanz, C.; Martínez Vivot, R.; Pallavicini, C.; Erowid, F.; Erowid, E.; Tagliazucchi, E. The Varieties of the Psychedelic Experience: A Preliminary Study of the Association Between the Reported Subjective Effects and the Binding Affinity Profiles of Substituted Phenethylamines and Tryptamines. Front. Integr. Neurosci. 2018, 12, 54. [Google Scholar] [CrossRef] [PubMed]
- Bzdok, D.; Carhart-Harris, R.; Savignac, C.; Bell, G.; Laureys, S. Large Language Models Auto-Profile Conscious Awareness Changes Under Psychedelic Drug Effects. 2024. Available online: https://www.researchsquare.com/article/rs-4670805/v1 (accessed on 1 September 2024).
- Bonifácio, T.A.d.S.; Carvalho, R.d.M.C.d.; Cravo, A.M. Self-report measures of subjective time: An overview of existing measures and their semantic similarities. PsyArXiv 2024. [Google Scholar] [CrossRef]
- Goldblum, M.; Finzi, M.; Rowan, K.; Wilson, A.G. The No Free Lunch Theorem, Kolmogorov Complexity, and the Role of Inductive Biases in Machine Learning. arXiv 2024, arXiv:2304.05366. [Google Scholar] [CrossRef]
- Huh, M.; Cheung, B.; Wang, T.; Isola, P. The Platonic Representation Hypothesis. arXiv 2024, arXiv:2405.07987. [Google Scholar] [CrossRef]
- Poggio, T.; Fraser, M. Compositional sparsity of learnable functions. Bull. Am. Math. Soc. 2024, 61, 438–456. [Google Scholar] [CrossRef]
- Hu, L.; Li, Y.; Lin, Z. Symmetry Discovery for Different Data Types. arXiv 2024, arXiv:2410.09841. [Google Scholar] [CrossRef]
- Yang, J.; Dehmamy, N.; Walters, R.; Yu, R. Latent Space Symmetry Discovery. arXiv 2024, arXiv:2310.00105. [Google Scholar] [CrossRef]
- Dehmamy, N.; Walters, R.; Liu, Y.; Wang, D.; Yu, R. Automatic Symmetry Discovery with Lie Algebra Convolutional Network. arXiv 2021, arXiv:2109.07103. [Google Scholar] [CrossRef]
- Coleman, A.J. The Betti Numbers of the Simple Lie Groups. Can. J. Math. 1958, 10, 349–356. [Google Scholar] [CrossRef]
- Rao, R.P.; Ballard, D.H. Predictive coding in the visual cortex: A functional interpretation of some extra-classical receptive-field effects. Nat. Neurosci. 1999, 2, 79–87. [Google Scholar] [CrossRef] [PubMed]
- Weiler, M.; Forré, P.; Verlinde, E.; Welling, M. Equivariant and coordinate independent convolutional networks. In A Gauge Field Theory of Neural Networks; World Scientific: Singapore, 2023. [Google Scholar]
- Ruffini, G.; Wendling, F.; Sanchez-Todo, R.; Santarnecchi, E. Targeting brain networks with multichannel transcranial current stimulation (tCS). Curr. Opin. Biomed. Eng. 2018, 8, 70–77. [Google Scholar] [CrossRef]
- Borgqvist, J.; Ohlsson, F.; Baker, R.E. Symmetries of systems of first order ODEs: Symbolic symmetry computations, mechanistic model construction and applications in biology. arXiv 2022, arXiv:2202.04935. [Google Scholar]
- Golos, M.; Jirsa, V.; Daucé, E. Multistability in Large Scale Models of Brain Activity. PLoS Comput. Biol. 2015, 11, e1004644. [Google Scholar] [CrossRef]
- Beggs, J.M.; Timme, N. Being critical of criticality in the brain. Front. Physiol. 2012, 3, 163. [Google Scholar] [CrossRef]
- Philipona, D.; O’Regan, J.; Nadal, J.; Coenen, O. Perception of the structure of the physical world using unknown multimodal sensors and effector. In Advances in Neural Information Processing Systems; MIT Press: Cambridge, MA, USA, 2004; pp. 945–952. [Google Scholar]
- Ruffini, P. Teoria Generale Delle Equazioni, in Cui si Dimostra Impossibile la Soluzione Algebraica delle Equazioni Generali di Grado Superiore al Quarto; Nella stamperia di s. Tommaso d’aquino: Bologna, Italy, 1799. [Google Scholar]
- Galois, E. Mémoire sur les conditions de résolubilité des équations par radicaux. J. Mathématiques Pures Appliquées 1830, 11, 417–433. [Google Scholar]
- Stewart, I.N. Galois Theory, 4th ed.; Apple Academic Press: Oakville, MO, USA, 2015. [Google Scholar]
- Kumpera, A.; Spencer, D.C. Lie Equations, Vol. 1: General theory: AM-73; Number 73 in Annals of Mathematics Studies; Princeton University Press: Princeton, NJ, USA, 2016. [Google Scholar] [CrossRef]
- Moerdijk, I.; Mrcun, J. Introduction to Foliations and Lie Groupoids, 1st ed.; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar] [CrossRef]
- Sharpe, R.W. Differential Geometry: Cartan’s Generalization of Klein’s Erlangen Program; Number 166 in Graduate texts in mathematics; Springer: New York, NY, USA, 1997. [Google Scholar]
- Olver, P.; Asorey, M.; Clemente-Gallardo, J.; Martínez, E.; Cariñena, J.F. Recent Advances in the Theory and Application of Lie Pseudo-Groups. AIP Conf. Proc. 2010, 1260, 35–63. [Google Scholar] [CrossRef]
- Laumon, G.; Moret-Bailly, L. Champs Algébriques; Number 39 in Ergebnisse der Mathematik und ihrer Grenzgebiete. 3. Folge/A Series of Modern Surveys in Mathematics; Springer: Berlin/Heidelberg, Germany, 2000. [Google Scholar] [CrossRef]
- Mostow, G.D. Strong Rigidity of Locally Symmetric Spaces. (AM-78); Princeton University Press: Princeton, NJ, USA, 1973. [Google Scholar]
- Hirsch, M.W. Actions of Lie groups and Lie algebras on manifolds. In A Celebration of the Mathematical Legacy of Raoul Bott; Kotiuga, P.R., Ed.; Centre de Recherches Mathématiques, U. de Montréal: Providence, RI, USA, 2010; Proceedings & lecture notes; Volume 50, pp. 69–78. [Google Scholar]
- Fraleigh, J.B.; Katz, V. A First Course in Abstract Algebra, 7th ed.; World student series; Addison-Wesley: Boston, MA, USA, 2003. [Google Scholar]
- Sipser, M. Introduction to the Theory of Computation, 3rd ed.; Cengage Learning: Boston, MA, USA, 2013. [Google Scholar]
- Wolfram, S. A New Kind of Science; Wolfram Media: Champaign, IL, USA, 2002. [Google Scholar]
- Olver, P.J. Symmetry and explicit solutions of partial differential equations. Appl. Numer. Math. 1992, 10, 307–324. [Google Scholar] [CrossRef]
- Landau, L.D.; Lifshitz, E.M. Mechanics, 3rd ed.; Butterworth-Heinemann: Oxford, UK, 1982. [Google Scholar]
- Arnold, V.I. Mathematical Methods of Classical Mechanics, 2nd ed.; Springer: New York, NY, USA, 1989. [Google Scholar]
- Goldstein, H. Classical Mechanics, 2nd ed.; Addison-Wesley Publishing: Reading, MA, USA, 1980. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ruffini, G.; Castaldo, F.; Vohryzek, J. Structured Dynamics in the Algorithmic Agent. Entropy 2025, 27, 90. https://doi.org/10.3390/e27010090
Ruffini G, Castaldo F, Vohryzek J. Structured Dynamics in the Algorithmic Agent. Entropy. 2025; 27(1):90. https://doi.org/10.3390/e27010090
Chicago/Turabian StyleRuffini, Giulio, Francesca Castaldo, and Jakub Vohryzek. 2025. "Structured Dynamics in the Algorithmic Agent" Entropy 27, no. 1: 90. https://doi.org/10.3390/e27010090
APA StyleRuffini, G., Castaldo, F., & Vohryzek, J. (2025). Structured Dynamics in the Algorithmic Agent. Entropy, 27(1), 90. https://doi.org/10.3390/e27010090