John Hopfield | |
---|---|
Born | John Joseph Hopfield July 15, 1933 Chicago, Illinois, U.S. |
Education | Swarthmore College (AB) Cornell University (PhD) |
Known for | Hopfield network Modern Hopfield network Hopfield dielectric Polariton Kinetic proofreading |
Awards |
|
Scientific career | |
Fields | Physics Molecular biology Complex systems Neuroscience |
Institutions | Bell Labs Princeton University University of California, Berkeley California Institute of Technology |
Thesis | A quantum-mechanical theory of the contribution of excitons to the complex dielectric constant of crystals (1958) |
Doctoral advisor | Albert Overhauser |
Doctoral students | Steven Girvin Gerald Mahan Bertrand Halperin David J. C. MacKay José Onuchic Terry Sejnowski Erik Winfree Li Zhaoping |
John Joseph Hopfield (born July 15, 1933) [1] is an American physicist and emeritus professor of Princeton University, most widely known for his study of associative neural networks in 1982. He is known for the development of the Hopfield network. Previous to its invention, research in artificial intelligence (AI) was in a decay period or AI winter, Hopfield work revitalized large scale interest in this field. [2] [3]
In 2024 Hopfield, along with Geoffrey Hinton, was awarded the Nobel Prize in Physics for their foundational contributions to machine learning, particularly through their work on artificial neural networks. [4] [2] He has been awarded various major physics awards for his work in multidisciplinary fields including condensed matter physics, statistical physics and biophysics.
John Joseph Hopfield was born in 1933 in Chicago [1] to physicists John Joseph Hopfield (born in Poland as Jan Józef Chmielewski) and Helen Hopfield (née Staff). [5] [6]
Hopfield received a Bachelor of Arts with a major in physics from Swarthmore College in Pennsylvania in 1954 and a Doctor of Philosophy in physics from Cornell University in 1958. [1] His doctoral dissertation was titled "A quantum-mechanical theory of the contribution of excitons to the complex dielectric constant of crystals". [7] His doctoral advisor was Albert Overhauser. [1]
He spent two years in the theory group at Bell Laboratories working on optical properties of semiconductors working with David Gilbert Thomas [8] and later on a quantitative model to describe the cooperative behavior of hemoglobin in collaboration with Robert G. Shulman. [1] [5] [9] Subsequently he became a faculty member at University of California, Berkeley (physics, 1961–1964), [2] Princeton University (physics, 1964–1980), [2] California Institute of Technology (Caltech, chemistry and biology, 1980–1997) [2] and again at Princeton (1997–), [2] [1] where he is the Howard A. Prior Professor of Molecular Biology, emeritus. [10]
In 1976, he participated in a science short film on the structure of the hemoglobin, featuring Linus Pauling. [11]
From 1981 to 1983 Richard Feynman, Carver Mead and Hopfield gave a one-year course at Caltech called the "The Physics of Computation". [12] Hopfield was invited by Feynman to teach on associative neural networks. [12] [13] This collaboration inspired the Computation and Neural Systems PhD program at Caltech in 1986, co-founded by Hopfield. [14] [12]
His former PhD students include Gerald Mahan (PhD in 1964), [15] Bertrand Halperin (1965), [16] Steven Girvin (1977), [16] Terry Sejnowski (1978), [16] Erik Winfree (1998), [16] José Onuchic (1987), [16] Li Zhaoping (1990) [17] and David J. C. MacKay (1992). [16]
In his doctoral work of 1958, he wrote on the interaction of excitons in crystals, coining the term polariton for a quasiparticle that appears in solid-state physics. [18] [19] He wrote: "The polarization field 'particles' analogous to photons will be called 'polaritons'." [19] His polariton model is sometimes known as the Hopfield dielectric. [20]
From 1959 to 1963, Hopfield and David G. Thomas investigated the exciton structure of cadmium sulfide from its reflection spectra. Their experiments and theoretical models allowed to understand the optical spectroscopy of II-VI semiconductor compounds. [21]
Condensed matter physicist Philip W. Anderson reported that John Hopfield was his "hidden collaborator" for his 1961–1970 works on the Anderson impurity model which explained the Kondo effect. Hopfield was not included as a co-author in the papers but Anderson admitted the importance of Hopfield's contribution in various of his writings. [22]
William C. Topp and Hopfield introduced the concept of norm-conserving pseudopotentials in 1973. [23] [24] [25]
In 1974 he introduced a mechanism for error correction in biochemical reactions known as kinetic proofreading to explain the accuracy of DNA replication. [26] [27]
Hopfield published his first paper in neuroscience in 1982, titled "Neural networks and physical systems with emergent collective computational abilities" where he introduced what is now known as Hopfield network, a type of artificial network that can serve as a content-addressable memory, made of binary neurons that can be 'on' or 'off'. [28] [5] He extended his formalism to continuous activation functions in 1984. [29] The 1982 and 1984 papers represent his two most cited works. [10] Hopfield has said that the inspiration came from his knowledge of spin glasses from his collaborations with P. W. Anderson. [30]
Together with David W. Tank, Hopfield developed a method in 1985–1986 [31] [32] for solving discrete optimization problems based on the continuous-time dynamics using a Hopfield network with continuous activation function. The optimization problem was encoded in the interaction parameters (weights) of the network. The effective temperature of the analog system was gradually decreased, as in global optimization with simulated annealing. [33]
Hopfield is one of the pioneers of the critical brain hypothesis, he was the first to link neural networks with self-organized criticality in reference to the Olami–Feder–Christensen model for earthquakes in 1994. [34] [35] In 1995, Hopfield and Andreas V. Herz showed that avalanches in neural activity follow power law distribution associated to earthquakes. [36] [37]
The original Hopfield networks had a limited memory, this problem was addressed by Hopfield and Dimitry Krotov in 2016. [33] [38] Large memory storage Hopfield networks are now known as modern Hopfield networks. [39]
In March 2023, Hopfield signed an open letter titled "Pause Giant AI Experiments", calling for a pause on the training of artificial intelligence (AI) systems more powerful than GPT-4. The letter, signed by over 30,000 individuals including AI researchers Yoshua Bengio and Stuart Russell, cited risks such as human obsolescence and society-wide loss of control. [40] [41]
Upon being jointly awarded the 2024 Nobel Prize in Physics, Hopfield revealed he was very unnerved by recent advances in AI capabilities, and said "as a physicist, I'm very unnerved by something which has no control". [42] In a followup press conference in Princeton University, Hopfield compared AI with discovery of nuclear fission, which led to nuclear weapons and nuclear power. [2]
Hopfield received a Sloan Research Fellowship [43] in 1962 and as his father, he received a Guggenheim Fellowship (1968). [44] Hopfield was elected as a member of the American Physical Society (APS) in 1969, [45] [46] a member of the National Academy of Sciences in 1973, a member of the American Academy of Arts and Sciences in 1975, and a member of the American Philosophical Society in 1988. [47] [48] [49] He was the President of the APS in 2006. [50]
In 1969 Hopfield and David Gilbert Thomas were awarded the Oliver E. Buckley Prize of condensed matter physics by the APS "for their joint work combining theory and experiment which has advanced the understanding of the interaction of light with solids". [51]
In 1983 he was awarded the MacArthur Foundational Prize by the MacArthur Fellows Program. [52] In 1985, Hopfield received the Golden Plate Award of the American Academy of Achievement [53] and the Max Delbruck Prize in Biophysics by the APS. [9] In 1988, he received the Michelson–Morley Award by Case Western Reserve University. [54] Hopfield received the Neural Networks Pioneer Award in 1997 by the Institute of Electrical and Electronics Engineers (IEEE). [55]
He was awarded the Dirac Medal of the International Centre for Theoretical Physics in 2001 "for important contributions in an impressively broad spectrum of scientific subjects" [56] [57] including "an entirely different [collective] organizing principle in olfaction" and "a new principle in which neural function can take advantage of the temporal structure of the 'spiking' interneural communication". [57]
Hopfield received the Harold Pender Award in 2002 for his accomplishments in computational neuroscience and neural engineering from the Moore School of Electrical Engineering, University of Pennsylvania. [58] He received the Albert Einstein World Award of Science in 2005 in the field of life sciences. [59] In 2007, he gave the Fritz London Memorial Lecture at Duke University, titled "How Do We Think So Fast? From Neurons to Brain Computation". [60] Hopfield received the IEEE Frank Rosenblatt Award in 2009 for his contributions in understanding information processing in biological systems. [61] In 2012 he was awarded the Swartz Prize by the Society for Neuroscience. [62] In 2019 he was awarded the Benjamin Franklin Medal in Physics by the Franklin Institute, [63] and in 2022 he shared the Boltzmann Medal award in statistical physics with Deepak Dhar. [64]
He was jointly awarded the 2024 Nobel Prize in Physics with Geoffrey E. Hinton for "foundational discoveries and inventions that enable machine learning with artificial neural networks". [65] [66]
Kip Stephen Thorne is an American theoretical physicist and writer known for his contributions in gravitational physics and astrophysics. Along with Rainer Weiss and Barry C. Barish, he was awarded the 2017 Nobel Prize in Physics for his contributions to the LIGO detector and the observation of gravitational waves.
Leon N. Cooper was an American physicist and professor of Brown University. His name is associated with his work on superconductivity and neuroscience. He proposed the Cooper pair mechanism in 1956 and in 1972 he was awarded the Nobel Prize in Physics with John Bardeen and John Robert Schrieffer for the development of the BCS theory of superconductivity.
Geoffrey Everest Hinton is a British-Canadian computer scientist, cognitive scientist, cognitive psychologist, known for his work on artificial neural networks which earned him the title as the "Godfather of AI".
John Norris Bahcall was an American astrophysicist and the Richard Black Professor for Astrophysics at the Institute for Advanced Study. He was known for a wide range of contributions to solar, galactic and extragalactic astrophysics, including the solar neutrino problem, the development of the Hubble Space Telescope and for his leadership and development of the Institute for Advanced Study in Princeton.
Phillip James Edwin Peebles is a Canadian-American astrophysicist, astronomer, and theoretical cosmologist who was Albert Einstein Professor in Science, emeritus, at Princeton University. He is widely regarded as one of the world's leading theoretical cosmologists in the period since 1970, with major theoretical contributions to primordial nucleosynthesis, dark matter, the cosmic microwave background, and structure formation.
A Boltzmann machine, named after Ludwig Boltzmann is a spin-glass model with an external field, i.e., a Sherrington–Kirkpatrick model, that is a stochastic Ising model. It is a statistical physics technique applied in the context of cognitive science. It is also classified as a Markov random field.
A Hopfield network is a form of recurrent neural network, or a spin glass system, that can serve as a content-addressable memory. The Hopfield network, named for John Hopfield, consists of a single layer of neurons, where each neuron is connected to every other neuron except itself. These connections are bidirectional and symmetric, meaning the weight of the connection from neuron i to neuron j is the same as the weight from neuron j to neuron i. Patterns are associatively recalled by fixing certain inputs, and dynamically evolve the network to minimize an energy function, towards local energy minimum states that correspond to stored patterns. Patterns are associatively learned by a Hebbian learning algorithm.
Terrence Joseph Sejnowski is the Francis Crick Professor at the Salk Institute for Biological Studies where he directs the Computational Neurobiology Laboratory and is the director of the Crick-Jacobs center for theoretical and computational biology. He has performed pioneering research in neural networks and computational neuroscience.
Ronald William Prest Drever was a Scottish experimental physicist. He was a professor emeritus at the California Institute of Technology, co-founded the LIGO project, and was a co-inventor of the Pound–Drever–Hall technique for laser stabilisation, as well as the Hughes–Drever experiment. This work was instrumental in the first detection of gravitational waves in September 2015.
Gerald Stanford "Gerry" Guralnik was the Chancellor’s Professor of Physics at Brown University. In 1964, he co-discovered the Higgs mechanism and Higgs boson with C. R. Hagen and Tom Kibble (GHK). As part of Physical Review Letters' 50th anniversary celebration, the journal recognized this discovery as one of the milestone papers in PRL history. While widely considered to have authored the most complete of the early papers on the Higgs theory, GHK were controversially not included in the 2013 Nobel Prize in Physics.
The Boltzmann Medal is a prize awarded to physicists that obtain new results concerning statistical mechanics; it is named after the celebrated physicist Ludwig Boltzmann. The Boltzmann Medal is awarded once every three years by the Commission on Statistical Physics of the International Union of Pure and Applied Physics, during the STATPHYS conference.
The Canadian Institute for Advanced Research (CIFAR) is a Canadian-based global research organization that brings together teams of top researchers from around the world to address important and complex questions. It was founded in 1982 and is supported by individuals, foundations and corporations, as well as funding from the Government of Canada and the provinces of Alberta and Quebec.
Syukuro "Suki" Manabe is a Japanese–American physicist, meteorologist, and climatologist, who pioneered the use of computers to simulate global climate change and natural climate variations. He was awarded the 2021 Nobel Prize in Physics jointly with Klaus Hasselmann and Giorgio Parisi, for his contributions to the physical modeling of Earth's climate, quantifying its variability, and predictions of climate change.
Carl Richard Hagen is a professor of particle physics at the University of Rochester. He is most noted for his contributions to the Standard Model and Symmetry breaking as well as the 1964 co-discovery of the Higgs mechanism and Higgs boson with Gerald Guralnik and Tom Kibble (GHK). As part of Physical Review Letters 50th anniversary celebration, the journal recognized this discovery as one of the milestone papers in PRL history. While widely considered to have authored the most complete of the early papers on the Higgs theory, GHK were controversially not included in the 2013 Nobel Prize in Physics.
William Henry Press is an astrophysicist, theoretical physicist, computer scientist, and computational biologist. He is a member of the U.S. National Academy of Sciences, the American Academy of Arts and Sciences, and the Council on Foreign Relations. In 1989, he was elected a Fellow of the American Physical Society "in recognition of important theoretical contributions to relativistic astrophysics and to cosmology" Other honors include the 1981 Helen B. Warner Prize for Astronomy. Press has been a member of the JASON defense advisory group since 1977 and is a past chair.
Hugh David Politzer is an American theoretical physicist and the Richard Chace Tolman Professor of Theoretical Physics at the California Institute of Technology. He shared the 2004 Nobel Prize in Physics with David Gross and Frank Wilczek for their discovery of asymptotic freedom in quantum chromodynamics.
The Computation and Neural Systems (CNS) program was established at the California Institute of Technology in 1986 with the goal of training PhD students interested in exploring the relationship between the structure of neuron-like circuits/networks and the computations performed in such systems, whether natural or synthetic. The program was designed to foster the exchange of ideas and collaboration among engineers, neuroscientists, and theoreticians.
John Joseph Hopfield was a Polish-American physicist. Hopfield's published research included vacuum ultraviolet spectroscopy and solar ultraviolet spectroscopy. He was the discoverer of the "Hopfield bands" of oxygen and co-discoverer of the "Lyman–Birge–Hopfield bands" of nitrogen. For about a decade he was an industrial physicist working with technologies for fabricating glass windows, and was the inventor listed on several related patents.
James P. Eisenstein is an American physicist noted for his experimental research on strongly interacting two-dimensional electron systems. He is currently the Frank J. Roshek Professor of Physics and Applied Physics, Emeritus, at the California Institute of Technology.
John Michael Jumper is an American chemist and computer scientist. He currently serves as director at Google DeepMind. Jumper and his colleagues created AlphaFold, an artificial intelligence (AI) model to predict protein structures from their amino acid sequence with high accuracy. Jumper stated that the AlphaFold team plans to release 100 million protein structures.