MarketJohn Hopfield
Company Profile

John Hopfield

John Joseph Hopfield is an American physicist and emeritus professor of Princeton University, most widely known for his study of associative neural networks in 1982. He is known for the development of the Hopfield network. Before its invention, research in artificial intelligence (AI) was in a decay period or AI winter, Hopfield's work revitalized large-scale interest in this field.

Biography
Early life and education John Joseph Hopfield was born in 1933 in Chicago Hopfield received a Bachelor of Arts with a major in physics from Swarthmore College in Pennsylvania in 1954 and a Doctor of Philosophy in physics from Cornell University in 1958. His doctoral advisor was Albert Overhauser. and later on a quantitative model to describe the cooperative behavior of hemoglobin in collaboration with Robert G. Shulman. Subsequently he became a faculty member at University of California, Berkeley (physics, 1961–1964), In 1976, he participated in a science short film on the structure of the hemoglobin, featuring Linus Pauling. From 1981 to 1983 Richard Feynman, Carver Mead and Hopfield gave a one-year course at Caltech called "The Physics of Computation". This collaboration inspired the Computation and Neural Systems PhD program at Caltech in 1986, co-founded by Hopfield. Bertrand Halperin (1965), Steven Girvin (1977), and David J. C. MacKay (1992). == Work ==
Work
In his doctoral work of 1958, he wrote on the interaction of excitons in crystals, coining the term polariton for a quasiparticle that appears in solid-state physics. He wrote: "The polarization field 'particles' analogous to photons will be called 'polaritons'." From 1959 to 1963, Hopfield and David G. Thomas investigated the exciton structure of cadmium sulfide from its reflection spectra. Their experiments and theoretical models allowed to understand the optical spectroscopy of II-VI semiconductor compounds. Condensed matter physicist Philip W. Anderson reported that John Hopfield was his "hidden collaborator" for his 1961–1970 works on the Anderson impurity model which explained the Kondo effect. Hopfield was not included as a co-author in the papers but Anderson admitted the importance of Hopfield's contribution in various of his writings. William C. Topp and Hopfield introduced the concept of norm-conserving pseudopotentials in 1973. In 1974 he introduced a mechanism for error correction in biochemical reactions known as kinetic proofreading to explain the accuracy of DNA replication. Hopfield published his first paper in neuroscience in 1982, titled "Neural networks and physical systems with emergent collective computational abilities" where he introduced what is now known as Hopfield network, a type of artificial network that can serve as a content-addressable memory, made of binary neurons that can be 'on' or 'off'. The 1982 and 1984 papers represent his two most cited works. Together with David W. Tank, Hopfield developed a method in 1985–1986 for solving discrete optimization problems based on the continuous-time dynamics using a Hopfield network with continuous activation function. The optimization problem was encoded in the interaction parameters (weights) of the network. The effective temperature of the analog system was gradually decreased, as in global optimization with simulated annealing. Hopfield is one of the pioneers of the critical brain hypothesis, he was the first to link neural networks with self-organized criticality in reference to the Olami–Feder–Christensen model for earthquakes in 1994. In 1995, Hopfield and Andreas V. Herz showed that avalanches in neural activity follow power law distribution associated to earthquakes. The original Hopfield networks had a limited memory, this problem was addressed by Hopfield and Dimitry Krotov in 2016. Large memory storage Hopfield networks are now known as modern Hopfield networks. == Views on artificial intelligence ==
Views on artificial intelligence
In March 2023, Hopfield signed an open letter titled "Pause Giant AI Experiments", calling for a pause on the training of artificial intelligence (AI) systems more powerful than GPT-4. The letter, signed by over 30,000 individuals including AI researchers Yoshua Bengio and Stuart Russell, cited risks such as human obsolescence and society-wide loss of control. Upon being jointly awarded the 2024 Nobel Prize in Physics, Hopfield revealed he was very unnerved by recent advances in AI capabilities, and said "as a physicist, I'm very unnerved by something which has no control". In a followup press conference in Princeton University, Hopfield compared AI with discovery of nuclear fission, which led to nuclear weapons and nuclear power. == Awards and honors ==
Awards and honors
of condensed matter physics. Luis Walter Alvarez (left) congratulates David Gilbert Thomas (middle) and John Hopfield (right). Hopfield received a Sloan Research Fellowship in 1962 and as his father, he received a Guggenheim Fellowship (1968). Hopfield was elected as a member of the American Physical Society (APS) in 1969, a member of the National Academy of Sciences in 1973, a member of the American Academy of Arts and Sciences in 1975, and a member of the American Philosophical Society in 1988. He was the President of the APS in 2006. In 1969 Hopfield and David Gilbert Thomas were awarded the Oliver E. Buckley Prize of condensed matter physics by the APS "for their joint work combining theory and experiment which has advanced the understanding of the interaction of light with solids". In 1983 he was awarded the MacArthur Foundational Prize by the MacArthur Fellows Program. In 1985, Hopfield received the Golden Plate Award of the American Academy of Achievement and the Max Delbruck Prize in Biophysics by the APS. Hopfield received the Neural Networks Pioneer Award in 1997 by the Institute of Electrical and Electronics Engineers (IEEE). He was awarded the Dirac Medal of the International Centre for Theoretical Physics in 2001 "for important contributions in an impressively broad spectrum of scientific subjects" including "an entirely different [collective] organizing principle in olfaction" and "a new principle in which neural function can take advantage of the temporal structure of the 'spiking' interneural communication". He received the Albert Einstein World Award of Science in 2005 in the field of life sciences. In 2007, he gave the Fritz London Memorial Lecture at Duke University, titled "How Do We Think So Fast? From Neurons to Brain Computation". Hopfield received the IEEE Frank Rosenblatt Award in 2009 for his contributions in understanding information processing in biological systems. In 2012 he was awarded the Swartz Prize by the Society for Neuroscience. In 2019 he was awarded the Benjamin Franklin Medal in Physics by the Franklin Institute, and in 2022 he shared the Boltzmann Medal award in statistical physics with Deepak Dhar. He was jointly awarded the 2024 Nobel Prize in Physics with Geoffrey E. Hinton for "foundational discoveries and inventions that enable machine learning with artificial neural networks". In 2025 he was awarded the Queen Elizabeth Prize for Engineering jointly with Yoshua Bengio, Bill Dally, Geoffrey E. Hinton, Yann LeCun, Jen-Hsun Huang and Fei-Fei Li for the development of modern machine learning. ==References==
tickerdossier.comtickerdossier.substack.com