The 2024 Nobel Prize in Physics, announced yesterday, surprised many people.
In recognition of the groundbreaking discoveries and inventions of machine learning using artificial neural networks, this year's Nobel Prize in Physics is awarded to United States physic·ist John J. Hopfield and British-born Canada computer scientist Ge·offrey E. Hinton, who will split the prize of 11 million Sweden kronor (about 7.45 million yuan) equally.
Last night, Hinton paused for a long time in an interview with the Royal Academy of Sciences of Sweden, and then he said, "I didn't expect that. ”
In fact, few academics expected that this year's Nobel Prize in Physics would go to machine learning and be awarded to two pioneers in artificial intelligence (AI). Previously, they almost never appeared in the list of predictions for the award, and Hinton is better known as the "father of artificial intelligence".
Interpreting this year's Nobel Prize in Physics, many scholars said, "If you consider the contributions of the two laureates to the world, this result is not so surprising".
Lay the foundation for powerful machine learning with physics tools
Today, deep learning models have become AI tools that even middle school students are proficient in, and these models continue to evolve from the work of these two winners.
As stated in this year's Nobel Prize citation, the two laureates have used physics tools to develop methods that underpin today's powerful machine learning. Among them, Hopfield has created a kind of associative memory that is capable of storing and reconstructing images, as well as other modal types. Hinton invented a way to autonomously discover attributes in data and perform tasks such as identifying specific elements in an image.
The 91-year-old Hopfield first proposed the "Hopfield Network" in the 80s, a single-layer, full-feedback network structure that mimics biological neuronal connections and introduces an energy function. When the energy function reaches a minimum, the whole system reaches a steady state. In a neural network, this steady state corresponds to the network's memory or stored information. This model has proven to have a wide range of applications, covering multiple fields such as machine learning, associative memory, pattern recognition, and optimization computing.
Yan Junchi, a professor in the Department of Computer Science and Engineering at Shanghai Jiao Tong University, said that Hopfield's work not only expands the boundaries of statistical physics, but also creates a new language for thinking about brain calculations and provides a deeper understanding of the dynamics of neural networks. For his outstanding contributions in this field, Hopfield was awarded the 2022 Boltzmann Prize.
The other winner, Hinton, is a leading figure in the field of machine learning. Xinton used the tools of statistical physics to randomly extend the Hopfield network and developed the "Boltzmann machine". He trained the Boltzmann machine by inputting examples that are likely to appear at runtime, for classifying images, etc. Not only that, but Hinton continued to expand on this basis, kicking off the explosive development of machine learning. In 2018, he received the Turing Award, the highest award in the field of computing.
Xu Xinjian, a professor at the School of Science at Shanghai University, said that Xinton's work has greatly promoted the development of artificial neural networks and broken the bottleneck of machine learning. In addition, the field of quantum artificial intelligence, the design of new quantum algorithms and quantum computers have also benefited from their work.
Continue to work hard in the direction of "unwelcome" and survive the academic winter
In fact, artificial neural networks are not the latest research direction, scientists have been studying since the 60s of the last century, and the research process has gone through several ups and downs. Whether it's Hopfield or Hinton, their research goes from hot to cold to hot. Especially Xinton, who is more well-known in the industry is the story of his 30 years on the "cold bench".
"As early as the 80s of the last century, artificial neural networks were a popular research direction, but due to problems such as computer computing power at that time, this field was once considered difficult to make breakthroughs, and soon it was no longer concerned. However, these two scholars have been able to continue to work in the direction of neural networks. Deep learning of neural networks can be said to have exploded on the basis of their research. Professor Feng Jianfeng, dean of the Institute of Brain-like Science and Intelligence at Fudan University, studied and dissertation during his doctoral studies in the 80s of the last century, which was related to the Hopfield network, and he also had research intersections with Xinton.
In 1986, Hinton and two students published a paper on the "backpropagation algorithm", which is the core of training neural networks. However, this did not change the trend of neural network research until 2012, when Hinton and two students proposed the Alexnet model, which greatly improved the accuracy of visual recognition, causing a shock to the global scientific community and starting a boom in deep learning.
In Feng Jianfeng's view, Xinton, who was born in an academic family, is a typical person who does scholarship for the sake of learning. It is precisely because of this that he was unable to apply for research funding at the University of Edinburgh during the "winter" season of neural network research, but even so, he still did not give up, but "tossed" United States and Canada continue his research direction.
"It's hard to imagine that Xinton will continue to work on neural networks, which is 'unseen by the academic community', until he has achieved a huge breakthrough in the field of deep learning." Zhang Ya, a distinguished researcher at the School of Artificial Intelligence of Shanghai Jiao Tong University, told reporters that many people, including Xinton's mentor, thought that Xinton should not waste time in this direction, and even persuaded him to change directions. But it is also his hard work in this unpopular field that led him to win the Turing Award in 2018 and the Nobel Prize in Physics this year, and brought a revolutionary breakthrough to the development of artificial intelligence.
The Nobel Prize is increasingly focused on interdisciplinary fields and highlights the characteristics of cutting-edge disciplines
With the awarding of the Nobel Prize in Physics this year, many scholars have also begun to discuss an extended topic: in recent years, this prize has become more and more popular for interdisciplinary research.
"This is tantamount to highlighting the intersection and interoperability of cutting-edge disciplines." Yan Junchi cited in an interview with reporters that the 2020 Nobel Prize in Physics was awarded to mathematician Penrose, and the 2021 Nobel Prize in Physics was awarded to meteorologists Shuro Manabe and Klaus · Hasselmann who study complex systems. Looking at this year, Hopfield and Xinton are also "crossover masters". A Ph.D. in physics across multiple disciplines, Hopfield has developed neural networks at the intersection of physics, chemistry, and biology. Hindon studied physics and physiology at Cambridge University and later earned a bachelor's degree in experimental psychology, where he advanced machine learning at the intersection of physics and mathematics with computer and neuropsychology.
Shi Yu, a professor at the Department of Physics at Fudan University, said that the important research and development of machine learning are inextricably linked to physics. On the one hand, physics has already broken through the traditional field and has a wider range of research. On the other hand, with the widespread use of AI tools, more and more researchers are also using machine learning, continuing to expand the boundaries of physics, chemistry, biology and other research. Reporter Jiang Peng Chu Shuting
Wen Wei Po