According to foreign media this week, with the new Supercomputer Summit, the United States is overtaking China in the field of hypercomputing. Considering that more powerful supercomputers are already under development, such lead may only be temporary. Supercomputers will accelerate the development of areas such as artificial intelligence, new materials and new energy, and Summit has been used for research in the medical and health field.
The following is the full text of the article:
The United States has just regained its lead in the world’s fastest supercomputer competition.
In the past five years, China had the fastest computer in the world. China is trying to prove that it is a technological power, and the world’s most powerful supercomputer is a symbolic achievement. However, with the Supercomputer Summit developed by the Oak Ridge National Laboratory in Tennessee, the United States has achieved a go-ahead.
According to the news announced this Friday, Summit's speed is staggering. The supercomputer can perform data calculations at 200 petaflops per second. In simple terms, if someone performs an operation once per second, it will take more than 6.3 billion years to achieve the results achieved within 1 second of this computer.
If you are not quite clear, there is another kind of ratio here. If a stadium capable of accommodating 100,000 people is fully occupied and everyone is equipped with the latest laptop, 20 such stadiums are required to achieve comparable computing capabilities of Summit.
Overall, China still has the largest number of supercomputers in the world. While China, Japan, and Europe are developing faster computers, this may mean that the US’s go-ahead is only short-term.
The development of Summit used US government’s 200 million U.S. dollars. Such supercomputers can accelerate the development of frontier computing technologies such as artificial intelligence and big data processing capabilities.
Such capabilities help solve the serious challenges in science, industry, and national security, and this is the core of the scientific and technological competition between China and the United States.
The tasks that supercomputers can perform include simulating nuclear tests, predicting weather trends, discovering oil burial, and cracking encrypted passwords. Scientists believe that further advances and new discoveries in medicine, new materials and energy technologies will depend on the tools brought by Summit.
John Kelly, head of IBM Research who assisted in the development of Summit, said: "These are big data and artificial intelligence machines, and they are where the future lies." ”
For more than 20 years, a team of computer scientists has been responsible for maintaining the list of top 500 global supercomputers. The team was led by Jake Dongarra, a computer scientist at the University of Tennessee. The latest list will not be released until later this month, but Tangara said he is convinced that Summit is currently the fastest supercomputer.
200 trillion petaflops per second is faster than the fastest supercomputer on the list in November last year, which is twice as much as “Light from Taihu Lake” in Wuxi, China.
Summit consists of many black, refrigerator-sized units, with a total weight of 340 tons and a floor space of 9,250 square feet (approximately 860 square meters). The supercomputer carries 9,216 CPUs from IBM and 27,648 GPUs from Nvidia. The connection of these chips used up to 185 miles (about 298 kilometers) of fiber.
Summit's cooling needs 4,000 gallons (15 degrees) of water per minute, which is enough to meet the needs of 8,100 U.S. households.
In the rapid development of supercomputers, Internet giants, including Amazon.com, Facebook and Google, as well as Chinese Alibaba, Baidu and Tencent, are leading the development of new technologies such as cloud computing and face recognition.
The performance of a supercomputer can measure the technical strength of a country. Of course, this is a narrow measure because speed is only one of the elements of computing performance. Another important element is software. Software can give life to the computer.
Scientists at US government agencies such as the Oak Ridge National Laboratory are conducting exploratory research, such as researching new materials to make roads more durable, designing energy storage solutions for electric cars and power grids, and trying new sources of energy such as nuclear fusion. The development of supercomputers can help in all these areas.
For example, the modeling of climate may need to run the code on the supercomputer for several days to process large amounts of scientific data, such as humidity and wind patterns, taking into account all real-world physical factors. Ian Buck, general manager and computer scientist at Nvidia’s data center business, believes that such computing tasks cannot run efficiently on cloud computing services provided by Internet companies.
Rick Stevens, deputy director of the Aragon National Laboratory in Illinois, said: "The industry is great. We have always cooperated with them, but Google will never involve new materials or be safe." nuclear reactor. ”
Thomas Zacharia, director of the Oak Ridge National Laboratory, used a large-scale health research project as an example to explain the future of supercomputers. Currently, Summit has begun receiving and processing data generated from the "Millions of Veterans Project". The project recruits volunteers to allow researchers to obtain all their health records, provide blood samples for genetic analysis, and answer survey questions about lifestyles and habits. So far, 675,000 veterans have joined the project, and the goal is to reach 1 million by 2021.
Zakaria said that the final discovery "can help us find new ways to deal with veterans and contribute to the entire precision medical field.".
As the "Million Veterans Program" principal researcher and Harvard Medical School professor, Michael & middot; Dr. Gaziano (Michael Gaziano) said that the project may be equivalent to the modern version of "Frehamham Heart" & rdquo; . The latter project began in 1948 and tracked approximately 5,000 respondents in a small town in Massachusetts.
Through decades of tracking, Framingham research found that heart disease is not caused by a single factor, but there are many causes, including cholesterol, diet, exercise and smoking.
Today, considering the rapid development of digital health data and supercomputers, Dr. Gaziano believes that population science may be entering a golden age. "We can create new areas through these large and messy data and rethink our understanding of the disease." This is an exciting time. ”
Summit's lead may only be temporary. In the United States and abroad, speeds of up to 5 times, or 1000 petaflops per second, are being developed. Paul Dabbar, deputy director of the US Department of Energy’s Science Affairs Department, said that in the two fiscal years ending September 2019, the Ministry of Energy’s budget for advanced computing projects will increase by 39%.
He pointed out: "We do this to promote innovation in areas such as supercomputers. ”