The fastest supercomputer in the world broke the record of artificial intelligence

Date:

2019-02-01 18:15:13

Views:

72

Rating:

1Like 0Dislike

Share:

The fastest supercomputer in the world broke the record of artificial intelligence

On the West coast of America the most valuable company in the world trying to make artificial intelligence smarter. Google and Facebook brag experiments using billions of photos and thousands of high-performance processors. But at the end of last year, the project in the Eastern part of Tennessee quietly has surpassed the scale of any corporate artificial intelligence lab. And he was under the control of the U.S. government.

the

Government supercomputer USA breaks records

In record the project involved the world's most powerful supercomputer, Summit, located in the National laboratory of oak ridge. This machine took the crown in June last year, returning US title five years later, when the list was led by China. In the framework of the project climate research giant the computer is running on machine learning, which flowed faster than ever before.

The Summit, covering an area equivalent to two tennis courts, involved in this project, more than 27,000 high-end graphics processors. He used their power for learning algorithms deep learning, the same technology that underlies advanced artificial intelligence. In the process of deep learning algorithms perform the exercises at a speed of a billion billion operations per second, known in supercomputing circles as ecaflip.

"Earlier, deep learning has never reached this level of performance," says Prabhat, head of the research group at the National scientific computing center of energy research at the National laboratory behalf of the Lawrence Berkeley. His group collaborated with researchers at the headquarters of the "Summit", National laboratory of oak ridge.

As you might guess, the training AI is the world's most powerful computer was focused on one of the biggest problems in the world — climate change. Technological companies train algorithms to recognize faces or traffic signs; government scientists have been trained to recognize weather conditions like cyclones by climate models, which compress a century of predictions of the Earth's atmosphere in three hours. (It is unclear, however, how much energy has requested the project and how much carbon was released into the air in the process).

The Summit Experiment matters to the future of artificial intelligence and climatology. The project demonstrates the scientific potential of the adaptation of deep learning to the supercomputers, which traditionally simulate physical and chemical processes, such as nuclear explosions, black holes, or new materials. It also shows that machine learning could benefit from more computational power — if you can find it — and provide breakthroughs in the future.

"We didn't know that this can be done in this scale before you do it," says Rajat Monga, engineering Director for Google. He and other "goglova" helped the project, adapting the software machine learning TensorFlow open source company for giant scale Summit.

Most of the work on scaling deep learning was conducted in the data centers of Internet companies, where the servers work together on problems, separating them, because it is located relatively in isolation, not connected in one giant computer. The same supercomputers like Summit have a different architecture with specialized high speed connections that connects them to thousands of processors in a single system which can operate as a single unit. Until recently has been relatively little work on adapting machine learning to work with this kind of hardware.

Monga said that the work on adaptation TensorFlow to the scope of Summit will also contribute to the efforts of Google to expand its internal artificial intelligence systems. Nvidia engineers have also participated in this project, making sure that tens of thousands of Nvidia in this machine running smoothly.

Finding ways to use more computing power to algorithms, deep learning has played an important role in the current development of technology. The same technology, which uses Siri for voice recognition and Waymo cars to read road signs, have been useful in 2012 when scientists adapted it to work on Nvidia.

In the analysis, published in may last year, scientists from the OpenAI, a research Institute in San Francisco, founded by Elon Musk, has estimated that the amount of computing power in the largest public experiments with machine learning doubles roughly every month 3.43 since 2012; this will mean an 11-fold increase for the year. This progression helped bot from Alphabet to win in a complex Board and video games, and also contributed to a significant increase in the accuracy of Google translator.

Google and other companies are currently developing new types of circuits adapted AI to continue this trend. Google States that "pods" with closely spaced thousands of its chips AI — tensor duplicated processors, or TPU — can provide 100 petaflops of computing power that is one tenth of the speed reached by the Summit.

The Summit project's Contribution to the science of climate shows how the AI of the giant scale can improve our understanding of future weather conditions. When the researchers generate the hundred years predicting the weather, reading the received forecast becomes challenging."Imagine that you have a movie on YouTube that is 100 years old. There is no way to find all the cats and dogs in this film by hand," says Prabhat. Usually to automate this process, software is used, but it is not perfect. The results of the "Summit" showed that machine learning can do it much better, which should help in the prediction of storm impacts like flooding.

According to Michael Pritchard, Professor, University of California, Irvine, run deep learning on supercomputers is a relatively new idea that appeared at a convenient time for researchers of climate. The slowdown in the improvement of CPUs has led to the fact that the engineers began to equip supercomputers increasing number of graphics chips that productivity has risen more steadily. "The time has come when you can no longer increase the computing power in the usual way," says Pritchard.

This shift has made traditional simulation to a standstill, and so had to adapt. It also opens the door for harnessing the power of deep learning that is naturally suited for graphics chips. Maybe we will get a clearer idea about the future of our climate.

And how would use such a supercomputer you? Tell us .

Recommended

Comments (0)

This article has no comment, be the first!

Add comment

Related News

Physicists have calculated the time of the state of superposition of graphene qubits

Physicists have calculated the time of the state of superposition of graphene qubits

the Possibility of practical use of quantum computers one step closer thanks to graphene. Experts from the Massachusetts Institute of technology and their colleagues from other research institutions were able to calculate the time...

At MIT used a biological virus in order to speed up your computer

At MIT used a biological virus in order to speed up your computer

whenever the computer (and any other electronic device) processes the data, there is a small delay, that is to say, the transfer of information "from one equipment to another" (e.g. from the memory to physical). The more powerful ...

New particles could open the way to photonic computers

New particles could open the way to photonic computers

All modern electronic devices use to transmit information of the electrons. Now in full swing, the development of quantum computers, which many consider the future replacement of traditional devices. However, there is another, no ...