Neural networks are a powerful thing, but very voracious. Engineers from the Massachusetts Institute of technology (MIT) have managed to develop a new chip which reduces power consumption of the neural network by 95%, which could in theory allow them to work even on mobile devices with batteries. The smartphones today are becoming smarter and smarter, offer more services, more energized artificial intelligence, like virtual assistants and transfers in real time. But usually the neural network processes the data for these services in the cloud, and smartphones only transmit data back and forth.
It's not ideal, because it requires a thick communication channel, and assumes that sensitive data is transmitted and stored beyond the reach of the user. But enormous amounts of energy required to power the neural networks running on GPUs, cannot be achieved in a device that runs on a small battery.
Engineers at MIT have developed a chip that reduces energy consumption by 95%. The chip dramatically reduces the need to transfer data back and forth between memory and processors.
Neural networks are composed of thousands of interconnected artificial neurons, arranged in layers. Each neuron receives input from several neurons in the underlying layer, and if the combined input passes a certain threshold, it transmits the result of multiple neurons above. The strength of connections between neurons is determined by weight, which is set in the learning process.
This means that for each neuron chip has to derive the input data for the specific connection and the weight of the connection from the memory, multiply them, store the result and then repeat the process for each input. A lot of data moving back and forth, spent a lot of energy.
The New chip MIT fixes it, calculating all entries in parallel in the memory with the use of analog circuitry. This significantly reduces the amount of data you want to overtake, leading to significant energy savings.
This approach requires that the weight of the compounds was binary, not a range value, but previous theoretical work has shown that this is not much impact on accuracy and scholars found that the results of the chip is diverged by 2-3% from the usual case of a neural network running on a standard computer.
Not for the first time scientists create chips that handle the processes in memory, reducing the power consumption of the neural network, but the first time this approach has been used to make powerful neural networks, known for his treatment of the images.
"the Results show impressive specifications energy efficient implementation of convolution operations within a memory array," says Dario Gil, Vice President for artificial intelligence at IBM.
"It definitely opens the possibility of using more complex convolutional neural networks to classify images and videos in the Internet of things in the future."
And it's interesting not only to the groups R&D. the Desire to make a AI on devices like smartphones, household appliances and all sorts of IoT devices is pushing many of Silicon valley in the side of the chips with low power consumption.
Apple has integrated his Neural Engine in iPhone X, to power, for example, facial recognition technology, and Amazon is rumored to be developing its own chips AI for the next generation of digital assistants Echo.
Large companies, chip manufacturers also increasingly rely on machine learning, forcing them to make their devices even more efficient. At the beginning of this year, ARM has introduced two new chip: Arm processor Machine Learning working with the objectives of the common AI, from translation, to face recognition, and Object Detection Arm processor that determines, for example, faces in photographs.
Latest mobile chip Qualcomm, Snapdragon 845, has a graphic processor and to a large extent focused on AI. The company also introduced Snapdragon 820E, which should work in drones, robots and industrial devices.
Looking ahead, IBM and Intel are developing neuromorphic chips, the architecture of which is inspired by the human brain and incredible energy efficiency. This could theoretically allow North (IBM) and Loihi (Intel) to conduct a powerful learning machine, using only a small proportion of the energy of conventional chips, but these projects are still purely experimental.
To Make chips that give life to neural nets to save the energy of battery will be very difficult. But at the current rate of innovation is "very difficult" seems quite feasible.
the Federal space Agency and Chinese national space administration (CNCA) signed an agreement in which both parties intend to work together on space research.
In the suburban town of Dubna is running the first experiment on newly built by the joint Institute for nuclear research (JINR) ion accelerator complex NICA (NICA, Nuclotron-based Ion Collider fAcility).
the human Touch is always nice, but other than that, if you believe new research group of scientists from France and Israel, they have a number of very interesting effects.