HAL 9000 is one of the most famous cinematic artificial intelligence. This superior form of intelligent computer malfunctioned on the way to Jupiter in the iconic film, Stanley Kubrick's "Space Odyssey 2001", which is currently celebrating the 50th anniversary of its release. HAL can speak, understand human facial expressions, read lips and play chess. Its superior computing capabilities are supported by unique human features. He can interpret the emotional behavior, to reason and to appreciate the art.
Giving HAL's emotion, the writer Arthur C. Clarke and filmmaker Stanley Kubrick has made him one of the most human-like intelligent images technology. In one of the most beautiful scenes in science fiction movies he says he's "afraid", when mission commander David Bowman begins to disconnect its memory modules after a series of deadly events.
HAL is programmed to provide assistance to the crew of the ship “discovery”. He manages the boat, with the support of his powerful artificial intelligence. But soon it becomes clear that he is very emotional — he can feel fear, sympathy, albeit slightly. Fantasy fiction, but this emotional artificial intelligence in our reality . Any depth of emotions and feelings that you will find in modern technology, is absolutely false.the
In the film, When Bowman starts manually edit functions HAL, he asks him to stop, and when we see the astonishing destruction of "mental" abilities of HAL, the AI tries to calm himself, singing "Daisy bell" is probably the first song, which was written by a computer.
In fact, the audience begin to feel that Bowman kills HAL. Disabling it sounds like revenge, especially after what we learned from the previous events of the film. But if HAL is able to make emotional judgments, the AI of the real world definitely will be limited in ability to reason and make decisions. Moreover, despite the opinion of futurists, we will never be able to program emotions, as did the science fiction — the creators of HAL, because we don't understand them. Psychologists and neuroscientists clearly trying to figure out how emotions interact with cognition, but not yet.
In one study conducted with Chinese-English bilingual users, the researchers studied how the emotional meaning of words can change the unconscious mental processes. When participants imagined positive and neutral words like "holiday" or "tree", they unconsciously removed the word forms in Chinese. But when have words had a negative meaning, like "murder" or "rape", their brain has blocked access to native language — without their knowledge.the
On the other hand, we understand how the argument. We can describe how to come to rational decisions, write the rules and transform those rules into the process and code. But the emotions remain mysterious evolutionary heritage. Their source cannot be tracked, so it is extensive, and it's not just an attribute of the mind, which can be implemented intentionally. To program something, you not only need to know how it works, but why. The reasoning is goals and objectives, emotions — no.
In 2015, a study was conducted with the students of Bangor University who speak Mandarin. They were asked to play a game with a opportunity to win money. In each round they had to take or leave the bet on the screen — for example, 50% chance to get 20 points, 50% chance to lose 100.
Scientists have suggested that the ability to speak their native language will give them emotions and they will not behave as if they were communicating in a second language, English. What happened: when the feedback was held in the native Chinese subjects were 10% more inclined to bet in the next round, regardless of risk. This shows that emotions affect reasoning.
Returning to the AI, because emotions cannot be fully implemented in the program — no matter how difficult it may be — the reasoning of the computer will never change under the pressure of his emotions.
One of the possible interpretations of the weird "emotional" behavior of the HAL is that it was programmed to simulate emotions in extreme situations where he had to manipulate people, based on common sense, but appealing to their emotional "I" when the human mind fails. This is the only way to see a convincing simulation of the emotions in such circumstances.
In my opinion, we will never create a machine that can feel, to hope, to fear or to rejoice for real. Every approach is a simulacrum, because the machine will never be human, and emotions are the default human part.
it Is believed that DNA will save us from computers. Thanks to advances in the replacement of silicon transistors, computers based on DNA promise to provide us a massive parallel computing architecture, is impossible at the present time. But here's t...
Fifty years ago, smartphones would have seemed absolutely magical computers. Just as classical computers have been almost unimaginable to previous generations, today we are facing the birth of an entirely new type of computing: something so mystical ...
IBM has proposed the use of a measure of the "quantum volume", which is expected to double every year — and it will be the equivalent of Moore's law, which is observed in traditional computing. According to Moore's law the number of transistors on a ...
an international group of scientists, consisting of Russian, British and German experts in the field of quantum technologies has created a revolutionary technology of qubits based on dzhozefsonovskikh the transition that represent...
Every Monday in the new issue of «News high-tech» we summarize the previous week, talking about some of the most important events, the key discoveries and inventions. This time we will focus on the toothbrush for mining,...
On the territory of Russia operate more than ten supercomputers, the leader of which is considered to be . Its performance is more than 2 petaflops, which provides him 63rd place in the ranking of most powerful supercomputers in t...