Hawking called artificial intelligence the greatest mistake of mankind

    British physicist Stephen Hawking , in his article , said that underestimating the threat from artificial intelligence could be the biggest mistake in the history of mankind.

    The co-authors of this work are the professor of computer science from the University of California Stuart Russell and the professors of physics Max Tegmark and Frank Wilchek from the Massachusetts Institute of Technology. The article points to some achievements in the field of artificial intelligence, noting self-driving cars, Siri's voice assistant and the supercomputer who defeated the person in the Jeopardy television quiz game.

    As Hawking told the Independent newspaper:
    All these achievements fade against the backdrop of what awaits us in the coming decades. The successful creation of artificial intelligence will be the biggest event in the history of mankind. Unfortunately, it may be the last, if we do not learn to avoid risks.


    Scientists say that in the future it may happen that machines with inhuman intelligence will improve and nothing can stop this process. And this, in turn, will launch the process of the so-called technological singularity, which means extremely rapid technological development.
    The article notes that such a technology will surpass man and begin to manage financial markets, scientific research, people and the development of weapons that are beyond our understanding. If the short-term effect of artificial intelligence depends on who controls it, then the long-term effect depends on whether it can be controlled at all.
    It is difficult to say what consequences for people the creation of artificial intelligence may entail. Hawking believes that little serious research has been devoted to these issues outside of such non-profit organizations as the Cambridge Existential Risk Research Center, the Institute for the Future of Humanity, as well as research institutes for machine intelligence and the life of the future. According to him, each of us should ask ourselves what we can do now to avoid the worst-case scenario for the development of the future.

    Also popular now: