Mitigating the risks of artificial superintelligence

June 18, 2011

Ben Goertzel:

What do you think are the biggest misconceptions regarding existential risk -- both among individuals in the futurist community broadly conceived; and among the general public?

Michael Anissimov:

Underestimating the significance of superintelligence. People have a delusion that humanity is some theoretically optimum plateau of intelligence (due to brainwashing from Judeo-Christian theological ideas, which also permeate so-called "secular humanism"), which is the opposite of the truth. We're actually among the stupidest possible species smart enough to launch a civilization.

You should follow me on Twitter here