The Singularity Q&A

Q: If superintelligence is possible, shouldn't superintelligent extra-terrestrials have reached us by now?

A: Who's to say they haven't?  Science fiction has already catalogued a number of reasons why alien civilizations might eschew contact with a species like us.  This, by itself, is probably not the reason why we seem to be alone in the universe, but looking at the puzzle of our solitude more carefully will demonstrate why the answer almost certainly has little to do with the difficulty of creating superintelligence.

The Fermi Paradox is the name given to idea that we seem to be alone when the odds seem overwhelmingly in favor of intelligent life having evolved and traveled the stars before us.  The reason why it is a paradox -- why our isolation is so surprising -- is that the universe is very old and very rich.  With trillions of stars in the cluster of galaxies comprising our "local group", so the reasoning goes, at least one of these supports a world which has given rise to a space-faring civilization that would have left obvious marks in our neighborhood; traveling at even one tenth the speed of light would allow one to crisscross galaxies in a fraction of the time these galaxies have existed.  It would only take one civilization with the drive and the means to exponentially colonize the stars before every corner of the universe would be occupied with alarming speed.

But it doesn't seem to have happened.  There are perhaps dozens of reasons why this might be the case, but the best solutions are probably those which take many reasons together to paint a picture of cumulative improbability.  In other words, a civilization going "exponential" on the universe might not be very implausible, merely very improbable, because of the huge number of prerequisite steps.  To name a few:

A star system must form with the right mix of light and heavy elements, with planets and/or moons at the right range from a stable star, in a region of a galaxy not subject to frequent cataclysmic bursts of radiation.  Life must appear on one of these worlds early enough to have a chance at evolving complex life forms before the star becomes unstable, and the world must not experience planet-melting collisions with other objects during this process.  Evolution must hit upon an extended combination of adaptations that result in general intelligence, and this general intelligence must go on to acquire the technology to colonize the universe without destroying itself in the process, perhaps by designing and/or evolving into superintelligence.  This intelligence must then consider exponential colonization to be a worthwhile activity that should be done in ways that could be detected by civilizations like ours.  Finally, this program must be executed without successful interference from any other intelligences in the universe, early enough and/or long-lasting enough to be established in the whereabouts of Earth at the time we became able to look around and see if we are alone.

Taken individually, none of the steps on the way to this outcome appear unlikely in a universe as vast as ours.  But taken together, the very long string of necessary events resembles an extremely large lottery number.  It is not impossible to hit upon this particular "winning" sequence, but it may be very improbable.  So perhaps we are alone in the universe and perhaps we are not.  But the particular step between "general intelligence" and superintelligence is only a tiny -- and perhaps unnecessary -- piece of the solution, so the fact that we seem to be alone does not provide a very strong argument for the idea that superintelligence may be impossible.

[Back to Futurism] [Back to Main]  ©2002 by Mitchell Howe