www.mitchellhowe.com


The Singularity Q&A

Q: How far off is the Singularity?

A: According to the definition that will be used in this Q&A, the Singularity will be "officially" ushered in as soon as greater intelligence starts solving problems and coming up with ideas that people from all prior eras might never have been able to.  There are at least four distinct technological paths that could play major roles in exceeding the traditional boundaries of human thought: neural interfacing, genetic engineering, Artificial Intelligence, and nanotechnology.

  • Neural interfacing would create direct connections between humans and computers, potentially allowing people to not just rapidly access information or subjectively "experience" simulated environments, but to actually expand their own thought processes in ways that would make them more intelligent.

  • Genetic engineering could theoretically improve brain function some day, although ethical considerations and the many years it takes a new person to mature to adulthood make this particular avenue slow and unlikely.

  • Artificial Intelligence (AI) would create new kinds of minds that, although different from our own, could eventually exceed human performance in every identifiable category of intelligence -- with the added benefit of being able to improve upon their own designs.

  • Nanotechnology refers to the engineering of materials and devices at the molecular level; it is included in this list because its more advanced potential applications, if realized, would make each of the other avenues to greater intelligence dramatically more accessible.

Guessing the arrival date of the Singularity is tricky because technological developments often have unforeseen benefits in seemingly unrelated fields, and it is therefore hard to know which technologies will be the first to allow human intelligence to be exceeded.  Also, futurists who agree on the possibility of greater intelligence often disagree on the nature of intelligence, selecting different approaches as a result.  For instance, the strongest AI proponents believe that real AI could run on today‚Äôs computer systems if we had the right programming design.  In contrast, those who feel that only biological neurons can give rise to intelligence must settle for the decades or centuries it would take to run a genetic engineering program.  At some distance from either of these extremes, there is something of a moderate consensus among futurists favoring the year 2030 as a very rough ETA.

Of course, the Singularity is not a horse race where the bettors cannot directly influence the outcome.  If we as a civilization make the Singularity a priority, it will happen much sooner.  If we are content to let technology run its course and produce greater intelligence as an afterthought, it will take far longer.




[Back to Futurism] [Back to Main]  ©2002 by Mitchell Howe