The Singularity Q&A

Q: Are there other kinds of Singularities?

A: The Singularity concept has a few different defenitions worth discussing, each related to the milestone of greater intelligence and each with their own strengths and weaknesses. 

The idea that exceeding the limits of human thought will produce currently unforeseeable change is highly relevant to one of these concepts, which defines the Singularity as the moment when the future becomes unpredictable -- for whatever reason.  The Singularity, by this definition, is the "predictive horizon" beyond which we have no meaninfgul capacity to imagine.  While this definition is a very "safe" one for futurists, relieving them of any pressure to make educated guesses, it suffers from an anchoring problem.  Like a topographical horizon, a predictive horizon is relative to the changing position of the observer, and can no more be reached than can a rainbow.  So, unless an anchor point is decided on:  i.e. "The Singularity is hereby and forever defined as the point in the future beyond which no meaningful predictions could have been made by those living in the year 2002." this interpretation of the Singularity will remain useless for establishing a milestone in the timeline of human history.  It will always be one step ahead of us -- although the distance between these steps seems to be continually shrinking and is therefore an interesting measure in its own way.   (It is theoretically possible that the predictive horizon could become extremely close, but it could never actually be reached unless conditions renedered any and all anticipations of even the slightest moments impossible.   Such a state would be the effective end of intelligence, short-circuting the force presumed to be driving progress the unpredictable progress.)

Developmental SingularityAnother important variation on Singularity is the idea of a "developmental Singularity", or a point at which the rate of technological advancement becomes infinite or close to infinite. The use of terms like "infinite" or "near infinite" throws an obvious wrench into such a definition, of course, since new technology is probably not something that can ever reach a point of infinite increase -- for the same reason that a predictive horizon can never be reached.  And as far as I know, nobody has given a definition of the point that could be considered "close to infinite." Nevertheless, any futurist knows that the rate of technological progress throughout history has been accelerating at an exponentially increasing rate.  Plotting this on a graph shows that we are in the "knuckle" of a curve on the verge of shooting explosively upward if trends continue.  The graph shows a very clear, sharp "spike, " and you can thus see why Damien Broderick chooses this name for the Singularity.  This type of Singularity is discussed in greater detail in the question directly addressing the "Spike" idea.

The definition of Singularity emphasized by this Q&A, the achievement of greater-than-human intelligence, does have at least one important shortcoming:  namely, the very likely possibility that minds could be smarter than humans in some ways while falling short in others.  In a sense, this is already true.  Computers as we know them today should not really be thought of as minds, but they are certainly better suited to, say, mathematics, than we are.  The precise arrival of the Singularity may thus be disputed for lack of an obvious way to recognize it.  The most "traditional" measurement, the so-called Turing Test, is probably not the best candidate.  In its simplest form, the Turing Test is a contest in which an Artificial Intelligence textually converses with a human judge; if the judge can't distinguish the AI from human participants, the AI passes the test.  The problem is that this type of communication is something that humans are very good at, on account of highly specialized mental hardware.  AI that can pass the Turing Test may therefore be far more intelligent than humans in other important domains -- so much so that it may have already triggered one of the other types of Singulairty.

The "greater intelligence" Singularity, despite the potential uncertainty in identifying it, is probably still the most precise of the three Singularites discussed in this response.  And as the milestone most likely to trigger spikes of technological progress and reductions of the predictive horizon, the arrival of greater intelligence is also perhaps the most inclusive definition of Singularity available.

[Back to Futurism] [Back to Main]  ©2002 by Mitchell Howe