Following some recent discussion with colleagues, who all had no clue what the singularity was:
The singularity is a hypothesis that predicts a run-away process if an artificial intelligence manages to improve itself. Here is what Wikipedia (http://en.wikipedia.org/wiki/Technological_singularity
) has to say:
The technological singularity is a hypothesised point in the future variously characterized by the technological creation of self-improving intelligence, unprecedentedly rapid technological progress, or some combination of the two.
Statistician I. J. Good first wrote of an "intelligence explosion", suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unseen by their designers, and thus recursively augment themselves into far greater intelligences.
Others, most prominently Ray Kurzweil, define the Singularity as a period of extremely rapid technological progress. Kurzweil argues such an event is implied by a long-term pattern of accelerating change that generalizes Moore's Law to technologies predating the integrated circuit and which he argues will continue to other technologies not yet invented.
I think the concept is very interesting. However in my view any such event would hit a resource cap. Which, of course, does not answer the question if the AI will out compete humans.