The definition of the singularity is that we don't know what how our world and civilization (both technologically and socially) will be affected by higher than human intelligence AI. Of course we don't know for sure anything about the future, but I suppose we are reasonably good at having some kind of an idea, as Kurzweil's track record shows. The one thing that we can reasonably predict is the impact will be pretty massive. The invention of agriculture type massive. Anything from Terminator-esque genocide to essentially heaven on earth is conceivable. Guiding this toward a beneficial outcome for the human race this is exactly the the point of The Singularity Institute Kurzweil's books "The Age of Spiritual Machines" and "The Singularity is Near" take the basic format of spending half the book explaining the singularity theory and making a case for it's validity, and the second half talking about how crazy awesome the outcome could potentially be. He's pretty optimistic about it and makes good arguments for his optimism. They are great reads.