"There shall be wings! If the accomplishment be not for me, 'tis for some other. The spirit cannot die; and man, who shall know all and shall have wings..." --Leonardo Da Vinci
I’ve spent the past several months exploring "doomer" themes at my sister blog The Doomer Report, where I think I’ve managed to cover most of the really pessimistic scenarios for the near future. Having thoroughly explored "the dark side", I am happy to report that I’m back in the Singularitarian camp, more convinced than ever that this vision of the future is the more compelling one.
While the case for an impending collapse of technological civilization is in many ways quite strong, and there are many highly intelligent people in the doomer camp, I think what is lacking in their world view, above all, is creativity. Doomers look at current trends, such as declining per capita energy production, depleting resources or global warming, project them forward along a straight line, and conclude that we’re all doomed. The role of scientific breakthroughs, technological black swans, cultural adaptations and human ingenuity is almost totally discounted, or assumed to be negative. History has demonstrated over and over again that such linear thinking about the future is usually wrong, often comically so. While I don’t dismiss the possibility of a future collapse or seek to downplay the seriousness of the challenges our species faces, I think it’s more likely that today’s doomers will look just as silly as Paul Erlich did in 1970 when he famously predicted that as many as four billion people would starve to death in the 1980's, including 65 million Americans.
My other criticism of the doomer world view is that it is a form of fatalism – it promotes the attitude that we are all helpless before huge negative global trends, and that the best we can do is to try and mitigate the impact of the inevitable doomsday. This way of thinking is highly seductive, because by abandoning all hope for the future we no longer feel responsible for it. It can be quite liberating to opt out of the whole progressive program, throw up your hands and say “to hell with it". But this attitude can quite easily become a self-fulfilling prophecy which will, indeed, result in the world going to hell. If every member of our species was so fatalistic, I have no doubt that we would still be eating our meat raw and subsisting on grubs clawed out of the earth with our bare hands. Singularitarians may be accused of hubris, but it is hubris in the best tradition of our species, and certainly seems more useful than doomer despair.
In a sense Singularitarians and transhumanists are doomers, in that we envision a future in which homo sapiens has been surpassed or even replaced by a superhuman intelligence. But this is hardly a radical notion, in view of the long evolutionary past in which every previous apex species has been similarly made obsolete. The difference this time is that we may intentionally knock ourselves off the top of the pyramid, and in so doing determine the future of intelligent life on this planet and beyond, rather than letting random environmental factors determine it for us. In fact, I would argue that the points doomers make about our species having exceeded its sustainable limits on a finite planet only make the case for transhumanism stronger. Since we do seem to be running up against some hard biological limits, rather than passively accepting the “die-off” that these limits imply, why not try to remove some of our shortcomings by any means necessary? Is tinkering with the human genetic code, enhancing our neuro-circuitry or developing artificial intelligence somehow more repugnant than accepting the deaths of billions of human beings? Only an extreme despiser of life could think so.
I highly recommend that every Singularitarian explore the doomer scenarios discussed at sites like dieoff.org. If nothing else, reading this material is a good exercise that challenges the simplistic techno-optimism that is so prevalent among transhumanists. For me, transhumanism is not so much an optimistic view of human nature as a deeply realistic one, in that it assumes that real progress derives more from technological enhancement than philosophical enlightenment. In this sense transhumanism is consistent with the views of uber-doomers like Jay Hanson, who likes to point out how deluded human beings are about the extent of their genetic programming. But where Hanson believes that this programming, combined with our technological prowess, will inevitably result in self-destruction, I believe that we may be able to avoid such a fate by modifying the destructive programming itself. This is why my foray into doomerism has only strengthened my belief in the importance of transhumanist research, and why I think it is so critical that we take advantage of this window of opportunity to aggressively pursue transhumanist game-changers before one of the doomer scenarios really does come to pass and such research becomes impossible.