I've been working on this topic for days, and meanwhile, the blog's got a bit stale. To make it more interesting, I'm breaking this topic up into more manageable pieces.
Transhumanism is humanism with an appreciation for the fact that our evolution is just beginning. Already, we augment ourselves with prosthetic limbs, heal ourselves with pharmaceuticals, and use computer systems to enhance our information processing ability. Our lifespans are typically double what they were a century ago. In the future, our posthuman descendants will have unlimited lifespans and reason with minds so powerful that they'll look back on today's humans as we look back on the dinosaurs.
Why is this important now? In a word, proximity. Within 50 years we will possess the technologies required to bestow superintelligence and superlongevity on any human. Unfortunately, those same technologies pose an existential threat to all life on this planet.
As an optimist, I like to believe that humanity will survive its terrestrial adolescence, and emerge as a more mature race. With forward planning, we can avoid the threats of totalitarian world regimes and self-destruction.
In my opinion, the best way to proceed is to anticipate our evolving future and establish mechanisms that will protect the rights of the evolved and the unevolved alike. We must shape our philosophy so that even superbeings will respect our ethical code.
Don't mention the war
Talk of superhumans inevitably finds its way back to the subject of Nazi Germany. Who is to say what constitutes enhancement? Should the smartest, most attractive, most perfect people have political power? Potentially scary stuff. If Nazi eugenics is one extreme, there's an opposite and equally authoritarian poison lurking in religious fundamentalism. If humans were created in God's image, then presumably we are already perfect. Indeed, if we are to believe the fairy tale that is the Bible, we were better off without reason, knowledge or the sciences. Religious fundamentalists have no need of these things, and many extremists are all too pleased to take freedom of thought from you by way of terror or military force. Both Nazi eugenics and Taliban-style regression are equally dark authoritarian nightmares. Authoritarianism bad!
People who neither like nor understand technology may propose that we simply outlaw all forms of human enhancement. If life is equally brutish and short for everyone, there will be no superior race trying to displace us. Unfortunately, this is a prescription for certain doom. Our technology will inevitably grow so complex that we will either create a super race by accident, or blow ourselves up. The only way to handle the complexity of our own future technology is to evolve right along with it. We can try to hold back technology with a gigantic police state, but then we're back to authoritarianism.
Ethical Principle #1:
Authoritarianism is bad. Embrace liberal philosophy.
Ethical Principle #2:
Embrace technology and adapt to meet its challenges.
When evolution is self-directed, it accelerates rapidly. If I devise an cybernetic implant that doubles my IQ, I will use it to devise an implant that redoubles my IQ. Each evolutionary step enhances our ability to evolve further, and supersmart life forms will be outpaced by yet supersmarter beings. Fifth generation superintelligences will look like monkeyboys compared to sixth generation superintelligences, and so on.
There's a chance that the reigning superintelligence might be tempted to eliminate all other potential competitors before they have the chance to steal control of its destiny. Of course, if the superintelligence needs us monkeyboys more than it needs control over its destiny, we should feel no less secure than we do today.
Ethical Principle #3:
Respect the rights of less intelligent life forms lest you be one.
Ethical Principle #4:
To be continued...