For many, the term "transhumanism" suggests a rejection of humanity or a dismissal of the body of philosophy we call "humanism." Some of the movement's proponents don't help matters, embracing an Ayn Rand-style libertarian perspective and disdain for "unenhanced" humanity. But not all transhumanists are the same. A growing number see the drive to develop technologies to strengthen and extend human capabilities as part and parcel of the push to improve global social conditions, and recognize that there is a necessary role for society and government in the safe development and fair distribution of new technologies. They refer to themselves as "Democratic Transhumanists," and their founding philosopher is Dr. James Hughes.
Dr. Hughes is a bioethicist and sociologist at Trinity College in Hartford Connecticut, where he teaches Health Policy, Drug Policy and Research Methods in Trinity's Graduate Public Policy Studies program. He holds a doctorate in sociology from the University of Chicago, where he also taught bioethics. He is a member of the American Society of Bioethics and Humanities, and the Working Group on Ethics and Technology at Yale University. He has been a longtime left activist, having founded EcoSocialist Review while in grad school, as well as working on systemic reform of health care organizations to empower patients.
He is also a Director of the World Transhumanist Association, and the author of the recently-published Citizen Cyborg: Why Democratic Societies Must Respond to the Redesigned Human of the Future. Dr. Hughes sees Democratic Transhumanism as existing in the space left fallow by both the libertarian transhumanist wing and the Luddite element of the left. As he put it in his lengthy and detailed treatise on the philosophy:
Democratic transhumanism stems from the assertion that human beings will generally be happier when they take rational control of the natural and social forces that control their lives. This fundamental humanistic assertion has led to two intertwined sets of Enlightenment values: the democratic tradition with its values of liberty, equality, solidarity and collective self-governance, and to the belief in reason and scientific progress, that human beings can use reason and technology to improve the conditions of life.
A recent manifestation of these principles is his founding of the Institute for Ethics and Emerging Technologies (IEET), an organization at which I am a Fellow. Over the past month, I've had an extended email conversation with Dr. Hughes, discussing the sometimes-strained relationship between progressive principles and technological utopianism. Because of its length, I'll post the discussion in three parts. Today's focuses on the meaning of Democratic Transhumanism.
Cascio: Let's start with the basics: what does "Democratic Transhumanism" mean?
Hughes: To me, the democratic part is a bit redundant, since I see transhumanism as a natural conclusion of the democratic and humanist philosophical tradition: life is better when people are empowered to make decisions about their own lives, individually and collectively. The two basic ways we can be empowered are by pushing back social domination through equality, liberty and social solidarity, and by pushing back the domination of nature through science and technology. It seems a natural conclusion that we should help one another use emerging technologies to push back sickness, aging, suffering and death, which is the key goal of transhumanism.
However, there are variants of the humanist tradition - neo-liberal, libertarian and anarcho-capitalist philosophies - that prioritize liberty to the exclusion of equality and solidarity, and try to eliminate democratic oversight, regulation, redistribution and public provision. Although transhumanists like Condorcet and Haldane were most often advocates of radical egalitarianism, the 1960s brought an ascendance of the romantic, anti-technology wing of the Left and the ceding of narratives of progress to these champions of the free-market and corporate capitalism. So when transhumanism finally found its feet as a social movement in the early 1990s, many transhumanists were attracted to these neo-liberal philosophies, at least up till the dotcom bust.
I think that libertarian tilt to transhumanism is now turning around. Progressives are discovering that their natural allies are not the Christian Right with its anxieties about hubris, but women trying to defend their reproductive rights to use technology, the disabled like Christopher Reeve fighting for assistive and restorative technologies, and the world's poor who need new technologies to provide clean water and abundant food. Transhumanists, in turn, are realizing that our Big Pharma is too short-sighted to commit to risky, far-sighted research on things like "negligible senescence." These projects need funding through the National Institutes of Health, the National Nanotechnology Initiative, and the Nano-Bio-Info-Cogno program. They also need to go through the Food and Drug Administration. Nothing is more disastrous for technology than a thalidomide-type disaster. We've already seen with estrogen replacement therapy that the public is ready to adopt technologies to try to forestall aging which they then find are actually killing them.
So progressive or "democratic" transhumanists, unlike the free marketeers, understand that strong oversight and social reform has to accompany technology diffusion. Technologies need to be tested for safety and made universally available. Hopefully as this perspective becomes more common we can drop the redundant "democratic."
Cascio: I wonder how much of the negative reaction to transhumanism comes from a reaction to the term itself, and its implied disdain for being human. While you make a good argument that democratic transhumanism is a natural evolution of humanist philosophies, some of the ideas that transhumanism encompasses do include outcomes (uploading, radical bioengineering to the point of speciation, etc.) which discard "human-ness." How does democratic transhumanism speak to those who find such a transition frightening?
Hughes: The first point is that transhumanism does not connote disdain for humanity, but disagreement that the category of "human" is meaningful. Take our newly discovered Hobbit cousins from Indonesia, or Neanderthals. If they were still around would we consider them human? What would it mean for our society if we were to deny modern Hobbits or Neanderthals human rights on the grounds they weren't "human" and treated them as pets or slaves? Are conjoined twins human? Is someone in an intensive care unit with machines breathing for them, pumping their blood, and maintaining their blood chemistry, are they still human? "Human," "human dignity" are empty signifiers that have crept into our language as proxies for "soul," and progressives need to rethink their use of these categories.
Francis Fukuyama, in Our Posthuman Future, explicitly argues that humanness is a "Factor X," a black box that combines some combination of genetics, rationality and emotion. But he doesn't want to specify it because if he did it would be clear that there are humans who don't have those specifics, and that great apes probably do. Specifying what it is that we value about humanness would also allow us to regulate biotechnology to protect that, and allow individual choice on the rest. Rationality and emotional complexity what makes us human? Great - nobody will be allowed to make themselves developmentally disabled or autistic. Remaining primarily organic what makes us human? OK, then adding lion genes shouldn't be a problem.
The basic argument between transhumanists and human-racists is a debate about what is really important and valuable in the human condition, self-aware existence, consciousness, emotionally rich experience and rational thought, on the one hand, or having the modal genome and body type of human beings circa 2000 (which is very different from what it was even 20 years ago, but never mind that)? The transhumanist position is known in bioethics as "personhood theory": you can be a self-aware person and not be human (great apes for instance) and you can be "human" and not be a person (such as fetuses and the brain dead). Rights are for persons, not humans.
But there is a grain of truth to the critics' attack in that we are very upset about the limitations of the human body and we think that, using reason and technology, we can do much better. That's what medicine is to begin with. Is it a lack of love for and faith in "humanness" to get vaccinated, or have surgery, or take insulin or vitamins? I think one of the thing most people consider core to "humanity" is a desire to improve and progress, so in that sense human enhancement technologies are quintessentially "human."
Since "human" is basically a tribal identity with no empirical referent, what Kurt Vonnegut called a "granfalloon" , I fully expect that in four hundred years there will be people with green skin, four arms, wings, endless lives, and nanocomputer brain pans, who proudly consider themselves "human" and who organize big family reunions for all the people with their surname, or all the other descendents of Civil War veterans, or whatever. And there are people today who are ready to give up any claim to membership in the human race because they have glasses or a pacemaker or are pissed off about the persistent ubiquity of ignorance and cruelty in this race that pretends to know better.
I understand that people do get frightened by the idea of a transhuman society, with increasing diversity of persons. People were frightened that the end of slavery and Jim Crow would unleash anarchy and race-mixing, and people are still scared that legal gay marriage will destroy Western civilization. We need to try to convince those who are afraid of human enhancement that we can still have peace, prosperity and tolerance of diversity in that future. And at the same time we need to remember that the transhumanist claim is that people should control their own bodies and minds, and other people don't get to tell us to go to the back of the bus because of their vague anxieties and yuck reactions to our choices.
Cascio: Say a little bit more about the "yuck reaction" -- it's a term I see in use among the transhumanist circles, but doesn't have quite the same impact in broader conversation.
Hughes: "Yuck factor" is bioethics shorthand for the many variants of the argument that something must be unethical just because it freaks people out. For instance people think consensual cannibalism is self-evidently immoral even though the alleged ethical arguments against it are very tenuous. Leon Kass, G.W.'s bioethics czar, is the principal proponent of the theory that people should be guided by their gut instincts in ethics. Don't like chocolate cake? Then there is probably something unethical about chocolate cake. Most bioethicists aren't as bold as Kass in jettisoning reason however, so they have invented two variants on the uncontestable "God don't like it": that something is "unnatural" and that it violates "human dignity." Both of these arguments are just hand waving. As Love and Rockets said "You cannot go against nature, Because when you do, Go against nature, It's a part of nature too." As for violating human dignity its in the eye of the beholder.
Yuck factor is also closely related to "future shock." When society changes fast people get upset and try to slow things down. In democratic societies they will be able to use quite a few brakes, which is generally a good thing. But balance is provided by the protection of individual liberty and minority rights. So, for instance, when most Americans freaked out about the Massachusetts Supreme Court's decision that gays should be able to marry they passed referenda around the country to stop gay marriage. I think the state courts and then the Supreme Court should overturn those referenda on the grounds that gay marriage is a fundamental right. When we are talking about basic rights, like the right to control your own body and mind, or vote, or sit anywhere on the damn bus you feel like, or marry your lover, those rights should trump other people's future shock and yuck reactions.
Cascio: "We can do better" is at the core of what WorldChanging does and what IEET represents. And the "we" is as important as the "do better" -- it's not just atomistic individuals trying to compete for greatest personal satisfaction, it's a social effort, which reflects social concerns.
Hughes: That's absolutely important. Libertarian individualism is completely self-defeating for the human enhancement movement. You want to make yourself and your kids smarter? You can take a smart pill and do your mental gymnastics, but you still need good books, stimulating friends, a solid education, a free and independent press, and a stable, well-regulated economy so your PDA keeps beaming Google searches and email chat into your eyeball through that laser display. And it might be nice to have a strong, independent Food and Drug Administration to make sure that your smart pill doesn't cause dementia in five years, and that that laser display doesn't blind you.
Similarly, the principal determinants of longevity in the 20th century have been improvements in social technology not medical technology, e.g. getting people to suppress infectious diseases. Universal access to safe, effective life extension and age-retardation technology in the coming decades will require public investments into basic research, reining in our out-of-control intellectual property system, and the subsidizing of access for the uninsured and the world's poor. The libertarian fantasies that atomistic individualism and an unregulated free market will build an attractive future are just stupid.
Comments (3)
Good stuff.
I could add: It's necessary for a wide variety of humanists to chip in in order for the face of the future to be *sane.* Don't leave it up to the . . . well, apocalyptic oddballs who have invested religious significance into the Singularity.
"I fully expect that in four hundred years . . ."
A couple of decades back, "The Space Gamer" ran a cute little story set in a spaceport bar.
A guy wanders in before things get busy. He chats with people as the after-work crowd wanders in. Various cyborgs, robots, an uplifted dinosaur just in from shooting a monster movie, people adapted to other planets, and so on.
At some point it's revealed that the protagonist is an _alien_.
The other patrons react with shock and horror. He's tossed out -- they don't serve that kind here -- and the fellow he was sitting with feels betrayed and scandalized.
Hilarious.
Posted by Stefan Jones | November 30, 2004 3:53 PM
Posted on November 30, 2004 15:53
I agree that a great deal of libertarian fantasies are simply stupid, but I also think that too much regulation and public accountability rhetoric can kill the movement in its infancy. Early technologies are full of mistakes, bugs and problems. That's just how life is. And if these technologies are used on the body, then people will suffer, too. In order for the methods to become safe, the early pioneers will have to make sacrifices -- and they will have to be _allowed_ to make those sacrifices. While libertarian fantasies may be stupid, it is people believing in them that are going to make the largest impact on pro-transhumanist technology.
Posted by Sergiy Grynko | November 30, 2004 5:49 PM
Posted on November 30, 2004 17:49
What a fine interview, Jamais. Worthy subject and good questions.
I felt my incipient Luddism coming to the fore as I read the articles. We haven't been able to solve the many problems that can easily be remedied with the technology we already have. And in the present culture, I do not trust us to make wise decisions with the tricky issues of biotechnology. GMOs are a case in point.
Perhaps in a social democracy with the corporations under control and a much greater awareness of ecology -- maybe then I'd start to feel enthusiasm rather than a sense of dread.
As it is, transhumanism sounds either like a costly irrelevance (e.g., space flight, artificial intelligence) or a Pandora's box (e.g., nuclear weapons).
To his credit, Hughes seems aware of many of the potential problems.
Posted by bart | December 3, 2004 1:27 AM
Posted on December 3, 2004 01:27