Transhumanism: The Quest for the Techno-Savior and Its Consequences
Written on
Locked and Uploaded
Transhumanism and the Techno-Savior
I have a passion for delving into lesser-known communities and individuals who possess unique or thought-provoking perspectives. While transhumanism might not be entirely obscure, it certainly seems less prevalent in discussions today.
Nevertheless, the movement persists, with organizations like Humanity+ maintaining interest. Compared to the vibrant conversations of the 90s and early 2000s, it appears to have become more subdued.
In those earlier years, a palpable excitement surrounded the subject. I remember engaging dialogues in online forums where people shared their awakening to transhumanism and expressed eagerness for a future with enhanced bodies. The discussions often felt almost religious in nature.
During that era, many enthusiasts had a key objective: to survive until the 2030s when technology would supposedly allow for centuries of life. The promise of a techno-savior was imminent, and this time it was grounded in science, not faith.
The Über Human
For those unfamiliar, transhumanism is a movement that acknowledges that our current physical forms are not the pinnacle of human evolution. It is characterized by a fervent hope for a future where our minds can reside in more advanced vessels.
Topics frequently discussed include life extension, artificial intelligence, nanotechnology, cryonics, mind uploading, and the concept of the singularity.
The singularity refers to the moment when machines with general intelligence surpass human capabilities, leading to a recursive cycle of machines creating even smarter machines—resulting in advancements far beyond human understanding.
At this juncture, the future of humanity becomes uncertain. A goal for transhumanists is to develop benevolent AI, ensuring that our digital successors do not inadvertently lead to our extinction. (I have previously argued that this threat may be overstated.)
It seems that many transhumanists are essentially science fiction enthusiasts who long for immortality, believing that advanced AI will elevate them into an exciting narrative rather than pose a danger.
One of the most prominent critics of transhumanism is Dale Carrico. During the height of post-human enthusiasm, he consistently reminded transhumanists of some uncomfortable truths:
- Enjoying science fiction doesn't equate to conducting science.
- Speculative ideas are not the same as predictive insights.
- There is little merit in asserting that magical concepts would be fascinating if they were real.
Carrico's critiques highlighted the lack of a solid foundation supporting the movement's aspirations.
I’ll Stick with My Monkey Body
In my past visits to the Burning Man festival, I encountered a transhumanist camp adjacent to mine.
Their setup featured rows of chairs under a shade structure, reminiscent of a church gathering or an Amway seminar—an environment where pre-packaged truths are dispensed.
A whiteboard displayed a graph often seen in those days, illustrating the rapid advancement of technology. The chart's line ascended steadily, marking milestones like the emergence of genuine AI, culminating in a point that signified the singularity and euphoric anticipation.
Burning Man is a festival filled with stunning visuals, light displays, and innovative art, so one might expect transhumanists to create an awe-inspiring camp.
Instead, they had only chairs and a whiteboard, overshadowed by camps serving vegan pancakes and more engaging offerings. This led me to question whether my ordinary human existence might be more compelling than the imagined life of a mind-uploaded entity.
While it may not be fair to judge the entire movement based on one encounter, that experience left a lasting impression.
Recipe for a Cult
The transhumanist movement is not devoid of a spiritual essence, often expressing a belief in the impending arrival of an AI-dominated future, promising eternal life—a sort of techno-religion.
Is it surprising that some cult-like tendencies have emerged? Here are a few characteristics that might apply:
- A shared belief in a transformative idea.
- Significant time and energy invested in that idea.
- A charismatic leader who claims special insights.
- A focus on ideological conformity.
- Apocalyptic perspectives on the future.
Eliezer Yudkowsky, an AI researcher and co-founder of the Machine Intelligence Research Institute (MIRI), has been a significant figure in this realm. He advocates for the development of friendly AI and has extensively written on rationality and cognitive biases.
Yudkowsky promotes the belief that ensuring AI remains benevolent is humanity's most critical task, leading to calls for donations to support this cause.
If you think this mindset is exaggerated, consider a 2010 interview with him, where he was asked if he believed there are only two valid paths for intelligent people: working on singularity-related issues or making money to donate to causes like SIAI/Methuselah.
His affirmative response emphasized the importance of prioritizing these endeavors, while he also escalated the urgency of the situation, declaring it a critical moment for humanity.
When asked about the consequences of his absence, he expressed a sense of narcissism, indicating that he believed only he could fulfill this vital role.
To summarize, we see a three-part formula for cult-like thinking:
- The most crucial cause in history.
- A logical rationale for substantial support of this cause.
- The belief that only one individual can fulfill this mission.
This rhetoric is reminiscent of L. Ron Hubbard's teachings, combining belief, financial investment, and a central figure.
Fortunately, it seems that followers of Yudkowsky's movement haven't faced the same level of abuse found in more harmful cults, but the similarities are intriguing.
Live Longer and Prosper
In the last 15-20 years, I anticipated that transhumanists would have progressed beyond the simplistic notion that vitamin intake and patience would lead to immortality.
I was surprised to find that these ideas still persist. The Humanity+ FAQ page includes a query about becoming post-human, suggesting that while it's not feasible today, those who live long enough may have a chance. Recommendations include:
- Leading a healthy lifestyle and minimizing risks.
- Enrolling in cryonics for post-mortem preservation.
- Saving for potential life-extension treatments.
- Supporting transhuman technologies through financial means or assistance.
These suggestions seem tailored to affluent individuals, as many cannot easily avoid risks or afford cryonics.
To address this disparity, Humanity+ proposes that life-extension technologies will eventually become accessible to all. Until then, society may resemble the world depicted in the film Elysium, where the wealthy enjoy luxurious lives while the less fortunate struggle.
The site also advocates for progressive taxation and community-funded services, which may present challenges given that many proponents of transhumanism tend to have libertarian views, often resisting increased taxation and enhanced social services.
Give Me That Old Time Futurism
As previously mentioned, mind-uploading or the singularity was once anticipated by the 2030s. Ray Kurzweil, a prominent futurist, projected the year 2045 for this event. A 2018 poll of AI researchers revealed varying opinions, with 24% suggesting a timeframe between 2036 and 2060, while 21% believed it would "likely never" happen.
It’s certainly a topic to consider for the future. However, if mind-uploading were achievable, the resulting consciousness would not be truly "you," but rather a duplicate version of yourself.
Yet, I’m sure proponents have some philosophical rationale to reconcile this dilemma if they choose to believe strongly enough.
The folk beliefs of the late 20th and early 21st centuries often revolve around UFOs and celestial beings, while the intellectual discourse centers on the singularity and science as our savior. For both, science fiction serves as a revered text.
In my view, one of the most compelling post-singularity science fiction novels is Charlie Stross’ Accelerando. Interestingly, Stross himself does not genuinely endorse the singularity outside of the realm of fiction.