Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Transhumanism
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Existential risks === {{See also|Existential risk studies|Existential risk from advanced artificial intelligence}} In his 2003 book ''[[Our Final Hour]]'', British [[Astronomer Royal]] [[Martin Rees]] argues that advanced science and technology bring as much risk of disaster as opportunity for progress. However, Rees does not advocate a halt to scientific activity. Instead, he calls for tighter security and perhaps an end to traditional scientific openness.<ref name="Rees 2003"/> Advocates of the [[precautionary principle]], such as many in the [[environmental movement]], also favor slow, careful progress or a halt in potentially dangerous areas. Some precautionists believe that [[artificial intelligence]] and [[robotics]] present possibilities of alternative forms of cognition that may threaten human life.<ref name="Arnall 2003"/> Transhumanists do not necessarily rule out specific restrictions on emerging technologies so as to lessen the prospect of [[existential risk]]. Generally, however, they counter that proposals based on the precautionary principle are often [[technorealism|unrealistic]] and sometimes even counter-productive as opposed to the [[technogaian]] current of transhumanism, which they claim is both realistic and productive. In his television series ''[[Connections (British documentary)|Connections]]'', science historian [[James Burke (science historian)|James Burke]] dissects several views on [[technological change]], including precautionism and the restriction of [[inquiry|open inquiry]]. Burke questions the practicality of some of these views, but concludes that maintaining the ''[[status quo]]'' of inquiry and development poses hazards of its own, such as a disorienting rate of change and the depletion of our planet's resources. The common transhumanist position is a pragmatic one where society takes deliberate action to ensure the early arrival of the benefits of safe, [[clean technology|clean]], [[alternative technology]], rather than fostering what it considers to be [[anti-science|anti-scientific views]] and [[technophobia]]. [[Nick Bostrom]] argues that even barring the occurrence of a singular [[Global catastrophic risk|global catastrophic event]], basic [[Malthusian]] and evolutionary forces facilitated by technological progress threaten to eliminate the positive aspects of human society.<ref name="bostrom-evolution">{{cite journal|last1=Bostrom|first1=Nick|title=The Future of Human Evolution|journal=Bedeutung|volume=284|issue=3|page=8|date=2009|url=http://www.nickbostrom.com/fut/evolution.html|bibcode=2001SciAm.284c...8R|doi=10.1038/scientificamerican0301-8}}</ref> One transhumanist solution proposed by Bostrom to counter existential risks is control of [[differential technological development]], a series of attempts to influence the sequence in which technologies are developed. In this approach, planners would strive to retard the development of possibly harmful technologies and their applications, while accelerating the development of likely beneficial technologies, especially those that offer protection against the harmful effects of others.<ref name="Bostrom 2002"/> In their 2021 book ''Calamity Theory'', Joshua Schuster and Derek Woods critique existential risks by arguing against Bostrom’s transhumanist perspective, which emphasizes controlling and mitigating these risks through technological advancements. They contend that this approach relies too much on [[fringe science]] and speculative technologies and fails to address deeper philosophical and ethical problems about the nature of human existence and its limitations. Instead, they advocate an approach more grounded in secular [[Existentialism|existentialist philosophy]], focusing on [[psychological resilience|mental fortitude]], [[community resilience]], international [[peacebuilding]], and [[environmental stewardship]] to better cope with existential risks.<ref name="Schuster and Woods 2021"/> ==== Antinatalism and pronatalism ==== Although most people focus on the scientific and technological barriers on the road to human enhancement, Robbert Zandbergen argues that contemporary transhumanists' failure to critically engage the cultural current of [[antinatalism]] is a far bigger obstacle to a posthuman future. Antinatalism is a stance seeking to discourage, restrict, or terminate [[human reproduction]] to solve existential problems. If transhumanists fail to take this threat to human continuity seriously, they run the risk of seeing the collapse of the entire edifice of radical enhancement.<ref>{{Cite journal |last=Zandbergen |first=Robbert |date=2021-12-09 |title=Morality's Collapse: Antinatalism, Transhumanism and the Future of Humankind |url=https://jeet.ieet.org/index.php/home/article/view/76 |access-date=2023-04-05 |journal=Journal of Ethics and Emerging Technologies|volume=31 |issue=1 |pages=1–16 |doi=10.55613/jeet.v31i1.76 |s2cid=248689623 |doi-access=free }}</ref> [[Simone and Malcolm Collins]], founders of Pronatalist.org, are activists known primarily for their views and advocacy related to a secular and voluntaristic form of [[pronatalism]], a stance encouraging higher birth rates to reverse [[population decline|demographic decline]] and its negative implications for the viability of modern societies and the possibility of a better future.<ref name="Dodds 2023">{{Cite news|url = https://www.telegraph.co.uk/family/life/pronatalists-save-mankind-by-having-babies-silicon-valley|title = Meet the 'elite' couples breeding to save mankind|newspaper = The Telegraph|date = April 19, 2023|last1 = Dodds|first1 = Io}}</ref> Critical of transhumanism, they have expressed concern that [[life extension]] would worsen the problem of [[gerontocracy]], causing toxic imbalances in power. The Collinses lament that [[voluntary childlessness|voluntarily childfree]] transhumanists who "want to live forever believe they are the epitome of centuries of human cultural and biological evolution. They don’t think they can make kids that are better than them."<ref name="Weiss 2023">{{Cite news|url = https://www.thefp.com/p/tech-gods-immortality-live-forever|title = The Tech Messiahs Who Want to Deliver Us from Death|newspaper = The Free Press|date = May 24, 2023|last1 = Weiss|first1 = Suzy}}</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Transhumanism
(section)
Add topic