Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Ray Kurzweil
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Views== ===The Law of Accelerating Returns=== {{main|The Law of Accelerating Returns}} In his 1999 book ''[[The Age of Spiritual Machines]]'', Kurzweil proposed "The Law of Accelerating Returns", according to which the rate of change in a wide variety of evolutionary systems (including the growth of technologies) tends to increase exponentially.<ref>Ray Kurzweil, ''The Age of Spiritual Machines'', [[Viking Press|Viking]], 1999, [https://books.google.com/books?id=ldAGcyh0bkUC&pg=PA630 p. 30] and [https://books.google.com/books?id=ldAGcyh0bkUC&pg=PA632 p. 32].</ref> He further emphasized this issue in his 2001 essay "The Law of Accelerating Returns", which proposes an extension of [[Moore's law]] to a wide variety of technologies and argues in favor of [[John von Neumann]]'s concept of a [[technological singularity]].<ref>{{cite web |url=http://www.kurzweilai.net/the-law-of-accelerating-returns |title=The Law of Accelerating Returns |access-date=September 15, 2014 |archive-date=January 14, 2015 |archive-url=https://web.archive.org/web/20150114213500/http://www.kurzweilai.net/the-law-of-accelerating-returns |url-status=dead }}</ref> ===Genetics, nanotechnology, and robotics=== Kurzweil was working with the [[Army Science Board]] in 2006 to develop a rapid response system to deal with the possible abuse of biotechnology. He suggested that a bioterrorist could use the same technologies that empower us to reprogram biology away from cancer and heart disease to reprogram a virus to be more deadly, communicable, and stealthy. But he suggests we have the scientific tools to defend against such an attack, much as we defend against computer software viruses. Kurzweil has testified before Congress on [[nanotechnology]], saying that it has the potential to solve serious global problems such as poverty, disease, and climate change: "Nanotech Could Give Global Warming a Big Chill".<ref name="Global_warming">{{cite web |date=July 2006 |title=Nanotech Could Give Global Warming a Big Chill |url=http://www.qsinano.com/pdf/ForbesWolfe_NanotechReport_July2006.pdf |url-status=dead |archive-url=https://web.archive.org/web/20131023163838/http://www.qsinano.com/pdf/ForbesWolfe_NanotechReport_July2006.pdf |archive-date=October 23, 2013 |access-date=June 16, 2011}}</ref> In media appearances, Kurzweil has stressed nanotechnology's extreme potential dangers<ref name=booktv /><!-- at 85, 147, 167 and 173 minutes into 3-hour interview--> but argues that, in practice, progress cannot be stopped because that would require a totalitarian system, and any attempt to do so would drive dangerous technologies underground and deprive responsible scientists of the tools needed for defense. He suggests that the proper place of regulation is to ensure that technological progress proceeds safely and quickly but does not deprive the world of profound benefits. He said: "To avoid dangers such as unrestrained nanobot replication, we need relinquishment at the right level and to place our highest priority on the continuing advance of defensive technologies, staying ahead of destructive technologies. An overall strategy should include a streamlined regulatory process, a global program of monitoring for unknown or evolving biological pathogens, temporary moratoriums, raising public awareness, international cooperation, software reconnaissance, and fostering values of liberty, tolerance, and respect for knowledge and diversity."<ref>{{cite web |url=http://www.kurzweilai.net/nanotechnology-dangers-and-defenses |title=Nanotechnology Dangers and Defenses |publisher=KurzweilAI |access-date=2013-07-28 |archive-date=October 4, 2013 |archive-url=https://web.archive.org/web/20131004175320/http://www.kurzweilai.net/nanotechnology-dangers-and-defenses |url-status=dead }}</ref> ===Health and aging=== Kurzweil admits that he cared little for his health until age 35, when he was found to suffer from a [[glucose intolerance]], an early form of [[type II diabetes]] (a major risk factor for [[heart disease]]). He then found a doctor, [[Terry Grossman]], who shared his unconventional beliefs and helped him to develop an extreme regimen involving hundreds of pills, [[Intravenous therapy|chemical intravenous]] treatments, [[Health effects of wine|red wine]], and various other methods to attempt to extend his lifespan. In 2007, Kurzweil was ingesting "250 supplements, eight to 10 glasses of [[alkaline water]] and 10 cups of [[green tea]]" every day and drinking several glasses of red wine a week in an effort to "reprogram" his biochemistry.<ref>{{cite magazine |date=February 12, 2005 |title=Never Say Die: Live Forever |url=https://www.wired.com/news/medtech/0%2C1286%2C66585%2C00.html?tw=wn_tophead_3 |url-status=dead |archive-url=https://web.archive.org/web/20070224022313/http://www.wired.com/news/medtech/0,1286,66585,00.html?tw=wn_tophead_3 |archive-date=February 24, 2007 |access-date=September 15, 2014 |magazine=WIRED}}</ref> By 2008, he had reduced the number of supplement pills to 150.<ref name="CNN-2008" /> By 2015, Kurzweil further reduced his daily pill regimen to 100 pills.<ref>{{cite web|url=https://www.businessinsider.com/ray-kurzweils-immortality-diet-2015-4 |title=The 700-calorie breakfast you should eat if you want to live forever, according to a futurist who spends $1 million a year on pills and eating right|work=Business Insider |access-date=March 3, 2019}}</ref> Kurzweil asserts that in the future, everyone will live forever.<ref>{{cite web |date=July 3, 2012 |title=As Humans and Computers Merge β¦ Immortality? |url=https://www.pbs.org/newshour/show/as-humans-and-computers-merge-immortality#transcript |website=PBS NewsHour |publisher=}}</ref> In a 2013 interview, he said that in 15 years, medical technology could [[Longevity escape velocity|add more than a year to one's remaining life expectancy for each year that passes]], and we could then "outrun our own deaths". Among other things, he has supported the [[SENS Research Foundation]]'s approach to finding a way to repair aging damage, and has encouraged the general public to hasten their research by donating.<ref name=wsj>{{cite web |url=https://www.wsj.com/articles/SB10001424127887324504704578412581386515510 |title=Will Google's Ray Kurzweil Live Forever? |date=April 12, 2013 |work=WSJ |access-date=September 15, 2014}}</ref><ref>{{cite web |url=http://www.exponentialtimes.net/videos/ray-kurzweil-sens-3 |title=Ray Kurzweil At SENS 3 | Video |publisher=Exponential Times |date=August 25, 2011 |access-date=2013-07-28}}</ref> ===Futurism and transhumanism=== Kurzweil's standing as a [[Futures studies|futurist]] and [[Transhumanism|transhumanist]] has led to his involvement in several singularity-themed organizations. In 2004, he joined the advisory board of the [[Machine Intelligence Research Institute]].<ref>{{cite web |url=http://www.singinst.org/aboutus/board |title=Board β Singularity Institute for Artificial Intelligence |work=Singularity University |access-date=September 15, 2014 |archive-url=https://web.archive.org/web/20100421072724/http://singinst.org/aboutus/board |archive-date=April 21, 2010 |url-status=dead }}</ref> In 2005, he joined the scientific advisory board of the [[Lifeboat Foundation]].<ref>{{cite web |url=http://lifeboat.com/ex/boards#robotics |title=Lifeboat Foundation Advisory Boards |access-date=September 15, 2014}}</ref> On May 13, 2006, Kurzweil was the first speaker at the [[Singularity Summit]] at [[Stanford University]] in [[Palo Alto, California]].<ref>{{cite web |url=http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2006/05/12/BUG9IIMG1V197.DTL&type=printable |title=Printable version: Smarter than thou? / Stanford conference ponders a brave new world with machines more powerful than their creators |date=May 12, 2006 |work=SFGate |access-date=September 15, 2014}}</ref> In 2013, he was the keynote speaker at the Research, Innovation, Start-up and Employment (RISE) international conference in [[Seoul]]. In 2009, Kurzweil, Google, and the [[NASA Ames Research Center]] announced the creation of the [[Singularity University]] training center for corporate executives and government officials. The university's self-described mission is to "assemble, educate and inspire a cadre of leaders who strive to understand and facilitate the development of exponentially advancing technologies and apply, focus and guide these tools to address humanity's grand challenges". Using [[Vernor Vinge]]'s Singularity concept as a foundation, the university offered its first nine-week graduate program to 40 students in 2009. Kurzweil views the human body as a system of thousands of "programs" and believes that understanding all their functions could be the key to building truly [[sentient AI]].<ref>{{Cite web |date=July 5, 2022 |title=Sentient AI? Convincing you it's human is just part of LaMDA's job |url=https://www.healthcareitnews.com/blog/sentient-ai-convincing-you-it-s-human-just-part-lamda-s-job |access-date=2022-07-07 |website=Healthcare IT News |language=en}}</ref><ref>{{Cite web |last=Kurzweil |first=Ray |date=February 16, 2003 |title=Human Body Version 2.0 |url=https://www.kurzweilai.net/human-body-version-20 |access-date=2022-07-07 |website=Kurzweilai.et |language=en-US}}</ref> === Universal basic income === Kurzweil advocates [[universal basic income]] (UBI), arguing that progress in science and technology will lead to an [[Post-scarcity|abundance of virtually free resources]], enabling every citizen to live [[Post-work society|without the need to work]]: "We are clearly headed toward a situation where everyone can live very well".<ref>{{Cite web |last=Kurzweil |first=Ray |date=1 May 2018 |title=Supporting universal basic income is a step in world progress |url=https://www.kurzweilai.net/letter-from-ray-supporting-universal-basic-income-as-step-in-world-progress |url-status=live |archive-url=https://web.archive.org/web/20201025023302/https://www.kurzweilai.net/letter-from-ray-supporting-universal-basic-income-as-step-in-world-progress |archive-date=25 October 2020 |access-date=12 September 2020 |website=kurzweilai.net |language=en-US}}</ref> According to him, the major hurdle to introducing UBI is not its feasibility, but political will, which is slowly emerging.<ref name=":0">{{Cite web |last=Schwartz |first=Ariel |date=2018-04-14 |title=Google futurist and director of engineering: Basic income will spread worldwide by the 2030s |url=https://www.businessinsider.com/basic-income-worldwide-by-2030s-ray-kurzweil-2018-4 |access-date=2024-04-05 |website=Business Insider |language=en-US}}</ref> In a 2018 [[TED Talk]], he predicted that "in the early 2030s, we'll have universal basic income in the developed world, and worldwide by the end of the 2030s. You'll be able to live very well on that. The primary concern will be [[Meaning of life|meaning and purpose]]."<ref name=":0" /> === Nuclear weapons === During a September 17, 2022, interview, Kurzweil explained his worries about technology being used for violence. When asked about nuclear armageddon and the [[Russo-Ukrainian War]], Kurzweil said: "I don't think [nuclear war] is going to happen despite the terrors of that war. It is a possibility but it's unlikely, even with the tensions we've had with [[Zaporizhzhia Nuclear Power Plant|the nuclear power plant]] that's been taken over. It's very tense but I don't actually see a lot of people worrying that's going to happen. I think we'll avoid that. We had two nuclear bombs go off in [1945], so now we're 77 years later... we've never had another one go off through anger... there are other dangers besides nuclear weapons."<ref>[https://www.youtube.com/watch?v=ykY69lSpDdo&t=2650s Ray Kurzweil Singularity Superintelligence and Immortality On The War in Ukraine Lex Fridman Podcast 321]. September 17, 2022 β 439,132 views. 53:51 to 55:54 [[Lex Fridman|Lex Fridmam]].</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Ray Kurzweil
(section)
Add topic