Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Technological singularity
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Plausibility== Prominent technologists and academics dispute the plausibility of a technological singularity, including [[Paul Allen]],<ref name="Allen2011"/> [[Jeff Hawkins]],<ref name="ieee-lumi"/> [[John Henry Holland|John Holland]], [[Jaron Lanier]], [[Steven Pinker]],<ref name="ieee-lumi"/> [[Theodore Modis]],<ref name="modis2012"/> and [[Gordon Moore]],<ref name="ieee-lumi"/> whose [[Moore's law|law]] is often cited in support of the concept.<ref name="ieee-whos-who"/> Most proposed methods for creating superhuman or [[transhuman]] minds fall into one of two categories: intelligence amplification of human brains and artificial intelligence. The many speculated ways to augment human intelligence include [[bioengineering]], [[genetic engineering]], [[nootropic]] drugs, AI assistants, direct [[brain–computer interface]]s and [[mind uploading]]. These multiple possible paths to an intelligence explosion, all of which will presumably be pursued, makes a singularity more likely.<ref name="singinst.org">{{cite web|url=http://singinst.org/overview/whatisthesingularity |title=What is the Singularity? | Singularity Institute for Artificial Intelligence |publisher=Singinst.org |access-date=2011-09-09 |url-status=dead |archive-url=https://web.archive.org/web/20110908014050/http://singinst.org/overview/whatisthesingularity/ |archive-date=2011-09-08 }}</ref> [[Robin Hanson]] expressed skepticism of human intelligence augmentation, writing that once the "low-hanging fruit" of easy methods for increasing human intelligence have been exhausted, further improvements will become increasingly difficult.<ref name="hanson">{{cite web |url=https://mason.gmu.edu/~rhanson/vc.html#hanson |title=Some Skepticism |date=1998 |first=Robin |last=Hanson |author-link=Robin Hanson |access-date=April 8, 2020 |archive-date=February 15, 2021 |archive-url=https://web.archive.org/web/20210215095129/https://mason.gmu.edu/~rhanson/vc.html#hanson |url-status=live }}</ref> Despite all of the speculated ways for amplifying human intelligence, non-human artificial intelligence (specifically seed AI) is the most popular option among the hypotheses that would advance the singularity.{{citation needed|date=July 2012}} The possibility of an intelligence explosion depends on three factors.<ref name="david_chalmers_singularity_lecture_resources_available">David Chalmers John Locke Lecture, 10 May 2009, Exam Schools, Oxford, [http://www.fhi.ox.ac.uk/news/2010/david_chalmers_singularity_lecture_resources_available Presenting a philosophical analysis of the possibility of a technological singularity or "intelligence explosion" resulting from recursively self-improving AI] {{webarchive|url=https://web.archive.org/web/20130115205558/http://www.fhi.ox.ac.uk/news/2010/david_chalmers_singularity_lecture_resources_available |date=2013-01-15 }}.</ref> The first accelerating factor is the new intelligence enhancements made possible by each previous improvement. However, as the intelligences become more advanced, further advances will become more and more complicated, possibly outweighing the advantage of increased intelligence. Each improvement should generate at least one more improvement, on average, for movement towards singularity to continue. Finally, the laws of physics may eventually prevent further improvement. There are two logically independent, but mutually reinforcing, causes of intelligence improvements: increases in the speed of computation, and improvements to the [[algorithm]]s used.<ref name="chalmers2010"/> The former is predicted by [[Moore's law|Moore's Law]] and the forecasted improvements in hardware,<ref name="itrs">{{cite web |url=http://www.itrs.net/Links/2007ITRS/ExecSum2007.pdf |title=ITRS |access-date=2011-09-09 |url-status=dead |archive-url=https://web.archive.org/web/20110929173755/http://www.itrs.net/Links/2007ITRS/ExecSum2007.pdf |archive-date=2011-09-29 }}</ref> and is comparatively similar to previous technological advances. But Schulman and Sandberg<ref>{{cite web|last1=Shulman |first1=Carl |last2=Anders |first2=Sandberg |title=Implications of a Software-Limited Singularity |url=https://intelligence.org/files/SoftwareLimited.pdf |year=2010 |website=[[Machine Intelligence Research Institute]]}}</ref> argue that software will present more complex challenges than simply operating on hardware capable of running at human intelligence levels or beyond. A 2017 email survey of authors with publications at the 2015 [[Conference on Neural Information Processing Systems|NeurIPS]] and [[International Conference on Machine Learning|ICML]] machine learning conferences asked about the chance that "the intelligence explosion argument is broadly correct". Of the respondents, 12% said it was "quite likely", 17% said it was "likely", 21% said it was "about even", 24% said it was "unlikely" and 26% said it was "quite unlikely".<ref name="exceed2017">{{cite arXiv|last1=Grace|first1=Katja|last2=Salvatier|first2=John|last3=Dafoe|first3=Allan|last4=Zhang|first4=Baobao|last5=Evans|first5=Owain|title=When Will AI Exceed Human Performance? Evidence from AI Experts|eprint=1705.08807|date=24 May 2017|class=cs.AI}}</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Technological singularity
(section)
Add topic