Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Confirmation bias
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Real-world effects == === Social media === In [[social media]], confirmation bias is amplified by the use of [[filter bubble]]s, or "algorithmic editing", which displays to individuals only information they are likely to agree with, while excluding opposing views.<ref name=":0">{{Citation|url=https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles|title=Ted talk: Beware online "filter bubbles"|last=Pariser|first=Eli|date=2 May 2011|website=TED: Ideas Worth Spreading|access-date=1 October 2017|archive-date=22 September 2017|archive-url=https://web.archive.org/web/20170922201521/https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles|url-status=live}}</ref> Some have argued that confirmation bias is the reason why society can never escape from filter bubbles, because individuals are psychologically hardwired to seek information that agrees with their preexisting values and beliefs.<ref>{{Citation|url=https://www.newstatesman.com/science-tech/social-media/2016/11/forget-fake-news-facebook-real-filter-bubble-you|title=Forget fake news on Facebook – the real filter bubble is you|last=Self|first=Will|date=28 November 2016|website=NewStatesman|access-date=24 October 2017|archive-date=11 November 2017|archive-url=https://web.archive.org/web/20171111042405/https://www.newstatesman.com/science-tech/social-media/2016/11/forget-fake-news-facebook-real-filter-bubble-you|url-status=live}}</ref> Others have further argued that the mixture of the two is degrading [[democracy]]—claiming that this "algorithmic editing" removes diverse viewpoints and information—and that unless filter bubble algorithms are removed, voters will be unable to make fully informed political decisions.<ref>{{Citation|url=https://www.wired.com/2015/05/did-facebooks-big-study-kill-my-filter-bubble-thesis/|title=Did Facebook's big study kill my filter bubble thesis?|last=Pariser|first=Eli|date=7 May 2015|magazine=Wired|access-date=24 October 2017|archive-date=11 November 2017|archive-url=https://web.archive.org/web/20171111042342/https://www.wired.com/2015/05/did-facebooks-big-study-kill-my-filter-bubble-thesis/|url-status=live}}</ref><ref name=":0" /> The rise of social media has contributed greatly to the rapid spread of [[fake news]], that is, false and misleading information that is presented as credible news from a seemingly reliable source. Confirmation bias (selecting or reinterpreting evidence to support one's beliefs) is one of three main hurdles cited as to why critical thinking goes astray in these circumstances. The other two are shortcut heuristics (when overwhelmed or short of time, people rely on simple rules such as group consensus or trusting an expert or role model) and social goals (social motivation or peer pressure can interfere with objective analysis of facts at hand).<ref>{{Citation |last1=Kendrick |first1=Douglas T. |first2=Adam B. |last2=Cohen |first3= Steven L. |last3=Neuberg | first4= Robert B. |last4=Cialdini |title=The science of anti-science thinking |journal=Scientific American |year=2020 |volume=29 |issue=4, Fall, Special Issue |pages=84–89}}</ref> In combating the spread of fake news, social media sites have considered turning toward "digital nudging".<ref>{{Cite journal|last1=Weinmann|first1=Markus|last2=Schneider|first2=Christoph|last3=vom Brocke|first3=Jan|date=2015|title=Digital nudging|journal=SSRN|location=Rochester, NY|ssrn=2708250|doi=10.2139/ssrn.2708250|s2cid=219380211|mode=cs2}}</ref> This can currently be done in two different forms of nudging. This includes nudging of information and nudging of presentation. Nudging of information entails social media sites providing a disclaimer or label questioning or warning users of the validity of the source while nudging of presentation includes exposing users to new information which they may not have sought out but could introduce them to viewpoints that may combat their own confirmation biases.<ref>{{Cite journal|last1=Thornhill|first1=Calum|last2=Meeus|first2=Quentin|last3=Peperkamp|first3=Jeroen|last4=Berendt|first4=Bettina|date=2019|title=A digital nudge to counter confirmation bias|journal=Frontiers in Big Data|volume=2|page=11|doi=10.3389/fdata.2019.00011|pmid=33693334|pmc=7931917|issn=2624-909X|doi-access=free|mode=cs2}}</ref> === Science and scientific research === {{See also|Planck's principle|Escalation of commitment|Replication crisis}} A distinguishing feature of [[science|scientific thinking]] is the search for confirming or supportive evidence ([[inductive reasoning]]) as well as falsifying evidence ([[deductive reasoning]]).<ref>{{Citation|journal=Cognitive Therapy and Research|volume=1|issue=3|pages=229–238|title=Psychology of the scientist: An analysis of problem-solving bias|first1=Michael J.|last1=Mahoney |first2=B.G.|last2=DeMonbreun |year=1977|doi=10.1007/BF01186796|s2cid=9703186}}</ref><ref>{{Citation|title=Norms and counter-norms in a select group of the Apollo moon scientists: A case study of the ambivalence of scientists|first=I. I.|last=Mitroff|journal=American Sociological Review|year=1974|volume=39|issue=4|pages=579–395|jstor=2094423|doi=10.2307/2094423}} </ref> Many times in the [[history of science]], scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data.<ref name ="nickerson"/>{{rp|192–194}} Several studies have shown that scientists rate studies that report findings consistent with their prior beliefs more favorably than studies reporting findings inconsistent with their previous beliefs.<ref name="Hergovich 2010" /><ref name="Koehler 1993">{{Harvnb|Koehler|1993}}</ref><ref name="Mahoney 1977">{{Harvnb|Mahoney|1977}}</ref> However, assuming that the research question is relevant, the experimental design adequate and the data are clearly and comprehensively described, the empirical data obtained should be important to the scientific community and should not be viewed prejudicially, regardless of whether they conform to current theoretical predictions.<ref name="Mahoney 1977"/> In practice, researchers may misunderstand, misinterpret, or not read at all studies that contradict their preconceptions, or wrongly cite them anyway as if they actually supported their claims.<ref name="Letrud & Hernes 2019">{{cite journal|last1=Letrud|first1=Kåre|last2=Hernes|first2=Sigbjørn|title=Affirmative citation bias in scientific myth debunking: A three-in-one case study|journal=PLOS ONE|volume=14|issue=9|year=2019|pages=e0222213|doi=10.1371/journal.pone.0222213|pmid=31498834|pmc=6733478|bibcode=2019PLoSO..1422213L|doi-access=free|mode=cs2}}</ref> Further, confirmation biases can sustain scientific theories or research programs in the face of inadequate or even contradictory evidence.<ref name="sutherland">{{Citation |last=Sutherland |first=Stuart |title=Irrationality |edition=2nd |publisher= Pinter and Martin |location=London |year=2007 |pages=95–103 |isbn=978-1-905177-07-3 |oclc=72151566}}</ref><ref>{{cite web | url =http://nautil.us/issue/24/error/the-trouble-with-scientists | title =The trouble with scientists: How one psychologist is tackling human biases in science | last =Ball | first =Phillip | date =14 May 2015 | website =Nautilus | access-date =6 October 2019 | mode =cs2 | archive-date =7 October 2019 | archive-url =https://web.archive.org/web/20191007124310/http://nautil.us/issue/24/error/the-trouble-with-scientists | url-status =dead }}</ref> The discipline of [[parapsychology]] is often cited as an example.<ref>{{Citation |last=Sternberg |first=Robert J. |author-link=Robert Sternberg |editor1-first=Robert J. |editor1-last=Sternberg |editor2-first=Henry L. |editor2-last=Roediger III |editor3-first=Diane F. |editor3-last=Halpern |title=Critical thinking in psychology |year=2007 |publisher=Cambridge University Press |isbn=978-0-521-60834-3 |oclc=69423179 |page=292 |chapter=Critical thinking in psychology: It really is critical |quote=Some of the worst examples of confirmation bias are in research on parapsychology ... Arguably, there is a whole field here with no powerful confirming data at all. But people want to believe, and so they find ways to believe.}}</ref> An experimenter's confirmation bias can potentially affect which data are reported. Data that conflict with the experimenter's expectations may be more readily discarded as unreliable, producing the so-called [[publication bias|file drawer effect]]. To combat this tendency, scientific training teaches ways to prevent bias.<ref name="shadish">{{Citation |last=Shadish |first=William R. |title= Critical Thinking in Psychology |editor1-first=Robert J. |editor1-last=Sternberg |editor2-first=Henry L. |editor2-last=Roediger III |editor3-first=Diane F. |editor3-last=Halpern |publisher=Cambridge University Press |year=2007 |page=49 |chapter=Critical thinking in quasi-experimentation |isbn=978-0-521-60834-3}}</ref> For example, [[Design of experiments|experimental design]] of [[randomized controlled trial]]s (coupled with their [[systematic review]]) aims to minimize sources of bias.<ref name="shadish" /><ref>{{Citation | doi = 10.1136/bmj.323.7303.42 | last1 = Jüni | first1 = P.| last2 = Altman | first2 = D.G.| last3 = Egger | first3 = M.| title = Systematic reviews in health care: Assessing the quality of controlled clinical trials | journal = BMJ (Clinical Research Ed.)| volume = 323| issue = 7303| pages = 42–46| year = 2001| pmid = 11440947 | pmc = 1120670}}</ref> The social process of [[peer review]] aims to mitigate the effect of individual scientists' biases, even though the peer review process itself may be susceptible to such biases<ref>{{Citation|last1=Lee|first1=C.J.|last2=Sugimoto|first2=C.R.|author2-link=Cassidy Sugimoto|last3=Zhang|first3=G.|last4=Cronin|first4=B.|year=2013|title=Bias in peer review|journal=Journal of the Association for Information Science and Technology|volume=64|pages=2–17|doi=10.1002/asi.22784}}</ref><ref>{{Citation |last=Shermer |first=Michael |author-link=Michael Shermer|date=July 2006 |title=The political brain: A recent brain-imaging study shows that our political predilections are a product of unconscious confirmation bias |journal=[[Scientific American]] |volume=295 |issue=1 |pages=36 |issn=0036-8733 |bibcode=2006SciAm.295a..36S |doi=10.1038/scientificamerican0706-36 |pmid=16830675 }}</ref><ref name="Mahoney 1977" /><ref>{{Citation | last1 = Emerson | first1 = G.B.| last2 = Warme | first2 = W.J.| last3 = Wolf | first3 = F.M.| last4 = Heckman | first4 = J.D.| last5 = Brand | first5 = R.A.| last6 = Leopold | first6 = S.S.| doi = 10.1001/archinternmed.2010.406 | title = Testing for the presence of positive-outcome bias in peer review: A randomized controlled trial | journal = [[Archives of Internal Medicine]] | volume = 170 | issue = 21| pages = 1934–1339 | year = 2010 | pmid = 21098355 | doi-access = }}</ref><ref name="Bartlett 2011">Bartlett, Steven James, "The psychology of abuse in publishing: Peer review and editorial bias," Chap. 7, pp. 147–177, in [[Steven James Bartlett]], ''Normality does not equal mental health: The need to look elsewhere for standards of good psychological health''. Santa Barbara, CA: Praeger, 2011.</ref> Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs.<ref name="Koehler 1993" /> Scientific innovators often meet with resistance from the scientific community, and research presenting controversial results frequently receives harsh peer review.<ref>{{Citation |doi=10.1001/jama.263.10.1438 |last=Horrobin |first=David F. |author-link=David Horrobin |year=1990 |title=The philosophical basis of peer review and the suppression of innovation |journal=[[Journal of the American Medical Association]] |pmid=2304222 |volume=263 |issue=10 |pages=1438–1441 }}</ref> === Finance === {{See also|Escalation of commitment|Sunk cost}} Confirmation bias can lead investors to be overconfident, ignoring evidence that their strategies will lose money.<ref name=WSJ /><ref>{{Citation |title=Behavioral finance and wealth management: how to build optimal portfolios that account for investor biases |first=Michael M. |last=Pompian |publisher=[[John Wiley and Sons]] |isbn=978-0-471-74517-4 |oclc=61864118 |year=2006 |pages=187–190}}</ref> In studies of [[election stock market|political stock markets]], investors made more profit when they resisted bias. For example, participants who interpreted a candidate's debate performance in a neutral rather than partisan way were more likely to profit.<ref>{{Citation |last=Hilton |first=Denis J. |journal=Journal of Behavioral Finance |year=2001 |title=The psychology of financial decision-making: Applications to trading, dealing, and investment analysis |volume=2 |issue=1 |doi=10.1207/S15327760JPFM0201_4 |issn=1542-7579 |pages=37–39|s2cid=153379653 }}</ref> To combat the effect of confirmation bias, investors can try to adopt a contrary viewpoint "for the sake of argument".<ref>{{Citation |first1=David |last1= Krueger |first2=John David |last2=Mann |title=The secret language of money: How to make smarter financial decisions and live a richer life |isbn=978-0-07-162339-1 |oclc=277205993 |year=2009 |publisher=[[McGraw Hill Professional]] |pages=112–113}}</ref> In one technique, they imagine that their investments have collapsed and ask themselves why this might happen.<ref name=WSJ /> === Medicine and health === Cognitive biases are important variables in clinical decision-making by medical general practitioners (GPs) and medical specialists. Two important ones are confirmation bias and the overlapping availability bias. A GP may make a diagnosis early on during an examination, and then seek confirming evidence rather than falsifying evidence. This cognitive error is partly caused by the availability of evidence about the supposed disorder being diagnosed. For example, the client may have mentioned the disorder, or the GP may have recently read a much-discussed paper about the disorder. The basis of this cognitive shortcut or heuristic (termed anchoring) is that the doctor does not consider multiple possibilities based on evidence, but prematurely latches on (or anchors to) a single cause.<ref>{{Citation | last=Groopman | author-link=Jerome Groopman | first= Jerome | title=How doctor's think | publisher=Melbourne: Scribe Publications | year=2007 | pages=64–66 | isbn=978-1-921215-69-8}}</ref> In emergency medicine, because of time pressure, there is a high density of decision-making, and shortcuts are frequently applied. The potential failure rate of these cognitive decisions needs to be managed by education about the 30 or more cognitive biases that can occur, so as to set in place proper debiasing strategies.<ref name="croskerry">{{Citation | last1=Croskerry | first1=Pat | title=Achieving quality in clinical decision making: Cognitive strategies and detection of bias | journal= Academic Emergency Medicine | date=2002 | volume=9 | issue=11 | pages=1184–1204 | doi=10.1197/aemj.9.11.1184 | pmid=12414468 }}.</ref> Confirmation bias may also cause doctors to perform unnecessary medical procedures due to pressure from adamant patients.<ref name="hospitalbias">{{Citation|last1=Pang |first1=Dominic|last2=Bleetman|first2=Anthony|last3=Bleetman| first3=David|last4=Wynne |first4=Max|title=The foreign body that never was: the effects of confirmation bias|journal=British Journal of Hospital Medicine|date=2 June 2017|volume=78|issue=6|pages=350–351|doi=10.12968/hmed.2017.78.6.350|pmid=28614014}}</ref> Mental disorders may be prone to misdiagnosis in being based upon observations and self-reporting rather than objective testing. Confirmation bias may play a role when practitioners stick with an early diagnosis.<ref>{{Cite journal| doi = 10.1136/bmjqs-2023-016996| issn = 2044-5423| volume = 33| issue = 10| pages = 663–672| last1 = Bradford| first1 = Andrea| last2 = Meyer| first2 = Ashley N. D.| last3 = Khan| first3 = Sundas| last4 = Giardina| first4 = Traber D.| last5 = Singh| first5 = Hardeep| title = Diagnostic Error in Mental Health: A Review| journal = BMJ Quality & Safety| access-date = 2025-04-13| date = 2024-10-01| url = https://qualitysafety.bmj.com/content/33/10/663| pmid = 38575311| pmc = 11503128}}</ref> Raymond Nickerson, a psychologist, blames confirmation bias for the ineffective medical procedures that were used for centuries before the [[History of medicine|arrival of scientific medicine]].<ref name ="nickerson"/>{{rp|192}} If a patient recovered, medical authorities counted the treatment as successful, rather than looking for alternative explanations such as that the disease had run its natural course. Biased assimilation is a factor in the modern appeal of [[alternative medicine]], whose proponents are swayed by positive [[anecdotal evidence]] but treat [[scientific evidence]] hyper-critically.<ref>{{Harvnb|Goldacre|2008|p=233}}</ref><ref>{{Citation |last1=Singh |first1=Simon | author-link=Simon Singh |first2=Edzard |last2=Ernst | author-link2=Edzard Ernst |title=Trick or treatment?: Alternative medicine on trial |publisher= Bantam |location=London |year=2008 |isbn=978-0-593-06129-9 |pages=287–288}}</ref><ref>{{Citation |last=Atwood |first=Kimball |year=2004 |title=Naturopathy, pseudoscience, and medicine: Myths and fallacies vs truth |journal=[[Medscape General Medicine]] |volume=6 |issue=1 |page=33|pmc=1140750 |pmid=15208545 }}</ref> [[Cognitive therapy]] was developed by [[Aaron T. Beck]] in the early 1960s and has become a popular approach.<ref>{{Citation |first1=Michael |last1=Neenan |first2=Windy |last2=Dryden |year=2004 |title=Cognitive therapy: 100 key points and techniques |publisher=Psychology Press |isbn=978-1-58391-858-6 |oclc=474568621 |page=ix}}</ref> According to Beck, biased information processing is a factor in [[depression (mood)|depression]].<ref>{{Citation |first1=Ivy-Marie |last1=Blackburn |first2=Kate M. |last2=Davidson |year=1995 |title=Cognitive therapy for depression & anxiety: a practitioner's guide |publisher=Wiley-Blackwell |isbn=978-0-632-03986-9 |oclc=32699443 |edition=2 |page=19}}</ref> His approach teaches people to treat evidence impartially, rather than selectively reinforcing negative outlooks.<ref name="baron195" /> [[Phobias]] and [[hypochondria]] have also been shown to involve confirmation bias for threatening information.<ref>{{Citation |first1=Allison G. |last1=Harvey |first2=Edward |last2=Watkins |first3= Warren |last3=Mansell |year=2004 |title=Cognitive behavioural processes across psychological disorders: a transdiagnostic approach to research and treatment |publisher=Oxford University Press |isbn=978-0-19-852888-3 |oclc=602015097 |pages=172–173, 176}}</ref> === Politics, law and policing === [[File:Witness impeachment.jpg|thumb|right|alt=A woman and a man reading a document in a courtroom|[[Mock trial]]s allow researchers to examine confirmation biases in a realistic setting.]] Nickerson argues that reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, juries or governments have already committed to.<ref name ="nickerson"/>{{rp|191–193}} Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with [[mock trial]]s.<ref>{{Citation |last1=Myers |first1=D.G. |first2=H. |last2=Lamm |year=1976 |title=The group polarization phenomenon |journal=Psychological Bulletin |volume=83 |pages=602–527 |doi=10.1037/0033-2909.83.4.602 |issue=4}} via {{Harvnb|Nickerson|1998|pp=193–194}}</ref><ref name="halpern">{{Citation |last=Halpern |first=Diane F. |title=Critical thinking across the curriculum: A brief edition of thought and knowledge |publisher=Lawrence Erlbaum Associates |year=1987 |page=194 |isbn=978-0-8058-2731-6 |oclc=37180929}}</ref> Both [[Inquisitorial system|inquisitorial]] and [[Adversarial system|adversarial]] criminal justice systems are affected by confirmation bias.<ref>{{Citation |last=Roach |first= Kent |ssrn=1619124 |title=Wrongful convictions: Adversarial and inquisitorial themes |journal= North Carolina Journal of International Law and Commercial Regulation |volume=35 | year=2010 | pages=387–446 | quote=Quote: Both adversarial and inquisitorial systems seem subject to the dangers of tunnel vision or confirmation bias.}}</ref> Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to wars: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position.<ref name="baron191">{{Harvnb|Baron|2000|pp=191, 195}}</ref> On the other hand, confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict. For example, psychologists [[Stuart Sutherland]] and Thomas Kida have each argued that [[U.S. Navy]] Admiral [[Husband E. Kimmel]] showed confirmation bias when playing down the first signs of the Japanese [[attack on Pearl Harbor]].<ref name="sutherland" /><ref>{{Harvnb|Kida|2006|p=155}}</ref> A two-decade study of political pundits by [[Philip E. Tetlock]] found that, on the whole, their predictions were not much better than chance. Tetlock divided experts into "foxes" who maintained multiple hypotheses, and "hedgehogs" who were more dogmatic. In general, the hedgehogs were much less accurate. Tetlock blamed their failure on confirmation bias, and specifically on their inability to make use of new information that contradicted their existing theories.<ref>{{Citation |last=Tetlock |first=Philip E. |title=Expert political judgment: How good is it? How can we know? |publisher=Princeton University Press |location=Princeton, NJ |year=2005 |isbn=978-0-691-12302-8 |oclc=56825108 |pages=125–128}}</ref> In police investigations, a detective may identify a suspect early in an investigation, but then sometimes largely seek supporting or confirming evidence, ignoring or downplaying falsifying evidence.<ref>{{Citation|last=O'Brien| first=B. | title=Prime suspect: An examination of factors that aggravate and counteract confirmation bias in criminal investigations | journal=Psychology, Public Policy, and Law | date=2009 |volume=15 | issue=4 | pages=315–334 |doi=10.1037/a0017881}}</ref> === Social psychology === Social psychologists have identified two tendencies in the way people seek or interpret information about themselves. ''[[Self-verification]]'' is the drive to reinforce the existing [[self-image]] and ''[[self-enhancement]]'' is the drive to seek positive feedback. Both are served by confirmation biases.<ref name="reconciling">{{Citation |last1=Swann |first1=William B. |first2=Brett W. |last2=Pelham |first3= Douglas S. |last3=Krull |title=Agreeable fancy or disagreeable truth? Reconciling self-enhancement and self-verification |journal=Journal of Personality and Social Psychology |year=1989 |volume=57 |issue=5 |pages=782–791 |issn=0022-3514 |pmid=2810025 |doi=10.1037/0022-3514.57.5.782}}</ref> In experiments where people are given feedback that conflicts with their self-image, they are less likely to attend to it or remember it than when given self-verifying feedback.<ref name="swannread_jesp" /><ref>{{Citation |last=Story |first=Amber L. |title=Self-esteem and memory for favorable and unfavorable personality feedback |journal=Personality and Social Psychology Bulletin |year=1998 |volume=24 |issue=1 |pages= 51–64 |doi=10.1177/0146167298241004 |s2cid=144945319 |issn=1552-7433}}</ref><ref>{{Citation |last1=White |first1=Michael J. |first2=Daniel R. |last2 =Brockett |first3=Belinda G. |last3=Overstreet |title=Confirmatory bias in evaluating personality test information: Am I really that kind of person? |journal=[[Journal of Counseling Psychology]] |year=1993 |volume= 40 |issue=1 |pages=120–126 |doi=10.1037/0022-0167.40.1.120 |issn=0022-0167}}</ref> They reduce the impact of such information by interpreting it as unreliable.<ref name="swannread_jesp">{{Citation |last1=Swann |first1=William B. |first2=Stephen J. |last2=Read |title=Self-verification processes: How we sustain our self-conceptions |journal=[[Journal of Experimental Social Psychology]] |year=1981 |volume=17 |issue=4 |pages=351–372 |issn=0022-1031 |doi=10.1016/0022-1031(81)90043-3}}</ref><ref name="swannread_jpsp">{{Citation |last1=Swann |first1=William B. |first2=Stephen J. |last2=Read |title=Acquiring self-knowledge: The search for feedback that fits |journal=Journal of Personality and Social Psychology |year=1981 |volume=41 |issue=6 |pages=1119–1328 |issn=0022-3514 |doi=10.1037/0022-3514.41.6.1119|citeseerx=10.1.1.537.2324 }}</ref><ref>{{Citation |last1=Shrauger |first1=J. Sidney |first2=Adrian K. |last2=Lund |title=Self-evaluation and reactions to evaluations from others |journal=[[Journal of Personality]] |year=1975 |volume=43 |issue=1 |pmid=1142062 |pages=94–108 |doi=10.1111/j.1467-6494.1975.tb00574.x }}</ref> Similar experiments have found a preference for positive feedback, and the people who give it, over negative feedback.<ref name="reconciling"/> === Mass delusions === Confirmation bias can play a key role in the propagation of [[mass delusion]]s. [[Witch trial]]s are frequently cited as an example.<ref>{{cite thesis |last=Lidén |first=Moa |date=2018 |title=Confirmation bias in criminal cases |chapter=3.2.4.1 |publisher=Department of Law, Uppsala University |chapter-url=http://www.diva-portal.org/smash/get/diva2:1237959/FULLTEXT01.pdf |access-date=20 February 2020 |archive-date=20 February 2020 |archive-url=https://web.archive.org/web/20200220083720/http://www.diva-portal.org/smash/get/diva2:1237959/FULLTEXT01.pdf |url-status=live }}</ref><ref>{{cite book |last=Trevor-Roper |first=H.R. |date=1969 |title=The European witch-craze of the sixteenth and seventeenth centuries and other essays |publisher=London: HarperCollins}} {{ISBN?}}</ref> For another example, in the [[Seattle windshield pitting epidemic]], there seemed to be a "pitting epidemic" in which windshields were damaged due to an unknown cause. As news of the apparent wave of damage spread, more and more people checked their windshields, discovered that their windshields too had been damaged, thus confirming belief in the supposed epidemic. In fact, the windshields were previously damaged, but the damage went unnoticed until people checked their windshields as the delusion spread.<ref>{{cite podcast |url=https://www.stitcher.com/podcast/the-constant/e/64112747 |title=The constant: A history of getting things wrong |website=constantpodcast.com |host=Chrisler, Mark |date=24 September 2019 |access-date=19 February 2020 |mode=cs2 |archive-date=20 February 2020 |archive-url=https://web.archive.org/web/20200220083721/https://www.stitcher.com/podcast/the-constant/e/64112747 |url-status=live }}</ref> === Paranormal beliefs === One factor in the appeal of alleged [[psychic]] readings is that listeners apply a confirmation bias which fits the psychic's statements to their own lives.<ref name="toolkit">{{Citation |last=Smith |first= Jonathan C. |title=Pseudoscience and extraordinary claims of the paranormal: A critical thinker's toolkit |publisher=London: Wiley-Blackwell |year=2009 |pages=149–151 |isbn=978-1-4051-8122-8 |oclc=319499491}}</ref> By making a large number of ambiguous statements in each sitting, the psychic gives the client more opportunities to find a match. This is one of the techniques of [[cold reading]], with which a psychic can deliver a subjectively impressive reading without any prior information about the client.<ref name="toolkit" /> Investigator [[James Randi]] compared the transcript of a reading to the client's report of what the psychic had said, and found that the client showed a strong selective recall of the "hits".<ref>{{Citation |last=Randi |first=James |title=James Randi: Psychic investigator |publisher=London: Boxtree |year=1991 |isbn=978-1-85283-144-8 |oclc= 26359284 |pages=58–62}}</ref> As a striking illustration of confirmation bias in the real world, Nickerson mentions numerological [[pyramidology]]: the practice of finding meaning in the proportions of the Egyptian pyramids.<ref name ="nickerson"/>{{rp|190}} There are many different length measurements that can be made of, for example, the [[Great Pyramid of Giza]] and many ways to combine or manipulate them. Hence it is almost inevitable that people who look at these numbers selectively will find superficially impressive correspondences, for example with the dimensions of the Earth.<ref name ="nickerson"/>{{rp|190}} === Recruitment and selection === Unconscious cognitive bias (including confirmation bias) in [[recruitment|job recruitment]] affects hiring decisions and can potentially prohibit a diverse and inclusive workplace. There are a variety of unconscious biases that affects recruitment decisions but confirmation bias is one of the major ones, especially during the interview stage.<ref>{{Citation | title=Here is how bias can affect recruitment in your organization | first=Dr Pragva | last=Agarwal | newspaper=[[Forbes]] | date=19 October 2018 | url=https://www.forbes.com/sites/pragyaagarwaleurope/2018/10/19/how-can-bias-during-interviews-affect-recruitment-in-your-organisation/ | access-date=31 July 2019 | archive-date=31 July 2019 | archive-url=https://web.archive.org/web/20190731082552/https://www.forbes.com/sites/pragyaagarwaleurope/2018/10/19/how-can-bias-during-interviews-affect-recruitment-in-your-organisation/ | url-status=live }}</ref> The interviewer will often select a candidate that confirms their own beliefs, even though other candidates are equally or better qualified.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Confirmation bias
(section)
Add topic