Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Confirmation bias
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Polarization of opinion === {{Main|Attitude polarization}} When people with opposing views interpret new information in a biased way, their views can move even further apart. This is called "attitude polarization".<ref name="kuhn_lao" /> The effect was demonstrated by an experiment that involved drawing a series of red and black balls from one of two concealed "bingo baskets". Participants knew that one basket contained 60 percent black and 40 percent red balls; the other, 40 percent black and 60 percent red. The experimenters looked at what happened when balls of alternating color were drawn in turn, a sequence that does not favor either basket. After each ball was drawn, participants in one group were asked to state out loud their judgments of the probability that the balls were being drawn from one or the other basket. These participants tended to grow more confident with each successive drawβwhether they initially thought the basket with 60 percent black balls or the one with 60 percent red balls was the more likely source, their estimate of the probability increased. Another group of participants were asked to state probability estimates only at the end of a sequence of drawn balls, rather than after each ball. They did not show the polarization effect, suggesting that it does not necessarily occur when people simply hold opposing positions, but rather when they openly commit to them.<ref>{{Harvnb|Baron|2000|p=201}}</ref> A less abstract study was the Stanford biased interpretation experiment, in which participants with strong opinions about the death penalty read about mixed experimental evidence. Twenty-three percent of the participants reported that their views had become more extreme, and this self-reported shift correlated strongly with their initial attitudes.<ref name="lord1979">{{Citation |last1=Lord |first1=Charles G. |first2=Lee |last2 =Ross |first3=Mark R. |last3=Lepper |year=1979 |title=Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence |journal=Journal of Personality and Social Psychology |volume=37 |issue=11 |pages=2098β2109 |issn=0022-3514 |doi=10.1037/0022-3514.37.11.2098|citeseerx=10.1.1.372.1743 |s2cid=7465318 }}</ref> In later experiments, participants also reported their opinions becoming more extreme in response to ambiguous information. However, comparisons of their attitudes before and after the new evidence showed no significant change, suggesting that the self-reported changes might not be real.<ref name="taber_political">{{Citation |last1=Taber |first1=Charles S. |first2=Milton |last2=Lodge |date=July 2006 |title=Motivated skepticism in the evaluation of political beliefs |journal=American Journal of Political Science |volume=50 |issue=3 |pages=755β769 |issn=0092-5853 |doi=10.1111/j.1540-5907.2006.00214.x|citeseerx=10.1.1.472.7064 |s2cid=3770487 }}</ref><ref name="kuhn_lao">{{Citation |last1=Kuhn |first1=Deanna |first2=Joseph |last2=Lao |date=March 1996 |title=Effects of evidence on attitudes: Is polarization the norm? |journal=Psychological Science |volume=7 |issue=2 |pages=115β120 |doi=10.1111/j.1467-9280.1996.tb00340.x|s2cid=145659040 }}</ref><ref>{{Citation |last1=Miller |first1=A.G.|first2=J.W. |last2=McHoskey |first3=C.M. |last3=Bane |first4=T.G. |last4=Dowd |s2cid=14102789|year=1993 |title=The attitude polarization phenomenon: Role of response measure, attitude extremity, and behavioral consequences of reported attitude change |journal=Journal of Personality and Social Psychology |volume=64 |pages=561β574 |doi=10.1037/0022-3514.64.4.561 |issue=4}}</ref> Based on these experiments, Deanna Kuhn and Joseph Lao concluded that polarization is a real phenomenon but far from inevitable, only happening in a small minority of cases, and it was prompted not only by considering mixed evidence, but by merely thinking about the topic.<ref name="kuhn_lao"/> Charles Taber and Milton Lodge argued that the Stanford team's result had been hard to replicate because the arguments used in later experiments were too abstract or confusing to evoke an emotional response. The Taber and Lodge study used the emotionally charged topics of [[gun politics|gun control]] and [[affirmative action]].<ref name="taber_political" /> They measured the attitudes of their participants towards these issues before and after reading arguments on each side of the debate. Two groups of participants showed attitude polarization: those with strong prior opinions and those who were politically knowledgeable. In part of this study, participants chose which information sources to read, from a list prepared by the experimenters. For example, they could read arguments on gun control from the [[National Rifle Association of America]] and the [[Brady Campaign|Brady Anti-Handgun Coalition]]. Even when instructed to be even-handed, participants were more likely to read arguments that supported their existing attitudes than arguments that did not. This biased search for information correlated well with the polarization effect.<ref name="taber_political" /> The '''{{vanchor|backfire effect}}''' is a name for the finding that given evidence against their beliefs, people can reject the evidence and believe even more strongly.<ref>{{Citation|url=http://www.skepdic.com/backfireeffect.html|title=Backfire effect|work=[[The Skeptic's Dictionary]]|access-date=26 April 2012|archive-date=6 February 2017|archive-url=https://web.archive.org/web/20170206213300/http://www.skepdic.com/backfireeffect.html|url-status=live}}</ref><ref name="CJR backfire">{{Citation | url = https://www.cjr.org/behind_the_news/the_backfire_effect.php | title = The backfire effect | access-date = 1 May 2012 | last = Silverman | first = Craig | date = 17 June 2011 | work = Columbia Journalism Review | quote = When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger. | archive-date = 25 April 2012 | archive-url = https://web.archive.org/web/20120425224027/http://www.cjr.org/behind_the_news/the_backfire_effect.php | url-status = live }}</ref> The phrase was coined by [[Brendan Nyhan]] and Jason Reifler in 2010.<ref>Nyhan, B. & Reifler, J. (2010). 'When corrections fail: The persistence of political misperceptions". ''Political Behavior'', 32, 303β320</ref> However, subsequent research has since failed to replicate findings supporting the backfire effect.<ref>{{Cite news|url=https://educationblog.oup.com/theory-of-knowledge/facts-matter-after-all-rejecting-the-backfire-effect|title=Facts matter after all: rejecting the "backfire effect"|date=12 March 2018|work=Oxford Education Blog|access-date=23 October 2018|language=en-GB|archive-date=23 October 2018|archive-url=https://web.archive.org/web/20181023234412/https://educationblog.oup.com/theory-of-knowledge/facts-matter-after-all-rejecting-the-backfire-effect|url-status=live}}</ref> One study conducted out of the Ohio State University and George Washington University studied 10,100 participants with 52 different issues expected to trigger a backfire effect. While the findings did conclude that individuals are reluctant to embrace facts that contradict their already held ideology, no cases of backfire were detected.<ref name ="wood">{{Cite journal|last1=Wood|first1=Thomas|last2=Porter|first2=Ethan|date=2019|title=The elusive backfire effect: Mass attitudes' steadfast factual adherence|journal=Political Behavior| volume=41 | pages = 135β163 |doi=10.2139/ssrn.2819073|issn=1556-5068|mode=cs2 }}</ref> The backfire effect has since been noted to be a rare phenomenon rather than a common occurrence<ref>{{Cite web|url=https://www.poynter.org/news/fact-checking-doesnt-backfire-new-study-suggests|title=Fact-checking doesn't 'backfire,' new study suggests|website=Poynter|language=en|access-date=23 October 2018|date=2 November 2016|mode=cs2|archive-date=24 October 2018|archive-url=https://web.archive.org/web/20181024035251/https://www.poynter.org/news/fact-checking-doesnt-backfire-new-study-suggests|url-status=live}}</ref> (compare the [[Boomerang effect (psychology)|boomerang effect]]).
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Confirmation bias
(section)
Add topic