Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Disinformation
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Operationalization== The [[Shorenstein Center on Media, Politics and Public Policy|Shorenstein Center]] at Harvard University defines disinformation research as an academic field that studies "the spread and impacts of misinformation, disinformation, and media manipulation," including "how it spreads through online and offline channels, and why people are susceptible to believing bad information, and successful strategies for mitigating its impact".<ref>{{Cite web |title=Disinformation |url=https://shorensteincenter.org/research-initiatives/disinformation/ |access-date=2023-10-30 |website=Shorenstein Center |language=en-US |archive-date=30 October 2023 |archive-url=https://web.archive.org/web/20231030110857/https://shorensteincenter.org/research-initiatives/disinformation/ |url-status=live }}</ref> According to a 2023 research article published in [[New Media & Society]],<ref name=":5" /> disinformation circulates on [[social media]] through deception campaigns implemented in multiple ways including: [[astroturfing]], [[Conspiracy theory|conspiracy theories]], [[clickbait]], [[culture war]]s, [[Echo chamber (media)|echo chambers]], hoaxes, [[fake news]], [[propaganda]], [[pseudoscience]], and [[rumor]]s. {| class="wikitable" ! colspan="4" |Activities that operationalize disinformation campaigns online<ref name=":5" /> |- !Term !Description !Term !Description |- |[[Astroturfing]] |A centrally coordinated campaign that mimics grassroots activism by making participants pretend to be ordinary citizens |[[Fake news]] |Genre: The deliberate creation of pseudo-journalism Label: The instrumentalization of the term to delegitimize news media |- |[[Conspiracy theory|Conspiracy theories]] |Rebuttals of official accounts that propose alternative explanations in which individuals or groups act in secret |[[Greenwashing]] |Deceptive communication makes people believe that a company is environmentally responsible when it is not |- |[[Clickbait]] |The deliberate use of misleading headlines and thumbnails to increase online traffic for profit or popularity |[[Propaganda]] |Organized mass communication, on a hidden agenda, and with a mission to conform belief and action by circumventing individual reasoning |- |[[Culture war]]s |A phenomenon in which multiple groups of people, who hold entrenched values, attempt to steer public policy contentiously |[[Pseudoscience]] |Accounts that claim the explanatory power of science, borrow its language and legitimacy but diverge substantially from its quality criteria |- |[[Doxing|Doxxing]] |A form of online harassment that breaches privacy boundaries by releasing information intending physical and online harm to a target |[[Rumor]]s |Unsubstantiated news stories that circulate while not corroborated or validated |- |[[Echo chamber (media)|Echo chamber]] |An epistemic environment in which participants encounter beliefs and opinions that coincide with their own |[[Troll (slang)|Trolling]] |Networked groups of digital influencers that operate 'click armies' designed to mobilize public sentiment |- |[[Hoax]] |News in which false facts are presented as legitimate |[[Urban legend]]s |Moral tales featuring durable stories of intruders incurring boundary transgressions and their dire consequences |- | colspan="4" |Note: This is an adaptation of Table 2 from [https://doi.org/10.1177/14614448231207644 Disinformation on Digital Media Platforms: A Market Shaping Approach], by Carlos Diaz Ruiz, used under [http://creativecommons.org/licenses/by/4.0/ CC BY 4.0] / Adapted from the original. |} In order to distinguish between similar terms, including misinformation and malinformation, scholars collectively agree on the definitions for each term as follows: (1) disinformation is the strategic dissemination of false information with the intention to cause public harm;<ref>Center for Internet Security. (3 October 2022). "Essential Guide to Election Security:Managing Mis-, Dis-, and Malinformation". [https://essentialguide.docs.cisecurity.org/en/latest/bp/mdm_info.html CIS website] {{Webarchive|url=https://web.archive.org/web/20231218162438/https://essentialguide.docs.cisecurity.org/en/latest/bp/mdm_info.html |date=18 December 2023 }} Retrieved 18 December 2023.</ref> (2) [[misinformation]] represents the unintentional spread of false information; and (3) [[malinformation]] is factual information disseminated with the intention to cause harm,<ref>{{Cite journal |last1=Baines |first1=Darrin |last2=Elliott |first2=Robert J. R. |date=April 2020 |title=Defining misinformation, disinformation and malinformation: An urgent need for clarity during the COVID-19 infodemic |url=https://ideas.repec.org//p/bir/birmec/20-06.html |journal=Discussion Papers |language=en |access-date=14 December 2022 |archive-date=14 December 2022 |archive-url=https://web.archive.org/web/20221214124131/https://ideas.repec.org//p/bir/birmec/20-06.html |url-status=live }}</ref><ref>{{Cite web |title=Information disorder: Toward an interdisciplinary framework for research and policy making |url=https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html |access-date=2022-12-14 |website=Council of Europe Publishing |language=en |archive-date=14 December 2022 |archive-url=https://web.archive.org/web/20221214125635/https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html |url-status=live }}</ref> these terms are abbreviated 'DMMI'.<ref>{{Cite journal |last=Newman |first=Hadley |orig-date=16 September 2021 |title=Understanding the Differences Between Disinformation, Misinformation, Malinformation and Information – Presenting the DMMI Matrix |url=https://committees.parliament.uk/writtenevidence/39289/html/ |journal=Draft Online Safety Bill (Joint Committee) |location=UK |publisher=UK Government |access-date=4 January 2023 |archive-date=4 January 2023 |archive-url=https://web.archive.org/web/20230104112358/https://committees.parliament.uk/writtenevidence/39289/html/ |url-status=live }}</ref> In 2019, [[Camille François]] devised the "ABC" framework of understanding different modalities of online disinformation: * Manipulative ''Actors'', who "engage knowingly and with clear intent in viral deception campaigns" that are "covert, designed to obfuscate the identity and intent of the actor orchestrating them." Examples include personas such as [[Guccifer 2.0]], [[Troll (slang)|Internet trolls]], [[state media]], and military operatives. * Deceptive ''Behavior'', which "encompasses the variety of techniques viral deception actors may use to enhance and exaggerate the reach, virality and impact of their campaigns." Examples include [[troll farm]]s, [[Internet bots]], [[astroturfing]], and "[[Facebook like button#Fake "likes"|paid engagement]]". * Harmful ''Content'', which includes [[Misinformation|health misinformation]], [[Media manipulation|manipulated media]] such as [[deepfakes]], [[Cyberbullying|online harassment]], [[violent extremism]], [[hate speech]] or [[terrorism]].<ref>{{Cite web |last=François |first=Camille |date=2019-09-20 |title=Actors, Behaviors, Content: A Disinformation ABC – Highlighting Three Vectors of Viral Deception to Guide Industry & Regulatory Responses |url=https://docs.house.gov/meetings/SY/SY21/20190926/109980/HHRG-116-SY21-Wstate-FrancoisC-20190926-SD001.pdf |archive-url=https://web.archive.org/web/20230321071912/https://docs.house.gov/meetings/SY/SY21/20190926/109980/HHRG-116-SY21-Wstate-FrancoisC-20190926-SD001.pdf |archive-date=2023-03-21 |access-date=2024-05-17}}</ref> In 2020, the [[Brookings Institution]] proposed amending this framework to include ''Distribution'', defined by the "technical protocols that enable, constrain, and shape user behavior in a virtual space".<ref>{{Cite web |last=Alaphilippe |first=Alexandre |date=2020-04-27 |title=Adding a 'D' to the ABC disinformation framework |url=https://www.brookings.edu/articles/adding-a-d-to-the-abc-disinformation-framework/ |archive-url=https://web.archive.org/web/20231027042531/https://www.brookings.edu/articles/adding-a-d-to-the-abc-disinformation-framework/ |archive-date=2023-10-27 |access-date=2024-05-18 |website=[[Brookings Institution]] |language=en-US}}</ref> Similarly, the [[Carnegie Endowment for International Peace]] proposed adding ''Degree'' ("distribution of the content ... and the audiences it reaches") and ''Effect'' ("how much of a threat a given case poses").<ref>{{Cite report |url=https://www.jstor.org/stable/resrep26180.6 |title=The ABCDE Framework |last=Pamment |first=James |date=2020 |publisher=[[Carnegie Endowment for International Peace]] |pages=5–9 |archive-url=https://web.archive.org/web/20240318053702/https://carnegieendowment.org/files/Pamment_-_Crafting_Disinformation_1.pdf |archive-date=2024-03-18}}</ref> ===Comparisons with propaganda=== Whether and to what degree disinformation and propaganda overlap is subject to debate. Some (like [[U.S. Department of State]]) define propaganda as the use of non-rational arguments to either advance or undermine a political ideal, and use disinformation as an alternative name for undermining propaganda,<ref>{{citation|url=https://www.state.gov/documents/organization/271028.pdf|date= May 2017|archive-date=30 March 2019|title=Can public diplomacy survive the internet?|archive-url= https://web.archive.org/web/20190330160444/https://www.state.gov/documents/organization/271028.pdf}}</ref>{{Page needed|date=March 2025}} while others consider them to be separate concepts altogether.<ref>{{citation|url=https://www.interpretermag.com/wp-content/uploads/2014/11/The_Menace_of_Unreality_Final.pdf|date=2014|title= The Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money|publisher=Institute of Modern Russia|archive-url=https://web.archive.org/web/20190203160823/https://www.interpretermag.com/wp-content/uploads/2014/11/The_Menace_of_Unreality_Final.pdf |archive-date=3 February 2019 }}</ref> One popular distinction holds that disinformation also describes politically motivated messaging designed explicitly to engender public cynicism, uncertainty, apathy, distrust, and paranoia, all of which disincentivize citizen engagement and mobilization for social or political change.<ref name=ned/>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Disinformation
(section)
Add topic