Towards a Science of Misinformation and Deception


Anthony Judge | Laetus in Praesens - TRANSCEND Media Service

The Challenge of an Information Pandemic — A COVID-19 Infodemic


2 Aug 2021 – The study of information has been remarkably clarified by information theory. This is the scientific study of the quantification, storage, and communication of digital information. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

Information theory has found applications in other areas, including statistical inference, cryptography, neurobiology,perception, linguistics, the evolution and function of molecular codes (bioinformatics), thermal physics, molecular dynamics, quantum computing, black holes, information retrieval, intelligence gathering, plagiarism detection, pattern recognition, anomaly detection and even art creation.

The world has recently been witness to much emphasis on misinformation in relation to the pandemic. The United Nations and its Specialized Agencies have been very explicit regarding the challenge of misinformation:

Misinformation is understood to be false, inaccurate, or misleading information that is communicated regardless of an intention to deceive. Disinformation is a subset of misinformation that is deliberately deceptive. The principal effect of misinformation is to elicit fear and suspicion among a population. News parody or satire can become misinformation if it is believed to be credible and communicated as if it were true. The terms “misinformation” and “disinformation” have often been associated with the concept of “fake news” (Varieties of Fake News and Misrepresentation: when are deception, pretence and cover-up acceptable? 2019).

Initiatives are underway to detect misinformation and fake news in digital media using the resources of artificial intelligence — most notably by social media platforms as a consequence of criticism of their irresponsibility in purveying biased content. As might be expected, these include approaches which benefit explicitly from information theory (Victoria Patricia Aires, et al, An Information Theory Approach to Detect Media Bias in News Websites, WISDOM, 24 August 2020; Carlo Kopp, et al, Information-theoretic Models of Deception: modelling cooperation and diffusion in populations exposed to “fake news”, PLOS One, 28 November 2018).

Less evident is how to distinguish what is speculative misinformation, by which collective understanding is confused, from disinformation about the nature of the crisis, How to distinguish deliberate lies in support of particular agendas?

A valuable article, relative to the quantity of confusing and misleading information otherwise available, is the study by Adam M. Enders, et al, (The Different Forms of COVID-19 Misinformation and their Consequences, Misinformation Review, 16 November 2020). Appropriately this argues that as the COVID-19 pandemic progresses, an understanding of the structure and organization of beliefs in pandemic conspiracy theories and misinformation becomes increasingly critical for addressing the threat posed by these dubious ideas. As stated, this preoccupation is itself problematic in that it appears to frame and conflate in a rather particular manner what is “misinformation”, “dubious”, and the focus of “conspiracy theories” — potentially excluding what some would argue (with evidence) as being of legitimate and strategic scientific concern.

Reference to “misinformation”, as being a major problem of the “infodemic”, can be recognized as exploiting this confusion. There is great advantage to vested interests in disguising deliberate lies within a context of speculative claims which can be readily dismissed — and claimed to be harmful. This has resulted in major initiatives by social media platforms and search engines to eliminate anything that can be readily labelled as “misinformation”. Sophisticated use is being made of artificial intelligence to that end.

Which truths upheld by one party however, would not now be dismissed as misinformation — if not a lie — by another? Opposing factions, whether in politics, science,  religion or business typically accuse each other of misrepresenting the truth — if not “lying”, possibly even with “evil” intent. Does disagreement automatically imply misinformation in that one party is held by the other to be misrepresenting the truth — lying — to the other?

It is unfortunate that efforts to apply information theory, together with insights into the nature of bias, seemingly take little account of the extent to which such efforts may themselves be each embedded in a particular pattern of bias associated with a preferred discipline, model or an institutional funding context — as would be argued by critics. There is also the confusion between bias and belief, raising issues as to how are fundamental beliefs to be recognized, or not, as misinformation (Reframing Fundamental Belief as Disinformation? Pandemic challenge to advertising, ideology, religion and science, 2020; Comparability of “Vaxxing Saves” with “Jesus Saves” as Misinformation? Problematic challenge of global discernment, 2021).

Far more challenging is the criteria by which the institutional promotion of any “Big Lie” would be detectable with the tools of information theory (Existential Challenge of Detecting Today’s Big Lie, 2016). The difficulty more generally is that increasingly any claim regarding such a “lie” is itself readily dismissed by authorities as “misinformation” meriting suppression — dismissing those claiming an unquestionably truthful alternative perspective. This pattern is most dramatically evident in political leaders accused of corruption — who deem their indictment as “political”.

Framed as a “war” by many leaders — thereby justifying a deceptive propaganda modality — is it then totally naive to assume that the official narrative regarding the pandemic is not based in some measure on misinformation, disinformation, deception, or deliberate lying? From a military perspective, this would be fully justified, given the highly valued role of deception in warfare.

With respect to the large scale production and dissemination of misinformation in the pandemic context, one study investigates those who believe it (Seoyong Kim and Sunhee Kim, The Crisis of Public Health and Infodemic: Analyzing Belief Structure of Fake News about COVID-19 Pandemic, Sustainability 12, 2020, 9904). This would however appear to avoid the issue of how to establish whether any information is true or false, given that critical counter-claims are necessarily dismissed or framed as false.

Are “fact-checking” initiatives to be upheld as totally free of bias — and unquestionably so, as some would claim them to be (Samikshya Siwakoti, et al., How COVID drove the evolution of fact-checking, Misinformation Review, 6 May 2021). Or does fact-checking depend on selectively framing particular information as false, according to the constraints of unquestionable criteria, governed by prescription of an often undeclared agenda (Sungkyu Park, et al, The presence of unexpected biases in online fact-checking, Misinformation Review, 27 January 2021).

Little is said in the current context, for example, about widespread tolerance of the misleading information presented to an i=ever increasing degree in advertising (Victor Pickard, Unseeing Propaganda: how communication scholars learned to love commercial media, Misinformation Review, 22 April 2021). In the detection of misinformation, are claims to scientific objectivity then themselves questionable — if they fail to address their own degree of complicity in framing the process (Cui bono?, Quis custodiet ipsos custodies?)

How durable are “facts”, given the “half life of knowledge“, as queried by Samuel Arbesman (The Half-life of Facts: why everything we know has an expiration date, 2012)? Expressed otherwise by the  Marcia Angell:

It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines. I take no pleasure in that conclusion, which I reached slowly and reluctantly over my two decades as an editor of The New England Journal of Medicine. (Drug Companies and Doctors: a story of corruptionThe New York Review of Books, 15 January 2009).

The problematic nature of the distinction between true and false is highlighted in a different context by the highly controversial discussion of critical race theory, especially in the USA — where it is proving to be an existential challenge to academia. As argued by Kerry Cosby, this raises the more general issue that the problems of today call for less binary and more systems thinking (Is Critical Race Theory Too Complex for U.S. Politics? The Globalist, 20 July 2021). It notes the relevance of the question to the pandemic, climate change, and other issues. Does politics have the ability to confront issues for which academia has (in some cases) begun using more complex methods to examine? Cosby argues:

Today, the U.S. political system largely relies upon establishing a binary choice for voters. Meanwhile, for many of today’s problems, researchers use models that consider multiple causes, feedback loops and systemic structural influences.

The argument can be evoked with respect to the manner in which “information” and “misinformation” are distinguished — given the limitations of the binary mindset focusing primarily (if not solely) on “truth” verse “falsehood”. Is information really either true or false — or possibly both, or even neither? This complexification has been explored by Kinhide Mushakoji (Global Issues and Interparadigmatic Dialogue, 1988). The non-binary insights of quantum computing, and their social implications, suggest that other perspectives may be pertinent (Alexander Wendt, Quantum Mind and Social Science: unifying physical and social ontology, 2015).

The title of this document is itself necessarily ambiguous — a “science of misinformation and deception” — given that considerable science has been acknowledged to have been applied in the course of the Facebook-Cambridge Analytica data scandal. As a marketing initiative to manipulate the opinion of voters, it frames the question as to whether advertising and puffery merit exploration as exercises in misinformation — in the guise of information (Rebecca A. Clay, Advertising as Science, American Psychological Association, 33, 2002, 9; Livia Gershon, Can Advertising Be a Science? JStor Daily, 4 December 2016; Adi Ignatius, Advertising Is an Art — and a Science, Harvard Business Review, March 2013).

Does the framing of the pandemic as a “war” preclude any assumption that authorities are complicit in processes of deception — as would be a natural option in the implication of the need for a “military” response?


Tags: , , , , , , ,


Share this article:

DISCLAIMER: The statements, views and opinions expressed in pieces republished here are solely those of the authors and do not necessarily represent those of TMS. In accordance with title 17 U.S.C. section 107, this material is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. TMS has no affiliation whatsoever with the originator of this article nor is TMS endorsed or sponsored by the originator. “GO TO ORIGINAL” links are provided as a convenience to our readers and allow for verification of authenticity. However, as originating pages are often updated by their originating host sites, the versions posted may not match the versions our readers view when clicking the “GO TO ORIGINAL” links. This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.

Comments are closed.