The Challenges of Combating Online Fake News: A Review of ‘Dead Reckoning’
This article was originally posted on the AACE Review by Sandra Rogers.
The Data & Society Research Institute has produced and shared informative articles on the many facets of fake news producers, sharers, promoters, and denouncers of real news as part of their Media Manipulation Initiative. In Dead Reckoning (Caplan, Hanson, & Donovan, February 2018), the authors acknowledged that fake news is an ill-structured problem that’s difficult to define in its many disguises (e.g., hoaxes, conspiracy theories, supposed parody or satire, trolling, partisan biased content, hyper-partisan propaganda disguised as news, and state-sponsored propaganda). Nevertheless, they stated the critical need for it to be defined to produce a problem statement. Only in this way can a proper needs assessment and subsequent solutions be explored.
Based on their critical discourse analysis of information reviewed during their field research, they identified two descriptions for fake news, problematic content and the critique of mainstream media’s efforts to produce trustworthy news. [They reported how]… the denouncement of mainstream media as fake news serves to legitimize alternative media sources. Beyond defining fake news, the authors seek parameters for what makes news real in their efforts to address information disorder.
Neither Man nor Machine Can Defeat Fake News
Kurzweil (1999) predicted that in the year 2029 humans will develop software that masters intelligence. However, the idea that cognition can be produced through computation has been refuted (Searle, 1980; McGinn, 2000). In Dead Reckoning, the authors addressed the problem of combating fake news as twofold; Artificial intelligence (AI) currently lacks the capability to detect subtleties, and news organizations are unable to provide the manpower to verify the vast proliferation of unmoderated global media. The problem is that once addressed, fake news producers circumvent the triage of security. Several efforts are underway in developing algorithms for machine learning such as PBS’ NewsTracker and Lopez-Brau and Uddenberg’s Open Mind.
Fake News Endangers Our Democracy & Leads to Global Cyberwars
The social media applications that have become part of the fabric of our society are used as propaganda tools by foreign and domestic entities. For example, prior to the 2016 Presidential election, Facebook’s ads and users’ news streams were inundated with fake news that generated more engagement from August to September than that of 19 major news agencies altogether (Buzz Feed News, 2016). The authors shared how concerned parties (e.g., news industry, platform corporations, civil organizations, and the government) have moved beyond whether fake news should be regulated to who will set standards and enforce regulations. “…without systemic oversight and auditing platform companies’ security practices, information warfare will surely intensify (Caplan, Hanson, & Donovan, p. 25, February 2018).”
The potential for fake news to reach Americans through digital news consumption from smartphone apps and text alerts compounds the issue. The Pew Research Center surveyed 2004 random Americans who consume digital news and found these habits based on two surveys per day for one week: 36% used online news sites, 35% used social media, 20% searched the Internet, 15% used email, 7% relied on family, and the remaining 9% was categorized as other (Mitchell, Gottfried, Shearer, & Lu, February 9, 2017).
Strategic Arbitration of Truth
Caplan, et al. state how organizations and AI developers approach defining fake news by type, features, and news signifiers of intent (e.g., characteristics of common fake news providers, common indications of fake news posts, and sharing patterns). For example, one common news signifier of fake news is the use of enticing terms such as ‘shocking.’ Digital intervention efforts include developing a taxonomy for verification of content, developing responsive corporate policy, banning accounts of fake news promoters, tightening verification process for posting and opening accounts, and informing users how to identify fake news. See the Public Data Lab’s Field Guide to Fake News and Other Information Disorders.
Caplan, et al. raise many unanswered questions in the struggle to defeat fake news. How can we arbitrate truth without giving more attention to fake news? Will Google’s AdSense allow users to control where their ads are placed? Can Facebook really reduce the influence of fake news promoters on their site all the time? Caplan, Hanson, and Donovan (2018) proposed these powerful strategies to combat fake news:
- Trust and verify- By trust, they mean going beyond fact-checking, and content moderation, and incorporate interoperable mechanisms for digital content verification through collaborative projects with other news agencies;
- Disrupt economic incentives- Stop the pay-per-click mill of online advertising without a say in the type of site it will appear in;
- Online platform providers need to ban accounts or otherwise not feature content based on falsehoods, click-bait, or spam; and
- Call for news regulation within the boundary of the First Amendment’s Good Samaritan provision.
For information on single-user technology and critical thinking skills to avoid fake news, visit my previous AACE Review blog on Navigating Post-truth Societies: Strategies, Resources and Technologies.
Caplan, R., Hanson, L., & Donovan, J. (February 2018). Dead reckoning: Navigating content moderation after “fake news”. Retrieved from https://datasociety.net/output/dead-reckoning/
Kurzweil, R. (1999). The age of spiritual machines: When computers exceed human intelligence. New York, NY: Penguin Books.
McGinn, C. (2000). The mysterious flame: Conscious minds in a material world. Basic Books, 194.
Mitchell, A, Gottfried, J, Shearer, E, & Lu, K. (February 9, 2017). How Americans encounter, recall, and act upon digital news. Retrieved from http://www.journalism.org/2017/02/09/how-americans-encounter-recall-and-act-upon-digital-news/
Searle, J. (1980). Minds, brains and programs. Behavioral and Brain Sciences, 3(3), 417–457. doi:10.1017/S0140525X00005756
P.S. Disinformation (aka fake news) means it was used with intent to deceive rather than unintentional misinformation.
Sandra Annette Rogers, Ph.D.