The Challenges of Combating Online Fake News: A Review of ‘Dead Reckoning’

Embed from Getty Images

This article was originally posted on the AACE Review by Sandra Rogers.

The Data & Society Research Institute has produced and shared informative articles on the many facets of fake news producers, sharers, promoters, and denouncers of real news as part of their Media Manipulation Initiative. In Dead Reckoning (Caplan, Hanson, & Donovan, February 2018), the authors acknowledged that fake news is an ill-structured problem that’s difficult to define in its many disguises (e.g., hoaxes, conspiracy theories, supposed parody or satire, trolling, partisan biased content, hyper-partisan propaganda disguised as news, and state-sponsored propaganda). Nevertheless, they stated the critical need for it to be defined to produce a problem statement. Only in this way can a proper needs assessment and subsequent solutions be explored.

Based on their critical discourse analysis of information reviewed during their field research, they identified two descriptions for fake news, problematic content and the critique of mainstream media’s efforts to produce trustworthy news. [They reported how]… the denouncement of mainstream media as fake news serves to legitimatize alternative media sources. Beyond defining fake news, the authors seek parameters for what makes news real in their efforts to address information disorder.

Neither Man nor Machine Can Defeat Fake News

Kurzweil (1999) predicted that in the year 2029 humans will develop software that masters intelligence. However, the idea that cognition can be produced through computation has been refuted (Searle, 1980; McGinn, 2000). In Dead Reckoning, the authors addressed the problem of combating fake news as twofold; Artificial intelligence (AI) currently lacks the capability to detect subtleties, and news organizations are unable to provide the manpower to verify the vast proliferation of unmoderated global media. The problem is that once addressed, fake news producers circumvent the triage of security. Several efforts are underway in developing algorithms for machine learning such as PBS’ NewsTracker and Lopez-Brau and Uddenberg’s Open Mind.

Fake News Endangers Our Democracy & Leads to Global Cyberwars

The social media applications that have become part of the fabric of our society are used as propaganda tools by foreign and domestic entities. For example, prior to the 2016 Presidential election, Facebook’s ads and users’ news streams were inundated with fake news that generated more engagement from August to September than that of 19 major news agencies altogether (Buzz Feed News, 2016). The authors shared how concerned parties (e.g., news industry, platform corporations, civil organizations, and the government) have moved beyond whether fake news should be regulated to who will set standards and enforce regulations. “…without systemic oversight and auditing platform companies’ security practices, information warfare will surely intensify (Caplan, Hanson, & Donovan, p. 25, February 2018).”

The potential for fake news to reach Americans through digital news consumption from smartphone apps and text alerts compounds the issue. The Pew Research Center surveyed 2004 random Americans who consume digital news and found these habits based on two surveys per day for one week: 36% used online news sites, 35% used social media, 20% searched the Internet, 15% used email, 7% relied on family, and the remaining 9% was categorized as other (Mitchell, Gottfried, Shearer, & Lu, February 9, 2017).

Strategic Arbitration of Truth

Caplan, et al. state how organizations and AI developers approach defining fake news by type, features, and news signifiers of intent (e.g., characteristics of common fake news providers, common indications of fake news posts, and sharing patterns). For example, one common news signifier of fake news is the use of enticing terms such as ‘shocking.’ Digital intervention efforts include developing a taxonomy for verification of content, developing responsive corporate policy, banning accounts of fake news promoters, tightening verification process for posting and opening accounts, and informing users how to identify fake news. See the Public Data Lab’s Field Guide to Fake News and Other Information Disorders.

Caplan, et al. raise many unanswered questions in the struggle to defeat fake news. How can we arbitrate truth without giving more attention to fake news? Will Google’s AdSense allow users to control where their ads are placed? Can Facebook really reduce the influence of fake news promoters on their site all the time? Caplan, Hanson, and Donovan (2018) proposed these powerful strategies to combat fake news:

  • Trust and verify- By trust, they mean going beyond fact-checking, and content moderation, and incorporate interoperable mechanisms for digital content verification through collaborative projects with other news agencies;
  • Disrupt economic incentives- Stop the pay-per-click mill of online advertising without a say in the type of site it will appear in;
  • Online platform providers need to ban accounts or otherwise not feature content based on falsehoods, click-bait, or spam; and
  • Call for news regulation within the boundary of the First Amendment’s Good Samaritan provision.

For information on single-user technology and critical thinking skills to avoid fake news, visit my previous AACE Review blog on Navigating Post-truth Societies: Strategies, Resources and Technologies.

References 

Caplan, R., Hanson, L., & Donovan, J. (February 2018). Dead reckoning: Navigating content moderation after “fake news”. Retrieved from https://datasociety.net/output/dead-reckoning/ 

Kurzweil, R. (1999). The age of spiritual machines: When computers exceed human intelligence. New York, NY: Penguin Books.

McGinn, C. (2000). The mysterious flame: Conscious minds in a material world. Basic Books, 194.

Mitchell, A, Gottfried, J, Shearer, E, & Lu, K. (February 9, 2017). How Americans encounter, recall, and act upon digital news. Retrieved from http://www.journalism.org/2017/02/09/how-americans-encounter-recall-and-act-upon-digital-news/

Searle, J. (1980). Minds, brains and programs. Behavioral and Brain Sciences3(3), 417–457. doi:10.1017/S0140525X00005756


P.S. Disinformation (aka fake news) means it was used with intent to deceive rather than unintentional misinformation.


Sandra Annette Rogers, Ph.D.

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com

Remember the Human in Online Courses

Remember the human is something we intuitively do in traditional face-to-face classrooms, but somehow this gets lost in distance education. If it’s only a text-based independent study, then we’ve silenced our students and treated them as mutes by not providing communication platforms that are supported in the grading criteria. Virginia Shea (1994) asks us to remember the human in the impersonal cyberspace, as part of her Core Rules of Netiquette. She was referencing politeness. I, on the other hand, am referencing the instructional goal of teaching to the whole student.

This blog focuses on the basics of computer-mediated instruction in terms of the dichotomy of transmissive (authoritarian) education versus that of a transformative one (democratic). Whenever I present on this topic at conferences, participants share that they or their peers have also encountered and endured transmissive online courses. I wonder how big the problem really is. Since first encountering this problem in 2012 as a doctoral student, I’ve dedicated my research efforts on addressing it.

Transmissive vs. Transformative

Critical pedagogies (e.g., Ignatian pedagogy and Freirean praxis) place the human in a real-world context as much as possible through learning experiences, questioning norms, and reflection. The goal being transformative learning experiences instead of transmissive ones that use the antiquated banking model of education where the teacher deposits knowledge for the student to withdraw (Bradshaw, 2017). An example of transformative learning is Ignatian pedagogy that advocates for context, experience, action, reflection, and evaluation (Korth, 1993).

Classroom interactions for transformative learning align with constructivism.   “Meaningful learning, as opposed to reproductive learning,  is active, constructive, intentional, authentic, and collaborative” (Jonassen,  2009, p.49). Hooks (1994) called this humanity-affirming location of possibility. The design of interaction treatments online doesn’t rely solely on synchronous sessions through web hosting with everyone present. Instead, the goal of high-quality online instruction is to avoid passive learning that requires little cognitive engagement. A good example of a transformative learning activity would be a student (or group) project where students provide each other with authentic feedback.

Interaction treatments are any direct or indirect action between and among students, teachers, and content. Besides written and spoken word, this includes nonverbal immediacy behaviors such as an instructor’s response time. The alternative, a transmissive education of information dumping, is unethical. Freire (1970) called it a corpse of knowledge. Nowadays, this is delivered by the uninformed online instructor through text-based study devoid of interactions with other students (e.g., read-write-submit). The lack of contact with others in the class is not only isolating, shielding us from social learning, but can be frustrating for some students.

Are we teaching machines to learn better than we teach humans?

Embed from Getty Images

I recently read an introductory book about artificial intelligence (AI) and was struck how even the old AI addressed the environment of the robot, as this is something online instructors sometimes overlook for humans. If we want to come away as winners in the man vs machine competition, when humanoids such as Erica the robot have complete human feelings and singularity occurs in 2045, we should focus on providing human interactions in online courses.

Through trial and error, AI has developed heuristics to address robots’ interaction with the environment such as the symbol grounding problem, where symbols are meaningless unless they’re grounded within a real-world context.  For example, the Skydio R1 drone may become the ultimate selfie as it maps its environment using GPS, cameras, and other sensors. How often are instructors grounding the instructional content into the lifeworld of human learners?

What are the heuristics for effective human interaction in distance education?

Provide an online community of inquiry (COI) to dispel the perceived psychological distance between students and teachers in distance education to improve student learning outcomes and student satisfaction. An online COI, a sublime goal, requires consideration of the types of interaction treatments that could engender social, teaching, and cognitive presence for going beyond generative learning. These presences are the key elements for the COI loop (Garrison, Anderson, and Archer, 2000).

Technological affordances can provide humans with multimodal instruction such as narrated PowerPoints or audio feedback for teaching presence for an online COI. For example, podcasts increase student achievement and student satisfaction because they can listen to them over and over (Beylefeld, Hugo & Geyer, 2008; McKinney, Dyck & Luber, 2009; Seed, Yang & Sinnappan, 2009). Learning management systems allow for student-student discussions and the sharing of projects with opportunities for peer feedback to engender social presence in a COI. For example, Schoology’s Media Album allows students to upload their media projects for peer feedback. Projects also provide student agency in the design of their own learning.

Cognitive presence is the other component in the COI triad. Instructors generally provide this with interesting and challenging activities online that they’ve honed over the years from their F2F courses. In my two research studies (Rogers & Van Haneghan, 2016; Rogers & Khoury, 2018), the potential plans for cognitive presence have been high at the institutions; however, social presence has been average and teaching presence below average.

Designing interaction treatments (e.g., student-student, student-teacher, and student-content) will help address the psychologically perceived distance in computer-mediated courses (Bernard et al., 2009). These designed interactions need to focus on meaningful activities for the students’ lifeworld to aid their learning. Remember the human as you plan your online course; otherwise, the robots will overtake us.

References

Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of ITs in distance education. Review of Educational Research, 79, 1243-1288. doi:10.3102/0034654309333844

Beylefeld, A. A., Hugo, A. P., & Geyer, H. J. (2008). More learning and less teaching? Students’ perceptions of a histology podcast. South African Journal of Higher Education, 22(5), 948-956. doi:10.4314/sajhe.v22i5.42914

Bradshaw, A. C. (2017). Critical pedagogy and educational technology, in A.D. Benson, R. Joseph, & J.L. Moore (eds.) Culture, Learning and Technology: Research and Practice (pp. 8-27). New York, NY: Routledge.

Freire, P. (1970). Pedagogy of the oppressed. New York, NY: Continuum.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text based environment: Computer conferencing in higher education. The Internet and Higher Education 2(2-3), 87-105. doi:10.1016/s1096-7516(00)00016-6

Hooks, B. (1994). Teaching to transgress: Education as the practice of freedom. New York, NY: Routledge.

Jonassen, D.H. (2009). Externally modeling mental models. In L. Moller et al. (eds.), Learning and Instructional Technologies for the 21st Century; Visions of the Future (pp. 49-74). New York, NY: Springer.

Korth, S. J. (1993). Precis of Ignatian pedagogy: A practical approach.  International Center for Jesuit Education, Rome, Italy.

McKinney, D., Dyck, J. L., & Luber, E. S. (2009). iTunes university and the classroom: Can podcasts replace professors? Computers & Education, 52, 617-623. doi:10.1016/j.compedu.2008.11.004

Rogers, S., & Van Haneghan, J. (2016). Rubric to evaluate online course syllabi plans for engendering a community of inquiry. Proceedings of Society for Information Technology & Teacher Education International Conference, 349-357. Chesapeake, VA: AACE.

Shea, V. (1994). NetiquetteSan Francisco, CA: Albion Books. 


Sandra Annette Rogers, Ph.D.

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com