Navigating Post-Truth Societies: Strategies, Resources, and Technologies

The blog was originally posted on the AACE Review.

The Problem

While fake news and information bubbles are not new, awareness of their impact on public opinion has increased. The Wall Street Journal (2016) reported on a study that found secondary and postsecondary students could not distinguish between real and sponsored content in Internet searches. This became apparent when observing my college-bound niece google her bank on the Internet and quickly click the name at the top of the list within the sponsored content and then have the computer freeze from a potential malware attack. If teenagers cannot discern between promoted and regular content, imagine their encounters with fake news. The WSJ article recommended lateral reading (i.e., leave site to learn about it) and for adults to ask teens about their selection choices during Internet searches. In the instance with my niece, she was unaware of sponsored content. She also didn’t know that the first item in a browser’s search results generally is strategically pushed to the top because of search engine optimization (SEO) with keywords (meta-tagging).

Figure 1. Tag cloud of words from blog post

How can we help? What are good heuristics to determine the quality of online content?

Solution 1. Critical Reading and Thinking Skills

Determine the purpose of the Website by its domain (e.g., .com, .org, .gov). Analyze its content and graphics. Analytical questions to consider are as follows:

  • Is it current? Broken hyperlinks indicate a lack of attention to the site.
  • Does it look professional? Is it well written?
  • Does it have a point of contact?
  • Does the writer provide proper citations?
  • What is the author’s tone? Is the content biased toward a view? If so, is it substantiated with empirical evidence? Does the author present the complete narrative or are certain important elements omitted?
  • Do the graphics illustrate a valid point? Do they make sense statistically?

Are you an IT specialist, researcher, or educator? Each field has different approaches to thinking. The strategies you select would depend upon the nature of the content, as different content requires different ways of thinking. Bruning, Schraw, and Norby (2011) refer to this as thinking frames such as how one would think about scientific inquiry and the use of research methods. If you’re an educator, you might be interested in a WebQuest I developed to help students create their own job aid for critical thinking. It asks students to tap into the critical lens of their future field of study.

Solution 2. Primary Sources

Combat fake news by seeking the original source of information. Take time to verify the authenticity of what is begin shared online. Use various sources whenever possible for triangulation (e.g., interviews, observations, and documentation). This will ensure that what you read is corroborated by other articles presenting the same information. A good legislative resource is the U.S. Government Publication Office that provides congressional records, bills, codes, and Federal Register items. Their govinfo.gov website explains how to check the integrity of a government document found on the web by revealing its verification seal upon printing. It’s a digital signature placed in their PDFs; if the document has been modified, it breaks the verification.

Solution 3. Technology Resources

Use technology to decipher the trustworthiness of online content. Several Internet browser extensions provide visible alerts. For example, the Fake News Detector extension displays the word FAKE in red capital letters or orange for CLICKBAIT/Probably FAKE on the web page. It’s available in the Chrome store along with a few others and their user ratings. I started curating reputable fact-checking tools such as PolitiFact and Snopes on my Scoop.it! e-magazine, The Critical Reader. Some extensions are application-specific such as the Official Media Bias/Fact Check Extension that determines the veracity of articles on Facebook. It provides factuality (e.g., High), references, popularity, and positionality (e.g., left-center) at the base of the article on your Facebook feed. I personally use this one displayed in Figure 2.

Figure 2. Facebook post of Smithsonian article with Official Media Bias/Fact Check results

Solution 4. Seek Professional Content

Seek information from reputable researchers and educational leaders. Most professions adhere to ethical standards as a promise to their constituents. For example, the American Educational Research Association (AERA) states that members will not fabricate, falsify, nor plagiarize “in proposing, performing, or reviewing research, or in reporting research results (AERA Code of Ethics, 2011).” This standard is taken very seriously in the field of educational research. Those in the past that didn’t heed ethical rules have paid the cost of being outed with plagiarism tools such as was the case for the German Education Minister and Former Defense Minister’s plagiarized dissertations and subsequent unseating of their government appointments (CNN World, 2013).

As an educator, I took the Kappa Delta Pi (KDP) pledge of fidelity to humanity, science, service, and toil, as an initiate into this international honor society. The ideal of science relates to the topic of this discussion, “…This Ideal implies that, as an educator, one will be faithful to the cause of free inquiry and will strive to eliminate prejudice and superstition by withholding judgment until accurate and adequate evidence is obtained. One will not distort evidence to support a favorite theory; not be blinded by the new or spectacular; nor condemn the old simply because it is old. All this is implied in the Ideal of Science (KDP Initiation Ceremony, 2015).”

Do you have good fact-checking resources or more solutions to share? Please share those in the comments section.

References

Brumfield, B. (2013, February 6). German education minister loses Ph.D. over plagiarized thesis. CNN World. Retrieved from https://www.cnn.com/2013/02/06/world/europe/german-minister-plagiarism/index.html

Bruning, R. H., Schraw, G. J., & Norby, M. M. (2011). Cognitive psychology and instruction. New York, NY: Pearson.

Caplan, R., Hanson, L. & Donovan, J. (2018). Dead Reckoning Navigating Content Moderation After “Fake News”. Data & Society. Retrieved from https://datasociety.net/pubs/oh/DataAndSociety_Dead_Reckoning_2018.pdf 

Code of ethics. (2011). American Educational Research Association. Educational Researchers, 40 (3), 145-156. doi: 10.31.02/001189X11410403

Ceremonies and rituals. (2015). Kappa Delta Pi International Honor Society in Education.

Shellenbarger, S. (2016, November 21). Most students don’t know when news is fake, Stanford study finds. The Wall Street Journal. Retrieved from https://www.wsj.com/articles/most-students-dont-know


Sandra Annette Rogers, Ph.D.

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com

A Review of ‘Media Manipulation and Disinformation Online’

This was previously posted on the AACE Review by Sandra Rogers.

Digital screen with green code on black background
Photo by Markus Spiske temporausch.com on Pexels.com

In Media Manipulation and Disinformation Online, Marwick and Lewis (2017) of the Data & Society Research Institute described the agents of media manipulation, their modus operandi, motivators, and how they’ve taken advantage of the vulnerability of online media. The researchers described the manipulators as right-wing extremists (RWE), also known as alt-right, who run the gamut from sexists (including male sexual conquest communities) to white nationalists to anti-immigration activists and even those who rebuke RWE identification but whose actions confer such classification.

These manipulators rally behind a shared belief on online forums, blogs, podcasts, and social media through pranks or ruinous trolling anonymity, usurping participatory culture methods (networking, humor, mentorship) for harassment, and competitive cyber brigades that earn status by escalating bullying such as the sharing of a target’s private information. The researchers proposed that the use of the more digestible term of alt-right to convey the collective agenda of misogynists, racists, and fascists propelled their causes into the mainstream discourse through various media streams. Therefore, I’ll use the term RWE instead.

MEDIA ECOSYSTEM MALEABILITY

The Internet provides a shared space for good and evil. Subcultures such as white nationalists can converge with other anti-establishment doers on an international scale thanks to the connected world we live in. Marwick and Lewis reported on how RWE groups have taken advantage of certain media tactics to gain viewers’ attention such as novelty and sensationalism, as well as their interactions with the public via social media, to manipulate it for their agenda. For instance, YouTube provides any individual with a portal and potential revenue to contribute to the media ecosystem. The researchers shared the example of the use of YouTube by conspiracy theorists, which can be used as fodder for extremist networks as conspiracies generally focus on the loss of control of important ideals, health, and safety.

The more outrageous conspiracies get picked up by the media for their viewers, and in doing so, are partly to blame for their proliferation. In the case study provided with this article, The White Student Union, an influencer successfully sought moral outrage as a publicity stunt. Why isn’t the media more astute about this? “The mainstream media’s predilection for sensationalism, need for constant novelty, and emphasis on profits over civic responsibility made them vulnerable to strategic manipulation (p. 47) (Marwick & Lewis, 2017).”

ONLINE ATTENTION HACKING

Marwick and Lewis shared how certain RWE influencers gained prominence based on their technological skills and manipulative tactics. One tactic they’re using is to package their hate in a way that appeals to millennials. They use attention hacking to increase their status such as hate speech, which is later recanted as trickster trolling all the while gaining the media’s attention for further propagation. Then there are the RWE so-called news outlets and blogs that promote a hyper-partisan agenda and falsehoods. These were successful in attention hacking the nation running up to the 2016 Presidential election at a scale that out-paced that of the regular news outlets on Facebook (Buzz Feed News, 2016). Are they unstoppable?

The researchers indicated that the only formidable enemy of alt-right media is the opposing factions within its fractured, yet shared hate, assemblage. Unfortunately, mainstream media’s reporting on political figures who engage in conspiracy theories, albeit noteworthy as to their mindset, raises it to the level of other important news worthy of debate.  Berger and Luckmann (1966) referred to this as ‘reality maintenance’ through dialogue, reality-confirmation through interactions, ongoingly modified, and legitimized through certain conversations. The media needs to stop the amplification of RWE messages; otherwise, as Marwick and Lewis stated, it could gravely darken our democracy.

ONLINE MANIPULATORS SHARED MODUS OPERANDI

Marwick and Lewis reported the following shared tactics various RWE groups use for online exploits:

  • Ambiguity of persona or ideology,
  • Baiting a single or community target’s emotions,
  • Bots for amplification of propaganda that appears legitimately from a real person,
  • “…Embeddedness in Internet culture… (p. 28),”
  • Exploitation of young male rebelliousness,
  • Hate speech and offensive language (under the guise of First Amendment protections),
  • Irony to cloak ideology and/or skewer intended targets,
  • Memes for stickiness of propaganda,
  • Mentorship in argumentation, marketing strategies, and subversive literature in their communities of interest,
  • Networked and agile groups,
  • “…Permanent warfare… (p.12)” call to action,
  • Pseudo scholarship to deceive readers,
  • “…Quasi moral arguments… (p. 7)”
  • Shocking images for filtering network membership,
  • “Trading stories up the chain… (p. 38)” from low-level news outlets to mainstream, and
  • Trolling others with asocial behavior.

This is a frightful attempt at the social reconstruction of our reality, as the verbal and nonverbal language we use objectifies and rules the order (Berger and Luckmann, 1966).

DISINFORMATION MOTIVATORS

According to Marwick and Lewis, media manipulators are motivated by pushing their ideological agendas, the joy of sowing chaos in the lives of targets, financial gain, and/or status. The RWE’s shared use of online venues to build a counter-narrative and to radicalize recruits is not going away any time soon. This was best explained in their article as, with the Internet, the usual media gatekeepers have been removed.

Some claimed their impetus was financial and not politically motivated such as the teenagers in Veles, Macedonia who profited around 16K dollars per month via Google’s AdSense from Facebook post engagements with their 100 fake news websites (Subramanian, 2017). “What Veles produced, though, was something more extreme still: an enterprise of cool, pure amorality, free not only of ideology but of any concern or feeling about the substance of the election (Subramanian, 2017).” Fortunately for those of us living in the US, Google eventually suspended the ads from these and other fake news sites. However, as reported in Dead Reckoning, new provocateurs will eventually figure out how to circumvent Google’s AdSense and other online companies’ gateways as soon as they develop new ones. This is because, as aforementioned, the RWE influencers are tech-savvy.

PUBLIC MISTRUST OF MAINSTREAM MEDIA

Marwick and Lewis acknowledged a long history of mistrust with mainstream media. However, the current distrust appears worse than ever. For example, youth reported having little faith in mainstream media (Madden, Lenhart & Fontaine, 2017). Republicans’ trust in the mainstream media was the lowest ever recorded by the Gallop Poll (Swift, 2016). Why has it worsened? They pinpointed The New York Times’ lack of evidence for various news articles on the Iraq War’s nuclear arsenal, as an example of long-lasting readership dismay. The researchers reported on how a lack of trust in the mainstream media has pushed viewers to watch alternative networks instead. Moreover, the right-wing extremists’ manipulation of the media demonstrates the media’s weakness, which in turn sows mistrust. Marwick and Lewis acknowledged that the RWE subculture has been around the Internet for decades and will continue to thrive off the mainstream media’s need for novelty and sensationalism if allowed. I, for one, appreciate what Data & Society is doing to shed light on the spread of fake news and hatemongers’ agendas on the Internet.

Instructional Material

If you’re a college instructor of communications or teach digital literacy as a librarian, see the corresponding syllabus for this article. It provides discussion questions and assignments for teaching students about media manipulation. To teach your students how to combat fake news online, see my other AACE Review post on Navigating Post-Truth Societies: Strategies, Resources, and Technologies.

References

Berger, P. L., & Luckmann, T. (1966). The social construction of reality: A treatise in the sociology of knowledge. New York, NY: Anchor Books.

Madden, M. Lenhart, A., & Fontaine, C. (February 2017). How youth navigate the news landscape. Data & Society Research Institute. Retrieved from https://kf-siteproduction.s3.amazonaws.com/publications/pdfs/000/000/230/original/Youth_News.pdf

Marwick, A. & Lewis, R. (2017). Media manipulation and disinformation online. Data & Society Research Institute. Retrieved from https://datasociety.net/pubs/oh/DataAndSociety_MediaManipulationAndDisinformationOnline.pdf

Subramanian, S. (February 2017). Inside the Macedonian fake-news complex. WIRED. Retrieved from https://www.wired.com/2017/02/veles-macedonia-fake-news/

Swift, A. (2016). Americans trust in mass media sinks to new low. Gallup. Retrieved from http://www.gallup.com/poll/195542/americans-trust-mass-media-sinks-new-low.aspx

Interview with the Creators of Hoaxy® from Indiana University

This post was previously published on the AACE Review by Sandra Rogers.

Hoaxy diffusion network of the spread of a misleading news article on vaccines via Twitter

Figure 1. A Hoaxy® diffusion network regarding claims about the HPV vaccine.

Falsehoods are spread due to biases in the brain, society, and computer algorithms (Ciampaglia & Menczer, 2018). A combined problem is “information overload and limited attention contribute to a degradation of the market’s discriminative power” (Qiu, Oliveira, Shirazi, Flammini, & Menczer, 2017).  Falsehoods spread quickly in the US through social media because this has become Americans’ preferred way to read the news (59%) in the 21st century (Mitchell, Gottfried, Barthel, & Sheer, 2016). While a mature critical reader may recognize a hoax disguised as news, there are those who share it intentionally. A 2016 US poll revealed that 23% of American adults had shared misinformation unwittingly or on purpose; this poll reported high to moderate confidence in one’s ability to identify fake news with only 15% not very confident (Barthel, Mitchell, & Holcomb, 2016).

What’s the big deal?

The Brookings Institute warned how organized disinformation campaigns are especially dangerous for democracy: “This information can distort election campaigns, affect public perceptions, or shape human emotions” (West, 2017). Hoaxes are being revealed through fact-checking sites such as FactCheck.org, Politifact.com, Snopes.com, and TruthorFiction.com. These have the potential to reveal falsehoods and provide any corresponding truth in the details or alternative facts. For example, PolitiFact’s Truth-O-Meter is run by the editors of The Tampa Bay Times. This tool was so crucial for checking the veracity of candidates’ statements during the 2008 Presidential campaign season that they won a Pulitzer Prize for Journalism in 2009.

Hoaxy® (beta)

Hoaxy® takes it one step further and shows you who is spreading or debunking a hoax or disinformation on Twitter.  It was developed by Indiana University Network Science Institute and the Center for Complex Networks and Systems Research. The Knight Prototype grant and the Democracy Fund support it. The visuospatial interactive maps it produces are called diffusion networks and provide real-time data if you grant the program access to your Twitter account. Hoax purveyors be warned—it shows the actual Twitter user or Bot promoting it through grey, low-credibility claims. Conversely, it also displays in yellow the Twitter accounts fact-checking the claim.

Bots are determined by computer algorithms and given a score based on the science behind the Botometer with red identifying accounts most ‘Bot Like’ and blue for most ‘Human-Like’. The website’s landing page provides trending news, popular claims, popular fact-checks, and a search box for queries. The site’s Dashboard shows a list of influential Twitter accounts and number of tweets for those sharing claims or fact-checking articles with the corresponding Botometer Score.

Use Hoaxy® to find out who is at the center of a hoax by clicking the node to reveal the Twitter account. It’s also interesting to see who the outliers are and their six degrees of separation. Select a node, and it will provide the Botometer Score and whether they quoted someone or someone quoted them (retweets) on Twitter. For example, a magician named Earl is the approximate epicenter for spreading misinformation about the human papillomavirus (HPV) vaccine causing an increase of cervical cancer in Swedish girls. See vaccine query visualized on Hoaxy, as in Figure 1. Based on the sharing of the article from Yournewswire.com, it had 1665 people claiming it and zero disclaimers on Twitter, as of 7/18/18. As for the facts, according to the Center for Disease Control and Prevention, HPV vaccines are safe and prevent cervical cancer (2018).

It was a privilege to talk to the Hoaxy project coordinator, Dr. Giovanni Ciampaglia, on behalf of his co-coordinators Drs. Alessandro Flammini and Filippo Menczer:

What was the inspiration or tipping point to invent Hoaxy®?

We started Hoaxy because we could not find a good tool that would let us track the spread of misinformation on social media. The main inspiration was a project called Emergent (emergent.info), which was a really cool attempt at tracking rumors spreading through the news media. However, it was a completely manual effort by a group of journalists, and it was hard to scale to social media, where there are just so many stories published at once. So, we set out with the idea in mind of building a platform that would work in a completely automated fashion.

Since its creation in 2016, what are some of the overhauls that the Hoaxy® software program required for updates?

Hoaxy has evolved quite a bit since we first launched in 2016. The main overhaul was a complete redesign of its interface, during which we also integrated our social bot detection classifier called Botometer. In this way, Hoaxy can be used to understand the role and impact of social bots in the spread of both misinformation, and of low-credibility content in general.

What are some of the unexpected uses of Hoaxy?

We were not entirely expecting it when we first heard it, but several educators use Hoaxy in their classrooms to teach about social media literacy. This is of course really exciting for us because it shows the importance of teaching these skills and of using interactive, computational techniques for doing so.

What hoax is currently fact-checked the most?

Hoaxes are constantly changing, so it’s hard to keep track of what is a most fact-checked hoax. However, Hoaxy shows what fact-checks have been shared the most in the past 30 days, which gives you an idea of the type of hoaxes that are circulating on social media at any given time.

What’s the most absurd claim you encountered?

There are just too many… my favorite ones have to do with ancient prophecies and catastrophes (usually about asteroids and other astronomical objects).

Has Hoaxy® won any awards? (If not, what type of award categories does it fit in?)

It has not won an award (yet!). We are grateful however to the Knight Foundation Prototype Fund and to the Democracy Fund, who supported the work of integrating Botometer into Hoaxy.

I noted Mihai Avram’s (Indiana University graduate student) work on Fakey, a teaching app on discerning disinformation that is gamified. Are you involved with overseeing that project as well?

Yes, Filippo is involved in it. Mihai has also worked on Hoaxy; in fact, without him, the current version of Hoaxy would have certainly not been possible!

What are some other resource projects your team is working on now?

Hoaxy is part of the Observatory on Social Media (osome.iuni.iu.edu), and we provide several other tools for open social media analytics (osome.iuni.iu.edu/tools). We are working on improving Hoaxy and making it operable with other tools. The ultimate goal would be to bring Hoaxy into the newsroom so that reporters can take advantage of it as part of their social media verification strategies.

What type of research is critically needed to better understand the spread of disinformation and its curtailing?

We definitely need to better understand the “demand” side of dis/misinformation — what makes people vulnerable to misinformation? The complex interplay between social, cognitive, and algorithmic vulnerabilities is not well understood at the moment. This will need a lot of investigation. We also need more collaboration between academia, industry, civil society, and policymakers. Platforms are starting to open up a little to partnering with outside researchers, and there will be much to learn on all sides.

Is there anything else you would like to share?

Yes! We are always happy to hear what users think about our tools, and how we can improve them. To contact us you can use email, Twitter, or our mailing list. More information here: http://osome.iuni.iu.edu/contact/

About Giovanni Ciampaglia

Dr. Ciampaglia is an assistant professor in the Department of Computer Science and Engineering at the University of South Florida. Previously, he was an assistant research scientist and postdoctoral fellow at the Indiana University Network Science Institute, where he collaborated on information diffusion with Drs. Menczer and Flammini and co-created Hoaxy. Prior to that, he was a research analyst contractor at the Wikimedia Foundation. He has a doctorate in Informatics from the University of Lugano in Switzerland. His research interest is in large-scale, collective, social phenomena on the Internet and other complex social phenomena such as the emergence of social norms.

References

Barthel, M., Mitchell, A., & Holcomb, J. (December 2016). Many Americans believe fake news is sowing confusion. Trusts, Facts, & Democracy. Pew Research Center Journalism and Media. Retrieved from http://www.journalism.org/2016/12/15/many-americans-believe-fake-news-is-sowing-confusion/

Ciampaglia, G. L., & Menczer, F. (June 2018). Misinformation and biases infect social media, both intentionally and accidentally. The Conversation. Retrieved from http://theconversation.com/misinformation-and-biases-infect-social-media-both-intentionally-and-accidentally-97148

Mitchell, A., Gottfried, J., Barthel, M., & Shearer, E. (July 2016). The modern news consumer. Trusts, Facts, & Democracy. Pew Research Center Journalism and Media. Retrieved from http://www.journalism.org/2016/07/07/the-modern-news-consumer/

Qiu, X., Oliveira, D., Shirazi, S., Flammini, A., & Menczer, F. (2017). Limited individual attention and online virality of low-quality information. Nature Human Behavior, 1(132). doi:10.1038/s41562-017-0132

West, D. M. (2017). How to combat fake news and disinformation. Retrieved from https://www.brookings.edu/research/how-to-combat-fake-news-and-disinformation/?gclid=EAIaIQobChMI2eK5nbrA3AIVDgFpCh1k1wepEAAYASAAEgIL0PD_BwE

 

Join me at AECT 2019 in Las Vegas!

The word, Inspired, is written against a purple splash of paint.
AECT 2019 Inspired Theme

Association for Educational Communications and Technology

The Association for Educational Communications and Technology (AECT) is a fantastic professional organization for instructional designers, instructional technologists, educational technology support staff, instructors, and education researchers. Why? Because they do fun stuff like ‘Breakfast with Champions’ and ‘Game Night.’   I learned about it from my professors in my doctoral program who promoted AECT and their educational technology standards to their students. AECT’s 2019 international convention will be held in Las Vegas, NV from October 21st-25th at the Convention Center. This year’s convention theme is Inspired Professional Learning. Inspired Learning Professionals. Let me know if you plan to attend so we can network and attend sessions and events together.

Sessions

I’m excited to share that the following three presentations were accepted! I’m really happy to be able to lead an Inspire! session, which is a new format to provide 50-minute professional development without the extra cost.  I invite you to attend my sessions below.

Host: Design and Development (D&D) Division

Magis Instructional Design Model for Transformative Teaching, Dr. Rogers

Wed, Oct 23, 10:00 to 10:20am, Convention Center, Pavilion 6 (Note: I’m first in this concurrent session.)

Description. The Magis Instructional Design Model endeavors to transform teaching online through the lens of critical pedagogy to place the human in a real-world context as much as possible through learning experiences and reflection. The goal being transformative learning experiences instead of transmissive ones that use the antiquated banking model of education. The model includes instructional strategies from the cognitive and affective domains. The Author asks for input and feedback on this model.

Host: D&D: Instructional Design in Context – Service

Roadmap to Reentry Resources in Mobile County to Prevent Recidivism Service Project, Dr. Sandra Rogers, Dr. Demetrius Semien, & Aubrey Whitten

Wed, Oct 23, 2:20 to 2:50pm, Convention Center, Ballroom C (Note: We go second in this session.)

Description. Would you like to start a service project? Consider creating a Google Map of service providers that meet a strong need in your community (food deserts, homeless shelters, or the previously incarcerated). Presenters will share their service project developing a reentry map of service providers to combat recidivism in their community. Learn to plot locations, draw pathways, and add information to a Google Map. Participants will also share what they are doing in their communities.

Host: Culture, Learning, and Technology (CLT) Inspire!

Safeguard Your Online Persona by Using Various Techniques and Technologies, Dr. Sandra Rogers

Oct 25, 9:00 to 9:50am, Convention Center, Conference Rm 1 (Note: Workshop format so bring your devices!)

Description. Have you googled yourself lately? What does the Internet search reveal about you? With each hashtag, blog post, tweet, and online project, you are building your online reputation whether you want to or not. In the absence of professional branding, your online persona brands you. Learn to curate your online personal data (e.g., Google Alert for keywords & reverse search images) and leave with an action plan.

Handouts

For AECT members, I’ll place my presentation and paper on the conference online portal. For my blog readers, I’ll post my presentations to SlideShare and then embed them here once their finalized. (Forthcoming!)

In closing, the sessions at AECT are really good. The organization’s special interest groups are dynamic. Conference-goers are very open to making new friends and learning, and this includes the big names in the field. You may find yourself sitting beside David Wiley, Curt Bonk, Lloyd Rieber, Amy Bradshaw, or George Veletsianos!

Breakfast table with invited guest and Wheaties box in the center
Breakfast with Champion, George Veletsianos

 

 

#aect19inspired

 

 

 

 

 

Curation of Your Online Persona Through Self-Care and Responsible Citizenship

Embed from Getty Images

 

I’m excited to announce that I finalized my first chapter for the K-12 book titled, Leveraging Technology to Improve School Safety and Student Wellbeing (forthcoming). My contribution to the edited book is titled, Curation of Your Online Persona Through Self-Care and Responsible Citizenship. It is written for secondary teachers and their students. It started as a few lesson plans for an interdisciplinary course at Spring Hill College (IDS 394: Wired) and grew into blog posts and eventually this chapter. See my previous blog post on the Recipe for Digital Curation of Your Online Persona and the one about the Global Interdisciplinary College Course.

ABSTRACT

With each blog post, tweet, and online project, Internet users are building their online reputation whether they want to or not. In the absence of professional branding, users’ online presence contributes vastly to what brands them. Through critical digital pedagogy, teachers and students question all technology practices (e.g., self, school, society). This chapter addresses the safety, security, and perception of their online data through self-determined prevention, weeding, and branding based on their short- and long-term goals. Methods, resources, and a lesson plan are provided as guidance to support students’ well-being pertaining to the online dimensions of their academic and personal lives. Strategies discussed include online identity system checks to review current digital footprint and data vulnerabilities, contemplation of technology usage in terms of self-care and responsible citizenship, and curation and development of their online persona. These participatory practices address two of the ISTE Standards for Students regarding digital citizenship.


The book’s release date is October 2019. Preorders are available now at IGI Global. There are many interesting chapters on school safety from many different perspectives including the marginalized. If interested in purchasing, let me know and I’ll provide you with a 40% discount coupon code.

I’ll present some of the curation strategies mentioned in the book at the Association of Educational Communications and Technology’s annual conference held in Las Vegas, NV this fall. My session is hosted by the Culture, Learning, and Technology special interest group in a new free workshop-style Inspire session on Friday, October 25th at 9:00-9:50 in the Convention Center, Room 1. It’s titled, Safeguard Your Online Persona by Using Various Techniques and Technologies. I’ve learned so much from taking a deep dive into this topic to write this chapter and look forward to sharing it with you.

Reference

Rogers, S. (in press). Curation of your online persona through self-care and responsible citizenship: Participatory digital citizenship for secondary education. In S. P. Huffman, S. Loyless, S. Albritton, & C. Green (Eds.), Leveraging Technology to Improve School Safety and Student Wellbeing. Hershey, PA: IGI Global. doi:10.4018/978-1-7998-1766-6


Sandra Annette Rogers, Ph.D.

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com

Recipe for Digital Curation of Your Online Persona

Cartoon headshot of blogger, Sandra Rogers

Have you googled yourself lately? What does the Internet search reveal about you? With each hashtag, blog post, tweet, and online project at a time, you’re building your online reputation whether you want to or not. In the absence of professional branding, your online persona brands you. Curation of our online personal data is more important than ever. This is because our online information and interactions are being used to analyze us for commercial benefit, credit ratings, job selection, relationships, health care decisions, harassment, law enforcement, and machine learning (Matsakis, 2019).

I’m putting together a few basic curation tasks in the ‘recipe’ below for a class lesson. Curation, of course, will take ongoing effort. These are simple actions to get you started.

Tag words from my blog

RECIPE

Curating Your Online Persona 

Time: Ready in minutes based on diversity of digital tools used and length of your digital footprint
Serves: Average technology users
Calories: 0

TIPS

  • Log out of all accounts to fully see information that you publicly shared.
  • Use alphanumericsymbolic passphrases for strong login credentials (e.g., @T!mBuk2B42Long). Create different ones for different types of accounts.
  • Consider the long-term impact of posting or otherwise reacting online.
  • Subscribe to a technical news service that shares how to keep your data safe such as  Mashable, TechCrunch or Wired.

INGREDIENTS

Benevolent Intention
Critical Thinking
Persistence
Relevant Safeguards

PREPARATION

  1. Search for your name on different Internet browsers (e.g., Google Chrome, Firefox, Internet Explorer, Safari). View at least the first three pages of results to find older accounts that you may have forgotten about and should close.
  2. Use Google’s reverse image search tool to see if your shared photos (e.g., headshot, Facebook profile, or wedding pictures) are used elsewhere without permission. For example, did you know that FB profile photos are publicly available? Anyone could be reusing or repurposing them.
  3. Set short and long-term goals based on your findings and personal insight.

CURATION

  1. Set up a Google Alert on your name to stay informed of its mentions on the Internet.
  2. Cleanse unprofessional social media posts. For example, use GoCardigan to remove retweets and likes on Twitter. Why? Twitter users can delete their own tweets but not their reactions to others.
  3. Close compromised or unused online accounts to safeguard your data and reduce your digital footprint. Review Wikipedia’s list of data breaches. Recheck the list periodically.
  4. Tighten the privacy settings on your social media accounts.

Please share your techniques and issues in safeguarding your online persona. I’ll continue to add to this post as I dive deeper into this topic and as new technologies surface.


Recommended Readings

Bates, C. (2018). Take charge of your online reputation. Educause. Retrieved from https://er.educause.edu/articles/2018/10/take-charge-of-your-online-reputation

Internet safety and cyber security awareness for college students. (N.D.) Retrieved from https://www.cyberdegrees.org/resources/internet-safety-for-college-students/

Matsakis, L. (2019). The Wired guide to personal data collection. Condé Nast. Retrieved from https://www.wired.com/story/wired-guide-personal-data-collection/


Sandra Annette Rogers, Ph.D

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com

What I’m doing to help combat disinformation online

A word cloud based on a blog about fake news detection resources.

I’ve spent a lot of time in the past two years reading and figuring out how to use technology and critical thinking to identify false information. I realized that I hadn’t posted anything on my personal blog about it. Instead, I’ve blogged about it on the academic site, the AACE Review. In Navigating Post-Truth Societies, I provided useful strategies, resources, and technologies. For example, if you’re still on Facebook, use Official Media Bias/Fact Check Extension to determine the accuracy of posted articles. In my review of Data & Society’s Dead Reckoning, I summarized why it’s so difficult for humans and machine algorithms to defeat fake news. I also summarized Data & Society’s article on whose manipulating the media and why. Recently, I interviewed the creators of Hoaxy to learn more about their social diffusion network that pinpoints claims posted on Twitter. Again, all of these are available on the AACE Review blog.

Additionally, I’ve been curating useful strategies and technologies for students to use to combat fake news on Scoop.It. The e-magazine is called The Critical Reader. This digital curation has useful videos, articles, games, and technology tools for detecting biased or false information. For example, it describes how the Open Mind Chrome extension not only detects fake news but also provides veritable articles instead. The target audience would be for high school and college students. Let me know if you would like to collaborate on this endeavor.

Lastly, I wrote my first chapter for an academic book on the curation of your online data, which includes strategies, technologies, and lessons on digital citizenship for secondary students. It’s titled, Curation of Your Online Persona through Self-Care and Responsible Citizenship. It promotes benevolent intention and reflection in students’ online interactions through participatory practices, hopefully, to avoid spreading misinformation and hate.


Sandra Annette Rogers, Ph.D.

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com