Curation of Your Online Persona Through Self-Care and Responsible Citizenship

Embed from Getty Images

 

I’m excited to announce that I finalized my first chapter for the K-12 book titled, Leveraging Technology to Improve School Safety and Student Wellbeing (forthcoming). My contribution to the edited book is titled, Curation of Your Online Persona Through Self-Care and Responsible Citizenship. It is written for secondary teachers and their students. It started as a few lesson plans for an interdisciplinary course at Spring Hill College (IDS 394: Wired) and grew into blog posts and eventually this chapter. See my previous blog post on the Recipe for Digital Curation of Your Online Persona and the one about the Global Interdisciplinary College Course.

ABSTRACT

With each blog post, tweet, and online project, Internet users are building their online reputation whether they want to or not. In the absence of professional branding, users’ online presence contributes vastly to what brands them. Through critical digital pedagogy, teachers and students question all technology practices (e.g., self, school, society). This chapter addresses the safety, security, and perception of their online data through self-determined prevention, weeding, and branding based on their short- and long-term goals. Methods, resources, and a lesson plan are provided as guidance to support students’ well-being pertaining to the online dimensions of their academic and personal lives. Strategies discussed include online identity system checks to review current digital footprint and data vulnerabilities, contemplation of technology usage in terms of self-care and responsible citizenship, and curation and development of their online persona. These participatory practices address two of the ISTE Standards for Students regarding digital citizenship.


The book’s release date is October 2019. Preorders are available now at IGI Global. There are many interesting chapters on school safety from many different perspectives including the marginalized. If interested in purchasing, let me know and I’ll provide you with a 40% discount coupon code.

I’ll present some of the curation strategies mentioned in the book at the Association of Educational Communications and Technology’s annual conference held in Las Vegas, NV this fall. My session is hosted by the Culture, Learning, and Technology special interest group in a new free workshop-style Inspire session on Friday, October 25th at 9:00-9:50 in the Convention Center, Room 1. It’s titled, Safeguard Your Online Persona by Using Various Techniques and Technologies. I’ve learned so much from taking a deep dive into this topic to write this chapter and look forward to sharing it with you.

Reference

Rogers, S. (in press). Curation of your online persona through self-care and responsible citizenship: Participatory digital citizenship for secondary education. In S. P. Huffman, S. Loyless, S. Albritton, & C. Green (Eds.), Leveraging Technology to Improve School Safety and Student Wellbeing. Hershey, PA: IGI Global. doi:10.4018/978-1-7998-1766-6


Sandra Annette Rogers, Ph.D.

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com

A Rubric to Identify Online Course Plans for a Community of Inquiry

This blog was originally posted on the AACE Review (Rogers, 2018).

Community of Inquiry

A community of inquiry (COI) is what it sounds like—people gather to learn from each other. I argue that a COI can be preplanned to engender a robust learning environment. What that entails is under investigation. For instance, a query of COI educational research on the EdTechLib database garnered 6500 articles. “The ‘community’ in “community of inquiry” is not defined by time or space. A common question, problem, or interest helps to forge the connection” (Shields, 1999, para. 2).

Historically, interdisciplinary scholarly communities have been around since the time of Theagenes of Rhegium who orally interpreted texts to pupils in the 6th century B.C.E. (Hornblower & Spawforth, 1998). Those ancient Greek gatherings were generally teacher-centered in a unidirectional flow of information between the teacher and listening participants until eventually taking on the Socratic method of shaping pupils’ understanding through questioning for critical thinking in the 3rd century B.C.E.

As for the American educational setting, the foundations of a COI can be found in John Dewey’s writing and reform efforts, which were influenced by Charles Sanders Pierce’s logic of inquiry for scientific methods and Jane Addams’ pragmatic approach to social analysis (Shields, 1999). For example, Dewey strongly believed that through experience-based learning, students could intellectually address the subject matter with the assistance of their teachers (Dewey, 1938).

Fast forward to computer-mediated instruction, Garrison, Anderson, and Archer (2000) proposed a COI framework for distance education. It includes the following elements they deem essential: social presence (SP), cognitive presence (CP), and teaching presence (TP). According to Google Scholar, their COI framework has been cited academically 4817 times. Based on their research and related literature, my interpretation of the COI presences is as follows:

  • SP is the co-construction of meaning through shared learning experiences to engender student agency from connectedness.
  • CP is the engagement in learning activities that demand higher-order thinking skills.
  • TP refers to feedback and instruction and can be presented through the instructor or student-led activities.

Online Community of Inquiry Syllabus Rubric ©

The online course syllabus serves as a plan of action that can be utilized for discussing continuous improvement between course design collaborators (i.e., instructional designers, course developers, instructors). To that end, I developed a rubric to evaluate online instructors’ planned interactions for delivering computer-mediated instruction based on their syllabi. It is used to analyze proposed interaction treatments (ITs) such as student-student opportunities for discussion, not the actual course. Our purpose was to determine the inclusion and strength of ITs to provide instructional design (ID) feedback to online instructors regarding their course plans. The underlying theoretical premise being the more interactive the course, the higher the level of student satisfaction and course achievement. Cummins, Bonk, and Jacobs (2002) conducted a similar syllabi study that looked at formats and levels of communication of online courses from colleges of education.

The rubric’s purpose is to provide a pragmatic solution to prevent problematic teacher-led (passive knowledge) online courses with little student interaction nor rigorous academic challenges. The Online Community of Inquiry Syllabus Rubric© is based on general concepts from Garrison, Anderson, and Archer’s (2000) COI framework, quality distance education rubrics (California State University-Chico, 2009; Johnson, 2007; Quality Matters™, 2014; & Roblyer & Wiencke, 2004), and significant literature. It consists of the following categories: ID for CP, technology tools for COI, COI loop for SP, support for learner characteristics, and instruction and feedback for TP. The 5-point rubric has the following scale for the criteria: low, basic, moderate, above average, and exemplary. Points awarded determine the course’s potential level of engendering an online COI (i.e., low, moderate, or high). See rubric.

Content Analysis Research of Online Course Syllabi

Rogers and Van Haneghan (2016) conducted the initial research utilizing the rubric with two raters. Good interrater-reliability agreement was obtained in the review of 23 undergraduate and graduate education online course syllabi, intraclass correlation coefficient (ICC) = .754, p < .001 and 95% CI [.514, .892]. Results indicated the potential for above-average CP (M = 4.7); however, SP (M = 3.1) was moderate, and TP (M = 2.7) was basic. Rogers and Khoury (2018) replicated the study at a different institution across disciplines with 31 syllabi; those findings mirrored the previous study’s levels of COI presences indicating a weakness in TP. For action research, the rubric criteria and results can serve as talking points between instructional designers and course developers to address gaps. Table 1 provides common ID feedback based on our 2018 syllabi analysis.

Table 1

Common Feedback Based on the Online Community of Inquiry Syllabus Rubric Analysis

Rubric Category Instructional Design Recommendations
Instructional Design for Cognitive Presence Include higher order thinking activities such as case analysis, papers that require synthesis or evaluation of peer, self, and/or product. See the list of cognitive activities in the Online Course Design Guide in Table 3.
Education Technology for COI · Add group work for collaborating on projects with Google Hangouts or Skype, so students can learn from each other.

· Use Schoology’s Media Album for students to share their projects and obtain peer feedback. For example, students could narrate PowerPoint project and save as MP4 to create a video presentation to add to a digital portfolio.

COI Loop for Social Presence · Provide a rubric for discussions to make the criteria clear.

· Provide discussions on readings to enhance learning from each other.

Support for Learner Characteristics

 

· Add the College’s accommodation statement.

· Provide links to academic tutoring services.

· Provide strategies for remediation and/or resources for building background knowledge.

Instruction and Feedback for Teaching Presence · Add specific online virtual office hours and format options. For example, use Skype, Google Hangouts, or FaceTime with your smartphone for human interaction.

· Describe direct instruction. Will there be narrated PowerPoints, audio summaries, lecture notes, or commercial program?

· Add information on feedback response time and format.

References

Cummings, J. A., Bonk, C. J., & Jacobs, F. (2002). Twenty-first century college syllabi: Options for online communication and interactivity. Internet & Higher Education, 5(1), 1.

Dewey, J. (1938). Experience and education. The Kappa Delta Pi Lecture Series. New York, NY: Collier Books.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education 2(2-3), 87-105. doi:10.1016/s1096-7516(00)00016-6

Hornblower, S., & Spawforth, A. (1998). The Oxford companion to classical civilization. New York, NY: Oxford University Press.

Johnson, E. S. (2007). Promoting learner-learner interactions through ecological assessments of the online environment. Journal of Online Learning and Teaching, 3(2). Retrieved from http://jolt.merlot.org/vol3no2/johnson.htm

QM Higher Education Rubric Fifth Edition. (2014). Quality Matters. Retrieved from https://www.qualitymatters.org/sites/default/files/PDFs/StandardsfromtheQMHigherEducationRubric.pdf

Roblyer, M., & Wiencke, W. (2004). Exploring the interaction equation: Validating a rubric to assess and encourage interaction in distance courses. Journal of Asynchronous Learning Networks, 8(4).

Rogers, S., & Khoury, S. (2018, October). Rubric to evaluate online course syllabi plans for engendering a community of inquiry: Round II. Paper presented at the meeting of the Association of Educational Technology & Communications, Kansas City, MO.

Rogers, S., & Van Haneghan, J. (2016). Rubric to evaluate online course syllabi plans for engendering a community of inquiry. Proceedings of Society for Information Technology & Teacher Education International Conference, 349-357. Chesapeake, VA: AACE.

Rubric for Online Instruction. (2009). Center for Excellence in Learning and Teaching. California State University-Chico. Retrieved from http://www.csuchico.edu/tlp/resources/rubric/rubric.pdf

Shields, P. M. (1999). The community of inquiry: Insights for public administration from Jane Addams, John Dewy and Charles S. Pierce. Archives of the Digital Collections at Texas State University. Retrieved from https://digital.library.txstate.edu/bitstream/handle/10877/3979/fulltext.pdf?sequence=1&isAllowed=y


Sandra Annette Rogers, Ph.D

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com

Join me at AERA 2019 in Toronto

Sandra Rogers standing near AERA conference sign celebrating 100 years

I’ll be attending my second conference of the American Educational Research Association (#AERA19) this year. The theme is ‘Leveraging Education Research in a Post-Truth Era: Multimodal Narratives to Democratize Evidence.’  It will be held in Toronto, Canada from April 5-9th at the Metro Toronto Conference Centre. I was impressed with last year’s conference but a bit overwhelmed. Hopefully, with the help of their conference app, I’ll find the sessions I need.

View this link to see the poster for Dr. Khoury and my session: Rubric to Analyze Online Course Syllabi Plan for Engendering a Community of Inquiry: Round II. Come join me on Saturday morning, April 6, from 8:00 to 9:30am in the Metro Toronto Convention Centre, 300 Level, Hall C. It’s hosted by the Division C – Section 3b: Technology-Based Environments in the subunit for Distance and Online Education. I’ll be sharing copies of my Online Community of Inquiry Syllabus Rubric.

I’ve shared our research paper on the AERA online Repository.  Read this blog page to learn more about our study. My hope is that it will be replicated to validate the rubric and improve not only instructors’ syllabi but teaching and learning in distance education. Let me know if you’re interested in replicating our study.

Are you going to AERA? Let’s connect in Toronto!

Sandra Annette Rogers, PhD

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com

Join me at AECT in Kansas City, MO!

Photo of Sandra Annette Rogers
Say hello if you see me.

The Association for Educational Communications & Technology (AECT) is, in my humble opinion, the premier association for instructional designers. My professors in my doctoral studies had been promoting this professional organization and their educational technology standards to their students. I finally attended the AECT conference last year and was blown away by the professional level of everyone I met and how cordial they were to newcomers. This year, their 2018 conference will be held in Kansas City, MO from October 23-27 at the Kansas City Marriott Downtown. I’ll be there, so let me know if you plan to attend. For AECT members, I placed my slides and research paper on the new conference online portal.

This time around, I’ll be presenting on my latest research and giving a workshop on the Online Community of Inquiry Syllabus Rubric that  I co-developed with Dr. Van Haneghan. It serves as a great collaboration tool to provide feedback to instructors and for syllabi content analysis for action research. Here’s my schedule:

Wed, Oct 24, 9:00am to 12:00pm, Marriott, Room-Bennie Morten B

Use of Online Community of Inquiry Syllabus Rubric for Course Developers and Collaborators, Drs. Rogers & Khalsa

Workshop – Registration Required
The syllabus serves as an action plan, which can be used as a resource for collaboration with instructional designers. In this session, participants will discuss how the Online Community of Inquiry Syllabus Rubric© (Rogers & Van Haneghan, 2016) can be used to pinpoint course development discussions on cognitive, social, and teaching presence for distance education instructors. Research and development of the rubric, a worked sample, commonly shared feedback, and rubric rater training will be provided.


Division of Distance Learning

Thu, Oct 25, 9:40 to 10:05am, Marriott, Room-Julia Lee A

Rubric to Evaluate Online Course Syllabi Plans for Engendering a Community of Inquiry: Round II, Drs. Rogers & Khoury

We replicated a research study that analyzed online course syllabi with the Online Community of Inquiry (COI) Syllabus Rubric© (Rogers & Van Haneghan, 2016). The rubric consists of the following elements: instructional design for cognitive presence, technology tools for COI, COI loop for social presence, support for learner characteristics, and instruction and feedback for teaching presence. We reviewed 31 syllabi across disciplines and found above average cognitive presence, average social presence, and basic teaching presence.

#AECT2018 #elearning #communityof inquiry #edtech

An Observer’s Notes on the Socratic Method in Action

Scorates talking to a man who is eagerly listening at his side.
Image source: Wikimedia
Here are my notes from the dialectic dialogue of the Socratic Seminar: An International Forum on Socratic Teaching held at the Association of Educational Communications and Technology (AECT) conference in Jacksonville, Florida in 2017.  I attended to learn more about the #Socratic method in general but also to learn how to apply it to the academic task of advising doctoral students’ dissertation writing. This is what occurred in a simulated environment with a doctoral student, her advisor, and a panel of experts. It was the most amazing thing I’ve ever seen offered at a conference—and far few people saw it, as the panel outnumbered the attendees.  I took notes for future reference and also to share with the student who was the target for this activity.
 
Introduction by Adviser, Dr. Abbas Johari: “This is a respectful dialogue between master and student….An example would be guided questions for the learner…Panelists should not make a statement but bring her to an understanding of a concept via questioning.”
Topic of Dissertation:  The student, Cheng Miaoting, gave a brief overview of her dissertation titled,  Technology Acceptance of LMS in Postsecondary Schools in Hong Kong.
MethodologyStudent used survey and interview methods to address several variables (e.g., SES, environment, context) based on the technology acceptance model (TAM 3).
Panels’ Questions: Each expert asked the student a question while she listened. I was not always able to attribute who said what as I feverishly took notes. Please understand the missing attributions.  See link below for panelists’ names.
  1. What is the problem? Tech or culture?
  2. What are you expecting to find? Recommendation for action? The assumption is __________.
  3. What are the assumptions underlying acceptance? Why is this good? Response to facilitate learning?
  4. Which theory will you use and why?
  5. Which variables affect learning?
Dr. Michael Thomas’ statement: “Tool has no agenda as in gun law. Is it possible to argue if a bad thing?” He recommended seeing Technological Sublime (aka Machine Messiah).
Dr. Amy Bradshaw’s statement: “What is modernity with Chinese characteristics?” Deficit ideology where X fixes them, whereas X is tech, mainland Chinese are needing a fix and solution is technology.
Adviser’s Guidance to Student: He told his student to address the master’s guidance by asking the following questions or to paraphrase what she had learned. She had a question about the term ‘factors’ in research.
Panel Questions continued:
6. What type of psychological adaptation will you use? Acculturation Framework? Cat mentioned Hofstede’s but panel discouraged it based on its hostility and stereotypical frame.
7. Fundamentally, what is the burning question you want to answer? The human question—why you want to do it. Solve one problem at a time.
8. How do things change in society? Need theory on societal change.
9. Why are immigrants coming to HK?
10. What are schools doing to address this? (Here is where you addressed the practical significance or human question, which was the missing piece of training for technology.)
11. Have you looked at other countries tech adaption for immigrants?
Adviser called for Debrief: The student acknowledged the need to focus study and reflect. She will reach out to other researchers to negotiate understanding, as was done today. She will talk in practical terms and not just in research methodology.
Panel Debriefed with Suggestions: 
  • Free yourself, but 1-directional.
  • What is the one thing they do not want you to talk about? That is your research questions.
  • Focus on commonality and not just differences.
  • Find ways to hear immigrants to inform study.
  • Remember the humane as well as the human.
  • Have an open mind in research design—always question research design.
  • Look at the polarity of human existence. What is up/down? In/out? What is not there? What’s obvious? Hidden? Who implemented these types of change?
  • Listen to your adviser.
  • See work by Charles Reigeluth and Carl Rogers.

Here is a link to the #AECT conference abstract and list of panel members.


Sandra Annette Rogers, Ph.D.

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com

Bibliography on Active Learning

A wordcloud in the shape of a Rubik's cube with these main words from the blog on active learning: learning, students, course, provide, can, and UDL.
This is a WordCloud based on my blog post on active learning.

Want to learn more about active learning? Check out this reading list. In preparation for my Fulbright application to Norway for an active learning research project, I prepared this bibliography last year.  It includes some Norwegian research on the topic.  I didn’t get that postdoctoral Fulbright but will try again next year for something else.  It took a lot of time preparing the application, and my references and potential hosting institution were so helpful in the process.  Special thanks to Dr. Rob Gray for serving as an intermediator in the application process!  You can read about his work below. If you have any seminal articles on active learning, please leave the citation in the comments section for inclusion. See my blog post for designing active learning in distance education.

Bibliography

Astin, A. W., & Antonio, A. L. (2012). Assessment for excellence: The philosophy and practice of assessment (2nd ed.). New York: NY: Rowman & Littlefield Publishers, Inc.

Baird, J-A., Hopfenbeck, T. N., Newton, P., Stobart, G., & Steen-Utheim, A. T. (2014). Assessment and learning: State of the field review, 13/4697. Oslo: Norway: Knowledge Center for Education. Retrieved from http://taloe.up.pt/wp-content/uploads/2013/11/FINALMASTER2July14Bairdetal2014AssessmentandLearning.pdf

Banta, T. W., & Palomba, C. A. (2015). Assessment essentials: Planning, implementing, and improving assessment in higher education (2nd ed.). San Francisco, CA: Jossey-Bass.

Barkley, E. F., & Major, C. H. (2016). Learning assessment techniques: A handbook for college faculty. San Francisco, CA: Jossey-Bass

Biggs, J., & Tang, C. (2007). Teaching for quality learning at university: What the student does (3rd ed.). Maidenhead, Berkshire: Open University Press.

Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5, 7-74. doi:10.1080/0969595980050102

Brookhart, S. M. (2007). Expanding views about formative classroom assessment: A review of the literature. In J. H. McMillan (Ed.), Formative classroom assessment: Theory into practice, 43-62. New York, NY: Teachers College Press.

Chickering, A. W., & Gamson, Z. F. (1991). Applying the seven principles for good practice in undergraduate education. New Directions for Teaching and Learning, 47. San Francisco, CA: Jossey-Bass.

Deci, E. & Ryan, R. M. 2014. Intrinsic motivation and self-determination in human behavior. Berlin: Springer.

Dysthe, O., Englesen, K. S., Lima, I. (2007). Variations in portfolio assessment in higher education: Discussion of quality issues based on Norwegian survey across institutions and disciplines. Assessing Writing, 12(2), 129-148. doi:10.1016/j.asw.2007.10.002

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. PNAS, 111(23), 8410-8415. doi:10.1073/pnas.1319030111

Gagné, R. M. (1985). The conditions of learning. New York, NY: Holt, Rinehart, & Winston.

Gray, R., & Nerheim, M. S. (2017). Teaching and learning in the digital age: Online tools and assessment practices, P48. Norgesuniversitetet: University of Bergen. Retrieved from https://norgesuniversitetet.no/prosjekt/teaching-and-learning-in-the-digital-age

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research77(1), 81-112. doi: 10.3102/003465430298487

Hopfenbeck, T. N., & Stobart, G. (2015). Large-scale implementation of assessment for learning. Assessment in Education: Principles, Policy & Practice, 22(1), 1-2. doi:10.1080/0969594X.2014.1001566

Johnson, D. W., Johnson, R., & Smith, K. (2006). Active learning: Cooperation in the university classroom (3rd ed.). Edina, MN: Interaction Book Company.

Klenowski, V. (2009). Assessment for learning revisited: An Asia-Pacific perspective. Assessment in Education: Principles, Policy & Practice, 16(3), 263-268. doi: 10.1080/09695940903319646

National Dropout Prevention Center/Network. (2009). 15 effective strategies for dropout prevention. NDPC: Clemson University. Retrieved from http://dropoutprevention.org/wp-content/uploads/2015/03/NDPCN15EffectiveStrategies.pdf

Norwegian Ministry of Education and Research. (2017). Quality culture in higher education, Meld. St. 16. Retrieved from https://www.regjeringen.no/no/dokumenter/meld.-st.-16-20162017/id2536007/

Nusche, D., Earl, L., Maxwell, W., & Shewbridge, C. (2011). OECD reviews of evaluation and assessment in education: Norway. Organisation for Economic Co-operation and Development. Retrieved from https://www.oecd.org/norway/48632032.pdf

Rogers, E. (2003). Diffusion of innovations (5th ed.). New York, NY: Simon and Schuster.

Thum, Y. M., Tarasawa, B., Hegedus, A., You, X., & Bowe, B. (2015). Keeping learning on track: A case-study of formative assessment practice and its impact on learning in Meridian School District. Portland, OR: Northwest Evaluation Association. Retrieved from http://files.eric.ed.gov/fulltext/ED567844.pdf

Wiliam, D. (2007). Keeping learning on track: Formative assessment and the regulation of learning. In F. K. Lester, Jr. (Ed.), Second handbook of mathematics teaching and learning (pp. 1053–1098). Greenwich, CT: Information Age Publishing.


Sandra Annette Rogers, Ph.D.

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com

Use Gwet’s AC1 instead of Cohen’s Kappa for Inter-rater Reliability

Last year, I attended a lecture by my former assessment and measurement professor, Dr. Van Haneghan, at the University of South Alabama. He addressed the paradox of using Cohen’s Kappa (k) for inter-rater reliability and acknowledged that it was identified in the literature two decades ago but has been mainly overlooked. The problem is that Cohen’sskews data when there is a high agreement between raters or an imbalance in the margins of the data tables (Cicchetti & Feinstein, 1990; Gwet, 2008).  This is contradictory to the statistical technique’s purpose, as researchers want to obtain an accurate degree of agreement. So if you’ve ever used Cohen’s Kappa for inter-rater reliability in your research studies, I recommend recalculating it with Gwet’s first-order agreement coefficient (AC1).

I decided to rerun the stats for my research study involving two raters analyzing the content of 23 online syllabi with the Online Community of Inquiry Syllabus Rubric for my presentation at AERA. AgreeStat was used to obtain Cohen’s k and Gwet’s AC1 to determine inter-rater reliability per category. Tables 1A-B show how the k statistic was affected by high agreement in the category of instructional design (ID) for cognitive presence (CP), while Gwet’s AC1 was not. Overall, Gwet’s AC1 values ranged from .102 to .675 (Mean SD = .135 ± .128). Interrater-reliability for scoring this category was good according to Altman’s (1991) benchmark, Gwet’s AC1 = .675, p < .001, and 95% CI [.04, .617].

Table 1A

Distribution of Scores by Rater and Category (Instructional Design for Cognitive Presence)

Rater2CP
Rater1CP 3 4 5 Missing Total
3 0 0 0 0 0 [0%]
4 1 2 1 0 4 [17.4%]
5 4 0 15 0 19 [82.6%]
Missing 0 0 0 0 0 [0%]
Total 5 2 16 0 23 [100%]
[21.7%] [8.7%] [69.6%] [0%] [100%]
 

Table 1B

 

Inter-rater Coefficients and Associated Parameters for ID for CP

METHOD Coeff. StdErr 95% C.I. p-Value
Cohen’s Kappa 0.36406 0.172287 0.007 to 0.721 4.617E-02
Gwet’s AC1 0.67549 0.128882 0.408 to 0.943 2.944E-05
Scott’s Pi 0.33494 0.195401 -0.07 to 0.74 1.006E-01
Krippendorff’s Alpha 0.34940 0.195401 -0.056 to 0.755 8.754E-02
Brenann-Prediger 0.60870 0.140428 0.317 to 0.9 2.664E-04
Percent Agreement 0.73913 0.093618 0.545 to 0.933 7.344E-08

Note. Unweighted Agreement Coefficients (Coeff.). Standard Error (StdErr) is the standard deviation. CI= confidence interval.

Gwet’s AgreeStat, Version 2015.6.1 (Advanced Analytics, Gaithersburg, MD, USA) currently costs $40. It’s fairly easy to use. See Kilem Gwet’s blog to learn more.

#AgreeStat #GwetAC1 #CohenKappa #Interrater-reliability

References

Altman, D. G. (1991). Practical statistics for medical research. London: Chapman and Hall.

Cicchetti, D.V., & Feinstein, A.R. (1990). High agreement but low kappa: II. Resolving the paradoxes. Journal of Clinical Epidemiology, 43(6), 551-558. doi:10.1016/0895-4356(90)90159-m

Gwet, K. (2008). Computing inter-rater reliability in the presence of high agreement. British Journal of Mathematical & Statistical Methodology, 61(1), 29-48. doi:10.1348/000711006×126600