An observer’s notes on the Socratic method in action

Scorates talking to a man who is eagerly listening at his side.
Image source: Wikimedia
Here are my notes from the dialectic dialogue of the Socratic Seminar: An International Forum on Socratic Teaching held at the Association of Educational Communications and Technology (AECT) conference in Jacksonville, Florida in 2017.  I attended to learn more about the #Socratic method in general but also to learn how to apply it to the academic task of advising doctoral students’ dissertation writing. This is what occurred in a simulated environment with a doctoral student, her advisor, and a panel of experts. It was the most amazing thing I’ve ever seen offered at a conference—and far few people saw it, as the panel outnumbered the attendees.  I took notes for future reference and also to share with the student who was the target for this activity.
 
Introduction by Adviser, Dr. Abbas Johari: “This is a respectful dialogue between master and student….An example would be guided questions for the learner…Panelists should not make a statement but bring her to an understanding of a concept via questioning.”
Topic of Dissertation:  The student, Cheng Miaoting, gave a brief overview of her dissertation titled,  Technology Acceptance of LMS in Postsecondary Schools in Hong Kong.
MethodologyStudent used survey and interview methods to address several variables (e.g., SES, environment, context) based on the technology acceptance model (TAM 3).
Panels’ Questions: Each expert asked the student a question while she listened. I was not always able to attribute who said what as I feverishly took notes. Please understand the missing attributions.  See link below for panelists’ names.
  1. What is the problem? Tech or culture?
  2. What are you expecting to find? Recommendation for action? The assumption is __________.
  3. What are the assumptions underlying acceptance? Why is this good? Response to facilitate learning?
  4. Which theory will you use and why?
  5. Which variables affect learning?
Dr. Michael Thomas’ statement: “Tool has no agenda as in gun law. Is it possible to argue if a bad thing?” He recommended seeing Technological Sublime (aka Machine Messiah).
Dr. Amy Bradshaw’s statement: “What is modernity with Chinese characteristics?” Deficit ideology where X fixes them, whereas X is tech, mainland Chinese are needing a fix and solution is technology.
Adviser’s Guidance to Student: He told his student to address the master’s guidance by asking following questions or to paraphrase what she had learned. She had a question about the term ‘factors’ in research.
Panel Questions continued:
6. What type of psychological adaptation will you use? Acculturation Framework? Cat mentioned Hofstede’s but panel discouraged it based on its hostility and stereotypical frame.
7. Fundamentally, what is the burning question you want to answer? The human question—why you want to do it. Solve one problem at a time.
8. How do things change in society? Need theory on societal change.
9. Why are immigrants coming to HK?
10. What are schools doing to address this? (Here is where you addressed the practical significance or human question, which was the missing piece of training for technology.)
11. Have you looked at other countries tech adaption for immigrants?
Adviser called for Debrief: The student acknowledged the need to focus study and reflect. She will reach out to other researchers to negotiate understanding, as was done today. She will talk in practical terms and not just in research methodology.
Panel Debriefed with Suggestions: 
  • Free yourself, but 1-directional.
  • What is the one thing they do not want you to talk about? That is your research questions.
  • Focus on commonality and not just differences.
  • Find ways to hear immigrants to inform study.
  • Remember the humane as well as the human.
  • Have an open mind in research design—always question research design.
  • Look at the polarity of human existence. What is up/down? In/out? What is not there? What’s obvious? Hidden? Who implemented these types of change?
  • Listen to your adviser.
  • See work by Charles Reigeluth and Carl Rogers.

Here is a link to the #AECT conference abstract and list of panel members.

My TeachersPayTeachers Best Selling Products

Green katydid eating pollen off of a pink zinnia
TPT Freebie Photo in Teacherrogers Store

TeachersPayTeachers (TPT) has its back-to-school sale this week with up to 25% off with the #BTSFRESH code.  I thought it would be good to share a list of my 6 best selling TPT products with you. They are as follows: (Some are aligned with the Common Core State Standards.)



TPT is an open marketplace for teachers to sell their self-produced (teacher-authored) material.  To learn more about TPT, see my blog page

Sandra Annette Rogers

Teacherrogers’ store is on #TPT

Bibliography on Active Learning

Want to learn more about active learning? Check out this reading list. In preparation for my Fulbright application to Norway for an active learning research project, I prepared this bibliography last year.  It includes some Norwegian research on the topic.  I didn’t get that postdoctoral Fulbright but will try again next year for something else.  It took a lot of time preparing the application, and my references and potential hosting institution were so helpful in the process.  Special thanks to Dr. Rob Gray for serving as an intermediator in the application process!  You can read about his work below. If you have any seminal articles on active learning, please leave the citation in the comments section for inclusion. #activelearning

Bibliography

Astin, A. W., & Antonio, A. L. (2012). Assessment for excellence: The philosophy and practice of assessment (2nd ed.). New York: NY: Rowman & Littlefield Publishers, Inc.

Baird, J-A., Hopfenbeck, T. N., Newton, P., Stobart, G., & Steen-Utheim, A. T. (2014). Assessment and learning: State of the field review, 13/4697. Oslo: Norway: Knowledge Center for Education. Retrieved from http://taloe.up.pt/wp-content/uploads/2013/11/FINALMASTER2July14Bairdetal2014AssessmentandLearning.pdf

Banta, T. W., & Palomba, C. A. (2015). Assessment essentials: Planning, implementing, and improving assessment in higher education (2nd ed.). San Francisco, CA: Jossey-Bass.

Barkley, E. F., & Major, C. H. (2016). Learning assessment techniques: A handbook for college faculty. San Francisco, CA: Jossey-Bass

Biggs, J., & Tang, C. (2007). Teaching for quality learning at university: What the student does (3rd ed.). Maidenhead, Berkshire: Open University Press.

Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5, 7-74. doi:10.1080/0969595980050102

Brookhart, S. M. (2007). Expanding views about formative classroom assessment: A review of the literature. In J. H. McMillan (Ed.), Formative classroom assessment: Theory into practice, 43-62. New York, NY: Teachers College Press.

Chickering, A. W., & Gamson, Z. F. (1991). Applying the seven principles for good practice in undergraduate education. New Directions for Teaching and Learning, 47. San Francisco, CA: Jossey-Bass.

Deci, E. & Ryan, R. M. 2014. Intrinsic motivation and self-determination in human behavior. Berlin: Springer.

Dysthe, O., Englesen, K. S., Lima, I. (2007). Variations in portfolio assessment in higher education: Discussion of quality issues based on Norwegian survey across institutions and disciplines. Assessing Writing, 12(2), 129-148. doi:10.1016/j.asw.2007.10.002

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. PNAS, 111(23), 8410-8415. doi:10.1073/pnas.1319030111

Gagné, R. M. (1985). The conditions of learning. New York, NY: Holt, Rinehart, & Winston.

Gray, R., & Nerheim, M. S. (2017). Teaching and learning in the digital age: Online tools and assessment practices, P48. Norgesuniversitetet: University of Bergen. Retrieved from https://norgesuniversitetet.no/prosjekt/teaching-and-learning-in-the-digital-age

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research77(1), 81-112. doi: 10.3102/003465430298487

Hopfenbeck, T. N., & Stobart, G. (2015). Large-scale implementation of assessment for learning. Assessment in Education: Principles, Policy & Practice, 22(1), 1-2. doi:10.1080/0969594X.2014.1001566

Johnson, D. W., Johnson, R., & Smith, K. (2006). Active learning: Cooperation in the university classroom (3rd ed.). Edina, MN: Interaction Book Company.

Klenowski, V. (2009). Assessment for learning revisited: An Asia-Pacific perspective. Assessment in Education: Principles, Policy & Practice, 16(3), 263-268. doi: 10.1080/09695940903319646

National Dropout Prevention Center/Network. (2009). 15 effective strategies for dropout prevention. NDPC: Clemson University. Retrieved from http://dropoutprevention.org/wp-content/uploads/2015/03/NDPCN15EffectiveStrategies.pdf

Norwegian Ministry of Education and Research. (2017). Quality culture in higher education, Meld. St. 16. Retrieved from https://www.regjeringen.no/no/dokumenter/meld.-st.-16-20162017/id2536007/

Nusche, D., Earl, L., Maxwell, W., & Shewbridge, C. (2011). OECD reviews of evaluation and assessment in education: Norway. Organisation for Economic Co-operation and Development. Retrieved from https://www.oecd.org/norway/48632032.pdf

Rogers, E. (2003). Diffusion of innovations (5th ed.). New York, NY: Simon and Schuster.

Thum, Y. M., Tarasawa, B., Hegedus, A., You, X., & Bowe, B. (2015). Keeping learning on track: A case-study of formative assessment practice and its impact on learning in Meridian School District. Portland, OR: Northwest Evaluation Association. Retrieved from http://files.eric.ed.gov/fulltext/ED567844.pdf

Wiliam, D. (2007). Keeping learning on track: Formative assessment and the regulation of learning. In F. K. Lester, Jr. (Ed.), Second handbook of mathematics teaching and learning (pp. 1053–1098). Greenwich, CT: Information Age Publishing.

 

What I’m doing to help combat disinformation online

A word cloud based on a blog about fake news detection resources.

I’ve spent a lot of time the past two years reading and figuring out how to use technology and critical thinking to identify false information. I realized that I hadn’t posted anything on my personal blog about it. Instead, I’ve blogged about it on the academic site, the AACE Review. In Navigating Post-Truth Societies, I provided useful strategies, resources, and technologies. For example, if you’re still on Facebook, use Official Media Bias/Fact Check Extension to determine the accuracy of posted articles. In my review of Data & Society’s Dead Reckoning, I summarized why it’s so difficult for humans and machine algorithms to defeat fake news. I also summarized Data & Society’s article on whose manipulating the media and why—a must read!

Additionally, I’ve been curating useful strategies and technologies for students to use to combat fake news on Scoop.It. The e-magazine is called The Critical Reader. This digital curation has useful videos, articles, games, and technology tools for detecting biased or false information. For example, it describes how the Open Mind Chrome extension not only detects fake news but also provides veritable articles instead. The target audience would be for high school and college students. Let me know if you would like to collaborate on this endeavor.

And last but not least, I spent 2016-2017 searching for the truth about the workings of the Trump administration. I curated these articles on another Scoop.It titled The News We Trust.  Each of these articles, videos, and tweets were evaluated with a critical lens prior to being added to the collection. Evaluation measures used were confirming authenticity, triangulation (e.g., interviews, observations, and documentation) of evidence, relevance, and currency. Many others I read didn’t make it due to biased comments or going off topic. The reason I’m sharing this now is that it’s still useful going forward in our shared effort of maintaining a free democracy.  They can also be useful in the 2018 midterm elections. If you notice any pertinent article missing, send it to me, and I’ll review for consideration.

#fakenews #mediamanipulation #disinformation #hoaxbusters

Remember the Human in Online Courses

Remember the human is something we intuitively do in traditional face-to-face classrooms, but somehow this gets lost in distance education. If it’s only text-based independent study, then we’ve silenced our students and treated them as mutes by not providing communication platforms that are supported in the grading criteria. Virginia Shea asks us to remember the human in the impersonal cyberspace, as part of her Core Rules of Netiquette. She was referencing politeness. I, on the other hand, am referencing the instructional goal of teaching to the whole student.

This blog focuses on the basics of computer-mediated instruction in terms of the dichotomy of transmissive (authoritarian) education versus that of a transformative one (democratic). Whenever I present on this topic at conferences, participants share that they or their peers have also encountered and endured transmissive online courses. I wonder how big the problem really is.  Since first encountering this problem in an online course in 2012, I have dedicated my research efforts on addressing it.

Transmissive vs. Transformative

Critical pedagogies, Ignatian pedagogy, a community of inquiry (COI), and Freirean praxis, place the human in a real-world context as much as possible through learning experiences and reflection. The goal being transformative learning experiences instead of transmissive ones that use the antiquated banking model of education where the teacher deposits knowledge for the student to withdraw (Bradshaw, 2017). A good example of transformative learning is Ignatian pedagogy, which advocates for context, experience, action, reflection, and evaluation (Korth, 1993).

Interactions for transformative learning are active, authentic, constructive, cooperative, and intentional. Hooks (1994) calls this humanity-affirming location of possibility.  The design of interaction treatments online doesn’t rely solely on synchronous sessions through web hosting with everyone present. Instead, the goal of high-quality online instruction is to avoid passive learning that requires little cognitive engagement. A good example of a transformative learning activity would be a student (or group) project where students provide each other with authentic feedback.

Interaction treatments are any direct or indirect action between and among students, teachers, and the content. This includes nonverbal immediacy behaviors such as an instructor’s response time.  The alternative is unethical—a corpse of knowledge delivered by the unknowing instructor through text-based study devoid of interactions with other students (e.g., read-write-submit). The lack of contact with others in the class is not only isolating, shielding us from social learning, but can be frustrating for some students.

Are we teaching machines to learn better than we teach humans?

Embed from Getty Images

I recently read an introductory book about artificial intelligence (AI). I was struck how even the old AI addressed the environment of the robot, as this is something online instructors sometimes overlook for humans. If we want to come away as winners in the man vs machine competition, when humanoids such as Erica the robot have complete human feelings and singularity occurs in 2045, we need to focus on providing human interactions in online courses.

Through trial and error, AI has developed heuristics to address the robot’s interaction with the environment such as the symbol grounding problem, where symbols are meaningless unless they’re grounded within a real-world context.  For example, the Skydio R1 drone may become the ultimate selfie as it maps its environment using GPS, cameras, and other sensors. How often are instructors grounding the instructional content into the lifeworld of human learners?

What are the heuristics for effective human interaction in distance education?

Provide an online COI to dispel the perceived psychological distance between students and teachers in distance education to improve student learning outcomes and student satisfaction. An online COI, a sublime goal, requires consideration of the types of interaction treatments that could engender social, teaching, and cognitive presence for going beyond generative learning. These presences are the key elements for the COI loop (Garrison, Anderson, and Archer, 2000).

Technological affordances can provide humans with multimodal instruction such as narrated PowerPoints or audio feedback for teaching presence for an online COI. For example, podcasts increase student achievement and student satisfaction because they can listen to them over and over (Beylefeld, Hugo & Geyer, 2008; McKinney, Dyck & Luber, 2009; Seed, Yang & Sinnappan, 2009). Learning management systems allow for student-student discussions and the sharing of projects with opportunities for peer feedback to engender social presence in a COI. For example, Schoology’s Media Album allows students to upload their media projects for peer feedback. Projects also provide student agency in the design of their own learning.

Cognitive presence is the other component in the triad of the COI. Instructors generally provide this with interesting and challenging activities online that they’ve honed over the years from their F2F courses. In my two research studies (Rogers & Van Haneghan, 2016; Rogers & Khoury, unpublished), the potential plans for cognitive presence have been high at the institutions. However, social presence has been average and teaching presence below average.

Designing interaction treatments (e.g., student-student, student-teacher, and student-content) will help address the psychologically perceived distance in computer-mediated courses (Bernard et al., 2009). These designed interactions need to focus on meaningful activities for the students’ lifeworld to aid their learning. Remember the human as you plan your online course; otherwise, the robots will overtake us.

#criticalpedagogy #transformativeeducation #AI #elearning #Ignatianpedagogy #CoI

References

Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of ITs in distance education. Review of Educational Research, 79, 1243-1288. doi:10.3102/0034654309333844

Beylefeld, A. A., Hugo, A. P., & Geyer, H. J. (2008). More learning and less teaching? Students’ perceptions of a histology podcast. South African Journal of Higher Education, 22(5), 948-956. doi:10.4314/sajhe.v22i5.42914

Bradshaw, A. C. (2017). Critical pedagogy and educational technology, in A.D. Benson, R. Joseph, & J.L. Moore (eds.) Culture, Learning and Technology: Research and Practice (pp. 8-27). New York, NY: Routledge.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text based environment: Computer conferencing in higher education. The Internet and Higher Education 2(2-3), 87-105. doi:10.1016/s1096-7516(00)00016-6

Hooks, B. (1994). Teaching to transgress: Education as the practice of freedom.  New York, NY: Routledge.

McKinney, D., Dyck, J. L., & Luber, E. S. (2009). iTunes university and the classroom: Can podcasts replace professors? Computers & Education, 52, 617-623. doi:10.1016/j.compedu.2008.11.004

Rogers, S., & Van Haneghan, J. (2016). Rubric to evaluate online course syllabi plans for engendering a community of inquiry. Proceedings of Society for Information Technology & Teacher Education International Conference, 349-357. Chesapeake, VA: AACE.

Using Google Suite for the Universal Design of Learning

Design for gardining Website interface displays tools and supplies as icons
This Google Drawing was created for a doctoral mini project on an interface design task for developing a gardening website with one of my peers in an online course. This was created prior to my understanding of accessibility issues. Notice that not all icons are labeled. This would not be accessible to all. Additionally, alternative text would need to be embedded with each image.

Google Suite,  along with the Chrome browser’s Omnibox and useful extensions, can be used to enhance the teaching of all learners with universal instructional design principles. Google Suite is the new name for these features: Google Apps (Docs, Forms, Sheets, Slides), Classroom, and Drive. This blog focuses on the use of technology to augment instruction through differentiation via scaffolding, formative assessments, and student collaboration. Google professional development opportunities and teacher resources are also addressed.

There are several efforts to design education with universal design in mind. Palmer and Caputo (2003) proposed seven principles for universal instructional design (UID): accessibility, consistency, explicitness, flexibility, accommodating learning spaces, minimization of effort, and supportive learning environments. The UID model recognizes those needs for course design. Its main premise is equal access to education and extends this to all types of learners and not just those with disabilities. For example, all learners can benefit from multi-modal lessons. Palmer and Caputo’s principles should be kept in mind as you develop differentiated instructional learning scenarios with Google Suite. See my blog post to learn more about universal design.

My College is a Google Apps for Education campus, which means we have unlimited storage on our Drive and seamless access to Google Suite through our school Gmail. Speak with your Google Suite administrator to learn about the features and functions of your access, as some institutions like my alma mater block YouTube and Google+. 

The following scenarios address possible technology solutions for teaching all learners. For instance, scaffolding supports different learners’ preferences, as well as the needs of lower performing students. Formative assessments are important to obtain ongoing feedback on student performance; use these often. They can be formal or informal (practice tests, exit tickets, polls). Formative tests promote active learning, which leads to higher retention of information learned. Use the following list to add your ideas and scenarios for differentiated lesson planning.

Scaffold Learning Google Tools & Features Formative Assessments Your Ideas & Scenarios
Provide visuals for structure, context, or direction & just-in-time definitions Google Drawings, Docs’ Explore tool, & Drive Students make their own graphic representation of a concept or complete guided tasks with the frame provided by an instructor.
Provide authentic speaking practice prior to oral test/presentation Google Docs’ Voice Typing, Chrome Browser’s Omnibox for a timer, & Drive Students work individually or in small group turn-taking voice typing their scripts/stories on Google Doc within a timed parameter on a split screen.
Check for comprehension to obtain data to drive instruction/remediation Google Forms, Sheets, Classroom, & Drive (Alternative: Google Slides new feature allows for asking questions & polling question priority live from slide.) Students take a quiz on Google Forms to demonstrate knowledge after a lesson (exit ticket) or homework. Instructors receive Form responses in a Google Sheet. Sheets has Explore tool for analyzing data for visual display for data-driven discussions among teacher cohort/supervisors. Auto import grades from Forms to Classroom gradebook.
Students use app with embedded choices to check their own grammar Free Chrome extension, Grammarly and/or app Students correct errors in their first writing drafts on the app or within online writing platforms (e.g., wiki, blog, or email). Grammarly is also available for MS Office and Windows but not for Google Docs. Use its app to check Docs or other writing formats by pasting content to New Document.
Hi/low peer collaboration and/or tutoring Google Apps, Classroom, & Drive Students share settings on project Docs, Drawings, etc. to collaborate via text comments or synchronous video chat sessions.

Resources for Digital Literacy Skill Training

  • Did you know that Google provides lesson plans for information literacy?
  • Do you need to teach your students how to refine their web searches? See Google Support.
  • Internet Safety Tip- Recommend that students use incognito browsing on Google Chrome when conducting searches to reduce their digital footprint. See Google’s YouTube playlist, Digital Citizenship and Security, and their training site for more information.

Accessibility Resources for Assistive Technology

  • ChromeVOX – Google’s screen reading extension for the Google Chrome browser and the screen reader used by Chrome Operating System (OS).
  • TalkBack – This is Google’s screen reading software that is typically included with Android devices. Due to the design of Android and its customizability by hardware manufacturers, TalkBack can vary and may not be included on some Android devices.
  • Screen Magnifier – This is the screen magnification software included with ChromeOS. The magnification function in ChromeOS doesn’t have a unique product name like other platforms.
  • Hey, Google – This is Google’s personal assistant, which is available in the Google Chrome browser, ChromeOS, and many Android devices.

Professional Development for Educators

Other

#Google #Edtech #Accessibility #UDL

References

Palmer, J., & Caputo, A. (2003). Universal instructional design: Implementation guide. Guelph, Ontario: University of Guelph.

Use Gwet’s AC1 instead of Cohen’s Kappa for Inter-rater Reliability

Last year, I attended a lecture by my former assessment and measurement professor, Dr. Van Haneghan, at the University of South Alabama. He addressed the paradox of using Cohen’s Kappa (k) for inter-rater reliability and acknowledged that it was identified in the literature two decades ago but has been mainly overlooked. The problem is that Cohen’sskews data when there is a high agreement between raters or an imbalance in the margins of the data tables (Cicchetti & Feinstein, 1990; Gwet, 2008).  This is contradictory to the statistical technique’s purpose, as researchers want to obtain an accurate degree of agreement. So if you’ve ever used Cohen’s Kappa for inter-rater reliability in your research studies, I recommend recalculating it with Gwet’s first-order agreement coefficient (AC1).

I decided to rerun the stats for my research study involving two raters analyzing the content of 23 online syllabi with the Online Community of Inquiry Syllabus Rubric for my presentation at AERA. AgreeStat was used to obtain Cohen’s k and Gwet’s AC1 to determine inter-rater reliability per category. Tables 1A-B show how the k statistic was affected by high agreement in the category of instructional design (ID) for cognitive presence (CP), while Gwet’s AC1 was not. Overall, Gwet’s AC1 values ranged from .102 to .675 (Mean SD = .135 ± .128). Interrater-reliability for scoring this category was good according to Altman’s (1991) benchmark, Gwet’s AC1 = .675, p < .001, and 95% CI [.04, .617].

Table 1A

Distribution of Scores by Rater and Category (Instructional Design for Cognitive Presence)

Rater2CP
Rater1CP 3 4 5 Missing Total
3 0 0 0 0 0 [0%]
4 1 2 1 0 4 [17.4%]
5 4 0 15 0 19 [82.6%]
Missing 0 0 0 0 0 [0%]
Total 5 2 16 0 23 [100%]
[21.7%] [8.7%] [69.6%] [0%] [100%]
 

Table 1B

 

Inter-rater Coefficients and Associated Parameters for ID for CP

METHOD Coeff. StdErr 95% C.I. p-Value
Cohen’s Kappa 0.36406 0.172287 0.007 to 0.721 4.617E-02
Gwet’s AC1 0.67549 0.128882 0.408 to 0.943 2.944E-05
Scott’s Pi 0.33494 0.195401 -0.07 to 0.74 1.006E-01
Krippendorff’s Alpha 0.34940 0.195401 -0.056 to 0.755 8.754E-02
Brenann-Prediger 0.60870 0.140428 0.317 to 0.9 2.664E-04
Percent Agreement 0.73913 0.093618 0.545 to 0.933 7.344E-08

Note. Unweighted Agreement Coefficients (Coeff.). Standard Error (StdErr) is the standard deviation. CI= confidence interval.

Gwet’s AgreeStat, Version 2015.6.1 (Advanced Analytics, Gaithersburg, MD, USA) currently costs $40. It’s fairly easy to use. See Kilem Gwet’s blog to learn more.

#AgreeStat #GwetAC1 #CohenKappa #Interrater-reliability

References

Altman, D. G. (1991). Practical statistics for medical research. London: Chapman and Hall.

Cicchetti, D.V., & Feinstein, A.R. (1990). High agreement but low kappa: II. Resolving the paradoxes. Journal of Clinical Epidemiology, 43(6), 551-558. doi:10.1016/0895-4356(90)90159-m

Gwet, K. (2008). Computing inter-rater reliability in the presence of high agreement. British Journal of Mathematical & Statistical Methodology, 61(1), 29-48. doi:10.1348/000711006×126600