Remember the Human in Online Courses

Remember the human is something we intuitively do in traditional face-to-face classrooms, but somehow this gets lost in distance education. If it’s only text-based independent study, then we’ve silenced our students and treated them as mutes by not providing communication platforms that are supported in the grading criteria. Virginia Shea asks us to remember the human in the impersonal cyberspace, as part of her Core Rules of Netiquette. She was referencing politeness. I, on the other hand, am referencing the instructional goal of teaching to the whole student.

This blog focuses on the basics of computer-mediated instruction in terms of the dichotomy of transmissive (authoritarian) education versus that of a transformative one (democratic). Whenever I present on this topic at conferences, participants share that they or their peers have also encountered and endured transmissive online courses. I wonder how big the problem really is.

Transmissive vs. Transformative

Critical pedagogies, Ignatian pedagogy, a community of inquiry (COI), and Freirean praxis, place the human in a real-world context as much as possible through learning experiences and reflection. The goal being transformative learning experiences instead of transmissive ones that use the antiquated banking model of education where the teacher deposits knowledge for the student to withdraw (Bradshaw, 2017). A good example of transformative learning is Ignatian pedagogy, which advocates for context, experience, action, reflection, and evaluation (Korth, 1993).

Interactions for transformative learning are active, authentic, constructive, cooperative, and intentional. Hooks (1994) calls this humanity-affirming location of possibility.  The design of interaction treatments online doesn’t rely solely on synchronous sessions through web hosting with everyone present. Instead, the goal of high-quality online instruction is to avoid passive learning that requires little cognitive engagement. A good example of a transformative learning activity would be a student (or group) project where students provide each other with authentic feedback.

Interaction treatments are any direct or indirect action between and among students, teachers, and the content. This includes nonverbal immediacy behaviors such as an instructor’s response time.  The alternative is unethical—a corpse of knowledge delivered by the unknowing instructor through text-based study devoid of interactions with other students (e.g., read-write-submit). The lack of contact with others in the class is not only isolating, shielding us from social learning, but can be frustrating for some students.

Are we teaching machines to learn better than we teach humans?

Embed from Getty Images

I recently read an introductory book about artificial intelligence (AI). I was struck how even the old AI addressed the environment of the robot, as this is something online instructors sometimes overlook for humans. If we want to come away as winners in the man vs machine competition, when humanoids such as Erica the robot have complete human feelings and singularity occurs in 2045, we need to focus on providing human interactions in online courses.

Through trial and error, AI has developed heuristics to address the robot’s interaction with the environment such as the symbol grounding problem, where symbols are meaningless unless they’re grounded within a real-world context.  For example, the Skydio R1 drone may become the ultimate selfie as it maps its environment using GPS, cameras, and other sensors. How often are instructors grounding the instructional content into the lifeworld of human learners?

What are the heuristics for effective human interaction in distance education?

Provide an online COI to dispel the perceived psychological distance between students and teachers in distance education to improve student learning outcomes and student satisfaction. An online COI, a sublime goal, requires consideration of the types of interaction treatments that could engender social, teaching, and cognitive presence for going beyond generative learning. These presences are the key elements for the COI loop (Garrison, Anderson, and Archer, 2000).

Technological affordances can provide humans with multimodal instruction such as narrated PowerPoints or audio feedback for teaching presence for an online COI. For example, podcasts increase student achievement and student satisfaction because they can listen to them over and over (Beylefeld, Hugo & Geyer, 2008; McKinney, Dyck & Luber, 2009; Seed, Yang & Sinnappan, 2009). Learning management systems allow for student-student discussions and the sharing of projects with opportunities for peer feedback to engender social presence in a COI. Projects also provide student agency in the design of their own learning.

Cognitive presence is the other component in the triad of the COI. Instructors generally provide this with interesting and challenging activities online that they’ve honed over the years from their F2F courses. In my two research studies (Rogers & Van Haneghan, 2016; Rogers & Khoury, unpublished), the potential plans for cognitive presence have been high at the institutions. However, social presence has been average and teaching presence below average.

Designing interaction treatments (e.g., student-student, student-teacher, and student-content) will help address the psychologically perceived distance in computer-mediated courses (Bernard et al., 2009). These designed interactions need to focus on meaningful activities for the students’ lifeworld to aid their learning. Remember the human as you plan your online course; otherwise, the robots will overtake us.

References

Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of ITs in distance education. Review of Educational Research, 79, 1243-1288. doi:10.3102/0034654309333844

Beylefeld, A. A., Hugo, A. P., & Geyer, H. J. (2008). More learning and less teaching? Students’ perceptions of a histology podcast. South African Journal of Higher Education, 22(5), 948-956. doi:10.4314/sajhe.v22i5.42914

Bradshaw, A. C. (2017). Critical pedagogy and educational technology, in A.D. Benson, R. Joseph, & J.L. Moore (eds.) Culture, Learning and Technology: Research and Practice (pp. 8-27). New York, NY: Routledge.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text based environment: Computer conferencing in higher education. The Internet and Higher Education 2(2-3), 87-105. doi:10.1016/s1096-7516(00)00016-6

Hooks, B. (1994). Teaching to transgress: Education as the practice of freedom.  New York, NY: Routledge.

McKinney, D., Dyck, J. L., & Luber, E. S. (2009). iTunes university and the classroom: Can podcasts replace professors? Computers & Education, 52, 617-623. doi:10.1016/j.compedu.2008.11.004

Rogers, S., & Van Haneghan, J. (2016). Rubric to evaluate online course syllabi plans for engendering a community of inquiry. Proceedings of Society for Information Technology & Teacher Education International Conference, 349-357. Chesapeake, VA: AACE.

Using Google Suite for the Universal Design of Learning

Design for gardining Website interface displays tools and supplies as icons
This Google Drawing was created for a doctoral mini project on an interface design task for developing a gardening website with one of my peers in an online course. This was created prior to my understanding of accessibility issues. Notice that not all icons are labeled. This would not be accessible to all. Additionally, alternative text would need to be embedded with each image.

Google Suite,  along with the Chrome browser’s Omnibox and useful extensions, can be used to enhance the teaching of all learners with universal instructional design principles. Google Suite is the new name for these features: Google Apps (Docs, Forms, Sheets, Slides), Classroom, and Drive. This blog focuses on the use of technology to augment instruction through differentiation via scaffolding, formative assessments, and student collaboration. Google professional development opportunities and teacher resources are also addressed.

There are several efforts to design education with universal design in mind. Palmer and Caputo (2003) proposed seven principles for universal instructional design (UID): accessibility, consistency, explicitness, flexibility, accommodating learning spaces, minimization of effort, and supportive learning environments. The UID model recognizes those needs for course design. Its main premise is equal access to education and extends this to all types of learners and not just those with disabilities. For example, all learners can benefit from multi-modal lessons. Palmer and Caputo’s principles should be kept in mind as you develop differentiated instructional learning scenarios with Google Suite. See my blog post to learn more about universal design.

My College is a Google Apps for Education campus, which means we have unlimited storage on our Drive and seamless access to Google Suite through our school Gmail. Speak with your Google Suite administrator to learn about the features and functions of your access, as some institutions like my alma mater block YouTube and Google+. 

The following scenarios address possible technology solutions for teaching all learners. For instance, scaffolding supports different learners’ preferences, as well as the needs of lower performing students. Formative assessments are important to obtain ongoing feedback on student performance; use these often. They can be formal or informal (practice tests, exit tickets, polls). Formative tests promote active learning, which leads to higher retention of information learned. Use the following list to add your ideas and scenarios for differentiated lesson planning.

Scaffold Learning Google Tools & Features Formative Assessments Your Ideas & Scenarios
Provide visuals for structure, context, or direction & just-in-time definitions Google Drawings, Docs’ Explore tool, & Drive Students make their own graphic representation of a concept or complete guided tasks with the frame provided by an instructor.
Provide authentic speaking practice prior to oral test/presentation Google Docs’ Voice Typing, Chrome Browser’s Omnibox for a timer, & Drive Students work individually or in small group turn-taking voice typing their scripts/stories on Google Doc within a timed parameter on a split screen.
Check for comprehension to obtain data to drive instruction/remediation Google Forms, Sheets, Classroom, & Drive (Alternative: Google Slides new feature allows for asking questions & polling question priority live from slide.) Students take a quiz on Google Forms to demonstrate knowledge after a lesson (exit ticket) or homework. Instructors receive Form responses in a Google Sheet. Sheets has Explore tool for analyzing data for visual display for data-driven discussions among teacher cohort/supervisors. Auto import grades from Forms to Classroom gradebook.
Students use app with embedded choices to check their own grammar Free Chrome extension, Grammarly and/or app Students correct errors in their first writing drafts on the app or within online writing platforms (e.g., wiki, blog, or email). Grammarly is also available for MS Office and Windows but not for Google Docs. Use its app to check Docs or other writing formats by pasting content to New Document.
Hi/low peer collaboration and/or tutoring Google Apps, Classroom, & Drive Students share settings on project Docs, Drawings, etc. to collaborate via text comments or synchronous video chat sessions.

Resources for Digital Literacy Skill Training

  • Did you know that Google provides lesson plans for information literacy?
  • Do you need to teach your students how to refine their web searches? See Google Support.
  • Internet Safety Tip- Recommend that students use incognito browsing on Google Chrome when conducting searches to reduce their digital footprint. See Google’s YouTube playlist, Digital Citizenship and Security, and their training site for more information.

Accessibility Resources for Assistive Technology

  • ChromeVOX – Google’s screen reading extension for the Google Chrome browser and the screen reader used by Chrome Operating System (OS).
  • TalkBack – This is Google’s screen reading software that is typically included with Android devices. Due to the design of Android and its customizability by hardware manufacturers, TalkBack can vary and may not be included on some Android devices.
  • Screen Magnifier – This is the screen magnification software included with ChromeOS. The magnification function in ChromeOS doesn’t have a unique product name like other platforms.
  • Hey, Google – This is Google’s personal assistant, which is available in the Google Chrome browser, ChromeOS, and many Android devices.

Professional Development for Educators

Other

References

Palmer, J., & Caputo, A. (2003). Universal instructional design: Implementation guide. Guelph, Ontario: University of Guelph.

Use Gwet’s AC1 instead of Cohen’s Kappa for Inter-rater Reliability

Last year, I attended a lecture by my former assessment and measurement professor, Dr. Van Haneghan, at the University of South Alabama. He addressed the paradox of using Cohen’s Kappa (k) for inter-rater reliability and acknowledged that it was identified in the literature two decades ago but has been mainly overlooked. The problem is that Cohen’sskews data when there is a high agreement between raters or an imbalance in the margins of the data tables (Cicchetti & Feinstein, 1990; Gwet, 2008).  This is contradictory to the statistical technique’s purpose, as researchers want to obtain an accurate degree of agreement. So if you’ve ever used Cohen’s Kappa for inter-rater reliability in your research studies, I recommend recalculating it with Gwet’s first-order agreement coefficient (AC1).

I decided to rerun the stats for my research study involving two raters analyzing the content of 23 online syllabi with the Online Community of Inquiry Syllabus Rubric for my presentation at AERA. AgreeStat was used to obtain Cohen’s k and Gwet’s AC1 to determine inter-rater reliability per category. Tables 1A-B show how the k statistic was affected by high agreement in the category of instructional design (ID) for cognitive presence (CP), while Gwet’s AC1 was not. Overall, Gwet’s AC1 values ranged from .102 to .675 (Mean SD = .135 ± .128). Interrater-reliability for scoring this category was good according to Altman’s (1991) benchmark, Gwet’s AC1 = .675, p < .001, and 95% CI [.04, .617].

Table 1A

Distribution of Scores by Rater and Category (Instructional Design for Cognitive Presence)

Rater2CP
Rater1CP 3 4 5 Missing Total
3 0 0 0 0 0 [0%]
4 1 2 1 0 4 [17.4%]
5 4 0 15 0 19 [82.6%]
Missing 0 0 0 0 0 [0%]
Total 5 2 16 0 23 [100%]
[21.7%] [8.7%] [69.6%] [0%] [100%]
 

Table 1B

 

Inter-rater Coefficients and Associated Parameters for ID for CP

METHOD Coeff. StdErr 95% C.I. p-Value
Cohen’s Kappa 0.36406 0.172287 0.007 to 0.721 4.617E-02
Gwet’s AC1 0.67549 0.128882 0.408 to 0.943 2.944E-05
Scott’s Pi 0.33494 0.195401 -0.07 to 0.74 1.006E-01
Krippendorff’s Alpha 0.34940 0.195401 -0.056 to 0.755 8.754E-02
Brenann-Prediger 0.60870 0.140428 0.317 to 0.9 2.664E-04
Percent Agreement 0.73913 0.093618 0.545 to 0.933 7.344E-08

Note. Unweighted Agreement Coefficients (Coeff.). Standard Error (StdErr) is the standard deviation. CI= confidence interval.

Gwet’s AgreeStat, Version 2015.6.1 (Advanced Analytics, Gaithersburg, MD, USA) currently costs $40. It’s fairly easy to use. See Kilem Gwet’s blog to learn more.

References

Altman, D. G. (1991). Practical statistics for medical research. London: Chapman and Hall.

Cicchetti, D.V., & Feinstein, A.R. (1990). High agreement but low kappa: II. Resolving the paradoxes. Journal of Clinical Epidemiology, 43(6), 551-558. doi:10.1016/0895-4356(90)90159-m

Gwet, K. (2008). Computing inter-rater reliability in the presence of high agreement. British Journal of Mathematical & Statistical Methodology, 61(1), 29-48. doi:10.1348/000711006×126600

Guest Blogging for the new AACE Review

A word cloud based on a blog about fake news detection resources.

I’m enjoying the challenge of guest blogging for the Association for the Advancement of Computers in Education’s (AACE) new blog, the AACE Review.  AACE is the professional organization that produces the LearnTechLib database and several educational research journals (i.e., International Journal on e-Learning, Journal of Computers in Math and Science Teaching,  Journal on Online Learning Research). It hosts several educators’ conferences that I like to attend such as the Society for  Information Technology and  Teacher Education (SITE) and the World Conference on eLearning (eLearn). See images of my past involvement with AACE.

So far, I’ve blogged about four unique topics on educational technology and learning: grit, computer-assisted language learning, strategies and resources for identifying the veracity of online content, and an API for children’s speech recognition. See these blogs on my Author page.


As for my Teacherrogers blog, I haven’t slowed down on my writing. I recently updated the page on my teaching philosophy, added my research statement, and a page on my Google Map project. These are the static pages at the top of this blog. You may have noticed the new award for landing in the top 75 blogs on Feedspot on the topic of educational technology. I was actually #58! Thanks for reading and sharing my blogs. I’ve been blogging here since 2011, and it serves as my knowledge base that I’m continuously updating, as I learn from and share with educators at my college and peers worldwide.

Instructional Design Graduate Assistantship Provided Apprenticeship

Dr. Rogers shows participants the various learning activities provided in StudyMate program
Sandra trained faculty at the University of South Alabama on various software programs such as Respondus’ StudyMate shown here.

This year, I’m celebrating my 5th anniversary as an instructional designer (ID). Prior to this career path, I was an educator for 18 years, so the transition was not difficult. As I reflect on the success I’m enjoying at Spring Hill College (SHC) now, I want to acknowledge the invaluable practical experience gained as an instructional designer during my doctoral program at the University of South Alabama (USA). I had a graduate assistantship with the Innovation in Learning Center (ILC) at the USA for 2 years.

Besides benefitting from tuition remission and a stipend, the apprenticeship provided me with the opportunity to work beside skilled IDs,  collaborate with a dozen of my classmates, and interact with faculty and students to address their needs. The assistantship purposefully had us cycle through various project teams, train-the-trainer sessions, and production tasks. Specifically, I was able to add these experiences to my resume:

  • Assisted the director of online learning with designing, developing, and delivering professional development and teaching tips for faculty to support student online learning via Sakai learning management system (LMS);
  • Moderated and maintained the online competency-based certificate course for faculty (Sakai 101: The Basics Online) and the orientation course for students (USAonline Student Course);
  • Supported the LMS administrator by answering technical calls from faculty and students; and
  • Served on the accessibility, resources, and USAonline teams to produce corresponding questionnaires, job aids, video tutorials, and reports (to include photography).

This apprenticeship grounded my doctoral studies, as I was able to think of developing trainer scripts based on Gagne’s 9 events of learning. See my previous post on a Pixlr workshop training plan.  Additionally, the formal and informal interactions with my peers provided opportunities to learn from each other, as the ID program is an interdisciplinary one. For example, my peers had advanced degrees in engineering, English, math, sociology, and IT. Many of my peers and co-workers from the ILC continue to shape my understanding of ID today through networking, professional development, and subject matter expertise on research interests.

If I didn’t have this well-rounded training and hands-on experience along with my doctoral coursework, I probably wouldn’t have had such as good start at my current workplace. For example, I was the first ID hired with a degree in the field at SHC. The previous person serving in the capacity of ID was actually the learning management system administrator and instructional technologist. All of the framework for collaborating with instructors as the ID (e.g., Online Course Design Guide, benchmarks, needs assessments, knowledge management, training), needed to be created from scratch. These documents initially relied on my ILC work experience but have since shifted to include the mission and identify of SHC. Nevertheless, I’m forever indebted to the ILC and my cohort of peers during my graduate assistantship!

Call for Comprehensive Commonsense Gun Reform

American Flag

Let me begin by stating that I don’t have the answer for gun violence in America, but that doesn’t stop me from trying to understand the situation nor advocating on behalf of those who have lost their lives to it.  This blog serves as a summary of the current gaps in legislation, school environments, consumer protection, and research.  The purpose is to consider all factors causing the problem and then develop problem statements.  Only by understanding the current situation fully, can we move forward with our objectives and (non)training solutions.

This categorized list will, hopefully, help us to form a solid argument for gun control. Through revision from your feedback, and as I learn more details, I seek a plan of action based on commonsense gun laws. In my opinion, the current situation is riddled with inadequacies in regards to public safety due to lax and inconsistent laws.  Today, in honor of the #MarchForOurLives,  I advocate change for good and applaud those involved in making informed decisions about gun laws that aren’t based on political or financial gains.

School Safety- Here are some of the ideas being promoted that require proof of efficacy: (A) Restrict entry to a single-point and require visitors to sign-in to limit access to nonstudents and nonpersonnel. (B) Provide a sufficient number of resource officers and counselors in accordance with school size to address student and staff needs. (C) Provide active shooter training and drills to prepare students and staff for such situations. (D) Use metal detectors at the entryway to deter crime. (E) In my opinion, arming teachers is not the solution but I will include it here and search the research on its effectiveness.

Gun Restrictions- Here’s a list of proposed measures to reduce gun violence: (A) Raise the age restriction to 21 to purchase a rifle or shotgun in accordance with the existing federal laws regarding handgun purchases from a licensed dealer.  Additionally, handguns and rifles purchased from unlicensed dealers (e.g., neighbor, gun show seller, or online store) should have the same age restrictions. (B) Require comprehensive background checks on nonlicensed buyers and enforce a centralized database to keep guns out of the hands of criminals, suspected terrorists on the no-fly list, the mentally ill, and other federally prohibited persons. A panel of gun violence experts cited these as effective means to curb gun violence (The New York Times) (C) Reinstate the 1994 federal ban on assault weapons and high-capacity ammunition magazines. This is supported by the American Public Health Association (APHA).  Gun violence experts also cited these as effective measures (The New York Times). (D) Ban the sale of bump stocks, which modify regular guns to perform as rapid-fire assault weapons.  This should already be strictly enforced by the government as it bucks existing federal laws for machine guns (18 U.S.C. § 921(a)(23); 27 C.F.R. § 479.11. See also 26 U.S.C. § 5845(b)). (E) Ban online sales of ‘ghost guns’ sold as maker kits and bear no serial identification.

Consumer Protection (A) Congress should ensure unsafe guns are recalled through an oversight agency such as the Bureau of Alcohol, Tabacco, Firearms, and Explosives (ATF). Our Consumer Product Safety Commission does not have jurisdiction over firearms and ammunition. Currently, unsafe guns are only recalled by the manufacturers.  Governmental oversight of unsafe guns was blocked by Rep. Dingell in 1972 and 1975 and has not been brought up for legislation since (Bloomberg). (B) Gun manufacturers should be required to test guns to ensure they work properly. For example, according to the Bloomberg report, nine different Taurus guns may fire when bumped or dropped even with the safety on. (C) Gun sellers, as defined by the ATF,  should obtain a federal firearms license. Moreover, the ATF needs to provide sufficient oversight, as the US DOJ Report #1-2004-005 found negligence in their inspections of licensure. (D) Congress should allow the use of smart gun technology such as devices that scan the owner’s fingerprint before it can fire.  See President Obama’s memorandum based on DOJ review (Federal Register), which reported its potential for reduction of accidental deaths by guns and use of stolen guns in criminal activities. Gun lobbyists kept Smith & Wesson from developing smart gun technologies through slander and a boycott of their products after President Clinton pushed the Gun Safety Agreement in 2000 with them. The APHA supports innovate technology to reduce gun violence and accidental shootings.

Research– (A) Congress should lift current restrictions on federal funding for research into gun violence. For example, the CDC National Violent Death Reporting System needs support from all 50 states, U.S. territories, and D.C. (B) Gun laws should be based on research and safe practices for society.

Note: I’ve written 168 blogs on this WordPress site. This is the only political one.

Join me at AERA in NYC

Photo of Sandra Annette Rogers
Say hello if you see me.

I’m so excited about attending my first conference of the American Educational Research Association (#AERA18) this year. This year’s theme is the dreams, possibilities, and necessity of public education. It will be held in New York City from April 13-17th at various participating hotels. There are 17,000 registrants!

My first event at the conference is to meet my second language research mentor on Friday! The Second Language Research special interest group (SIG) offered mentorship from volunteers in their group, and I signed up.  My mentor is Dr. Meagan Madison Peercy from the University of Maryland.

On Tuesday the 17th, I’ll be participating in a roundtable to discuss the research study with the Online Community of Inquiry Syllabus Rubric(c) that Dr. Van Haneghan and I conducted. It will be held in conjunction with other roundtables on the topic of Quality Assurance of Online Teaching & Learning, which is hosted by the Online Teaching & Learning SIG.  Come join my roundtable at 10:35am to 12:05pm at the New York Marriott Marquis, Fifth Floor, Westside Ballroom Salon 4. If you can’t make it, the paper will be provided in the AERA Online Repository.

Lastly, I’d like to thank the Spring Hill College Friends of the Library for helping fund this professional development activity!