Join me at AERA 2019 in Toronto

Sandra Rogers standing near AERA conference sign celebrating 100 years

I’ll be attending my second conference of the American Educational Research Association (#AERA19) this year. The theme is ‘Leveraging Education Research in a Post-Truth Era: Multimodal Narratives to Democratize Evidence.’  It will be held in Toronto, Canada from April 5-9th at the Metro Toronto Conference Centre. I was impressed with last year’s conference but a bit overwhelmed. Hopefully, with the help of their conference app, I’ll find the sessions I need.

View this link to see the poster for Dr. Khoury and my session: Rubric to Analyze Online Course Syllabi Plan for Engendering a Community of Inquiry: Round II. Come join me on Saturday morning, April 6, from 8:00 to 9:30am in the Metro Toronto Convention Centre, 300 Level, Hall C. It’s hosted by the Division C – Section 3b: Technology-Based Environments in the subunit for Distance and Online Education. I’ll be sharing copies of my Online Community of Inquiry Syllabus Rubric.

I’ve shared our research paper on the AERA online Repository.  Read this blog page to learn more about our study. My hope is that it will be replicated to validate the rubric and improve not only instructors’ syllabi but teaching and learning in distance education. Let me know if you’re interested in replicating our study.

Are you going to AERA? Let’s connect in Toronto!

Sandra Annette Rogers, PhD

Teacherrogers Products
Pre-K, Kindergarten, First, Second, Third, Fourth, Fifth, Sixth, Seventh, Eighth, Ninth, Tenth, Eleventh, Twelfth, Higher Education, Adult Education, Homeschooler, Staff, Not Grade Specific - TeachersPayTeachers.com

Join me at AECT in Kansas City, MO!

Photo of Sandra Annette Rogers
Say hello if you see me.

The Association for Educational Communications & Technology (AECT) is, in my humble opinion, the premier association for instructional designers. My professors in my doctoral studies had been promoting this professional organization and their educational technology standards to their students. I finally attended the AECT conference last year and was blown away by the professional level of everyone I met and how cordial they were to newcomers. This year, their 2018 conference will be held in Kansas City, MO from October 23-27 at the Kansas City Marriott Downtown. I’ll be there, so let me know if you plan to attend. For AECT members, I placed my slides and research paper on the new conference online portal.

This time around, I’ll be presenting on my latest research and giving a workshop on the Online Community of Inquiry Syllabus Rubric that  I co-developed with Dr. Van Haneghan. It serves as a great collaboration tool to provide feedback to instructors and for syllabi content analysis for action research. Here’s my schedule:

Wed, Oct 24, 9:00am to 12:00pm, Marriott, Room-Bennie Morten B

Use of Online Community of Inquiry Syllabus Rubric for Course Developers and Collaborators, Drs. Rogers & Khalsa

Workshop – Registration Required
The syllabus serves as an action plan, which can be used as a resource for collaboration with instructional designers. In this session, participants will discuss how the Online Community of Inquiry Syllabus Rubric© (Rogers & Van Haneghan, 2016) can be used to pinpoint course development discussions on cognitive, social, and teaching presence for distance education instructors. Research and development of the rubric, a worked sample, commonly shared feedback, and rubric rater training will be provided.


Division of Distance Learning

Thu, Oct 25, 9:40 to 10:05am, Marriott, Room-Julia Lee A

Rubric to Evaluate Online Course Syllabi Plans for Engendering a Community of Inquiry: Round II, Drs. Rogers & Khoury

We replicated a research study that analyzed online course syllabi with the Online Community of Inquiry (COI) Syllabus Rubric© (Rogers & Van Haneghan, 2016). The rubric consists of the following elements: instructional design for cognitive presence, technology tools for COI, COI loop for social presence, support for learner characteristics, and instruction and feedback for teaching presence. We reviewed 31 syllabi across disciplines and found above average cognitive presence, average social presence, and basic teaching presence.

#AECT2018 #elearning #communityof inquiry #edtech

An observer’s notes on the Socratic method in action

Scorates talking to a man who is eagerly listening at his side.
Image source: Wikimedia
Here are my notes from the dialectic dialogue of the Socratic Seminar: An International Forum on Socratic Teaching held at the Association of Educational Communications and Technology (AECT) conference in Jacksonville, Florida in 2017.  I attended to learn more about the #Socratic method in general but also to learn how to apply it to the academic task of advising doctoral students’ dissertation writing. This is what occurred in a simulated environment with a doctoral student, her advisor, and a panel of experts. It was the most amazing thing I’ve ever seen offered at a conference—and far few people saw it, as the panel outnumbered the attendees.  I took notes for future reference and also to share with the student who was the target for this activity.
 
Introduction by Adviser, Dr. Abbas Johari: “This is a respectful dialogue between master and student….An example would be guided questions for the learner…Panelists should not make a statement but bring her to an understanding of a concept via questioning.”
Topic of Dissertation:  The student, Cheng Miaoting, gave a brief overview of her dissertation titled,  Technology Acceptance of LMS in Postsecondary Schools in Hong Kong.
MethodologyStudent used survey and interview methods to address several variables (e.g., SES, environment, context) based on the technology acceptance model (TAM 3).
Panels’ Questions: Each expert asked the student a question while she listened. I was not always able to attribute who said what as I feverishly took notes. Please understand the missing attributions.  See link below for panelists’ names.
  1. What is the problem? Tech or culture?
  2. What are you expecting to find? Recommendation for action? The assumption is __________.
  3. What are the assumptions underlying acceptance? Why is this good? Response to facilitate learning?
  4. Which theory will you use and why?
  5. Which variables affect learning?
Dr. Michael Thomas’ statement: “Tool has no agenda as in gun law. Is it possible to argue if a bad thing?” He recommended seeing Technological Sublime (aka Machine Messiah).
Dr. Amy Bradshaw’s statement: “What is modernity with Chinese characteristics?” Deficit ideology where X fixes them, whereas X is tech, mainland Chinese are needing a fix and solution is technology.
Adviser’s Guidance to Student: He told his student to address the master’s guidance by asking following questions or to paraphrase what she had learned. She had a question about the term ‘factors’ in research.
Panel Questions continued:
6. What type of psychological adaptation will you use? Acculturation Framework? Cat mentioned Hofstede’s but panel discouraged it based on its hostility and stereotypical frame.
7. Fundamentally, what is the burning question you want to answer? The human question—why you want to do it. Solve one problem at a time.
8. How do things change in society? Need theory on societal change.
9. Why are immigrants coming to HK?
10. What are schools doing to address this? (Here is where you addressed the practical significance or human question, which was the missing piece of training for technology.)
11. Have you looked at other countries tech adaption for immigrants?
Adviser called for Debrief: The student acknowledged the need to focus study and reflect. She will reach out to other researchers to negotiate understanding, as was done today. She will talk in practical terms and not just in research methodology.
Panel Debriefed with Suggestions: 
  • Free yourself, but 1-directional.
  • What is the one thing they do not want you to talk about? That is your research questions.
  • Focus on commonality and not just differences.
  • Find ways to hear immigrants to inform study.
  • Remember the humane as well as the human.
  • Have an open mind in research design—always question research design.
  • Look at the polarity of human existence. What is up/down? In/out? What is not there? What’s obvious? Hidden? Who implemented these types of change?
  • Listen to your adviser.
  • See work by Charles Reigeluth and Carl Rogers.

Here is a link to the #AECT conference abstract and list of panel members.

Bibliography on Active Learning

Want to learn more about active learning? Check out this reading list. In preparation for my Fulbright application to Norway for an active learning research project, I prepared this bibliography last year.  It includes some Norwegian research on the topic.  I didn’t get that postdoctoral Fulbright but will try again next year for something else.  It took a lot of time preparing the application, and my references and potential hosting institution were so helpful in the process.  Special thanks to Dr. Rob Gray for serving as an intermediator in the application process!  You can read about his work below. If you have any seminal articles on active learning, please leave the citation in the comments section for inclusion. #activelearning

Bibliography

Astin, A. W., & Antonio, A. L. (2012). Assessment for excellence: The philosophy and practice of assessment (2nd ed.). New York: NY: Rowman & Littlefield Publishers, Inc.

Baird, J-A., Hopfenbeck, T. N., Newton, P., Stobart, G., & Steen-Utheim, A. T. (2014). Assessment and learning: State of the field review, 13/4697. Oslo: Norway: Knowledge Center for Education. Retrieved from http://taloe.up.pt/wp-content/uploads/2013/11/FINALMASTER2July14Bairdetal2014AssessmentandLearning.pdf

Banta, T. W., & Palomba, C. A. (2015). Assessment essentials: Planning, implementing, and improving assessment in higher education (2nd ed.). San Francisco, CA: Jossey-Bass.

Barkley, E. F., & Major, C. H. (2016). Learning assessment techniques: A handbook for college faculty. San Francisco, CA: Jossey-Bass

Biggs, J., & Tang, C. (2007). Teaching for quality learning at university: What the student does (3rd ed.). Maidenhead, Berkshire: Open University Press.

Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5, 7-74. doi:10.1080/0969595980050102

Brookhart, S. M. (2007). Expanding views about formative classroom assessment: A review of the literature. In J. H. McMillan (Ed.), Formative classroom assessment: Theory into practice, 43-62. New York, NY: Teachers College Press.

Chickering, A. W., & Gamson, Z. F. (1991). Applying the seven principles for good practice in undergraduate education. New Directions for Teaching and Learning, 47. San Francisco, CA: Jossey-Bass.

Deci, E. & Ryan, R. M. 2014. Intrinsic motivation and self-determination in human behavior. Berlin: Springer.

Dysthe, O., Englesen, K. S., Lima, I. (2007). Variations in portfolio assessment in higher education: Discussion of quality issues based on Norwegian survey across institutions and disciplines. Assessing Writing, 12(2), 129-148. doi:10.1016/j.asw.2007.10.002

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. PNAS, 111(23), 8410-8415. doi:10.1073/pnas.1319030111

Gagné, R. M. (1985). The conditions of learning. New York, NY: Holt, Rinehart, & Winston.

Gray, R., & Nerheim, M. S. (2017). Teaching and learning in the digital age: Online tools and assessment practices, P48. Norgesuniversitetet: University of Bergen. Retrieved from https://norgesuniversitetet.no/prosjekt/teaching-and-learning-in-the-digital-age

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research77(1), 81-112. doi: 10.3102/003465430298487

Hopfenbeck, T. N., & Stobart, G. (2015). Large-scale implementation of assessment for learning. Assessment in Education: Principles, Policy & Practice, 22(1), 1-2. doi:10.1080/0969594X.2014.1001566

Johnson, D. W., Johnson, R., & Smith, K. (2006). Active learning: Cooperation in the university classroom (3rd ed.). Edina, MN: Interaction Book Company.

Klenowski, V. (2009). Assessment for learning revisited: An Asia-Pacific perspective. Assessment in Education: Principles, Policy & Practice, 16(3), 263-268. doi: 10.1080/09695940903319646

National Dropout Prevention Center/Network. (2009). 15 effective strategies for dropout prevention. NDPC: Clemson University. Retrieved from http://dropoutprevention.org/wp-content/uploads/2015/03/NDPCN15EffectiveStrategies.pdf

Norwegian Ministry of Education and Research. (2017). Quality culture in higher education, Meld. St. 16. Retrieved from https://www.regjeringen.no/no/dokumenter/meld.-st.-16-20162017/id2536007/

Nusche, D., Earl, L., Maxwell, W., & Shewbridge, C. (2011). OECD reviews of evaluation and assessment in education: Norway. Organisation for Economic Co-operation and Development. Retrieved from https://www.oecd.org/norway/48632032.pdf

Rogers, E. (2003). Diffusion of innovations (5th ed.). New York, NY: Simon and Schuster.

Thum, Y. M., Tarasawa, B., Hegedus, A., You, X., & Bowe, B. (2015). Keeping learning on track: A case-study of formative assessment practice and its impact on learning in Meridian School District. Portland, OR: Northwest Evaluation Association. Retrieved from http://files.eric.ed.gov/fulltext/ED567844.pdf

Wiliam, D. (2007). Keeping learning on track: Formative assessment and the regulation of learning. In F. K. Lester, Jr. (Ed.), Second handbook of mathematics teaching and learning (pp. 1053–1098). Greenwich, CT: Information Age Publishing.

 

Use Gwet’s AC1 instead of Cohen’s Kappa for Inter-rater Reliability

Last year, I attended a lecture by my former assessment and measurement professor, Dr. Van Haneghan, at the University of South Alabama. He addressed the paradox of using Cohen’s Kappa (k) for inter-rater reliability and acknowledged that it was identified in the literature two decades ago but has been mainly overlooked. The problem is that Cohen’sskews data when there is a high agreement between raters or an imbalance in the margins of the data tables (Cicchetti & Feinstein, 1990; Gwet, 2008).  This is contradictory to the statistical technique’s purpose, as researchers want to obtain an accurate degree of agreement. So if you’ve ever used Cohen’s Kappa for inter-rater reliability in your research studies, I recommend recalculating it with Gwet’s first-order agreement coefficient (AC1).

I decided to rerun the stats for my research study involving two raters analyzing the content of 23 online syllabi with the Online Community of Inquiry Syllabus Rubric for my presentation at AERA. AgreeStat was used to obtain Cohen’s k and Gwet’s AC1 to determine inter-rater reliability per category. Tables 1A-B show how the k statistic was affected by high agreement in the category of instructional design (ID) for cognitive presence (CP), while Gwet’s AC1 was not. Overall, Gwet’s AC1 values ranged from .102 to .675 (Mean SD = .135 ± .128). Interrater-reliability for scoring this category was good according to Altman’s (1991) benchmark, Gwet’s AC1 = .675, p < .001, and 95% CI [.04, .617].

Table 1A

Distribution of Scores by Rater and Category (Instructional Design for Cognitive Presence)

Rater2CP
Rater1CP 3 4 5 Missing Total
3 0 0 0 0 0 [0%]
4 1 2 1 0 4 [17.4%]
5 4 0 15 0 19 [82.6%]
Missing 0 0 0 0 0 [0%]
Total 5 2 16 0 23 [100%]
[21.7%] [8.7%] [69.6%] [0%] [100%]
 

Table 1B

 

Inter-rater Coefficients and Associated Parameters for ID for CP

METHOD Coeff. StdErr 95% C.I. p-Value
Cohen’s Kappa 0.36406 0.172287 0.007 to 0.721 4.617E-02
Gwet’s AC1 0.67549 0.128882 0.408 to 0.943 2.944E-05
Scott’s Pi 0.33494 0.195401 -0.07 to 0.74 1.006E-01
Krippendorff’s Alpha 0.34940 0.195401 -0.056 to 0.755 8.754E-02
Brenann-Prediger 0.60870 0.140428 0.317 to 0.9 2.664E-04
Percent Agreement 0.73913 0.093618 0.545 to 0.933 7.344E-08

Note. Unweighted Agreement Coefficients (Coeff.). Standard Error (StdErr) is the standard deviation. CI= confidence interval.

Gwet’s AgreeStat, Version 2015.6.1 (Advanced Analytics, Gaithersburg, MD, USA) currently costs $40. It’s fairly easy to use. See Kilem Gwet’s blog to learn more.

#AgreeStat #GwetAC1 #CohenKappa #Interrater-reliability

References

Altman, D. G. (1991). Practical statistics for medical research. London: Chapman and Hall.

Cicchetti, D.V., & Feinstein, A.R. (1990). High agreement but low kappa: II. Resolving the paradoxes. Journal of Clinical Epidemiology, 43(6), 551-558. doi:10.1016/0895-4356(90)90159-m

Gwet, K. (2008). Computing inter-rater reliability in the presence of high agreement. British Journal of Mathematical & Statistical Methodology, 61(1), 29-48. doi:10.1348/000711006×126600

Join me at AERA 2018 in NYC

Photo of Sandra Annette Rogers
Say hello if you see me.

I’m so excited about attending my first conference of the American Educational Research Association (#AERA18) this year. This year’s theme is the dreams, possibilities, and necessity of public education. It will be held in New York City from April 13-17th at various participating hotels. There are 17,000 registrants!

My first event at the conference is to meet my second language research mentor on Friday! The Second Language Research special interest group (SIG) offered mentorship from volunteers in their group, and I signed up.

On Tuesday the 17th, I’ll be participating in a roundtable to discuss the research study with the Online Community of Inquiry Syllabus Rubric(c) that Dr. Van Haneghan and I conducted. It will be held in conjunction with other roundtables on the topic of Quality Assurance of Online Teaching & Learning, which is hosted by the Online Teaching & Learning SIG.  Join my roundtable at 10:35am to 12:05pm at the New York Marriott Marquis, Fifth Floor, Westside Ballroom Salon 4. If you can’t make it, the paper will be provided in the AERA Online Repository.

Lastly, I’d like to thank the Spring Hill College Friends of the Library for helping fund this professional development activity!

Ask Congress to support and fund gun violence research

Dear Readers,

I signed a petition on the Action Network to ask Congress to support and fund gun violence research. See petition below.


Join a diverse, nonpartisan, and interdisciplinary group of organizations in adding your name as an advocate to call for Congress to provide dedicated federal funding for research into gun violence. The current restriction on federal funding for gun violence research limits our understanding of this epidemic and prevents us from enacting evidence-based policies that will protect our lives, our families, and our communities. We also ask for Congress to remove restrictions preventing federal agencies from sharing information that could help them better understand – and ultimately prevent – injuries and loss of life.

Three of the deadliest shootings in modern US history have happened in the last six months. These are but three of hundreds of recent mass shootings that have torn apart families and communities.  These acts of violence now happen with such frightening regularity that in some cases they pass almost without recognition, not even registering in the public conscience long enough for us to know the names of the lives lost and communities shattered.

Gun violence is a public health crisis that, on average, takes the lives of 100 people (Bauchner et al., 2018) and injures hundreds (CDC) more in the United States every day.  In order to address gun violence as the public health issue that it truly is, both the public and our elected officials who serve us need to understand what works to prevent gun violence, and this can not be accomplished without credible, scientific research.

Research into the causes and prevention of violence is not a partisan issue.  Yet for more than two decades, Congress has failed to provide dedicated funding for gun violence research, in part because of the Dickey Amendment, a law that states that “None of the funds made available for injury prevention and control at the Centers for Disease Control and Prevention (CDC) may be used to advocate or promote gun control.” Although the Dickey Amendment does not explicitly prevent research on gun violence, it is widely acknowledged that absent clearer guidance from Congress it has had a devastating effect on violence prevention research at the CDC.  As advocates for science, we demand policies based on scientific evidence, and we ask that Congress immediately repeal the Dickey Amendment and provide dedicated funding for research into the causes and prevention of gun violence.

Without this research, we cannot identify risk and protective factors, nor can we develop prevention strategies.  Gun violence affects all communities, but disproportionately affects marginalized communities, who will continue to suffer the greatest consequences of our inaction. The lack of publicly funded research on gun violence has left us without evidence to guide us in responding to an epidemic that kills tens of thousands of people each year and adversely impacts millions more.

We further ask that the federal government repeal the Tiahrt Amendment, a 2003 provision prohibiting the Bureau of Alcohol, Tobacco, Firearms, and Explosives from releasing information about its firearms database to the CDC and the National Institutes of Health.  Researchers need systematic data collection and a national database dedicated to storing and collecting data on gun sales and registrations.  This information must be coupled with a database on firearms injuries and deaths nationwide to monitor and better understand the scope of this national public health problem.  To help accomplish this goal, we ask Congress to provide funding for the CDC National Violent Death Reporting System to support the participation of all 50 states, U.S. territories, and the District of Columbia in reporting gun violence statistics to the national database; currently, 42 states receive funding.  In order to prevent gun violence, we must understand how it affects adults and children in all states, without exception.

Research and policy development on firearm-related injuries and deaths warrant the same level of attention, and dedicated federal and state funding and support, as are currently directed to public health challenges presented by the opioid epidemic, cigarette smoking, and HIV/AIDS. Regardless of political party, every member of Congress must play a role in supporting the research we need to protect our communities and enact evidence-based policy to combat gun violence.

We urge you to honor victims, survivors, and their loved ones by writing and implementing evidence-based policies to protect our communities from gun violence.  We stand together in asking Congress for the support and funding needed to make these policies a reality. Signing this petition will add your name to this open letter calling for action.

References (I’m adding the references within the document and a few that were missing from the original call to action hyperlinked message.)

American Public Health Association. (2016). Fact sheet on preventing gun violence. Retrieved from https://www.apha.org/~/media/files/pdf/factsheets/160317_gunviolencefs.ashx? on February 28, 2018.

Dockrill, P. (2017, September 16). Here’s why gun violence research in the US is about to come to a grinding halt. Retrieved from https://www.sciencealert.com/shelved-obama-gun-research-program-could-terminate-studies-of-firearm-violence?

Bauchner, H., Rivara, F. P., Bonow, R. O., Bressler, N. M., Disis, M. L. N., Heckers, S., … & Rhee, J. S. (2018). Death by gun violence—A public health crisis. JAMA Facial Plastic Surgery, 20, 7-8. doi:10.1001/jama.2017.16446

Beckett, L. (2014, May 15). Why don’t we know how many people are shot each year in America? Retrieved from https://www.propublica.org/article/why-dont-we-know-how-many-people-are-shot-each-year-in-america?

Bieler, S., Kijakazi, K., La Vigne, N., Vinik, N., & Overton, S. (2016). Engaging communities in reducing gun violence. Washington, DC: Urban Institute.

Branas, C. C., Richmond, T. S., Culhane, D. P., Ten Have, T. R., & Wiebe, D. J. (2009). Investigating the link between gun possession and gun assault. American Journal of Public Health, 99, 2034-2040. doi:10.2105/AJPH.2008.143099.

Chapman, S., Alpers, P., & Jones, M. (2016). Association between gun law reforms and intentional firearm deaths in Australia, 1979-2013. Journal of the American Medical Association, 316, 291-299. doi:10.1001/jama.2016.8752

Center for Disease Control (CDC). Leading causes of nonfatal injury reports, 2000-2006. U.S. Department of Health and Human Services. Retrieved from https://webappa.cdc.gov/sasweb/ncipc/nfilead.html

Fenway Health. (2016). Gun Violence and LGBT Health. Retrieved from http://fenwayhealth.org/?source=email&amp; on February 28. 2018.

Gani, F., Sakran, J. V., & Canner, J. K. (2017). Emergency Department Visits For Firearm-Related Injuries In The United States, 2006–14. Health Affairs, 36, 1729-1738.

Kellermann, A. L., & Rivara, F. P. (2013). Silencing the science on gun research. Journal of the American Medical Association, 309, 549-550. doi:10.1001/jama.2012.208207

Kellermann, A. L., Rivara, F. P., Rushforth, N. B., Banton, J. G., Reay, D. T., Francisco, J. T., … & Somes, G. (1993). Gun ownership as a risk factor for homicide in the home. New England Journal of Medicine, 329, 1084-1091.

Wellford, C. F., Pepper, J. V., & Petrie, C. V. (2005). Firearms and violence: A critical review. Washington, DC: National Academies Press.

Can you join me and take action? Click here: https://actionnetwork.org/petitions/12b649467ddaddbdee0b0564502325b49833096c?source=email&