Bibliography on Active Learning

Want to learn more about active learning? Check out this reading list. In preparation for my Fulbright application to Norway for an active learning research project, I prepared this bibliography last year.  It includes some Norwegian research on the topic.  I didn’t get that postdoctoral Fulbright but will try again next year for something else.  It took a lot of time preparing the application, and my references and potential hosting institution were so helpful in the process.  Special thanks to Dr. Rob Gray for serving as an intermediator in the application process!  You can read about his work below. If you have any seminal articles on active learning, please leave the citation in the comments section for inclusion.


Astin, A. W., & Antonio, A. L. (2012). Assessment for excellence: The philosophy and practice of assessment (2nd ed.). New York: NY: Rowman & Littlefield Publishers, Inc.

Baird, J-A., Hopfenbeck, T. N., Newton, P., Stobart, G., & Steen-Utheim, A. T. (2014). Assessment and learning: State of the field review, 13/4697. Oslo: Norway: Knowledge Center for Education. Retrieved from

Banta, T. W., & Palomba, C. A. (2015). Assessment essentials: Planning, implementing, and improving assessment in higher education (2nd ed.). San Francisco, CA: Jossey-Bass.

Barkley, E. F., & Major, C. H. (2016). Learning assessment techniques: A handbook for college faculty. San Francisco, CA: Jossey-Bass

Biggs, J., & Tang, C. (2007). Teaching for quality learning at university: What the student does (3rd ed.). Maidenhead, Berkshire: Open University Press.

Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5, 7-74. doi:10.1080/0969595980050102

Brookhart, S. M. (2007). Expanding views about formative classroom assessment: A review of the literature. In J. H. McMillan (Ed.), Formative classroom assessment: Theory into practice, 43-62. New York, NY: Teachers College Press.

Chickering, A. W., & Gamson, Z. F. (1991). Applying the seven principles for good practice in undergraduate education. New Directions for Teaching and Learning, 47. San Francisco, CA: Jossey-Bass.

Deci, E. & Ryan, R. M. 2014. Intrinsic motivation and self-determination in human behavior. Berlin: Springer.

Dysthe, O., Englesen, K. S., Lima, I. (2007). Variations in portfolio assessment in higher education: Discussion of quality issues based on Norwegian survey across institutions and disciplines. Assessing Writing, 12(2), 129-148. doi:10.1016/j.asw.2007.10.002

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. PNAS, 111(23), 8410-8415. doi:10.1073/pnas.1319030111

Gagné, R. M. (1985). The conditions of learning. New York, NY: Holt, Rinehart, & Winston.

Gray, R., & Nerheim, M. S. (2017). Teaching and learning in the digital age: Online tools and assessment practices, P48. Norgesuniversitetet: University of Bergen. Retrieved from

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research77(1), 81-112. doi: 10.3102/003465430298487

Hopfenbeck, T. N., & Stobart, G. (2015). Large-scale implementation of assessment for learning. Assessment in Education: Principles, Policy & Practice, 22(1), 1-2. doi:10.1080/0969594X.2014.1001566

Johnson, D. W., Johnson, R., & Smith, K. (2006). Active learning: Cooperation in the university classroom (3rd ed.). Edina, MN: Interaction Book Company.

Klenowski, V. (2009). Assessment for learning revisited: An Asia-Pacific perspective. Assessment in Education: Principles, Policy & Practice, 16(3), 263-268. doi: 10.1080/09695940903319646

National Dropout Prevention Center/Network. (2009). 15 effective strategies for dropout prevention. NDPC: Clemson University. Retrieved from

Norwegian Ministry of Education and Research. (2017). Quality culture in higher education, Meld. St. 16. Retrieved from

Nusche, D., Earl, L., Maxwell, W., & Shewbridge, C. (2011). OECD reviews of evaluation and assessment in education: Norway. Organisation for Economic Co-operation and Development. Retrieved from

Rogers, E. (2003). Diffusion of innovations (5th ed.). New York, NY: Simon and Schuster.

Thum, Y. M., Tarasawa, B., Hegedus, A., You, X., & Bowe, B. (2015). Keeping learning on track: A case-study of formative assessment practice and its impact on learning in Meridian School District. Portland, OR: Northwest Evaluation Association. Retrieved from

Wiliam, D. (2007). Keeping learning on track: Formative assessment and the regulation of learning. In F. K. Lester, Jr. (Ed.), Second handbook of mathematics teaching and learning (pp. 1053–1098). Greenwich, CT: Information Age Publishing.


Use Gwet’s AC1 instead of Cohen’s Kappa for Inter-rater Reliability

Last year, I attended a lecture by my former assessment and measurement professor, Dr. Van Haneghan, at the University of South Alabama. He addressed the paradox of using Cohen’s Kappa (k) for inter-rater reliability and acknowledged that it was identified in the literature two decades ago but has been mainly overlooked. The problem is that Cohen’sskews data when there is a high agreement between raters or an imbalance in the margins of the data tables (Cicchetti & Feinstein, 1990; Gwet, 2008).  This is contradictory to the statistical technique’s purpose, as researchers want to obtain an accurate degree of agreement. So if you’ve ever used Cohen’s Kappa for inter-rater reliability in your research studies, I recommend recalculating it with Gwet’s first-order agreement coefficient (AC1).

I decided to rerun the stats for my research study involving two raters analyzing the content of 23 online syllabi with the Online Community of Inquiry Syllabus Rubric for my presentation at AERA. AgreeStat was used to obtain Cohen’s k and Gwet’s AC1 to determine inter-rater reliability per category. Tables 1A-B show how the k statistic was affected by high agreement in the category of instructional design (ID) for cognitive presence (CP), while Gwet’s AC1 was not. Overall, Gwet’s AC1 values ranged from .102 to .675 (Mean SD = .135 ± .128). Interrater-reliability for scoring this category was good according to Altman’s (1991) benchmark, Gwet’s AC1 = .675, p < .001, and 95% CI [.04, .617].

Table 1A

Distribution of Scores by Rater and Category (Instructional Design for Cognitive Presence)

Rater1CP 3 4 5 Missing Total
3 0 0 0 0 0 [0%]
4 1 2 1 0 4 [17.4%]
5 4 0 15 0 19 [82.6%]
Missing 0 0 0 0 0 [0%]
Total 5 2 16 0 23 [100%]
[21.7%] [8.7%] [69.6%] [0%] [100%]

Table 1B


Inter-rater Coefficients and Associated Parameters for ID for CP

METHOD Coeff. StdErr 95% C.I. p-Value
Cohen’s Kappa 0.36406 0.172287 0.007 to 0.721 4.617E-02
Gwet’s AC1 0.67549 0.128882 0.408 to 0.943 2.944E-05
Scott’s Pi 0.33494 0.195401 -0.07 to 0.74 1.006E-01
Krippendorff’s Alpha 0.34940 0.195401 -0.056 to 0.755 8.754E-02
Brenann-Prediger 0.60870 0.140428 0.317 to 0.9 2.664E-04
Percent Agreement 0.73913 0.093618 0.545 to 0.933 7.344E-08

Note. Unweighted Agreement Coefficients (Coeff.). Standard Error (StdErr) is the standard deviation. CI= confidence interval.

Gwet’s AgreeStat, Version 2015.6.1 (Advanced Analytics, Gaithersburg, MD, USA) currently costs $40. It’s fairly easy to use. See Kilem Gwet’s blog to learn more.


Altman, D. G. (1991). Practical statistics for medical research. London: Chapman and Hall.

Cicchetti, D.V., & Feinstein, A.R. (1990). High agreement but low kappa: II. Resolving the paradoxes. Journal of Clinical Epidemiology, 43(6), 551-558. doi:10.1016/0895-4356(90)90159-m

Gwet, K. (2008). Computing inter-rater reliability in the presence of high agreement. British Journal of Mathematical & Statistical Methodology, 61(1), 29-48. doi:10.1348/000711006×126600

Join me at AERA in NYC

Photo of Sandra Annette Rogers
Say hello if you see me.

I’m so excited about attending my first conference of the American Educational Research Association (#AERA18) this year. This year’s theme is the dreams, possibilities, and necessity of public education. It will be held in New York City from April 13-17th at various participating hotels. There are 17,000 registrants!

My first event at the conference is to meet my second language research mentor on Friday! The Second Language Research special interest group (SIG) offered mentorship from volunteers in their group, and I signed up.  My mentor is Dr. Meagan Madison Peercy from the University of Maryland.

On Tuesday the 17th, I’ll be participating in a roundtable to discuss the research study with the Online Community of Inquiry Syllabus Rubric(c) that Dr. Van Haneghan and I conducted. It will be held in conjunction with other roundtables on the topic of Quality Assurance of Online Teaching & Learning, which is hosted by the Online Teaching & Learning SIG.  Come join my roundtable at 10:35am to 12:05pm at the New York Marriott Marquis, Fifth Floor, Westside Ballroom Salon 4. If you can’t make it, the paper will be provided in the AERA Online Repository.

Lastly, I’d like to thank the Spring Hill College Friends of the Library for helping fund this professional development activity!

Ask Congress to support and fund gun violence research

Dear Readers,

I signed a petition on the Action Network to ask Congress to support and fund gun violence research. See petition below.

Join a diverse, nonpartisan, and interdisciplinary group of organizations in adding your name as an advocate to call for Congress to provide dedicated federal funding for research into gun violence. The current restriction on federal funding for gun violence research limits our understanding of this epidemic and prevents us from enacting evidence-based policies that will protect our lives, our families, and our communities. We also ask for Congress to remove restrictions preventing federal agencies from sharing information that could help them better understand – and ultimately prevent – injuries and loss of life.

Three of the deadliest shootings in modern US history have happened in the last six months. These are but three of hundreds of recent mass shootings that have torn apart families and communities.  These acts of violence now happen with such frightening regularity that in some cases they pass almost without recognition, not even registering in the public conscience long enough for us to know the names of the lives lost and communities shattered.

Gun violence is a public health crisis that, on average, takes the lives of 100 people (Bauchner et al., 2018) and injures hundreds (CDC) more in the United States every day.  In order to address gun violence as the public health issue that it truly is, both the public and our elected officials who serve us need to understand what works to prevent gun violence, and this can not be accomplished without credible, scientific research.

Research into the causes and prevention of violence is not a partisan issue.  Yet for more than two decades, Congress has failed to provide dedicated funding for gun violence research, in part because of the Dickey Amendment, a law that states that “None of the funds made available for injury prevention and control at the Centers for Disease Control and Prevention (CDC) may be used to advocate or promote gun control.” Although the Dickey Amendment does not explicitly prevent research on gun violence, it is widely acknowledged that absent clearer guidance from Congress it has had a devastating effect on violence prevention research at the CDC.  As advocates for science, we demand policies based on scientific evidence, and we ask that Congress immediately repeal the Dickey Amendment and provide dedicated funding for research into the causes and prevention of gun violence.

Without this research, we cannot identify risk and protective factors, nor can we develop prevention strategies.  Gun violence affects all communities, but disproportionately affects marginalized communities, who will continue to suffer the greatest consequences of our inaction. The lack of publicly funded research on gun violence has left us without evidence to guide us in responding to an epidemic that kills tens of thousands of people each year and adversely impacts millions more.

We further ask that the federal government repeal the Tiahrt Amendment, a 2003 provision prohibiting the Bureau of Alcohol, Tobacco, Firearms, and Explosives from releasing information about its firearms database to the CDC and the National Institutes of Health.  Researchers need systematic data collection and a national database dedicated to storing and collecting data on gun sales and registrations.  This information must be coupled with a database on firearms injuries and deaths nationwide to monitor and better understand the scope of this national public health problem.  To help accomplish this goal, we ask Congress to provide funding for the CDC National Violent Death Reporting System to support the participation of all 50 states, U.S. territories, and the District of Columbia in reporting gun violence statistics to the national database; currently, 42 states receive funding.  In order to prevent gun violence, we must understand how it affects adults and children in all states, without exception.

Research and policy development on firearm-related injuries and deaths warrant the same level of attention, and dedicated federal and state funding and support, as are currently directed to public health challenges presented by the opioid epidemic, cigarette smoking, and HIV/AIDS. Regardless of political party, every member of Congress must play a role in supporting the research we need to protect our communities and enact evidence-based policy to combat gun violence.

We urge you to honor victims, survivors, and their loved ones by writing and implementing evidence-based policies to protect our communities from gun violence.  We stand together in asking Congress for the support and funding needed to make these policies a reality. Signing this petition will add your name to this open letter calling for action.

References (I’m adding the references within the document and a few that were missing from the original call to action hyperlinked message.)

American Public Health Association. (2016). Fact sheet on preventing gun violence. Retrieved from on February 28, 2018.

Dockrill, P. (2017, September 16). Here’s why gun violence research in the US is about to come to a grinding halt. Retrieved from

Bauchner, H., Rivara, F. P., Bonow, R. O., Bressler, N. M., Disis, M. L. N., Heckers, S., … & Rhee, J. S. (2018). Death by gun violence—A public health crisis. JAMA Facial Plastic Surgery, 20, 7-8. doi:10.1001/jama.2017.16446

Beckett, L. (2014, May 15). Why don’t we know how many people are shot each year in America? Retrieved from

Bieler, S., Kijakazi, K., La Vigne, N., Vinik, N., & Overton, S. (2016). Engaging communities in reducing gun violence. Washington, DC: Urban Institute.

Branas, C. C., Richmond, T. S., Culhane, D. P., Ten Have, T. R., & Wiebe, D. J. (2009). Investigating the link between gun possession and gun assault. American Journal of Public Health, 99, 2034-2040. doi:10.2105/AJPH.2008.143099.

Chapman, S., Alpers, P., & Jones, M. (2016). Association between gun law reforms and intentional firearm deaths in Australia, 1979-2013. Journal of the American Medical Association, 316, 291-299. doi:10.1001/jama.2016.8752

Center for Disease Control (CDC). Leading causes of nonfatal injury reports, 2000-2006. U.S. Department of Health and Human Services. Retrieved from

Fenway Health. (2016). Gun Violence and LGBT Health. Retrieved from; on February 28. 2018.

Gani, F., Sakran, J. V., & Canner, J. K. (2017). Emergency Department Visits For Firearm-Related Injuries In The United States, 2006–14. Health Affairs, 36, 1729-1738.

Kellermann, A. L., & Rivara, F. P. (2013). Silencing the science on gun research. Journal of the American Medical Association, 309, 549-550. doi:10.1001/jama.2012.208207

Kellermann, A. L., Rivara, F. P., Rushforth, N. B., Banton, J. G., Reay, D. T., Francisco, J. T., … & Somes, G. (1993). Gun ownership as a risk factor for homicide in the home. New England Journal of Medicine, 329, 1084-1091.

Wellford, C. F., Pepper, J. V., & Petrie, C. V. (2005). Firearms and violence: A critical review. Washington, DC: National Academies Press.

Can you join me and take action? Click here:

ECTESOL Conference in Pensacola Feb. 3rd

Tag words from my blog

The Emerald Coast TESOL (Teaching English to Speakers of Other Languages) conference is this Saturday from 10-3 at University of West Florida International Center. The registration is $25 and includes lunch. The conference will feature professionals from northern Florida panhandle and the Alabama Gulf Coast. As a new Board member, this will be my first time attending. Here’s the schedule:

9:30 – 10:00 Registration
10:00 – 10:10 Welcome – Council Vaughn, Director, International English Program
Overview of Conference – Dr. Arlene Costello, VP/ECTESOL Conference Chair
10:15 – 10:50 Keynote Speaker: Chane Eplin, Bureau Chief, Student Achievement through Language Acquisition, Florida Department of Education
Topic Address: Quality Education for English Learners K-12 and Beyond
10:55 – 11:30 Concurrent Sessions
Room 1: ELs as Independent and Autonomous Learners (Kiss/Costello)
Room 2: Google Suite to Enhance English Language Instruction (Rogers)
11:35 – 12:00 Lunch and 12:00 – 12:15 Cultural Performances DOOR PRIZES
12:20 – 1:00 Featured Keynote Speaker: Dr. Susan Ferguson Martin, Faculty, ESOL and Educational Leadership, University of South Alabama
Topic Address: Academic Language in Teaching and Learning Across the Curriculum: A Functional Approach
1:05 – 1:35 Panel – Speakers
Grace McCaffery, Founder, Costa Latina
Shannon Nickinson, Project Manager, Early Learning Studer Institute
1:40 – 2:15 Concurrent Sessions
Room 1: Sowing Seeds (Sessions & Cuyuch)
Room 2: ESOL, EFL, and Reciprocal Service Learning (Fregeau, Leier, Ojiambo, Cornejo, and Chikatia)
2:20 – 2:50 Concurrent Sessions
Room 1: The SUCCESS from Teachers, Students, and Parents Working Together (Baker)
Room 2: Saudi ELLs’ Digital Gameplay Habits and Effects on LA (Rogers)
2:50 – 3:00 Brief Business Meeting: Report by President; Paper Report by Treasurer
Closing: Amany Habib, ECTESOL President DOOR PRIZES
3:00 – 3:20 ECTESOL Board Meeting

I’ll be presenting a case study on gameplay habits and an information session on Google Suite for enhancing English language instruction. I hope to see you there!

Elements of Cooperative Learning and Their Application to Distance Ed

Embed from Getty Images


According to Wikipedia, the cooperative learning theory has been around since the 1930s and discussed by researchers from diverse fields such as philosophy and psychology. Cooperative learning involves strategic group practices and elements to aid critical thinking.  As an educator, I’m most familiar with Kagan’s (1985) approach to cooperative learning. Additionally, I learned about Palinscar and Brown’s reciprocal teaching method; their article on Reciprocal Teaching of Comprehension-fostering and Comprehension-monitoring Activities (1984) predates that of Kagan’s work.  Johnson and Johnson researched and wrote about cooperative learning activities in the 70s, 80s, and 90s. I learned about their work in my doctoral coursework on instructional strategies.

Johnson and Johnson (1994) were the first to describe the following five essential elements of cooperative learning: positive interdepence, face-to-face (F2F) promotive action, individual & group accountability, social skills, and group processing.  The following lists their elements and how they can be implemented in online courses.

  1. Element of Cooperative Learning: Positive Interdependence

Course Design– A) Provide example of project team roles. B) Another layer to this is to then divide the content assignment into specific components and assign them to team members.

Resources–  I modified the list that Dr. Dempsey shared in our doctoral course on instructional strategies at the University of South Alabama: team leader, timekeeper, idea monitor, QA monitor, and Wild Card (for the extra item that varies according to the content or situation).

Difference from F2F Instruction: A) Not all students will be able to meet F2F on campus due to geographic distances. B) Not all students will see information (login) at the same time. Delays can cause emotional distress to team members and create psychological distance.

2. Element of Cooperative Learning: F2F Promotive Interaction

Course Design- Include synchronous sessions with live audiovisual possibilities.

Resources– Use virtual meeting spaces such as BigBlueButton, Skype, Google+ Hangout, & Second Life

Difference from F2F Instruction: A) Students can discuss items freely without being in earshot of the teacher or other teams. B) Students need technical skills to be able to participate online. C) Meetings can easily be recorded for review.

3. Element of Cooperative Learning: Individual & Group Accountability

Course Design– Create rubric for individual and group tasks explicitly described.  Ask student to complete a peer evaluation of team members according to their assigned components.

Resources- Teacher asks students to create this for greater understanding of the requirements.

Difference from F2F Instruction- No real difference except for no F2F lecture mode to explain rubric.

4. Element of Cooperative Learning: Social Skills

Course Design– Teachers model social skills with teacher talk.  They shape students’ behavior by providing praise when appropriate actions are taken.  They provide rubrics that describe the actions such as how many times to post in forums and to whom.  Students set up their own agreed upon ground rules.

Resources– Netiquette: There are several versions out there.  There’s even a multiple-choice test that scores a students’ netiquette knowledge automatically.

Difference from F2F Instruction– A) Etiquette rules differ. B) In OL, every student gets the opportunity to respond. C) For OL, there’s a larger chance of procrastination due to the “absence” of the traditional classroom routine, physical building, seeing friends in the hallway to remind you, etc.

5. Element of Cooperative Learning: Group Processing

Course Design– Ask students to create their own set of group rules and definitions. (This was another Dr. Dempsey idea.) Monitor group work by asking to be added to their collaborative project sites.

Resources– Use Web 2.0 tools like wiki, clog, and/or Google Drive to collaborate.

Difference from F2F Instruction- A) Must decide on which synchronous and Web 2.0 tools to use and create accounts. B) Meetings include the World Map for time and date. C) May be grouped with someone that you will never meet F2F (I’m unsure of the psychological ramifications but certain this plays a role in online behavior).


Johnson, D., & Johnson, R. (1994). Learning together and alone, cooperative, competitive, and individualistic learning. Needham Heights, MA: Prentice-Hall.

Kagan, S. (1985). Cooperative learning. San Clemente, CA: Resources for Teachers, Inc.

Palinscar, A.S., & Brown, A.L. (1984). Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities.  Cognition and Instruction, I(2), 117-175.

Magis Instructional Design Model for Ignatian Pedagogy

Saint Ignatius of Loyola. Engraving by C. Klauber. Wellcome M0005653

The Magis Instructional Design (ID) Model for online courses was developed by Sandra Rogers (2015) with input from the Jesuits at Spring Hill College, as subject matter experts, and her professor in instructional design, Dr. Davidson-Shivers. It’s unique in that it addresses religion, spirituality, and social justice in addition to intellectual growth.

Jesuit school educators include techniques for reflection within their units of study in order to challenge students to serve others (Korth, 1993). According to one theology professor, Jesuit educators focus instructional activities on experiential learning to engender the cycle of experience leading to reflection and further action. This is based on the dynamics of Saint Ignatius’ Spiritual Exercises from which Ignatian pedagogy is derived.

The principles of Ignatian pedagogy include context, experience, reflection, action, and evaluation (Korth, 1993). Further action and service to others are for the greater glory of God. Magis means doing more for God’s Kingdom (Ad majorem Dei gloriam). The Magis ID Model is an alternative to existing ones in that it embeds the following Ignatian pedagogical layers into the systematic design of instruction to develop learners into caring leaders by addressing the whole person:

  1. Analyze human learning experience online/offline
  2. Establish relationships of mutual respect online/offline
  3. Tap into learner’s prior knowledge & experience
  4. Design optimal learning experience for the whole person
  5. Assimilate new information
  6. Transfer learning into lifeworld
  7. Encourage lifelong learning & reflections beyond self-interest
  8. Learners become contemplatives in action

Online Community of Inquiry

Designing for a community of inquiry (COI) loop will address the Ignatian principles of teaching to the whole person. A  COI exists when you have social presence, cognitive presence, and teaching presence. These are essential elements to the communication loop for an online COI (Garrison, Anderson, & Archer, 2000). This means that learners in an online environment are involved in activities that are cognitively challenging, are able to interact with their classmates, and that teaching is present in some way through words (e.g., text-based discussion), voice (e.g., podcasts), or person (e.g., webcast). The teaching can be delivered by student moderators or the instructor.

Bernard et al. (2009) conducted a meta-analysis of 74 online course interactions and found substantive research outcomes indicating the positive effect on learning when online educators build these types of interactions into their courses: student-student, student-teacher, and student-content. These interaction treatments (ITs) were defined as the environments and not the actual behaviors that occur within them. Through ID processes, one can design and develop these types of environments for distance education. Table 1 displays the main components of a Jesuit education, COI, and ITs, and their interrelationships.

Table 1

Comparison of Jesuit Education and Research-Based Best Practices

Jesuit Education of the Whole Person Mind Body Spirit
Necessary Elements for an Online Community of Inquiry Intellectual Presence Social Presence Teaching Presence
Research-based Best Practices for Interaction Treatments Student-content interactions Student-student interactions Student-teacher interactions

Designing Optimal Learning Experiences for the Whole Person

The Magis ID Model analyzes the type of instructional strategies used in distance education to ensure they address the whole person through cura personalis (mind, body, & spirit). Strategy selection should vary to meet the needs of diverse learners and engender higher-order thinking for cognitive presence. Selection depends on various affordances and constraints such as time and resources. For example, an activity-centered lesson is based on an interactive task and requires collaborative tools and student groupings. Content-centered lessons are passive tasks where the student generally only interacts with the content; the exception being discussions of content. Experience-centered activities require a hands-on approach to developing something or serving/working with others. The learner-centered activity provides the learner with more autonomy over their pursuit of knowledge and includes metacognitive actions for self-regulation of learning; the affordances and constraints for this type of activity are highly dependent on the task. Ideally, online educators should provide active learning activities to enhance cognitive transfer of new information and skills learned to long-term memory.

Contact Dr. Rogers ( at Spring Hill College to learn more about this ID model and how it’s being used to develop distance education courses.