Application of Gagne’s 9 Events of Instruction to WDE Gaming

Application of Gagné’s Nine Events of Instruction to Well Designed Educational (WDE) Gaming 

(This chart was published in my dissertation. See references below.)

Gagné’s Nine Events of Instruction (1985) Comparison to WDE Gaming (Adapted from Becker, 2008 and Van Eck, 2006) Mental Processes (Gagné & Driscoll, 1988)
Gain attention Capture attention with movement, scenes, sounds, speech, and health status updates Reception
State the learning objectives Inform learner of quest and related game documentation to include limitations and cutscenes (e.g., set mood) Expectancy
Stimulate recall of prior learning Present stimulus through environmental structures that provide familiarity with obstacles or behaviors of characters Retrieval to working memory
Present content Present content according to the objectives of the game such as storyline embedded within the virtual environment Selective perception
Provide guidance Guide users with storylines, profiles, help section, map, sale of higher-level gear as you level up, hint books, friendly gamers’ verbal and nonverbal input, NPCs’ model language, and partial clues for quests found in gameplay Semantic encoding
Elicit performance Require adequate knowledge to advance to next level Responding
Provide feedback Provide feedback via speech, sounds, visuals, text, or motion directives including no motion Reinforcement
Assess performance Assess users’ performance as they progress to end goal and achieve reward for knowledge and skill Retrieval and reinforcement
Enhance retention Interweave past learning experience with new challenges; otherwise, repeat prior mistakes Retrieval and Generalization

References

Becker, K. (2008). Video game pedagogy: Good games = Good pedagogy. In C. T. Miller (Ed.), Games: Purpose and potential in education (pp. 73-122). New York, NY: Springer.

Gagné, R. M. (1985). The conditions of learning. New York, NY: Holt, Rinehart, & Winston.

Gagné, R. M., & Driscoll, M. P. (1988). Essentials of learning for instruction (2nd ed.). Englewood Cliffs, NJ: Prentice Hall.

Rogers, S. A. (2017). A MMORPG with language learning strategic activities to improve English grammar, listening, reading, and vocabulary (Doctoral dissertation). Available from ProQuest Dissertations and Theses database. (UMI No. 10265484)

Van Eck, R. (2006). Building artificially intelligent learning games. In D. Gibson, C. Aldrich, & M. Prensky (Eds.), Games and simulations in online learning research & development frameworks (pp. 271–307). Hershey, PA: Idea Group.

5 Pitfalls of Online Teaching

Female student looking frustrated with books and computer

I took my first series of online courses for professional development in 2009. The courses were highly interactively and well-designed because they were taught by experts in the field of computer-assisted language learning. A shout-out to my professors in the Teaching English to Speakers of Other Languages (TESOL) certificate program, Principles and Practices of Online Teaching! (See blog on this topic). Ever since then, I’ve compared online courses to those.

As a working instructional designer and current PhD student enrolled in online courses, I bring a well-rounded perspective to the topic of distance education. I’ve researched and written about how to develop an online community of inquiry. It has become my personal agenda to ensure that students taking online courses don’t get frustrated from the course design and lack of teacher presence.

Here’s a list of what I consider the top 5 pitfalls that will surely decrease student learning outcomes and student satisfaction:

  1. Lack of pattern in weekly assignments will cause confusion, especially in a hybrid (blended) course. For example, as you plan threaded discussions, quizzes, and assignments, make sure they follow a pattern; otherwise, indicate on your syllabus any gaps in the established pattern of assignments.
  2. Numerous clicks to find content leads to frustration. To increase findability, use clear navigation practices to reduce time lost on task and frustration levels (Simunich, Robins, & Kelly, 2012).
  3. Lack of synchronous sessions to connect with the human leads to reduced achievement. To increase student achievement, include synchronous sessions (Bernard et al., 2009), Arbaugh and Hornik (2006) suggested video conferencing, voice messaging, or some other types of multimedia.
  4. Instructors not responding to students’ discussions in a timely manner could cause missed learning opportunities. There are several theories on human learning about delivering targeted instruction at the right time such as Vygotsky’s (1978) zone of proximal development that posits that a student can only attain so much without the assistance from others. Students need prompt feedback that targets their instructional needs (Arbaugh, 2001). See my blog post on instructor feedback for online courses.
  5. Lack of student-student interactions may decrease student satisfaction and student achievement (Bernard et al., 2004). Make sure students can talk to one another and share their finished projects.

Do you agree with my top 5?

References

Arbaugh, J. B. (2001). How instructor immediacy behaviors affect student satisfaction and learning in web-based courses. Business Communication Quarterly, 30, 42-54.

Arbaugh, J. B., & Hornik, S. (2006). Do Chickering and Gamson’s seven principles also apply to online MBAs? The Journal of Educators Online, 3(2), 1-18.

Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R., Surkes,  M. A., & Bethel, E. C. (2009). A meta-analysis of three types of ITs in distance education. Review of Educational Research, 79, 1243-1288.

Simunich, B., Robins, D., & Kelly, V. (2012). Does findability matter? Findability, student motivation, and self-efficacy in online courses.  Quality Matters (QM) Research Grant, Kent State University.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

Join me at the MSERA 2016 in Mobile, Alabama!

Photo of Sandra Annette Rogers
Say hello if you see me.

Join me in Mobile, AL this November 2nd-4th for the Mid-South Educational Research Associations (MSERA) 2015 annual meeting.  Click this link to see the full conference schedule.  The conference takes place at the Renaissance Mobile Riverview Plaza Hotel on Water Street downtown.  For more information on the MSERA, visit their Website.  The great thing about #MSERA is that they are friendly and welcome newcomers—and they remember your name the time they see you!

I’ll be making two brief paper presentations and chairing these same sessions. Here’s my schedule:

  • 2:00 eLearning Session in Grand Bay Room I/II: November 3 (Thursday)

    Rubric to Evaluate Online Course Syllabi Plans for Engendering a Community of Inquiry

    Sandra A. Rogers & James Van Haneghan, University of South Alabama


    10:00 Instructional Design Session in Windjammer Room: November 4th (Friday)

    Magis Instructional Design Model for Ignatian-based Distance Education

    Sandra A. Rogers, Spring Hill College

     

Goodbye eCollege, Hello Schoology!

Venn Diagram of the tools and features of eCollege compared to those of Schoology LMS

Here’s a link to the PDF of this image.  Pearson is closing its door on eCollege and eCompanion, so we adopted a new learning management system (LMS).  Schoology by comparison has so many more features for our learners.

Problem Analysis: 3 Job Aids to Find Root Causes

Instructional and Learner Analysis in Instructional Design

Acronym: Analysis, Design, Development, Implementation, Evaluation

Instructional design (ID) is commonly segmented into 5 iterative phases: analysis, design, development, implementation, and evaluation. Instructional analysis and learner analysis are processes in the systematic approach of ID of a learning event or product. These occur simultaneously in the analysis phase along with a context analysis because they’re intrinsically tied to the performance objectives, which is the outcome of the analysis phase. Other important activities in the analysis phase are the needs assessment (NA) and the performance analysis, both of which precede the instructional analysis and learner analysis.

The NA will identify the gap between the optimal status and actual status of the learners. The performance analysis is conducted to determine if the problem can be addressed with instruction. If so, a goal statement is produced based on the findings of the performance analysis. The instructional analysis breaks down the goal statement into supraordinate, subordinate, and entry level skills by identifying the aspects that will need to be taught to reach the goal. The learner analysis identifies the learners’ current knowledge, skills, attitudes, as well as other pertinent information such as preferences or cultural contraints that may impact learning. Overall, the goal of ID is to design effective, efficient, and innovative learning experiences.

In the instructional analysis, the instructional designer determines what the learners will actually be doing to reach the goal and the instructional pathway. During the goal analysis, the instructional designer will graphically display the specific steps needed. In the diagram of your analysis, she can include alternative actions, breaks in the process, and the type of learning. Types of learning outcomes include: verbal, intellectual, cognitive strategy, psychomotor, or attitudinal. The type of learning condition requires different types of analysis. For example, verbal information can be clustered according to a particular schema. For intellectual or psychomotor skills, instructional designers use a hierarchical approach because a subordinate skill must be achieved before a supraordinate one.

The outcome of the goal analysis becomes the supraordinate skills. During the subordinate skill analysis of a complex skill, the supraordinate steps are broken down into main rules, concepts, and discriminations. The corresponding verbal information and attitudinal skills are attached horizontally. Once the substeps have been fleshed out, the instructional designer determines the entry level skills. These are what the learner should already know how to do in order to successful achieve the new learning goal. For example, the instruction will generally require a certain reading level, language ability, and topic specific knowledge.

As aforementioned, the learner analysis is done simultaneously with the instructional analysis because they inform one another. The learner analysis functions include understanding the wide array of variables that affect the learner. These variables include entry skills, educational level, prior topic knowledge, attitudes toward content, attitudes about the delivery system, attitude toward the organization, learning preferences, group characteristics, and motivation. The instructional designer collects information on the learners by conducting structured interviews with those familiar with the current performance. Additionally, the instructional designer conducts site visits to observe the learners in the performance and instructional contexts. Furthermore, they can collect data on the learners via pretests, self-reports, or one-on-one informal discussions.

The output of the learner analysis is a report on all the previously mentioned variables potentially affecting the learner. The context analysis is interrelated with the learner analysis as it collects information on another category of variables affecting the learner: administrative support, physical site, social aspects of the site, and relevance of skill (goal) to the workplace/school.

All three analyses (instructional, learner, and context) are critical to the appropriate design and development of instruction. If any of the skills (supraordinate, subordinate, and entry level) are overlooked or learning context variables not addressed, this will diminish the effectiveness of the instruction. For example, if your target audience is English language learners, you’ll need to collect data on their language skills, reading levels, and cultural norms; otherwise, the instruction created will not meet the needs of the learners, and therefore be a waste of time, money, and effort.

Quality Matters for Online Instruction

Quality Matters (QM) logo

What is it?

Quality Matters™ (QM) is a peer-review process for providing feedback and guidance for online course design.  According to the QM website, it originated from the MarylandOnline Consortium project in 2003. They received a grant from the US Department of Education to create a rubric and review process based on research and best practices.  In 2014, it became its own nonprofit organization.  Through a subscription service, the organization now provides training, resources, conference events, and research collaborations.  They currently have 5000 QM certified reviewers to assist subscribers with the peer review process of their online courses.

Who uses it?

QM provides specific rubrics and guidelines for the quality assurance review process for K-12, higher education, publishers, and continuing education programs that offer distance education.  QM has a new program to bring the rubric and process to students.  The QM process is specifically for hybrid and fully online courses; it’s not for web-enhanced face-to-face courses.  QM currently has 900 subscribers.  Subscription prices are adjusted to the size of your online programs.

How does it work?

A subscribing institution (or individual) requests a QM review of their course and submits an application.  QM recommends that you familiarize yourself with the rubric through the training process in advance of the review.  They also recommend that the course for review not be new—that it has been through a few semesters to work out the bugs.  A QM coordinator for your course assigns you a team of reviewers consisting of a team leader and two other certified peer reviewers, one of which is an subject matter expert.  They read your self-report about the course and review your course using the rubric and guidelines.  The rubric covers these general standards: 1. Course Overview & Introduction, 2. Learning Objectives (Competencies), 3. Assessment & Measurement, 4. Instructional Materials, 5. Course Activities & Learner Interaction, 6. Course Technology, 7. Learner Support, and 8. Accessibility & Usability.  The team contacts you with questions throughout the 4-6 week process.  Then they present you with your evaluation with time to address any major issues before finalizing the report.

What are the benefits?

Those courses that pass the review process receive recognition on the QM website.  Even if you meet the standards, the peer reviewers provide you with recommendations for further improvements.  Instructors can use this feedback for other courses they teach or debrief with colleagues about it.  This serves as an ongoing continuous improvement process.  This is something that institutions can promote to their clients and instructors can add to the curriculum vitae.  From personal experience in becoming a QM certified peer reviewer, I can attest to the benefits of knowing the best practices and accessibility requirements for online course design.  It has helped me to become a better online instructor and provided me with a wealth of knowledge for my work as an instructional designer.  I’m grateful to the Innovation in Learning Center at the University of South Alabama for training me on the QM process and providing the opportunity to become a certified peer reviewer.