CALL Criteria for Use of EverQuestII Video Game

Ocelot in full armor with sword on a snowy tundra with orcs running in the background
Meet my virtual identity, Kerrannie

As a computer-assisted language learning (CALL) budding researcher, I selected EverQuestII(EQ2) for my second language acquisition (SLA) research study based on a previous study and similar gaming literature. Little did I know how much reading and advanced vocabulary was involved in this game—vocabulary that you need to know in order to advance to the next level.  Reading fiction is a good way to improve your vocabulary.  Reading while immersed in the context is even better for the language learner!

EQ2 is in the game genre of massive multiplayer online role-playing games (MMORPGs).   Scholars like Millard (2002) believe that modern technologies can improve literacy.  I’m using EQ2 combined with SLA strategies as an after school intervention with English language learners’ to see if it will improve their grammar, reading, and vocabulary.

Chapelle (2001) developed criteria for CALL media selection that included language learning potential, learner fit, meaning focus, authenticity, positive feedback, and practicality. Other SLA researchers have used it to vet video game selection for their research (Miller and Hegelheimer, 2006). This criteria is a great way for me to share how impressed I am as an ESL educator with EQ2 as a medium for informal learning. Here are my initial understandings of the fit with the CALL criteria proposed by Chapelle: (albeit brief…)

  • Language Learning Potential: Text-based and/or live chats with native English speakers; written support of all communication in chat logs and speech bubbles; scaffolded introduction to each player’s role; and environment, animation and audible alerts enhance understanding
  • Learner Fit: Current literature indicates promise for gaming for educational purposes; EQ2 is rated T for Teen (ESRB, 2016) for a more approachable theme; and participants are university students who are familiar with online gaming
  • Meaning Focus: Role-play takes on meaning of several narratives on various kingdoms; and encounters provide salutations, skirmishes, and humor,
  • Authenticity: 5000 creatures to encounter on 8000 quests for situated learning encounters with non-playing characters and gamers; capability to build your own virtual identity; and possibility of failure
  • Positive Feedback:  Level-up announcements; tokens for continuance in gameplay; game currency for quest completion; and rewards for being courageous, etc.
  • Practicality: Free up to 91 levels of play; online for ease of access anytime; and tutorials available in-game and on YouTube; and user-friendly tips and error messages.

Drawbacks include the need to have sufficient computer graphic card, hard drive storage space, and the support of a “gaming coach” for those first-time gamers.  I realize that EQ2 is no longer the most sophisticated or popular game since its heyday was around 2011. Actually, this is why I selected this video game for my research study—so that participants will likely not be familiar with it.

References

Millard, E. (2002). Boys and the Blackstuff. National Association of for the Teaching of English (NATE) Newsletter, 16, January.

Chapelle, C. A. (2001). Computer applications in second language acquisition: Foundations for teaching, testing, and research. Cambridge, MA: Cambridge University Press.

Entertainment Software Rating Board. (2016). ESRB Ratings. New York, NY: Entertainment Software Association.  Retrieved from https://www.everquest2.com/news/february-2016-producers-letter-holly

Miller, M., & Hegelheimer, V. (2006). The Sims meet ESL: Incorporating authentic computer simulation games into the language classroom. International Journal of Interactive Technology and Smart Education, 3(4), 311–328.

Thanks to all my followers!

Heart Tagxedo for blog post imageI wanted to celebrate the milestone of reaching 1000 followers on my blog! Thanks to all of you who subscribe to Teacherrogers’ blog on WordPress.  My first blog post was in 2010, but I didn’t really become active until 2011.  This will be post #139.  I also blogged for TESOL International Association during 2011-2012 on their website.  Additionally, I blogged for a workforce education nonprofit I spearheaded in 2007-2009.  Some of those blogs have been republished here.

As a subscriber or regular reader, you know that I strive to provide you with relevant information on instructional design, learning theories, integration of technology and social media into the learning environment, as well as specific information in my areas of interest (second language acquisition, gaming, and e-learning).  My blog posts also serve as an archive of my learning.  This provides me with a place to review and reflect.  I hope my blogs have provided you with the information you needed or, at the very least, an idea or link to follow up.

Thanks again for following me on this journey of social blogging!  Please join me in this celebration by leaving me a comment.

Gratefully,

Sandra Annette Rogers, MAT

Instructional and Learner Analysis in Instructional Design

Acronym: Analysis, Design, Development, Implementation, Evaluation

Instructional design (ID) is commonly segmented into 5 iterative phases: analysis, design, development, implementation, and evaluation. Instructional analysis and learner analysis are processes in the systematic approach of ID of a learning event or product. These occur simultaneously in the analysis phase along with a context analysis because they’re intrinsically tied to the performance objectives, which is the outcome of the analysis phase. Other important activities in the analysis phase are the needs assessment (NA) and the performance analysis, both of which precede the instructional analysis and learner analysis.

The NA will identify the gap between the optimal status and actual status of the learners. The performance analysis is conducted to determine if the problem can be addressed with instruction. If so, a goal statement is produced based on the findings of the performance analysis. The instructional analysis breaks down the goal statement into supraordinate, subordinate, and entry level skills by identifying the aspects that will need to be taught to reach the goal. The learner analysis identifies the learners’ current knowledge, skills, attitudes, as well as other pertinent information such as preferences or cultural contraints that may impact learning. Overall, the goal of ID is to design effective, efficient, and innovative learning experiences.

In the instructional analysis, the instructional designer determines what the learners will actually be doing to reach the goal and the instructional pathway. During the goal analysis, the instructional designer will graphically display the specific steps needed. In the diagram of your analysis, she can include alternative actions, breaks in the process, and the type of learning. Types of learning outcomes include: verbal, intellectual, cognitive strategy, psychomotor, or attitudinal. The type of learning condition requires different types of analysis. For example, verbal information can be clustered according to a particular schema. For intellectual or psychomotor skills, instructional designers use a hierarchical approach because a subordinate skill must be achieved before a supraordinate one.

The outcome of the goal analysis becomes the supraordinate skills. During the subordinate skill analysis of a complex skill, the supraordinate steps are broken down into main rules, concepts, and discriminations. The corresponding verbal information and attitudinal skills are attached horizontally. Once the substeps have been fleshed out, the instructional designer determines the entry level skills. These are what the learner should already know how to do in order to successful achieve the new learning goal. For example, the instruction will generally require a certain reading level, language ability, and topic specific knowledge.

As aforementioned, the learner analysis is done simultaneously with the instructional analysis because they inform one another. The learner analysis functions include understanding the wide array of variables that affect the learner. These variables include entry skills, educational level, prior topic knowledge, attitudes toward content, attitudes about the delivery system, attitude toward the organization, learning preferences, group characteristics, and motivation. The instructional designer collects information on the learners by conducting structured interviews with those familiar with the current performance. Additionally, the instructional designer conducts site visits to observe the learners in the performance and instructional contexts. Furthermore, they can collect data on the learners via pretests, self-reports, or one-on-one informal discussions.

The output of the learner analysis is a report on all the previously mentioned variables potentially affecting the learner. The context analysis is interrelated with the learner analysis as it collects information on another category of variables affecting the learner: administrative support, physical site, social aspects of the site, and relevance of skill (goal) to the workplace/school.

All three analyses (instructional, learner, and context) are critical to the appropriate design and development of instruction. If any of the skills (supraordinate, subordinate, and entry level) are overlooked or learning context variables not addressed, this will diminish the effectiveness of the instruction. For example, if your target audience is English language learners, you’ll need to collect data on their language skills, reading levels, and cultural norms; otherwise, the instruction created will not meet the needs of the learners, and therefore be a waste of time, money, and effort.

Goals of Research Study on MMORPGs + SLA Strategies

This summer, I started my research study for my dissertation on massive multiplayer online role-playing games (MMORPGs) combined with second language acquisition (SLA) strategies.  I want to find out if free, commercial video games, MMORPGs in particular, are useful in helping English language learners (ELLs) acquire English.  Could MMORPGs be used to supplement language programs or personal learning agendas?  I’ll be using EverQuest II combined with three language strategies, as an after school add-on in a pretest-posttest control group design.

In my literature review and my previous case study on gaming and language learning,  ELLs self-reported that they learn a lot of English from playing video games.   Also, researchers on this topic are reporting positive gains for ELLs in vocabulary and language skills (reading, writing, listening, and speaking). My study focuses on vocabulary and reading gains, as well as student attitudes toward gaming as a language learning tool.  I’ll use statistical techniques to control for prior gaming experience and vocabulary and reading knowledge.

The goal of my study is to foster ELLs’ communicative competence—no matter their locale or socioeconomic situation.  Free role-play gaming (EQII provides 91 levels of free play) can provide opportunities to access authentic learning environments for experiential learning.  MMORPGs may challenge ELLs linguistically but with accessible themes and embedded support systems.  Literature on gaming indicates gamers practice information literacy skills (seeking & disseminating information), collaboration, problem-solving, and decision-making through meaningful and relevant tasks.

I’ll keep you posted on my progress and findings on this blog.

Quality Matters for Online Instruction

Quality Matters (QM) logo

What is it?

Quality Matters™ (QM) is a peer-review process for providing feedback and guidance for online course design.  According to the QM website, it originated from the MarylandOnline Consortium project in 2003. They received a grant from the US Department of Education to create a rubric and review process based on research and best practices.  In 2014, it became its own nonprofit organization.  Through a subscription service, the organization now provides training, resources, conference events, and research collaborations.  They currently have 5000 QM certified reviewers to assist subscribers with the peer review process of their online courses.

Who uses it?

QM provides specific rubrics and guidelines for the quality assurance review process for K-12, higher education, publishers, and continuing education programs that offer distance education.  QM has a new program to bring the rubric and process to students.  The QM process is specifically for hybrid and fully online courses; it’s not for web-enhanced face-to-face courses.  QM currently has 900 subscribers.  Subscription prices are adjusted to the size of your online programs.

How does it work?

A subscribing institution (or individual) requests a QM review of their course and submits an application.  QM recommends that you familiarize yourself with the rubric through the training process in advance of the review.  They also recommend that the course for review not be new—that it has been through a few semesters to work out the bugs.  A QM coordinator for your course assigns you a team of reviewers consisting of a team leader and two other certified peer reviewers, one of which is an subject matter expert.  They read your self-report about the course and review your course using the rubric and guidelines.  The rubric covers these general standards: 1. Course Overview & Introduction, 2. Learning Objectives (Competencies), 3. Assessment & Measurement, 4. Instructional Materials, 5. Course Activities & Learner Interaction, 6. Course Technology, 7. Learner Support, and 8. Accessibility & Usability.  The team contacts you with questions throughout the 4-6 week process.  Then they present you with your evaluation with time to address any major issues before finalizing the report.

What are the benefits?

Those courses that pass the review process receive recognition on the QM website.  Even if you meet the standards, the peer reviewers provide you with recommendations for further improvements.  Instructors can use this feedback for other courses they teach or debrief with colleagues about it.  This serves as an ongoing continuous improvement process.  This is something that institutions can promote to their clients and instructors can add to the curriculum vitae.  From personal experience in becoming a QM certified peer reviewer, I can attest to the benefits of knowing the best practices and accessibility requirements for online course design.  It has helped me to become a better online instructor and provided me with a wealth of knowledge for my work as an instructional designer.  I’m grateful to the Innovation in Learning Center at the University of South Alabama for training me on the QM process and providing the opportunity to become a certified peer reviewer.

SITE Conference Day 2: My Itinerary for PD

I finally decided on the presentations to attend on Wednesday.

Wednesday, March 23rd, 2016

  • 8:30 AM-9:45 AM: General Session, Paper Awards & Keynote: Larysa Nadolny Iowa State University, EPIC WIN: Designing for success with game-based learning
  • 10:15-10:45 AM in Scarbrough 1: What Features We Like When We Like Educational Games, Spencer Greenhalgh, Matthew Koehler Liz Owens Boltz
  • 10:45 AM-11:15 AM in Regency F: Establishing Presence and Community in the Online ClassroomBrianne Leigh Moore-Adams & Sarah Warnick
  • 11:50 AM-12:10 PM in Scarbrough 1:  Applying Conceptual Change Model in the Professional Development for Online FacultyLa Tonya Dyer & Liyan Song
  • 12:10-12:30 PM in Verelst: I’m Just A Blog, Yes I’m Only a Blog: Educating Teachers to Develop Students’ Skills in Digital Rhetoric Teresa Marie Kelly Barbara Green
  • 12:30 PM-1:45 PM in Harborside Center: Universal Design for Learning SIG (TEC)
  • 1:45 PM-2:45 PM in Harborside Center: Developing Google Certified Educators in Undergraduate Teacher Education, Ryan Visser &  D. Matthew Boyer
  • 1:45 PM-2:45 PM in Harborside Center: Investigating the Impact of Gamified Learning on Post-Secondary Education Student’s Ability to Self-Regulate their LearningStein Brunvand & David Hill
  • 1:45 PM-2:45 PM in Harborside Center: Use of Piktochart to Enhance Teacher Action ResearchHeather Leaman, Connie DiLucchio & Michelle Fisher
  • 3:00 PM-4:00 PM in Regency AB: Establishing STEAM Technology/Maker Labs in Colleges of Education: Challenges, Opportunities, and Lessons Learned, Jonathan Cohen,  Monty Jones & Shaunna Smith
  • 4:15 PM-5:15 PM in Harborside Center: Evaluation of Faculty Boot Camp Professional Development for Online Course Instruction,Barbara Duchardt,  Paula Furr, Steve Horton &  Ronald McBride
  • 5:30 PM-7:00 PM in Harborside Center: Saudi ELLs’ Digital Gameplay Habits and Effects on Second Language AcquisitionSandra Rogers (me) & Burke Johnson

SITE Conference Day 1: My Itinerary for Professional Development

I can’t wait to see all of these great presentations at the SITE conference next week!

Society of Information Technology and Teacher Ed, Tuesday, March 22nd, 2016, Planner

  • 8:30 AM-9:45 AM: Welcome, General Session & Keynote, Marc Prensky, Global Future Education Foundation and Institute, “PLAN B”: Education to Improve the World
  • 10:15 AM-11:15 AM in Harborside Center: Exploring simSchool: A Simulation-Based Learning Tool for Educators, David Collum , Melanie Bishop & Timothy Delicath
  • 11:30-11:50 AM in Regency F: Best Practices for Diverse Learners: Universal Design for Learning Online & Off, Dr. Elizabeth Dalton &  Liz Berquist
  • 11:50 AM-12:10 PM in Regency F: Rubric to Evaluate Online Course Syllabi Plans for Engendering a Community of Inquiry, Sandra Rogers (me) & James Van Haneghan
  • 12:30 PM-1:45 PM in Regency D: Digital Games & Simulations (ITC) Special Interest Group Meeting
  • 1:45 PM-2:05 PM in Scarbrough 4: Avoiding Epic Losses: Steps for Integrating Meaningful Gamification into the Classroom, Lorraine Beaudin
  • 2:05-2:25 PM in Scarbrough 4: Self-Organized Learning Environments (SOLEs),
    Selma Koc &  Ahmed Ali
  • 3:00 PM-4:00 PM in Harborside Center:  Wearables as Assistive Technology,
     Cindy Anderson & Kevin Anderson
  • 3:40 PM-4:00 PM in Scarbrough 4: Serious Games Classroom Implementation: Teacher Perspectives and Student Learning Outcomes, Monica Trevathan , Michelle Peters, Jana Willis & Linda Sansing
  • 4:15-4:35 PM in Regency E: Interactive Video Authoring: Student and Instructor Experiences, Liz Berquist & Lance Cassell
  • 4:55 PM-5:15 PM in Regency E: edTPA Videoing Made Easy: Standardizing the Process with iPads, Holley Roberts & Christopher Greer
  • 5:30 PM-6:30 PM: Welcome Reception!

Join me at SITE 2016 in Savannah, GA!

Photo of Sandra Annette Rogers
Say hello if you see me.

Two of my proposals were accepted for presentation at the Society for Information Technology and Teacher Education (SITE) International Conference in Savannah, GA.  I’d love to connect with any of my readers who are also going to SITE. This will be my second time to attend this conference and my first time in the city of Savannah.  I can’t wait!

Here’s my current schedule for the conference: (All times are Eastern Standard Time.)

1. Brief Paper: Rubric to Evaluate Online Course Syllabi Plans for Engendering a Community of Inquiry, March 22, 2016 at 11:50- 12:10 P.M., in the Hyatt Regency F.

2.  Poster Session: Saudi ELLs’ Digital Gameplay Habits and Effects on SLA: A Case Study,  March 23, 2016 at 5:30-7:00 P.M. in the Hyatt Regency Harborside Center. See my poster below.

My Human Performance Improvement Toolbox

HPI Image for blog

Beresford and Stolovich (2012) defined human performance improvement (HPI) as three perspectives: vision, concept, and end. Vision is for individuals to succeed in areas that are valued by their organization’s stakeholders. Concept is to use the vision to accomplish the organization’s goals through successful interactions with not only the organization’s stakeholders, but also with the customers, regulatory agencies, and society. End refers to terminal behaviors, products, and other outcomes that provide a return on investment (ROI).  I’ll use Beresford and Stolovich’s perspectives on HPI in my toolbox to address the needs of an organization.

Gilbert (2007) provided HPI with a formula for worthy performances (Pw), which is Pw = Av/Bc, where Av refers to valued accomplishments and Bc refers to costly behaviors. The term “costly” can have positive and negative connotation; it references the costs involved with each performance (e.g., salaries, resources, and trainings). Gilbert’s formula is a powerful tool for better determining worthy performances.

The first step in improving a particular performance is to conduct a needs assessment (NA) to better understand the current performance in relation to the desired outcomes such as industry standards (benchmarking) coupled with the vision of an organization. A NA helps organizations identify the gap (need) between the actual and optimal performance levels of an organization. I would rely on the Aultschuld’s (2010) three-phase NA model (preassessment, NA, postassessment), as a guide for interacting with a NA team and NA committee of stakeholders. In the preassessment, my team would gather data on the topic from key informants, literature, and extant resources.

The NA team would follow up on emergent themes describing the perceived need and gather specific information via interviews, questionnaires, and focus groups on what the respondents value as possible solutions. The NA postassessment process identifies the problem succinctly. Is the gap due to a lack of incentives, knowledge, skills, or institutional support?  Training is not always the answer.  Interactions and behaviors can be improved via instructional and/or noninstructional interventions. For instance, HPI can be as simplistic as buying a better writing instrument (e.g., Dr. Grip pen) to expedite note-taking on the job. This would be a noninstructional intervention.

I’d utilize the various job aids provided in Aultschuld’s series of books to identify and address the problem in light of the organizations concepts. For example, I favor Ishikawa’s Fishbone Diagram with the bones representing the various issues within labeled categories of performance. Moreover, I’d collect solutions from stakeholders and conduct a Sork feasibility study to determine the appropriate solutions.  Given the complexity of a NA, the Aultschuld series would serve as another item in my HPI toolbox.

I created a manual of methods for problem analysis (PA) for novice instructional designers that can be used on a daily basis when a full NA is impossible.  I studied Jonassen’s typology of problems to determine the type and possible actions required.  I learned if the problem is well-structured, then a quick solution can be found because it is easily solved.  If it is ill-structured, then I should conduct a PA to get to the root of the problem. I would use Harless’ (1974) list of 14 questions for PA. I recognize his first one as being very important: Is there a problem? After a problem(s) is identified, I would use Toyoda’s Why Tree for root cause analysis; this technique keeps asking why for each response given until the root(s) is identified. Then I would use Sanders and Thiagarajan’s 6-box model to see which areas of an organization are affected by these performance problems: knowledge, information, motives, process, resources, wellness. I also learned from Jonassen’s (2004) work that we should collect our problems in a fault database.  This is something I have been doing to improve our turnaround in resolving learning management system (LMS) issues at my workplace to increase our ROI for cost, labor, and learning outcomes.

For interventions at my workplace, I use job aids, embedded performance systems, and the aforementioned idea for a fault database. I purchased Rossett and Gautier-Down’s (1991) HPI resource book, A Handbook of Job Aids.  This book provides matrices (Frames Type II) for the user to discern which job aid should be used with which type of task. I also create job aids for the workplace to facilitate teaching and learning.  For example, I create how-to guides for instructional technology software (e.g., Camtasia Studio) for instructors who are unable to attend trainings and must learn on their own.  Job aids are useful HPI tools for infrequent tasks like the occasional instructional video one might need to create for class. I have also been focusing on providing performance support mechanisms for right-time needs for students and instructors.  I noticed an overreliance on the instructional designer to answer all LMS related questions.  To provide an embedded support system, I added a webpage on our LMS to answer frequently asked questions. This has greatly reduced my cue of email requests, all the while improving the performance of those affected. In closing, for my HPI general framework, I rely on Beresford and Stolovich’s HPI perspectives of vision, concept, and end.  To put my framework into action, I rely on the works of Gilbert, Autschuld, Jonassen, Harless, Ishikawa, Sanders, Thiagarajan, and Toyoda.

References

Altschuld, J. W., & Kumar, D. D. (2010). Needs assessment. Thousand Oaks, CA: SAGE Publications.

Beresford B., & Stolovitch, H. D. (2012). The development and evolution of human performance improvement. In R. A. Reiser & J. V. Dempsey (Eds.) Trends and issues in instructional design & technology (3rd ed.) (pp. 135-146). Boston, MA: Allyn & Bacon Pearson Education.

Harless, J. H. (1974). An analysis of front-end analysis. Improving Human Performance, 2(4), 229-244.

Jonassen, D. H. (2004). Learning to solve problems: An instructional design guide. San Francisco, CA: Pfeiffer.

Rossett, A., & Gautier-Downes, J. (1991). A handbook of job aids. San Francisco: CA. Pfeiffer & Company.

Thanks and Happy New Year!

Cartoon headshot of blogger, Sandra Rogers

Dear Readers,

Thank you for all of your comments and re-sharing of my blog. I’m so humbled to have such a growing readership. I hope I have created some useful content for each of you. Let me know if you have a topic of interest that I might can blog about. I also renewed my Podbean podcast and hope to interview educators, instructional designers, game designers, and innovators on learning. Let me know if you’re interested!

Since my beginnings on this blog in 2011, I tried to make it practical. In 2015, I decided to transform my blog into a scholarly one as I near the end of my doctoral program of study. I’m slowly going back to previous posts to update them to include citations and more precise advice based on research.

In 2016, I will continue to blog at least once a week to share my learning with you. Blogging is my way of reviewing information to help me remember. It also helps me to synthesize information to share with the educators I work with now. Whenever I find myself explaining something verbally, I check to see if I have a blog on it to share as a follow-up. If I don’t, I write one. My blogs have become job aids!

This is the year that I start, and hopefully complete, my dissertation on gaming for second language acquisition. I will share more on this topic once I complete the proposal.

Happy New Year!

Sandra Annette Rogers

aka Teacherrogers