by Ethan Edwards, chief instructional strategist
The standard accepted methodology for designing instruction, including e-learning, is still centered solidly on the ADDIE process…a time-honored methodology composed of five steps--Analysis, Design, Development, Implementation, and Evaluation. I’m a strong proponent of the idea that the inherent linearity of this process creates huge obstacles to coming up with highly-engaging interactivity in e-learning.
It isn’t that these tasks are necessarily wrong; rather, it’s that design for e-learning needs to be more exploratory and experimental in approach instead of prescriptive. At Allen Interactions we do our designing using a iterative, rapid-prototyping approach to creating effective instruction. Many training groups are beginning to modify ADDIE to include some flexibility, but generally it doesn’t seem to me that it’s far enough to effectively design really engaging interactions.
But whether you are using ADDIE or Rapid Prototyping, the success of a project will often rest on the kind of analysis activities carried out. When I ask students about some of the weaknesses or failures they’ve experienced in implementing ADDIE in creating e-learning, a very common response is that too little emphasis is placed on analysis—or that sometimes analysis is effectively skipped altogether. Often, subject matter has been pre-screened through the eyes of a subject matter expert, or worse yet, the content already exists in PowerPoint displays for an instructor led course, and the belief is that Analysis is already complete.
Unfortunately, analysis carried out by dedicated SMEs often is completely content bound. Focus is entirely on content knowledge goals without addressing aspects of how the knowledge is applied. And analysis done for ILT usually didn’t have the particular strengths of e-learning—individualization, user-control, and judgment-free activity—in mind when investigating the content.
You’ll still probably need to do some of the standard analysis questions to filter and organize scope effectively, but regardless of what else is asked in analysis, I propose these 5 questions as those most critical for making effective e-learning even possible.
The 5 Most Important Analysis Questions You’ll Ever Ask:
1. What do you expect learners to be able to DO after completing the course that they can’t do now?
Don’t listen to “They need to know this, and they need to know this, and they need to know this, blah, blah, blah” talk. (Well, you probably have to listen to it, but don’t pay much attention.) This approach usually represents somewhat thoughtless thinking, and if actually it were true, it’s likely that e-learning isn’t even your best option for dealing with it. Be precise about specific performance outcomes and relentless in pruning content that doesn’t directly support those desired behaviors.
2. What are the consequences TO THE LEARNER if the learner fails to master the intended outcomes?
Often times, initial analysis identifies consequences of failure for the organization., While these are certainly important and might be the justification for funding the project, don’t assume that the same drivers have equal significance for the learner. This information will be critical in setting Context for every interaction, and often can be an inspiration for elements of risk (and Challenge) for bringing the interaction to life.
3. Insist on an active demonstration, a detailed simulation, or an opportunity to directly observe the desired performance. (ok…sorry this isn’t a question, but I’m sure you’ll make sense of it anyway.)
Until you can see the intended performance executed in the real environment, it is easy to overlook some critical complexities that are masked by the logical structuring of content. We were recently working with a client to create a course on driver safety. The content was a straightforward six-step process, and each step was completely understandable and seemingly easy (e.g., “slow down,” “look both ways,” etc.) Reviewing that orderly content totally failed to capture the difficulty of the challenge. It wasn’t until we rigorously put that plan into play in a real situation and analyzed the errors that we were able to grasp that the real difficulty wasn’t in knowing the steps, but in making the learners so fluent and prepared that they could execute the steps flawlessly almost simultaneously without any question or delay.
4. What specific performance mistakes do new learners regularly make?
This question is essential for designing the right challenge and the right actions in your interactions. Learners mainly learn from the mistakes they make; if they can’t actually make during training the kind of mistakes they tend to make on the job, the e-learning is unlikely to have any effect on the problems that your organization is probably suffering from most. Again, this information is absolutely essential for deciding on Actions and corresponding Feedback incorporated into your design.
5.What tools, resources, job aids, or help do successful performers (or even experts) use to do these tasks?
Often times a training task is made more difficult than necessary because the learning environment is made more difficult than even the performance environment simply because in a frenzy to “test” the learner, the design withholds basic information that even the most proficient performers use regularly. Equally as important as teaching the skills desired, is making sure the learner is aware of all the tools and resources to be used as support to make success more likely.
If you are dedicated to investigating these questions and really listening to the answers, I’m certain that you will have the basic ingredients needed to begin designing true Instructional Interactivity.