Remember Dear Abby? Well, here is her counterpart in the Instructional Design field…
My work husband and I argue about how much content learners want. He feels we should load up each page with optional information – things learners might find useful. He says the advanced learners who want deeper levels of information should be accommodated, or else we’re losing the potential for star performers.
I worry about information overload. I strive for minimal content so learners can focus on mastering what’s there. Already, post-test scores are in the 60s. Why dump more words onto learners? Other than that, he’s a good work husband. He buys me chocolate shakes in the summer. Your thoughts?
-Divorce Isn’t an Option, Seattle
You and your husband are both arguing about the wrong perspective. It’s not about how much or little content; it’s whether or not the content supports the desired performance. What do you want learners to DO? Define that mastery level of performance desired, and you’ll know exactly what to include: you need only as much content as is necessary to perform that skill. Avoid divorce. Focus on performance.
My learners are the worst. They wait to complete required training the very day before it’s due. They’re not motivated to take additional courses. Post-tests are dismal, usually the lowest necessary scores to pass. How do I get them to get their act together?
-Tired of Lazy Learners, Denver
Sit down, dear. You may not like this. Have you considered the problem is actually you? What if they’re not lazy and unmotivated? What if they’re tired of your crappy training and your lame post-tests designed to assess the bare minimum? The same people you call lazy might be involved as coaches at their kids’ school, may ride horseback every weekend, or are planning a backpacking trip through Tibet. They’re just not into your training.
Your willingness to generalize about an entire body of people as, “they’re all lazy,” is a pretty good indicator the problem isn’t them. Sorry, but you asked. Start seeing learners as people with busy lives who don’t enjoy having their time wasted. That may help you gain insight into understanding your audience and finding ways to motivate them.
My boss won’t let me do anything. I’ve approached her a dozen times about the need for instructional interactivity and she says, “Oh, it sounds like a good idea, but we have no resources. We can’t afford that.” I know she’s right, but at the same time, I think we should somehow find the money or make ourselves pursue better training. How do I get her to come around?
-Workin’ That Cubicle, Toronto
You must speak her language. If her concern is money and resources, you must talk money and resources. You must compare how much it costs to produce instructional interactivity and weigh that against missed opportunities and losses measured by NOT performing the desired behavior. When companies understand the actual cost of poor performance (usually resulting in the hundreds of thousands or millions for big companies), suddenly the cost of developing instructional interactivity doesn’t seem so grand. Get one of your business analysts to help you review numbers and costs until you’re fluent.
Also, are you sure it’s about money?
When my kids wanted us to go to Disney World, I told them we couldn’t afford to go. That was true, but not completely true. When they came back to me with a plan for cheap hotels and reasonable airfare (those eager beavers discovered kayak.com), I was forced to admit that it wasn’t about the money so much as I hated massive crowds. By addressing the concerns I put out there, they found a way to unearth my true objection. We’re going to Disney next March. Sigh. Consider that there’s more to your boss’s objection than you understand.
I want audio narration throughout the course because it appeals to learners who have an auditory learning style, but the cost to record (and re-record when course updates come out) is prohibitive. What do you advise?
-Do You Hear What I Hear, Ann Arbor
First, let’s eliminate your misperception about learning styles. Do a Google search. There is no research suggesting that the differences in learning acquisition achieved through different learning styles is *significant enough* to warrant implementation of multiple approaches. We need to stop justifying audio and video because of “learning styles.” Let’s put an end to this conversation. Some learners may prefer bulleted items and others prefer video. Sure. Why not? But these preferences won’t stop them from mastering material.
Now, let’s talk about audio in your course. Should you do it? Yes, when it supports the performance goals. Same applies to video. If the skill you want learners to master is how well they interpret a customer’s attitude over the phone, yes, audio is necessary to include. It’s a powerful part of the context necessary to performance. If reading ability is a concern in your target audience, maybe audio should be a consideration here as well. But it shouldn’t be a consideration because of learning-style garbage. Use audio (or video or animation or whatever) because it enhances the context of the skill to be performed.
My learners’ test scores are usually in the 70s. I’ve been charged with bringing the average score up into the 80s. It’s a standard 25 question, multiple-choice test. Suggestions?
-Test Mastery, Libertyville
Make the test easier. People will score higher. There. Problem solved.
But let me address a bigger problem, one mentioned in your letter and in a few others during this week’s column: low test scores.
What exactly are you people measuring?
If you’re performing a Kirkpatrick Level 2 pre- and post-test, you know the limitations, right? Stakeholders want the post-test to reveal “how much was learned,” but that’s not an accurate takeaway. More accurate: how much the learners memorized for the post-test. Or: how good a guesser the learner is in test-taking situations.
Most post-tests are so poorly designed and the multiple choice questions so poorly written, learners don’t need to complete the e-learning to pass. In test-taking,
Better approaches are possible. If the training truly asked learners to perform the behavior they perform in the real world, well, the post-test would ask the exact same of the learners, but would remove some of the “help elements” that offered assistance during the e-learning. The test would match the e-learning 100% and be completely fair since it’s assessing the behavior taught.
If you can’t create a post-test with this fidelity, consider a test asking learners how they intend to implement the new skills. Promising studies show links between learner intention to implement skills and learners actually implementing the new skill. Of course, these aren’t multiple-choice questions, so they’re not easily scored. Your staff might actually have to—gasp—care about the answers to read and evaluate them.
Good post-tests are possible. But fretting about the low scores on a poorly-designed test is a waste of your brain.
End of rant—I’ll step off my soapbox now.
LIKE WHAT YOU'VE READ? SHARE THE KNOWLEDGE WITH YOUR PEERS USING THIS READY-MADE TWEET!
CLICK TO TWEET: Dear Savvy, a new #aiblog series! http://ctt.ec/vmVc5+ #InstructionalDesign