Call them knowledge checks, quizzes, or tests—they should be part of your eLearning. But are you assessing learner skills, or are your quizzes just reading comprehension tests? How can you test the target objectives? Assessments are a critically important component of eLearning, whether it covers compliance topics, software skills, or soft skills.

I interviewed Jean Marrapodi recently about the challenges of assessment.

BB: What are the mistakes designers make in devising assessments?

JM: They assess the wrong thing. I see that over and over again. Designers will pull a bit of trivia out of some point in the course, as if everyone had photographic memory to remember the details. That's the most common mistake.

Second is asking a subject matter expert to write the assessment. Experts think differently than beginners, and you really have to work hard as a designer to help the experts and subject matter experts think about the level of the learner that is doing things. This includes not underestimating the learners.

Third is my really big beef, making assessments obvious. Designers want people to pass. Great, but is the goal for learners to pass or is the goal for them to show that they know what to do? Let's think about that. We don't want to make an assessment too hard, but we want them to think and apply.

Four, do we allow them to learn and fail along the way? We have formative assessments and we have summative assessments. Think about the difference between a quiz and a test. The formative assessment is when they are learning and they're figuring things out and if they get something wrong they learn from it. That's formative: the learning is being formed as they're going. The summative evaluation is about the stuff that counts in the end, if they have to have a record of something. What are you trying to have the learner accomplish? How do we balance the formative and the summative?

The fifth mistake would be tacking on a multiple choice test, just because that's the way we've always done it. I've been asked to write software training with multiple choice examples at the end. If we're training people on using software, we should have a software simulation where they have to click what they're supposed to click next. And we scaffold that the way the software is scaffolded.

BB: How does your design or delivery choice affect your choice of assessment?

JM: Well, the great qualifier here is: it depends. One of the biggest challenges that we face in eLearning is that it's really easy to create multiple choice questions, or drag and drop questions, or questions that are dealing with information recall. But when we are trying to train people to do something, we want them to do the thing that we're trying to have them do. We don't want to grill them on vocabulary, and that's one of the things that I see way too much in eLearning.

Designers sometimes believe we have to do something so that learners can score 80 percent. Where 80 percent comes from is this great enigma left over from grammar school, so a designer will write five questions and that allows a learner to get one wrong. So the learners answer the question and yay if they get enough right; they're done and supposedly they're competent in whatever it is. But it comes back to the initial goal. We're about learning how to do things. And so we want to create assessments that assess not only if they know, but if they can do.

My critical bottom line is: Does your assessment measure to the goal that you're trying to achieve?

In compliance training, we give learners lots of facts and figures and cite all kinds of stuff. And then we grill them about the rule. Okay, they know the rule but that's not the point of compliance. Compliance training is about doing whatever behavior it is that complies with the regulation. Back in the old days when HIPAA was rolling out, we would grill people on keeping personal information private, which we still do today. And then your “do” was, “Don't leave stuff on the fax machine. Don't leave your computer unlocked, and in the banking world don't leave personally identifiable information out where people can see it.”

Well, if an employee knew the rule was “don't share personal information,” wonderful, she passed the test. But then she didn't comply with the behavior that she needed to have. So one of the things that we want to do when we're writing compliance training is to create situations that are as close to the real world as we can get and ask the learner to make decisions like they would have to make in the real world. One of the ways you could measure for that particular scenario is to find all the places where there is possible PII—Personally Identifiable Information. Where are places that PII could possibly be exposed? Have learners click on those spaces. Create a pop up and have it show the piece of paper, of what the item says to determine, “Is this okay; is this not okay?” That's helping determine whether learners are thinking about the behavior that they have to do in order to apply the rules you're trying to get them to comply with.

BB: What about the soft skills, the non-procedural skills, the communication skills? How do you assess those?

JM: Well, let's step back to see what we're trying to do. Are we teaching them to handle difficult customers? One way to do that is to create personas. Teach how you react to this one, how you react to the other one, you have the outlier now with the globalization cross-cultural concerns that we have. Should cross cultural behavior be a part of that course? It depends on the audience that you're working with. If you're teaching sensitivity to globalization issues and inter-cultural issues, then you train to that and you measure against that. Put a scenario in. You want learners to apply the knowledge that they gained so you can drag and drop expectations.

We want to make our assessments valid, and that's something I'll talk about in my class, along with the challenge of dealing with soft skills: the validity and the reliability of the assessments, and more.

But the most important is, are the assessments measuring to the goal behavior that the learners must be able to do?

Improve your assessments in an hour

During The eLearning Foundations Online Conference, December 11-12, 2019, Jean Marrapodi will show you how to align your assessments with the intended outcomes of your eLearning. In "Evaluating Your Assessments: Are You Testing The Right Thing?", Jean will expand on the ideas that she shared in this interview, including the specifics of building authentic assessments and creating online scenario-based assessment activities.