In today’s world there’s a lot of talk about “research” and “data.” What these terms mean can vary widely, depending on the context we’re working in or even the individual we’re talking with. In a learning context, research might refer to learner research (e.g., interviewing learners to figure out what their needs and challenges are) or smile sheets (i.e., asking learners after a training session if they were engaged or if a workshop was useful). One might be referring to data, such as xAPI as evidence of learner behavior, or system data as evidence for high or low worker performance.

Research and data ideally lead to evidence-informed practice, which is essential to improving the learner’s experience, through effective LX design.

As L&D professionals, we must understand that some types of evidence are stronger than others and that nuance is sometimes missing in discussions of applying research and data to learning experience design (LXD). But that’s not all.

What’s really worrisome is the lack of discussion of evidence from scientific research in the context of learning in the workplace. This “leg” of evidence-informed practice is still very much underutilized, and that needs to change.

Evidence-informed practice

Note the emphasis on “evidence-informed” practice, rather than “evidence-based.” The difference is significant.

  • Evidence-based practice is an interdisciplinary approach to clinical practice and is grounded in medicine. It integrates:
    — The best available scientific research evidence bearing on whether and why a treatment works;
    — Clinical expertise (clinical judgment and experience); and 
    — Client preferences and values.

  • Evidence-informed practice is also based on scientific research, but in the field of learning sciences, the evidence might look slightly different. After all, there are lots of muddy and mucky real-life influences on people’s learning experiences. This is simply because we’re dealing with many variables that might be hard to control.

Hence, when we use scientific evidence from learning sciences, we need to acknowledge that what works in one context doesn’t necessarily work in another. We also usually use more qualitative data and so the evidence is weaker.

Rest assured that there’s still a lot of scientific evidence that we can and should use to the learners’ benefit. Evidence-informed practice is the only way to have nuanced conversations about how to design learning experiences effectively. It’s also the only way the L&D team can become trusted advisers and strategic partners; partners who can ensure that their work is in alignment with the goals and solutions to real problems that both the business and learners need.

Scientific evidence need not be intimidating

One reason that scientific research is still underused might be that some practitioners find it a bit intimidating. People who have not been exposed to scientific articles might not know where to start. Becoming aware of scientific evidence in the learning field is a great first step; putting that into practice is the next. Although it might be challenging, it can be done.

There are two pieces of good news. First, one does not have to read scientific articles to be understand how evidence from learning sciences can support practice. The second is that there are ways to make better informed design decisions by following four simple steps, suggested by Daniel Willingham:

  1. Strip it & flip it
    In this step, one focuses on the language used, for example in an article or blog. It’s about considering if the language is emotional or hyped-up. The second part, “flip it," means that you try and turn the argument upside down. For example, the idea of learning styles intuitively and emotionally makes sense to people, but if you flip it and ask, “How do you feel about pigeon-holing people,” then suddenly learning styles sound less appealing.
  2. Trace it
    This means that one needs to dig a bit deeper and ask what kind of evidence there is for the posed claim. What kind of resources has the author or speaker used? Just take a critical look.
  3. Analyze it
    This step requires some basic statistical knowledge but even a critical eye can bring you a long way. To put it simply: If something sounds too good to be true, then it probably is.
  4. Should you do it?
    Based on the findings in the previous steps, one needs to decide if it’s worth “doing,” which in our profession would be about, for example, applying a method, implementing a strategy, or buying a tool.

Practicing these steps when reading articles online, watching videos, listening to podcasts, or just reading a good old book, we can learn to see when we need to dig deeper. This approach helps to develop an eye for truth versus truthiness—a term that comedian Stephen Colbert invented to describe something that sounds plausible and that people are likely to believe without considering facts, logic, or any contradictory evidence.

As L&D professionals get better at recognizing truth versus truthiness, we can use the evidence to increase knowledge and expertise; have informed conversations with clients, partners, and stakeholders; and explain why we are recommending certain learning solutions or making design decisions.

Improved L&D expertise increases our value in organizations, and, most important, improves learning experience design so that we can deliver or support more effective, efficient, and enjoyable learning experiences.

Design evidence-informed learning experiences

Mirjam Neelen will present “Evidence-Informed Learning Experience Design” as part of the Science of Learning Summit, May 15–16, 2019. Register today! In her session, participants will learn to apply Willingham’s four steps and make better LX design decisions that are grounded in scientific evidence.

Other sessions will explore affective learning, reducing cognitive load, and designing so that learners will remember their training longer.