How do you know for certain when a learning experience has been valuable and effective? What are the signs? If you're like many organizations out there, you're not quite sure how to measure the impact of a modern learning experience. Test scores and module completions have long been held up as the mainstay of learning measurement. And why not? It's how we've traditionally measured success in the classroom.
But the evolution of the learning experience has now gone far beyond the classroom, and we have so many more ways to measure our successes – and our failures – when it comes to learning in the workplace. Studies show that we don't even prefer to learn in the classroom anymore. According to LinkedIn's 2018 Workplace Learning Report:
- 68 percent of employees to learn at work
- 58 percent of employees prefer to learn at their own pace
- 49 percent of employees prefer to learn at the point of need
And when you factor in the rise of digital learning opportunities, microlearning, self-directed learning, highly personalized employee development interests, and informal learning (just to name a few), there are endless ways employees create their own learning experiences. But that doesn't mean your L&D team can't develop a cohesive, strategic plan for leveraging these diverse learning opportunities and measuring the outcomes to prove the value in your learning activity investments.
Brandon Hall Group's Principal Learning Analyst David Wentworth and I have already shown you the first three critical steps in creating a modern learning experience:
- How to find your "why," since a clear strategy is the starting point for a modern learner experience
- Three key things you need to create a stellar personalized learning experience
- The agile decisions and actions you need to take during the implementation phase
The goal is to make it engaging, outcome-driven and impactful. Only once you've taken these first three steps can you start to measure the outcomes of the learning experience you've deployed at your organization. Without a rock-solid foundation, the learning experience you want to deliver to your end-users – your employees, your people – will still be an experience, but it won't be very positive for anyone.
A positive, effective learning experience needs to be modern, strategic, personalized, inextricably linked to performance outcomes and, above all, its outcomes need to be measured.
Measure the right metrics, make a major impact
So what strategic learning metrics do you need to measure, and how do you know when you're on the right track?
Listen in to the final audio blog episode in our series of four Q&A-style chats about creating a modern learning experience with Brandon Hall Group's Principal Learning Analyst David Wentworth. We dive into strategies about how to measure the impact of your modern learner experience, leveraging data beyond scores and completions.
In our conversation, you'll hear insights about:
- What new research reveals about organizations' approach to measuring learning outcomes
- How layering tools like Google Analytics over digital learning activities taps into a plethora of new insights
- The most important thing you need before you can start measuring and reporting on any data
It's time to listen in and learn like you:
David Wentworth: Hi, and welcome to another Audio Blog in our series on Learning Experience. I'm David Wentworth, I'm the Principal Learning Analyst here at Brandon Hall Group. I'm joined by Carl Crisostomo, who is the Product Manager for Content for Saba Software.
In this Audio Blog, we're going to talk a little bit about effectively measuring the impact of a modern learning experience. We've got other Audio Blogs on the importance of this type of experience, what goes into a good learning experience, as well as the implementation of this type of learning.
In this Audio Blog, we wanted to talk a bit about measurement and how important it is that we are able to demonstrate value. This isn't just some sort of fad that we're exploring, there is tangible value in a modern learning experience. So how do we demonstrate that?
Carl with that, let's just take the question on its face. How do you effectively measure the impact of this type of learning experience? Because it tends to be a bit different from the traditional learning and measurement models we've seen.
Carl Crisostomo: Yeah. Thank you, David. I think the good thing about learning experience is that you often have more data points to measure. I think it goes beyond the usual scores and completions. Scores and completions have been the mainstay of learning measurement for a while now. They are useful – they provide insight. But they're fairly narrow, so I think a modern learning experience usually gives us a lot more data points that we can use for measurement.
I'm just going to focus on an example of measuring the experience itself. This example comes out of our custom content team, Saba Studio. We recently created a program that's focused around career conversations. It was a portal. It contained videos, PDF resources, links to learning and apps. One of the apps included a tool to help you kind of rate the potential of your team members.
What they did there was they layered Google Analytics over this portal. This allowed you to measure a whole host of things. You could measure video watches, for example. You may see that lots of people were watching the video on "How to run a conversation with a low or poor performer." And you noticed things you could measure, like bounce rates and how people interacted with the app.
If the potential of your team is constantly outputting low-potential results, this could be an issue, especially if you match that with the video watches data. You want to start to uncover problems. And there are other measurable things in there as well, such as how people arrived at the portal. So all those campaigns that you put in place to lure people to your learning experience, how effective were they? You can start gauging that.
This gives you a completely different level of insight than you would normally have when you're delivering a less-modern learning experience. These insights really allow you to react with agility while you're implementing and delivering your modern learning experience. So that's the kind of stuff within the experience itself.
David Wentworth: Yeah. I think there's an important turning point for organizations because one of the push backs we've always heard from a measurement standpoint is, "Well, we don't do a lot of this type of learning because we can't track and measure it."
The reality is that big companies can't track and measure it because it doesn't fit into the traditional measurement model. You can't measure it by butts in seats and hours consumed, so they just think it can't be measured.
If the traditional measurement was getting the job done, like if people really could understand the impact of their learning programs then I would say, "Okay, fine, let's leave it the way it is." But our research has shown, and we've talked to thousands of organizations, they're terrible at it. They're not good at measuring. They can basically measure completions and figure out how many people are going to their programs and whether they like them. But they're having a terrible time measuring the impact that it's having on the business. Why not take this opportunity to completely rethink your measurement strategy and focus on things like performance and outcomes.
On the one hand, Carl, you're talking about how this is so great; it gives us all these new data points and more data. I think a lot of organizations are scared of that. They can't even manage the data they already collect. But that's because they're trying to shove it into these boxes that don't work anymore.
The idea is to open yourself up to these new interactions, these new data points and realize that what you're getting from those is the actual impact. It's not so much now that, "Oh, somebody clicked on this and watched it," or "Somebody clicked on this and interacted with it," and that's the end of the story. It's more about: because somebody clicked on this, XYZ happened. This thing happened faster, this thing happened better – whatever it is. But by changing your measurement thought process to fit more of this modern learning experience, you're actually doing yourself a huge favor in rethinking the way you measure learning in general, like focusing on these outcomes, so that you're able to demonstrate the value of the learning and the impact it has.
I think, one of the things I hear a lot and Carl, you are probably used to this too from a measurement standpoint, is a company will say, "Well, I can't tell you that our training improved our sales performance... Even though that's why we built it. That's why we delivered it. But I can't tell you that's what it was because there are too many other variables. So why bother?"
That's a complete cop-out. You need to be able to demonstrate the impact that it does have and include those other variables because, with all those other variables, people are going to claim credit for those. So why doesn't learning claim credit for its role in how well things go?
Sometimes you get push back, "Well, if things don't get better? What if there's a downturn?" Well, if you're doing your job and you're doing your homework, you can able to say, "Well, yes, these are the factors led to things turning down but because we're able to train people this way, we were actually able to mitigate how bad things could have gotten." Learning really needs to take this active role and own the impact that it's having on outcomes.
Carl Crisostomo: Yeah. I'd agree with you 100 percent – other departments do. When a piece of change comes, such as a business objective around customer satisfaction that you want to drive up. You want to drive up customer satisfaction scores because you can correlate an increasing revenue to that. There may be a whole host of different initiatives going on where they're going to impact customer satisfaction.
I think for learning development, they need to know where they fit within that picture. They need to see (going forward with this example) how they can be really effective in helping drive their customer satisfaction up. And then they can really focus on changing behavior, looking at how they can change old habits and get people to form new ones.
I think it's really important to understand that the reason or the objective that they're going after, the change they want to enact. And also, looking at some direct measurements around that. But also not feeling afraid to say, "We're part of this bigger thing; we're part of improving customer satisfaction scores; we're part of increasing revenue." I think it's really, really important for them to do that.
David Wentworth: Yeah. I'm not going to sit here and say it's just a matter of flipping a switch and doing this clearly for a lot of organizations. This is a complete re-engineering of how they approach learning measurement. It may, for a lot of organizations, involve things like either investing in a tool like a learning record store (LRS) that's able to track, manage and store all of these different types of interactions because you're trying to measure things beyond just completions and final sheets. You might need some tools to do that.
I think the other key thing to really think about is: What kind of data analysis horsepower does your organization have access to? Do you have data analysts within learning or is there data analyst in the building that can be shared by learning? Do you need external help with this? In order for this to work, you need people who are able to look at not just all the data that you're collecting from these learning experiences, but data from around the business, so that you can actually ask the right questions and line these things up and see how things are working. And that takes a different skill set than where you used to using it in this environment.
In order to get serious about this, there are some things that organizations are going to have to do. It's not as simple as using a different formula in an Excel spreadsheet and you're going to be able to do this, you really need to rethink the resources that you're putting towards measurement.
Carl Crisostomo: Yeah. That point about skills is really, really interesting, so I think it does need a brand new set of skills. You often hear the phrase, "L&D needs to be more like marketing." I think this is probably true; I think the CLO is probably got more in common with the CMO than most other functional heads within the organization.
When I go out and I speak to recruiters in our space, one of the things that they saying to me is that over the years, there have been some incremental changes in the type of job roles that they're being asked to service. But more recently, this being a big, fundamental change, some of them are reappraising how they actually recruit and actually bring talent into L&D because the number of new skills is so large. And that is leaning on things like data analysis and marketing communications.
David Wentworth: Ultimately, when we think about measurement, it really circles back to the beginning of the learning experience discussion in that, in order for that experience to be effective, you've got to know what outcomes you're after to start with, and chase those things and build to those things. If you do that, the measurement becomes self-evident. You've already determined what it is you're trying to change, you're building the interactions that if you hope are going to get you there. Then the measurement is really a comparison between whether or not the people that are engaging in those interactions are actually exhibiting the changes that you're after. So it really comes full circle.
In order to effectively measure, you need to start with that measurement at the beginning and that's going to help you create that effective modern learning experience. Otherwise, how else are you going to be able to connect that experience to personal and individual goals as well as the business goals, if you don't know what those are at the beginning?
Measurement is not an afterthought. It's not the end product. It's really wrapped up in the entire learning experience, so it's a really key element and one that should focus on pretty strongly.
With that, I think we're going to sign off on this particular Audio Blog. Be sure to check out the other ones in this learning experience series. We've done them on, why an organization needs to think about a modern learning experience, as well as what goes into a good learning experience and another one on implementation of a modern learning experience.
We thank you for joining us on this Audio Blog. Carl thanks for joining me.
Carl Crisostomo: Thank you, David.
David Wentworth: See you all in the next one.