The learning and development (L&D) industry has several challenges to solve in the years ahead. Two of the more prevalent ones are related to learning design and measuring learning impact. On one hand, L&D professionals tend to recognize that learning has to be more engaging, relevant, and personalized for it to cut through the clutter and have the desired impact. On the other hand, there’s an increased need to quantify and demonstrate that impact and the value of training for the businesses we operate in.
Now, what do these two challenges have in common? Both are a source of frustration for L&D professionals, myself included. But perhaps the more meaningful commonality is in the solution. We could potentially solve both of these problems with smart use of data.
On the design front, smart use of data and analytics can help us to personalize learning. Instead of one-size-fits-all courses, we could provide our people with learning activities that work for them and which take into account their existing knowledge, skills, job functions, workflows and even career paths, among many other things. As a learning designer myself, the more I know about my learners, the better job I can do. And it tends to make a world of difference to the learners too.
But learning design work, of course, doesn’t stop at the delivery. Ultimately, it’s quite unlikely that we get everything right the first time. At least in our work, we quite often find that the learners didn’t engage with the content the way we initially thought. Sometimes we might even realize that we’ve addressed an entirely wrong set of problems as we begin to uncover the real ones. Therefore, it’s very important that we utilize data to formulate an understanding into what actually works, how learners engage with the content and the areas we could improve in, and then iterate accordingly. And that puts us on the path towards more impactful learning.
Impact, though, is easier said than measured. The learning impact part itself is not entirely impossible. But to be frank, most organizations are looking at the wrong things. The reality is that most common corporate learning metrics such as completion rates, time spent, or results of immediate summative assessment don’t tell us very much about the actual learning at all. But neither do the happy sheets filled in at the end of a classroom session, if that’s a point of comparison. Thankfully, comprehensive tracking, data collection, and analytics enabled by technological developments of recent years enable us to move beyond those metrics, use more formative approaches, and understand learning in a deeper way.
But the fun certainly doesn’t stop there! After all, we still need to make the case for the value of the learning to the business, don’t we? To understand that, we need to move beyond the conventional data sources available to the L&D function. First, we’ll want to find out whether learning transfer actually happens, i.e. do people apply the knowledge on the day-to-day. Secondly, we’ll want to find out what kind of impact, if any, do these new practices have on the business and its performance.
Naturally, adding salt to the mix is the fact that none of this happens in a vacuum: behaviors, and especially performance, are influenced by many other factors too. And learning or the impact doesn’t occur overnight either. But even taking that into account, the only credible way of even getting closer to demonstrating the impact and value of learning is through data. At the end of the day, the more information we have, even if it’s not perfect, the better equipped we are to make those important business and learning decisions.
About the author: Lauri Sulanto is a learning technologist and strategist trying to make sense of data to help people learn better. He will be sharing his thoughts on data-driven learning at the Learning and Development Asia 2019 conference in the Philippines on September 26th, 2019.