I got in a Twitter debate a while ago. Shock, I know. It wasn’t about learning styles or indeed whether it’s elearning, or eLearning or e-learning (it’s elearning, btw). No, I ended up in a really interesting discussion around employee engagement. What it is, what it means. Why it’s a thing. Why we’re trying to accomplish it.
And I’ll tell you this – there are a lot of opinions out there. After the debate I realised, what on earth are we even talking about? Like many buzzwords in this industry, many, many L&D departments are blinkered by the idea of learner engagement – but do they even know what it means? The pursuit of this mecca, the modern learner experience, is swathed in complexity and assumptions and I’m not sure many of us really quite understand what we’re trying to accomplish here.
Let’s examine. Strap on in, this is going to get a bit spicy.
The employee engagement debacle
So, this is the tweet that started it all:
— Lorna Matty (@Lorna_Matty) 5 September 2018
It says: “We will no longer measure learning by completions, but by engagement and sentiment scores.”
Wow. Did that ever open a can of worms. From reading the thread, there were two key areas which people took issue with right away.
Issue #1: Completion = learning, no?
Many people had a good point: why was a completion ever a measure of learning in the first place? Surely completions are for the business, not the learner. How on earth does a completion signify that learning took place? How can we measure whether an individual acquired any knowledge simply by them completing the course?
I don’t want to, but I could give at least 10 examples of occasions where I’ve endured/wallowed/trudged through information, only to learn absolutely nothing. Thankfully my driving test wasn’t one of these, but the speed awareness course I consequently took several years later, which dragged on for over 5 hours, certainly was…
What’s worrying is that this measurement of ‘success’ merely states that a process or procedure, or indeed a series of events, has taken place. And it’s the backbone of L&D. But sadly it’s not a great measurement tool, in fact it does not:
- Provide any indication of a change in behaviour
- Correlate impact of training undertaken with business performance and goals
- Prove that the training is useful, interpretable and implementable
The goal is learning here – does a completion, a tick box, a mere finish of a course signify that? I don’t personally believe so.
I agree the goal should always be learning. But learning is more than completion which for some has been the default.
— Lorna Matty (@Lorna_Matty) 5 September 2018
Issue #2: Engagement doesn’t prove knowledge acquisition
Same as many took issue with the concept of a completion proving learning, many also argued that engagement really wasn’t an indication of learning either. Which is fair really. I argued that engagement doesn’t prove learning, but it does prove that your learners are interested, connected and listening (at the very least). Engagement still has a place in the workflow – but it just doesn’t necessarily mean what we think it does.
Engagement is not a measure of anything other than… engagement. Learning is about cognitive change not just making people feel good (it’s easy to do that). To be honest I’m not sure what engagement actually is… maybe we’re agreeing and this is a linguistic problem…
— Donald Clark (@DonaldClark) 5 September 2018
I kind of see Donald’s point. Just as I, a data driven marketer, would measure engagement in marketing (clicks on emails, visits to the site etc), that data on its own does not prove buying sentiment or indeed drive new customers. Those are nice stats and data, but they do not always correlate with tangible evidence of sales and alignment with business KPIs. As stand alone items, they actually mean very little.
And that’s what really got me thinking – we’re giving engagement the entirely wrong meaning. We’re tying learning to engagement and that’s not what it’s there for. We’re chasing phantoms – trying to pursue vanity figures like increases in visits to the LMS, ‘views’ on videos or even time spent on specific areas of the LMS – and trying to restrospectively tie them to learning. But, as I said in my data needs context article, this sort of vanity data collection means very little unless you can attach it to behaviours and correlate it with actual results.
More people came to your LMS. So what?
More learners watched a video you shared. So what?
People ‘liked’ 100 articles this month. SO WHAT.
I could go on. These are all engagement stats – but when you look at them as isolated data points, you can see how little context they have in the traditional constraints of ‘learning outcomes’.
But does that really matter? Is the only goal in our business to teach people in a formal way, to prove learning and indeed to have nice reports which say people completed their training in a timely manner? Doesn’t sound all that modern to me. Surely a completion isn’t the only way to prove that people are learning stuff in your business?
Engagement is not a coup de grâce of learning
As Lori’s original message said, we will soon evolve way beyond the idea of a completion being a clear indicator of learning success. I’m in her boat – I believe we’ll move to behavioural data, insights and iteration and work much more like marketers when trying to understand our audiences (read: learners). We’ll move away from measuring learning in a traditional way – and accept new data and stats as the status quo.
I can see that happening, but the worry for me is that we’ll just be using these new statistics in the same way we’ve used completions in the past. And to me, that’s not what this data is about. It goes so much more than saying: “Janet completed her training today. That means she has learned something.”
I may be wrong, but I don’t think we should be using engagement to evidence learning – because what Donald said above is right; it doesn’t prove learning (no more than a completion does, anyway.) Rather, I believe engagement stats should be used as springboard to better understand learner sentiment, to dig deeper into what works for them (what they like/dislike) and ultimately help L&D iteratively improve their approach so that hopefully, when people are undertaking training, they’re more interested, connected and trust the content they’re consuming.
As a marketer, I would never use engagement stats in isolation to discern someone’s intent to buy. But I would use that data improve the interactions my business has with them and provide more relevant, personalised content to them which adds value to their lives and improves the way they work. L&D should be doing the same, instead of trying to use engagement as the new metric for success.
Not sure I agree Donald. Engagement doesn’t prove learning has taken place, but is that really the primary goal for most businesses these days? Surely it’s more about building trust and making learning meaningful and relevant? I’d say engagement is a great measurement of that…
— AS (@ashmsinclair) 5 September 2018
What is learning, anyway?
This conflict between engagement and learning is a really interesting one and begs the question: “What is learning, anyway?”
Well, if we look at the way L&D typically proves learning at present, it’s completion of training. It’s no surprise that people who believe that a completion = knowledge acquisition also completely disregard engagement as a metric for learning. The two are poles apart in terms of the substance and information they provide – but the reality is, to the learner they both likely signify very little.
Proving ‘learning’ is a strange concept in general to me. Rarely am I asked to prove that I learned something. If I have a need, I solve it by learning something. Ideally, that changes or improves my behaviour and approach and consequently has overall positive results for the business, be that in productivity, output, sales or general morale and job satisfaction. No one’s here at Thrive asking me what I learned, or asking me to complete some learning to prove it. I just learned it, and that the end of it.
Measuring ‘knowledge acquisition’ as an output for me is an outdated approach. There. I said it. Imagine if I was a musician and was using ONLY tangible asset sales (ie, CDs, records etc) as my sole measure of success, much like we did in the 90’s. If I’m a modern artist I have to accept that’s not the world we live in and if I ignore the options of MP3 and streaming sites, I’m doing myself a massive injustice. L&D – by ignoring the modern ways of learning, you’re doing the same to yourself.
Latching onto the concept of learning
Latching onto this concept of measuring learning as a completion is a real problem for me. It proves nothing.
Some people have warmed to the idea of engagement – it feels more modern, it’s using behavioural data. But if we are just using the statistics as a measurement of learning, then we really are missing a huge piece of the pie (much like the analogue focused artist mentioned above).
Because what we actually need in this industry is a complete change in mindset. An absolute change in approach and really, a change in the way that we perceive learning happening in businesses. It’s great that we’re starting to attribute value to statistics and data measurement, but they don’t prove learning any more than a completion does.
The point for me is the fact that learning is always happening in a business, it’s ubiquitous. So measuring things like clicks, views etc shows that people are indeed interacting with your training and content. But does that mean they learned anything? Does it even matter if they did?
Does learning = success?
Here we are telling ourselves that proving learning means that we have been a success. Engagement is a perceived powerhouse in L&D because it means that people are interacting with your stuff and using it. Maybe it means they like it. Maybe that will change their behaviour over time, maybe it will not.
Unravelling the complexity of learning at work is no mean feat, but at least engagement statistics can be correlated with training undertaken and get real data results for L&D functions to improve, iterate and evolve their approach.
And maybe, one day, we’ll stop measuring whether learning took place and accept that learning is a by-product of content which is accessible, easy to use and easy to understand. It’s not a completion. It’s not engagement. It’s information being consumed and used at a later date and really, measuring that is an impossible feat. Instead – let’s focus on fostering and facilitating environments where learners can share and collaborate with one another and drive ecosystems where learning becomes a natural habit, not something you have to measure.