Engagement = learning, right?

A dive into the fundamentals of the L&D industry.

Richard Bailey Marketing Executive

A dive into the fundamentals of the L&D industry.

When it comes to employee engagement, there are a lot of opinions out there on exactly what it means. There is so much discussion around this topic, it makes you start to wonder what on earth we are even talking about? Like many buzzwords in this industry, many, many L&D departments are blinkered by the idea of learner engagement - but do they even know what it means? The pursuit of this mecca, the modern learner experience, is swathed in complexity and assumptions and I’m not sure many of us really quite understand what we’re trying to accomplish here.

Let’s examine. Strap on in, this is going to get a bit spicy. 

The employee engagement debacle

So, this is the tweet that started it all: 

It says: “We will no longer measure learning by completions, but by engagement and sentiment scores.” 

Wow. Did that ever open a can of worms. From reading the thread, there were two key areas which people took issue with right away.

Issue #1: Completion = learning, no? 

Many people had a good point: why was a completion ever a measure of learning in the first place? Surely completions are for the business, not the learner. How on earth does a completion signify that learning took place? How can we measure whether an individual acquired any knowledge simply by them completing the course? 

I don’t want to, but I could give at least 10 examples of occasions where I’ve endured/wallowed/trudged through information, only to learn absolutely nothing.

What’s worrying is that this measurement of ‘success’ merely states that a process or procedure, or indeed a series of events, has taken place. And it’s the backbone of L&D - but sadly it does not: 

The goal is learning here - does a completion, a tick box, a mere finish of a course signify that? I don’t personally believe so. 

Issue #2: Engagement doesn’t prove knowledge acquisition

Just as many took issue with the concept of a completion proving learning, many also argued that engagement really wasn’t an indication of learning either. Which is fair really. I would argue that engagement doesn’t prove learning, but it does prove that your learners are interested, connected and listening (at the very least). Engagement still has a place in the workflow - but it just doesn’t necessarily mean what we think it does.

I kind of see Donald’s point. Just as I, a data driven marketer, would measure engagement in marketing (clicks on emails, visits to the site etc), that data does not prove buying sentiment or indeed drive new customers on its own. Those are nice stats and data, but they do not always correlate with tangible evidence of sales and alignment with business KPIs. As stand alone items, they actually mean very little.

And that’s what really got me thinking - we’re giving engagement the entirely wrong meaning. We’re trying learning to engagement and that’s not what it’s there for. We’re chasing phantoms - trying to pursue vanity figures like increases in visits to the LMS, ‘views’ on videos or even time spent on specific areas of the LMS. But this sort of vanity data collection means very little unless you can attach it to behaviours and correlate it with actual results. 

More people came to your LMS. So what? 

More learners watched a video you shared. So what? 

People ‘liked’ 100 articles this month. So what. 

I could go on. These are all engagement stats - but when you look at them as isolated data points, you can see how little context they have in the traditional constraints of ‘learning outcomes’.

But does that really matter? Is the only goal in our business to teach people in a formal manner, to prove learning and indeed to have nice reports which say people completed their training in a timely manner? Doesn’t sound all that modern to me. Surely a completion isn’t the only way to prove that people are learning stuff in your business?

Engagement is not a coup de grâce of learning

Engagement is not learning, I think we can all agree that. But that doesn’t mean it isn’t valuable to L&D.

As Lori’s original message said, we will soon evolve way beyond the idea of a completion being a clear indicator of learning success. I’m in her boat - I believe we’ll move to behavioural data, insights and iteration and work much more like marketers when trying to understand our audiences (read: learners). We’ll move away from measuring learning in a traditional way - and accept new data and stats as the status quo.

I can see that happening, but the worry for me is that we’ll just be using these new statistics in the same way we’ve used completions in the past.  And to me, that’s not what this data is about. It goes so much more than saying: “Janet completed her training today. That means she has learned something.” 

I may be wrong, but I don’t think we should be using engagement to evidence learning - because what Donald said above is right; it doesn’t prove learning (no more than a completion does, anyway.) Rather, I believe engagement stats should be used as springboard to better understand learner sentiment, to dig deeper into what works for them (what they like/dislike) and ultimately help L&D iteratively improve their approach so that hopefully, when people are undertaking training, they’re more interested, connected and trust the content they’re consuming. 

As a marketer, I would never use engagement stats in isolation to discern someone’s intent to buy. But I would use that data improve the interactions my business has with them and provide more relevant, personalised content to them which adds value to their lives and improves the way they work. L&D should be doing the same, instead of trying to use engagement as the new metric for success.

What is learning, anyway?

This conflict between engagement and learning is a really interesting one and begs the question: “What is learning, anyway?” 

Well, if we look at the way L&D typically proves learning at present, it’s completion of training. It’s no surprise that people who believe that a completion = knowledge acquisition also completely disregard engagement as a metric for learning. The two are poles apart in terms of the substance and information they provide - but the reality is, to the learner they both likely signify very little. 

Proving ‘learning’ to me is a strange concept in general to me. Rarely am I asked to prove that I learned something. If I have a need, I solve it by learning something. Ideally, that changes or improves my behaviour and approach and consequently has overall positive results for the business, be that in productivity, output, sales or general morale and job satisfaction. No one’s here at Thrive asking me what I learned, or asking me to complete some learning to prove it. I just learned it, and that the end of it. 

Measuring ‘knowledge acquisition’ as an output for me is an outdated approach. There. I said it. Imagine if I was a musician and was measuring ONLY tangible asset sales (ie, CDs, records etc) as my sole measure of success, much like we did in the 90’s. If I’m a modern artist I have to accept that’s not the world we live in and if I ignore the options of MP3 and streaming sites, I’m doing myself a massive injustice. L&D - by ignoring the modern ways of learning, you’re doing the same to yourself.

Latching onto the concept of learning

Latching onto this concept of measuring learning as a completion is a real problem for me. It proves nothing.

Some people have warmed to the idea of engagement - it feels more modern, it’s using behavioural data. But if we are just using the statistics as a measurement of learning, then we really are missing a huge piece of the pie (much like the analogue focused artist mentioned above).

Because what we actually need in this industry is a complete change in mindset. An absolute change in approach and really, a change in the way that we perceive learning happening in businesses. It’s great that we’re starting to attribute value to statistics and data measurement, but they don’t prove learning any more than a completion does. 

The point for me is the fact that learning is always happening in a business, it’s ubiquitous. So measuring things like clicks, views etc shows that people are indeed interacting with your training and content. But does that mean they learned anything? Does it even matter if they did?

Does learning = success? 

Here we are telling ourselves that proving learning means that we have been a success. Engagement is a perceived powerhouse in L&D because it means that people are interacting with your stuff and using it. Maybe it means they like it. Maybe that will change their behaviour over time, maybe it will not. 

Unravelling the complexity of learning at work is no mean feat, but at least engagement statistics can be correlated with training undertaken and get real data results for L&D functions to improve, iterate and evolve their approach. 

And maybe, one day, we’ll stop measuring whether learning took place and accept that learning is a by-product of content which is accessible, easy to use and easy to understand. It’s not a completion. It’s not engagement. It’s information being consumed and used at a later date and really, measuring that is an impossible feat. Instead - let's focus on fostering and facilitating environments where learners can share and collaborate with one another and drive ecosystems where learning becomes a natural habit, not something you have to measure.

Latest from the blog


Put L&D's value in the spotlight with the Data Lab

Get your own team of Data Analysts, helping you measure the real impact of learning activity against business goals so you can maximise your investment and gain a seat at the table.

Ben Cooper Awards & PR Specialist


Introducing: The Content Club

An army of storytellers building campaign-led content and learning pathways to drive meaningful conversations.

Ben Cooper Awards & PR Specialist

Try it out for yourself

Get started and see how your employees can thrive.

Get LXP demoGet Content Club demo