Engagment, growth, and results. Just three reasons Topps Tiles renews with Thrive
Find out more
May 21, 2025
|
5 mins to read

How to use learning data with confidence

Data is an invaluable resource for L&D teams – but it’s only truly useful if you know what to do with it. In this blog, we’ll explore how to turn raw insights into real impact.
Alex Mullen
Web Content Writer

As an L&D professional, you’ll be well aware of the need for data. 

Perhaps you anxiously track your team’s engagement scores; sit and wait for the feedback survey results to roll in; keep your eagle eyes on the completion rates like it's a football match. 

But do you know what to do with that information once you have it in your (proverbial) hands? 

Many Learning & Development teams are data-driven, even data-obsessed. But in this blog, written with expert help from Barbara Poblano (Product & Customer Analyst at Thrive), we’ll dive into what it means to truly use the data gathered from your learners.

We’ll define qualitative and quantitative data, explain how they work together, and explore why the real transformation happens when you connect the dots between them. 

Quantitative vs qualitative learning data: what's the difference?


Let’s start at the beginning by defining our terms. What is the difference between quantitative and qualitative data? If you’re not “a data person”, you might have heard these words thrown around before without knowing what they actually mean. 

They’re essential if you want to know how to use your learner data to its fullest potential, so here’s a clear definition: 

  • Quantitative data refers to hard numbers. It’s measurable, trackable and objective. (For example: Survey scores, course completion rates, or number of logins.)

  • Qualitative data is everything that’s harder to measure. It’s the feelings, experiences and feedback that come from open text responses, interviews, focus groups, or observational notes.

To put it simply: quantitative data tells you what’s happening. Qualitative data helps you understand why.

Does it really matter if you have the data?


Yes, but (not to belabour the point) only if you know how to use it.

  • Don’t gather data just because you feel as though you should. Collecting endless surveys and engagement metrics is no use if you don’t use them to make decisions.

  • Avoid collecting data that you’re not planning to analyse or act on — it can quickly become noise and erode trust if people don’t see outcomes.

Ask yourself this:

If someone handed you a spreadsheet with every piece of feedback your learners have ever submitted, what would you actually do with it?

Why you need both data types


So, do you need both qualitative and quantitative data?

It’s tempting to lean heavily on the numbers. Quantitative data is easy to report on. It looks neat in a graph. You can compare it month-on-month, and show off increases in satisfaction scores.

But numbers without context can be misleading.

For example, your survey might show that 85% of learners say their learning needs are being met. Great, right? But unless you dig deeper, you won’t know what that actually means to them. Which learning needs? Why do they feel supported? And more importantly… What about the other 15%?

That’s where qualitative data comes in. Learner comments, manager responses, open feedback boxes — all of this helps you make sense of the numbers.

When you combine the two, you get a full picture:

  • Quantitative data shows you trends, patterns and where to focus.
  • Qualitative data helps you understand behaviour, motivation and context.

You can’t drive meaningful change without both.

Turning “how learners feel” into data you can use


One of the biggest challenges in L&D is translating fuzzy, qualitative feelings like "I feel supported" or "I enjoy the training" into something you can measure and improve.

Here’s an example. Let’s say you send out an annual learner survey and one of the questions is:

"Are you happy with your learning experience?"

Simple yes or no. Maybe even a 1-10 scale.

That’s a decent start, but it’s not enough. You need to ask:

What does “happy” mean?

An existential question, for sure. But putting it strictly in L&D terms: Does it mean they get enough learning opportunities? That the content is relevant? That their manager supports them?

Until you break that vague feeling down into specific, measurable areas, you won’t know where to improve.

Take this next question as a further example:

"How satisfied are you with your development this year?"

A qualitative answer to this might be "good", "bad", or "satisfied" — but those responses can quickly get ambiguous. What if this year you get more “satisfied” responses than last year, but fewer “good” ones? Or more “bad,” which might actually reflect someone’s personal workload or circumstances, rather than the L&D strategy itself?

This is where combining qualitative and quantitative data becomes powerful. Asking for a rating on a scale (e.g. "From 0 to 10, how satisfied are you with your development this year?") gives you a measurable data point you can compare year-on-year. Perhaps someone scored a 7 this year and a 4 last year, showing clear improvement.

From there, you can go deeper. Follow up with a qualitative prompt to understand the reasoning: What influenced your score? You can even break that experience down further by specific metrics like:

  • Was the amount of learning time right for you?
  • Did you feel supported by your manager or team?
  • Was the content relevant to your role?

This layered approach helps you interpret the story behind the score and make data-informed improvements that actually matter. Speaking of which…

How to translate qualitative feedback into quantitative action


So how do you bridge the gap? How do you turn “I feel supported” into something you can actually work with?

Here’s a simple framework to get started:

1. Identify common themes in qualitative feedback


Look for patterns in comments, conversations or survey answers. Are people mentioning the same frustrations? Praising the same things?

2. Categorise those themes into measurable areas


Once you’ve spotted the common threads, group them. For example:

  • Content relevance
  • Ease of access
  • Manager support
  • Frequency of learning opportunities

3. Assign metrics


Now, create survey questions or feedback scores linked to those categories. Instead of asking, "Are you happy?", ask:

"How relevant do you find the content to your role? (1-10)"
"How often do you have time to complete learning at work?"

This way, you move from vague qualitative feedback to tangible, trackable data points.

Data alone isn’t enough


Even when you’ve got clean, well-structured data, that’s only half the job.

The true value lies in who’s looking at it and what questions they’re asking.

Numbers can’t tell you what action to take. That’s a human job. You need someone with curiosity; someone who’s willing to dig into the data, mine for inconsistencies, and ask why some groups are thriving while others are falling behind.

For example:

  • Why is satisfaction lower in Region A compared to Region B?
  • Why do new starters rate the learning content higher than long-term employees?
  • Why is engagement dropping off after the first month?

The data can point you in the right direction — but it’s people who will connect the dots and turn insight into action.

Moving from data to action: A simple roadmap


If you’re looking to make better use of your learner data, here’s a straightforward process to follow:

Step 1: Start with a clear question

What are you trying to understand or improve? Be specific. For example:

  • Are learners satisfied with onboarding content?
  • Are people applying what they’ve learned in their roles?

Step 2: Collect both quantitative and qualitative data

Use surveys, completion rates, and feedback forms — but also conversations, interviews and open-ended feedback boxes.

Step 3: Look for trends and patterns

Don’t stop at surface-level averages. Break the data down by team, location, tenure, or demographic to spot disparities.

Step 4: Dig deeper

Ask why the data looks the way it does. What stories are emerging? What behaviours or blockers might be driving the numbers?

Step 5: Take action, and measure the impact

Once you’ve identified areas to improve, make changes. Then re-measure to see if your actions had the desired effect.

The power of data


Data shouldn’t exist for the sake of it. In L&D, it’s not enough to say, “Our learner satisfaction score is 85%” and call it a day. That number means nothing if you don’t understand what’s behind it or how to make it better.

The real power of data lies in combining the numbers with the narrative (the quantitative with the qualitative) and using both to drive action.

So next time you’re faced with a spreadsheet full of engagement scores, ask yourself:

What is this actually telling me? And what can I do about it?

Data doesn’t drive change; you do. It’s how you interpret it, act on it, and turn insight into meaningful outcomes that really makes the difference.

Thanks again to Barbara Poblano for her data expertise, and if you want to see how Thrive can help you turn learning data into real business impact, take a look at our Thrive Impact service and book a demo today.

More Stories

See all

See Thrive in action

Explore what impact Thrive could make for your team and your learners today.

May 21, 2025
|
5 mins to read

How to use learning data with confidence

Data is an invaluable resource for L&D teams – but it’s only truly useful if you know what to do with it. In this blog, we’ll explore how to turn raw insights into real impact.
Alex Mullen
Web Content Writer

As an L&D professional, you’ll be well aware of the need for data. 

Perhaps you anxiously track your team’s engagement scores; sit and wait for the feedback survey results to roll in; keep your eagle eyes on the completion rates like it's a football match. 

But do you know what to do with that information once you have it in your (proverbial) hands? 

Many Learning & Development teams are data-driven, even data-obsessed. But in this blog, written with expert help from Barbara Poblano (Product & Customer Analyst at Thrive), we’ll dive into what it means to truly use the data gathered from your learners.

We’ll define qualitative and quantitative data, explain how they work together, and explore why the real transformation happens when you connect the dots between them. 

Quantitative vs qualitative learning data: what's the difference?


Let’s start at the beginning by defining our terms. What is the difference between quantitative and qualitative data? If you’re not “a data person”, you might have heard these words thrown around before without knowing what they actually mean. 

They’re essential if you want to know how to use your learner data to its fullest potential, so here’s a clear definition: 

  • Quantitative data refers to hard numbers. It’s measurable, trackable and objective. (For example: Survey scores, course completion rates, or number of logins.)

  • Qualitative data is everything that’s harder to measure. It’s the feelings, experiences and feedback that come from open text responses, interviews, focus groups, or observational notes.

To put it simply: quantitative data tells you what’s happening. Qualitative data helps you understand why.

Does it really matter if you have the data?


Yes, but (not to belabour the point) only if you know how to use it.

  • Don’t gather data just because you feel as though you should. Collecting endless surveys and engagement metrics is no use if you don’t use them to make decisions.

  • Avoid collecting data that you’re not planning to analyse or act on — it can quickly become noise and erode trust if people don’t see outcomes.

Ask yourself this:

If someone handed you a spreadsheet with every piece of feedback your learners have ever submitted, what would you actually do with it?

Why you need both data types


So, do you need both qualitative and quantitative data?

It’s tempting to lean heavily on the numbers. Quantitative data is easy to report on. It looks neat in a graph. You can compare it month-on-month, and show off increases in satisfaction scores.

But numbers without context can be misleading.

For example, your survey might show that 85% of learners say their learning needs are being met. Great, right? But unless you dig deeper, you won’t know what that actually means to them. Which learning needs? Why do they feel supported? And more importantly… What about the other 15%?

That’s where qualitative data comes in. Learner comments, manager responses, open feedback boxes — all of this helps you make sense of the numbers.

When you combine the two, you get a full picture:

  • Quantitative data shows you trends, patterns and where to focus.
  • Qualitative data helps you understand behaviour, motivation and context.

You can’t drive meaningful change without both.

Turning “how learners feel” into data you can use


One of the biggest challenges in L&D is translating fuzzy, qualitative feelings like "I feel supported" or "I enjoy the training" into something you can measure and improve.

Here’s an example. Let’s say you send out an annual learner survey and one of the questions is:

"Are you happy with your learning experience?"

Simple yes or no. Maybe even a 1-10 scale.

That’s a decent start, but it’s not enough. You need to ask:

What does “happy” mean?

An existential question, for sure. But putting it strictly in L&D terms: Does it mean they get enough learning opportunities? That the content is relevant? That their manager supports them?

Until you break that vague feeling down into specific, measurable areas, you won’t know where to improve.

Take this next question as a further example:

"How satisfied are you with your development this year?"

A qualitative answer to this might be "good", "bad", or "satisfied" — but those responses can quickly get ambiguous. What if this year you get more “satisfied” responses than last year, but fewer “good” ones? Or more “bad,” which might actually reflect someone’s personal workload or circumstances, rather than the L&D strategy itself?

This is where combining qualitative and quantitative data becomes powerful. Asking for a rating on a scale (e.g. "From 0 to 10, how satisfied are you with your development this year?") gives you a measurable data point you can compare year-on-year. Perhaps someone scored a 7 this year and a 4 last year, showing clear improvement.

From there, you can go deeper. Follow up with a qualitative prompt to understand the reasoning: What influenced your score? You can even break that experience down further by specific metrics like:

  • Was the amount of learning time right for you?
  • Did you feel supported by your manager or team?
  • Was the content relevant to your role?

This layered approach helps you interpret the story behind the score and make data-informed improvements that actually matter. Speaking of which…

How to translate qualitative feedback into quantitative action


So how do you bridge the gap? How do you turn “I feel supported” into something you can actually work with?

Here’s a simple framework to get started:

1. Identify common themes in qualitative feedback


Look for patterns in comments, conversations or survey answers. Are people mentioning the same frustrations? Praising the same things?

2. Categorise those themes into measurable areas


Once you’ve spotted the common threads, group them. For example:

  • Content relevance
  • Ease of access
  • Manager support
  • Frequency of learning opportunities

3. Assign metrics


Now, create survey questions or feedback scores linked to those categories. Instead of asking, "Are you happy?", ask:

"How relevant do you find the content to your role? (1-10)"
"How often do you have time to complete learning at work?"

This way, you move from vague qualitative feedback to tangible, trackable data points.

Data alone isn’t enough


Even when you’ve got clean, well-structured data, that’s only half the job.

The true value lies in who’s looking at it and what questions they’re asking.

Numbers can’t tell you what action to take. That’s a human job. You need someone with curiosity; someone who’s willing to dig into the data, mine for inconsistencies, and ask why some groups are thriving while others are falling behind.

For example:

  • Why is satisfaction lower in Region A compared to Region B?
  • Why do new starters rate the learning content higher than long-term employees?
  • Why is engagement dropping off after the first month?

The data can point you in the right direction — but it’s people who will connect the dots and turn insight into action.

Moving from data to action: A simple roadmap


If you’re looking to make better use of your learner data, here’s a straightforward process to follow:

Step 1: Start with a clear question

What are you trying to understand or improve? Be specific. For example:

  • Are learners satisfied with onboarding content?
  • Are people applying what they’ve learned in their roles?

Step 2: Collect both quantitative and qualitative data

Use surveys, completion rates, and feedback forms — but also conversations, interviews and open-ended feedback boxes.

Step 3: Look for trends and patterns

Don’t stop at surface-level averages. Break the data down by team, location, tenure, or demographic to spot disparities.

Step 4: Dig deeper

Ask why the data looks the way it does. What stories are emerging? What behaviours or blockers might be driving the numbers?

Step 5: Take action, and measure the impact

Once you’ve identified areas to improve, make changes. Then re-measure to see if your actions had the desired effect.

The power of data


Data shouldn’t exist for the sake of it. In L&D, it’s not enough to say, “Our learner satisfaction score is 85%” and call it a day. That number means nothing if you don’t understand what’s behind it or how to make it better.

The real power of data lies in combining the numbers with the narrative (the quantitative with the qualitative) and using both to drive action.

So next time you’re faced with a spreadsheet full of engagement scores, ask yourself:

What is this actually telling me? And what can I do about it?

Data doesn’t drive change; you do. It’s how you interpret it, act on it, and turn insight into meaningful outcomes that really makes the difference.

Thanks again to Barbara Poblano for her data expertise, and if you want to see how Thrive can help you turn learning data into real business impact, take a look at our Thrive Impact service and book a demo today.

More Stories

See all

See Thrive in action

Explore what impact Thrive could make for your team and your learners today.