L&D Data - Behind the Scenes Part 2

Kick starting our 2022 Learner Intelligence project: survey analysis.

Written by Anna Barnett
Published 16 February 2022
L&D Data - Behind the Scenes Part 2

Welcome to the second blog of our miniseries, where we will be opening the door to our research practices and giving you a behind-the-scenes look at how we do things at Minds Tools. At this point in the year, we’re working towards producing our next Learner Intelligence report, which you can read more about in our first blog, here. You can also take a look at our earlier Learner Intelligence reports, here – there’s plenty to keep you going, we’ve been producing them for years! In this blog, though, we’ll be talking about analysis – our favourite 8 letter word starting with A (avocados is second).

As food is always on our mind, let us introduce you to our first course:

Bitter sweet Excel files served with lashings of columns and rows

There are lots of tools that help with data exploration, transformation, and manipulation but, truth be told, Excel is amazing at most things when it comes to data. And so, a key step in our research is bringing the raw data into a shape and format that is understandable and meaningful when doing further analysis.

It’s also an opportunity to explore the data (in a not-so-sophisticated way just now) to gather ideas about what might be interesting to investigate further down the line. Indeed, the final Excel file for us is one with lots of columns (these are our demographic cuts of interest, e.g., company size, turnover) and variables as rows. This gives us a neat overview of our dataset, which is filterable and easy to interact with. It’s also super useful for report writing.

As a snapshot, this is what is looks like:


Column C is simply the count of learners that competed the survey (in this case, 2044) and the rest is cut to tell us how many learners work in the industries we care about (construction through to real estate). In Column A, we have the variables that we have measured, which looks something like this:


Whilst we are big fans of Excel, we reserve our more advanced statistical analysis for more specialised programs such as R, Python, or STATA. Which brings us to the second course of our meal:

The meat and potatoes of our reports

It doesn’t seem to matter how many questions you ask in a survey, there’s always massively more analysis than you can (or should) do. Sometimes it’s easy, Excel-only stuff like reporting group averages in pretty bar charts, and other times we need to delve a bit deeper, addressing complicated questions (e.g., what sort of challenges do learners face when they work onsite, remote or both?) with more sophisticated analytic techniques (see Figure 1).

Figure 1. A snippet of our R script for this year’s Learner Intelligence data


It’s so important to focus and to think carefully about what it is we want to achieve before we’ve even opened RStudio, otherwise we risk being swamped. It’s also important to remember that just because it doesn’t make the cut for this year’s report, doesn’t mean we can’t use it! Returning to older datasets can be just as insightful months or even years later.

Sometimes the data guides you and other times you need to think more objectively: what questions do L&D leaders have right now? What answers can’t we find elsewhere? What hypotheses did we predict when we drafted the survey? These are all great starting points, and exactly how we approached this year’s data.

In our survey we asked about working arrangements pre-COVID and today. Thinking about the tough decisions that business leaders now need to make about working from home, this seemed like a great place to start. Quickly, and as predicted, we discovered that working arrangements have shifted dramatically over the course of the pandemic (see Figure 2), which guided us to look a bit closer.

Figure 2. How have working arrangements changed over the course of the pandemic?


The next step was to explore if working onsite, remote or via some combination of both, had any impact on perceptions of learning at work.

And surprise surprise, it did.

One of the more interesting findings to emerge from the data so far is that for those that report having no access to learning resources at work, the majority work onsite for most or all of the time (see Figure 3). The challenge next, of course, is to identify why this is the case (which you will be able to read about in our Learner Intelligence 2022 report), and to think about what this means in the bigger picture.

Figure 3. Some employees claim to have no access to learning resources at work


When analysing data, there are always a few surprises along the way. Sometimes, the “obvious” hypotheses don’t ring true, or you discover really cool things in unexpected places.

At the start of this research project, we were pretty sure we would be able to confirm that learners who participate in self-led (rather than mandatory) learning would be more confident applying what they had learned. The logic is simple – if you choose to develop a particular skill, chances are that you’ll be more invested in the learning process.

To assess if the relationship between self-led learning and confidence exists, we ran a simple correlation between the two variables. To our surprise, there was no relationship, which prompted us to look a bit closer. Of the 2044 employees that took part in the survey, 80% reported feeling able to “apply what they had learned” from the most recent skill/capability that they had developed. Interestingly, only 5% said that they couldn’t apply it, and the rest were unsure.

On reflection, having the confidence to apply what you’ve learned is quite a difficult thing to measure – particularly so when self-reported. And don’t we all like to think that the time and energy we’ve invested into something has paid off? In Psychology terms, this would be some sort of sunk cost fallacy-endowment effect hybrid, which describes the nonrational, behavioural outcomes of making personal

investments. In this case, the outcome being an overinflated perception of how valuable a learning experience really is.

This experience reminded us of a significant pitfall of self-reported measures. While survey data shouldn’t be regarded as erroneous, it should be taken with a pinch of salt. The salt being a moment to reflect on what we find and ask, “what else could be influencing this particular finding?”.

The final course: The crème de la crème

At Mind Tools, we are now busy pulling all our Learner Intelligence insights together to produce this year’s report. To be the first to receive the report, please submit the pre-order form below. Before then though, keep an eye out for our third and final blog in this miniseries, where we’ll be talking all things report writing.

Bon appétit!


About the author

Anna Barnett

Anna Barnett

With a strong background in academic research, Anna completed her PhD in 2020 and is now a core member of our Impact & Insights team. Anna specializes in producing high-quality, comprehensive and topical evidence-based reports for the L&D community.

You may also be interested in…

Benchmarking: Why do we struggle to measure the impact of L&D?

Benchmarking is a verb, a doing word, an action.

December 2022

Read More

How can managers help retain top talent?

Discover some important insights from our 2022 Annual L&D Benchmark Report.

August 2022

Read More

How important are managers to learning?

Discover some important insights from our latest Learner Intelligence Report.

July 2022

Read More