Learning analytics is in high demand but in short supply. However, those who do it firmly believe in its value. This finding is reflected directly in our own experience—with Watershed clients consistently expanding their capabilities once they’ve applied learning analytics and seen the value of using data-driven insights to guide business strategy.
So what does the demand for learning analytics look like? Is L&D offering the C-suite the reports and insights they need to inform their decision-making?
This blog post covers the following key areas:
- Learning analytics: Rapidly evolving, but early adopters are still ahead of the rest
- What is the demand for learning analytics?
- Does having a budget for measurement make a difference?
- Who’s asking for reports, and how frequently?
- Are L&D’s success metrics holding back the adoption of learning analytics?
- How L&D can avoid the “Chicken or Egg” trap
Learning Analytics: A rapidly evolving industry, but early adopters are still ahead of the rest
As our CEO David Ells mused in his recent blog post, we’ve come a long way since learning analytics was a rare term—taking the form of traditional training evaluation models, smile sheet statistics, and countless Excel spreadsheets.
Whereas once we would see learning analytics applied to single programs, often as a proof of concept, we now see a holistic approach that uncovers trends from the entire learning ecosystem. Add HRIS data into the mix, and you suddenly take your analytics to the next level, offering the C-Suite the type of on-demand insights they need to drive business strategy. Combining your learning data with HRIS data is crucial, as it allows you to view trends by organizational hierarchy (i.e., by job role, region, and individual learning tracks).
For all these positive advances, anecdotally, we accept that there is a divide between global organizations blessed with forward-thinking learning leaders and budget and with those that find it hard to get started.
Throughout our “State of Learning Analytics” blog series, we’re validating these thoughts using the data (or voices to add the human touch!) of more than 1,000 L&D professionals who responded via two surveys:
- Measuring the Business Impact of Learning. Since 2016, Watershed has partnered with LEO Learning to conduct the annual Measuring the Business Impact of Learning (MBIL) survey.
- Adopting Learning Analytics: Closing the C-Suite/L&D Language Gap. Watershed teamed up with Chief Learning Officer magazine to survey its readers on topics relating to L&D and learning analytics.
So when it comes to the demand for learning analytics, what did we find from the results?
Recommended Reading
What Is the Demand for Learning Analytics?
When we look at demand, we’re really asking, “Does the business ask L&D to report on its impact?”.
Both surveys showed similar results, with half of the respondents saying they feel executive pressure to measure learning’s impact. Our historical data from the last six years tells a consistent story. Furthermore, it’s not unreasonable to assume that the slight dip in 2021 could be attributed to COVID-19 (in 2020 and 2021, more than a third of respondents said their response to COVID-19 was to “focus on operational changes first and do measurement next”).
So, we can see executive pressure to measure has been consistently there for half of our respondents during recent years. This finding is backed up by a high desire to measure and a high belief that it’s actually possible to do so:
Does Having a Budget for Measurement Make a Difference?
When we break down these figures even further to assess the difference between organizations that do set aside budgets, we find that both desire and belief increase. This, of course, makes sense and reinforces the notion that those who have invested in learning analytics see the benefits pay off.
The 2022 Measuring the Business Impact of Learning survey found that 94% of organizations want to measure the business impact of learning programs, and 84% believe that it’s possible to do so. For organizations with a budget for learning analytics, this rises to 99% and 94%, respectively.
In terms of response to the COVID-19 pandemic during the past few years, there is a sharp distinction between organizations that already have a budget for learning analytics and those that don’t.
Organizations with a budget are more likely to have chosen to double down on their measurement strategies. In contrast, those without a budget are more likely to have postponed learning measurement in favor of other priorities.
Given this high demand for learning analytics, it is surprising that many organizations struggle to integrate learning analytics into their processes for developing, launching, and supporting learning programs.
However, we’ve found that these common roadblocks are often to blame: not knowing how or where to start, trouble securing budgets, and finding time for analytics among competing priorities. (We’ll explore the role of impact metrics and how to break the chicken-or-egg cycle toward the end of this blog post.)
You can dive deeper into the difference budget makes in the full version of our Measuring the Business Impact of Learning in 2022 report.
Learning Analytics Demand: Who’s asking for reports, and how frequently?
The CLO survey asked for more detail about who explicitly asks for reports on learning data. It found that the most requests* came from:
- Senior management (70% of respondents)
- Heads of L&D (43%)
- Heads of HR (37%)
- Line Managers (28%)
- Instructional Designers (24%)
*Note that respondents checked “all that apply,” so these totals equal more than 100%.
Each of the “other” stakeholders (i.e., board, regulators, learners, legal, head of sales, and finance) do not commonly request reports on learning and are each selected by only 7–16% of respondents.
The pattern here is that more senior people—including senior management and heads of HR and L&D—are more likely to ask for reports than frontline employees who develop content and directly manage learners.
Generally speaking, this pattern is often seen with many different types of data (i.e., this doesn’t just apply to learning data). For example, as we reported in “3 Reasons It's Time for L&D to Invest in Data and Learning Analytics,” 81% of senior executives and managers in the United States report having access to data, compared to only 48% of frontline workers.
Democratizing Data: The more the merrier
In our recent webinar, “The State of Learning Analytics: Views from 1,000 L&D Professionals,” we spoke of the importance of building a culture of curiosity. For an organization looking to embrace learning analytics, having dedicated people within Learning and Development who have the desire, time, and skills to embrace data and look for trends is critical.
But this doesn’t, and shouldn’t, be limited to those in dedicated L&D roles; learning analytics can be just as valuable and insightful to line managers and instructional designers as it is to senior colleagues.
Line managers need to understand what their people are learning and the new skills they are developing to use their talents best. Managers must also understand knowledge and skills gaps to best support their people's improvement. And yet, in 72% of organizations surveyed, line managers are not asking for reports on learning data.
Similarly, instructional designers need analytics to understand how their content is and isn't working. Without evaluating the success of their programs, how can instructional designers improve their craft and make effective programs? How do they identify and resolve issues with existing programs if 76% of them are not asking for reports on learning data?
Potential reasons for this may be because they:
- don’t realize that it’s possible to get reports on learning data,
- don’t see themselves as able to ask for reports, or
- don’t recognize the value of reporting on learning.
Addressing these issues means communicating to line managers and instructional designers what’s possible, what’s available, and the value of reporting. This ties in with the fact that all learning analytics specialists have an essential role in educating business stakeholders on the broader reporting possibilities.
See how athenahealth’s L&D team uses a newsletter to keep colleagues updated with the organization’s program data, insights, and best practices.
What report types do people want from L&D, and how often?
The CLO survey asked respondents about commonly requested report types and their frequency. The results for how often people request reports are mixed, so it is arguably hard to draw meaningful insights from this data:
- Some teams receive requests at intervals between monthly and yearly (47% in total).
- Other teams receive requests at the end of learning programs (21%).
- And some teams don't receive any requests (22%).
When it comes to looking at the kinds of report requested by “type,” we see the focus prioritized like this:
- Learning Experience
- Learning Programs
- Learners
When we dive into more detail on the report types, we see a mixed bag with knowledge and skill gain topping the lot:
- 40–50% of organizations request about six report types.
- Usage reports (56%), completion reports (60%), and knowledge/skill gain reports (63%) only marginally stood out from the crowd as more commonly requested.
We’ll be exploring more on the currently requested report types versus the reports that L&D wishes for (if there were no limitations) later in this series.
For instance, we found that “Completions” dropped from second place (currently reporting on) to fifth place when asked for ideal types of reports—possibly reflecting that mandatory / compliance reporting is seen as a necessary evil rather than a true reflection of on-the-job performance.
Recommended Resources
Are L&D’s Success Metrics Hindering the Adoption of Learning Analytics?
When we looked at the data from our CLO survey, our whitepaper concluded that L&D was falling behind in providing the C-Suite with quantitative metrics. This finding indicates an over-reliance on reporting on L&D’s own productivity (courses created, completions) and learner satisfaction.
While these metrics are useful, alone they do little to honestly assess the impact the learning had on changing a learner’s behavior. The link to business metrics (performance KPIs) appears to be the missing gap.
The wider mismatch between demand for learning analytics and supply may be due to how L&D’s success is measured and, therefore, how L&D teams are incentivized and rewarded for their work. For example, one Watershed client expressed the challenges of prioritizing learning analytics like this:
“Our team is not accountable for showcasing business impacts. We’re not incentivised to focus on business impact. We’re responsible for putting on great experiences that people enjoy.”
The survey results also reflect this sentiment, with only a third (33%) of respondents saying the success of their department is measured by organizational impact or performance improvement. And nearly two-thirds (60%) said they’re measured based on learner satisfaction, content utilization, or not at all.
The solution is to ensure that L&D is held accountable for and incentivized to deliver and measure performance and business impacts. Agreeing on the right business metrics with key stakeholders beforehand is critical. As a result, you can embed business impact into the heart of the instructional design process, allowing programs to address a particular business problem with clearly defined and measurable performance goals.
Not only is this good for effectively measuring impact, but it will also naturally focus your objectives on how learners apply these in their daily job role to meet the business KPI.
Check out our BALDDIE instructional design process (adapted from the ADDIE design method), which walks you through each step.
How L&D Can Avoid the “Chicken or Egg” Trap
As stated above, L&D can help break the cycle of reporting on success metrics by engaging the business with meaningful, pre-agreed impact metrics.
While this is a good starting point, creating a learning analytics program can be challenging, as L&D teams can often find themselves in a chicken-or-egg situation. That's because:
- There’s no budget for learning measurement as there hasn’t been one previously.
- There’s no expectation from the organization for L&D to report on learning’s performance and business impact because they’re not accustomed to having those reports.
- There’s no team in place for learning analytics, so there’s nobody to drive the launch of a learning analytics implementation.
- There’s no technology in place for learning analytics, so any attempts at learning analytics are more time-consuming and less impactful.
Building a culture of curiosity is vital to getting started. Many of the forward-thinking organizations we work with have a learning analytics champion who embeds a data-driven culture within L&D first and then educates the wider business. Getting started is key; creating a clear business case for learning analytics to win over your business stakeholders is the next step.
Recommended Reading
Up Next: Are You Ready for Learning Analytics?
The next post in this series looks at how prepared organizations are for learning analytics and where organizations are on their learning analytics maturity journey. In fact, most organizations see themselves at the lower end of the learning analytics maturity scale, so we’ll discuss ways to help your organization progress up that scale.
About the author
Having worked in almost every job going in marketing, Ash loves the diversity and variation of challenges marketing handles. From acknowledging pain points to genuine, straightforward messaging, there’s a lot to be said and many ways to say it!
Subscribe to our blog