Nearly seven years after Watershed was founded, we have watched profound changes take place in the learning analytics industry. We’ve come a long way from a time when learning analytics was a rare term, taking the form of traditional training evaluation models, smile sheet statistics, and piles of Excel spreadsheets.
Today, we have many emergent and valuable areas of analytics in the learning domain—like program analytics, content analytics, cross-system utilization metrics, business impact measurement, and others. This breadth of capability has been enabled by new platforms, automated processes, standard data formats, and the progressive understanding of real practitioners sharing and supporting each other in this effort.
This journey of co-exploration amongst trailblazers has created a defined space that I believe has taken clear enough shape to welcome the early majority into the fold. While this year’s survey results (see below) remain consistent on some themes we’ve been watching, it also reveals major differences between respondents who have taken an early chance on learning analytics and those who are still looking to get started.
Ultimately, what we’re watching unfold through the data now is an evolving set of organizations moving along a scale of learning analytics maturity, and surfacing differences that reveal how less mature organizations can move further along that scale.
An industry in motion
Many organizations are aspirational when it comes to learning analytics. While roughly 19 out of 20 respondents to our Measuring the Business Impact of Learning survey believe in the importance and possibility of applying learning analytics to measure business impact in their organizations, and approximately 3 out of 4 respondents believe that investments should be made in this area, only 1 out of 4 actually reports setting aside budget to this end.
That is a huge disparity that suggests the vast majority of respondents want to realize the rewards of learning analytics but don’t have any access to the budget to do so. Furthermore, looking at the CLO survey results, we see significant differences in the types of analytics prevalent at respondents’ organizations versus what they wished would become prevalent, moving along the spectrum from basic compliance reporting to skill and learning path analysis.
However, it’s good to remember this is not a static picture. These results give us a snapshot of an industry in motion. It is noteworthy in itself that 1 out of 4 MBIL respondents report that they do have a budget set aside for learning measurement. And half of the respondents to the CLO survey report they are planning to graduate from spreadsheets and implement a learning analytics platform, which will help automate the analytics workflow and drive more sophisticated inquiries into the data.
These results reveal the aforementioned spectrum of maturity. I believe this can create a significant and positive feedback loop as results coming out of those organizations with a mature learning analytics practice help drive the motivation and buy-in for those organizations who are still putting all the pieces together in that effort.
See some great examples of mature learning analytics in our webinar recording “5 Ways Learning Analytics Can Transform Your Business.”
Data is not easy
One of the most important pieces of the effort is talent. After all, analysis requires analysts. Whether that talent is outsourced, created in-house from another department, or embedded within the team, we need skilled professionals to understand, access, manipulate, and present data in order for that data to have any impact.
No matter how it is collected and stored, even if gathering and cleaning it is an automated process, making use of that data is a nontrivial exercise that requires significant specialized skills.
Even within the learning domain, where many common areas of analysis exist, every data ecosystem and set of analysis goals still have unique elements. And those organizations with access to technical data skills will be far more agile in their ability to fit their efforts directly to their needs.
Here we see a very similar progression as we see with budget. About 1 out of 4 respondents report specific analytics capability as a data analyst or data engineer. Others report a more general capacity in this regard, but 3 out of 5 respondents report no analytics capability at all on their team.
A similar amount, 3 out of 4, report that they have no intention to add to this capability. From the perspective I’ve gathered through Watershed over many years, I can report that the most successful learning analytics efforts include investments in both technology and talent. Having a high-end racecar is not enough to win—you need an able driver for it as well.
The audience is there
Though the metric has fallen in the most recent MBIL report, the reported pressure from executives to measure learning’s impact has still risen over 40% since the first survey results from six years ago.
Roughly 1 out of 2 respondents report this pressure to measure. This result coincides with the CLO results around who is asking for learning data within the organization, with the three top stakeholders in order being: Senior Management, Head of HR, and Head of L&D.
The fact is, as we’ve understood for some time, thriving organizations use a lot of data to understand their business and make decisions, and learning cannot linger as an exception to this rule.
While operations, marketing, sales, and finance bring significant data insights to the table in the C-suite, learning is increasingly expected to have analogous capabilities—especially as the assessment, retention, and development of people take center stage in the post-pandemic war for talent.
One of the most notable results along these lines is that “knowledge gained and skills acquired” was reported by CLO survey respondents as the most common type of report requested. While “completions” (implying compliance reporting) is next highest on the list, seeing skills as the number one inquiry is telling.
This underscores the fact that organizations are continuing to experience a changing landscape of need in terms of talent due to a changing employee base, new strategic goals, or the emergence of new capabilities and skills that could materially affect the business (e.g. learning analytics!).
Learners still matter—a lot
At any rate, demand for skills reporting also emphasizes an important truth: a business is ultimately its people. The business's ability to meet the market's demands comes down to its people's capabilities. The significant employee churn associated with the post-pandemic era has been a huge reminder of this point to all organizations. Understanding, developing, and retaining skilled talent is essential to success.
To that end, I think it makes perfect sense that coinciding with this interest in skills, we see MBIL respondents report learner satisfaction as the most common measurement of success for their L&D team.
In a competitive talent landscape with more options for potential employees, things like individual learning budgets, skill development programs, and growth opportunities are major differentiating factors. People have the choice right now to not just find a job but the best job—and they have underscored personal development as a key factor in that choice.
While all the other success metrics, including business impact and program effectiveness, are crucial to the long-term success of a learning function, it is neither mutually exclusive nor secondary to learner satisfaction and engagement.
No training program, no matter how impactful it is, will have any impact on learners who are disengaged and leaving for another organization that better emphasizes the care and investment of its people.
Up Next: What’s the demand for learning analytics?
This series will take us into an incredibly detailed discussion of the state of learning analytics, informed not just by a wide view of survey results but by our experience in the industry and years of observation.
We are watching organizations move up the maturity curve of learning analytics. And we offer many insights into that progression in hopes of illuminating a path for all organizations to realize a more valuable and effective learning analytics practice. Join us for the next post in this series as we discuss the demand for learning analytics and what it means for L&D.
About the author
David Ells takes great pride in leading a dynamic team and turning innovative ideas into reality. His passion for technology and development found its roots at Rustici Software. During his tenure, he contributed significantly to the creation of SCORM Cloud, a groundbreaking product in the eLearning industry, and led the development of the world’s first learning record store powered by xAPI.
Subscribe to our blog