Welcome to our fifth installment exploring results from our Learning Analytics Research Study and what they mean in practice. This week, we’re diving into the dimensions under the learning program categories. (We’ll pick up the learning experience category next week.)
Learning Program Dimensions
Learning Program analytics means looking at a group of learners in a particular learning or training program. A common question is whether people have completed the required program elements. Still, this category also can include questions about a learning program’s effectiveness in improving job performance or driving business KPIs.
We identified four dimensions under the learning program category:
- Attempt: comparing different attempts at a course or other program element
- Status: comparing people with different statuses for a program (e.g., those whose status is “passed” against those whose status is “failed”)
- Interaction: data organized by individual learner interaction that’s often displayed in interaction streams (e.g., being hired, completing a program of learning, or simply interacting with an eLearning course element)
- Program: data about the program presented together, which is often displayed in Watershed’s program report (e.g., completed learner milestones, amount of time learners took to finish, or program completion percentages of various groups or individuals)
In practice, most of the reports we observed under the learning program category were program reports, making up nearly 87% of the view share.
Most of the remaining reports in this category (nearly 12%) were interaction reports and leaderboards organized by interaction.
Recommended Resources
So, how popular is the learning program category in practice?
Looking at our data of real-world learning analytics, we found that—even though organizations in our study created nearly double the amount of learning experience reports compared to learning program or learner reports—learning program reports had significantly higher average view counts per report.
(And this is still true, but to a lesser extent, if we exclude the organization with the most report views, which also mainly uses learning program reports).
What does this mean?
This information provides some helpful insights:
- Reports within the learning program category, which look at a group of learners within the context of a particular learning program, tend to be used most frequently.
That doesn’t necessarily mean these reports are more valuable or impactful in terms of resulting actions, but it’s certainly a good indication that learning program reporting is worthwhile.
- Seasonal changes, such as business financial cycles, can affect learner program report usage because they are most relevant while the programs are running—especially around deadlines. For instance, you might need to complete the training before a product launch.
In fact, if we look at data over time, learning experience reports have more views than learning program reports in the second half of the year, while learning program report views spike around the end of the financial year in April.
Actionable Insights
Here are two actions you can take away from this blog post:
- Learning Program reports tend to get the most attention, so make these reports available to your organization’s managers!
- Reporting on data about learning programs lends itself to specialized learning program reports (such as Watershed’s program report) rather than more generic visualizations (such as pie charts or bar charts).
Recommended Reading
Up Next: The Learning Experience Category
Next week, we’ll explore the different dimensions within the learning xperience category. We identified many more dimensions under this category, so stay up to date and sign up for Watershed Insights for the latest blog posts.
About the author
As a co-author of xAPI, Andrew has been instrumental in revolutionizing the way we approach data-driven learning design. With his extensive background in instructional design and development, he’s an expert in crafting engaging learning experiences and a master at building robust learning platforms in both corporate and academic environments. Andrew’s journey began with a simple belief: learning should be meaningful, measurable, and, most importantly, enjoyable. This belief has led him to work with some of the industry’s most innovative organizations and thought leaders, helping them unlock the true potential of their learning strategies. Andrew has also shared his insights at conferences and workshops across the globe, empowering others to harness the power of data in their own learning initiatives.
Subscribe to our blog