As we completed our Learning Analytics Research Study, we found many reports that fell into the Learning Experience category. Not only were there more reports, but they also were more varied—leading us to create nearly double the number of dimensions under this category than compared to the learner and learning program categories. And in this post, we'll explore these learning experience dimensions in practice.
What Are Learning Experience Analytics?
Learning experience analytics focus on learning experiences, platforms, and content. They answer questions such as:
- What learning content is popular?
- How can I make L&D videos more engaging?
- Why are average scores for a question so much higher than others?
- What topics need additional training content?
- What time of day is my learning platform the busiest?
These analytics will most likely interest the L&D team and those responsible for creating, sourcing, and providing these learning experiences.
Conversely, learning experience analytics are less likely to be helpful to people managers and those more interested in how particular groups of people (or the organization as a whole) are performing.
What caused so many varied learning experience reports?
L&D team members tend to be more advanced Watershed users and have time to really dig into and explore their data. So, it's not surprising that we see a wide variety of reports regarding learning experience analytics.
A Quantum Leap into Learning Experience Dimensions
To help simplify these findings, we've grouped them into a few buckets. So let's tackle them one at a time.
1) Asset and experience dimensions
These are dimensions where the reports compare individual assets the learner interacts with or experiences, which include:
- Resource: comparing resources (e.g., eLearning content, videos, or documents)
- Experience: comparing different experiences (e.g., events or classes)
- Question: comparing questions of an assessment or survey
- Session: comparing different sessions of a course or other experience
- Section: comparing parts of a document or other resource
For example, this report compares two survey questions:
Both questions show high percentages of always and sometimes responses, but the second question indicates comparatively more rarely and never responses. So, this may be a better area to focus on for future training initiatives.
2) Asset and experience grouping dimensions
Other dimensions compare grouping or collections of assets or experiences, which include:
- Content Type: comparing different types of content (e.g., eLearning vs. video)
- Content Provider: comparing various sources of the content (e.g., vendors)
- Data Source: comparing the applications that sent the data
- Duration: comparing different lengths of content or experiences
- Version: comparing different versions of a piece of content
For example, this report compares the usage of learning content by duration:
This report shows people are more likely to consume content with a duration of 5, 10, or 15 minutes. However, this may be due to the content's overall availability, which can affect what people can watch.
3) How the experience or asset was accessed
Some dimensions compare how the experience or asset was accessed, which include:
- Browser: comparing internet browsers (e.g., Google Chrome, Internet Explorer, etc.)
- Device: comparing different devices (e.g., mobile phone, desktop, or tablet)
- Workflow: comparing navigation flow to the content (e.g., search, share, homepage link, etc.)
For example, this report compares usage between an iPad and iPhone:
4) Outcome dimensions
Some dimensions compare various outcomes of the learning experience, which include:
- Mistake: comparing the number of times different errors occur
- Response: comparing different responses to a question
- Score: comparing scores or ranges of scores
For example, this report compares responses to a question to highlight common knowledge gaps.
5) Time period dimensions
Some dimensions compare different timeframes, whether looking at data during a specific period or comparing recurring periods (e.g., day of the week or hour of the day).
For example, this report compares LMS activity by day of the week:
Aside from enrollments, LMS usage tails off toward the end of the week, with a peak on Tuesday. This finding suggests Tuesday may be an excellent day to enroll people in new programs.
6) The remaining (action and search) dimensions
We observed a verb dimension that compares the different actions taken by the learner, as shown in the following heatmap:
We also a search term dimension that compares different search terms:
How Are Learning Experience Analytics Used?
It's interesting to note that the organizations in our study primarily conducting learning experience analytics are different from those that mainly use learner and learning program analytics.
That's to be expected based on the learning analytics triangle as a maturity model, and why we recommend starting with one category of analytics before expanding to others.
It's also indicative that learning experience analytics are more likely to appeal to learning and development professionals, while learner and learning program analytics are more targeted at people managers.
Let's look at the kinds of learning experience analytics in use. Reports mostly compare individual learning experiences (51% of reports and 58% of report views) and time periods (31% of reports and 24% of views).
Reports organized by search term are also widespread (7% of reports and 10% of views), which is significant because not all researched clients have platforms with search capability.
Actionable Insights
We've seen that reports comparing individual experiences are used most often. This approach can be great for identifying the most popular learning content, but what do you do with that data?
Once you've identified your best content, consider further analysis to understand why it's the best. This method might involve further quantitative analysis by looking at performance by factors, such as duration or workflow, as we've seen above. You should also consider qualitative research following Brinkerhoff's Success Case Method.
Up Next: Complexities & Analysis Types
Next time, we move from categories and dimensions to complexities and analysis types, exploring the questions people ask as they analyze their data.
About the author
As a co-author of xAPI, Andrew has been instrumental in revolutionizing the way we approach data-driven learning design. With his extensive background in instructional design and development, he’s an expert in crafting engaging learning experiences and a master at building robust learning platforms in both corporate and academic environments. Andrew’s journey began with a simple belief: learning should be meaningful, measurable, and, most importantly, enjoyable. This belief has led him to work with some of the industry’s most innovative organizations and thought leaders, helping them unlock the true potential of their learning strategies. Andrew has also shared his insights at conferences and workshops across the globe, empowering others to harness the power of data in their own learning initiatives.
Subscribe to our blog