Learning games are an increasingly popular way for engaging learners in essential topics or where learners need to practice a skill. By their nature, games don’t follow a standard structure. Instead, each one has its own rules, ways of playing, and measures of success.
So how do you report on a game that’s unique, and what’s the business case for doing so in the first place? This post explores game analytics and real-world examples before outlining the business case and how you can get stakeholder buy-in.
If you’re new to our blog series on Building a Business Case for Learning Analytics, check out the introduction for an overview and recommendations for making the most of this series.
What Is Game-Based Training?
Games are often used in training to share knowledge and facts and involve learning by repetition—seeing and using the same information again and again. Sometimes the term “serious games” is used to describe game-based training because the purpose is personal development rather than entertainment.
Game-based training is a fun, competitive approach to engage and motivate learners. Typically, these games involve quizzes where individuals or teams compete against each other and can even involve incentives for the winners.
eLearning games can be particularly helpful with more competitive audiences or with mundane content that benefits from having a hook to encourage learner engagement. For instance, you can use learning games to follow up on product knowledge sales training.
What Is Learning Game Analytics?
Learning game analytics means using player data about usage and engagement to inform the design, development, and operation of L&D training games. These analytics answer questions, such as:
- Who is playing?
- How do players interact with the game?
- Are there any problems with the game that need to be addressed or lessons learned for future game development?
- How are players performing compared to one another? Who is winning and losing?
- Is the game an effective learning activity? Is it having the desired business impact?
eLearning games often have unique gameplay interactions and rules. For example, one game might involve cards, another might include quizzing, and yet another might be a workplace simulation. Even within these offerings, different games work in different ways. These variations add interest to training games, as learners don’t have to follow the same process for every piece of learning.
The following report uses demo data from an information security card game and shows how reporting needs to be game specific. For example, this report shows whether players have viewed all the content that randomly appears during gameplay with lighter colors indicating unseen content. You can use insights like these to encourage people to keep playing, so they view all the necessary information.
Recommended Reading
What Does Game-Based Analytics Look Like in Practice?
Games & Analytics: The perfect combination
A pharmaceutical company’s L&D department used Watershed to support a quiz game running on Scrimmage’s mobile learning platform. Watershed provided game leaderboards so teams of sales reps in different territories could compete with one another for the highest scores.
The game’s goal was to identify knowledge gaps following an in-person training event. By discovering and analyzing the questions with the most incorrect answers, the team was able to identify a lack of knowledge around the company’s segmentation strategy. Using this knowledge, the team rolled out further training to address the gap.
The game was so successful and the insights so valuable that the training manager referred to the combination of Watershed and Scrimmage as like peanut butter and chocolate: “Each is tasty on its own. But dip the chocolate in the peanut butter, and you’ve really got something special!”
Improving sales performance with training games
A financial services company identified that salespeople who sold bundled products rather than individual products performed better than those who didn’t. Using this insight, the company used Gomo Learning to develop a training game designed to shift salespeople to this way of selling.
They also used Watershed to evaluate the learning’s impact. As a result, the team was able to show how:
- the game positively impacted the way salespeople sold products, and
- salespeople who played the game for more than four hours could identify four times as many sales opportunities as the company average.
This use case is not only a great example of learning game analytics, but also of good learning design. The company started with a business goal of increased sales and identified what salespeople needed to do differently to sell more bundles (rather than individual products) before designing the learning solution to bring about that change and increase in sales. And with detailed learning evaluation work that included both Kirkpatrick level 3 and 4 measures, the company proved that careful thought through learning design really does work.
Furthermore, this is an excellent example of using Watershed to add competitive elements to a quiz. The company used Watershed reports to give team leaders visibility into their teams’ performance.
Tracking game content completion and drop-out rates in Zero Threat
Another example is Zero Threat—a digital card game in which learners must defeat hackers by playing the correct information security best practice card against the corresponding hacker threat.
The game uses this approach to teach players about potential information security threats and the steps to counter them. And it’s done in a way that’s a lot more fun and engaging than simply displaying a list of threats and security precautions for learners to read.
We helped the creators of Zero Threat implement xAPI tracking so Watershed can capture every learner action—from cards received to cards played. Watershed also tracks the “hacker’s” actions, which means you can see what a player is responding to and ensure they have encountered all the possible threats during gameplay.
We’ve demoed Zero Threat and the corresponding Watershed reports at many industry conferences, and even that demo data has generated some helpful insights. For example, the following report shows how many of the 20 rounds players complete. As you might expect, many players got a feel for the game and stopped playing in the first few rounds. However, if players made it to Round 8, most continued until at least Round 14 before dropping out.
This data suggests a conference version of the game with fewer rounds might be preferable so players can experience the whole game. And for the game’s design in workplace training settings, this insight suggests that attention-grabbing elements work best at the start of the game and from Round 14 onward.
How Does Watershed Support Learning Game Analytics?
Improving learner engagement with leaderboards
Watershed supports gameplay by providing the functionality to track and share players’ scores on leaderboards, which adds some friendly competition and entices players to outperform their colleagues. You also can incorporate Watershed reports into games themselves to give learners instant feedback on their performance compared to their peers.
Measuring game-based learning with Watershed
You can assess learning with Watershed’s reports and analytics. For instance, this might be a:
- learning assessment from previous training (as in the pharmaceutical example above)
- an evaluation of whether a game is improving skills and knowledge
- a report showing if a quiz game results in higher scores and faster responses the more learners play.
It’s also important to monitor the usage of game-based learning. In fact, monitoring utilization and exploring how learners use the game are even more critical than for other learning activities. That’s because the unique nature of games means they often can either do really well and explode in popularity or perform quite poorly.
There also may be an increased risk of glitches due to the increased technical complexity compared to other learner activities, such as watching a training video or taking a survey. Closely monitoring games—especially when they first launch—helps you address issues faster, whether technical or related to promoting the game.
Evaluate training game effectiveness.
Finally, you can use Watershed to evaluate the effectiveness of a training game. For example, compare data around how many times people play the game and how well they score against skills assessments and application data.
Recommended Reading
What Is the Business Case for Watershed in This Scenario?
We’ve illustrated that training game analytics is not just an optional add-on for a learning game project—but, in fact, should be integral. These analytics help ensure a game has the most significant impact and return on investment by:
- powering leaderboards that foster friendly competition and motivate learners,
- helping you spot and address any issues with the launch of the game, and
- providing data to evaluate the game’s impact, helping you make a case for other games in the future.
How Can You Convince Stakeholders of the Value?
Stories and case studies such as those in this blog post play an essential role in convincing stakeholders of the potential value of eLearning games. For instance, the fourfold increase in sales opportunities described in the financial services example is compelling evidence to show how games can have a significant and positive impact on a business’s bottom line.
You need learning analytics to confirm whether games live up to their potential, effectiveness, and intended impact.
Understand your stakeholders and how they benefit from learning game analytics.
Stakeholders | Pain Points | Benefits |
---|---|---|
C-Suite (CLO, CEO, CFO) | C-suite wants to ensure the game is achieving its intended business results. | Learning analytics measure the impact of the game and evaluate its success at achieving its goals. |
Human Resources | HR wants to ensure the game equips learners with needed skills and knowledge. | Learning analytics measure the game’s effectiveness at helping learners develop knowledge and skills. |
Learning Leaders | Learning leaders want to ensure the game is well used and results in learning. | Learning analytics monitor game usage and measure learners’ knowledge and skills developed after playing. |
Instructional Designers | Instructional designers want to learn how to produce an even more effective game and understand the effectiveness of games as a mode of learning. | Learning analytics can give instructional designers granular detail about how learners interact with the game, which can inform the design of future games. |
Compliance | The team wants to ensure learners are playing compliance training games and that the games impact compliant behavior. | Learning analytics measure the use and impact of compliance-related learning games. |
Line Managers | Line managers want to ensure their team members get the most out of the game while addressing any gaps in knowledge and skills within their team. | Learning analytics monitor whether all team members play the game and highlight particular gaps in skills and knowledge. |
Learners | Learners want to compare their success to others to claim bragging rights. | Leaderboards keep learners updated with their success compared to others to feed competitive impulses! |
Next Course: Why Learning Ecosystem Analytics Is Good for Business
In addition to game-based learning, you may also have eLearning content, instructor-led training, social learning, video tutorials, and external content libraries. You might deliver these experiences via a well-planned, integrated learning ecosystem. Or, in some cases, your ecosystem may be more of a loose collection of platforms.
Whatever the state of your ecosystem, you need proper visibility of your organization’s learning to inform L&D’s decision-making properly. That’s why it’s vital to be able to report on learning wherever it happens and without having to log into several different platforms. In the next post, we explore learning ecosystem analytics and the business case for having all your learning data and reporting in one central place.
About the author
As a co-author of xAPI, Andrew has been instrumental in revolutionizing the way we approach data-driven learning design. With his extensive background in instructional design and development, he’s an expert in crafting engaging learning experiences and a master at building robust learning platforms in both corporate and academic environments. Andrew’s journey began with a simple belief: learning should be meaningful, measurable, and, most importantly, enjoyable. This belief has led him to work with some of the industry’s most innovative organizations and thought leaders, helping them unlock the true potential of their learning strategies. Andrew has also shared his insights at conferences and workshops across the globe, empowering others to harness the power of data in their own learning initiatives.
Subscribe to our blog