If you want good, comprehensive reporting that answers all your questions, you need good, comprehensive data. But when you have multiple tools, systems, and platforms across your learning ecosystem, how can you ensure you’re getting all the data you need and in the proper format?
In this blog series, we’ll explore data requirements for the various tools and technologies that are commonly found in learning ecosystems as well as requirements and best practices for extracting data from each system into your learning analytics platform (LAP).
Whether you are planning an integration with an existing platform, writing an RFP, or discussing options with a vendor, this series will help you to ask the right questions.
Key takeaways
At its simplest, this series will answer:
- What do I need to do when it comes to implementing learning tools and technologies across an ecosystem?
- What do I need to consider for each system—compliance, conformance, best practices, etc.?
This series will cover the key platforms in your learning ecosystem and beyond including:
- Learning Management System (LMS)
- Learning Experience Platform (LXP)
- Video Platform
- Survey Tool
- Digital Credentialing Service
- Observation Checklist Apps
- HRIS
- BI Tool
- ILT/VILT
- AR/VR
- Custom Tools (Quizzes, Games, Intranet & More)
Recommended Reading
What are my L&D reporting requirements?
Before we dive into the specific data requirements for different types of systems, let’s cover some general principles that are useful whatever tool you’re extracting data from.
Helpful Hint: Good reporting design starts with good learning design because the purpose of the learning will inform what you want to report on. Be sure to identify a clear goal for the learning you are designing.
For instance, someone may ask you how many people in department Y have passed course X, but does that person really want to know something else? The best learning programs also are planned with reporting and data capture in mind. So if the program is being designed to change X, what demonstrates X changing?
When reporting on learning, you’re typically looking to accomplish one or more of the following tasks:
- I’m tired of going to five disparate systems to get information. I want to report on all of my organization’s learning data in one place.
- I want to see how my organization's learning has changed over time. Are people more engaged, have they increased their training, are they getting better at learning? (This might sound similar to the last item, but this request might only apply to a few systems—such as an LMS or a new LXP—rather than apply across all systems and technologies. This step is often used to begin to prove the value of investment in that system.)
- Show me how a specific piece of learning has changed how my business operates. This is something along the lines of:
- Does sales training increase sales of a product or service?
- Does product training increase product knowledge?
- Does customer service training increase customer satisfaction?
These questions will require data from both learning systems and the system that tracks the metric you want to change. However, in this blog series we are going to focus on platform requirements, helping you ask the right questions, and getting the best data out of your learning platforms to support your learning measurement goals.
What should I look for from a new platform or vendor?
When it comes to pulling data into an LAP, a good xAPI implementation is a great start; but in an ideal world, you want to have tested that xAPI connection in your LAP’s reports.
When you can see how the data appears in reports and visualizations, you can more easily spot any issues in the data.
If possible, ask your platform vendor to work with your LAP team to ensure there is a good data connection between the two platforms before you sign off on a long-term contract. In particular, you should check:
- The xAPI implementation is complete and working on the live platform, and you’re not just looking at test or example data. Ideally, look for platforms that have customers who are already using the xAPI implementation (i.e., ask the vendor for references). This helps ensure the implementation will be properly maintained and provides confidence that the integration is functional because it is already being used.
- The xAPI implementation tracks all the events and includes all the data that you need for reporting.
- The vendor is contractually obliged to maintain the xAPI implementation and fix any issues that emerge or develop with the data over the duration of your contract.
- The xAPI implementation is well documented and includes examples of xAPI statements showing the technical structure of the data. You’ll need this information to support your xAPI data governance processes.
You want platforms that can show examples or case studies of clients who have used the data it provides to ensure these platforms can meet your needs.
For instance, if you want to send L&D data to an LRS, you want examples of how that works/looks. If you want to show utilization of the platform back to the business, you want examples of that.
In other words, you should avoid providers that are unable to show people who've been able to integrate the platform into other systems, or providers that talk about being able to do it all inside their platform (i.e., It means those platforms don't play well with others.).
Why are data requirements important for reporting?
Good reporting stands on the foundation of good data. You can have the snazziest charts and the most beautiful dashboards, but if the underlying data is missing, inaccurate, or misleading, they are of no use to anybody.
Defining clear data requirements at the outset of your project and as you connect additional data sources helps to ensure that your data is:
- Complete. When reporting on all learning activity across your organization, you want to see all learning activity across your organization, not just most of it.
- Accurate. It’s important that data is accurate to maintain confidence in reporting and avoid wrong conclusions.
- In line with expectations. Data can be complete and accurate, but still mislead report viewers if it shows something slightly different than what’s expected. For instance, a report of everyone who got to the last slide of an e-learning course is different from a report of everybody who passed the quiz in the course. That’s why it’s important to be explicitly clear with your reporting and data requirements.
Good data requirements translate the reporting requirements into clear, unambiguous requirements about exactly what data is required, and what is included and excluded.
Ensure data collection efforts meet reporting needs.
For your reports to be relevant and useful to your primary reporting users, you need the data underpinning those reports to be good. This means considering your reporting requirements in order to determine your data requirements.
Here are our top three tips for defining reporting requirements:
- Consider all your reporting stakeholders, not just the obvious ones. Who else might want to have a view into the data? These stakeholders might include the learner, operational managers, regulators, senior managers, the learning designer/developer, compliance managers, or customers, just to name a few.
- Think big and inspire your stakeholders. If they are not used to having access to data and reporting, their initial requests may be limited. Give them what they ask for, but also consider how you might inspire them to make greater use of the data for a bigger impact (e.g. David Rosenfeld shares his experience of doing just that within athenahealth.).
- Run a pilot, create a mockup, or test with real data. If you’re not experienced at defining reporting requirements, it’s often not until you can explore the data that your requirements become clear. Give your reporting stakeholders something to look at and ask for feedback.
Once you know your reporting requirements, build your data requirements from there. Consider what data and in what format is required to power the reports you need. And then build some test reports based on test data that match those requirements.
Do data requirements differ between a learning ecosystem and a data ecosystem?
Most organizations are used to working with all sorts of data. Learning data tends to be data about the activities of people, and is normally used by report users interested either in those people, in the activities they undertake, or a combination of both.
As a result, there are some particular considerations to make when working with learning data:
- You will almost always want to connect your learning activity data to data about your organizational hierarchy from your HR systems. For all learning data, make sure there is a learner identifier in that data that can be matched to an identifier in the HR system.
- For learning activity data, you need to know what was done, who did it, when it was done and—where relevant—what the result was. Make sure all of those data points are available.
Up Next: LMS Data Requirements
Now that we’ve covered the general principles of data requirements that you can apply across your ecosystem, we’re ready to start exploring data requirements unique to specific platforms and systems—starting with a learning management system (LMS). Be sure to sign up for our blog so you don’t miss out!
About the author
Peter Dobinson is passionate about developing connected learning ecosystems that empower organizations to deliver exceptional learning experiences. With a strong foundation in product design and management, eLearning interoperability, system integrations, user-centered design, and data analytics, he thrives in helping organizations get the most out of their L&D data. Peter's background in learning technology means he has the knowledge and expertise needed to drive the implementation of innovative solutions such as xAPI within the L&D industry. In other words, Peter helps organizations unlock the true potential of learning—one ecosystem at a time.
Subscribe to our blog