3 things to know about eLearning measurement and tracking (and 5 ways it makes content better)
The measurement and tracking of eLearning isn't just a bit of housekeeping to do at the end of your digital learning process. In this chapter from our ebook 'From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success' we explore why they are a new beginning for your wider L&D strategy.
While we all understand the importance of measurement and tracking, there are plenty of factors stopping L&D departments from using the available technology to its full extent. Whether through limited systems or a reliance on old structures, organizations are often stuck reporting little more than pass, fail and completion events. We could be achieving so much more with learning measurement. In this article, we offer a few tips for taking your approach to the next level.
How to use measurement and tracking to inform future content
Your learning analytics approach will vary on a case-by-case basis, but there are some common principles. Primarily, you’re looking for the weaker areas of your content and determining how you can improve those areas either in live content or in the content you build in the future. Key things to look for include:
- Failure rates: Have learners failed specific questions or sections of the course? Look for rates significantly below the average for other sections—some questions are just naturally harder than others. Peer-review, rewrite, and restructure material to get a better result.
- Time on page: Authors should have some idea of how long a piece of content takes to read and absorb. If time on page is low, learners may be skipping it. If time on page is high, learners may be bored, or alternatively, struggling to understand the concepts or language used. Experiment with rewrites and restructures.
- Progress through a course: Do learners get halfway through a topic, then skip it, only to fail the quiz later? This can happen when learners are initially presented with too much they already know, while genuine new information they aren’t aware of is buried several screens in. You could restructure the course accordingly.
- Device data: Are mobile users taking longer to complete courses, or taking less time on a page? This may indicate an issue with how the course is being delivered on different devices.
- Before/after scores: If you’re running pre- and post-training tests, you’re obviously looking for improvement in before/after scores. If there’s no improvement, something in your approach isn’t working. Either the material itself needs review, or you should consider offering extra training that reinforces the same points.
Getting to grips with xAPI and SCORM course tracking
xAPI can go beyond the parameters listed above for some very fine detail insights. For example, for a multiple-choice question with multiple correct answers, xAPI will report each choice individually. It can also tell you things like how many times a question was attempted, or how much time was spent on a course.
The granular tracking discussed above focuses on leveraging capabilities of the xAPI tracking standard. However, there are other tracking standards such as SCORM 1.2 and 2004 that, while not quite as new or comprehensive as xAPI, are widely used, particularly in tracking compliance learning.
The various SCORM standards have traditionally emphasized the completion and success status of an entire course. Although SCORM does offer the ability to track smaller-scale interactions, its tracking model is very rigid and not adaptable to different types of learning. It isn’t as well suited to offering especially in-depth analysis of how the content is performing. As a learning author, it’s imperative that measurement helps you understand which areas work well and what needs improving so you can continually improve your learning materials.
It’s worth remembering that xAPI isn’t just about improving eLearning content. Digital courses are often used as part of a blended learning approach, and learners’ successes and failures reflect on the effectiveness of classroom content too.
The benefits of advanced tracking extend beyond measurement
SCORM is LMS-dependent, whereas xAPI content isn’t subject to this technical constraint. Instead, tracking statements are recorded in a Learning Record Store (LRS). The LRS can be completely detached from the learning, which can be hosted anywhere you like. This could be on a website, or on a company intranet, for example.
This isn’t to say that xAPI makes traditional Learning Management Systems defunct. They still provide some very important features such as access control and learning paths. Because of this, standards such as CMI5 define a common packaging format that allow xAPI to function inside the LMS.
In this way, xAPI doesn’t just allow you to change the depth at which you can track the content, but it also allows you to distribute the content in a different way. This can help you deal with bandwidth issues, or host the content more locally to the users taking your courses. This has obvious benefits for international organizations testing employees abroad.
Tips for avoiding analysis paralysis
Of course, the extra granularity that xAPI allows is a double-edged sword. Its structure of Actor-Verb-Object (“Adam-Answered-Question 4”) allows for a huge number of different combinations and comparisons between data points. While you can filter specific verbs to look at certain groups—such as all fails, all passes and all ‘experiences’—there has to be a purpose for doing so.
If you have a learning analytics platform (such as Watershed), you can avoid going too far off on a tangent by simply keeping to the recommended dashboards. Such platforms will also make the process of bashing various xAPI statements together effortless, meaning that you waste less time exploring dead-ends.
Troubleshooting tips for optimal eLearning course delivery
Troubleshooting is a complex area, and the most serious issues should be discussed with all relevant vendors in order to uncover the issue. However, we do see a couple of basic things that commonly trip teams up.
When setting up your xAPI tracking, you may be asked to provide end-point details (for example, your username/password). It seems obvious, but it’s important to double-check that these details are correct and firing off the information that you expect. Sometimes details change, other times an unchecked typo ruins everyone’s day. If you’re devoting adequate time to testing, this is something that should be weeded out early, ideally in the prototyping stage.
Though it isn’t strictly an implementation problem, one very common tracking issue to be aware of is caused by dropped connections. Because courses are run locally on a user’s machine, poor connections can sometimes result in tracking statements never returning to the server. Furthermore, the learner won’t necessarily notice that anything is up—content can be cached locally so it appears to work.
Therefore, make sure you’re aware of whether your user is subject to an intermittent connection during your troubleshooting process. There’s no technical fix you can deploy for this—unless your organization is responsible for providing the internet connection itself. Ideally, authoring tools should make the user aware of the issue, for example, with a ‘lost connection’ pop-up. This should also inform the user that course progress may not be recorded.
Also on the Gomo site:'5 Pointers for Effective Remote Training'
Without measurement and tracking, you’ll never quite know whether you’ve successfully introduced your new eLearning authoring platforms, and whether your content choices have hit the mark. Therefore, you neglect them at your peril. They are your most important tools for developing, improving, and supporting your learning content going forward.
In the remaining chapters of the ebook, we also cover how to:
- Define your needs and lay the groundwork before purchasing a new learning tool
- Work to your budget and build content that secures budget increases
- Introduce and effectively transition staff to a new tool
- Create high-quality, effective learning content
- Review your content in line with best practice in order to avoid common oversights
About the author: Adam Fox
Adam Fox is Director of Engineering at PeopleFluent and has been part of the LTG group since 2014. As a member of the engineering leadership team, Adam is responsible for technical strategy as well as software and quality assurance engineering teams.
He brings to the business a wealth of experience in designing, developing, and leading teams to create innovative systems that draw on the most appropriate and cutting-edge technologies. Adam has worked across health, defence, education, and professional sectors, producing award-winning solutions for over 15 years.