Course testing and review: Are you guilty of these common oversights?
Why do you need to review the courses that you create?
This might seem like a bit of a trick question, but the answer matters: reviews are an essential way to make sure there are no errors or omissions that impact the quality of your training.
Although we’re all aware that eLearning authoring should be as error-free as possible, it’s still easy to overlook the value of a good review process. No matter how much care and attention you’ve put into an eLearning course, there’ll always be a few blips and glitches that can make their way to the review and testing stage
That’s why you need to embrace reviewing and testing and keep a watchful eye out for problems that detract from the learning experience. Read on to discover some of the common oversights we encounter (and how to address them!)
Failing to test different end-user environments
From spurious error messages to courses that just plain fail to load, your end users’ environment can throw up unanticipated errors that result in a bad experience. However, it’s not just errors that can throw things off. You need to test whether each possible user environment results in different and potentially inferior experiences:
We’re now long past the assumption that desktop computers are the only device your learners will use. Today, desktops, laptops, tablets, iOS phones, Android phones, and other devices are all on the table. Of course, the best authoring tools (like Gomo!) will use responsive eLearning design to handle your content on all of these devices—but you still need to check what the output actually looks like. Remember: you’re not just checking whether content works on each device, but whether your design ensures users are getting the best experience no matter how they access your content.
Most people are largely settled on their favorite browser, whether it’s Chrome, Safari, Firefox, Edge, and so on. Amidst this array of options, it can be easy to miss the subtly different ways browsers can handle content—even content that’s compliant with web standards. As such, it’s never been more important to test your content in every popular browser to catch all of these errors. Don’t assume your content is faultless just because it works in the browser you’re authoring in! In the spirit of thoroughness, it’s worth bearing in mind that there are multiple mobile browsers to take into account, too.
It doesn’t take long to list out the usual suspects in the world of operating systems, but—as ever—that’s not the whole story. For mobile devices, Android and iOS are obvious candidates for thorough testing. Meanwhile, a good chunk of business machines run Windows. However, you only have to take a walk through a design department to find someone using macOS, and it’s also far from impossible that your technical teams are running some flavor of Linux. When it comes to operating systems, as with so many aspects of testing, it’s best to leave no stone unturned!
We often see people reviewing their courses via a test LMS but forgetting to engage with the LMS on which they’ll actually deliver their content. Just as browsers sometimes treat web standards in slightly different ways, SCORM and xAPI implementation can vary across different LMSs. For example, SCORM may require a character limit on a certain field. In one LMS, you may be given an error that flags the character issue, but in another, you may receive a confusing general message.
The challenge of testing for multiple user environments
Clearly, all creators will benefit from an awareness of the disparities created by different user environments. As you test each option, ask yourself: does my content still achieve my desired learning outcome? And is it more or less effective in certain formats?
What should be clear from this list is that it would be impractical to test for every permutation of every device, browser, operating system, and LMS that could be used to access a course. The important thing is to make sure that you have a set of test devices that are representative of your end-user group. Not sure which devices fit the bill? Take advantage of existing user data or survey your learners to find out what you need to test with thoroughness and integrity.
Building great courses involves spinning a lot of plates. Our how-to guides are here to help:‘How to deliver multiple versions of an eLearning course to different learner groups’
Overlooking divergent user journeys
A common mistake organizations make when testing is to individually check each content unit via the contextless list of screens in their editor. Once these all look correct, they’ll congratulate themselves on a job well done. Here’s the problem: though this approach covers every screen in the course, it doesn’t account for all the possible journeys and paths through the content.
If you’re using any form of question logic, you’re often sending learners through diverging paths when they answer a question in different ways. It’s important to test each and every one of these paths to ensure that every learner gets the experience you intend for them to have. Omitting a given journey, or even a number of journeys, can cause unexpected issues to emerge at late stages of the course creation process.
Skipping instant previews
When you’re pressed for time, it can be tempting to skip the preview process as you build your content. This is an understandable omission for time-poor learning designers. After all, in-tool previews aren’t a substitute for that all-important final review within your LMS. However, that doesn’t mean previews aren’t a useful time-saving device.
By taking a glance at how your eLearning content is looking as you build the site, you’ll have the valuable opportunity to nip any obvious mistakes in the bud. Plus, the right authoring tool will allow for instant previews, helping you to finesse your course layout in real time.
Interested in instant previews? Check out Gomo’s preview process—alongside a host of features and updates:‘Gomo in 2022: 5 exciting updates, from live delivery to instant previews’
Non-technical pitfalls that are surprisingly easy to forget
Faced with the complexity of the systems and learning designs that require review, it’s not uncommon for teams to gloss over one or more aspects of the actual content. While your robust device testing process reveals that a screen doesn’t load properly on mobile, you can still miss an obvious spelling mistake in the second paragraph.
That’s why it’s worth getting someone knowledgeable to review every aspect of your language use, including:
- Spelling and grammar
- Tone of voice
- UK and US English (and dialectal differences in other languages)
- Company style guidelines
Company style guidelines, in particular, can catch teams out—and the last thing your project needs is for brand compliance to throw a last-minute spanner in the works. In some scenarios, a project may encompass 50 or more courses, all created at the same time. If the theme used across all of these courses is not brand-compliant, every project will need to be corrected.
The right platform can help you avoid this type of disaster. Gomo allows you to update your master theme file: a strategy that could quickly correct the problem. However, even this is dependent on building courses with a unified theme. Plus, this kind of problem incurs a further round of testing and review to make sure the change has taken effect on all your courses—and that there are no obscure issues that have arisen as a result.
About the author: Pratibha Shah
Since I joined the team in 2014, I’ve been a part of Gomo’s journey from a simple authoring solution to the outstanding content creation tool it is today.
For many years, I've contributed to numerous projects within the eLearning sector as a gatekeeper of quality. My passion lies in delivering outstanding software to our clients and continuing Gomo’s growth through the implementation of exciting new features.
I started my career with Gomo as a Tester and now lead the Gomo Engineering Team. I take pride in supporting our developers to create innovative and effective solutions for our clients in collaboration with our Product Manager. I oversee projects from their conceptual stage through to final delivery and work alongside my team to help us achieve our goals.