Jump to content

Monitoring the success of eLearning

How to evaluate your online courses

Visualisation e-learning analytics

When exactly can an online course be said to have been successful? The answer to this question is essential for assessing whether you are getting a return on your investment—after all, eLearning must first and foremost be cost-effective and move your company forward, regardless of any idealistic motives. This article tells you about the different levels you can use to analyze the success of your online courses, and how to obtain and analyze specific data.


4 levels of analysis for evaluating learning outcomes

Whenever a number of stakeholder groups are involved in something, all perspectives must be taken into account. This also applies to eLearning. In the long run, it is no use if test scores are high but employee satisfaction with the learning process is low, or vice versa. The renowned evaluation model developed by US scientist Dr. Donald Kirkpatrick ensures that all key viewpoints are taken into consideration. Donald “Don” L. Kirkpatrick (15 March 1924–9 May 2014) was a US economist and Professor at the University of Wisconsin. In the 1950s, he researched the evaluation of educational processes as part of his PhD dissertation. Later, he used this as the basis for his 4-stage model of evaluation—the foundation of contemporary training monitoring. His 1975 book, Evaluating Training Programs earned Don Kirkpatrick worldwide recognition.


These loose questions about satisfaction are based on a wide variety of factors. Some of the most important are:

  • Relevance and usefulness of content
  • Usability of software
  • Ease with which learning can be integrated into everyday (working) life
  • Applicability of content
  • Enjoyable communication of content that makes neither too many nor too few demands of learners

Because the more relevant and useful learners perceive new content as being, the higher their intrinsic motivation will be to engage with it and thus the greater attention they will pay to their learning. However, in the era of hybrid working, widespread individualization, and on-demand consumption, online courses also need to be easy to fit into everyday life and give learners space for self-direction. Otherwise, learners will perceive even the most relevant online course as a burden.


It is crucial to learning outcomes that learners are satisfied with online courses—and of course, this also promotes employee loyalty. In addition, however, you need to ensure that learners actually assimilate and retain what they have been taught. The easiest way of determining this is to build tests into your courses or where necessary run external tests. Self-assessment by learners can also provide useful information.


Although it may be important for your employees to acquire as much new knowledge as possible, they must of course also be able to put it into practice. After all, the key reason companies invest in training is to improve everyday work and processes. At the same time, as explained in level 1, if content is relevant and applicable, this will increase your learners’ motivation.

But can it actually be applied in practice? Unfortunately, this question is a great deal more difficult to answer than the first two. The principal challenge is to identify parameters that will enable the applicability of content to be determined. If you’re delivering an online course on sales strategies, for example, you can analyze trends in sales figures—both in your own company and in comparison with subsidiaries that may either not have introduced the online course or may have higher/lower completion rates.


The last of the four levels of analysis deals solely with the efficacy of the eLearning project with regard to company objectives. This analysis is intended to show what tangible results the training will deliver for the company. The answer cannot be generalized, but will depend on the company in question: The goal may be to increase leads or measurable productivity, for example, or to increase company sales, or to deal with fewer customer complaints.

In contrast to levels one and two, however, a needs analysis is essential here, in order to define the indicators that the eLearning project is aiming to optimize. Without being able to compare performance directly with such indicators, it is not possible to evaluate an eLearning project.

Nadine Pedro
[Translate to English:] Nadine Pedro, chemmedia AG

Interesting information free of charge?

Take advantage of our expertise.

We regularly share our knowledge and experience free of charge.
Just subscribe to the newsletter to ensure you don’t miss a thing.

Bitte füllen Sie das Pflichtfeld aus.
Bitte füllen Sie das Pflichtfeld aus.
Bitte füllen Sie das Pflichtfeld aus.
* Mandatory field

By confirming, you agree that chemmedia AG may send you whitepapers, news and know-how in the field of e-learning. Of course, we will treat your data confidentially and will not pass it on to third parties. You can unsubscribe from the newsletter at any time and object to the processing of your data in accordance with our privacy policy.

Methods of obtaining meaningful data


The most effective way to measure learners’ satisfaction is to ask them. The same applies to the applicability of knowledge if neither indicators nor monitoring options are available. Digital surveys can help in this context, and can even be integrated into your learning management system depending on how professional your software is. Ideally, you should conduct two surveys for each course—one directly after it is completed and another one to two weeks later. This will enable you to hear participants’ views when they have completed the course, but also to learn whether and how the online course has had an impact on their day-to-day work. 

Make sure not to overwhelm your employees with too many questions; instead, formulate clear questions about issues that are really relevant to you. You can expect to receive particularly honest–and therefore meaningful–answers if the survey is conducted anonymously.


Questions that may help you evaluate learner satisfaction

  • How relevant was the course content for you?
  • Are there any topics that you think the course should have covered but did not include?
  • How do you rate of the amount of material covered?
  • Did you enjoy the course?
  • How challenging did you find the course (too easy / just right / too hard)?
  • Was the content sufficiently explained?
  • Did you find the course structure clear?
  • How easy was it to use the course?
  • Did you have any technical difficulties during the course?

Questions about the applicability of content

  • How helpful were the examples in the course?
  • How would you rate your competence in relation to the content covered?
  • Have you been able to apply your new knowledge to your everyday work?
  • Please describe a situation where you have applied your new knowledge.
  • Name three key things/concepts that you have learned from this course.
  • List three things that you will change about your approach to your day-to-day work as a result of the course.

Tests within online courses

It’s easy to find out how much of your online courses’ content your employees have learned—at least if you have the right authoring tool. This will make it easy to check learning by incorporating different types of tests into your online courses. Tests also help learners to embed knowledge they have just learned, to get a better idea of their progress, and to gain an overview of learning outcomes. Knowledgeworker Create, for example, allows you to integrate the following test options at any point in your courses.


multiple-choice tests

gapfill exercises

open questions


Learning analytics: Using performance indicators

Just as the performance of websites can be evaluated, eLearning includes what are known as learning analytics performance indicators that enable you to evaluate learning processes and outcomes. Course evaluation has numerous advantages: It helps you to optimize your company’s training setup where necessary, and to identify and address skills gaps; it also provides information about the return on your investment and where to deploy your resources to the greatest effect. 

Learning analytics also provide basic data to enable you to personalize and customize learning processes. So instead of offering the same rigid learning paths to all your learners, you can use the data automatically—i.e., without manual intervention—to offer the precise learning path that will suit each individual employee, based on their user behavior. Where employees’ user data highlights gaps in their qualifications, for example, this enables you to offer them the opportunity to repeat courses. Conversely, employees who can demonstrate that they have sufficient knowledge can skip refresher courses. In this way, learners perceive content as being even more relevant and you save on resources.


The bottom line

Monitoring the success of eLearning, including evaluation, is as natural a part of the training process as the design and delivery of online courses. Without it, you will never find out how successful your digital training actually is: Skills gaps will go unnoticed and you will not know whether learning outcomes have been achieved.

However, the decisive argument for evaluating the success of courses is at a higher level: Every investment your company makes in training measures must be worthwhile and must contribute to the goals you have set. So you should consider monitoring and evaluating your training products as less of an obligation and more of an opportunity to deliver a measurable increase in the value of your business through successful knowledge transfer.

Magda Lehnert | Blogger
Magda Lehnert

Our Promise

Banner chemmedia service contact
Banner chemmedia service contact
Logo chemmedia AG

Faster. Easier. More sustainable.

We shorten the process of content creation, delivering impressively lean, efficient and cost-effective eLearning processes. More than 90% of our projects break even within a year—and 60% do so within 6 months.


You may also be interested in the following articles


Image source: Rymden/shutterstock.com