Jump to content
 
 

Learning analytics

How to use data to optimize learning outcomes

 
Visualisation eLearning analytics

eLearning analytics are the equivalent of Google Analytics for the web. Data on use and behavior are collected and subsequently analyzed and evaluated. This aim of this, of course, is to enable you to optimize products further in order to ensure the best possible learning outcomes. It is therefore essential to look at the facts and figures (learning analytics) in order to ensure you provide the best possible employee development on the one hand, and to make your products as efficient as possible on the other. Discover which data you absolutely have to measure, what uses you can make of them, and how you can use KPIs to illustrate the success of a training product.

 

What does “learning analytics” mean?

A definition

“Learning analytics refers to the interpretation of a wide range of data produced […] and gathered […] in order to assess academic progress, predict future performance, and spot potential issues.” (Horizon Report 2012). Accordingly, the term “learning analytics” always describes a two-part process: 

  1. The gathering of data.
  2. The analysis of data with the aim of adding value to a company’s skills development. 

 

eLearning software enables access to such data, making learner and user behavior transparent and thus measurable. Processes that take place out of sight in analog learning and are difficult to measure (if they can be measured at all) become visible in detail on electronic devices—far beyond obvious performance indicators, such as test results or course completion. Evaluation can accordingly be more complex: In addition to summative results evaluation (“learned or not learned”), learning analytics also enable formative evaluation, providing you with a wide variety of information and allowing you to make ongoing improvements to the learning process.

 

Why are learning analytics so important?

Determination of return on investment (ROI)

The aim of any deliberately initiated process of learning is to demonstrably increase knowledge or modify behavior. This starts with the development of learning objectives and ends with the review of such objectives. In a corporate context, however, learning outcomes are less idealized, and success is considered much more from a commercial point of view. In short: Is the investment in training paying off?

Data can be correlated with individual key performance indicators (KPIs) to enable a precise ROI to be identified at the end of the analysis. Numerous other data points can also be collected and evaluated in order to test specific hypotheses. 

Of course, it is helpful not to launch any other initiatives that could also impact on KPIs during the learning and evaluation process. Otherwise, it will no longer be possible to identify the source of any such impact (possible time saving, for example) with any reliability. Your results will be distorted and you might over- or underestimate the effectiveness of your training measures.

 

Data

  • Training costs
  • Number of training hours
  • Test results and
  • completion rates

KPIs

  • Increase in sales in %
  • Time saving in %
  • Error rate reduction in %
 

The most frequently measured data and their purpose

Icon target group information

Target group information

  • Device: to check the need to optimize units for mobile learning
  • Location/country: to check for differences (correlating with learning-time data)
  • Selected content language
  • Time of day / day: often correlated with time spent learning or learning outcome
  • Mobile network access (indicator of remote learning) vs. WiFi (learning at work / while working from home)
Icon Learning behaviour

Learner behavior data

  • Number of views of content/section/video: to check which content was assimilated at the first attempt (success) or as an indicator of overcomplicated/incomprehensible content
  • User click behavior: as an indicator of interests
  • How long videos are played for: to check whether they are viewed in full (potential savings for future videos, because these are particularly expensive)
  • Time required per section / media type: to determine preferences
  • Total time required: e.g., to compare target groups and calculate time spent studying
Icon eLearning data

Results data

  • Completion status (pending, started, in progress): to check for successful completion and allow reminders to be sent if necessary
  • Knowledge test score
    • via pre-testing (to enable products to be adapted to learner’s prior knowledge)
    • to evaluate learning outcomes (including the quality of the relevant learning content)
    • Use as list of participants / proof of participation
  • Number of attempts required for tests: as an indicator of the quality of the relevant learning material
 

Observation of real-world skills development

Despite its commercial importance, ROI is only the tip of the learning analytics iceberg. Learning analytics also enable you to look at the complexity of developing real-world skills—an essential task, but one that, given the advancement of digitization and hybrid working models, also requires a digital solution.

Skills development is no longer merely a question of stocking up on knowledge, but about achieving higher levels of competence, including specific problem-solving or self-directed learning.

 

Example: Teaching compliance at different levels

Icon Know learning objective

Level 1

Familiarity with compliance rules

Icon Assessment Learning objectives

Level 2

Ability to evaluate violations of compliance rules 

 

Whereas level 1 (naming, repeating, testing) can be taught and analyzed with little effort, level 2 requires a much more complex process. Compliance rules must be firmly embedded in learners’ understanding and must also be communicated, discussed, reflected upon, and tested with regard to specific situations in terms of theory and/or practice. 

The higher the achievement level required and the freer and more self-directed the learning process, the harder it is to monitor learning success. It’s only logical: Whereas success at level 1 of our example can easily be determined by summative results monitoring (“learned or not learned”), the evaluation of level 2 requires further data. The question is therefore what information can be collected in order to examine skills development and to optimize learning outcomes? 

 
Nadine Pedro
[Translate to English:] Nadine Pedro, chemmedia AG

Interesting information free of charge?

Take advantage of our expertise.

We regularly share our knowledge and experience free of charge.
Just subscribe to the newsletter to ensure you don’t miss a thing.

Bitte füllen Sie das Pflichtfeld aus.
Bitte füllen Sie das Pflichtfeld aus.
Bitte füllen Sie das Pflichtfeld aus.
* Mandatory field

By confirming, you agree that chemmedia AG may send you whitepapers, news and know-how in the field of e-learning. Of course, we will treat your data confidentially and will not pass it on to third parties. You can unsubscribe from the newsletter at any time and object to the processing of your data in accordance with our privacy policy.
 
 

Collecting data

Step 1: Learning analytics

Once you have defined the objectives and assumptions of your learning scenario, you can deduce which data are required to check the presumed interdependencies. But how can these data be collected?

Digital content in SCORM, xAPI, and cmi5 format is particularly suited to the collection of data on learning activities. All online courses created in these eLearning standards have the necessary technical capacity to measure and store learning activity within the learning management system.

 
[Translate to English:] Visualization eLearning standard functionality xAPI

In addition to digital data, analog parameters such as book chapters that have been read can also be recorded. All you need to do to evaluate learning as a whole is to digitalize and store these in a central database together with all other relevant data. Ideally, this should be done via what is known as a “learning record store”, which allows data from a wide variety of sources to be centrally stored, processed, and analyzed. In addition to data processing, popular learning record stores (such as Learning Locker) also offer options for evaluating and visualizing data in individual reports.

 

Use data wisely

Step 2: Learning analytics

As the explanations show, learning analytics can be used to collect vast amounts of data. The real challenge is to evaluate the data in a meaningful way and to draw conclusions about skills development and learning outcomes. The basis for this is your knowledge of the internal features of your company that determine the quality of your in-house personnel development. Just as it is important to define KPIs, it is necessary to identify criteria that enable you to describe these features. Although this sounds quite theoretical, it can be better understood from the following three perspectives:

 
visualization perspectives of learning analytics
 

The learners’ perspective

Learners’ main interest in eLearning is the usability of the learning platforms and the user experience offered by the learning products. The goal here is therefore a high level of user-friendliness, for example through

  • storing progress with learning,
  • displaying current status (new, completed, in progress) and
  • displaying learning outcomes (successfully completed, test results, overview of certificates, etc.)

 

Learner satisfaction has a direct impact on learning outcomes and can be increased through optimized usability. The user experience cannot be seen directly from the learner’s behavior, but is determined by means of a survey taken immediately after engagement with the content.

 

The perspective of content authors and eLearning designers

Data such as length of visit, pages viewed, and test attempts provide content authors and instructional designers with information about the quality of teaching, aesthetics, functions, and content provided. Test results relating to a specific video, for example, can provide information on whether the video conveys the information in the best way or whether certain aspects require further optimization.

 

The perspective of personnel development and the company

Of course, from a corporate perspective, training is successful if individuals’ learning outcomes lead to an increase in turnover. The measurable value of training can be determined by comparing training costs, the relevant data and KPIs such as the duration of training (downtime costs) with increases in sales.

However, it is important not to disregard achievements that may not be measurable in financial terms, but that contribute to the development of employees’ skills. A variety of data are used for this purpose, for instance 

  • device used 
  • learning location
  • prior knowledge
  • click behavior
  • playing time for videos, etc.,
     

in order to determine individual preferences and prior knowledge and thus to adjust future content.

This promotes the most effective development of individuals’ skills and, ideally, shortens time spent learning. Thus the data not only help to develop skills, but also reduce personnel costs resulting from downtime.

 

The bottom line

In order for companies to gain added value from the many different types of data, they need a system that can recognize and exploit potential—and they may also need a little imagination. Companies wanting to develop the knowledge and skills of their employees over the long term therefore need to safeguard the learning outcomes of existing learning processes through systematic evaluation and employee feedback within their new “compulsory digital program”. Learning analytics also help you identify the weaknesses and strengths of your in-house digital training and to improve them so as to enhance skills development—with no additional effort for your learners.

 
Magda Lehnert | Blogger
Magda Lehnert
Copywriter
 
 

Title image: Gorodenkoff/shutterstock.com