Learning Metrics that Matter

    As a L&D professional, how are you going to demonstrate that your learning technology selection and implementation process is going to deliver the Return on Investment (ROI) that you are expecting to see? Your answer may incorporate learning metrics, which you can obtain about your learner journeys. This may particularly be the case if your organization relies heavily on data-driven insights to measure success.

    What are learning metrics

    Let us first take a look at what exactly we mean by ‘learning metrics’. There are a number of types of ‘activities’ that can be tracked in your learning technology (whether you are using a LMS or a LCMS). Let’s discuss this in a little more detail. Afterwards, we might zone in on training effectiveness metrics that might deliver the most useful insight to your organization.

    Groups and types of learning metrics

    You might consider dividing learning metrics into two different groups – depending on whether you are measuring activity or training effectiveness.

    Activity metrics

    You might call ‘activity’ metrics those that are more focused on ‘functional performance’. These might include:

    – amount of learners on a course;

    – number of learners completing a course;

    – numbers of learners studying (and separately, completing) a module;

    – rate of ‘abandonment’ of a course;

    – questions/tasks, etc, that may have been most commonly answered or completed incorrectly;

    Training effectiveness metrics

    duration that a learner is focused on a course/module/content chunk, etc;

    – simulation scores/performance & retention of knowledge;

    – behavior change; and

    – business impact.

    Duration

    Why is length of time a learner spends on a course more interesting? From learning time you might gather whether the learner was really engaged (or not). For instance, if they got through the course very quickly then this might suggest one of two things:

    a) that they know the subject inside out already; or

    b) they’re not really engaged and they see the course as a ‘box-ticking’ exercise.

    With access to granular-level data on the learner, you should be able to ascertain which scenario is more likely. One cause for concern would be if you work in a highly-regulated industry. Particularly, if this is training for compliance or regulatory reasons. In this case, more scrutiny may be needed on why the content is being completed particularly quickly. Accordingly, consider the implications for your business.

    Simulation performance

    Simulations can give you extra insight on how the learner has engaged with the learning content. IE, how much knowledge they have actually retained. Poor performance here might indicate that they hadn’t really engaged well or were struggling with the concepts in the learning. Perhaps this is also an opportunity to review the validity of the learning content in a particular area. You might decide to ‘tweak’ content accordingly. Revising content will ensure that any concepts are being thoroughly understood. The content must have an impact on learners.

    Next we move on the metrics that may inform on the 3rd and 4th level of the ‘Kirkpatrick Model’. (Organizations may also pay attention to other methods of training effectiveness, such as the Kaufman Model, the Anderson Model and the Brinkerhoff Method.)

    Behavior change

    You might consider the many ways in which you can set up measures, post-learning, to collect information on behavioral changes. One example might be to integrate feedback from learners via a ‘social’ informal learning platform within your organization. You might set up specific, tailored, groups for your learners in however way you like. For instance, around a specific area of learning. Individuals who have completed learning could engage in discussions on learning behaviors, and/or to engage with ‘gamification’ by earning points or badges, etc, for specific performances. eXact learning has an informal, knowledge-based, learning platform, ‘con-X’, that provides this functionality.  The technology allows users to ‘rate’ knowledge-based content, which can be uploaded by group, business unit, function, etc.

    Business impact and KPIs

    Depending on how your organization tracks employee performance in general, individuals may have been set Key Performance Indicators (KPIs) that link to high-level company strategic objectives.

    There are many ways in which you might be able to track the achievement of these KPIs via the achievement of Learning Intentions (LIs). For instance, for sales teams, you could track improvements in achievements of sales targets or pipelines post-completion of sales elearning courses. There are many other examples of strategic objectives that you could measure via KPIs if they are related to specific elearning programmes. EG, improvements in Service-Level Agreement (SLA) performance in Customer Service and/or customer feedback. Naturally, you will need to be ensure your KPIs are SMART.

    xAPI

    Now, let’s take a look at how learning metrics work in practical terms.

    While SCORM allows you to track the complete learner journey with learning objects, xAPI (the ‘experience’ API) allows you to measure further interactions. That is, xAPI allows tracking of how the learner goes beyond the actual learning object and course itself. For instance, should you have a ‘social’ informal learning platform then xAPI allows you to track any of the learner interactions with this, including participation in discussion boards, gamification elements such as badges earned, and so on. This allows you, as the L&D professional, to have a much ‘rounder’ perspective on how well the learner has interacted, and engaged, with the learning content.

    xAPI statements and flexibility

    xAPI tracking statements provide L&D departments with an advantage in that they are ‘customizable’ for the organization. This allows organizations to focus on collecting the activity or learning metrics that are relevant and of interest to it. Rather this, than tracking training metrics that are aligned with a different organization’s needs.

    con-X’ and xAPI

    Screenshot of the LCMS dashboard

    Screenshot of the LCMS Dashboard

    A screenshot of the analytics available via xAPI integrated with ‘con-X’, eXact’s informal-learning platform.

    The above Figure shows training metrics that have been extracted from the eXact learning LCMS. Here, we provide an example of how a ‘third-party’ Learning Record Store’ (LRS) can be integrated with eXact’s LCMS, to display tracking of content created on the LCMS on the eXact learning dashboard. Thus, we can demonstrates the flexibility that xAPI has for tracking behaviors across different, integrated, systems.

    Views from the top on learning metrics integration

    Again, on a practical level, it seems that organizations are still displaying difficulty or uncertainty with fully climbing on board with using learning metrics to measure training effectiveness.

    According to a survey of Chief Learning Officers (CLOs), from CLO magazine (2019):

    the vast majority of organizations are struggling with integration. Thirty-seven percent are manually generating metrics, 20 percent are automating metrics from their learning management system and 14 percent have no formal metrics in place”.

    This is surprising, considering how valuable learning metrics can be for assessing the effectiveness of elearning and the impact – not only on individual performance but also – on overall strategic goals, as we have discussed above.

    However, according to the same CLO survey (gathered in 2018) 71 percent of learning professionals say their organization plans to increase its learning analytics capability over the next two years”.

    Perhaps the pandemic, and resulting increase in remote working and remote learning technology, means that organizations have accelerated these plans. Currently (Q4, 2020) there is a plethora of articles and studies published in the elearning press forecasting the growth of learning metrics in 2020. However, most of these were published prior to the Pandemic. This means we will have to wait and see if the appetite for ensuring that learning metrics for measuring training effectiveness really has speeded up.

    Advantages of ‘feedback’ from learning metrics

    Forward-thinking organizations are recognizing and taking advantage of the benefits that can come from gaining ‘feedback’ (via learning metrics) on learning content effectiveness. Crucially, intelligence gained by L&D teams via learning metrics means that content can be ‘tweaked’, revised and re-focused to aim for the desired effect on learners. Importantly, learning metrics allow this review to happen as quickly as the organization desires. The resulting impact on re-alignment and any resulting organizational gains has to be tremendously appealing.