When you get a group of learning professionals together, there are a few topics of conversation you can be fairly certain will come up:

  • which new methods, emerging tools and evolving practice are worth pursuing  
  • how to manage constraints (both time and financial)
  • the inevitable discussion about “how do we get a ‘seat at the table”,  i.e how do we stop being viewed as line item and start being seen as a key part of organisational success

The reality is, those topics are really all the same topic – at the end of the day they all relate to the real world impact of learning.  

We look for the emerging tools and approaches that will best support real learning.  

We face fiscal and temporal constraints because our resources hinge on how the business is doing as well as (sometimes), on how L&D are perceived as contributing, or (sometimes) just at the whim of budgets issued from on high that we have no say in.  

If we have an impact on performance in the organisation – and can define, measure and communicate it – it’s going to help validate our innovative methods; it informs how to best apply our resources, and put us in a position to negotiate for what we need.

It also puts us solidly in a seat at the proverbial table.

Show Me the Money

We need a plan; and not just for how to collect and analyse our data our data, although we do need that too.

If we are going to show the ROI of our learning activities, we need to know the business motivations behind those activities.  

This may seems obvious, but it’s surprisingly common for stakeholders to request courses or learning data without any reference as to where and how these fit in with organisational strategy and goals.

If that’s not defined at the outset, it’s going to be pretty difficult to demonstrate value!

Understanding “Why”

Knowing the intended business outcomes for a learning activity allows us to determine the relevant performance metrics, and to map out potential correlations between performance improvements and learning experiences.  

This in turn helps define the data evidence trail will be that will demonstrate if goals were met, to what degree, and by whom; and correlated with which activities or resources.

If we are really looking at performance impacts, not all of the data we need will be found under the charge of L&D, this makes things more complicated, but it’s well worth the effort as the performance data is where we can explore the impact of learning on organisational KPIs.  

Quantitative data, analysed well, is a powerful tool for demonstrating the real world impacts of L&D activities.

Whether we are looking at a 20% reduction in time to competency for new hires, improving customer satisfaction with tech support, or decreasing infection rates in a hospital, learning activities can have a significant impact on the bottom line for a business.

Can’t Buy Me Love

Measuring learning impact isn’t always straightforward, however.  Outside of the challenges involved in gaining access to performance data, there are some learning impacts that are difficult to measure in an immediate, straightforward manner:

One can’t fully measure what was avoidedWe can estimate the benefits of preventing accidents at a chemical plant or cyber attacks, but we might never know how many such incidents were avoided or how severe their impacts might have been.

Complex learning that involves changing habits or process change does not necessarily provide immediate performance metricsMeasure taken immediately after a learning event are only part of the picture; they don’t demonstrate long term impacts

It is difficult to measure impact when a performance activity is part of a larger whole;  Infection reduction efforts in a medical facility will likely have cumulative effects and it may prove challenging to quantify the impact of a single learning event in the overall ecosystem.

It’s been said that “what’s measured, matters”, but there is information that numbers can’t provide; in complex situations we need to rely on qualitative data as well as quantitative.  

Whether it be user interviews, or text analytics from course submissions, it’s the qualitative data that gives a window to how learners have really responded to what they’ve learned:   

  • Do they have concrete plans to carry things forward?  
  • Do they have organisational or resource related barriers to success? 
  • Are they looking for more resources and support to help them follow through?

Improving the ROI of Learning

So how you can ensure your learning interventions deliver useful performance metrics every time?

First up, next time you commission your next course or training programme, ensure that you’ve taken time to really consider the purpose and exactly what impact it will have on the business.

By first defining what you’re trying to achieve, you’ll be in a better position to only only measure it, but to design your programme to produce metrics in the most appropriate format.

Secondly, remember that performance isn’t always best measured in numbers, and investment isn’t just measured in dollars.

Using qualitative analysis techniques to measure learner’s cognitive presence,  for example, you can identify behavioural change (or understanding) that might not manifest itself in an otherwise quantifiable state.

Additionally, leadership programs may be developed to influence a change in organisation culture and by analyzing the way people talk you can identify that change has taken place.

We’ll be sharing examples of how we’ve worked with organisations to look at the bigger picture impact of learning in a series of future posts.  If you’re interested in getting started already, get in touch – we’d love to hear from you.