+44 (0)208 545 2630 [email protected]

Why measuring learning impact is more important than ever

by | May 2, 2017 | Articles, Featured | 0 comments

Learning professionals should be playing a vital role in helping people and organisations thrive in times of change. However, to do this, they must be agile. This article, by LEO’s Imogen Casebourne and Gareth Jones, and Watershed’s Andrew Downes, explains the importance of measuring the impact of learning on business.

The world is moving and changing ever more quickly. As a result, organisations and the individuals who work for them, are finding themselves having to adapt to constant change. Learning and Development (L&D) departments should be playing a vital role in helping people and organisations thrive in times of change. However, to do this, they must themselves be nimble and agile – ready to try new methods and tools and to discard them quickly if they are not effective. Unfortunately, this is not always the case. Where marketing departments have moved to using research, testing and impact measurement to select the most effective approaches and to demonstrate the value of the marketing spend at board level, L&D has not yet followed far down the path of measuring learning impact.

Proving value with learning analytics

Learning analytics is a powerful new tool in the learning and development ‘toolbox’. Being able to quickly see the impact of different approaches enables designers to experiment with and assess new approaches to meet changing needs.

The increasing demand for better evaluation and analytics in learning has largely been driven by better data in other parts of the business. In particular, much like learning, the marketing function has moved from being a ‘black box’ where investments had an unknown impact to an area where organisations can very clearly and immediately see the direct benefits of specific investments.

As a result, marketing departments have moved from a ‘spray and pray’ approach, to a much more evidence-driven strategy where they can predict the impact of their initiatives. Marketing has gone from one of the first things to be cut in a downturn to one of the last.

Now it’s the turn of L&D to go through a similar transition…

Winning a seat at the boardroom table

At LEO’s partner company, learning data analytics experts Watershed, clients are keen to implement learning analytics to get a view of what learning resources and platforms are being used and by whom. In organisations where learning happens everywhere, there’s an urgent need to expose and bring together a record of both the formal and informal learning into one place, viewed ideally through visually effective dashboards.

If you don’t know who’s using the resources and experiences you create, pay for and curate, then your approach to e-learning is very similar to that ‘spray and pray’ approach of old-fashioned marketing. It’s likely that at least some of your courses, campaigns and platforms either aren’t being used or aren’t working. But how do you know which elements are ineffective?

L&D teams in organisations like Visa are using learning analytics to assert a far more strategic role in the business. They are gathering data on the key learning moments that help to develop great leaders. This is enabling the L&D team to develop a better understanding of how these leaders are learning, how often, and from what sources.

As we all know from experience, many senior executives still treat learning and development as a cost. When we can demonstrate – using big data – that we know which capabilities to fine-tune in order to achieve business goals quicker, then we will be invited to take a seat at the boardroom table.

Proving the value of learning isn’t the whole story; as learning professionals, we already know that learning works. Learning analytics gets really interesting where we not only prove, but also improve learning. This is possible with more detailed data and analysis that tells us not only whether our learning programmes, strategies and offerings are working, but which elements and approaches are most effective. These detailed analytics can also be used to highlight practical and cultural blockers that need to be addressed.

Real-world learning analytics

Watershed has lots of examples of this principle at work:

  • AT&T used learning analytics to prove that more engaging, higher fidelity compliance learning content was more effective than existing training. During the project, they were able to save 670,562 production hours and 160,380 employee course hours.
  • Medstar uses learning analytics to evaluate the effectiveness of training on clinical metrics. Almost immediately after launch, they were able to identify a problem with a particular step of their simulation app when they noticed a surprisingly high number of clinicians missing that step.
  • Nuance Communications uses learning analytics to report on completions of required training. The greater level of detail made possible with Watershed enabled Nuance to uncover instances where learners are trying to ‘game’ the system by taking the pre-test after having already attempted another copy of the same test included in the course to get the right answers.
  • What’s wrong with Thursdays? Another client recently uncovered the fact that significantly more people drop out of courses on Thursdays than any other day, and they are now exploring ways of rescheduling courses to reduce drop-out rates in future.

Changing perceptions by measuring learning impact

Gathering learner data is a win-win situation. More than that, with big data making ever greater inroads into other areas of most organisations, by not embracing a more data-driven approach, L&D departments risk being viewed as old-fashioned and slow moving.
So what are you waiting for? We asked a number of organisations how they’re measuring learning impact and found that while a number of organisations had already started using learning analytics to prove value, others were being held back by a variety of factors. You can view the results of that research here.

Imogen Casebourne is LEO’s Director of Learning.
Gareth Jones is LEO’s Product Development Director.
Andrew Downes is a Learning and Interoperability Consultant at Watershed.
This article was first published on LEO’s website.

 

Compare your L&D strategy with the Towards Maturity Benchmark

The Towards Maturity Benchmark enables L&D professionals to review their current learning strategy, compare their approach directly with top performing organisations and set priority actions – helping to deliver peak performance.

Find out how your learning strategy compares – it's free to get started.

Featured content

3 ways technology can help build business agility

3 ways technology can help build business agility

Learning technologies are here to stay, so it’s time we got smarter in the way we are using them. Our learner voice research with over 5000 workers  shows that 7 in 10 learners want to do their job faster and better. Is technology helping you do this and if not, why?...

You might like...

3 ways technology can help build business agility

3 ways technology can help build business agility

Learning technologies are here to stay, so it’s time we got smarter in the way we are using them. Our learner voice research with over 5000 workers  shows that 7 in 10 learners want to do their job faster and better. Is technology helping you do this and if not, why?...

Our Ambassadors

Our Supporters also influence Towards Maturity’s Benchmark and research, providing insights on future trends and practices that should be investigated.

Pin It on Pinterest