May 29, 2025

Let's be honest about impact

Vicki Culpin, Anna Odumodu, Marcela Paz, Leah Henderson

A thoughtful person wearing glasses, looking upward, with a blurred abstract background and geometric shapes.

Why is proving learning impact so difficult – and how can we make it easier?

When you can define impact, you get better at creating it

One of the big issues with measuring impact is that it means different things to different people, even within the same organization. As Dr Vicki Culpin, Professor of Organizational Behavior at Hult International Business School, explained: “What that means is we spend a huge amount of time and energy trying to define it” – rather than getting on with it practically. 

Taking a step back, she said we can think of impact as a ‘positive difference’ that could happen at the individual, team, or organizational level. By asking what that looks like at each level, you can get specific about which changes are most important, how they will manifest, and how they can be measured. For example, what does ‘better decision-making’ mean? Faster decisions? More risk-taking? What change will people see in the organization – in behavior, projects, collaboration, or innovation?  

Anna Odumodu, VP Custom Solutions at Hult Ashridge Executive Education, said:

“The better you can define the change you want to see, the more it can be integrated into a solution to make the learning relevant”

This will allow you to make conscious choices from the beginning of an intervention or partnership. She also emphasized the importance of aligning the desired learning outcomes with strategic objectives of the organization. 

Don’t let your assumptions get in the way of impact measurement

Dr Culpin highlighted the importance of testing assumptions around impact – “Otherwise our measures of impact will be driven by the assumptions we make rather than the impact we’re actually having on the organization.” She shared an example from her research which showed that virtual delivery was as powerful as face-to-face delivery in learning transfer (despite common misconceptions that the opposite is true).  Her key advice:

"Don’t assume that absence of data is data of absence."

There are many ripples of change a learning intervention can have. If your measures don’t show impact, you may not be looking in the right place. And although you can’t measure everything, think about what else you could potentially look at in terms of wider impact. “Go beyond your set of assumptions because that’s when you’ll really see the difference that programs make.” 

Engage your people from the start

Marcela Paz, Senior Learning Consultant at Hager Group, drew attention to the connection between learning and change.

"If we treat every learning program as a change process, then we know that the first thing we need to do to capture impact at the end is raise awareness – why is this important for the learner, team, or business?"

From here, we can build motivation. “When you have this mindset and you are prepared psychologically, it’s easier to implement that knowledge in your real work or life.” 

Engagement sets the foundation for change – but it isn’t enough on its own

Paz described engagement as “the first indicator of acceptance and psychological safety in the learning environment. It sets the foundation for change.” 

Dr Culpin agreed that “memorability has to come first, but it doesn’t stop there; it has to go beyond that.” Remembering a session doesn’t necessarily mean there will be a change for the individual or organization. We know experiential learning supports learning transfer, so although engaged participants are important, we also know that when participants experience discomfort in a psychologically safe environment, it supports behavior change.

Impact measurement is a human process – that’s what makes it imperfect

“So often, we can sit back and try to design the perfect suite of measures,” said Dr Culpin. “You will never do that. Human beings by their nature are unpredictable.” The learning journey is influenced by individual circumstances. “We don’t behave like chemicals in a test tube: we will always have other things that influence our behavior and ability to learn and translate that back into action in the workplace.”  

Rather than investing time and energy in creating something perfect – accept the imperfection and work with the complexity.

Work with the complexity of the data

Elaborating on complexities of impact measurement, Odumodu highlighted the issues like participant bias, consistency in line managers, and the ‘logic dilemma’ of misinterpreting correlations in the data as causality. “If a new policy has come in or there’s a new business unit leader, the upward trajectory score doesn’t all have to be allocated or isolated to L&D,” she explained. And how do you measure all the impact from one intervention?  

So, there’s a lot of complexity – but with this understanding, the data you gather through impact measurement is still helpful information. For Odumodu, it’s about understanding these complexities and combining the information – whether it’s quantitative data, stories from your people, or the innovations they’re making – to start seeing patterns and relationships more clearly. 

Listen to the stories your people tell

“People want quick wins,” said Paz. While quantitative data is invaluable, it takes time and resources to gather. For Paz, being able to showcase the qualitative storytelling from participants to the business is a must – “the first-hand information from the learners about how it has been implemented and why it has been a success.” 

Their stories along with projects and graduation events in a program are all opportunities to gather impact data – and invite senior leaders to hear it. “Senior sponsorship also helps to make learning ‘important’ in the eyes of the participant and grow the impact,” said Odumodu. 

Make impact part of the learning journey

Dr Culpin highlighted that, often, impact can be lost if it isn’t captured from participants, even if they feel something has had a huge impact on them. At Hult Ashridge, Odumodu explained that the focus has been on integrating impact measurement into the learning journey so that there isn’t an additional ask of the participants, and so they can engage with their own change data.  

“Trust participants to be part of the impact journey,” she said. Integrating measurement into the learning journey captures data more effectively and enhances the learning experience for your people. 

What now?

1. Start at the end and work backwards

What do you want the end result to be? This comes back to your definition of impact: what are the positive differences you want to see at each level?  


2. Make learning part of strategic priorities and involve senior leadership

They’re the people who need to hear stories of impact. Engaging senior leadership elevates the learning from the perspective of the participants.


3. Don’t assume absence of data is data of absence.

Ownership comes when people feel involved in the change process. Leaders should create space for participation and acknowledge their emotional process.


4. Have patience and stay curious

While quick wins are important, sustained impact takes time. Manage expectations about when results can be fed back to the business, and build in dates for post-program measurements.


5. Don’t wait for perfection to start measuring

Impact measurement is a human process. Keep the complexities in mind, and just start.

Meet the experts

Headshot of Vicki Culpin

Vicki Culpin

Professor of Organizational Behavior at Hult International Business School

Vicki specializes in how to master leadership and resilience in turbulent times and has spent over 20 years researching the impact of well-being and memory. 

Vicki also researches in the field of adult pedagogy, specifically in relation to learning transfer. Vicki works with a range of clients from across the world, advising on leadership development, along with sharing her research findings.

Headshot of Anna Odomodu

Anna Odumodu

Vice President Custom Solutions at Hult Ashridge Executive Education

As Head of Custom Solutions at Hult Ashridge, Anna brings over two decades of expertise in creating custom learning and development solutions that drive change and sustainable growth for organizations. Passionate about understanding and meeting client needs, her current focus is on measuring the impact of learning interventions. Drawing on 10 years as a civil engineer managing multi-million pound projects, Anna brings this rigor and structure to her L&D work. She holds an Executive MBA with Distinction from Bayes Business School.

Headshot of Marcela Paz

Marcela Paz

Senior Learning Consultant at Hager Group

As Learning Consultant at Hager Group, Marcela brings over 15 years of experience as consultant and project manager for custom Learning & Development initiatives that drive growth and organizational transformation.

Specialized in human-centered development, she designs and leads end-to-end, high-impact learning strategies and projects that align with business needs and enable sustainable success.

Marcela holds a degree in Social Communication and Journalism, with specializations in Organizational Communication and Marketing Management, and certifications in instructional design, change management, conflict management, and behavioral assessment methodologies.

Watch the webinar on-demand

Watch our webinar on L&D impact – our panel discuss the pressures of proving it, creative approaches to tracking it, and the perennial challenge of understanding true learning impact.

Lady looking ahead with ripple effect
Professional women smiling with blurred foreground

We help leaders and organizations to change.