Measuring impact is one of the 3 main challenges (needs, transfer, measurement) to demonstrating that leadership development pays off identified in the article How to make leadership development effective It is possible to measure impact. We provide practical tips how to improve the measurement of participants’ learning:
The ultimate proof of the impact of leadership development programmes lies in their influence on pre-defined tangible results like customer/user satisfaction, efficiency, quality or profitability. This is often thought to be a step too far as there are so many factors that can influence these outcomes.
This problem has been partly solved by the widespread adoption of Kirkpatrick’s framework. Kirkpatrick describes 4 levels of results, each one leading to another, each more important than the last. The first is that the Reaction has to be positive, supporting effective Learning that leads to changes in Behaviour that drive Results.
Smile-sheets, with questions like ‘would you recommend the programme to a colleague?’, usually filled in immediately after a session, give data for participant reactions, i.e. level 1. Many practitioners are content with results at this level. They follow a standard market logic: they know if their customers are satisfied with the product and will therefore probably recommend it to others and buy other products as well.
The main problem with traditional smile-sheets is that they do not predict learning. Two meta-analyses looking at 150 studies have shown there to be no correlation between level 1 Reaction scores and level 2 Learning outcomes.
All is not lost. Thalheimer, a leading researcher in the area, has shown that smile-sheets can be improved with a few well-designed questions focusing on:
One key tip is to change the answer options from agree/disagree to statements that relate more closely to the learner’s context:
The data such questions give is still only limited to the learner’s reactions but the learner’s motivation to apply the learning (question 2) does predict actual application, and a self-assessment of the support the learner can expect (question 3) is relevant for whether the learning will have an effect.
Question 1 (above) is as good a question as possible for testing the participant’s beliefs about what has been learned, but we know that learners are overly optimistic and lack knowledge of learning. It is important to move to level 2, and test remembering and understanding a few days or more after the learning has taken place. Such a test of knowledge and retention is straightforward and easy to apply digitally.
Reacting positively and retaining Learning is good, but not enough. Behaviour has to change if Results are to be created.
The process below is a simple approach to using 3600 test and retest to measure whether the individual leader changes their behaviour as a consequence of participating in a leadership development programme.
The secondary benefits of running such a process, higher expectations and more support, are equally important as solid measurement. The stakeholders involved in the anchoring step include people the participants see as important. These stakeholders can set expectations that influence participants’ behaviour, increasing their motivation to get whatever they can out of the leadership development programme.
The involvement of more stakeholders to go through the results of the 3600 tests and retests ensures the participants receive the support they need to learn and to apply that learning. In combination, higher expectations and more support increase the effectiveness of the leadership development programme.
The test-retest process gives solid data on whether leaders have developed and changed their behaviour as a result of a development programme. The data and graphs are very helpful in communicating about the programme to stakeholders.
The discussions between a leader and their own boss, colleagues or reports ensure the leader also receives feedback about issues the test does not manage to address. However well-designed, a 3600 test read-out doesn’t capture everything a colleague might want to say to a leader.
Reliably measuring whether changes in behaviour impact tangible results like more efficiency, stronger innovation or happier colleagues is very challenging. In our article How to make leadership development effective we made the point that there are so many factors that influence efficiency, or any other major result, that organisations find it simply too difficult.
That is true. But there is hope.
We have 2 suggestions. The first based on a survey, the second on projects.
Asking stakeholders which factors have contributed to a particular Result, and to what extent, gives a qualitatively informed assessment. It isn’t scientific proof, but it is a strong indication from the internal experts the organisation trusts. Asking the same people who were involved in the 3600 test-retest process ensures that they have the insight necessary to judge whether the leadership development programme has contributed significantly.
Each survey has to be designed to fit the specific context. The below is a mock-up to give you an idea:
To make it easier for you to adapt and apply this method in practice we list the principles we have based this mock-up on:
Some leadership development programmes include real-life projects where the participant can try out new learning and behaviours. These projects can be of considerable value. As an example, in a programme in a national Post Office each leader’s programme-related project had an average value of €20,000. With 25 participants, the programme’s projects for each cohort added half a million euros in value.
Most of these projects would not have been run without the programme. The project value is not a measure of the impact of the leader’s change in behaviour, but it does give an answer to how to measure the impact of leadership development programmes. Measurable ROI for running a cohort through the programme for the Post Office was:
230% is a very handsome return for the Post Office.
Another example of applying the project approach can be seen in the first-100-days segment. Approaches in this segment address the challenges and potential a manager faces in a new role.
The first-100-days segment:
The digital coach ELLA is an example of a product in the first-100-days segment. Ella helps organisations provide the kind of support managers need to make a success of their new roles within 100 days.
ELLA encourages users to treat their first 100 days as a project, evaluating their progress and thinking through next steps every week. The key deliverable is Results by Day 70:
The figure describes Best Practice for a manager succeeding in a new role.
The early win delivered by Day 70 does not have as high an average value as the projects run in traditional leadership development programmes, which are driven by established managers over more than 6 months. Early wins do however establish the new manager both in their role and with their team. Together they set the platform for delivering far more value than that produced by a single limited project.
The short-term ROI is easy to calculate. An early win can, for example, have a value of €5,000. Given that Ella costs about the same as a mobile phone, ROI can be calculated to be about 400%.
It is possible, with limited effort, to improve methods for measuring the impact of leadership development. This article has pointed to solid recommendations at 4 levels, i.e. in Reactions, Learning, Behaviour and Results.
If you want to know more about the first-100-days segment please check out our e-book.
Organisational psychologist with an MBA. Broad top management experience spanning many industries, functions and countries, including 10 years with corporate responsibility for HR. Extensive experience as consultant in private, public and voluntary sectors. A number of board positions in the education and culture sectors. Started career as counsellor for drug and alcohol abusers.
We write professional blogs worth a read.
Follow the blog for a sneak peek of the future!
* By subscribing to our newsletter, you agree to receive digital communications. You may withdraw this consent at any time.
Nov 01, 2020 - Richard Taylor C. Psychol., MBA
We all use google searches to find answers to our questions. We often look at the videos google suggests for..
Nov 01, 2020 - Richard Taylor C. Psychol., MBA
Many are sceptical to the return on the investment made in leadership development. A promising new approach..
Nov 01, 2020 - Richard Taylor C. Psychol., MBA
A ‘recruiting error’ is seldom an error of selection. The real error lies in not helping the new recruit..