Measuring the Effectiveness of Leadership Development
By Dr. Francis Eberle
Leadership development is worth the effort and time. According to a Lorman survey, retention rates rise 30-50% for companies with strong learning cultures. And 74% of surveyed employees feel they aren’t reaching full potential at work due to lack of development opportunities.
Particularly with workforces shifting, employees retiring, and younger employees taking over those roles, leadership development is a crucial need. It can help address the gaps created, fast pace of work, demands for effective and agile teams, and employees’ desires for advancement.
When you deploy leadership development training in your organization, it’s important to measure its effectiveness. Measurement doesn’t have to be difficult, and you don’t have to be a large company with many resources in order to see the impact of leadership development.
There are four common traps when it comes to measuring development programs:
- Can’t we just ask the participants? This is a common trap and what many companies have been doing all along. These types of questions typically measure whether the participants enjoyed the session, but not what they learned. And if you ask them what they learned, you may actually be measuring their level of guilt, instead of their level of learning, as they want to report that they learned something.
- Can’t we check with the participants a couple of months later and see if they are applying what they learned? This is similar to trap #1. If your supervisor asked you, how would you respond? If leaders are applying what they have learned, what companies truly want to understand is the impact. But it is hard for single employees to see the extent of their impact.
- We could ask if it was a worthwhile program? Yes, but what is this telling you? Let’s say it was a challenging program and they didn’t have fun. Or perhaps they were in the middle of a huge high-risk project, and they didn’t want to be there? Results from this line of questioning could be skewed.
- What about the range of responses? This can be a trap if, perhaps, only half of participants return the survey. Typically, the people responding to the survey are those with extreme opinions—either they are enthusiastic about everything, or they really didn’t enjoy the training. And in either case, you are gathering biased responses.
Some additional measurements to consider are if the training sticks after the program is over. Consider asking the participant’s direct reports and supervisors, to see if they are experiencing a change in the participant’s behavior. What is the impact on the company? Can this program be used over and over and have the same impact?
To answer these questions and more, think about what you want to measure. Here is a break down, in levels, of what you can evaluate in a leadership development program.
Level 1: Did they enjoy participating or level of satisfaction with the program?
Level 2: Did they learn anything new?
Level 3: Are they using or showing new behaviors as a result of what they learned?
Level 4: To what extend did this program impact the business?
Level 5: What is the return on investment of the program for the business?
Fortunately, there are ways to measure all of these aspects. Most measurement today is of Levels 1 and 2. I will focus here on Levels 3-5 as they are less commonly measured. Measuring them requires more upfront time, but still only one survey. The steps below walk through the process. This is explained in much more detail in the book Measuring the Success of Leadership Development by Phillips, Phillips & Ray.
Step 1: Before the program is started, clearly determine what you want to accomplish or what change you want to occur as a result of the program. For example, “creating more effective teams,” is good, but not clear enough. Something such as, “creating teams that are actively engaged with each other and communicate well,” is more specific.
Step 2: Ensure your leadership development program’s objectives are aligned with your outcome(s). If they don’t match, look for another one. Ask yourself if the duration of the program is enough to help you reach the outcome(s).
Step 3: Notify the participants that program’s effectiveness will be evaluated three months after it is over. Also, let them know that the evaluation will ask questions of the people they work with and who report to them.
Step 4: Three months after the program has ended, send one survey to the participants and to people who work closely with them. Three months is a reasonable time to see if their learning is stable and they are using what they learned.
The survey questions should ask about behaviors and structures. Such as, “Do I have more opportunities to speak?” or, “I feel listened to when I do speak.” Ask both quantitative and qualitative questions so you can conduct some statistical analysis (effect size) and back it up with anecdotal data. And you can add interviews if you have time for a more detailed report. Also, if you have a cohort of people of the same level who did not participate, you could involve them as a kind of control group.
Step 5: Analyze the results and illustrate the extent that behaviors changed as reported by the participants and their reports. For the quantitative data, display the numbers of participants who changed a lot and those that did not and in what areas. Then conduct an analysis for effect size to see to what extent the change is caused by the program and not just natural growth over time or from another program they attended. Add the anecdotal data as examples.
This approach focuses on whether participants are using their learned knowledge, changing their behaviors and impacting others. And for those wanting to go a step further with information about customers and sales numbers, you can add ROI to the leadership development program evaluation.
Measuring impact is doable with planning. You can target effectiveness, skill areas, behaviors, and whether participants are using what they learned. This approach allows you to improve the program so that it is better each time.
Header image by Ron Lach/Pexels.