Rule 11: Create a constantly flowing feedback loop – evaluate and measure.

Written by Alistair Gordon, 04 Jun 2020

Because small group coaching leadership programs run over a longer period of time, the ability to course correct is available to program designers. Provided they know that course corrections are needed.

Our advice – measure everything, regularly.

None of this is, of course, rocket science. But it requires advance planning, and it requires the coaches needing to remind participants to complete surveys, and so on.

The feedback process needs to be communicated ahead of time as part of the program design.

Ethically, for example, if we are tracking data about email opens etc we should be clear with people that that is what we will be doing. (This actually builds compliance, as well.)

One final piece of advice here – we always position the gathering of ‘performance data’ as a way for us to improve the program, and help any participants or managers who are struggling, rather than talking about scoreboards.

But if we say this, we have to be seen to be doing it – that is improve the program and take feedback on board (see next tip).

What is best practice to evaluate the success of management coaching and small group coaching?

Coaching Calls

Every now and then, to discover what themes are coming up, without discussing specifics, and the overall engagement of participants, without mentioning individuals. The right coaches are a rich source of insight.

Mid-cycle surveys

Of both managers and participants. How is the program being experienced? What improvements would you suggest? Which bits appear to be working best? How are you seeing your report/yourself leading differently, and what impact is this having?

End-of-program surveys.

Asking the same questions above, but a wider range.

Activity measures.

Those of you with advanced Learning Management Systems can design processes which allow you to track, or see what’s be opened and downloaded, and when; it can measure the number of participants submitting materials such as Personal Growth Plans (see our website for our approach to these in an episodic design). It is also possible to track who opened their pre-work and post-work, and who didn’t.

You can also track how many managers opened their Manager Guides.

If you have a scoring system (we use Pardot, part of Salesforce) we can add scores to all of these activities to get a scoreboard of who is lost, engaged, and least engaged. And perhaps which participants and managers need attention, or recognition.

Anecdotal calls

Asking random participants and managers throughout the process how they think the program is landing. This may seem obviousl but often we find it is a rich source of ideas and feedback, and also a rich source of complements. The feedback process needs to be communicated ahead of time as part of the program design.

Ethically, for example, if we are tracking data about email opens etc we should be clear with people that that is what we will be doing. (This actually builds compliance, as well.)

One final piece of advice here – we always position the gathering of ‘performance data’ as a way for us to improve the program, and help any participants or managers who are struggling, rather than talking about scoreboards.

But if we say this, we have to be seen to be doing it – that is improve the program and take feedback on board (see next tip).

@Fastlead

Feedback from participants led us to a very important improvement in our process which would otherwise have been unseen. We asked participants in mid-cycle and end-of-cycle a range of questions about their coach.

All of our coaches passed most of these measures with flying colours – except one.

We ask participants whether their coach challenged them in coaching sessions. We noticed that “sometimes” (a score of 3 in a five-number scale) was the highest figure. Conclusion, our coaches were not challenging Fastlead participants enough. We discussed this with coaches and reset their understanding of our expectation. We also ran some sessions at our conference on how to coach diplomatically more often. The last batch of figures show far higher rates of challenge across all coaches.

Message received, understood, and acted upon.

Download now