Short Communication

Evaluation for learning and improvement at the right time: an example from the field

AUTHORS

name here
Alison Rogers
1 Master of Evaluation, Strategic and Innovation Adviser and PhD student *

name here
Carol Watson
2 PhD, Pandanus Evaluation

name here
Nea Harrison
3 Master of Education (Hons), Pandanus Evaluation

name here
Sharon Manhire
4 Diploma in Project Management and Applied Science, Team Leader

name here
Catherine Malla
5 Master of International Public Health, Knowledge Management Advisor

AFFILIATIONS

1 Indigenous Australia Program, The Fred Hollows Foundation, Scaturchio St, Casuarina, Darwin, NT 0820, Australia; and Centre for Program Evaluation, University of Melbourne, Carlton, Vic. 3053, Australia

2, 3 Pandanus Evaluation, PO Box 349, Parap, NT 0804, Australia

4 Indigenous Australia Program, The Fred Hollows Foundation, Scaturchio St, Casuarina, Darwin, NT 0820, Australia

5 Knowledge and Innovation Division, The Fred Hollows Foundation, Scaturchio St, Casuarina, Darwin, NT 0820, Australia

ACCEPTED: 4 October 2019


early abstract:

Evaluation expertise to assist with identifying improvements, sourcing relevant literature and facilitating learning from project implementation is not routinely available or accessible to not-for-profit organisations. The right information, at the right time and in an appropriate format, is not routinely available to program managers. Program management team members who were implementing The Fred Hollows Foundation’s Indigenous Australia Program’s Trachoma Elimination Program required information about what was working well and what required improving. This article describes a way of working where the program management team and an external evaluation consultancy collaboratively designed and implemented an utilisation-focused evaluation, informed by a developmental evaluation approach. Additionally, principles of knowledge translation were embedded in this process, thereby supporting the evaluation to translate knowledge into practice. The lessons learned were that combining external information and practice based knowledge with local knowledge and experience is invaluable; it is useful to incorporate evaluative information from inception and for the duration of a program; a collaborative working relationship can result in higher quality information being produced and it is important to communicate findings to different audiences in different formats.