top of page

SERVICES

Monitoring & Evaluation

Making sense of evidence to understand change over time.

Over two decades, we feel privileged to have helped many public sector organisations to plan, monitor and evaluate their policies, programs, services and systems to be as effective and equitable as possible. Over 20 years of public sector work, we have learned about the complexity of public service decision-making and policy cycles. We have been developing impactful M&E projects since 2008 – we draw on leading theories and choose the most appropriate ways to deliver work in complex, real-world settings. 

Charlie Tulloch founded Policy Performance in 2018 to help public sector organisations to maximise the use of their scarce funding and related resources, so that investments achieve their full potential, for those in greatest need.

People often reach out to Policy Performance when they find themselves seeking a trusted monitoring & evaluation partner to help their teams to plan and conduct monitoring and evaluation projects. Or to sensitively engage their clients and stakeholders in targeted data collection activities, to find out what they think.

We then engage sensitively, collect data robustly and report creatively.

Understanding Change Over Time

We are deeply involved in monitoring and evaluation practice. We understand that there are many ways to plan, measure, collect, analyse and report on implementation and results achieved. We select the most appropriate approaches and data collection methods, based on the available time, datasets and resources. Monitoring and evaluation efforts are most often attempts to better understand change over time. In other words - impact.

We also bring extensive quantitative skills to the task, led by Arun’s Masters-level studies in Statistics and application through randomised controlled trials (RCTs) in India.

We also bring extensive qualitative skills, including depth interviewing, focus group conduct, survey design and careful collection of information.

Recent projects by Policy Performance to develop national monitoring and evaluation guidelines have provided us with a ready bank of resources and tools that we are able to apply to our monitoring and evaluation tasks, and to share with clients through our work, so you are more able to use evidence effectively in the longer-term.

Charlie Tulloch

01
Readiness Review

Organisations with intentions to monitor and evaluate their programs, services and supports may benefit from an initial diagnostic and assessment process to understand their readiness. This can be an essential early step when establishing a credible and reliable M&E system or approach, or can be undertaken mid-rollout to gain clarity on options for monitoring and evaluation activities.  

Readiness reviews also consider how to overcome M&E constraints that arise, particularly where there are data, funding or staffing constraints.

Policy Performance conducts readiness review projects that consider: 

  • Timing: when is the timing right to activate M&E activities? 

  • Data: what information is already available? Is it credible and consistent? What else may be needed?

  • Stakeholders: what is the best way to engage different stakeholder groups in M&E efforts? 

OUR
EXPERIENCES:

Undertaking a readiness review to consider whether a suite of education reforms are able to be effectively evaluated

Reviewing datasets for a road safety program and advising on a new set of key performance indicators that can be better used for monitoring and management purposes.

02
Monitoring Plans & Systems

Monitoring is the routine data collection process to track implementation and emerging outcomes during delivery of a given program or service. It serves a critical management function and can provide essential data about change over time to support later evaluation activities. Monitoring need not be limited to the measurement of outputs. It can also reflect outcome indicators, to provide real-time feedback on results. 

Policy Performance leads monitoring work that supports: 

  • End-to-end preparation of Monitoring Plans

  • Defining clear key performance indicators, supported by robust datasets

  • Establishing baselines and targets

  • Advising on systems for data collection

  • Developing new data collection tools for monitoring

  • Capability uplift across teams and organisations involved in monitoring

  • Visual data reporting on monitoring results. 

OUR
EXPERIENCES:

Developing national guidelines for monitoring, and associated tools and templates.

Various projects to support compelling reports that share monitoring findings.

Establishing data tools (e.g. surveys) for routine and repeat data collection.

03
Evaluation Framework Design

Evaluation framework design is critical. It considers in detail how various data sources will help to answer the key questions, which is necessary to later interpret results and analyse explanations for change.

Policy Performance works in close collaboration with clients and affected stakeholders during this important phase to define frameworks for organisations, service systems, programs/policies and related activities.

At Policy Performance, we know that quality evaluation requires sound planning. This includes:

  • Understanding the scope of the evaluation

  • Determining key questions

  • Agreeing on an appropriate evaluation approach

  • Addressing ethical concerns

  • Understanding existing datasets

  • Deciding on methods for additional data collection, and 

  • Settling the project timing, governance, deliverables, management and stakeholders to be involved

OUR
EXPERIENCES:

Developing a new set of performance measures for a road safety program 

Establishing an evaluation plan and data collection tools for a national climate-related review.

Establishing evaluation frameworks to support program and policy reviews.

04
Policy & Program Reviews

We work through a 7-stage process to undertake high quality evaluation projects, as shown below. Our ambition is to provide rigorous and robust answers to the agreed key questions. Our reports present the narrative in compelling ways, drawing on quantitative and qualitative datasets to tell the story as clearly as possible.

Image by Headway

Understand the topic

Clearly understand the subject of the evaluation and how the public response is expected to create change.

Meeting Table

Scope the evaluation

Define the boundaries of the evaluation and develop a set of key questions to explore.

Image by Kaleidico

Valuing/judging success

Consider how judgements regarding success will be made.

Image by Kelly Sikkema

Define approach & methods

Agree overarching evaluation approach and methods for data collection to respond to key questions.

Image by Jason Goodman

Reporting

Develop reporting documents to address key evaluation questions, outlining findings. This may also include recommendations for
adaptation/change/improvement.

Image by Ksenia

Analyse findings

Review and carefully analyse collected information to generate robust findings in response to key questions.

EVALUATION FRAMEWORK

Image by Walls.io

Systematic data collection

Use existing data sources and any required additional sources, collect data in line with approach/methods

OUR
EXPERIENCES:

Review of national model for drought support.

Review of a new approach to the work of environmental inspectors.

Evaluation of international school teacher recruitment processes.

Evaluation of a 5-year strategic landscape plan.

Review of a grants process, including results achieved by grant recipients.

05
Impact Evaluation

Impact evaluations focus primarily on understanding change over time. Robust methods tend to be established early in a program’s life-cycle in order to generate a baseline dataset that can be used as the basis for setting targets and analysing results achieved. There are quanititative and qualitative-oriented impact evaluation techniques. Policy Performances has expertise and experience in both.

Arun Callapilli has been involved in numerous randomised controlled studies in India during roles in government and non-profit organisations. Drawing on a Masters of Statistics, she is able to analyse large datasets to understand result changes and to draw causal inferences. Charlie Tulloch has most often used qualitative methods for exploring impacts and results, drawing out reflections via interviews, surveys and related data collection processes to understand different perspectives on questions of effectiveness. 

OUR
EXPERIENCES:

Contribution to large-scale randomised controlled trials in India.

Tutoring in Impact Evaluation at the University of Melbourne.

Application of rubrics to understand baseline and over-time changes.

Use of counter-factual analysis to understand what would have happened otherwise.

Charlie at Policy Performance did an outstanding job on the mid-program review. His attention to detail and thorough analysis of the data helped us to identify areas for improvement and implement effective solutions. His communication skills were exceptional, and he kept us informed every step of the way. His hard work and dedication to the project made it a success. We are extremely satisfied with the results and highly recommend Charlie to anyone in need of his services.

Program Review Evalution Client

bottom of page