Project Managers' Guide to Performance Measurement and Evaluation

APPENDIX 3: Tools for Project Managers - Project Level Evaluation Plan - Checklist

Element

Look for….

Tools

Project Description

  • Project objectives

  • Target group or beneficiaries

  • Activities

  • Outputs

  • Expected results (outcomes)

Consider providing a Logic model or Project "road map"

Indicators of success/impact

  • What are the indicators of success/impact?

  • Are they measurable?

Specific indicators

Data collection

  • Methods (Qualitative and Quantitative)

  • Data Sources

  • Feasibility

  • Logistics

  • Timing/frequency of data collection

  • Roles and responsibilities

  • Protocols for collecting and monitoring

  • Appropriate methods that are sensitive to the situation and population (gender, culture, language, literacy, age, community, disability)

  • Data collection plan and protocols

  • Ethical standards and confidentiality provisions

Who is responsible for conducting the evaluation?

  • Is the evaluator internal or third party?

  • Does the evaluator have the appropriate knowledge and skills, including cultural/diversity competence?

  • Are there any conflict of interest issues to consider?

  • How will privacy and confidentiality be addressed?

  • Is there good communication between the evaluator and the project sponsor?

Agreements, contracts and protocols

Partner and Stakeholder involvement

  • How will partners be involved in the evaluation?

  • How will stakeholders (e.g. funders) be involved?

Agreements, terms of reference for committees

Evaluation resources

  • Are sufficient resources allocated to carry out the plan?

  • Is the evaluation cost-effective?

Evaluation budget as a % of project budget

Actual and in-kind resources

Utilization of results

  • How will the project use the results?

  • How will DOJ use the results?

  • Project statement of how the results will be used to improve their project

  • DOJ statement of how the results will be used to inform decision-making

Does the evaluation make sense?

  • Is the type of evaluation planned appropriate? Realistic?

  • Is the evaluation plan practical and achievable?

  • Will results be meaningful & credible?

  • Will results be timely?

Your overall assessment

Advice of others

Considerations

  • Are there more suitable methods that would be better matched to the project?

  • Are there more cost effective strategies?

 

An Overview of Information/Data Collection Methodologies

There are various types of information or data, and various collection methods. Here’s an overview of some of the most commonly used methods.

Type of information/data

Examples of methods to collect information/data

Some advantages

Quantitative data

  • Closed-question Surveys (mail-out, e-mail, web site, telephone)

  • Project records/statistical reviews (client processing information; project dissemination log analysis)

  • You can gather information from many people and you can count and measure to produce statistics.

  • You can provide a quick overview of your project's activities (e.g. how many clients you served, how many pamphlets you disseminated, costs per activity)

Qualitative data

Project file or document reviews

 

You can build understanding of the context and experiential process from the project record.

 

Literature reviews

You can assess the relevance or your work within broad stage of knowledge development in the field.

 

Policy reviews

You can situate your work with broad stage of policy development in the field.

 

Key informant interviews

You can discover the context and meaning of peoples' experience with the project.

 

Case studies

You can get in-depth information or a story of what happened and what the results were.

 

Expert panels

You can acquire further knowledge and insights.

 

Focus groups

Like a group interview. You can get collective insight on a specific topic or questions.

 

Dialogue or learning circles

You can gather stakeholders together to share experiences and identify key learnings in a culturally appropriate way.

Guidelines for Tool Development & Examples

There are many different ways to collect project evaluation information, including the compilation of basic statistical information. This appendix briefly describes several of the tools that can be used to evaluate projects – and to determine, in particular, project impacts:

Workshop or Conference Evaluation Feedback Forms

Workshops and conferences bring individuals together to share their experiences, exchange ideas, develop knowledge and acquire new skills. Participant feedback from such events can provide valuable information to determine the very immediate impact of the event. You can also evaluation feedback forms to get a sense of how people will use the knowledge or skills they acquired at the event. You would need to do further follow-up at a later point in time– such as participant interviews or a surveys – to find out whether and how people have applied the knowledge and skills that they acquired and how it has impacted their work.

What’s Involved?

Before the event: Once you have set your agenda, design a brief feedback form and include it in the participant package. Participants should fill out this form anonymously.

At the event: Have participants fill out the form and hand them in at the end of the event.

After the event: Compile the answers to assess what worked well, what did not work so well, and participants’ suggestions for improvements and/or next steps. Use this information in future work (e.g. future workshops or conferences, follow up steps).

Overall Design

A participant feedback form should:

Designing the Questions

Close-ended (or closed) questions provide individuals with a set of answers to choose from, such as a multiple choice list of answers, “yes” or “no” boxes to check, or a rating scale to complete.

Open-ended (or open) questions do not provide individuals with a set of answers to choose from – the individual is expected to formulate their own answer, in their own way.

Here are some examples of topics suited to closed questions:

Here are some examples of topics that may require “open-ended” questions:

Interviews

Interviewing individuals who have been involved in – or impacted by – a project can provide in-depth and detailed information about their perspectives and experiences.

One-on-one interviewspermit individuals to make anonymous comments and express their opinions freely.

Interview data can supplement – and permit a crosscheck of – information obtained from various sources.

Interviews can be conducted in-person or on the telephone.

What’s Involved?

Before conducting the interviews

When conducting interviews

After conducting the interviews…

Overall Design

Interviews should:

Developing Interview Questions

Interview questions should be:

Here are a couple of examples of interview questions that could be asked of those involved in a newsletter project:

Surveys

A survey (or questionnaire) is a set of questions that is given to a group of individuals to complete. A survey can be used in a variety of different settings to collect information about the same set of questions from many different people. Surveys may consist of a few brief questions – or they may be more detailed and lengthy.

Although surveys may include either close-ended or open-ended questions (see definitions above), they usually consist primarily of close-ended questions, because these take less time to complete, and the results are easier to analyze statistically.

A survey can be administered in a number of different ways: the questions can be printed and sent (or given) out; an electronic survey form can be emailed out or posted on a web site; or individuals can be asked to respond to a telephone survey.

What’s Involved?

Before conducting the survey….

While conducting the survey…

Collect and record/keep track of all completed surveys.

After conducting the survey…

Overall Design
Developing the Questions

Here are a couple of examples of survey questions (open and closed) that could be asked of those who participated in an expert consultation to develop a research plan:

Focus Groups

A focus group is a type of “group interview” in which a small number of people are asked to provide their perspectives on a specific topic. The group’s facilitator encourages all participants to express their views, but the group is not expected to reach consensus. For evaluators, focus groups can provide diverse perspectives and insights on an issue. The opportunity for group interaction and discussion may stimulate participants to make observations and comments that they otherwise may not have offered.

What’s Involved?
Overall Design
Developing Questions

Here are some questions that might be asked of a small group of practitioners who have been involved in implementing an amended (or new) piece of legislation on specific offences against children:

Cluster Evaluations

Cluster evaluations look at how well a collection of similar projects meet a particular objective of change. Cluster evaluations are a potential way for the Family Violence Initiative to look across projects to identify common threads, themes and impacts and to identify overall lessons learned.

Some potential goals of a cluster evaluation include to:

Cluster evaluations are not a substitute for project-level evaluations. A third-party cluster evaluator typically conducts them. They may in part rely on some data collection by project-level evaluators. Logic Model Development Guide and W.K. Kellogg Foundation Evaluation Handbook, p. 17 (W.K. Kellogg Foundation).

What’s Involved?
Overall Design

Cluster evaluation is a good method for obtaining information on projects that cumulatively are designed to bring about policy or systematic change. Such evaluations can lead to important “lessons learned”. This makes cluster evaluation particularly attractive to family violence issue-oriented projects.