Evaluation of the Justice Partnership and Innovation Program
3. Methodology
The methodology for the evaluation consisted of multiple lines of evidence, including a document/literature review, key informant interviews, a file review, a survey of applicants, and case studies. As noted, the methodology is as outlined in the JPIP Performance Measurement Strategy. The evaluation was guided by an evaluation matrix that addresses questions around relevance and performance, and includes indicators and data sources for each. Performance questions include the consideration of achievement of the outcomes as defined by the JPIP logic model. Appendix A provides the logic model and evaluation matrix.
The evaluation methodologies are described below and data collection instruments are provided in Appendix B. All data collection instruments were developed in consultation with the EWG.
3.1. Document and Literature Review
The document review involved a review of internal documents as well as publicly available documents and included the following:
- terms and conditions for the Program
- templates or examples of recipient project or contribution agreements
- annual reporting by the Program, such as through Departmental Performance Reporting
- other relevant publicly available information such as reports on plans and priorities, Budget Speeches, and Speeches from the Throne
- previous evaluations or sub-studies of funded areas
- annual data on the number of recipients and funding amounts by funding areas
- any other relevant documents identified by the Program or ED
While this task did not include an exhaustive literature review, any recent literature covering access to justice issues relevant to the JPIP that were identified over the course of the evaluation were also reviewed.
3.2. Key Informant Interviews
Key informant interviews were conducted to obtain the opinions, perceptions, and experiences of key stakeholders with knowledge of the JPIP. Interviews included Department of Justice Canada staff, partners, and stakeholders.
Information was collected from 21 key informants in total. Input was provided through 18 interviews (two of the interviews involved two participants each) and one written submission, with the breakdown of key informants as follows:
- Department of Justice representatives – five interviewees
- Family Violence Initiative (FVI) recipients – five interviewees
- PLEI core funding recipients – four interviewees
- Violence Against Aboriginal Women and Girls (VAAWG) fund recipients – three interviewees
- JPIP general funding recipients – two interviewees
- AJA PLEI recipients – two interviewees
The EWG identified potential key informant interview participants. Interviews were conducted using structured interview guides tailored to the specific groups, with questions designed to address the evaluation issues and questions.
The representatives for the AJA PLEI components were asked some basic questions regarding how PLEI services are delivered in the territories, what gaps in services exist, and any challenges in delivering PLEI in the northern territories.
3.3. File Review
The file review involved a review of performance information and files for specific funded JPIP recipients. Files were selected to include a sample from each of the recipient groups, including named grants, PLEI organizations, VAAWG initiatives, FVI initiatives, and JPIP general funding. Given the small number of named grants, it was decided to include all five recipients in the review. Half of the PLEI organizations were included in the review, and the remaining files were chosen from the other initiatives.
A total of 31 JPIP recipient files were reviewed, with the breakdown as follows:
- named grants – five
- PLEI organizations – five
- VAAWG – seven
- FVI – nine
- JPIP general funding – five
The JPIP program randomly selected files to review, according to the number per group outlined above. Recipients that were included in case studies were not included in the file review.
Files were reviewed at Department of Justice program offices and included reviews of data and records maintained for each funding recipient, such as the following:
- recipient applications
- contribution agreements
- recipient reporting
- financial data on reviewed files
- correspondences between the recipient and program staff
Each file was reviewed using a file review template to ensure that all data were reviewed in a consistent fashion and to facilitate analysis and reporting. Two file review templates were developed based on information required in JPIP applications and reporting templates; JPIP uses two reporting templates — one for named grants, and one for all other recipients.
3.4. Survey of Applicants
The evaluation included an online survey of project applicants in order to obtain input from a broad range of Program stakeholders and to provide an opportunity to capture applicant input in a manner that can be aggregated and quantified. The survey included both successful and unsuccessful applicants, and it included some questions applicable to both and some specific to funding recipients and unsuccessful applicants. As well, the survey questionnaire was developed in a manner as to allow some comparisons with the survey results from the previous evaluation.
The Program provided the email addresses of primary contacts for project applicants. Applicants received an email invitation from the ED to explain the survey and encourage their participation. Applicants were emailed a unique link to the survey, with options for completing the survey in either official language. Several rounds of follow-up emails to non-respondents were conducted to encourage participation. A total of 114 applicants were invited to participate in the survey. Forty-six applicants completed the survey, representing a response rate of 40%.
3.5. Case Studies
Five case studies of funded projects/initiatives were conducted to provide more in-depth context into how the JPIP contributes to each of the initiatives and results achieved. Case study candidates were identified with the assistance of the ED and included the following recipients:
- the HCCH (receives JPIP general funding)
- UNIDROIT (receives JPIP general funding)
- Indspire (which administers the LSAP Program through JPIP general funding)
- Girls Action Foundation (receives funding through both the FVI and the VAAWG initiative)
- Community Legal Education Association Incorporated (receives JPIP general funding for PLEI)
Each case study involved a review of relevant documents and files for each project/initiative (e.g., applications, contribution agreements, recipient reporting), as well as interviews with several key stakeholders for each of the chosen projects/initiatives. Two interview guides were developed specifically for the case studies: one for PLEI recipients and one for other recipients.
Interviews were conducted with the primary contact for each case study initiative, who were asked to identify other relevant stakeholders to interview, such as partners and/or program participants.
3.6. Limitations
The methodological limitations of the evaluation are listed below, along with the mitigating strategies taken.
- Review of documents and data.
- There were limited documents and data available to inform the evaluation beyond program objectives, program terms and conditions, a few previous evaluations, financial data on funding, and project reporting.
- Responses from surveys and interviews with key informants and case study stakeholders.
- Surveys and interview findings are potentially affected by self-reported response bias and strategic response bias. Self-reported response bias occurs when individuals are reporting on their own activities and so may want to portray themselves in the best light. Strategic response bias occurs when the participants answer questions with the desire to affect outcomes.
- Small sample size for the survey.
- The survey included all organizations that have applied for JPIP funding over the evaluation period, other than the HCCH and UNIDROIT, including successful and unsuccessful applicants. As a smaller program, the JPIP has not had a large number of applicants over the course of the evaluation. The Program did not have up-to-date information for some applicants, primarily unsuccessful applicants. As a result, 114 applicants were asked to respond to the survey and 46 applicants responded, which provides a fairly small sample for reporting results.
- Mitigation strategies.
- The main mitigation strategies for the above methodological limitations were to use multiple lines of evidence, make use of both quantitative and qualitative data, and include a range of stakeholder groups for the various lines of evidence. By triangulating the findings from these different sources, the evaluation was able to strengthen its conclusions despite the limitations. The different stakeholder groups included key informant interviews with Department of Justice representatives, as well as with funding recipients from each type of JPIP funding initiative (e.g., named grants, PLEI organizations, FVI, VAAWG, and JPIP general funding); the survey of applicants that included all recipient types except for HCCH and UNIDROIT; the case studies of five funding recipients; and the file review of 31 recipient files.
- Date modified: