2.0 Scope and Methodology

2.0 Scope and Methodology

The evaluation was conducted in accordance with the directive and standards specified in Treasury Board of Canada’s 2009 Policy on Evaluation. The evaluation covers the time period from April 1, 2010 to March 31, 2015, and was conducted between June 2015 and January 2016.

Previous audits and evaluations, such as the 2015 VIP Follow-up Audit and the Evaluation of the Veterans Independence Program 2011, were used in calibratingFootnote 13 the scope of the evaluation. The evaluation team focused primarily on the housekeeping and grounds maintenance components of the Program for several reasons:

  • housekeeping and grounds maintenance are the Program’s two largest components, accounting for 77% of program expenditures in 2014-15 (combined $281M out of total program spending of $363M)Footnote 14.
  • the method of payment for these components has changed from a contribution to a grant since the last evaluation was completed in July 2011;
    • the change from a contribution to a grant in January 2013 is the single biggest change to the program since the last evaluation; and
    • the implementation of the grant necessitated the development of a new Grant Determination Tool (GDT) to assist in consistent calculation of funding across the country.

The following areas were excluded from the scope of the evaluation:

  • intermediate care was not reviewed as it was included as part of the Evaluation of the Intermediate and Long Term Care Programs in 2014;
  • recent departmental announcements (e.g., changes to the follow-up process and future anticipated system changes) were not analyzed in depth as their planned implementation date is outside the evaluation period; and
  • smaller elements of the Program, which combine for only 10% of program expendituresFootnote 15.

Program eligibilities and processes were examined at a high level as they are being reviewed as part of the Department’s five-year strategy. The five-year strategy (2015-20) is being developed to enhance support to Veterans, focusing on cultural change in the Department as well as departmental output, policies, practices, and processes. The strategy is being built on three objectives:

  • a Veteran-focused approach that places Veterans firmly at the centre of all VAC business, ultimately fostering the well-being of Canada’s Veterans;
  • a seamless integration of Veterans Affairs and National Defence transition programs and services by removing the complexities of navigating between the two organizations in order to access benefits during the release process; and
  • a focus on service excellence that will recognize and create opportunities to exceed expectations by understanding Veterans and their needsFootnote 16.

It should be noted that the Program underwent a gender based analysis in December 2011; no issues were identified with respect to gender biasFootnote 17. To further validate this finding, the evaluation team completed a statistical analysis of current housekeeping and grounds maintenance data on services provided to recipients. No issues with regards to gender bias were noted.

2.1 Multiple Lines of Evidence

The research methodology incorporated multiple lines of evidence, thus ensuring the reliability of information collected and results reported. The lines of evidence used to evaluate the Program’s relevance and performance are outlined in Table 3.

Table 3 – Sources of Information Reviewed During the Program Evaluation
Methodology Source
Non-Departmental Literature Review
  • Senate and House of Commons reports, Budget speeches, and Speeches from the Throne;
  • Program documents and data from the United States, Australia, and the United Kingdom;
  • Program documents and data related to provincial home care programs offered across Canada;
  • Media articles relating to the Program; and
  • Policies and procedures developed by the Processor.
Departmental Documentation and Secondary Research Review
  • Departmental acts and regulations, Treasury Board Submissions;
  • VAC reports/published research papers, policies, procedures, strategic documents, performance reports, and recipient complaint records;
  • Pre-existing recipient survey/public opinion research (e.g. VAC National Client Survey 2010); and
  • Previous audits and evaluations.
Interviews and/or Work Observations
  • Telephone and in-person interviews with 60 VAC and Processor staff involved in the delivery of the Program;
  • Interviews with 12 VAC senior executives and program experts; the Office of the Veterans Ombudsman; and representatives from provincial home care programs; and
  • Observation of business processes and procedures used by the Processor.
Recipient Feedback/File Review
  • VIP follow-up forms (to confirm that benefits received by Program recipients are appropriate and are meeting their needs);
  • File reviews (to determine the timelines of Program decisions, whether Program amounts changed for Veterans after the program switched from a contribution to a grant, and whether Program amounts changed for survivors/primary caregivers after the implementation of the GDT).
Statistical Analysis
  • Financial, demographic, and operational data collected by VAC and analyzed by the evaluation team for fiscal years ending 2010-11 to 2014-15.

2.2 Limitations and Analytical Challenges

The following limitations were identified during the evaluation:

  1. The evaluation team did not speak directly with individuals in receipt of a Program benefit.

    The team partially mitigated this limitation by:
    • reviewing existing data (e.g. VAC’s 2010 National Client Survey and VIP annual follow-up forms);
    • conducting interviews with VAC employees who deal directly with Program recipients with the intent to obtain the perspectives of Veterans; and
    • observing follow-up calls and GDT interviews conducted by the Processor.
  2. The Program did not have mechanisms in place to assess performance during the timeframe covered by this evaluation:
    • Program management had limited ability to capture program performance data;
    • changes to the systems used to track Program data will not be implemented until 2016; it is anticipated that these changes will assist in compiling performance data;
    • VAC quality assurance procedures subsequently implemented in 2015-16; and
    • quality assurance procedures for the Processor were not in place, but are currently under development.
    The evaluation team partially mitigated limitations by conducting a file review to determine if Program decisions were being made in a timely manner. The team also made inquiries as to the nature of complaints received by VAC and the Office of the Veterans Ombudsman (OVO); no significant issues were identified.

  3. Administration expenses for the Program were reviewed. However, a comparison could not be made to programs offered through international departments of Veterans Affairs as they are too dissimilar.

    The evaluation team also attempted to conduct a literature review of similar provincial home care programs. Administration information on provincial programs was limited, and where available, not measureable against the Program due to differences in the delivery models used. This situation prevented a wholesome comparison of administration costs (see section 4.2 for more details).

The above limitations should be considered when reading the evaluation findings.