Five-Year Plan for Evaluation and Performance Measurement Strategies 2012-13 to 2016-17

PDF Version (250 Kb, 39 Pages)

 


Table of Contents




Executive Summary

This document represents Aboriginal Affairs and Northern Development Canada’s Five-year Plan for 2012-13 to 2016-17 for Evaluation and Performance Measurement Strategies. The plan adheres to guidance provided by the Treasury Board of Canada Secretariat. It aligns with and supports the departmental Management, Resources and Results Structure; ensures the evaluation of all ongoing grants and contribution (G&C) programs on a five-year cycle, and; implements a risk-based approach for determining the calibration of effort for evaluations.

A number of factors were considered in the development of this plan. The planned G&C spending for 2012-13 is included to provide an indication of the level of coverage. The deadline of evaluation, calculated by adding five years to the date of the last evaluation, verifies that proposed evaluations are scheduled within the five-year time period dictated by Treasury Board. The inclusion of planned audits will allow for the development of strategies to reduce the impact on programs and will inform calibration along with risk rankings and the status of performance measurement strategies.

Given the uncertainties presented by the Deficit Reduction Action Plan, this plan focuses on calibrating the level of effort for proposed work in 2012-13 only and will be revisited in the summer of 2012 when the impact of budget announcements are better understood.

The previous plan provided the basis for the updated plan. Changes are based on an analysis of the evaluation universe and input received from senior management. There is a total of 48 evaluations scheduled between 2012-13 and 2016-17, which are fairly evenly distributed across departmental strategic outcomes. Fifty percent of evaluations have a performance measurement strategy in place, which is expected to reduce the level of effort required for these evaluations. Performance measurement strategies are scheduled to be developed in time for another 30 percent of scheduled evaluations.

A calibration of the level of effort was conducted for new evaluations scheduled for 2012-13. The analysis of materiality, risk and complexity revealed a very high risk portfolio of evaluations for year 1 of the plan.






1. Introduction

This document outlines a plan for evaluating 100 percent of grants and contribution (G&C) programs over the next five years at Aboriginal Affairs and Northern Development Canada (AANDC) and the development of performance measurement strategies.

1.1 Purpose of the Evaluation Plan

The primary purpose of the plan is to help the Deputy Head ensure that credible, timely and neutral information on the ongoing relevance and performance of direct program spending is available to support evidence-based decision making on policy, expenditure management and program improvement. The plan also:

  1. Provides an opportunity to align evaluations with information needs of the Department and the information needs of others (e.g. central agencies) as articulated in the Policy on Evaluation;

  2. Helps ensure that evaluations supporting program redesign are planned and completed in advance of program renewal;

     
  3. Provides an annual platform for program managers and heads of evaluation to discuss the development and implementation of performance measurement strategies that effectively support evaluations;

     
  4. Allows departmental units responsible for the development of the Report on Plans and Priorities (RPP) and the Departmental Performance Reports (DPR), as well as other groups engaged in strategic planning and reporting activities, to identify when evaluations will be available to inform their work;

     
  5. Initiates regular communication and consensus building on evaluation needs and priorities across the Department; and

  6. Provides central agencies with advanced notice of when evaluations will be available to inform their work (e.g. in support of Memoranda to Cabinet, Treasury Board submissions, strategic reviews).

Moreover, the plan serves as a management tool for the Head of Evaluation by enabling workflow and human resources planning for the coming years.

1.2 Drivers for Evaluation Planning

In the Government of Canada, evaluation is defined as the systematic collection and analysis of evidence on the outcomes of programs to make judgments about their relevance and performance, and to examine alternative ways to deliver them or to achieve the same results. Evaluation serves to help establish whether or not a program contributed to observed results and to what extent. It also provides an in-depth understanding of why program outcomes were, or were not, achieved.

The Government of Canada adopted the first Evaluation Policy in 1977 to inform expenditure management and/or program decision making. A renewed policy, standard and directive introduced in 2009, requires the evaluation of all ongoing G&C programs every five years and clarifies the management responsibilities and accountabilities of ministers and deputy heads.

The April 2009 Treasury Board Secretariat (TBS) Policy on Evaluation states:

3.2 Evaluation provides Canadians, parliamentarians, ministers, central agencies and deputy heads an evidence-based, neutral assessment of the value for money, i.e. relevance and performance, of federal government programs. Evaluation:

  1. Supports accountability to Parliament and Canadians by helping the Government to credibly report on the results achieved with resources invested in programs;

  2. Informs government decisions on resource allocation and reallocation by:

    1. Supporting strategic reviews of existing program spending, to help ministers understand the ongoing relevance and performance of existing programs;

    2. Providing objective information to help ministers understand how new spending proposals fit with existing programs, identify synergies and avoid wasteful duplication;

  3. Supports deputy heads in managing for results by informing them about whether their programs are producing the outcomes that they were designed to produce, at an affordable cost; and

  4. Supports policy and program improvements by helping to identify lessons learned and best practices.

In accordance with Section 6.1.7 of the Policy on Evaluation, this plan aligns with and supports the departmental Management, Resources and Results Structures (MRRS), which is the framework for the systematic collection and analysis of performance information. It also ensures the evaluation of all ongoing G&C programs every five years, as required by Section 42.1 of the Financial Administration Act (FAA). In compliance with Section 6.1.3 of the Directive on the Evaluation Function, this plan identifies a risked-based approach for determining methodologies, the level of effort and the appropriate level of resources required to conduct each evaluation.

1.3 Departmental Context

AANDC Mandate

The vision of AANDC is a future in which First Nations, Inuit, Métis and northern communities are healthy, safe, self-sufficient and prosperous - a Canada where people make their own decisions, manage their own affairs and make strong contributions to the country as a whole.

To this end, the Department supports Aboriginal peoples (First Nations, Inuit and Métis) and Northerners in their efforts to:

  • improve social well–being and economic prosperity;

  • develop healthier, more sustainable communities; and

     
  • participate more fully in Canada’s political, social and economic development — to the benefit of all Canadians.

AANDC is the federal department primarily responsible for meeting the Government of Canada’s obligations and commitments to First Nations, Inuit and Métis, and for fulfilling the federal government’s constitutional responsibilities in the North. The Department’s overall mandate and wide–ranging responsibilities are shaped by centuries of history, and unique demographic and geographic challenges. It derives from the Canadian Constitution, the Indian Act, the Department of Indian Affairs and Northern Development Act, territorial acts, treaties, comprehensive claims and self–government agreements as well as various other statutes affecting Aboriginal people and the North.

The Indian and Inuit Affairs mandate derives from the Indian Act and its amendments over the years, from specific statutes enabling modern treaties, such as the Nisga'a Final Agreement Act or the Labrador Inuit Land Claims Agreement Act, and from more recently enacted statutes, among which are statutes like the First Nations Fiscal and Statistical Management Act and the First Nations Jurisdiction Over Education in British Columbia Act, designed to provide First Nations with jurisdictional powers beyond the Indian Act. A significant amount of the Department's mandate is derived from policy decisions and program practices that have developed over the years; it is framed by judicial decisions with direct policy implications for the Department; and it is structured by funding arrangements or formal agreements with First Nations and/or provincial or territorial governments.

The AANDC Minister is also the Federal Interlocutor for Métis and Non-Status Indians, and is responsible for the Office of the Federal Interlocutor. The Office of the Federal Interlocutor uses its relationships and partnerships with other federal departments, other governments, Aboriginal representative organizations and community leaders to raise awareness of the circumstances of Métis, non-status Indians and urban Aboriginal people, and to increase opportunities for their improved participation in the economy and society.

The Northern Development mandate derives from the Department of Indian Affairs and Northern Development Act; from statutes enacting modern treaties north of 60°, such as the Nunavut Land Claims Agreement Act, or; self-government agreements, such as the Yukon First Nations Self-Government Act, and; from statutes dealing with environmental or resource management, and; is framed by statutes that enact the devolution of services and responsibilities from AANDC to territorial governments, such as the Canada-Yukon Oil and Gas Accord Implementation Act.

Most of the Department’s programs, representing a majority of its spending, are delivered through partnerships with Aboriginal communities and federal–provincial or federal–territorial agreements. AANDC also works with urban Aboriginal people, Métis and non–status Indians (many of whom live in rural areas) through the Office of the Federal Interlocutor. AANDC is one of 34 federal departments and agencies delivering Aboriginal and northern programs and services.

Program Activity Architecture (PAA)

AANDC’s broad mandate is demonstrated by the Program Activity (PA) Architecture, which supports five strategic outcomes (SO):

  1. The Government - Good governance and co–operative relationships for First Nations, Inuit and Northerners.

  2. The People - Individual, family and community well–being for First Nations and Inuit.

  3. The Land and Economy - Full participation of First Nations, Inuit and Métis individuals and communities in the economy.

  4. The North - Self–reliance, prosperity and well–being for the people and communities of the North.

  5. Office of the Federal Interlocutor - Socio–economic well–being of Métis, non–status Indians and urban Aboriginal people.

Appendix A provides a more detailed breakdown of the PAA.

Transfer Payment Programs

According to the 2010-11 Public Accounts of Canada, AANDC is the fifth largest in terms of total ministerial net expenditures (behind Finance, Human Resources and Skills Development Canada (HRSDC), National Defence and Public Safety), and third largest in terms of total transfer payments/G&C (behind Finance and HRSDC). In 2010-11, AANDC’s total net expenditures were $8,257,798 and total transfer payments were $6,722,119. The requirement for all direct program spending and G&C to be evaluated every five years represents a significant volume of work for the Department.

Planned expenditures for Fiscal Year 2012-13

  2012-13 Planned Expenditures
($ millions)
The Government 1,581.6
The People 3,472.0
The Land and Economy 1,314.7
The North 207.3
The Office of the Federal Interlocutor 28.0
AANDC Total 6,986.1
 

Due to rounding, figures may not add to totals shown.
Source: 2010-11 AANDC Report on Plans and Priorities

AANDC Challenges and Opportunities

The Aboriginal population is one of the fastest-growing segments of Canadian society. The 2006 Census data show that the number of people who identified themselves as an Aboriginal person has surpassed the one million mark. This growth is bringing with it ever-increasing demands for services as well as the opportunities that an educated, capable Aboriginal youth cohort can offer the labour force of tomorrow. Canada's North possesses unparalleled opportunities for resource development that will transform the lives of all Northerners, including Aboriginal people, the communities they live in, and Canada as a whole.

To contribute to Aboriginal and northern aspirations, the Department must work with Aboriginal and northern people living in isolated communities while others are concentrated in, or in close proximity to, urban areas. In addition, many Aboriginal communities in the North, in particular, are on the front line of environmental and climate change. At the same time, unparalleled opportunities are emerging for Aboriginal and northern communities, arising out of resource development, claims settlements, new program delivery arrangements, new legislative frameworks and, most importantly, out of the growing capacity of these communities to manage their own affairs and pursue their own priorities.






2. Planning Methodology

Section 6.2.3 of the Directive on the Evaluation Function states that the Head of Evaluation is responsible for developing and annually updating a rolling five-year evaluation plan. Accordingly, at AANDC, the Evaluation Plan was developed by the Evaluation, Performance Measurement and Review Branch (EPMRB), which is part of the Audit and Evaluation Sector. The approach for the creation of the 2012-13 Plan was as follows:

Define the evaluation universe and scoping

Defining the evaluation universe leads to the identification of units of evaluation that will contribute to full coverage of the PAA. These units of evaluation are the smallest logical programming units that could be reasonably subject to an individual evaluation.

Past AANDC plans were organized by Strategic Outcomes of the Department and included associated authorities (source of funds), however, they did not show direct links to the MRRS or the amount of G&C. For the 2012-13 Plan, the AANDC evaluation universe in Appendix B was defined to show the linkages between the MRRS, programs, G&C spending, authorities and proposed evaluations. In this way, it is easier to see the level and scope of evaluations, and the coverage of authorities.

In the AANDC MRRS, there are 46 Sub-activities (SA). A total of 26 evaluations identified, or approximately 54 percent of all evaluations, are at the Sub-activity level. There are five evaluations that target multiple Sub-activities and another six that are at the Sub-sub-activity (SSA) level or lower. There are a number of reasons for the varying scoping of evaluations: 

  • In the past, to meet Evaluation Policy requirements for coverage and deadlines of expiring authorities, multiple programs were clustered into a single evaluation for efficiency.

  • Programs were also clustered into a single evaluation where, Sub-activities are closely linked and contribute to common outcomes.

  • At AANDC, Sub- activities are not consistent in terms of their size and complexity. One Sub-activity might entail a small evaluation such as Registration Administration while a second might require a large evaluation such as Activation of Community Assets.

  • There are also instances where, within the PAA, a program is unique within the Sub-activity. In order to achieve economies of scale, where possible, these unique programs are incorporated into other evaluations.

Efforts to define the evaluation universe were complicated by the associated authorities. Some programs use multiple authorities and some authorities direct funding to multiple programs. Some gaps in information remain for the evaluation universe so EPMRB will continue to work with the Chief Financial Office to ensure 100 percent coverage in the plan.

Conduct a risk assessment of units of evaluation

Each year, the Audit and Assurance Services Branch (AASB) in the Audit and Evaluation Sector at AANDC develops risk rankings for Sub-activities. The process followed involves extensive review of corporate documents, workshops and consultations with program representatives and external stakeholders and is an essential component for the preparation of the Risk-Based Audit Plan.

The AASB risk approach is consistent with guidance from the Office of the Comptroller General, is thorough, and for the most part, the audit and evaluation units were the same. Risk considerations include impact of identified risks on the achievement of departmental outcomes, materiality, scope, potential for public scrutiny, legal risk, prevalence of risk, and the severity of consequences.

Previously, in the preparation of the plan, EPMRB conducted its own risk analysis and assigned risk rankings of low, medium or high for mission risk, materiality and availability of performance information. These rankings were used to prioritize evaluation and performance measurement projects, and for scoping, resource assignment and timing considerations. In order to take advantage of the risk ranking process above and not duplicate efforts, EPMRB decided to adopt the AASB risk ranking to inform the level of effort and allocation of resources for evaluations.

There are five levels of risk identified in the Risk-Based Audit Plan, which take into account the number of risk factors and the severity of consequences on the Department and stakeholders. The five levels are defined as follows:

  • Very high risk – The auditable entity is inherently exposed to multiple risks that are expected to remain or get worse over time. Business conditions contain considerable risk factors. If one or more risks were to materialize, consequences would be severe and could include permanent or long-term damage to AANDC’s ability to achieve its objectives. Consequences would be felt by the majority of stakeholders, both internally and externally to AANDC.

  • High Risk - The auditable entity is inherently exposed to many risks that are expected to remain over time. Business conditions contain many risk factors. If one or more risks were to materialize, consequences would be significant and could be endured by AANDC with significant management attention. Some AANDC activities could be subject to significant review or changed ways of operation. Consequences would be felt by many stakeholders.

  • Moderate Risk - The auditable entity is inherently exposed to several risks that are expected to remain over time. Business conditions contain some risk factors. If one or more risks were to materialize, consequences would be moderate and could be managed with a minor level of management attention. Consequences would be felt by a sub-set of stakeholders.

  • Low Risk - The auditable entity is exposed to a few risks that may diminish over time. Business conditions contain a few risk factors. If one or more risks were to materialize, consequences would be minor and could be absorbed through normal activity. Consequences would be isolated.

  • Very Low Risk - The auditable entity is inherently exposed to few or no risks. Business conditions contain few or no risk factors. If one or more risks were to occur, the consequences would be negligible and could be absorbed through normal activity.

Create the Plan

The 2012-13 Plan builds upon the previous year’s plan. As discussed above, the units of evaluation were aligned with the MRRS and planned G&C spending was added. Deadlines for evaluations were documented to ensure that an evaluation was scheduled to take place within five years from the date of the previous one.

A central goal in the design of the plan was to provide an even and strategically clustered distribution of evaluation projects to balance the impact on program managers, regional operations, across sectors as well as on EPMRB.

On occasion in the past, an evaluation was scheduled in the same year as an audit leading to pressures on program staff. To mitigate this in the future, EPMRB included audits from the 2012-13 Risk-Based Audit Plan. Where audits and evaluations are scheduled to occur in the same year, it was agreed with AASB that audit work will be conducted early in the fiscal year, so that programs are not overburdened with requests for information.

The addition of planned audits is also useful for the calibration of effort for evaluations. Audits generally reveal a considerable amount about the design and management of programs and give early indications on the achievement of program outcomes. Audits scheduled in advance of an evaluation can impact the scope and level of effort needed for an evaluation.

An important consideration in the design of this plan was the anticipated Deficit Reduction Action Plan (DRAP). The Government of Canada has committed to reducing expenditures by 10 percent during the 2012-13 fiscal year to support efforts to reduce the country’s deficit. Although details of DRAP have yet to emerge, it is against this back drop that evaluation planning has taken place. To compensate for the uncertainty going forward, EPMRB focused largely on defining work for 2012-13 and will revisit the plan in the summer of 2012.

In this plan, 48 evaluations are scheduled between 2012-13 and 2016-17. The table below shows the distribution of evaluations for the five years covered by the plan.

Distribution of Evaluations by Strategic Outcome and by Year

Year Government People Land and
Economy
North Office
of the
Federal
Interlocutor
TOTAL
Carry over
from
2011/12
0 4 0 3 0 7
2012/13 4 2 3 2 1 12
2013/14 2 2 5 2 0 11
2014/15 3 0 1 2 0 6
2015/16 3 1 3 2 1 10
2016/17 0 1 0 1 0 2
TOTAL 12 10 12 12 2 48
 

There is a fairly even distribution of total evaluations by Strategic Outcome with 10 to 12 evaluations per strategic outcome area over the five year period. The annual distribution of evaluations by sector is also fairly even, however, there are a few years where no evaluations are scheduled for certain sectors and a few years where one sector is targeted for a relatively large number of evaluations such as Land and Economy Sector in 2013-14, which has five evaluations scheduled. These discrepancies will be addressed when the plan is revisited in June 2012.

Similar to the identification of evaluations, the schedule for the development of performance measurement strategies has been taken from the previous year’s plan. Typically, performance measurement strategies have been targeted to follow an evaluation. Interestingly, all performance measurement strategies in the previous plan, except one, were scheduled to be completed by 2012-13.

Currently, 24 of the 48 planned evaluations (50 percent) have a performance measurement strategy in place. There are 14 evaluations that have a performance measurement strategy scheduled to be developed in time for the evaluation. The remaining 20 percent are only partially covered in that a performance measurement strategy exists for a component (ie. Sub-sub-activity) of the evaluation or have no performance measurement strategy planned. The strategy and timeline for developing new performance measurement strategies will also be revisited in June 2012.

Consultation with Senior Management

Once a draft plan had been generated, meetings were organized with all direct reports to the Deputy Minister to confirm the scoping and timing of proposed evaluations. The meetings with senior managers also provided an opportunity to ensure evaluations were aligned with their information needs (ie. Reporting commitments to Parliament, program reviews). As a result of these discussions, two evaluations were moved forward to 2012-13.

Calibration of Level of Effort

Section 6.2.1, subsection c, of the Standard on Evaluation for the Government of Canada gives departments the flexibility to calibrate the nature and depth of each evaluation undertaken in relation to the risks associated with the program and the information needs of the Deputy Head.

An analysis of the ten evaluations to be started in 2012-13 was undertaken to determine the level of effort required for each evaluation and better align available resources. Four considerations went into the calibration exercise:

  1. Materiality – A score of 3 was assigned to evaluations of programs with annual G&C spending over $500 million; 2 was for spending between $100 and $500 million, and; 1 was assigned to evaluations covering less than $100 million.

  2. Risk – A score of 4 was assigned to evaluation units with a risk ranking of “very high”; 3 for “high”, 2 for “moderate” and 1 for “low”

  3. Complexity – The scores for complexity were guided by the number of delivery partners, coverage, governance structure, number of delivery mechanisms and number of objectives:
High (score=3) Medium (score=2) Low (score=1)
Large number of external delivery partners or delivery through the regions, broad coverage, complex governance structure, multiple delivery structures for multiple objectives Small number of delivery partners (ie. regions), targeted to a few groups with slightly different characteristics, multiple layers of governance, one delivery structure Delivery out of Headquarters or one single entity, targeted to one group, simple governance structure, one objective
 

Performance Measurement - The availability of performance measurement data was ranked according to the number of years a performance measurement strategy had been in place. Green indicates a performance measurement strategy has been in place for three or more years. Yellow signifies a new performance measurement strategy has been in place for one to two years. Red means no performance measurement strategy exists. The scoring permits a maximum score of 10. Results of the calibration exercise are as follows:

Planned Evaluation Materiality Risk Complexity Total Performance
Measurement
Impacts of Comprehensive Land Claims Agreements and Self-Government Agreements 3 4 3 10  
Negotiations and Implementation of Self-Government Agreements and Comprehensive Land Claim Agreements 3 4 3 10  
Specific Claims Action Plan (Summative Evaluation) 3 3 3 9  
Income Assistance, National Child Benefit Reinvestment and Assisted Living 3 3 3 9  
First Nations Water and Wastewater Infrastructure 3 3 3 9  
Federal Contaminated Sites Action Plan 3 3 3 9  
Enhanced Prevention Focused Approach for the First Nations Child and Family Services Program in Quebec and Prince Edward Island 3 4 1 8  
Activation of Community Assets 1 4 3 8  
Aboriginal Entrepreneurship 2 3 2 7  
Northern Nutrition Contribution (Implementation Evaluation) 1 3 1 5  
Federal Interlocutor's Contribution Program and Powley: Reconciliation and Management of Métis Aboriginal Rights 1 1 2 4  
Consultation and Engagement Initiatives  1 2 1 4  
 
  • Impacts of Comprehensive Land Claims Agreements and Self-Government Agreements = Yellow signifies a new performance measurement strategy has been in place for one to two years

  • Negotiations and Implementation of Self-Government Agreements and Comprehensive Land Claim Agreements = Red means no performance measurement strategy exists

  • Specific Claims Action Plan (Summative Evaluation) = Red means no performance measurement strategy exists

  • Income Assistance, National Child Benefit Reinvestment and Assisted Living = Yellow signifies a new performance measurement strategy has been in place for one to two years

  • First Nations Water and Wastewater Infrastructure = Green indicates a performance measurement strategy has been in place for three or more years

  • Federal Contaminated Sites Action Plan = Yellow signifies a new performance measurement strategy has been in place for one to two years

  • Enhanced Prevention Focused Approach for the First Nations Child and Family Services Program in Quebec and Prince Edward Island = Yellow signifies a new performance measurement strategy has been in place for one to two years

  • Activation of Community Assets = Red means no performance measurement strategy exists

  • Aboriginal Entrepreneurship = Red means no performance measurement strategy exists

  • Northern Nutrition Contribution (Implementation Evaluation) = Yellow signifies a new performance measurement strategy has been in place for one to two years

  • Federal Interlocutor's Contribution Program and Powley: Reconciliation and Management of Métis Aboriginal Rights = Red means no performance measurement strategy exists

  • Consultation and Engagement Initiatives = Red means no performance measurement strategy exists

The analysis above shows that of the 12 evaluations scheduled for 2012-13, seven are of high materiality involving annual G&C spending over $500 million. Four received a risk ranking of "very high" and six were ranked as "high". Seven are deemed complex. Two evaluations received the maximum score of 10 and four were found at 9 and two at 8. Only three evaluations received a ranking of 5 or below, which means that EPMRB has a complex, high risk portfolio of evaluations for the first year of the plan. The calibration levels will be used to determine the level of effort and assignment of resources.

The ranking of performance measurement reveals six evaluations do not have a performance measurement strategy to work from. Only one evaluation has a strategy that is older than three years and the remaining five evaluations have new performance measurement strategies. The absence of performance measurement strategies will impact the level of effort for an evaluation. Evaluations will have to introduce methodologies that address the data gaps, which typically requires a greater level of effort and increased resources.






3. 2012-13 to 2016-17 AANDC Plan for Evaluation and Performance Measurement Strategies






Appendix A: AANDC Program Activity Architecture






Appendix B: AANDC Evaluation Universe