Revisiting Innovation Canada: A Call to Action (A Critical Analysis)

Introduction

The Innovation Canada: A Call to Action report written by the Independent Panel on Federal Support to Research & Development (R&D) was released on October 17, 2011. The concept for this report originated from the 2010 federal budget, when a comprehensive review of support for research and development (R&D) was announced. The report has previously been criticized by other organizations such as PricewaterhouseCoopers (2011), The School of Public Policy at the University of Calgary (2012), and others; however, the majority of the responses focused on debating the final recommendations. This paper seeks to provide a critical analysis of the strengths and weaknesses of the research that underscores the recommendations.

Summary

The panel was tasked to answer addressed the following questions: (1) What federal incentives are most effective in increasing business R&D and facilitating commercially relevant R&D partnerships? (2) Is the current mix and design of tax incentives and direct support for business R&D and business-focused R&D appropriate? (3) What, if any, gaps are evident from the current suite of programming, and what might be done to fill the gaps? The answers to these questions were evaluated against eight principles, research was conducted via stakeholder meetings and a national survey, consolidated responses are provided, and six recommendations for improvement are put forward.

Analysis

Areas of Strength

A key tenet of of evidence-based policy design is rigorously established objective evidence; however, in order to assess the efficiency and effectiveness of existing programs in order to design (or redesign) policy frameworks, one must first identify what is in currently use. Only once the existing programs have been identified can details regarding their design, implementation, desired and actual outputs be evaluated. As such, the key strength of the report is the identification and consolidation of over 60 funding programs, thereby permitting a side-by-side comparison of costs and other analysis to occur. This simple process allows the reader to observe the diversity and complexity of locating innovation funding programs within Canada. It further showcases that several sector-specific programs currently exist, but that they may not be fully understood or utilized. Finally, it allows the reader to compare the mix of direct and indirect funding that is currently available – and that a disproportionate amount (70%) is delivered via a single program, the Scientific Research & Experimental Development tax credit (Expert Panel, 3-8).

As a result of having collected and reviewed this statistical data, comparisons can be made regarding direct and indirect incentives by other governments. This provides additional details when examining the mix of policy tools that may impact BERD, such as comparing the tax subsidy rates and the associated BERD intensity (Expert Panel, 6-4).

A potential issue when comparing Canadian programs to those of other governments is whether the definitions used for R&D are identical. The report is aware of this limitation:

The Frascati Manual is accepted by consensus of all OECD countries, thus ensuring international agreement on the definition of R&D, as well as the application of guidelines for its measurement. There is nevertheless still some room for national differences of interpretation as well as variation in the depth and specificity of data collected by statistical agencies for indicators such as HERD and BERD. So while OECD data allow for meaningful international comparisons of R&D activity, perfect cross-national comparability remains an aspiration and not yet a reality. (Expert Panel, 2-5)

In a considerable amount of innovation policy writing, there is often a mixing of terminology; specifically, innovation strategy is often confused with science and technology strategy. According to the report, business innovation transcends science and technology and R&D (Expert Panel, 8-3). While broadening the scope would be interesting, consolidating existing R&D program information and expenditures represents an important first step in determining the actual mix of tax incentives and direct support for business R&D and business-focused R&D.

Criticism

Due to the significant number of programs reviewed (over 60) spanning multiple departments and the highly politicized nature of GERD, the report data may contain minor errors. In reviewing the table on Total Envelope Expenditure (Expert Panel, 3-3), it can be observed that the Automotive Innovation Fund has no expenditures between 2008 and 2011. This is different from the Industry Canada website which indicates the Government of Canada had designated $250 million to this program, of which a combined $134.8 million had been approved prior to the release of this report (Industry Canada, 2013). No footnotes or other details explain this discrepancy.

In the report, the discussions related to the design of the programs focuses heavily on the SR&ED program due to the estimated $3.5 billion distributed annually through this program (Expert Report, E-9). An area for improvement in the report would be reviewing some of the programs in more details, citing examples of unbalanced programs. An example (although it may not have been apparent at the time of the report being issued) would be the limited number of recipients for the Automotive Innovation Fund (AIF). Of the eight awards between 2008-2015, the recipients have been Ford (two awards / $151.6 million total), Linmar (two awards / $105.5 million total), and Toyota (three awards / $146.715 million total).

Insight into the administration costs of these programs would have been beneficial, so that one may determine how much does it cost to administer the AIF, a program that services only four companies. The authors repeatedly mention that either the costs of administration are included or excluded from the figures reported. The only specific reference to costs is in aggregate: “the Panel reviewed 60 programs and institutes totaling $5.14 billion (or $4.96 billion when federal program administration costs are removed)” (Expert Panel, 3-3).  This would therefore imply that the cost of administering the programs is upwards of $180 million. There are no further details. The authors do make passing reference to the cost of administration: “All programs generate costs of administration and compliance, which must be netted against the benefits.” (Expert Panel, 3-2).

Finally, the major criticism of the evaluation of is that the report falls into the common trap of focusing on the inputs (BERD and GERD) rather than the outputs of the funding. This is a larger issue and one that the authors do mention in passing, but this is not part of the extended discussion or results.

Conclusion

This paper discussed some of the strengths and weaknesses of the Innovation Canada: A Call to Action report. An area for additional research would be updating the list of funding programs offered by the Canadian government, the funds distributed, and the associated administration costs on an annual basis. Combined with updated evaluation frameworks—a tentative one is suggested by the panel—this could be a powerful means of assessing one of the primary methods that the Canadian government currently supports R&D in Canada.

References

Expert Panel. 2011. Innovation Canada: A Call to Action: Review of Federal Support to Research and Development, Publishing and Depository Services, Public Works and Government Services Canada, Ottawa.
PricewaterhouseCoopers. 2011. Narrowing Canada’s Innovation Gap: PwCs Observations on the Jenkin’s Report.

Mintz, J., & Manning, P. 2012. Implications of the Recommendations of the Expert Panel on Federal Support to Research and Development. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2028124.

About the author: elizabeth

Leave a Reply

Your email address will not be published.