Define, Design & Refine: Revisiting SR&ED as an Innovation Policy Tool

SRED Production Runs Policy - 2016 04 15

Inspiration, intention and incentives are three of the key tenets of the innovation-focused economics espoused by Atkinson & Ezell (2012). In Canada, we already have an excellent channel for delivering incentives to upwards of 24,000 corporations annually, via the single largest source of federal government support for research and development in Canada: the Scientific Research and Experimental Development (SR&ED) tax credit. Through (re)defining, (re)designing, and refining through reviewing outputs of the process, the current program could be improved.


In 2015, the retirement of the “Industry” ministry title in favour of a new “Innovation, Science, and Economic Development” implied that the new government seeks to encourage those three concepts, ideally concurrently; however, these concepts are often confused. The OECD Oslo manual (2005) defines four types of innovation: product, process, marketing and organizational. By contrast, the OECD Frascati manual (2002) defines R&D as “creative work undertaken on a systematic basis in order to increase the stock of knowledge, including knowledge of man, culture and society, and the use of this stock of knowledge to devise new applications.” It goes into further detail regarding basic research, applied research, and experimental development:

Basic research is experimental or theoretical work undertaken primarily to acquire new knowledge of the underlying foundation of phenomena and observable facts, without any particular application or use in view.

Applied research is also original investigation undertaken in order to acquire new knowledge. It is, however, directed primarily towards a specific practical aim or objective.

Experimental development is systematic work, drawing on existing knowledge gained from research and/or practical experience, which is directed to producing new materials, products or devices, to installing new processes, systems and services, or to improving substantially those already produced or installed. R&D covers both formal R&D in R&D units and informal or occasional R&D in other units. (OECD, 2002)

There is evidence of some overlap in the definitions. Product and process innovation may include applied research or experimental development; however, there are points of divergence regarding marketing and organizational innovation. The SR&ED program further compounds the issue by introducing a slightly different definition:

‘scientific research and experimental development’ means systematic investigation or search that is carried out in a field of science or technology by means of experiment or analysis and that is

(a) basic research, namely, work undertaken for the advancement of scientific knowledge without a specific practical application in view,

(b) applied research, namely, work undertaken for the advancement of scientific knowledge with a specific practical application in view, or

(c) experimental development, namely, work undertaken for the purpose of achieving technological advancement for the purpose of creating new, or improving existing, materials, devices, products or processes, including incremental improvements thereto (CRA, 2012)

As detailed above, basic and applied research in SR&ED policy are very similar to the definitions of the Frascati manual, but this diverges significantly when reviewing the definition of experimental development. The focus of the SR&ED program is on advancing knowledge within a field of science or technology. This limited view of R&D is by no means a new issue; the boundaries of experimental R&D and how it relates to innovation have been debated for many years in Canada. Doern and Levesque (2002):

Issues concerning how to define experimental R&D were also similar in many ways to IRAP and the NRC’s continuing practical concerns about how to define innovation, in that both dealt with the existence of uncertainty as opposed to routine technical advance.

Boundaries between innovation and experimental R&D continues appear to be an ongoing issue engaging with taxpayers, as these definitions are specifically addressed in the recent Eligibility of Work for SR&ED Investment Tax Credits policy (2012):

The creation of new, or improvement of existing, materials, devices, products, or processes can be achieved without technological advancement. Also, novelty, innovation, uniqueness, feature enhancement, or increased functionality alone does not represent or establish technological advancement. Instead, it is how these attributes or features arise (that is, whether or not they arise through technological advancement) that is important.

The above lack of clarity regarding definitions has spillover effects in two areas. First, Hawkins (2012) has warned that Canadian policies for innovation are inadequate, as they do not reflect current knowledge the innovation process and the relationship to economic development. Second, the unreliability of the SR&ED program, due to uncertainties associated with qualification and timing, is sometimes so great that the program is excluded from R&D investment decisions in corporate environments (Jenkins, 2011).

While there will always be varying definitions, ensuring that the new government establishes clear definitions based on current knowledge in Innovation and R&D will allow them to build a more effective portfolio. Clearly defining the boundaries between the different concepts, and drawing on updated literature such as Hawkins et al. (2013) Canada’s Future as an Innovative Society: A Decalogue of Policy Criteria will enable the government to determine the objectives of the program, as will clarifying the relationship between national strategy and the different associated policy tools explicit.

Design and Refine

The current SR&ED program provides over $4 billion annually to more than 24,000 corporations, a significant increase from the $1 billion to just under 6,500 corporations in 1990 (Doern, 1995). Once a clear definition of this policy tool in relation to national strategy has been established, it is likely worth reviewing the program for effectiveness, oversight, and other areas for improvement.

The Treasury Board of Canada Secretariat (2009) defines “effectiveness” in the federal framework for program evaluation as “the extent to which a program is achieving expected outcomes.” Currently, the SR&ED program is delivered through the tax system, which means that it does not have the same results-based accountability framework (Jenkins, 2011). Instead, multiple economic assessments of the program from a net benefit perspective have occurred; however, these are known to have significant limitations and margins of error. In 2014, the CRA responded to allegations of program abuse by external consultants by adding a mandatory section to the existing application form. This form made explicit the anticipated payment to individuals involved in claim preparation. The addition of this section implies that it would be possible to make further changes in order track some of the outputs of the investment into the SR&ED program. But what could be tracked? What should be tracked? Bruland and Mowery (2005) suggest that patents are a means of tracking innovation through time; however, there are known limitations with this method. Other output measurement tools include journal articles produced in peer-reviewed journals, but these metrics may only be applicable for basic and applied research. The question of effectiveness is inextricably linked to the desired outcomes – only once they have been established can research into the best means of measuring the outcomes in a quantifiable manner begin to occur.

If the objective and outcomes of the program are explicit, who should administer the program? Hagel and Singer (1999) are the founders of the concept of the “unbundled corporation”, which argues that companies should separate departments that have different economic, competitive, and cultural imperatives. The same should be true of the design of the organization currently overseeing the delivery of the SR&ED program. The objective of the Canada Revenue Agency is to collect taxes, not to provide R&D funding. This point is reiterated by Doern (1995) who identified that these conflicted roles at the CRA “can and do create tensions as to how hard to push the tax collection versus program delivery levers” and the subsequent internal micro-policy development may be at odds with the original intent of the program. Borrowing from the guidelines outlined by Bardach (2011)—‘making use of what looks like good ideas from someplace else’—there are a few examples from which solutions may be sourced. The most relevant is the administration of the Ontario Interactive Digital Media Program via the Ontario Media Development Corporation (OMDC). Companies apply for approval through the OMDC. After approval, they receive a Certificate of Eligibility. This is then submitted to the CRA and the refund is processed. While there are limitations to this setup (including longer wait times than SR&ED), it provides the proven example of a viable Canadian alternative. Removing the administration of the program from the Canada Revenue Agency, as per suggestions by Doern (1995), Muller (2009), Jenkins (2011) and CATA (2015) is possible and should be considered.

Finally, the Canadian R&D funding portfolio is heavily weighted towards indirect funding (via SR&ED). There are multiple benefits to this approach, as indirect funding allows for free market demands. By contrast, direct funding allows for focusing R&D support on specific areas, projects, industries, and/or regions—the tradeoff is that it often incurs higher administration costs due to their selection and evaluation processes and the compliance costs of recipients (State of the Nation, 2012). Could SR&ED be hybridized in order to benefit from the large application volume (over 24,000 annually), encourage spending in areas that corporations would normally consider unprofitable, and keep the cost of program administration low? If the government can work towards measuring outcomes via a few additional checkboxes, redirecting some funding towards specific initiatives (ex. Clean technology) via an indirect funding program would be possible. The key, of course, will be continuing to reduce the complexity of the program as it evolves and monitoring whether it meets key objectives.


The new Canadian government has a unique opportunity to update an older policy tool to work with emerging knowledge regarding science, technology, and innovation policies in Canada. Through clarifying definitions and roles, defining the outputs, and revisiting the organizational culture, it may be possible to meet the ambitious goals of the new Ministry of Innovation, Science and Economy.


Atkinson, R. D., & Ezell, S. J. 2012. Innovation economics: the race for global advantage. Yale University Press.

Bardach, E. 2011. Practical guide for policy analysis: the eightfold path to more effective problem solving. Sage.

Bruland, K., & Mowery, D. C. 2005. Innovation Through Time. In D. C. M. A. R. R. N. Jan Fagerberg (Ed.), The Oxford Handbook of Innovation. Oxford University Press.

CRA. 2014, January 30. SR&ED T661 Claim Form – Revised optional filing measure for Part 9. Canada Revenue Agency.

CRA. 2012, October 9. Eligibility of work for SR&ED investment tax credits. Canada Revenue Agency.

Doern, G. B., & Levesque, R. 2002. The National Research Council in the Innovation Policy Era: Changing Hierarchies, Networks and Markets. University of Toronto Press.

Doern, G. B. 1995. Institutional Aspects Of R&D Tax Incentives: The SR&ED Tax Credit. no. 6, Industry Canada.

Hagel, J., 3rd, & Singer, M. 1999. Unbundling the corporation. Harvard Business Review, 77(2): 133–41, 188.

Hawkins, R., Gault, F., Dufour, P., Geelen, J., & Saner, M. 2013. Canada’s Future as an Innovative Society A Decalogue of Policy Criteria. Institute for Science, Society and Policy, University of Ottawa.

Jenkins, T., Gupta, A., Naylor, D., Robinson, N., Leroux, M., et al. 2011. Innovation Canada: A Call to Action. Industry Canada Expert Panel.

Mintz, J., & Manning, P. 2012. Implications of the Recommendations of the Expert Panel on Federal Support to Research and Development.

Frascati Manual: Proposed Standard Practice For Surveys On Research And Experimental Development. 2002. OECD.

Oslo Manual: Guidelines For Collecting And Interpreting Innovation Data. 2005. no. 3rd Ed, OECD and Eurostat.




About the author: elizabeth

Leave a Reply

Your email address will not be published.