:
I call the meeting to order.
Good morning, everybody. Welcome to meeting number two of the Standing Committee on Science and Research.
Pursuant to the motion of the committee on June 18, 2025, the committee is meeting to study the impact that the criteria for awarding federal funding have on research excellence here in Canada.
Today's meeting is taking place in a hybrid format. Pursuant to the Standing Orders, members are attending in person, in the room, and remotely by using the Zoom application. I think all of the members are in person right now for this one.
Before we continue, I would like to ask all in-person participants to consult the guidelines written on the cards on the table. These measures are in place to help prevent audio and feedback incidents and to protect the health and safety of all of the participants, including the interpreters.
You will also notice a QR code on the card. It links to a short awareness video.
I would like to make few comments for the benefit of witnesses and members.
Please wait until I recognize you by name before speaking. To those participating by video conference, click on the microphone icon to activate your mic, and please mute yourself when you are not speaking. Those of you on Zoom can select the appropriate channel for “floor”, “English” or “French” at the bottom of your screen. Those in the room can use the earpiece and select the desired channel.
I will remind you that all comments should be addressed through the chair. Members in the room, if you wish to speak, please raise your hand. Members on Zoom, please use the “raise hand” function. The clerk and I will manage the speaking order as best as we can, and we appreciate your patience and understanding in this regard.
I would like to welcome our three witnesses for this panel. We are joined virtually by Gita Ljubicic, professor at McMaster University. We are joined in person by Steven Pinker, Johnstone family professor of psychology, Harvard University. The third witness for today is Azim Shariff, professor at the University of British Columbia. He has joined us via video conference.
Welcome, and thanks a lot for coming.
With that, we will go to the witnesses.
The first one will be Ms. Gita Ljubicic. You will have five minutes for your opening remarks. Please go ahead.
:
Thank you for the opportunity to appear before this committee on science and research today. It's an honour to share my experiences on how federal funding criteria impact research excellence in Canada.
Public investment in research is vital for advancing knowledge, solving complex problems and training future researchers. However, evaluating proposals is challenging. Reviewers aim to fund research that positively impacts Canadians through innovation, evidence and creativity, and in ways that improve our understanding of the world, quality of life and equity.
Conventional measures of excellence often focus on quantitative indicators like the number of grants, publications, awards and scholarships; the number of students graduated; an individual's track record; and proposal strength. While these do reflect academic productivity, they don't always capture real-world impacts, such as informing policy and community decisions, improving health and education practices and outcomes, supporting economic growth, advancing reconciliation, and promoting environmental sustainability and social equity. Researchers highlight these impacts in applications, but measuring them remains difficult, and this creates challenges for rigorous, fair and consistent approaches to evaluation.
My name is Gita Ljubicic. I'm a professor in the school of earth, environment and society at McMaster University, and I lead the StraightUpNorth, or SUN, research team. I'm a geographer, trained in both natural and social sciences, working at the intersection of cultural and environmental geography. My research is rooted in respectful collaboration with indigenous knowledge holders to address complex social and ecological issues. For over 25 years I've worked primarily with Inuit communities in Nunavut, and, through students and collaborators, I've been involved in projects across Inuit Nunangat—which are Inuit homelands in the Canadian Arctic—and with first nations and Métis communities in Yukon and Northwest Territories. Our SUN team aims to ensure that research benefits our community partners, informs decision-making, improves research practice and supports emerging northern researchers.
My recommendations here today reflect personal experience in community-engaged and interdisciplinary research. Federal funding criteria must include qualitative indicators that rigorously and fairly assess research excellence. I have experience working with NSERC, SSHRC, CIHR—receiving those funds as well as reviewing applications—and interdisciplinary initiatives through the tri-council, Environment and Climate Change Canada, and Crown-Indigenous Relations and Northern Affairs Canada.
Funding policies have evolved to better support interdisciplinary research, EDI initiatives, indigenous leadership partnerships, early career researchers, mentorship, and knowledge mobilization. However, alongside these important policy changes, targeted funding opportunities and new requirements in proposals, the conventional quantitative and academic focus metrics of excellence need to be re-envisioned.
There are six ways that I propose this can be achieved, and I would be happy to expand on any of these today or in follow-up written testimony. My suggestions are to ensure the representation of reviewers with direct cultural or community-specific experience in funding evaluation committees; to ensure the representation of early career researchers as reviewers for early career research-specific funding pools; to consider the amount of time dedicated to community-engaged and partnership research when assessing the rationale, methodology, budget and claims of significance in a proposal; to extend consideration of training and mentorship contributions beyond academic, highly qualified personnel; to assess partnerships according to their diversity of roles, strengths of relationships and evidence of collective planning and implementation; and to recognize that knowledge mobilization goes beyond academic audiences and public outreach.
In the few minutes I've had today, I've offered these six specific recommendations to refine how federal research funding is assessed and allocated.
:
Madam Zahid, Monsieur Blanchette-Joncas, and members of the Standing Committee on Science and Research, as a proud Canadian and graduate of Dawson College and McGill University, it's a tremendous honour to speak to you today about diversity in science.
Starting in the late 1970s, the concept of diversity became popular in the United States after the Supreme Court ruled that explicit racial quotas in university admissions were a form of unconstitutional discrimination, but that it was acceptable for schools to favour minority students if the goal was to enhance the educational experience of all students by having a diverse student body. Over time, the laudable goal of diversity morphed into policies that increasingly used race and sex as criteria in admissions, hiring and funding. That was the “D” in DEI, or EDI.
More recently, the term “viewpoint diversity” became popular as an ironic response to racial and gender diversity. The joke went that in a university, “diversity” means people who look different and think alike; viewpoint diversity, in contrast, is the form of diversity that really matters in scientific and intellectual life. It is simply not true—indeed, one might say it is a form of prejudice—to assume that all women or all members of a racial or ethnic minority think in a particular way.
A diversity of viewpoints, though, is necessary to do science properly. This is not because diversity is aesthetically pleasing; it's because people are not omniscient or infallible. As a cognitive scientist, I can attest that the human mind is vulnerable to many biases and fallacies. The strongest is the “myside” bias, the conviction that my own tribe, coalition or party is correct and that a rival coalition is ignorant or evil or both. People are poor at spotting their own biases. As the economist Joan Robinson put it, “Ideology is like breath. You never smell your own.”
The reason that science can proceed despite these blind spots is that we're much better at spotting someone else's biases. In a community in which people with different viewpoints can criticize those they disagree with without fear of punishment, censorship or cancellation, one person can point out another's errors, and the whole community can be more rational than any of the individuals in it.
In contrast, there are several reasons to fear that diversity, in the DEI sense of allocating funding to scientists based on their race or sex, works against the interests of science and the nation.
First, it can be inherently unfair. Funding is a zero-sum game. If people of one sex or skin colour are given an advantage, then others of a different sex or skin colour are being put at a disadvantage. This was the reason that my own institution lost another famous Supreme Court case, Students for Fair Admissions, Inc. v. President and Fellows of Harvard College, in 2023. The court ruled that in favouring Hispanics and African Americans in admissions, Harvard was unconstitutionally discriminating against Asian Americans.
Second, it can be a waste of taxpayer money if grant dollars don't go to the scientific research that is judged to be of the highest quality and priority. Of course, reviewers of grant proposals are themselves subject to biases, including racism and sexism, but this means that the biases themselves should be minimized through blind review, audits and the most objective measures of quality and influence we can find.
Third, while it's laudable to attract the widest range of talent in science and to overcome past barriers to inclusion, the awarding of grants takes place at the end of the science training pipeline, far too late in a person's life to rectify social and historical inequities. Obsessing over statistical differences in the awarding of research grants draws attention away from formative influences that create inequities in the first place, including education from the preschool years through university as well as social and cultural norms that make science attractive as a career.
Finally, the promotion of diversity in gender and ethnicity at the same time that diversity in opinion is constricted by censorship, cancellation or intellectual monocultures undermines public trust in science. I often mention to audiences or interviewers that the massive scientific consensus is that human activity is warming the planet. Many times a listener has replied, “But why should we trust the consensus if it comes from a clique that does not favour the best science and that punishes anyone who disagrees with the orthodoxy?”
Recent events in the United States—with which, I'm guessing, you're familiar—illustrate the dangers that can result when politicians and the public lose trust in science.
:
Thank you for the opportunity to speak before this committee.
My name is Azim Shariff, and I'm a professor of psychology at the University of British Columbia. I was born and educated in Canada—first at the University of Toronto and then, for my doctorate, at UBC—and I later held faculty positions in the U.S. before being invited back home under the Canada 150 research chairs program. In light of this committee's study, my most useful contribution today will be to share my observations about how well-intentioned policies surrounding the Canada research chair program have played out in practice.
As you all know, the CRC program serves as one of Canada's primary tools for attracting and retaining highly impactful researchers. To fulfill its mandate to support research excellence, the program has, over its 25-year tenure, adjusted its policies with regard to equity, diversity and inclusion. There are many rationales for why academia should prioritize these values: A faculty that is more representative of the Canadian population earns trust and legitimacy with the community; it is also more tuned to the full spectrum of questions that Canadians care about. Chief among the reasons, from a public interest standpoint, is that removing barriers to access means that nothing prevents the most talented scholars from transmuting their talent into the products of research that benefit us all.
To achieve this goal, the CRC program set 2029 equity targets for groups that were severely under-represented at the program's outset: women and gender equity-seeking groups; racialized individuals, like me; persons with disabilities; and indigenous peoples. The targets have been, largely, reached nationally for all groups.
That said, not all targets for all groups have been reached at all institutions. As per the 2019 policy adjustment, so long as an institution trails behind its targets on any one group, it is restricted from submitting new chair holder nominations for individuals outside of any of these groups.
There are two concerns with this policy in terms of how it operates on the ground.
First, aggregating the equity groups in this way serves as a blunt and sometimes ineffective way of addressing barriers. The pool of scholars who are racialized individuals or are from women and gender equity-seeking groups is much larger and is therefore much easier to hire from than the pool of indigenous peoples or persons with disabilities. As a consequence, the policy incentivizes some institutions—like mine—to swell their ranks of women and racialized individuals well beyond their targets while continuing to trail the targets for the latter two groups.
The second concern is the impact on the public interest of the restriction in the first place. As I noted earlier, any barrier to equal access impoverishes everyone because it fails to position the most talented individuals into the roles where their talent can do the most good, yet with the restrictive policy, the CRC program employs exactly this kind of barrier—closing doors rather than opening them.
Here is a case study of how this plays out. Several years ago, my department sought to fill a tier one CRC vacancy. We were replacing the retiring director of a highly productive global excellence research cluster on language sciences. Since this needed to be a senior scholar with a particular expertise, the pool of candidates was already small. Since it was a CRC hire, the pool was further narrowed to members of the four equity-seeking groups, excluding many of the most relevant and impactful scholars. This left very few qualified candidates, and indeed only one was both above our thresholds and open to moving from her institution in the U.S. Unable to meet her requirements and without any backup options, the search failed, the CRC was revoked, and the future of the institute and the research cluster is now in jeopardy.
Equity and social justice are important goals of the CRC program. However, by explicitly excluding a body of scholars, this restrictive policy creates an unnecessary conflict. It sets those aims against the program's broader goal of improving our depth of knowledge and quality of life for all Canadians, leaving talent on the table.
This is especially pressing right now. We're currently seeing the academic environment in the United States undermined by attacks on academic freedom and by devastating cuts to research funding. America is the global centre of science and research. The whole world will lose out from the disruption to knowledge creation that they will now experience. Canada is best positioned to pick up that slack. For high-impact scholars choosing to leave the U.S., the most attractive alternatives are to come to the University of Toronto, Waterloo or UBC.
The world needs these people to remain productive. I would encourage Canada to reconsider the trade-offs involved in keeping one hand of its CRC program tied behind its back. We should refine our policies accordingly. Science and scholarship work best when everyone is invited to participate.
:
Thank you, Madam Chair.
It's good to see all of my colleagues again. I look forward to working with all of you on this committee as we proceed. Thank you to the witnesses for being here as well.
It's an important study; this past July, the announced that $1.3 billion had been awarded in federal research funding. This study, which we're picking up, builds upon the work of the committee from the previous Parliament and wants to examine and receive input and feedback on the various criteria used in awarding these federal funds.
Ms. Ljubicic, you talked about the use of quantitative criteria and how that may be harmful. We've heard previous testimony from colleges that say they're precluded from some of this research funding, for example. We've also heard about the issue of DEI and its use in criteria, and how that may impact science as well.
Mr. Pinker, I'd like to thank you for your comments. You talked about how DEI works against the interests of science.
I was looking back at some of the previous testimony. Going back to November 2024, we had Dr. Jeremy Kerr, a professor at the department of biology at the University of Ottawa. When he was asked by one of the committee members, “How important are diversity and inclusion in research when producing reliable and accurate data?”, he replied, “I want to be really clear here. As I said, our objective is not to implement an affirmative action program; our objective is to achieve excellence, on behalf of Canadians....”
That's not to say that a diversity of views or diverse backgrounds are not important. Can you pick up on what you said in some of your comments and that notion of how DEI works against the interests of science?
:
Thank you for that. To your point, it also leads to faulty science.
I was reading an article by Geoff Horsman, who's an associate professor of chemistry and biochemistry at Wilfrid Laurier University. When he was talking to a colleague, this colleague basically said to him, “I have made my peace with EDI. I will lie about my most deeply held beliefs or convictions on paper in order to get funding.” They're basically saying that if you believe in merit and competency, shut up and just lie on your application to get the funding. That doesn't advance science.
What we have now is individuals being put in a position where they know that unless they tick off a box, they're not going to get their program funded.
I was wondering if you could elaborate on that.
:
Yes, well, many American universities require so-called “diversity statements” in which an applicant for a professorship has to basically endorse the policies of DEI, including racial preferences, and has to endorse the critical social justice theory as to why there are racial disparities.
I've had students who've had ChatGPT write their diversity statements because they could not honestly fill them out. It would go against their conscience to say things that they knew were not true, but they knew they would be blackballed and eliminated from a job if they expressed their true opinions. That's one of the reasons that many universities—now including my own, Harvard—have got rid of diversity statements.
Also, I think it is a peculiar version of social justice that says that the composition of a scientific body, a university body or a pool of funded scientists has to match the demographics of the population at large. It leads to, I think, rather monstrous consequences, like saying that there are too many Asians on a committee, or that too many Asians are getting funded, or too many Jews, or too many Sikhs or too many Arabs. It is just not going to be the case that every ethnic group or every sex is going to be perfectly represented in proportion to their membership in the population. If we are truly seeking quality, that should not matter. We don't have to count. There may be discrepancies, and they can go in different directions, but if we're funding the best science, we get the best science.
:
Thank you so much, Chair.
Thank you to our witnesses for being here.
I'm going to pick up, Professor Shariff, where you left off.
It's good to see you again. It's been a long time—probably 30 years or maybe longer. It's great to see you.
I want to pick up where you left off in terms of talking about the importance of making sure that we are able to attract and keep our best and brightest minds, regardless of some of these criteria. Some of these criteria may be important, but we should not index on those in making sure that we have the best folks around the table,.
What, in your view, is the best way for Canada to approach poaching talent—I'm going to say it bluntly—from the U.S., where folks are feeling uncomfortable right now about the threats to academia and there is this pervasive attitude that you have to think a certain way or else your funding is going to be cut? What do we need to make sure that we aren't falling into the wrong traps on either side of this conversation, to make sure that we're attracting the best talent—without leaning in on this perceived attack on “woke ideology”, which I want to get to, whatever the hell that means—in a way that gets us the best talent here and allows us to do the best types of research while also building an inclusive environment for academics?
:
One of the areas of my research is looking at institutional trust and perceived politicization. One of the challenges that we've discovered is that once people perceive an institution to be politicized, it has a negative impact on trust, not just for the people who perceive the institution to be on the opposite side from their politics but also for the people who perceive the institution to be on the same side as their politics. Scientists, the consumers of science and the consumers of scholarship do not want their institutions to be politicized.
Canada, unfortunately, has a reputation of having a somewhat politicized academy. Dr. Pinker talked about the impact that the perceptions of politicization are now having in the U.S.; Canada has an opportunity here to try to be a safe haven for a more objective, less politicized academy. People who are trying to flee a politicized and undermined academic environment in the U.S. could hopefully find a more flexible, free funding climate in Canada, as well as an academy that tries to lower the temperature on politicization.
Politicization in science is like bacteria in an operating room. There's no way you'll be able to get rid of it entirely, but you do want to do as much as you can to remove it. I don't think you should trust any surgeon who's not trying to do that.
I'm going to follow up with one more question for you, and then I'm going to throw the same question over to Professor Pinker.
In watching President Trump's attacks on my two alma maters, Princeton and Harvard, and the threats of cuts, what we've seen are cuts to funding for cancer research, diabetes, new ways of farming, preschool development and teacher quality. These are all things that have been affected by this attack on what is being termed “woke ideology”. We've heard this term “woke ideology” being used by the in this country; he says wants to cut “woke ideology” from Canadian universities.
When you hear terms like that and the types of attacks on universities that are being made under that guise and that cover, does it concern you that Canada might go down a similar road in terms of using that as a cover to attack academic freedom, academic research and academic intellectual expansion?
Professor Pinker, I'm going to throw the same question over to you. I just want to say that I really enjoyed your piece “Harvard Derangement Syndrome”, because I think it actually brought to light some of the concerns that folks have about when the attacks become blanket attacks. You used the example of the impact on Jewish professors as a perfect example of how, when you're using, perhaps, one angle, there is a broader impact on research, on science and on folks whose funding is getting cut.
Can you talk a little bit about how we dial down that type of rhetoric and why it's important to dial down that type of rhetoric in Canada so that we don't fall into the same trap? Can you then also follow up on the question I asked Professor Shariff?
:
Thank you, Madam Chair.
I'd like to thank the witnesses who are here today to take part in this important study.
My first questions are for Professor Pinker.
Don't equity, diversity and inclusion policies risk replacing merit with political considerations, thereby undermining public trust in science?
If science is perceived as ideological, doesn't that also risk undermining public trust, even when it comes to issues like climate?
:
Thank you very much, Madam Chair.
It's an honour to be a part of this committee, as I've been a student of science throughout my life. After doing my master's in civil engineering at Toronto Metropolitan University, I applied for my Ph.D. as well, but I couldn't make it due to some time constraints.
My question is for Professor Gita Ljubicic. She's a professor at McMaster University, which is next to my riding.
I'm pleased to see the work you have been doing in the faculty of science. I'm curious to know whether you, based on your experience with the braiding project, have faced more or fewer issues regarding the funding for research around indigenous knowledge.
:
Thank you for the question.
As I mentioned, there have been a lot of changes over the past decade or so in tri-council policies that increasingly recognize the value of indigenous knowledge, encourage indigenous leadership in research and encourage partnerships. There's actually more and more funding available for indigenous scholars and partnerships with indigenous communities. The challenge that I was trying to highlight, though, is how to effectively assess whether those partnerships are respectful, whether they are upholding indigenous leadership and whether they enable indigenous scholars to access those funds.
A lot of the comments that were discussed around EDI are also really important in the context of indigenous and other community-engaged research in terms of how it's evaluated, so that researchers are not just ticking boxes to be able to apply for these particular sources of funding but are actually following through on what they say, and you can actually track partnerships and respectful, culturally appropriate methodologies in how they write their methods, in who their team is and in how they allocate their budgets. This is a big factor.
Yes, I think there's more support for and recognition of indigenous research, but that's where I think some of the qualitative assessments are really important, to differentiate between those who get really good at writing proposals in a certain way and those who are actually implementing meaningful, respectful approaches to research.
Thank you to all the panellists for being here today.
My question is for you, Mr. Pinker. You mentioned that, with the system as it is today, if anyone goes against the current orthodoxy, it creates a loss of trust in science, and I think this is the most detrimental effect of what's happening today in the DEI base.
I'm going to echo very quickly an article that came out: “a fellow academic scientist...said, 'I have made my peace with EDI. I will lie about my most deeply held beliefs or convictions on paper in order to get funding.'”
How would you assess the current merit-based criteria for federal funding in Canada? How will that trust be eroded in time, and how quickly, especially in innovation hubs like Kitchener Centre, where I'm from?
Earlier, a member opposite mentioned world views and the importance of incorporating world views. I am a western scientist. That's how I've been trained. That is the world view I use, but I also know that if I look at the world only in that context or if we use only that context, it is incomplete. The example I want to use with regard to that is indigenous knowledge. In the environmental sciences, where I come from, we know that the first nations of this land already knew that there was an ice age in the past. They already knew that the Great Lakes were in a different location. There is a tremendous amount of knowledge that is there, especially around sustainability, so it's important that we work together.
My question is for our first speaker, Dr. Ljubicic.
Could you just speak to the importance that you see of indigenous knowledge and of collaboration and partnerships so that we can further the field of science?
I would like to make a few comments for the benefit of the new witnesses. Please wait until I recognize you by name before speaking. Those participating by video conference can click on the microphone icon to activate their mic. Please mute yourself when you are not speaking. Those on Zoom can select the appropriate channel for interpretation at the bottom of the screen: floor, English or French. Those in the room can use the earpiece and select the desired channel. I will remind you that all comments should be addressed through the chair.
For this panel, I would like to welcome Dr. Kelly Cobey, scientist at the University of Ottawa Heart Institute. We are also joined, via video conference, by Dr. Grace Karram, assistant professor of higher education and coordinator of the higher education graduate program at the University of Toronto. Our third witness for this panel is Mr. Vincent Larivière, professor, Université de Montréal.
Welcome to all the witnesses.
Each of you will have five minutes for your opening remarks, and then we will proceed to the round of questioning. We will start with Dr. Cobey.
Dr. Cobey, please go ahead. You have five minutes. Thank you.
:
Thank you, Madam Chair and members of the committee, for the invitation to discuss the impact of federal funding criteria on research excellence in Canada.
I am a scientist at the University of Ottawa Heart Institute and an associate professor at the University of Ottawa. I also co-chair an international initiative called DORA, the Declaration on Research Assessment. DORA operates globally and across all disciplines. Our recommendations at DORA apply to funding agencies, academic institutions, journals, metrics providers and individual researchers. DORA advocates broader assessment criteria to acknowledge the diversity of researcher activities.
Our meeting today comes at a time when the criteria to assess researchers in this country are shifting. Historically, decisions were based on quantitative metrics, such as the number of articles we published, the journal impact factor of where those publications sat and the amount of funding that we brought in. Quantitative metrics are easy to calculate, which makes them convenient for assessing a lot of people very quickly. Unfortunately, they're not evidence-based, they're not responsive to changes in the research ecosystem and they can't be used for any mission-driven goals of the federal government.
The misuse of the journal impact factor, as well as the overemphasis on quantitative metrics, has created a culture in the research ecosystem of “publish or perish”. As researchers, we often feel that the surest or only pathway to success in our domain is through publishing more and doing more, with less emphasis on quality and more on quantity.
However, presently in Canada, we're seeing a principled shift away from these quantitative metrics and toward consideration of qualitative metrics that consider a broader impact of research. Canada's tri-agencies signed DORA in 2019 and have been working to implement its recommendations since then. This process is an evolution, not a revolution. In my view, Canada is becoming active on the global science policy stage with respect to the criteria to assess researchers. The tri-agencies are actively involved in DORA's community of practice for funders, they have a leadership role in the Global Research Council's research assessment committee and, through SSHRC, they have joined RORI, the Research on Research Institute.
Concretely, as researchers, we see recent changes that have had a widespread and immediate impact on us. For example, CIHR has an entirely new research excellence framework that now considers research excellence across eight domains, one of which is open science. The tri-agencies as a collective are implementing a new narrative CV, which sounds exactly like what it is: It's a descriptive report on what a researcher is doing, how they did it and why it had an impact. This is replacing a traditional CV, which was much more considerate of a list of outputs as opposed to a qualitative, nuanced assessment.
This new format requires researchers and reviewers alike to be trained in how to create these narrative CVs as well as how to appropriately adjudicate them. Otherwise, there's the concern that old habits and these leadership-style quantitative metrics are going to persist in the written narrative form. Narrative CVs are part of the solution to assessing research appropriately; however, I would say that I'm concerned about how these reforms are being implemented in our country and that there's a gap between the strong science policy that we're creating around this and the actual realities of what's happening at committees. We need to ensure effective monitoring and implementation as we roll out these changes.
I have three final short points.
First, how the federal government chooses to assess research excellence directly impacts what research is done, how it is done and who does it.
Second, the tri-agencies' new definitions of research excellence do not always come to be considered in practice in how research is evaluated by committees. This again comes back to repeated implementation gaps between what we say we want to do and what actually happens.
Finally, even if we assume that the criteria used to assess excellence in this country, historically or presently, were appropriate, there are a series of issues with how funding is administered in this country that prevent us from achieving that excellence in an efficient way. One example is the across-the-board funding cut for funded research projects.
There's also, in my view, incredibly limited grant monitoring. Once we get funds based on the promises of what we wrote in our grant, there's very little monitoring to see that, as researchers and as a federal government, we're providing returns on that investment.
Thank you.
:
Thank you very much, honourable members.
[Translation]
Thank you for the opportunity to talk about this important topic.
[English]
When we compare Canada's position in scientific research with the positions of other members of the OECD, several paradoxes come to light. I will present these paradoxes as a way of clarifying Canada's research and development sector and those who work within it.
Specifically, I'll examine the role of post-secondary institutions, the impact of international research collaborations, the role of the business sector and labour market inefficiencies that have led to an underutilization of our Ph.D.s. I'm going to conclude with several recommendations to help strengthen Canada's research production.
How does Canada compare globally? Well, Canada's general expenditure on research and development, as a percentage of the GDP, is notably below the OECD average, and it has declined steadily since 2001. The paradox, of course, is that higher education expenditures in research and development have increased 30% during the same 20 years, so Canadian post-secondary institutions and the researchers they house play a significant role in the country's research and development.
The second paradox is that while our percentage of publications per researcher places us at seventh in the world in—and that's great—in our production of patents, we're actually 18th from the bottom. This is likely because of fairly low levels of R and D in the business sector. Even though industry tends to fund some R and D in post-secondary institutions, the ties are relatively loose.
The third paradox relates to international collaboration and a significant gender divide. Studies have repeatedly confirmed that international collaboration is correlated with an increase in research production, often identified by publications, however limited. However, in Canada, a statistically significant gender divide exists between men and women researchers. Men have significantly more international collaborations, and thus more high-impact research outputs.
The final paradox relates to labour and personnel. Although Canada has increased the number of individuals graduating with doctoral degrees, the number of tenure-track positions has plateaued. This has led to highly skilled researchers being employed in part-time, precarious positions mainly focused on teaching, and some eventually leave academia. You just have to visit one of Canada's amazing colleges, universities, CEGEPs or polytechnics to see a huge labour force of underemployed Ph.D.s, many with international experience and many who are women. Because much of our R and D is housed in post-secondary institutions, our private sector does not absorb Ph.D.s in the same way as other countries.
What does this tell us about scientific research in Canada? Higher education is a significant actor. We have relatively loose business ties, limited participation in global collaboration and an inefficient labour market that's not making the most of its skilled labour.
What do I recommend? Well, first, post-secondary institutions are at the heart of our research success, so keep funding universities and colleges. Canada needs to increase research funding to build the infrastructure at smaller institutions, as others have said in these panels, and definitely at our colleges, with their ties to industry and applied research. This practice of funding both projects and institutions has been very successful in the European context. In contrast, Canada tends to focus more on the projects than the institutional infrastructure, and we need to bring institutions up as well.
Second, fund both theoretical and applied research, establish strong partnerships with industry and make a pipeline to patents. However, as gatekeepers of research funding, we need thoughtful regulatory frameworks that ensure that it's done ethically and equitably and that it considers the social impact of research.
Third, we have to expand who is considered a researcher. Our precarious faculty who teach on part-time, limited contracts are rarely eligible to apply for federal funding. Moreover, federal funding prevents salaries from going to principal investigators, meaning that part-time researchers, when they are eligible to receive a grant, cannot increase their income to a living wage with funds from the grant. Our selection criteria need to adapt to the reality that not all researchers have the same conditions of employment.
Fourth, we need to increase our global collaborations and provide funding for travel to work globally with other teams. When I have conducted research on international publications, other teams in other countries are shocked that international collaboration is not one of our requirements. We need to focus on the big issues that impact our planet.
Lastly, we need targeted programming to support populations of researchers who are left outside the high-impact world of scientific research: women, researchers of colour and indigenous communities. In short, we want to see research funding going to diverse institutions and diverse researchers who can make Canada a global leader in scientific research with a positive social impact.
:
Thank you very much for the invitation to testify on the important issue of research excellence.
My name is Vincent Larivière, and I'm a professor of information sciences at the Université de Montréal. I'm also the UNESCO Chair on Open Science and the Quebec research chair on the discoverability of scientific content in French. I'm not representing the Université de Montréal today. I'm appearing as an individual, as an expert who has spent about 20 years studying the scientific community, and specifically the issue of research excellence and evaluation.
The first thing that's important to mention is the lack of consensus on what research excellence is. This can be seen virtually everywhere in the scientific community. Funding evaluation committees don't always agree on which projects are the most important. Journal editors and reviewers don't always agree on the quality of a paper.
Excellence in research is, in a way, the holy grail of the scientific world, but it remains quite difficult to define. There's a lot of subjectivity in all of this. It can be explained in a number of ways, but one thing is clear: Scientific excellence is multi-faceted. It can vary depending on the context. It can be the ingenuity of a method, the originality of a research issue, the quality of an argument's construction or the potential applications of a research project.
Because of this lack of consensus, evaluation committees often rely on quantifiable indicators, things that can be measured: the number of papers written in prestigious journals, the number of times they are cited, whether the person graduated from a prestigious university or whether they have gotten funding before. One of the main criteria for getting funding is having already gotten it. Those quantifiable markers don't always reflect research excellence, but they make the evaluation much simpler. A dozen or so publications will always be more than five. A million dollars will always be more than $100,000. That way of evaluating scientists and their projects, often done implicitly, raises important questions for the Canadian scientific community.
Focusing on publication volume will promote certain works, but also certain themes that are more easily published. That contributes to an overproduction of papers, which shouldn't be confused with overproduction of knowledge. Overproduction of papers contributes to noise and information overload, especially of mediocre quality. Many Nobel Prize winners, including Peter Higgs, have said that they wouldn't have been able to make their discoveries in today's context of research evaluation.
I'd like to make three recommendations for improving research excellence in Canada.
The first one is quite complicated, but I think it's doable. The idea would be to enable funding agencies to experiment with peer review. Peer review is known to be imperfect, but many countries are experimenting with it, including Switzerland, Norway and the United Kingdom. We can't say that those countries are lagging behind in science. There are countries that have taken the bull by the horns, realized the biases currently associated with research evaluation and decided that they should try to find new ways to encourage excellence. As my colleague Julien Larrègue says, it's important for the results of those experiments to be available to the expert community.
The second recommendation is somewhat related to what my colleague Ms. Cobey said on the issue of CVs, which are evaluated by the various committees. Narrative CVs were recently put in place, which I think sounds like a good idea on the surface, but it isn't entirely clear how those CVs are going to be interpreted. They will, in fact, also be interpreted based on their volume. I recently received a seven-page narrative CV that was longer than the application itself. We have absolutely no idea how committees are going to evaluate that. That has to be considered. Some countries have implemented a requirement for short, two-page CVs that don't focus on the publication volume and that can then show the publications that are most relevant to the project.
The third recommendation goes back to indicators. In Canada, there usually isn't an explicit request to provide indicators for evaluations. However, during evaluations, committee members often pull indicators out from nowhere. Obviously, committees are often sovereign, so there isn't much that can be done. I think there needs to be a ban on using those indicators in the evaluation committees of granting agencies. It isn't just a matter of not encouraging them; it's also about telling the committees that all of that is outside the scope of the evaluation.
Thank you, and I look forward to taking your questions.
:
Thank you, Madam Chair.
I'd like to quickly indicate that I'll ask one question, and then I'll cede some of my time to my colleague Ms. DeRidder. She has a few questions she'd like to ask.
I'd like to follow up with Ms. Cobey and Mr. Larivière.
Mr. Larivière, you mentioned a notion that struck me—the lack of consensus on what constitutes research excellence.
Ms. Cobey, you talked about DORA and the move away from quantitative to qualitative metrics, for example. The DORA principle is being accepted by the three federal granting agencies, but I read in the briefing materials provided that only nine universities have accepted that principle. Why do you think there's only a limited uptake on that with regard to accepting the DORA principle? What's precluding others from accepting that idea?
I think one thing that needs to be done is that there needs to be more consultation on an ongoing basis between, say, the tri-agencies and the government and the researchers in the institutions. There's a bit of a siloing, I think, in terms of how messages and policies translate from the federal funders to the institutions. At the tri-agencies, they may be saying that they signed DORA and they value a broad range of impacts, including community-based research and these types of things, but if the institutions don't send that same message, there's a bit of a mismatch.
I feel that researchers are often caught between two systems as we roll out at the federal level. We're being told EDI, open science and broader excellence from our federal funders, but many of our institutions are still focused on those quantitative indicators. It creates a duplication of effort for us as researchers.
:
Thank you so much, Madam Chair.
Thank you to our witnesses.
The purpose of this committee is to make recommendations, as all three of you have done, on the type of evaluation criteria to be used for the allocation of federal funding through the tri-agencies. I was one of the members of this committee who was here in the previous Parliament when we studied this question. We heard a number of ideas.
Dr. Cobey, you mentioned the “publish or perish” imperative. One of the ideas we heard was that perhaps applications should be completely blinded as to who the proponent is. In other words, that potential bias would be removed, and the evaluation would be done simply on the quality of the proposal. That would perhaps be part of a stepwise review of the application. In other words, once various proposals were considered excellent, they could then proceed to more evaluation of the team. Since we're so interested in the whole EDI evaluation criteria, perhaps that could be part of the second step. It might relate to the training of researchers, etc.
I'd like some of your comments to see how, in a very practical way, you could remove some of the potential biases that have existed institutionally for a long time.
:
Through you, Madam Chair, thank you for the question.
I think you point at issues or shifts in the potential way peer review is done at grant panels. I would agree with your suggestion. I think having blinded peer review at these committees could help address some of these issues, and then selecting for excellence in the second phase where perhaps it's not blinded.
I would encourage the committee to consider making peer review more open, generally speaking. While we may have blinded peer review initially, at the end phase, once selection is done, I think it would be of extreme value to open up the black box that is the peer review process in this country for federal funding and make those peer review reports as available and as open as possible. Sometimes there are trade secrets or things that need to be closed, but to me, in order to improve the system, we need to know how the system is working and we need to do active research, or metaresearch, on peer review to improve it. We don't want to go from one system that's clearly not working to another system that we think might be working better without actually having the evidence. As a researcher, I think we need research, metaresearch, to show that the goals for how we'd like to change peer review and select for excellence are actually being changed and achieved. Right now, across the board, I would say there's very little implementation in monitoring our policies and practices.
Right now in Canada, for the vast majority of grants, once you get funding, there's essentially no monitoring until your final report. In other jurisdictions, there are grants officers assigned to funded projects to ensure that certain milestones are met and that overall outputs are delivered.
I'll use, from my area of expertise, the concept of open science. For instance, with clinical trials, we have federal policy to ensure that these trials are registered prospectively in an appropriate registry. We know that we're not doing that for metaresearch.
We have a policy and we need to monitor, when we do fund a trial, that those trials are indeed getting registered and that the results are subsequently being reported fully and completely. We know from an audit we've done that about half of the trials conducted in this country never see the light of day in terms of having their results reported in a public registry or even in a peer-reviewed publication. That suggests inefficiency. We want to make sure that there's monitoring to make sure that some of these basic science policies that we have—our science policies are quite strong and getting stronger—are being implemented on the ground.
:
Thank you, Madam Chair.
Thank you to the witnesses for being here.
Professor Larivière, you have shown in your work that the group of Canada's 15 major research universities, or U15 Canada, has received about 80% of research funding in Canada over the past 20 years. Out of that group, five universities in particular receive nearly half of that amount.
Does this federal allocation criterion really promote excellence, or does it preserve a concentration that perpetuates institutional prestige without improving scientific production?
:
It's obviously a complex issue, and it's hard to find cause and effect relationships in all of this.
However, there's a well-known phenomenon in the sociology of science called the Matthew effect. Basically, the scientists or the institutions with the biggest amount of symbolic capital and prestige will receive even more, regardless of the intrinsic quality of it all. If two scientists discover the same thing at the same time, the discovery will most likely be attributed to the one who already has an enormous amount of capital. We know this to be one of the natural effects, say, of the scientific system, that is, giving more to those who already have it.
If future funding is based on past funding, that obviously leads to the concentration of funds, largely in the hands of researchers affiliated with U15 Canada.
You mentioned a little earlier that peer review committees can use bibliometric indicators as criteria for excellence—you just now said, “pull indicators out from nowhere”—such as citations, impact factors, the h-index and publication volumes. However, your research shows that the committees favour certain disciplines, as well as scientific publications in English, to the detriment of francophones, the humanities and emerging researchers.
Do those criteria really reflect excellence, or do they introduce new biases into research funding?
:
Thank you, Madam Chair.
Thank you to all the witnesses. Thank you to the members opposite as well.
Since this committee is about government funding for research and academic excellence, I will focus my questions on where the grant should go and how better use of that grant can be made. I will start with the heavily talked about question of DEI in this session.
My question is for Dr. Karram.
You said in your evidence that drawing a balance is an important thing that we need to keep in mind when federal funding comes into question. Here is my question for you. When there's DEI, it's a mandatory condition put by the government on educational institutions. How do you think a balance can be drawn for researchers who do not want to disclose that information? If they disclose that information, they risk not getting that funding or not having their application approved. How would you respond to that?
:
We have years of research that shows that names, for instance, bias hiring with résumés.... This is well documented in sociology. When we have things like blind peer review as part of the process, that can be a way to evaluate the research without considering EDI. That's one option.
We also have really good streaming that we do for early career researchers. You say that you're an early career researcher, and you're in a particular pool that allows you to be evaluated accordingly. Why can we not have certain pools for people who are saying, “I want to be evaluated on merit”, and for people who say, “Look, I'm from a group that has historically been marginalized in the academy. I do not have the family resources. I went to a small institution, so I don't have the institutional resources to get these great research ideas off the ground”?
When I say that we're going to fund research infrastructure.... I work at a university where my hand is held from the inception of my research ideas right until the moment I click send on my SSHRC grant applications. I have such a robust community, offices that help me, and we want to make sure that groups of people who don't have access to that have access to that.
In their files, if we know they're from one of these groups and if they've been able to identify the reasons they have been on the outside of the research community, we want to bring them in, because that research that's getting missed is actually putting Canada behind. We know that international collaboration increases publications. The people who have natural ties to other parts of the world—because they're first-generation Canadians in the academy or have come here with a Ph.D. from another place—are a huge asset in the Canadian labour force, and we are missing it.
:
Thank you, Madam Chair.
We think of research as the time you start the experiment until the time you end the experiment. However, there is a lot that happens before that, and there's a lot that happens after that.
On the research funding criteria—which was the actual basis of this study and why we brought you here—I heard some great comments earlier. I'm hoping you can elaborate on that around knowledge mobilization and uptake and how we should be including, or not including, that in research criteria for awarding.
I'll start with you, Dr. Cobey.
:
When we look at the amazing research that's happening in our colleges, which needs to be funded more strongly, we have this applied research that is looking at how work-integrated learning, for instance, is able to build curriculum, and we're getting a much faster pipeline from post-secondary into the labour market. That's really important.
When we have people sitting at the table who've worked in other jurisdictions, they begin to make these networks. Right now, we have incredible programs happening at a few of our community colleges where we're making links to institutions in other nations. Canada plays both a development role and a learning role in seeing how places with strong, robust technical education can move forward.
The sharing of best practices internationally is one of the most effective ways to improve your post-secondary sector, which, again, is at the heart of your research. It goes wrong when we have people who have worked in only one small community, where they are. They've done an effective job, but they haven't included the diverse voices of, say, the international students—
:
Thank you, Madam Chair.
My next question is for Ms. Cobey.
You mentioned earlier that the application of the San Francisco Declaration on Research Assessment, or DORA, was quite robust. However, the three granting agencies took six years to apply the DORA.
After such a six-year delay, how can you say that the application is robust enough? In my opinion, there is instead a lack of leadership or vision. Can you also confirm how many universities in Canada have signed DORA?
With that, we will end the round of questioning.
Thank you to all the witnesses for appearing today and providing important testimony. If there is anything you want to bring to the attention of the committee members that you were not able to address because of limited time, you can always send written submissions to the clerk of the committee. Once you do that, those submissions will be circulated to all members of the committee and will be taken into consideration at the time of the drafting of the report.
I really want to thank all the witnesses.
Do I have the consent of the members to adjourn the meeting?
Some hon. members: Agreed.
The Chair: Okay. The meeting is adjourned.