Etienne Vignola-Gagné, Dr. Phil., holds 12 years of experience as an analyst of public policy, innovation systems and organizational change. He has led in-depth case studies of major biomedical innovation initiatives and policies, conducting hundreds of expert interviews with researchers, policymakers, industry representatives and others along the way. He has conducted mandated enquiries for British, Canadian, German and European agencies, focused on improving public policy in areas such as the evaluation of transdisciplinary research programs; developing bibliometric metrics for capturing societal outcomes of research, or international comparisons of COVID-19 test and trace administrations and systems; or tracking genomic innovation in cancer care.
He has published or co-authored scientific contributions in venues such as Quantitative Science Studies, History and Philosophy of the Life Sciences or Science and Public Policy. Dr. Vignola-Gagné holds a doctoral degree in political science from the University of Vienna, with prior training in science and technology studies.
Etienne Vignola-Gagné
Science-Metrix / ElsevierAs one of the Science-Metrix coauthors of the Technical Note listed in the background documentation for this discussion, I wanted to provide a bit more context about the general orientation that my coauthor, Christina Zdawczyk, and I gave to this framework for the deployment of bibliometric strategies as part of CGIAR QoS evaluations.
You may notice that the ubiquitous publication counts and citation impact indicators were afforded only a small portion of our attention in this Technical Note. One of our intentions with this note was to showcase how bibliometrics now offers indicators for a much broader range of dimensions, including cross-disciplinarity, gender equity, preprinting as an open science practice, or the prevalence of complex multi-national collaborations.
That is, there is (in our opinion, often untapped) potential in using bibliometrics as indicators of relevance and legitimacy. Simultaneously, some of the bibliometrics we have suggested can also be used as process or even input indicators, instead of their traditional usage as output indicators of effectiveness. For instance, bibliometrics can be used to monitor whether cross-disciplinary research programs are indeed contributing to increased disciplinary integration in daily research practice, considering that project teams and funders often underestimate the complexity of such research proposals. Moreover, dedicated support is often required for such projects, at levels that are seldom properly planned for (Schneider et al 2019). With the caveat that output publications to be monitored are only available later in the project life cycle, findings on cross-disciplinarity can help modulate and re-adjust research programs and associated support instruments on a mid-term timeline.
As you can see, our view is very much one of using program evaluation tools, including bibliometrics, to improve research and innovation governance and support mechanisms for project teams, rather than to rank performances.
Hope you enjoyed or will enjoy the read.
Etienne
Senior analyst, Science-Metrix (Elsevier)
Reference
Schneider, F., Buser, T., Keller, R., Tribaldos, T., & Rist, S. (2019). Research funding programmes aiming for societal transformations: Ten key stages. Science and Public Policy, 46(3), pp. 463–478. doi:10.1093/scipol/scy074.