Platform logo
Explore Communities
27th International Conference on Science, Technology and Innovation Indicators (STI 2023) logo
27th International Conference on Science, Technology and Innovation Indicators (STI 2023)Community hosting publication
You are watching the latest version of this publication, Version 1.
conference paper

Multi-actor policy dynamics in research evaluation: The introduction of international standards of excellence

21/04/2023| By
Eleonora Eleonora Dagiene,
+ 1
Guus Guus Dix
1605 Views
0 Comments
Disciplines
Keywords
Abstract

We analyse the dynamics between national science policymakers and international research evaluation experts in the context of research assessment in Lithuania. We focus on the pressure to comply with international standards of excellence brought in by foreign experts and the attempt of national policymakers to translate the recommendations of these experts to the Lithuanian context. Analysis of national research assessment policies and interviews with politicians and civil servants reveal why Lithuanian policymakers predominantly opted for quantitative measures to assess institutions and researchers.

Preview automatically generated form the publication file.

Multi-actor policy dynamics in research evaluation:
The introduction of international standards of excellence

Eleonora Dagienė*, Ludo Waltman** and Guus Dix***

* e.dagiene@cwts.leidenuniv.nl

https://orcid.org/0000-0003-0043-3837

Centre for Science and Technology Studies (CWTS), Leiden University, The Netherlands;
Institute of Communication, Mykolas Romeris University, Lithuania

**waltmanlr@cwts.leidenuniv.nl

https://orcid.org/0000-0001-8249-1752

Centre for Science and Technology Studies (CWTS), Leiden University, The Netherlands

***g.dix@utwente.nl

https://orcid.org/0000-0003-1571-536X

Center for Higher Education Policy Studies (CHEPS), University of Twente, The Netherlands

We analyse the dynamics between national science policymakers and international research evaluation experts in the context of research assessment in Lithuania. We focus on the pressure to comply with international standards of excellence brought in by foreign experts and the attempt of national policymakers to translate the recommendations of these experts to the Lithuanian context. Analysis of national research assessment policies and interviews with politicians and civil servants reveal why Lithuanian policymakers predominantly opted for quantitative measures to assess institutions and researchers.

1. Introduction

When the Baltic state Lithuania regained its independence in 1990, the small Lithuanian researchers’ community was still predominantly geared towards former Soviet patrons with few connections to other global research communities. Politicians put internationalisation of science as a primary goal for academia, mainly opted for quantitative measures (e.g. getting more papers in international journals), and introduced various instruments to achieve tangible results. Since 2001, decisions on which researchers to recruit, promote, reward or lead the funded project and which research units to support were based on the number of articles they produced, relying exclusively on author-based metrics and Journal Impact Factor (JIF). With its emergent focus on internationalisation, Lithuania became an integral part of global development in which many countries fostered international cooperation to strengthen their research systems (Crăciun & Orosz, 2018).

In Lithuania, the quantification of research assessments for individuals and institutions did not run smoothly. Instead of a linear process of policy-driven internationalisation, it ignited complex policy dynamics between key actors internal and external to the Lithuanian science system. These dynamics led civil servants, foreign research evaluation experts, publication firms and academic researchers to engage in relations of cooperation and conflict with consequences for the future of science and science policy.

There is little research on the multi-actor policy dynamics that drives the emergence and development of quantitative research assessment in general and measuring internationalisation in particular. One reason is that a key strand in the literature understands science policy in terms of delegation, not dynamics (Rip & Meulen, 1996). In that understanding, one actor group is referred to as the ‘principal’ (usually the government), who then delegates to ‘agents’ (usually researchers) a ‘task’ – say: publishing articles in high-ranked journals – they have to successfully accomplish to meet the principal’s demand (Borrás & Edquist, 2013; Braun, 2003; Potì & Reale, 2007). While there is a lot to gain from these studies, the subordination of dynamics to delegation makes it challenging to understand science-policy processes where multiple actor groups interact and where the definition of a reasonable task is exactly at stake.

A second reason for the lack of research on multi-actor policy dynamics around indicator development and use is a rather narrow methodological focus of the scientometric literature on internationalisation. That literature offers rich discussions of co-authored papers as a bibliometric indicator for international cooperation (Kehm & Teichler, 2007; Robinson-Garcia & Ràfols, 2020; van den Besselaar et al., 2012) with many studies pointing out that co-authorship is a complicated, even problematic, measure for internationalisation (Gazni, Sugimoto, & Didegah, 2012; Katz & Martin, 1997; Wagner, Brahmakulam, Jackson, Wong, & Yoda, 2001; Wagner, Park, & Leydesdorff, 2015). Though interesting, the discussion of co-authored publications as scientometric indicator for internationalisation is relatively disconnected from any concerns about real-world science policy. For all its methodological sophistication, the political dynamics in developing, using and adapting metrics is largely absent from the literature on bibliometric indicators.

Due to the emphasis on delegation in the science policy literature and the methodological focus of the scientometric literature on internationalisation, there has been surprisingly little discussion of the policy dynamics in which quantitative performance indicators are developed, used, contested, and altered. This paper takes a multi-actor approach to address that gap. With an empirical focus on Lithuanian science policies, we seek to understand where the political demand for quantification comes from and how it intersects with other scientific system developments. To answer that question, we provide an in-depth analysis of the first phase of quantitative assessment in Lithuanian science policy and the science system at large, roughly between 1996 and 2008.

Although we focus on a particular phase in the development of Lithuanian science policy, our analysis of this multi-actor policy dynamics may also be insightful for metric-based assessment in other research systems. Our work could therefore have ramifications for developing science policies in other political contexts, especially when the place of indicators in research evaluation and science is again up for debate, for instance in the light of the recently established Coalition for Advancing Research Assessment CoARA1.

Three interrelated multi-actor dynamics in science policymaking stand out here, namely those relating to the establishment of quantitative requirements for researchers; to intermittent policy changes in the indexing of journals in the ISI database; and to academics who resisted quantitative measures overtly through the judiciary or covertly by gaming the system. This conference paper focuses on the first dynamics between national science policymakers and international research evaluation experts. More in particular, we focus on the pressure to comply with international standards of excellence brought in by foreign expert judgement and the attempt of national policymakers to translate that judgment to new local contexts. In the full paper, we also elaborate on (a) the dynamics between science policymakers and data providers, as the former make themselves dependent on databases of peer-reviewed journal publications and (b) the dynamics between metrics-oriented policymakers and researchers who attempt to challenge the bibliometric assessment of their work.

2. Methodology

To analyse three interrelated multi-actor dynamics in science policymaking, we combined interviews with politicians, civil servants, researchers and document analysis (Figure 1). We analysed national regulations, containing requirements for research outputs gathered from the Register of Legal Acts of the Republic of Lithuania (TAR)2.

Figure 1. Actors and types of data studied in our analysis of multi-actor dynamics in science policymaking

Seven politicians and six civil servants participated in semi-structured interviews on the development of research assessment policies, requirements imposed, expectations and achievements.

3. Introducing international standards of excellence in research evaluation

When Lithuania became independent in 1990, policymakers sought to reorganise the entire Soviet-era science system. In the decade following initial academic reforms, Lithuanian policymakers revealed they had bigger ambitions: successfully integrate Lithuanian researchers into European and global scientific knowledge production. One of our interviewees made a strong stance supported by almost every Lithuanian politician: “National science doesn’t exist, so internationalisation is our priority” (Interview, politician 7), attesting to the desire to become part of the international scientific community. Nurturing such ambitions, policymakers began with assessing the national research institutions following international practices.

3.1. Trying to carry out qualitative evaluation from within

In 1994, the first national research assessment in Lithuania aimed to present its results to the government for use as a basis for funding allocation. As one politician explains, “Before then, the state funds for higher education and research institutions were distributed under unclear principles,” and specified that “interested parties, such as state-funded institutions, used to meet together and start such a tug-of-war.” (Interview, politician 6). To develop new funding principles, Lithuanian academics were tasked to lead the first research assessment (Daujotis, Radžvilas, Sližys, & Stumbrys, 2002). To start with, faculties and institutions were asked to prepare self-analysis reports. Two groups of local experts worked in parallel to evaluate the same research outputs – papers, patents and industry-academia cooperation – submitted by the institutions. As one civil servant mentioned, “These chosen indicators corresponded with globally recognised measures of excellence, but our local academics at that time had a lack of understanding of research papers” (Interview, civil servant 3). Most submitted articles were published in local journals, some in the Soviet Union and only a few in Western ones (Daujotis et al., 2002, 176-177).

More importantly, isolation from the Western world under the Soviet regime profoundly affected the culture of research assessment: the higher the academic rank of authors, the better local experts scored the publication they were asked to assess. This assessment culture made it complicated for Lithuanian academics to apply procedures, criteria, and outputs accepted by the Western scientific community. Even when the expert groups working in parallel reached the same conclusion about the quality of the submitted works, policymakers considered results in some disciplines unreliable just because of the submitted outputs’ types (Daujotis et al., 2002, 178).

The unsatisfactory dynamics between science policymakers and national expert groups led the former to conclude that calling on people ‘from within’ the national academic community does not work. After national-level consultation failed, an ‘outside’ perspective was needed to get things going. As one politician put it: “It was necessary to restructure that old science system. Everything required a specific look from the outside.” (Interview, politician 4).

3.2. Inviting foreign expert judgement

Lithuanian policymakers searched for that ‘look from the outside’ in three different directions. First, they decided to follow the lead of Latvia and Estonia, who, having recently gained independence too, asked the Danish and Swedish research councils to evaluate their respective research systems. In 1995, foreign experts from the Norwegian Research Council were brought in for an external assessment. Second, in the same year, Lithuania applied for membership to the European Union. Seeking EU membership, Lithuanian policymakers began to engage with European policymakers to provide information on their national research system and its development. Finally, their recently gained independence brought them in contact with experts from the World Bank.

The foreign expert reports all came to a similar conclusion. The Norwegian experts, to start with, pointed to a feature of the publication practices of Lithuanian researchers they deemed problematic for the database-driven bibliometric assessment they looked for: “Too few research results are published in languages that allow communication with international academic communities. This hampers an international peer review of Lithuanian research” (Evaluation of Research in Lithuania, 1996, 18). In line with that verdict, the Norwegian experts argued that important measures were needed ‘to increase international contacts and cooperation substantially through publishing in international, peer-reviewed journals (when appropriate).’ (ibid., p. 29). In similar vein, EU policymakers (Daujotis et al., 2002, 169) and experts from the World Bank (2003, 70) advised the government to bring about significant changes in the stagnated research system through international standards in research evaluation and the promotion of international cooperation.

This advice was not lost on the Lithuanian government. One civil servant could still recall the verdict of EU-commissioned consultants: “The expert from Coopers & Lybrand told us: ”Listen, annually, from all over Lithuania, you make three hundred articles [in ISI journals], and the rest - somewhere else.” The civil servant added, “Obviously, we wanted to get more of those ISI articles.” (Interview, civil servant 1). Similarly, one politician stated that “after the World Bank indicated insufficient outputs and improvements needed, we immediately submitted their recommendations to the government for implementation” (Interview, politician 4).

3.3. Pressure to comply with international standards of excellence

From interaction with EU-commissioned and Norwegian experts, Lithuanian policymakers learned that their national output was insufficient. One politician explained that “we needed those ISI papers because the world was already taking data from the Institute for Scientific Information and looking at articles only in those journals.” (Interview, politician 1). A civil servant mentioned that they constantly monitored ISI indicators and reported to policymakers, “everyone was unhappy with national achievements,” one of them added, “the country’s results were miserable.” (Interview, civil servant 3). Both pointed to the EU report Key Figures 2001: Towards a European Research Area (2001).

To inform academia’s understanding of what should be done to improve Lithuania’s standing in the EU, the Lithuanian Academy of Sciences prepared a White Paper on Science and Technology. This document served as a basis for the Long-Term Research and Experimental Development Strategy, officially approved by the government in 2003. The White Paper stated that it was “necessary to pay more attention to the coordination of the country’s R&D policy with the EU” as “Lithuania is lagging behind its neighbours.” (Lithuanian science and technology. White paper, 2002: 97). The White Paper most likely drew on the World Bank report saying that “the number of articles published in scientific periodicals per researcher in highly developed countries usually comes to 0.5 a year; according to data available for 2000, this indicator in Lithuania was as low as 0.05” (The World Bank, 2003: 64).

The dynamics between Lithuanian science policymakers and foreign experts on research assessment were consequential for subsequent academic reforms. Through their interaction with foreign experts, Lithuanian policymakers knew that they had to confront low internationalisation levels of the national research system and lack of developments in the research assessment. The White Paper confirms that aim “The internationalisation problem in Lithuania should be transferred from the level of institutions or the Department of Science and Higher Education to the state level.“ (Lithuanian science and technology. White paper, 2002: 97). A civil servant reflecting the impressions of interacting with foreign experts, “at all levels, national or institutional, eighty per cent of the research assessment impact is self-assessment ... what we always did ... not someone from outside coming in and saying what to do.” (Interview, civil servant 2). The foreign experts reflected that “it is a strength of the Lithuanian system that policymakers are ready to acknowledge weaknesses in their system and consider a change.” (Edler, 2007: 6).

As one of the politicians said, “We received many different recommendations from foreign experts, but translating them to a local context does not mean strictly following every recommendation.” (Interview, politician 3). He added that “we didn’t expect to invent the wheel, but we needed to consider international trends”. This motion was seconded by a civil servant who said that “from the early beginning, we have made sure that not everyone from the West is a prophet” (Interview, civil servant 5). Despite these caveats, almost every interviewee mentioned the conviction of one civil servant that “we must follow European Union recommendations if we want our researchers to get used to the Western rules.” (Interview, civil servant 6). Politicians and civil servants wanted to improve the national positions but were unwilling to implement everything that was recommended.

3.4. Attempts to shape national policies by borrowing foreign practices

Politicians and civil servants explored research assessment practices by going to Western countries to gain experience and learn in an informal way how research assessment was done there. This close collaboration with peers in developed countries shaped the national research assessment system and set its direction.

Lithuanian policymakers predominantly opted for quantitative measures to assess institutions and researchers. First, because of the small size of the Lithuanian research community, “after the first attempt, finding unbiased experts for qualitative research assessment looked impossible” (Interview, politician 4; cf. Daujotis et al., 2002:179). Second, since Lithuania lacked research outputs in English, especially “internationally recognised academic works”, as one civil servant reflected, “the country looked sad, and we wanted to encourage them not to be lazy but publish in ISI journals.” (Interview, civil servant 1). Third, another civil servant stressed that “We had no research assessment culture at all, so quantitative measures seemed only possible to start with” (Interview, civil servant 2).

The political orientation towards quantitative measures and Western journals started with the awarding of scientific degrees and academic titles, requiring at least fifteen scientific papers published in recognised scientific outlets. These ‘recognised scientific outlets’ covered three subcategories of publications: (1) articles in foreign peer-reviewed journals with prominent researchers in the relevant field on their editorial boards; (2) articles in proceedings of conferences organised by international scientific societies; or (3) articles in journals included in the National Journals List. According to one politician, “Institutions themselves suggested that research papers cannot be published anywhere but at least in some meaningful outlets.” (Interview, politician 2). A civil servant seconded: “Researchers should get used to publishing their papers in typical peer-reviewed journals read by the broader scientific community, not only local folks” (Interview, civil servant 3).

In parallel with quantitative research assessment for researchers, algorithmic funding allocation for institutions – Institutional Research Assessment – was developed and announced in 2004. From the early beginning, quantitative institutional research assessment accounted for research papers, patents, and applied research activities. Since then, state funding has always been distributed using complex calculations. As one civil servant mentioned, “Not everyone easily understands those equations, and it‘s tough to get through, but funding should be distributed honestly“ (Interview, civil servant 4). Using these formulas, papers in ISI journals earned the largest share of state funding for institutions in the Institutional Research Assessment.

Papers in ISI journals became core research outputs in the two main national quantitative research assessment systems – Minimum Qualifications for Researchers and Institutional Research Assessment – developed in Lithuania over the first decade after the country’s regained independence. Interviewed politicians and civil servants all agreed that “Since science is international... assessment involving global publishers, editors, and databases is more reliable than national ones – simply a matter of credibility” (Interview, politician 5).

4. Conclusions

To improve research assessment practices, it is important to draw lessons from past research assessment reforms. Our study on policy dynamics in Lithuania provides an understanding of what happens when one changes a dominant evaluation system. Our findings will be useful to inform ongoing assessment reforms, for instance in the context of the recently established Coalition for Advancing Research Assessment (CoARA).

As our study revealed, the judgment of foreign experts was not simply imposed on Lithuania as a country that had recently gained independence. The Lithuanian government explicitly solicited a dynamic relationship with foreign experts to get an outside perspective on the formerly Soviet-oriented research system. That perspective provided input and legitimacy for national research system reforms when the results of the peer-review evaluation carried out ‘from within’ did not satisfy policymakers.

Lithuania’s research evaluation system has evolved as a result of dynamic pressures in the science system. As Lithuanian policymakers noticed, simply applying a model from elsewhere does not work, so they experimented with alternative approaches.

The substantial reform of research evaluation in Lithuania created resistance from researchers who benefited from the previous evaluation system. This will be discussed in a full paper that is currently in preparation. This paper will also analyse the dynamics between national authorities and data providers (databases and peer-reviewed journals).

Open science practices

This paper is based on two types of data – Lithuanian research assessment policies and interviews with Lithuanian politicians and civil servants. We promised the interviewees that they will remain anonymous. The Lithuanian research assessment policies are freely available in the Register of Legal Acts managed by the Office of the Seimas of the Republic of Lithuania (see https://www.e-tar.lt/portal/en/index).

Author contributions

Conceptualisation: ED, GD. Investigation: ED, GD. Writing – original draft: ED. Writing – review & editing: GD, LW.

Competing interests

The authors have no competing interests.

References

Borrás, S., & Edquist, C. (2013). The choice of innovation policy instruments. Technological Forecasting and Social Change, 80(8), 1513–1522. https://doi.org/10.1016/j.techfore.2013.03.002

Braun, D. (2003). Lasting tensions in research policymaking - a delegation problem. Science and Public Policy, 30(5), 309–321. https://doi.org/10.3152/147154303781780353

Crăciun, D., & Orosz, K. (2018). Benefits and costs of transnational collaborative partnerships in higher education. EENEE Analytical Report No.36. Retrieved from http://www.eenee.de/eeneeHome/EENEE/Analytical-Reports.html

Daujotis, V., Radžvilas, V., Sližys, R. P., & Stumbrys, E. (2002). Lietuvos mokslo politika Europos kontekste. Vilnius, Lithuania: Justitia.

Edler, J. (2007). OMC Policy Mix Review Report Country Report. Retrieved from https://ec.europa.eu/invest-in-research/pdf/download_en/omc_lt_review_report.pdf

Evaluation of Research in Lithuania. (1996). Oslo: The Research Council of Norway.

Gazni, A., Sugimoto, C. R., & Didegah, F. (2012). Mapping world scientific collaboration: Authors, institutions, and countries. Journal of the American Society for Information Science and Technology, 63(2), 323–335. https://doi.org/10.1002/asi.21688

Glänzel, W. (2001). National characteristics in international scientific co-authorship relations. Scientometrics, 51(1), 69–115. https://doi.org/10.1023/A:1010512628145

Katz, J. S., & Martin, B. R. (1997). What is research collaboration? Research Policy, 26(1), 1–18. https://doi.org/10.1016/S0048-7333(96)00917-1

Kehm, B. M., & Teichler, U. (2007). Research on internationalisation in higher education. Journal of Studies in International Education, 11(3–4), 260–273. https://doi.org/10.1177/1028315307303534

Key Figures 2001: Towards a European Research Area. (2001). European Commission. Retrieved from http://ec.europa.eu/research/era/pdf/benchmarking2001.pdf

Lithuanian Science and Technology. White Paper. (2002). Vilnius, Lithuania: Justitia. Retrieved from https://www.smm.lt/uploads/documents/White paper.pdf

Pohl, H. (2021). Internationalisation, innovation, and academic–corporate co-publications. Scientometrics, 126(2), 1329–1358. https://doi.org/10.1007/s11192-020-03799-6

Potì, B., & Reale, E. (2007). Changing allocation models for public research funding: an empirical exploration based on project funding data. Science and Public Policy, 34(6), 417–430. https://doi.org/10.3152/030234207X239401

Rip, A., & Meulen, B. J. R. Van Der. (1996). The post-modern research system. Science and Public Policy, 23(6), 343–352. https://doi.org/10.1093/spp/23.6.343

Robinson-Garcia, N., & Ràfols, I. (2020). The differing meanings of indicators under different policy contexts. The case of internationalisation. In Evaluative Informetrics: The Art of Metrics-Based Research Assessment (pp. 213–232). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-030-47665-6_10

The World Bank. (2003). Lithuania - Aiming for a knowledge economy. Retrieved from https://documents.worldbank.org/en/publication/documents-reports/documentdetail/694971468753015485/lithuania-aiming-for-a-knowledge-economy

Wagner, C. S., Brahmakulam, I., Jackson, B., Wong, A., & Yoda, T. (2001). Science and Technology Collaboration: Building Capacity in Developing Countries?

Wagner, C. S., Park, H. W., & Leydesdorff, L. (2015). The Continuing Growth of Global Cooperation Networks in Research: A Conundrum for National Governments. PLOS ONE, 10(7), e0131816. https://doi.org/10.1371/journal.pone.0131816


  1. CoARA, Coalition for Advancing Research Assessment https://coara.eu/agreement/the-agreement-full-text/ accessed 11 April 2023↩︎

  2. The Register of Legal Acts of the Republic of Lithuania (TAR) https://www.e-tar.lt/portal/en/index.↩︎

Figures (1)

Publication Image
Submitted by21 Apr 2023
User Avatar
Eleonora Dagiene
Leiden University
Download Publication
ReviewerDecisionType
User Avatar
Hidden Identity
Minor Revision
Peer Review
User Avatar
Hidden Identity
Accepted
Peer Review
User Avatar
Hidden Identity
Accepted
Peer Review