Platform logo
Explore Communities
27th International Conference on Science, Technology and Innovation Indicators (STI 2023) logo
27th International Conference on Science, Technology and Innovation Indicators (STI 2023)Community hosting publication
You are watching the latest version of this publication, Version 1.
conference paper

Understanding the support of metrics: the role of orientation to Global Science in sociologists' perceptions of performance measurement

19/04/2023| By
Katerina Katerina Guba
320 Views
0 Comments
Disciplines
Keywords
Abstract

What can explain variation within the disciplinary communities in expressing the support of metrics in the evaluation of research productivity? The aim of this study is to demonstrate that the attitude to the metrics varies between researchers depending on their academic cultural preferences. We conducted a survey of 1,850 Russian sociologists with the range of questions regarding the main divisions of sociology and perceived legitimacy of metrics for evaluating academic work. We found that although the majority of sociologists do not support giving more value to the publications that appeared in international databases, 30% of respondent expressed the positive attitude to this policy. Researchers with orientation to the global science measured by declarations and behavior publication pattern generally have a more positive attitude regarding using metrics in the research evaluation while academics with commitment to the local path generally have a more negative attitude.

Preview automatically generated form the publication file.

Understanding the support of metrics: the role of orientation to Global Science in sociologists' perceptions of performance measurement

Katerina Guba*

*kguba@eu.spb.ru

0000-0002-4677-5050

Center for Institutional Analysis of Science and Education, European University at St.Petersburg, Russia

What can explain variation within the disciplinary communities in expressing the support of metrics in the evaluation of research productivity? The aim of this study is to demonstrate that the attitude to the metrics varies between researchers depending on their academic cultural preferences. We conducted a survey of 1,850 Russian sociologists with the range of questions regarding the main divisions of sociology and perceived legitimacy of metrics for evaluating academic work. We found that although the majority of sociologists do not support giving more value to the publications that appeared in international databases, 30% of respondent expressed the positive attitude to this policy. Researchers with orientation to the global science measured by declarations and behavior publication pattern generally have a more positive attitude regarding using metrics in the research evaluation while academics with commitment to the local path generally have a more negative attitude.

1. Introduction

The reliance on quantitative indicators of research productivity is, perhaps, the most important recent transformation of academic institutions (Auranen & Nieminen, 2010). During last decades, many countries have launched performance-based research funding systems in which citation metrics are integral (Hicks, 2012). However, researchers question that citations might be used to assess the intellectual contribution of the scientist accurately. Citations are not synonymous with quality, originality, or a high level of performance (Tahamtan & Bornmann, 2019). There is also widespread criticism, especially from social scientists, who defend the statement that quantitative indicators cannot measure research merit in their fields (Glaeser, 1999). Despite the widespread criticism, the phenomenon of 'citizen bibliometrics' has come into existence—citation metrics are used by groups other than professionals, including scientists and administrators, in the routine evaluation of individuals (Hammarfelt & Rushforth, 2017). Moreover, studies on opinions on scientometric indicators show that some scientists think positively of problematic metrics even if their use is debated in the scientometric community (Buela-Casal & Zych, 2012).

What can explain huge variation within the scientific communities in response to support of scientometrics in evaluation of research productivity? In addition to national variation related to the place of metrics in the evaluation system, the role of disciplinary differences has received researchers’ attention (Söderlind & Geschwind, 2020). It was shown that social sciences are generally less in favor of quantification that is related to the opposition to the pressure of quantification and neoliberal policy (Buela-Casal & Zych, 2012). In this paper, we bring attention to the observation that epistemic cultures differ not only between disciplines but also within them. Disciplinary communities have their own differences in missions, cultures and orientations towards research, which might interact with exogenous institutional pressures (Akbaritabar et al., 2018). However, whether sub-disciplinary cultures are a factor that affects the variation in attitudes of academics toward measuring the research performance is a question that has rarely been empirically studied. The aim of this study is to demonstrate that the attitude to the metrics could vary between scientists depending on specific standards and norms of their sub-communities.

Empirically, we focus on the case of sociologists which is interesting for its co-existence of different epistemic communities among sociologists. Sociologists have expressed to support different missions regarding what is the purpose of social science with sociologists who believe that sociology is as an activist discipline and sociologists who consider it as a science. Besides, sociology has long-standing divide between quantitative and qualitative scholars with the first who are predominantly anti-bibliometric and publish preferably in national journals or monographs and the second who are familiar with bibliometric indicators and produce papers more fitted for international outlets (Akbaritabar et al., 2018). For this study, it is more important that sociology in developing countries have divided attitudes towards globalization: the segment with a global orientation, with English as its language of publication; and the local segment, with a regional or continental orientation to the Spanish-speaking academic community (Vanderstraeten & Koch, 2018). To examine contrasting academic cultures related to support of metrics, we conducted a survey of 1,850 Russian sociologists (506 responded) to ask sociologists about the perceived legitimacy of using international bibliometric indicators for evaluating academic work.

2. Relevant Literature

Using an extensive bibliographic database, Mosbah-Natanson and Gingras (2013) examined the social sciences worldwide over a 30-year period to document the center–periphery effect on the production of papers, international collaboration, and citation patterns. They found that North America and Europe account for close to 90% of the global production in the social sciences and humanities. The important obstacle for globalized social sciences is not even linguistic differences that hinder authors from publishing in English, but national traditions and the historical legacy of discipline institutionalization. The recent trend of internationalization in the social sciences has reinvented the debate on the integration of local communities in global science. Some countries fully absorb the necessity to publish in international outlets. Thus, Finnish sociologists (Hokka, 2018) consider writing in their local language as lacking real academic value since only articles in international peer-reviewed journals meet academic standards. Other communities have divided attitudes towards globalization. For example, in Chile, there are two segments: the segment with a global orientation, with English as its language of publication; and the local segment, with a regional or continental orientation to the Spanish-speaking academic community (Vanderstraeten & Koch, 2018). Gantman and Fernandez (2016) examined what differences exist in articles published in two segments of Spanish-speaking countries in management and organizational studies and found these segments are separated from each other with their own resources, channels of publications, and patterns of esteem.

Post-Soviet sociology is another example of a discipline divided between “those who identify themselves with international and global science with those who are oriented towards predominantly local debates and audiences” (Sokolov, 2018: 5). At the time of communist governments, publishing in international journals was virtually nonexistent, especially in the sphere of social sciences and humanities. Following the collapse of the Soviet Union, the policy has changed to increase the number of scientific publications. Western foundations supported the creation of a “global” sociology—they distributed research grants and scholarships, gave money for academic travel, and provided institutional support for Western-style private research centers and universities that espoused an opposite mission than the older public institutions (Sokolov, 2018). However, academic groups oriented toward local traditions are still powerful. They are a result of the rebranding of departments responsible for teaching historical and dialectical materialism, scientific Communism, and the political economy of socialism. Thus, the niche for local variants of the social sciences was created that operates in parallel to those based upon “Western” values.

Choosing a specific orientation to global science is related to the certain publication behavior with the preference to publishing in international English-language periodicals instead of publishing in their native language in domestic. We suggest that orientation to the global science was unintentially supported by the increased reliance of Russian authorities on quantitative indicators as a solution to the problem of academic expertise. Starting from 2010s, metrics had started to perpetuate numerous official documents in science, research and higher education policy. The new benchmarks affected not only two dozens of institutions participating in the Russian Excellence Program (Project 5-100) but were also adopted by other universities. The universities started to encourage their faculty to publish more in exchange of bonuses and additional scores in so-called effective employment contracts. The core of the whole system was giving highest number of scores to publications appeared in the journal that are indexed in Scopus or Web of Science (WoS). Although new evaluation system required to change dramatically the publication behavior in a short time, we expect, that the sociologists oriented to international science were in favor of the new rules given the symbolical value of global science.

3. Sample and Methods

This study’s main method of collecting empirical information is an online survey of Russian sociologists for evaluating the degree of convergence of expert judgments and citation metrics. Selecting a sample of scientists is especially problematic regarding a fragmented discipline that is divided into practically isolated research groups or political camps (Sokolov, 2018). The sample should take into account the variety of sub-disciplines and, ideally, cover all scientists. We relied on the Russian Index of Science Citation (RISC), which covers more than 18,000 Russian journals. In addition, the RISC is integrated with a full-text platform, the Scientific Electronic Library (eLibrary.ru), which indexes more periodicals than RISC. We believe that the RISC reduces the problem of selecting scientists to participate in the survey, as it is difficult for active scholars to bypass the RISC-indexed journals. Recently, Russian universities are required to register their entire faculty on RISC to provide information for the national reports. Many universities have created a special position responsible for input of information into the database. Thus, we can conclude that RISC contains a relatively representative dataset of Russian scholars.

We selected authors who (a) have at least three articles in RISC during the past five years, because we strive to survey active scientists; (b) have published most of their articles in journals classified by RISC as belonging to sociology, or have the bulk of their citations coming from such sources; and (c) are both registered in the RISC system and provided an email address in their profiles. Overall, we ended up with 1850 profiles of scholars, to whom the survey was sent by email. Subsequently, we received 506 responses (over 30%), which can be considered a useful response rate for online survey. In general, the respondents accurately represent the population, regarding the organizations with which the scholars are affiliated.

The attitudes toward performance measurement might be explored in a number of different perspectives by including ideas about the accuracy of performance measures, opinions on the consequences of performance measurement on the working environment and the behaviour of the respondents (Söderlind & Geschwind, 2020). In this study, we focused on how respondents assess the formal policy of giving more scores to research results which were published in journals indexed in international citation databases (Web of Science and / or Scopus). The exact question is below:

Since January 2020, the Ministry of Higher Education and Science has been considering the possibility of calculating a publication performance score for research organizations. The methodology involves taking into account the quality of publications, assigning each publication a score depending on its type. Do you agree with the initiative of the Ministry, suggesting that articles published in journals indexed in international citation databases (Web of Science and / or Scopus) should be assigned a higher score than articles in journals not indexed in them?

No surprise that more than half of our respondents answered that they do not support the policy (53%), however, we still have 30% of respondents who support the practice of giving more scores to the indexed papers (17% found it difficult to answer). In our analysis, we used the binary variable with 1 going to those who supported science policy with the reliance on international citation databases and 0 attached to those who do not support the policy.

The data allowed for a construction of two variables of localism and globalism, attitudinal and behavioural. To measure the segregation between those wholly oriented to the global science and those oriented to a national research, we use several survey items which were developed in the study of Russian sociologists (Sokolov, 2018). Each item was evaluated on a four-point Likert scale (‘completely disagree’, ‘mostly disagree’, ‘mostly agree’, ‘completely agree’). We decided to create a single index variable given that items are a set of correlated variables. The set of items are ordinal variables that requires performing a factor analysis based on a polychoric correlation matrix, rather than raw variables. Factor analysis extracted one component with an eigenvalue exceeding 1, loadings reported in the Table 1.

Table 1 The Globalism–Localism scale (with factor loadings)

Items Percent (agree/disagree) Rotated factor loadings
Q1 Russian sociologists should preserve and develop the national sociological tradition 65/22 0.7073
Q2 Russian sociology has lagged behind the Western for decades and we should now learn from Western colleagues 47/36 -0.5678
Q3 Western theories do not explain much in Russian life. We need to work with native theoretical models 62/22 0.6036
Q4 The existence of a special Russian theory of society is just as little justified as the existence of a special Russian physics or medicine 51/33 -0.477
Q5 The average methodological level of articles in leading English-language journals is much higher than in leading Russian ones, and young scientists should be taught to focus on it. 35/43 -0.5928
Q6 Doing research, sociologists ought to think first of all about the interests of their country and their state 47/36 0.6043

In addition to the attitudinal measure, we tried to characterize the academic culture to which the sociologists belonged, based on a non-survey information – by a set of different publication and citation-based metrics. We assume that orientation to global science is evident in authors’ publication behavior. If sociologists believe that Western science is more valued, they might prefer to publish in international journals. Data on respondents’ publications stored from Scopus database allows us to examined the relation between the respondents’ publication outputs and their replies. Three types of indicators were calculated. First, the number of publications which a respondent has published. Second, the average normalized journal score of journals in which the respondents have published their work which Scopus provides for all journals. High scores indicate that the authors have published in a high-impact journal. Third, we included the total citations that the respondents have acquired. Table 2 present the descriptive statistics.

Table 2 Descriptive statistics of respondents’ bibliometric performance

Variable Obs Mean Std. Dev. Min Max
N indexed publications (by Scopus) 504 3.7 5.9 0 53
Total N of citations 506 1.8 3.8 0 33
Average journals’ SNIP 466 0.4 0.4 0 2.6

In addition, we included several controls: respondents’ age and information on the type of institution to which a respondent is affiliated. Project 5–100 was launched in 2012 to improve the international competitiveness of a handful of Russian universities. Consequently, we coded them as research-oriented “strong” organizations and others as “weak.”

4. Results

Tables 3 represents a series of logit regression models with, correspondingly, the attitudinal Global–Local scale and the behavioural scale of globalism–localism as dependent variables. The models demonstrate that higher scores in Global-Localism scale generally decreases the odds of supporting official policy of giving more credit to indexed publications comparing with domestic journals which are not covered by international databases. Models also demonstrate that all bibliometric indicators are related to the express positive attitude to the using of international metrics in research evaluation. It is worth noting that higher academic productivity measured by three variables has a rather significant effect on the probability that a respondent will not express negatively. For example, each additional increase in the number of indexed publications is associated with an 5% increase in the odds of supporting relying on international databases. Regarding control variables, we do not find evidence that age and affiliation to a top Russian university have a differentiated effect on the variables of interest.

Table 3 Logistic regression model results

(1) (2) (3) (4) (5) (6)
VARIABLES Odds ratio Odds ratio Odds ratio Odds ratio Odds ratio Odds ratio
39-40 1.527 1.460 1.532 1.340 1.675 1.532
(0.793) (0.781) (0.836) (0.757) (0.883) (0.836)
40-49 1.348 1.402 1.334 1.389 1.330 1.334
(0.664) (0.731) (0.705) (0.738) (0.662) (0.705)
50-59 1.281 1.524 1.377 1.230 1.191 1.377
(0.648) (0.807) (0.741) (0.676) (0.611) (0.741)
60-69 0.882 0.967 0.874 0.776 0.840 0.874
(0.451) (0.521) (0.477) (0.431) (0.434) (0.477)
>70 0.990 1.165 1.117 0.854 0.982 1.117
(0.654) (0.799) (0.771) (0.608) (0.652) (0.771)
Top univ 1.592 1.346 1.320 1.617 1.582 1.320
(0.510) (0.453) (0.465) (0.562) (0.531) (0.465)
G-L scale 0.762** 0.762** 0.762**
(0.0848) (0.0877) (0.0877)
N publications 1.049** 1.049**
(0.0247) (0.0247)
Avg SNIP 1.940**
(0.622)
N citations 1.066*
(0.0390)
Constant 0.442* 0.553 0.486 0.316** 0.399** 0.486
(0.196) (0.255) (0.226) (0.158) (0.178) (0.226)
Observations 311 292 275 271 294 275

5. Conclusion

The aim of this study is to understand whether academics in general are complacent about the use of metrics for research evaluation and whether there are differences in the attitude between subdisciplinary communities differentiated by their orientation to global versus local science. We found that although the majority of sociologists do not support giving more value to the publications that appeared in international databases, 30% of respondent expressed the positive attitude to this policy. Our main finding is that academics with orientation to the global science measured by declarations and behavior publication patterns generally have a more positive attitude regarding using metrics in the research evaluation while academics with commitment to the local path generally have a more negative attitude.

Our results contribute to the research on the understanding the variance in support of metrics. Studying the legitimacy of rules is important as it affects its potential impact of proposed rules: “the more acceptance it has, the greater the possibility for effects” (Söderlind & Geschwind, 2020). We showed that legitimacy of formal policy is related to the deep academic values that are differed between different sub-communities even in the boundaries of one discipline. We demonstrated that not only the rewards that respondents receive by playing the rule of the games matter (they support metrics if they are successful according to these metrics) but also academic cultural attitudes reinforce the use of metrics.

Open science practices

Data have not been made available for other researchers given the difficulties related to the respondents’ anonymization. We use the bibliometric information that made it possible to identify the respondent in the citation database.

Funding information

The paper has been prepared with the support of the Russian Science Foundation, No. 21-18-00519.

Competing interests

I have no competing interests.

References

Auranen, O., & Nieminen, M. (2010). University research funding and publication performance—An international comparison. Research Policy, 39(6), 822–834. https://doi.org/10.1016/j.respol.2010.03.003

Akbaritabar et al. (2018). The conundrum of research productivity: A study on sociologists in Italy. Scientometrics, 114(3). https://doi.org/10.1007/s11192-017-2606-5

Buela-Casal, G. & Zych, I. (2012). What do the scientists think about the impact factor? Scientometrics, 92, 281–292. https://doi.org/10.1007/s11192-012-0676-y

Hammarfelt, B. & Rushforth, A. (2017). Indicators as judgment devices. An empirical study of citizen bibliometrics in research evaluation. Research Evaluation, 26(3), 169–180. https://doi.org/10.1093/reseval/rvx018

Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2). https://doi.org/10.1016/j.respol.2011.09.007

Hokka, J. (2018). What counts as ‘good sociology’? Conflicting discourses on legitimate sociology in Finland and Sweden. Acta Sociologica, 62 (4), 357–371. https://doi.org/10.1177/0001699318813422

Gantman, E.R. & Fernández R. (2016). Literature segmentation in management and organization studies: The case of Spanish-speaking countries (2000–10). Research Evaluation. https://doi.org/10.1093/reseval/rvv031

Koch, T. & Vanderstraeten, R. (2018). Internationalizing a national scientific community? Changes in publication and citation practices in Chile, 1976–2015. Current Sociology, 67(5). https://doi.org/10.1177/0011392118807514

Mosbah-Natanson, S. & Gingras, Y. (2014). The globalization of social sciences? Evidence from a quantitative analysis of 30 years of production, collaboration and citations in the social sciences (1980–2009). Current Sociology, 62 (5), 626–646. https://doi.org/10.1177/0011392113498866

Sokolov, M. (2018). The sources of academic localism and globalism in Russian sociology: The choice of professional ideologies and occupational niches among social scientists. Current Sociology, 67 (1). https://doi.org/10.1177/0011392118811392

Söderlind, J. & Geschwind, L. (2020). Disciplinary Differences in Academics' Perceptions of Performance Measurement at Nordic Universities. Higher Education Governance and Policy, 1(1). https://dergipark.org.tr/en/pub/hegp/issue/55277/758741

Submitted by19 Apr 2023
User Avatar
Katerina Guba
European University at St.Petersburg
Download Publication
ReviewerDecisionType
User Avatar
Hidden Identity
Minor Revision
Peer Review
User Avatar
Hidden Identity
Minor Revision
Peer Review