Platform logo
Explore Communities
27th International Conference on Science, Technology and Innovation Indicators (STI 2023) logo
27th International Conference on Science, Technology and Innovation Indicators (STI 2023)Community hosting publication
You are watching the latest version of this publication, Version 1.
conference paper

The Robustness analysis of CAS Journal Ranking against “Covidization” of research

20/04/2023| By
Yahui Yahui Liu,
+ 1
Zhesi Zhesi Shen
854 Views
0 Comments
Disciplines
Keywords
Abstract

The covid-19 pandemic provides a natural dataset to examine the performance of evaluation indicators in terms of crisis and assess their ability to adapt to rapidly changing scientific landscapes. In this paper, we explore the robustness of CAS Journal Ranking, one of the most widely used journal ranking systems in China, in response to “Covidization” of research. The results show that evaluation indicator of CAS Journal Ranking is more robust relatively, i.e., FNCSI is robust to highly cited papers and the CWTS paper-level classification system can help to reduce the impact of covid-19 pandemic. Future research could focus on how to evaluate the public emergency-related publications classified into non-emergency categories, and analyze whether the high impact of these publications results from academic value of research itself or the pursuit of research hotspots.

Preview automatically generated form the publication file.

The Robustness analysis of CAS Journal Ranking against “Covidization” of research

Yahui Liu1,2 Liying Yang1 Zhesi Shen1*

*shenzhs@mail.las.ac.cn

1 National Science Library, Chinese Academy of Sciences, 100190, Beijing, China;

2 Department of Information Resources Management, School of Economics and Management, University of Chinese Academy of Sciences, 100190, Beijing, China

Abstract The covid-19 pandemic provides a natural dataset to examine the performance of evaluation indicators in terms of crisis and assess their ability to adapt to rapidly changing scientific landscapes. In this paper, we explore the robustness of CAS Journal Ranking, one of the most widely used journal ranking systems in China, in response to “Covidization” of research. The results show that evaluation indicator of CAS Journal Ranking is more robust relatively, i.e., FNCSI is robust to highly cited papers and the CWTS paper-level classification system can help to reduce the impact of covid-19 pandemic. Future research could focus on how to evaluate the public emergency-related publications classified into non-emergency categories, and analyze whether the high impact of these publications results from academic value of research itself or the pursuit of research hotspots.

Keywords Covid-19, CAS Journal Ranking, Robustness, Journal Impact Factor, Field Normalized Citation Success Index, Category Normalized Citation Impact

1.Introduction

Balancing the robustness and sensitivity of evaluation indicators is one of the main issue in evaluation research,which in the past could only be analyzed and predicted by simulation data. Public emergencies,such as covid-19 pandemic, provide a natural experiment to test the robustness of evaluation indicators. This historical public health emergency has significantly altered the global scientific ecosystem,with widespread implications for research priorities (Yong, 2021; Hill et al.,2021; Shapira,2020), funding (Medical Research Council 2020; National Natural Science Foundation of China, 2020; National Institutes of Health,2020), collaboration (Lee & Haupt,2021; Cai et al.,2021) and publishing patterns (Aviv-Reuven & Rosenfeld,2021; Horbach,2020). The shift of the research enterprise and massive production and citation of related publications (“Covidization” of research) has had substantial ramifications for quantitative evaluation in various contexts,including journal evaluation.

The Journal Impact Factor (JIF) is widely used in the evaluation of academic journals and the assessment of the research productivity of individual researchers,departments and institutions. Based on Journal Citation Reports (JCR, 2022 version according to Clarivate1),a substantial fraction of journals with significant increase in JIF was due to the publication and citation of covid-19 papers (Zhang et al.,2022; Liu et al.,2022). Liu and Wang(2022) confirm the citation premium of covid-19 papers by a large-scale investigation using SCIE and SSCI. Mondal et al.(2022) analyze recent trends in standard bibliometrics (JIF, Eigenfactor, SNIP) of pediatric journals, and the results indicate that JIF and SNIP increased considerably in 2020, while Eigenfactor remained stable between 2018 and 2020. Fassin(2021) finds the covid-19 pandemic has a greater promoting effect on the h-index and JIF of specialized journals, predicting that the impact will last at least until 2024. Considering the fluctuations of JIF,if it is used as the benchmark of excellence continually,there may take an unequal toll on researchers,worsening existing disparities and amplifying the inadequacy of research evaluation processes.

In this paper, we focus on the CAS Journal Ranking,which is one of the most used journal ranking system in China, i.e., nearly 500 universities and research institutions subscribe to it as references of academic submission for researchers and scientific evaluation for management departments(Huang et al.,2021). In addition,some scholars conduct scientific research referring to the ranking results(Quan et al.,2017; Li et al.,2023). It adopted the Citation Success Index(CSI) algorithm proposed by Milojevic et al.(2017),further integrated the paper-level classification system, eventually forming the field normalized CSI(FNCSI). Tong et al.(2020) verify the evaluation indicator of CAS Journal Ranking is robust against extremely highly cited publications and the wrongly assigned document type utilizing the simulation data. Given its prominent position and strong influence in China’s scientific research evaluation,it is crucial to investigate the performance of the CAS Journal Ranking in the context of the covid-19 pandemic and assess its ability to adapt to rapidly changing scientific landscapes.

2.Method and Data

2.1 Journals and publication data

CAS journal ranking includes the journals contained in Clarivate’s JCR. Based on the latest version of JCR, for citation data,it contains citations in year 2021 of each article and review, published in years 2019 and 2020. 4,045,054 article and review papers published in 12,401 journals in Web of Science Core Collection-Science Citation Index Expanded and Social Sciences Citation Index between 2019 and 2020 were collected. And it contains 36,916 covid-19 papers, which are obtained by referring to the retrieval strategy of the World Health Organization COVID-19-Database2.

2.2 The classification system data

We utilized the results of CWTS paper-level classification, which means each paper belongs to a certain topic. According to different granularity,this classification system consists of three levels-macro,meso and micro levels. Here we use the micro-level with about 4,000 topics. And the results of JCR subject category is used for comparison. Based on the JCR journal categorization, all journals are grouped into 254 categories,and each journal may be categorized into one or more subjects.

2.3 Journal ranking indicators

The FNCSI can be defined as the probability that the citation of a paper from a journal is larger than a random paper on the same topic and with the same document type from other journals(Shen et al.,2018). In order to investigate the robustness of FNCSI and the superiority of CAS Journal Ranking system,we compared the performance of JIF, FNCSI and CNCI in response to the effect of “Covidization” of Research under the CWTS paper-level classification system and the JCR subject category. More details about FNCSI and CNCI will be introduced in the section below.

  • Field Normalized Citation Success Index (FNCSI)

    For journal A, the probability P(\(c_{a}\)>\(c_{o}\)|a∈A,o∈O) that the citation of a paper from journal A is larger than a random paper in the same topic and with the same document type from other journals, is defined as below:

    \(S_{A}\)=P(\(c_{a}\)>\(c_{o}\)|a∈A,o∈O)=\(\sum_{t,d}^{}{P(A^{t,d}}\))P(\(c_{a}\)>\(c_{o}\)|a∈\(A^{t,d}\),o∈\(O^{t,d}\))

    Where \(c_{a}\) is the citation count of one paper from journal A,t∈{topic1,topic2,...}, d∈{article, review},\(A^{t,d}\) represents the represents the publications belonging to journal A in topic t with document type d, and O represents the publications from the rest journals.

  • Category Normalized Citation Impact(CNCI)

    Category Normalized Citation Impact (CNCI) use the same classification system as CSI but uses the average citations based on normalization approach, i.e., each citation is normalized by the average citation of papers in the same topic cluster and with the same document type. For instance, the CNCI of journal A is defined as:

    \(F_{A}\)=\(\frac{\sum_{t,d}^{}{\sum_{a \in A^{t,d}}^{}{c_{a}/\mu_{t,d}}}}{N_{A}}\)

    Where \(\mu_{t,d}\) is the average citation of papers in topic t with document type d.

    A similar calculation is done for journal-level classification system,the JCR subject category,where each paper is assigned to the category of journal A and then aggregated.

    3.Results

    3.1 Changes of indicators over time

    Journal impact indicators should be stable across time as a journal’s reputation and quality will not change dramatically. In this section, to measure the stability of journal indicators,we calculate the JIF, FNCSI and CNCI as compared based on citations in year 2020 of 3,573,997 article and review papers, published in years 2018 and 2019.

    Figure 1 shows the changes in JIF, FNCSI and CNCI of 12051 overlapping journals from 2020 to 2021. It is clear that JIF indicator changes more significantly, and when using FNCSI indicators to measure the impact of journals,both the blue squares (based on CWTS paper-level classification system) and the orange dots(based on JCR subject category) scatter not so far from the diagonal line. Figure 2 shows the increased rate of FNCSI and CNCI, and we can see that variation range of FNCSI is much smaller than CNCI after ignoring outliers. So relatively speaking,the FNCSI is more robust and not easily affected over time.

    Figure 1: The JIF, FNCSI and CNCI of Journals for 2020 and 2021

    Figure 2: The Increased rate of FNCSI and CNCI from 2020 to 2021

    increased rate_distribution

    3.2 Results of ranking changes

    This section presents the results of ranking changes based on FNCSI and CNCI under different classification systems. The robustness of an indicator represents its sensitivity to changes in the set of publications based on which it is calculated. To measure the extent to which journal impact indicators are affected by covid-19,we calculate these indicators and re-calculate them after removing covid-19 papers and contributed citation data.

    The comparison of journal rankings based on changed data with the original rankings is shown in Figure 3. We can see that almost all the orange dots (FNCSI) locate closely along the diagonal line while the blue squares (CNCI) spread much broader,which implies that the FNCSI measure reduces the effect of highly cited papers and it is more robust than CNCI against “Covidization” of research. Moreover,the range of ranking changes based on FNCSI under the CWTS paper-level classification system is smaller than JCR subject category.

    We further explored extent of change in journal ranking when using different indicators to measure journal impacts. Figure 4 shows the specific distribution of ranking changes based on FNCSI and CNCI under CWTS paper-level classification system and the JCR subject category. We can see that journals are more affected by covid-19 papers in “JCR-CNCI” condition(the green line),with about 25% of journals concentrated in the middle part (ranking changes in the range of 100-250). In addition,journals ranking changed more than 1,000 account for 4.5% in this condition. In “CWTS-FNCSI” condition(the red line),the change of ranking range is comparatively smaller, and its distribution is skewed such that over 80% of journals ranking changed less than 100, which indicates that FNCSI is most robust under the CWTS paper-level classification system.

    Figure 3: Robustness against “Covidization” of Research for FNCSI and CNCI

    fusion

    Figure 4: Distribution of Ranking Changes based on FNCSI and CNCI

    ff_distribution

    3.3 Outlier journals in the “CWTS-FNCSI” condition

    Although FNCSI is not easily affected by “Covidization” of Research, there are still some outlier journals, i.e., there is a more significant change in the ranking of journals after the removal of covid-19 papers. In this section we try to explore the causes of the outliers by specifically analyzing the papers published in these journals.

    Under CWTS paper-level classification system, the vast majority (64.42%) of the 36,916 covid-19 papers published in 2019 and 2020 belong to the category 42 (Table 1), which including the terms like “severe covid; main protease; mers cov; severe acute respiratory syndrome coronavirus; spike protein”, and they will not affect the normalization of other groups. But for the covid-19 papers classified into non-coronavirus categories, they may receive a disproportionate number of scientific citations to the extent that they dominate these classes dominant. Therefore, we investigate the categories which papers published in outlier journals belong to and the FNCSI of these journals in each category.

    Table 2 shows TOP 10 journals with the largest decrease in FNCSI after removing covid-19 papers,with their rankings dropping by more than 3000. It is clear that most covid-19 papers published in these journals are classified into non-coronavirus categories. The FNCSI of covid-19 papers in other categories is larger than covid-19 papers in category 42, and is much larger than non-covid-19 papers in other categories. That is to say, The citation advantage gained by covid-19 papers in non-coronavirus categories makes the journal an outlier with a significant drop in FNCSI after removing these papers.

    Table 1: Distribution of covid-19 papers among meso-level categories

Meso_level_id Number of articles Percentage
42 23782 64.42%
360 741 2.01%
15 289 0.78%
274 278 0.75%
45 253 0.69%
510 247 0.67%
114 238 0.64%
141 210 0.57%
184 196 0.53%
103 188 0.51%

4.Conclusion and Discussion

Covid-19 pandemic has caused unusual scientific citation pattern and profound consequences on journal evaluation indicators, severely affecting the validity of impact indicators. In this paper, we assess the robustness and sensitivity of the CAS Journal Ranking in response to the covid-19 pandemic through a comprehensive scientometric analysis. The results indicate that CAS Journal Ranking is more robust relatively, i.e.,FNCSI is robust to highly cited papers and the CWTS paper-level classification system can help to reduce the impact of covid-19 pandemic. The disproportionate citation advantage of covid-19 papers in non-coronavirus categories is the main reason for the significant decrease in FNCSI of the outlier journals after removing covid-19 papers.

There are some limitations of the current study. Future research may focus on these works:

  • Investigate stability of indicators by analyzing the changes in indicators before and after the public emergencies.

    It may include comparing the indicators for the same journals to determine if there is a significant difference, and comparing the changes of indicators in ranking or quartile distribution for different journals in the same field. Finally we can confirm whether the changes in indicators are temporary or long-term over several years.

  • Discuss how to evaluate the public emergency-related publications classified into other categories.

    For example, it is possible to compare the field normalization impact of covid-19 papers and non-covid-19 papers in the same category originally from the perspective of citing literature, referring to the journal evaluation indicator Source Normalized Impact per Paper3(SNIP). Then we can verify whether the high impact results from academic value of research itself or the pursuit of research hotspots,and if a more reasonable classification of publications related to public emergencies is necessary.

  • Explore the robustness of indicators by measuring the effect of different public emergencies.

    The current study only reflects the robustness of FNCSI in the context of covid-19 pandemic. Future research can measure the changes of indicators under the effect of various public emergencies, and then detect the advantage of CAS Journal Ranking in journal evaluation.

    New indicators may be proposed and existing indicators may be iteratively optimized in the future, all of which may have there own defects. It is the social responsibility of the journal ranking organization to show such defects openly and transparently and guide academic community to use evaluation indicators properly. we hope to provide valuable insights into the strengths and weaknesses of CAS Journal Ranking system and to inform future improvements and refinements in the evaluation of scientific journals. This also confirms the plead for responsible metrics of the European-wide Coalition for Advancing Research Assessment (CoARA).

    Open science practices

    The data used in our paper are publicly available. We believe that open science practices are essential for ensuring transparency and accountability in research. Therefore, we also encourage other researchers to use our data and replicate our findings through collecting data from web of science following the search strategy of World Health Organization COVID-19-Database4 and Journal Citation Reports5. We hope to contribute to a more open and collaborative research environment.

    Author contribution

    Yahui Liu: formal analysis, investigation, visualization, writing-original draft preparation

    Liying Yang: Writing-review & editing

    Zhesi Shen: conceptualization, investigation, Writing-review & editing

    Competing interests

    All authors disclosed no relevant relationships.

    References

    Aviv-Reuven, S., & Rosenfeld, A. (2021). Publication patterns’ changes due to the COVID-19 pandemic: a longitudinal and short-term scientometric analysis. Scientometrics, 126(8), 6761-6784.

    Cai X, Fry C V, Wagner C S. International collaboration during the COVID-19 crisis: autumn 2020 developments[J]. Scientometrics, 2021, 126(4): 3683-3692.

    Fassin, Y. (2021). Research on Covid-19: a disruptive phenomenon for bibliometrics. Scientometrics, 126(6), 5305-5319.

    Hill, R., Yin, Y., Stein, C., Wang, D., & Jones, B. F. (2021). Adaptability and the pivot penalty in science. arXiv preprint arXiv:2107.06476.

    Horbach, S. P. (2020). Pandemic publishing: Medical journals strongly speed up their publication process for COVID-19. Quantitative Science Studies, 1(3), 1056-1067.

    Huang, Y., Li, R., Zhang, L., & Sivertsen, G. (2021). A comprehensive analysis of the journal evaluation system in China. Quantitative Science Studies, 2(1), 300-326.

    Lee J J, Haupt J P. Scientific globalism during a global crisis: Research collaboration and open access publications on COVID-19[J]. Higher Education, 2021, 81(5): 949-966.

    Li, L., Yu, L., Ming, Y., Minhao, W., Fuyou, C., Zhesi, S., & Liying, Y. (2023). Influence of JIF and Journal Tier on Submission Behaviors in Different Countries——Based on Monthly Accepted Papers of NPG Journals. Data Analysis and Knowledge Discovery, 6(12), 43-52.

    Liu, W., & Wang, H. (2022). Citation premium: a much higher proportion of COVID-19 related publications become highly cited papers. arXiv preprint arXiv:2208.11991.

    Liu, Y., Zhang, J., Yang, L., et al.(2022). The Effect of Covid-related Publications on Journal Impact Factor(in Chinese). CSTR:32003.36. ChinaXiv.202211.00358.V1.

    Medical Research Council. (2020). COVID-19 Rapid Response Call. Retrieved 4 February 2020 https://webarchive.nationalarchives.gov.uk/ukgwa/20200419142045/https://mrc.ukri.org/funding/browse/2019-ncov-rapid-response-call/2019-ncov-rapid-response-call/.

    Milojevic, S., Radicchi, F., & Bar-Ilan, J. (2017). Citation success index - An intuitive pair-wise journal comparison metric. Journal of Informetrics, 11(1), 223-231.doi:10.1016/j.joi.2016.12.006

    Mondal, P., Mazur, L., Su, L., Gope, S., & Dell, E. (2022). The upsurge of impact factors in pediatric journals post COVID-19 outbreak: a cross-sectional study. Frontiers in Research Metrics and Analytics,7.

    National Institutes of Health.(2020) Estimates of funding for various research, condition, and disease categories (RCDC). Retrieved February 2020 https://report.nih.gov/categorical_spending.aspx.

    National Natural Science Foundation of China. (2020). Special project guide of “Fundamental research on origin, pathopoiesis and prevention of 2019-nCov” (in Chinese). Retrieved 24 February 2020 https://www.nsfc.gov.cn/publish/portal0/tab440/info77422.htm.

    Quan, W., Chen, B., & Shu, F. (2017). Publish or impoverish: An investigation of the monetary reward system of science in China (1999-2016). Aslib Journal of Information Management, 69(5), 486-502.

    Shapira, P. (2020). Scientific publications and COVID-19 “research pivots” during the pandemic: An initial bibliometric analysis. bioRxiv, 2020-12.

    Shen, Z., Yang, L., & Wu, J. (2018). Lognormal distribution of citation counts is the reason for the relation between Impact Factors and Citation Success Index. Journal of Informetrics,12(1), 153-157.

    Tong, S., Shen, Z., Chen, F., & Yang, L. (2020). The novel utilization of paper-level classification system on the evaluation of journal impact: An update in CAS Journal Ranking. arXiv e-prints, arXiv-2006.

    Yong, E. (2020). How science beat the virus. The Atlantic, 327(1), 48-58.

    Zhang, J., Liu, Y., & Shen, Z. (2022). Covid-related Papers Contribute 50% on the JIF of High Impact Medicine Journals.Journal of Data and Information Science.


  1. https://jcr.clarivate.com/jcr/browse-journals↩︎

  2. https://www.who.int/docs/default-source/coronaviruse/who-covid-19-database/who-covid-19-database-search-strategy.pdf?sfvr sn=113c354e_1↩︎

  3. https://www.journalindicators.com/↩︎

  4. https://www.who.int/docs/default-source/coronaviruse/who-covid-19-database/who-covid-19-database-search-strategy.pdf?sfvr sn=113c354e_1↩︎

  5. https://jcr.clarivate.com/jcr/browse-journals↩︎

Figures (3)

Publication ImagePublication ImagePublication Image
Submitted by20 Apr 2023
User Avatar
Yahui Liu
National Science Library
Download Publication
ReviewerDecisionType
User Avatar
Hidden Identity
Accepted
Peer Review
User Avatar
Hidden Identity
Major Revision
Peer Review
User Avatar
Hidden Identity
Major Revision
Peer Review