Broadening the conception of ‘what counts’ – example of a narrative CV in a university alliance

. Research(er) assessment is currently under reform. For example, the CoARA initiative aims at building critical mass to change assessment systems to lean more on qualitative assessment instead of metrics. There are also articulated needs to recognise the diversification of academic careers and contributions in academic work. For example, despite the aims towards Open Science, assessment systems rarely take researchers’ contributions in Open Science into account. This paper is about a practical approach, which aims at helping universities to broaden the ways researchers are assessed, for example for the purposes of academic recruitment. It presents a narrative CV template covering diverse aspects of academic work. The paper is based on the work conducted in YUFERING, a joint project by the European university alliance YUFE.


Introduction
University researchers as special kinds of knowledge professionals possess a high degree of job autonomy.Their own values affect what academic activities and outputs they see as the most important ones and hence, how they prioritise their work.Still, to be eligible for academic recruitment and promotion, researchers also have to consider how universities recognise and reward different outputs and activities.For researchers, universities' assessment systems produce incentives how to prioritise work tasks (cf.Pietilä 2019).Dominant research assessment systems have for long been criticised because of the overemphasis and/or misuse of metrics, especially the journal impact factor and the H-index (Rice et al. 2020;Niles et al. 2020).There have been calls for putting more emphasis on the quality and content of researchers' contributions (e.g., CoARA 2022).In addition, the expectations of societal stakeholders towards the contributions of researchers (and universities) are today highly diverse (Pietilä & Pinheiro 2021).There is demand for more collaborative working patterns within academia (e.g., sharing of research data and code), science-society interaction, and mobility between the academia and other sectors.Still, the dominant assessment systems remain one-sided focusing predominantly on traditional research outputs.For example, despite the policy efforts towards Open Science, universities' recruitment practices do not on average consider researchers' contributions in Open Science (Khan et al. 2022;Pontika et al. 2022).
Based on the review of central documents (declarations, recommendations, policy papers), examples from universities' assessment practices, research interviews, and existing narrative CVs for research assessment, the paper presents one practical example of how universities as research-performing organisations aim at responding to the changes in their institutional environments.It presents a framework for assessing the diverse contributions and achievements of researchers, and a narrative portfolio template designed to help universities to widen the set of dimensions researchers are assessed against.The work has been conducted jointly in the project YUFERING of the university alliance YUFE (Young Universities for the Future of Based on universities' review, promotion, and tenure documents at seven countries, the study by Pontika et al. (2022) found that many universities' policies still rely on traditional criteria.Criteria related to engagement with stakeholders or the public, let alone policies acknowledging researchers' Open Science activity were much rarer.In addition, many universities continue to apply problematic quantification of contributions.Another study by the same group (Ross-Hellauer et al. 2023) found a mismatch between the valuations of researchers and the valuations of their institutions: researchers valued mentoring, peer review, collegiality, and Open Science activities more than their institutions.
The universities' policies are partly a result of funding allocation models (Pruvot & Estermann 2022) and other national-level steering mechanisms, which in many cases emphasise quantitative indicators.This means it may be difficult for universities to change their assessment systems if the national incentivisation systems remain untouched.The imperative for change remains, outlined by Pruvot and Estermann (2022, 6), who state that 'core funding must reflect the growing responsibilities of universities'.
It has been stated that because of the strong incentives towards certain kinds of activity, such as publishing in academic outlets (often behind paywalls), the tasks of researchers have been reduced (de Rijcke et al. 2016) and do not necessarily reflect the needs of the society (Rice et al. 2020).For example, contributions in teaching risk of being under-recognised in career progression (Macfarlane 2005;Pietilä 2019).Because of the dominant incentives and the pressure to adapt to them, in their work, researchers may deviate from 'what is considered relevant to what pays off' (Robinson Garcia et al. 2023).As a reaction, for example the CoARA (Coalition for Advancing Research Assessment) aims, among other things, at recognising the diversity of contributions to and careers in research.

Design
When adhering to initiatives such as CoARA, universities need to consider what kinds of contributions they especially value in different career stages and academic fields, how they could assess these contributions, and what kind of biases are likely to evolve.To do that, universities also need tools, such as frameworks and templates to work with.The presentation introduces one such tool: YUFERING portfolio developed in the YUFERING project.In addition, we reviewed the YUFE universities' recognition and reward structures related to academic recruitment and promotion (interviews and documentary analysis in 2021).The data collection was supplemented with visits to four YUFE universities in fall 2022 (supplementary interviews of HR staff and researchers).At the University of Eastern Finland, we took a more comprehensive interview round with researchers at different career levels and disciplinary fields (views on the central components in the current assessment system and how researchers would like to change the system).Some of these interviews were aimed at the testing and further development of the portfolio format.
We validated the different versions of the framework and the portfolio in consultations with experts from the YUFE universities.Special focus in the framework and the portfolio was put on researchers' contributions in Open Science because the promotion of Open Science is in the strategic focus of the university alliance.

Outcome
Table 1 presents the main areas identified as relevant in academic work within the YUFE universities.It also lists possible key outputs, examples, and contributions in each area.In the framework, research and teaching (including supervision) are the main areas of academic work.Community engagement and societal outreach represent aspects of researchers' interaction with societal groups and wider impact.Teamwork and leadership represent the researcher's contribution to the research group, scholarly community, and the university community.A practical outcome, so-called YUFERING portfolio, then represents a researcher-driven narrative CV, which is expected to be backed up with evidence (data) (see Table 2).The portfolio aims at helping universities in the alliance to diversify researcher assessment to acknowledge the researchers' contributions and merits not only in research, but also in teaching and supervision; community-engagement and societal interaction; as well as in teamwork, leadership, and management.  in different countries face different regulatory environments (e.g., dependence or autonomy with respect to the state in employment issues), institutional strategies, and organisational traditions.In addition, the structure of the portfolio should be adapted according to the specific case (e.g., position to be recruited), with needs for certain competences or skills (e.g., when recruiting to a research-intensive position with no teaching obligations).Thus, the portfolio serves as a generic tool that considers different national and organisational contexts and allows for flexibility, local adaptations, and tailoring for individual assessment case.If used in recruitment or promotion, the expected skills and competences should be openly and transparently communicated.
We acknowledge that the databases for the relevant outputs and activities listed as relevant in the portfolio vary in different national contexts and universities, often with serious deficiencies (apart from traditional publications).Thus, for many of the outputs and activities, the information is dependent on the researcher's own description, lowering the comparability and reliability of the data.Still, the structure of the portfolio, with its ready-made questions and the arguments required, produces the universities more comparable information than, for example, a completely free-form motivation letter format.

Conclusions and Discussion
The paper presents one practical example through which universities and their alliances aim at broadening their assessment systems.Changes in incentivisation schemes that target organisational structures, such as recruitment and promotion processes, could potentially have long-term effects on individual behaviour.However, universities' recruitment and promotion processes are complex processes, which involve multiple actors, including selection committees and external reviewers (Pietilä & Pinheiro 2021).Therefore, it is not easy to affect them, let alone the underlying values that underpin the selection processes (e.g., what kinds of contributions are seen as valuable in a particular field).Despite the focus on wider recognition of contributions in academic recruitment, what matters the most is the implementation.From the perspective of universities, implementation is affected by institutional and field-specific priorities (and realities, such as incentives created by national funding systems) and what kinds of contributions are seen as legitimate for the final decisions (for example, the weight Open Science contributions are given in the overall assessment).
Part of the portfolio (the second section) is being piloted in 2023 in the selection of post-doctoral researchers to a MSCA COFUND program (YUFE4Postdocs).In the first call of the program, 29 three-year postdoctoral positions are opened in spring 2023.The questions from the portfolio are integrated in the structured curriculum vitae in the call.Unfortunately, the timing of the selection process does not allow us to present data on the experiences of using the portfolio.Still, piloting in pan-European post-doctoral researcher recruitment shows the versatility of the generic and flexible portfolio model.

Open science practices
The project members discussed Open Sciences practices at the beginning of the project.It was decided jointly not to open the data collected in the project.For example, some data in the project were gathered primarily from the perspective of developing the YUFE universities' current policies and practices.It would have been especially difficult to open the research interview data because of the sensitivity of the topic (which partly concerned the work conditions of researchers) and the need to protect the anonymity of participants.We do not use any openly available data (except for the openly available documents; see references at the end).When publishing, whenever possible we make the publications openly available.
The research-based development of the novel, practical tool for researcher assessment is based on several datasets.These included a set of central documents, which deal with principles on responsible research assessment (e.g., the DORA Declaration 2013; the Leiden Manifesto 2015 byHicks et al. 2015), policy papers or agreements (e.g., CoARA 2022), central reports on Open Science indicators with a connection to career assessment (e.g., OS-CAM by O'Carroll et al. 2017 and the report by Wouters et al. 2019), national initiatives on responsible research assessment (e.g., Room for everyone's talent in the Netherlands by VSNU et al. 2019 and Good practice in researcher evaluation -Recommendation for the responsible evaluation of a researcher in Finland by The Committee for Public Information and Federation of Finnish Learned Societies 2020), and examples of narrative CVs (ACUMEN portfolio and Résumé for Researchers by the Royal Society) as practical tools for broadening assessment.

1A
How did your interest in your research area begin, what kinds of questions have you been particularly interested in, and how have your interests been shaped over the course of your career?1B Describe your own strengths and skills as a researcher and/or as a teacher.What do you want to improve? 1C What is your vision for your career in the coming 5-10 years?1D The YUFE universities place great importance to responsible research.This includes the support of the objectives of open science.Describe how you have made research and/or education more open, and what your plans are for the future.

Table 1 .
Main areas of academic work and possible key outputs, examples, and contributions.