Everyone wants to be a Nobel prizewinner, but Nobel prizewinners didn’t get there on their own. Alongside the support of family and friends, scientists rely on an army of technicians, librarians and other people who fill roles that contribute to research outputs — from the humble article to paradigm-shifting experiments.
These essential, hidden roles are rarely celebrated alongside research achievements, which can make it hard to convince graduates in science, technology, engineering and mathematics to consider them as career options. Recognizing the variety of parts in a functioning scientific culture is a challenge that research-evaluation processes around the world have failed to solve.
In 2021, in the shadow of the pandemic, we took part in an exercise that became a live experiment of how hidden contributions can be celebrated. In parallel to the Research Excellence Framework (REF), which periodically assesses the research quality of UK universities and directly awards funding accordingly, we ran the Hidden REF to expand what’s valued in science. In doing so, we uncovered a side effect of formalized frameworks: concentration on outputs. The REF measured the impact of research across 22 categories of output, from journal publications to software. The assessment of these outputs contributes 65% towards the total score received by a university. The last REF, in 2014, received 191,000 outputs, and 189,000 of these were related to publications. In other words, only 3% of the outputs related to everything else that is produced by research, from data to compositions to devices and products. This is not to say publications aren’t important — but the health of a diverse research system should not be judged by such a narrow metric when there is so much more that merits recognition. There are contributions, roles and careers that are not represented by publications.
The Hidden REF’s goals
For this reason, our aim for the Hidden REF was to use a similar model of evaluation to the one in the mainstream REF, to see whether we could evaluate overlooked aspects of research and research outputs.
The first round had more than 120 submissions in 21 categories that provided significant value but were formally unrecognized. Entries included: a crowd-sourcing platform for increasing access to collections at the British Library; an open-source project to make data science reproducible, ethical, collaborative and inclusive; a comic about experiences of dementia; and the work of countless research managers, administrators, software engineers and technicians.
We used panels of expert evaluators drawn from around the globe, along with two main guiding criteria that embodied the objectives of the Hidden REF: significance, the scale and importance of the contribution; and visibility, the degree to which the submission could not normally receive recognition from standard research-evaluation measures. Individual evaluators initially reviewed and scored submissions before coming together as a panel to decide on a winner. Our experience reinforced the reason for initiating the Hidden REF: concentrating such a large proportion of any assessment of scientific quality on publications alone does a disservice to the diversity of research culture.
Systematically overlooking such roles also has repercussions.
Cases in point
Take Kevin Atkins, an award winner in the ‘hidden role’ category. Atkins has worked as a site engineer at the Marine Biological Association in Plymouth, UK, for more than 30 years. He is responsible for, among other things, designing and constructing marine sampling tools and fixing laboratory equipment. He is the ‘go-to guy’ when things break or don’t work, or when researchers need to brainstorm ideas. The science couldn’t be done without him. Yet people in Atkins’s role are rarely named on outputs or journal publications emerging as a result of their expertise — diminishing the way in which their work is recognized, celebrated and promoted as a potential career path in science.
The question of how to measure contributions that are taken for granted is as yet unresolved. In the lab setting, scientists value people like Atkins and could not imagine life without them. Yet, by not recognizing them formally, their contributions go unvalued. Co-authorship is one option — but we must not limit our thinking to recognition through publication. Not everyone is motivated by the idea of co-authorship, so the research community should be open to more relevant ways to give credit where it’s due.
This is not restricted to technicians. Conventional outputs do not recognize the crucial help provided by librarians, the research managers that help in preparing budgets for grant applications and those who help in analysing our data and writing our software.
Another example is our research participants. Their data are anonymized and used in our publications, but their contributions beyond being data points are often overlooked. Another Hidden REF winner, in the citizen science category, involved 18 street children and young people from Ghana, the Democratic Republic of Congo and Zimbabwe who, for more than three years, worked as researchers for a project hosted by the University of Dundee, UK. They provided the research team with weekly narrative accounts of what it was like to grow up on the street. They worked as activists by recruiting participants, and they created tools to disseminate study results. They even analysed their own data. Their work led to a REF-able publication. An impact case study was submitted to the REF2021 (impact was an extra 25% of the overall REF2021 assessment, and these case studies describe how research has led to positive change or benefit beyond academia), but the young people were not formally recognized until their award from the Hidden REF. The Hidden REF panel was particularly taken with this submission: some evaluators admitted that the it brought a tear to their eyes. These children were so important for research, yet much of their contribution had been overlooked.
Evaluating such diverse contributions is complex. Some might argue that, owing to this complexity, these integral roles and outputs cannot be meaningfully celebrated. But the fact that these roles and contributions are not included in formal methods of rewarding research contribution can mean one of only two things: that these roles aren’t valued enough to recognize and reward them, or current evaluation frameworks are unable to extend consideration and assessment criteria to these otherwise unmeasurable hidden outputs.
For the Hidden REF, the latter was shown to be incorrect. With very little guidance, panelists did assess the quality of entries and were able to show that complex evaluations of disparate contributions are possible. Like all evaluations, the Hidden REF framework will be improved over time as future iterations are run and the evaluation approaches, criteria and methods are adopted elsewhere. The Hidden REF will run again in 2023. There is interest in the framework from around the world, so it will run at different times in different countries.
Atkins’s nominator emphasized that Atkins “hid his voluminous light under a bushel, so that most of the time people don’t realize what essential work he has carried out on a daily basis to keep the laboratory running smoothly”.
So for Atkins, and others like him hidden in our complex research culture, we see you.