Written byJanie Ironside-Wood
In a new research report to be published today, Professor Daniel Neyland and Dr. Sveta Milyaeva from Goldsmiths Department of Sociology offer the first in-depth, qualitative analysis of the UK Higher Education Research Excellence Framework (REF).
As a mechanism for competitively allocating around £2billion of research funding per annum through assessments of UK academic research publications, impact cases and research environments, the REF and its predecessors have always courted controversy.
Claims that the REF is involved in the marketization of Higher Education, is more focused on steering than measuring academic work or is based on an inaccurate system of measurement, have accompanied research assessment since its inception in the 1980s.
In their report ‘The Challenges of Research Assessment: A qualitative analysis of REF2014’, Daniel Neyland and Sveta Milyaeva present insiders’ accounts of the assessment process to explore and go beyond these critiques.
Drawing on in-depth, semi-structured interviews with REF managers, impact assessors, Main Panel and sub-panel members from a cross-section of disciplines, the report suggests that research assessment:
* has led to the professionalization of UK academia
* that recent changes to the REF have had a positive influence on equality and diversity
* and that impact has enabled academia to demonstrate its value to the world beyond universities
At the same time, participants in the REF note that:
* universities and even individual academics have increasingly sought to ‘game’ the system
* the disciplinary sub-panels at times faced problems with representativeness, internal disputes and efforts to make scoring consistent
* the system of peer review central to REF, was not the same form of peer review experienced in other areas of academic life
* sub-panel members took part in the REF to ensure their institution’s income, to try and change the process from within, or because they were pressured by their institutions
For many REF sub-panellists, it seems that their participation provoked tensions between being part of an academic community in which they took civic pride and being disappointed by the make-up or marginalising effects of sub-panel work. At times there also appeared to be a deep inequity between institutions that not only got to take part in REF panels, but helped define its purposes and practices, and those institutions that were left on the outside, using guesswork as a basis for making their submissions. More detail on each of these concerns is presented in the report.
Professor Daniel Neyland said:
“An overview of the practices of the REF, a detailed comparison of the issues that arise in research assessment, and how these are resolved, has until now not been the subject of research. This Report aims to address that gap, provide greater insight into how the process works, and to look at the experiences of those involved in the process.”
Professor Neyland will be giving his Inaugural Lecture at the Ian Gulland Lecture Theatre, Goldsmiths, University of London, 17.30- 19.30 on 21st February.
The report is available here from 12 noon Tuesday 21st February 2017: www.marketproblems.com
TO ATTEND or for further information:
Janie Ironside Wood, (Interim) Head of Communications
Goldsmiths, University of London,
M+44 7730 047511 | E: email@example.com www.gold.ac.uk
* The research was funded by the European Research Council (grant no: 313173; www.marketproblems.com) as part of a broader enquiry into the effects of interventions into public sector problems that are said to contain a market component.
* The Research Excellence Framework (REF) is the system introduced in 2014 for assessing the quality of research in UK higher education institutions. It replaced The UK Research Assessment Exercise (RAE) which took place in 1986, 1989, 1992, 1996, 2001 and 2008.