Research Assessment Exercise in 2001:
|To||Heads of HEFCE-funded institutions|
Heads of DENI-funded institutions
Heads of HEFCW-funded institutions
Heads of SHEFC-funded institutions
|Of interest to those responsible for||Research, Funding|
|Response||by 16 October 1998|
|Publication date||July 1998|
|Enquiries to||HEFCE - David Pilsbury tel. 0117 931 7450|
NIHEC - John Coote tel. 01247 279333
HEFCW - Ann Hughes tel. 01222 682221
SHEFC - Morag Campbell tel. 0131 313 6567
1. This paper announces key decisions about the next Research
Assessment Exercise (RAE) and identifies issues for further consultation.
2. The funding bodies will jointly conduct a further RAE in 2001
for the whole UK, and broadly similar in form to the 1996 RAE.
Our prime purpose will be to produce ratings of research quality
to inform allocations of funds for research from the academic
3. The census date for assigning research active staff to the
HEIs where they are in post will be 31 March 2001, and the submission
date 30 April 2001. The RAE in 2001 will assess submissions in
all subjects from all eligible UK HEIs.
4. A summary of consultation issues is in paragraphs 62-68. Responses
should be sent by 16 October 1998 to David Pilsbury at
the joint RAE team based at the HEFCE, Northavon House, Coldharbour
Lane, Bristol, BS16 1QD.
5. The four funding bodies for UK higher education1
consultation paper on the future of research assessment in November
1997 'Research Assessment: Consultation (RAE 2/97)'.
A list of respondents and a quantitative analysis of responses
from higher education institutions are at Annex A.
6. This publication summarises the issues, announces key decisions
and makes some further proposals about the conduct of the next
exercise based on the comments we received. Key decisions and
specific issues for consultation appear throughout the document
and are summarised in paragraphs 69-87. We welcome comments both
on those matters identified and on any other aspect of the RAE.
7. A separate document is published alongside this one concerning
the bodies to be invited to nominate members of assessment panels.
Responses to that document are required by 4 September
8. Some aspects of the RAE, such as the types of data we collect,
their definitions and the method of data collection are not discussed
in detail here. These will be the subject of a separate document
to be published in spring 1999.
9. We will announce in spring 1999 the number and definition of units of assessment (UOAs), number and membership of assessment panels, and data requirements and definitions. We expect panels to consult on draft criteria from summer 1999, and to publish final criteria by the end of that year. A draft timetable is at Annex D.
Timing and scope
10. Most respondents to the recent consultation believed that
there should be a further RAE along broadly similar lines to the
last. Most also acknowledged the positive effects of the RAE to
date on research quality and the management of research. They
believed that evolutionary changes should be made to a future
exercise to improve its value and efficiency. There was no support
for a rolling programme of assessments, and respondents felt that
an equitable and robust approach to an interim exercise could
not be devised. There was very little support for the Dearing
Report's recommendation to provide baseline funding for scholarship
to units opting out of the RAE: institutions would generally like
to see positive inducements to engage in activities other than
research, rather than an amendment to the RAE to effect this.
A UK-wide RAE rather than separate regional or national exercises
is still favoured to facilitate comparability and achieve economies
11. Most respondents believed that the period between assessment
exercises should be no less than four years and no more than six,
and the majority favoured five years. A five-year period seems
preferable to us to ensure that judgements affecting funding are
made on the basis of reasonably current data, while ensuring that
the act of participating in the RAE does not unduly affect research
activity by too frequent review. It should be noted that in order
to effect the changes to the assessment process discussed below,
panels may need a longer period between the receipt of submissions
and the publication of ratings. We have had to balance this requirement
with the need for panels to produce ratings in time to inform
funding allocations for 2002-2003, and with the desire expressed
by some HEIs not to bring forward the preparation of submissions
to the beginning of the academic year.
The funding bodies will jointly conduct a further RAE in 2001 for the whole UK. The census date for assigning research active staff to the HEIs where they are in post will be 31 March 2001, and the submission date 30 April 2001. The RAE in 2001 will assess submissions in all subjects from all eligible UK HEIs.
12. A significant number of HEIs, and some respondents from other
sections of the higher education research community, suggested
that although another RAE is desirable, a fundamental review is
needed now to determine whether it will be appropriate to assess
research quality differently after the next exercise. This review
should allow a sufficiently long lead-time to prevent views about
the RAE being driven by short-term thinking.
We will begin now to consider the future of the RAE beyond the next exercise in 2001.
Framework and data definitions
13. Respondents expressed strong support for a common framework
and data definitions for all subjects, provided that these are
not too rigid so as to preclude appropriate weightings and interpretations
being applied in different subjects. That is, a common set of
data should be provided to all panels, but it should be possible
for panels to attach different weight to the evidence supplied
and interpret it as appropriate to a particular subject. Respondents
also felt that panels should be able to request specific information
to be provided in submissions. However, where there were variations
between panels, they should be reasonable, justifiable and made
explicit in criteria that had been set an appropriate time before
the submission date.
14. Most respondents who expressed an opinion believed that the
range of data collected for the 1996 RAE was appropriate to assess
research quality on a common and comparable basis for all subjects.
We will inform HEIs in spring 1999 about precise data definitions
and the census date for eligibility of all the other data that
we will collect, apart from the publications and other forms of
15. In the 1996 RAE, submissions recorded up to four items of
research output per active researcher. We thought this represented
a reasonable balance between the need to give panels a cross-section
of work to read or review, and the risk of overwhelming panel
members with material which they could not digest. Feedback from
panels confirms that they could reach the necessary judgements
on this basis.
16. Many respondents to the consultation urged us to maintain
the requirement for up to four pieces of public research output
per researcher, across the range of subjects assessed. Others
commented that producing far more than one item per year is the
norm in many areas, and that we should collect data that illustrate
17. We have weighed the sector's desire to maintain a common framework
for all subjects, the ability of panels to digest the research
submitted, and concerns that the RAE should not influence researchers
to publish prematurely, or more prolifically than they would otherwise.
We propose that up to four pieces of research should be prescribed,
and that supplementary data, such as a list of publications, could
be collected for panels in different UOAs if they make a case
for doing so (see paragraph 19 below).
- require details of up to four pieces of research output per active researcher
- collect information on research income from all sources, on postgraduate student activity, and on research culture and plans
- allow the submission of work produced during the five years from 1 January 1996 to 31 December 2000 for science, medicine, engineering and social sciences UOAs; and during the seven years from 1 January 1994 to 31 December 2000 in arts and humanities UOAs.
18. Some output in medicine, sciences and social sciences published
between 1 January and 31 March 1996, and some arts and humanities
output published between 1 January 1994 and 31 March 1996, which
was submitted in 1996 may also be submitted in 2001.
Method of assessment
19. As noted in paragraph 13 above, virtually all respondents
agreed that peer review panels should operate within a common
basic framework, basing their judgements on a common set of data
and data definitions. There was some support for supplementing
the data that we collected in the 1996 RAE with some quantitative
measures, if panels considered this appropriate in their subject.
A full discussion of panels' criteria and methods of operation
is in paragraphs 43-44 below.
20. Many respondents thought that a form of self-assessment that
fell short of specifying an indicative rating would be a welcome
addition to the process of peer review. They felt that the parts
of submissions about research culture, context and future plans
could be developed to facilitate this. Returns structured more
explicitly against headings or criteria should help HEIs to present
their research most effectively to panels, and ensure that panels
assess them even-handedly. In turn, the self-assessment could
provide the basis for feedback, discussed in detail in paragraphs
- ask panels to specify a structure to the textual parts of submissions (currently forms RA5 and 6) that facilitates self-assessment under a number of headings. Panels will make specifications on these forms within a basic common framework, after consultations with their subject communities
- allow panels to request specific, additional information on these forms, such as quantitative indicators of total research output and other contextual information.
Outputs of the RAE
21. Most respondents believed that panels should provide some kind of feedback. However, opinion was divided on the extent of this. Most respondents recognised that the more we move in the direction of formative feedback, the more work this will bring for panels. Some were opposed in principle to the RAE becoming formative if this implied panels making directive judgements that would unduly influence the nature or course of research in a unit or at the subject level.
22. Some respondents suggested that some feedback could be provided
at no extra cost, through more efficient recording of panels'
decisions. Others observed that any additional cost of providing
feedback would be offset by the savings achieved by moving to
a five-year cycle. A range of options has been proposed, whose
additional cost in time and money would vary:
23. We need to balance a number of factors: concerns that the
RAE should not become directive; the need for panels to focus
primarily upon producing ratings of research quality; the sector's
desire for the RAE to be as transparent as possible; and a desire
to make the exercise more cost effective.
We intend to ensure that all panels produce:
The extent and content of generic reports on units of assessment will be decided as part of the panels' process for developing their criteria and working methods.
- a written report, published by the funding bodies, confirming their working methods and giving a brief account of their observations about the strengths, weaknesses and intensity of activity of the research areas falling within the UOA
- feedback on each submission summarising the reason for the rating with reference to the panel's published criteria. This feedback will be sent in all cases only to the head of the institution concerned.
24. Publishing parts of submissions to increase transparency and provide a developmental resource to HEIs was supported by most of those HEIs who responded, although there was concern about the strategic parts of submissions being made public. After the 1992 and 1996 RAEs, the funding bodies published anonymised, summary data from the submissions. Some respondents to the recent consultation suggested that the decision to publish entire submissions should be at HEIs' discretion.
25. We see benefits to a coherent, rather than piecemeal approach
to publishing submissions. It would enable HEIs to undertake their
own analysis and research on a complete and unique source of data,
and to develop indicators against which to benchmark their performance.
However, although UK HEIs would find the information useful, the
publication of such information in so readily accessible a form
could affect the competitive positions of institutions internationally.
Furthermore, publishing the names of all staff selected for submission
by HEIs (and, by omission, the non-selected staff) will also impact
We could, in addition to publishing ratings for the RAE, publish on the Internet the parts of submissions that contain factual data and textual information about research environment and general observations (currently RA1-RA4, RA6), but not RA5 which concerns strategic research plans. Submissions would not be anonymised, so the names of selected staff and titles of their output would become public, but personal details such as dates of birth would be removed.
We invite HEIs to say whether or not they agree that we should publish all submissions on the Internet. If so, should we publish whole submissions or parts, as specified in the preceding paragraph?
Units of assessment
26. Respondents to the consultation agreed that UOAs based on
subjects were the best method of delineating the research map
rather than, for example, a thematic approach. However, they pointed
out that the UOAs for the 1996 RAE were not in all cases single
disciplines: a number of them combined subjects. Most respondents
agreed that the number and coverage of 1996 UOAs were broadly
right, but a minority believed we should have fewer, more broadly
27. We agree in principle with suggestions for a wide-ranging consultation to map the current UOAs to research activity in order to take account of emerging fields. However, although there may be changes in the configuration of higher education research in three years' time, we need to establish now UOAs for 2001, so that we can convene panels and begin to develop criteria. Our recent consultation elicited suggestions about particular UOAs; those cited most often are mentioned below. On the whole, comments on UOAs were not comprehensive, and we have been encouraged by many HEIs to retain those used in the 1996 RAE.
28. Therefore, our starting point for this round of consultation
is the list of UOAs and their coverage in 1996 (reproduced in
Annex B). We recognise that this map
does not mirror the configuration of research in all UK HEIs:
this would be virtually impossible to achieve. However, it should
enable the assessment of the whole range of research conducted
in UK HEIs, leaving no ground uncovered. It should also ensure
that closely related themes and subjects are submitted to the
same UOA and assessed by the same panel. Two principles will apply
in considering any suggestions to amend this research map:
These principles should be considered in the context of our proposals for joint working of panels, which is discussed in paragraph 46 below.
Apart from the UOAs identified in paragraph 29, and unless there are compelling reasons to make other changes, we propose to retain for the next RAE the 69 units of assessment listed in Annex B, and the definition of each one provided in the list of sub-areas of activity.
We seek comments on this proposal, and particularly on whether any definitions of UOAs need revision.
29. The following UOAs were highlighted in the recent consultation
as requiring reconfiguration.
30. There was no consensus about the consultation paper's proposals
for ensuring that interdisciplinary research is adequately assessed.
There was some support for requiring panels to incorporate into
their criteria some generic interdisciplinary criteria, but opinion
was divided over the proposal for a single panel which would be
responsible for overseeing the treatment of all interdisciplinary
31. The funding bodies are jointly conducting a specific study
of interdisciplinary research in relation to the RAE. Its conclusions
will inform further debate from the autumn. In the meantime, we
highlight several other possible amendments to the RAE process
which were suggested in the recent consultation:
There are serious practical problems associated with the fourth
suggestion, but we are inclined to take forward the others.
We intend, subject to the outcome of the funding bodies' study of interdisciplinary research and the RAE, to implement suggestions a-c in paragraph 31 above.
Assessing collaborative research
32. Research collaboration between HEIs is important; it is essential
that arrangements for making submissions to the RAE do not deter
it. On the other hand, it would be wrong to try to make the RAE
a mechanism for promoting collaborative research by including
criteria to this effect. For the 1996 RAE, HEIs were able to make
joint submissions where collaborative research groups existed
between one or more HEIs. Moreover, HEIs were able to refer to
collaborations within and between institutions in the textual
parts of submissions.
We will retain these provisions but need to consider further how we could facilitate the submission of collaborative research in the RAE. We welcome comments about this.
33. There was extensive comment on the question of panel membership.
The most frequent messages were that panels need to be credible
and that their membership should comprise people able to judge
research excellence. Panels should also be sufficiently broadly
based to assess the whole range of research submitted to them,
and there should be appropriate turnover in panel membership.
34. Respondents felt that the previous method of appointing panel
chairs on the basis of nominations from outgoing chairs alone,
was likely to build orthodoxy and bias into the process, and was
one of the less satisfactory elements of the RAE. Opinion was
divided on whether we should seek nominations to the chair from
the entire outgoing panel or the incoming panel. There are strong
arguments in favour of the first option. The chair needs to be
in place early in the process to play a part in appointing the
panel members. Moreover, the outgoing panel will know from experience
who is capable of chairing and co-ordinating a complex and time-consuming
process. There is a strong feeling that we should establish panels
early, so that criteria can be set and published much earlier
than for the 1996 RAE.
We will appoint a chair for each panel on the basis of nominations from the entire panel in 1996. Using this method, we will begin now to identify an indicative chair for each panel established in the 1996 RAE. Should the consultation on units of assessment lead us to establish new or different panels, chairs for these will be appointed following consultation with cognate panels and subject communities.
35. Many of those who responded felt that the present arrangements
- with up to two-thirds of panel members continuing from exercise
to exercise - were likely to lead to orthodoxy and conservatism.
A degree of continuity of panel membership was favoured, but this
should be reduced. We were advised to continue to consult a wide
range of bodies about nominations to a pool of potential panel
members. A separate document (RAE 2/98), published alongside this
one, seeks recommendations on the list of bodies we should invite
to make nominations to panels. Responses to that document are
required by 4 September 1998.
36. A number of respondents questioned the transparency of our
past practice in sifting from this pool of nominations to appoint
panel members. Many recommended an election from the pool to the
panel. However, we do not believe that an election process would
produce a coherent panel membership, qualified to assess excellence
across the range of themes and approaches that are likely to be
submitted to the panel. We believe that if panel chairs are themselves
elected by a wider constituency, then they should continue to
advise the funding bodies on the best balance of members to cover
the subject from a broadly based pool of nominations.
- continue to seek nominations to panels from a wide range of bodies with an interest in research
- continue to appoint panel members from the resulting pool of nominations, on the basis of recommendations from the panel chair
- publish the names of those appointed to the panel and of the bodies who nominated them
- allow panel members to serve only in two successive exercises, and chairs in a maximum of three - but only twice as chair, including on panels for any new or reconfigured UOAs
- roll forward no more than half the membership of any one panel from one RAE to the next.
37. Opinion was mixed about including users of research on panels.
A tension exists between those who think peer review requires
only academics to serve on panels, and others, including some
panels in the 1996 RAE, who believe that the impact of applied
research can be given full credit only if users are involved in
38. Many pointed to the wider community of users other than industry,
such as the voluntary sector, teachers and healthcare professionals.
The most frequent and pragmatic suggestions for increasing the
input of users were to approach them early in the process, involve
them in setting criteria, and thereafter enable panels to refer
to them as consultants where a submission flagged the applied
nature of its research. It is our intention to achieve wider membership of panels than in the past. Users of research may be nominated and
appointed to panels in the normal nominations process; alternatively,
sub-panels of users could be convened. We plan to
consult panels on this issue. In the meantime, we will also consult
with bodies that have successfully engaged users in peer review
exercises, and we are placing press advertisements inviting other
bodies to put themselves forward to nominate panel members.
We will consult panels, and other bodies that have successfully engaged users in peer review exercises, on the most appropriate method for involving users in the RAE.
39. Most respondents agreed that sub-panels could be used effectively
to assess specialist research areas within a UOA that do not justify
the creation of a separate UOA and panel. Respondents were keen
that sub-panels should be identified before the submission date,
rather than as an ad hoc response, once submissions are
received, to perceived shortcomings in the coverage of panels.
Their membership should be decided with the same degree of openness
as membership of the main panels; and their relationship to the
main panels, including the weight that their advice will carry,
should be explicit.
40. In most cases, panels will be able to decide before they receive
submissions whether they need to convene sub-panels. However,
in the 1996 RAE, a few panels received submissions that included
significant bodies of work in an area beyond the panels' collective
expertise - and which they had not anticipated. To ensure parity
of treatment in such cases, panels should retain the option to
propose setting up ad hoc advisory panels once they have
received submissions, to advise on significant bodies of work
that neither the panel nor its sub-panels is competent to assess.
41. A number of respondents believed that sub-panels should themselves
have executive powers, rather than an advisory role. However,
main panels view the whole range of submissions in a unit of assessment,
and we do not believe that sub-panels can carry responsibility
for ratings if they judge only a sub-set of submissions. We agree,
however, that all sub-panels must include at least one member
of the main panel to which they report, so that their advice can
be weighed by someone who has taken part in assessing the whole
range of submissions.
42. On a related point, many respondents recommended that where panels consult specialist advisers during the assessment process, the advisers' names should be published. In the main, panels will be encouraged to seek advice on specialist areas from the pool of nominations for each panel, discussed in paragraphs 35-36 above. Clearly, however, panels may need to look beyond this pool of expertise once submissions have been received. Panel criteria are discussed in paragraphs 43-44 below. We will ensure that panels state in their criteria how they intend to use specialist advisers.
- publish the membership of any sub-panels, established by panels before the receipt of submissions, to advise on the assessment of specialist areas
- require panels to state in their criteria the remit of any sub-panels convened, and their approach to using specialist advisers
- enable panels to propose setting up ad hoc advisory panels after the receipt of submissions, to advise on significant bodies of work that neither the panel nor its sub-panels feel competent to assess.
Panel criteria and operation
43. Opinion was divided on whether all panels should operate to
common criteria and working methods, or whether specific criteria
should apply in different subjects. The clearest messages were:
There was also a strong feeling that the funding bodies should
facilitate interaction between panels in both the planning and
assessment stages, to prevent unwarranted variations in the practices
of panels assessing similar work.
44. To balance these concerns, we believe that the process of
criteria-setting should begin much sooner than in 1996, with a
view to publishing draft criteria two years before the submission
date. We will facilitate consultation between panel chairs in
cognate groups when setting criteria and during the assessment
phase, to ensure that panels in similar subjects apply their criteria
and the rating scale consistently.
- convene panels by the end of 1998 and offer them a core framework for panel criteria and operation on which to hang subject specific variations
- convene forums of panel chairs in broadly similar areas to compare the specificity of criteria and working methods
- encourage panels to consult even-handedly across all subject associations and other appropriate bodies in setting specific criteria
- oversee the criteria-setting process and ensure that criteria are reasonable, defensible and consistent.
45. We have been encouraged by a number of institutions to increase
confidence in panels' operations by convening a special panel
with responsibility for attending one meeting of each main panel.
The panel could comprise senior academics, former RAE panel members
and officials of the funding bodies. It would ensure that panels
abide by their published criteria, and report on perceived variations
in the generosity of their marking.
We are considering establishing a 'roving' panel of experts to ensure the robustness and comparability of panel judgements, but will first seek the views of panel chairs.
46. The suggestion noted in paragraph 45 goes a step further than
having forums of panel chairs meet during the assessment process.
Whether or not it is implemented, we believe there will also be
merit in having some umbrella groups of panel chairs who meet
at the beginning and towards the end of the assessment phase.
These forums will ensure a common approach is taken in similar
subjects, both to problems or anomalies identified in submissions,
and to the interpretation of the rating scale and its descriptors.
Panel chairs will also be asked to explain the profile of ratings
awarded and to defend their 5 and 5* ratings in these forums.
We will establish a small number of umbrella groups of panel chairs, which will meet during the assessment stage to ensure consistency of approach and of marking standards. Once established, the panels will be consulted on the appropriate division of the UOAs into umbrella panels.
Validating judgements of international excellence
47. There was a mixed reception to the suggestion to include non-UK
researchers on all panels to validate the application of the rating
scale (which is premised on the notion of international excellence).
There are obvious limitations to this suggestion: the complete
international perspective on a unit of assessment and the broad
range of themes it covers could not be captured by bringing in
one or two experts from outside the UK. Some disagreed in principle
with the suggestion, claiming that if panel members are truly
excellent in their field, they will themselves have an international
perspective. The rating scale and issue of international excellence
are discussed fully in paragraph 52 below.
48. 'International' panel members may be desirable for subjects
where the UK research community is relatively small. In any case,
we will not prevent panels from including non-UK researchers.
An analysis of responses from HEIs with the largest numbers of
5 and 5* ratings in the 1996 RAE indicates support for panel judgements
at the top end of the scale being moderated or verified by correspondence
with a group of non-UK researchers. This would enable other ratings
within a UOA to be calibrated. These referees would have an advisory
role rather than a right of veto. Some respondents suggested that
such referees should sample across all grade bands. However, although
non-UK researchers will be competent to judge whether a submission
reaches international standards of excellence, we do not believe
that they would be able to validate the award of a rating of national
We will require panels to consult a corresponding group of non-UK researchers as referees before awarding the top ratings to any submission. Panels will be invited to identify the corresponding referees before the receipt of submissions. Their names will be made public.
The definition of research
49. There was unanimous support in the consultation for a broad
and inclusive definition of research, encompassing all academic
research and work of direct relevance to the needs of commerce
and industry, public and voluntary sectors. Equally, however,
we are urged not to dilute the definition by including essentially
non-research activities. Analysis of responses that said the definition
of research excluded essential activities showed a dissatisfaction
not with the definition itself, but with its interpretation by
some panels in the 1996 RAE. In particular, respondents urged
that panels' criteria should specify exactly how they will treat
research outputs whose status may be unclear such as scholarly
translations and creative writing. It was also suggested that
panels should specify what credit they will give to activities
that contribute to long-term collaborative work to develop and
maintain the intellectual infrastructure of research.
50. A large majority agreed with the assertion that there should
be no additional credit given in the RAE to the utility
of research. However, many pointed out that in some subjects and
UOAs, the question of relevance or utility is integral to any
understanding of research quality. Therefore, these panels' criteria
should articulate what recognition will be given to usability
or application as an indicator of research quality.
51. A small minority wanted to see value for money (VFM) assessed, but most who commented explicitly on this question acknowledged the difficulty of establishing robust VFM indicators. Therefore, considerations of VFM will not be included in the rating scale or in panels' criteria.
In 2001, we will use the definition of research in Annex C, and require panels to specify in their criteria how they intend to apply it.
The rating scale
52. There was strong support in the consultation for retaining
a rating scale of seven points, but opinion was divided on whether
this should be renumbered from 1-7. Some convincing arguments
were made for retaining the present scale on grounds of continuity
and because it is widely understood, and we agree in principle
with these. Neither decimal scores nor scores identifying the
percentages of work at different levels were widely supported.
Most respondents believed HEIs should retain the right to reference
sub-areas in a submission in order to describe the spread of themes
and subjects it encompassed. However, most agreed that references
to sub-areas in the rating scale led to problems in interpreting
the scale. The most frequent suggestion was to review the definitions
of the points on the rating scale, and to clarify the meanings
of the terms referring to sub-areas and proportions as well as
the terms 'national' and 'international' quality.
- use for RAE2001 a seven-point scale, numbered from 1-5* as for the 1996 RAE
- convene a group of senior respected academics and representatives of learned societies to sharpen and clarify the terms and descriptors of the rating scale
- publish a rating scale based on these descriptors, once this group reports.
53. There was no significant support for suggestions that the
general rules of submission which apply to all UOAs should prescribe
a minimum critical mass for submissions awarded the highest rating.
The notion of minimum critical mass may be appropriate in some
UOAs, while in others lone researchers are the norm and can produce
excellent research without collaboration. Therefore, panels will
be required to make clear in their criteria how they will judge
54. There was some support for the view that in order to achieve
the highest ratings, a high proportion of staff - as much as 100
per cent - should be submitted. This was associated with the feeling
that a unit should not be able to present itself as a 5 or 5*
unit when only a relatively small proportion of staff had been
assessed as such. However, a number of those who believed it was
desirable could see practical problems in prescribing a minimum
percentage return for the award of the highest rating. Many HEIs
do not organise their research in traditional departmental structures.
For other HEIs , submissions to RAE UOAs do not accurately map
to their existing departmental or other structures. Therefore,
there are difficulties in defining the pool of potential researchers
from which the submitted staff have been drawn.
55. If the main problem is one of presentation of the ratings,
then some felt we should adopt a rating scale in which all elements
of the rating and submission are presented, and make sure that
it is quoted in full. For example a rating of 5(A25) would define
a 5 rating for a submission comprising 25 researchers, and more
than 95 per cent of staff in the unit returned. Suggestions for
ensuring that HEIs and others quote the ratings properly include
copyrighting them, or asking representatives bodies such as the
CVCP to 'police' HEIs' use of ratings in marketing material. This
does not differ significantly from the combined number-letter
rating for the 1996 RAE, and we are not convinced that we could
ensure it was properly used.
56. However, we wish to consider further the issue of mapping
HEIs' total research effort to the RAE UOAs. We are still investigating
whether we can develop a robust basis on which to determine the
percentage of staff who have been submitted, so that we could
set a minimum percentage threshold for the award of the highest
We will not adopt a rating scale which requires a minimum size.
We are still considering the question of requiring a minimum proportion of staff to be returned for the achievement of the highest ratings. We seek suggestions for a robust approach to determining what percentage of staff have been submitted, in view of the diverse nature of HEI structures for organising research.
We also seek further views - if such a measure can be determined - about whether a minimum percentage, up to 100 per cent, should be submitted in order to achieve the highest ratings.
Prospective judgements and staff transfers
57. The issue of staff transferring between HEIs in the census
period provoked much comment. There are two related questions:
whether we should take action to discourage the poaching of staff;
and whether there is some way to ensure that institutions that
have invested in developing a high quality research environment,
and will continue to do so, are not disadvantaged by staff leaving
immediately before the RAE. On the first question, some believed
that the RAE to date has promoted poaching by some HEIs of top
quality researchers, and that this should be stopped. Others concurred
with our conclusion - following quantitative analysis of movement
in 1992 and 1996 - that the RAE produces a small increase in the
transfer of staff between HEIs, against a background of movement
that is natural in a dynamic and growing sector. The fact that
recent research has shown that there is more of this sort of recruitment
activity in the United States suggests that this movement is not
driven only by the RAE. However, we recognise that this is an
area in which further work could usefully be done, and we will
examine the transfer market further as part of our longer term
review of the RAE.
As part of our longer term review of the RAE beyond 2001, we will examine the evidence for a 'transfer market' of researchers, promoted by the RAE, and consider its impact on HEIs' research.
58. The funding bodies could devise a rule that penalises, through
the assessment or funding process, an HEI that recruits a researcher
from another institution in the approach to an RAE. However, we
do not believe that this would discourage HEIs from aggressive
recruitment practices. If it had any effect at all, it would simply
move the 'transfer market', if there is one, to a different point
in the academic calendar, and might adversely affect the promotion
and career development prospects of individual researchers in
the years approaching the RAE.
59. We believe there are two important principles. First, in making
prospective judgements on research quality based on past performance,
the RAE is able to take account of institutions' investment in
developing their staff and research culture. Second, where key
individuals move on, but are replaced so that the environment
will flourish and outputs will continue, the institution should
not be disadvantaged. For the next RAE, we will ensure that HEIs
which lose staff in the run-up to the exercise will nevertheless
be able to benefit from their investment in their staff.
60. Our measure will not remove the incentive for recruitment,
but will provide some redress for the loser. In addition, as in
1996, our guidance to panels on their criteria will ask them to
consider the implications for the research culture of a submission
that includes a large number of recently appointed staff.
61. Any method that tries to apportion research output pro-rata
to the amount of time spent at different HEIs will have its problems.
First, the research underpinning a published output could have
been conducted some years before its publication; second, such
apportionment may be difficult to record and audit.
- allow HEIs that have lost a researcher up to 12 months before the submission date to return his/her research output as an indicator of quality on form RA2, and flag in the textual parts of submissions whether the researcher has been, or will be, replaced. Such researchers will not be included in the research volume for funding
- allow the 'new' institution in such cases to return the same researcher in the normal manner
- where a researcher moves HEIs more than 12 months before the census date, allow the 'losing' HEI to refer to their research in the textual parts of the submission only.
Summary of issues for consultation
62. We could, in addition to publishing ratings for the RAE, publish
on the Internet the parts of submissions that contain factual
data and textual information about research environment and general
observations (currently RA1-RA4, RA6), but not RA5 which concerns
strategic research plans. Submissions would not be anonymised,
so the names of selected staff and titles of their output would
become public, but personal details such as dates of birth would
be removed. We invite HEIs to say whether or not they agree that
we should publish all submissions on the Internet. If so, should
we publish whole submissions, or parts, as specified in the preceding
paragraph? (See paragraph 25.)
63. Apart from the UOAs identified in paragraph 27, and unless
there are compelling reasons to make other changes, we propose
to retain for the next RAE the 69 units of assessment listed in
Annex B, and the definition of each
one provided in the list of sub-areas of activity. We seek comments
on this proposal, and particularly on whether any definitions
of UOAs need revision. (See paragraphs 26-28.)
64. We invite comments on how sub-panels could be configured within UOAs 1-3. (See paragraph 29.)
65. We invite comments on how the range of sub-themes identified for UOAs 43 and 64 (see Annex B) could be divided between sub-panels to maintain academic coherence and facilitate a manageable distribution of work within panels. (See paragraph 29.)
66. We seek comments on recommendations that the Education UOA should be subdivided, with either: a separate UOA for research in compulsory education and one for post-compulsory education; or a separate UOA for research into education policy, and one for education practice including pedagogic research. (See paragraph 29.)
67. We will retain the provisions of the 1996 RAE for assessing
collaborative research but need to consider further how we could
facilitate the submission of collaborative research. We welcome
comments about this. (See paragraph 32.)
68. We are still considering the question of requiring a minimum
proportion of staff to be returned for the achievement of the
highest ratings. We seek suggestions for a robust approach to
determining what percentage of staff have been submitted, in view
of the diverse nature of HEI structures for organising research.
We also seek further views - if such a measure can be determined
- about whether a minimum percentage, up to 100 per cent, should
be submitted in order to achieve the highest ratings. (See paragraph
Summary of decisions
69. The funding bodies will jointly conduct a further RAE in 2001
for the whole UK. The census date for assigning research active
staff to the HEIs where they are in post will be 31 March 2001,
and the submission date 30 April 2001. The RAE in 2001 will assess
submissions in all subjects from all eligible UK HEIs. (See paragraph
70. We will begin now to consider the future of the RAE beyond
the next exercise in 2001. (See paragraph 12.)
71. We will:
72. We will:
73. We intend to ensure that all panels produce:
The extent and content of generic reports on units of assessment
will be decided as part of the panels' process for developing
their criteria and working methods. (See paragraph 23.)
74. We intend, subject to the outcome of the funding bodies' study
of interdisciplinary research and the RAE, to implement suggestions
a - c in paragraph 29 above. (See paragraph 31.)
75. We will appoint a chair for each panel on the basis of nominations
from the entire panel in 1996. Using this method, we will begin
now to identify an indicative chair for each panel established
in the 1996 RAE. Should the consultation on units of assessment
lead us to establish new or different panels, chairs for these
will be appointed following consultation with cognate panels and
subject communities. (See paragraph 34.)
76. We will:
77. We will consult panels, and other bodies that have successfully
engaged users in peer review exercises, on the most appropriate
method for involving users in the RAE. (See paragraph 38.)
78. We will:
79. We will:
80. We are considering establishing a 'roving' panel of experts
to ensure the robustness and comparability of panel judgements,
but will first seek the views of panel chairs. (See paragraph
81. We will establish a small number of umbrella groups of panel
chairs, which will meet during the assessment stage to ensure
consistency of approach and of marking standards. Once established,
the panels will be consulted on the appropriate division of the
UOAs into umbrella panels. (See paragraph 46.)
82. We will require panels to consult a corresponding group of
non-UK researchers as referees before awarding the top ratings
to any submission. Panels will be invited to identify the corresponding
referees before the receipt of submissions. Their names will be
made public. (See paragraph 48.)
83. In 2001 we will use the definition of research in Annex C,
and require panels to specify in their criteria how they intend
to apply it. (See paragraph 51.)
84. We will:
85. We will not adopt a rating scale which requires a minimum critical mass. (See paragraph 55.)
86. As part of our longer term review of the RAE beyond 2001, we will examine the evidence for a 'transfer market' of researchers, promoted by the RAE, and consider its impact of it on HEIs' research. (See paragraph 57.)
87. We will:
1 The Higher Education Funding Council for England, the Higher Education Funding Council for Wales, the Scottish Higher Education Funding Council and the Department of Education for Northern Ireland.
2 A consultation document arising from these discussions will be issued in summer 1998.
Annex A - Summary of Responses from HEIs
Annex C - Definition of Research
Annex D - Timetable