RAE2001 logo


Criteria and Working Methods


1.1 The primary purpose of the 2001 RAE is to produce ratings of research quality (Appendix C) which will be used by the higher education funding bodies in determining the main grant for research to the institutions which they fund with effect from 2002-03. The RAE is conducted jointly by the Higher Education Funding Council for England (HEFCE), the Scottish Higher Education Funding Council (SHEFC), the Higher Education Funding Council for Wales (HEFCW) and the Department of Education for Northern Ireland (DENI).

1.2 The assessment process is based on a peer review and is not mechanistic. Panels will use their professional judgement to form a view about the overall quality of the research activity described in each submission in the round, taking into account all the evidence presented.

1.3 The other principles by which the RAE is governed are:

  1. Clarity: a large and complex body of regulations and procedures governs the RAE. In addition anecdote, partial information and speculation surround the exercise. All written documents and statements about the RAE, therefore, will be clear and consistent and misinformation about the RAE will be challenged wherever possible by unambiguous statements of the factual position. This principle will guide both the preparation of formal written documents such as the Guidance on Submissions and the advice and interpretation provided by the RAE Team. Similarly it will be a key element in preparing statements of criteria and working methods.
  2. Consistency: assessments made through the RAE should be consistent across cognate areas and in the calibration of quality ratings against international standards of excellence.
  3. Continuity: the RAE develops through an evolutionary process, building on and learning from previous exercises. With every successive exercise a balance has to be struck between continuity and development. In general changes are only made where they can bring demonstrable improvements which outweigh the cost of implementing them.
  4. Credibility: as was demonstrated through responses to the funding bodiesí consultation on the conduct of the RAE the methodology, format and processes employed in the exercise are credible to those being assessed. It is essential that this credibility be maintained.
  5. Efficiency: previous RAEs have been extremely cost-effective given the value of public funds distributed through their ratings (including the estimated cost to HEIs.) The cost of the exercise should continue to be the minimum which is consistent with a robust and defensible process.
  6. Neutrality: the RAE exists to measure the quality of research in Higher Education Institutions. It should carry out that function without distorting that which it is measuring. In other words, the RAE should not encourage or discourage any particular type of activity or behaviour other than providing a general stimulus to the improvement of research quality overall.
  7. Parity: the RAE is concerned only with assessing the quality of research of participating HEIs regardless of its type, form or place of output.
  8. Transparency: the credibility of the RAE is reinforced by transparency about the process through which decisions are made. This principle has already been used throughout the consultation process and associated work. Except where there is a need to preserve confidentiality (for example when dealing with the names of nominees for panel membership or with the strategic research plans of institutions) all decisions and decision-making processes will be explained openly.

1.4 The exercise is managed by the RAE Team, based at HEFCE, on behalf of the four funding bodies. All UK HEIs receiving funding from these bodies are eligible to participate.

1.5 The same rating scale as in 1996 will be used. The definitions of the points have been amended, following consultation with a group of former panel Chairs, in order to give greater clarity (Annex C). As in previous exercises panels will be able to indicate research groups which are identified as having particular excellence in research above the overall rating for a submission. Submissions which contain no evidence of research will not be awarded a rating.

1.6 Institutions will be invited to make submissions containing information on staff in post on the census date, 31 March 2001, on publications and other forms of assessable output which they have produced during the assessment period, on research support staff and students, research income, and a textual commentary. The assessment period is 1 January 1994 to 31 December 2000 in the case of Arts and Humanities subjects (UoA 45 to 67 inclusive) and 1 January 1996 to 31 December 2000 in the case of the other subjects (UoAs 1 to 44 and 68). All information will be submitted by 30 April 2001.

1.7 The results of the assessment will be published in December 2001. These will show the rating awarded to each submission and the number and proportion of staff submitted for assessment in each case. In addition, in order to improve the arrangements for feedback from the exercise, the funding bodies will produce:

  1. A published report by each panel confirming their working methods and giving a brief account of their observations about the strengths, weaknesses and intensity of activity of the research areas falling within the UoA.
  2. Feedback on each submission summarising the reason for the rating awarded with reference to the panelís published criteria. This feedback will be sent only to the head of the institution concerned.

1.8 The reports and feedback will be available in early 2002.

1.9 The funding bodies will also publish on the Internet those parts of submissions that contain factual data and textual information about the research environment. This will include the names of selected staff and the listings of their research output. Personal and contractual details will be removed and data on research students and research income will be presented in aggregate. Institutions will be given an opportunity to indicate any staff who should be omitted from the published list for reasons of security.

1.10 Panel members have been appointed from the list of nominations of experienced and well-regarded members of the research community, including users of research. The inclusion of user representatives on appropriate panels is intended to enhance the assessment process by making available a user perspective and expertise in assessing the quality of research expressed in forms of output which may be less familiar to academic panel members. All members play a full and equal part in the assessment process and share responsibility for reaching and justifying the Panelís final decisions.

[ Contents | Next ]


Last updated 4 June 1999

[ Home | About the RAE2001 | Panels | Guidance for panel members | Guidance for institutions | Publications  ]