RAE2001 logo

Briefing

RAE Briefing Paper - May 1999

Interdisciplinary Research and the Research Assessment Exercise

Supplement: Press interest in the report: Interdisciplinary Research and the RAE

Press Interest

1 The two main issues being focussed on by the press are drawn from the summary of Objective 3: Practices on p iv of the report. These are.

  1. Highly interdisciplinary researchers (67%+) received slightly lower (0.05 point) rating than high single disciplinary researchers (<=33% interdisciplinary) and interdisciplinary/single disciplinary researchers (33-66% interdisciplinary).

  2. "A particular problem was the fit between departmental structures and the RAE. Nearly one quarter (23%) of departments split their researchers between panels, or sought cross-referral between panels.
  • Overall, boundary critical submissions receive a 0.5 point lower rating than non-critical submissions.
  • In the former polytechnics and colleges, boundary critical and non-critical submissions received almost equal ratings (3.3 and 3.2 respectively).
  • In established universities, boundary critical submissions received an average rating of 4.6 compared with 5.1 for non critical submissions.

Although small, these differences in ratings are of concern because they suggest that departments suffer during assessment if they are not organised in a way that approximates to the RAE panels.”

The Press spin is that 23% departmental submissions were down rated by 0.5 of a point.

2 The objective of this analysis is to consider whether the main body of the report actually supports this statement and secondly to offer explanations for any differences in ratings.

Report Content

3 Boundary critical submissions are those where departments have either split their researchers between UoAs or have asked for cross-referral.

  • 11% of departments split their researchers
  • 10% requested cross referral
  • 2% both split their submissions and asked for referral

Split submissions

4 Objective 3, paragraph 30 p21

"If we adopt an a priori hypothesis that departments that submitted to more than one UoA in RAE 1996 are of the same quality as those that are submitted to a single UoA, it appears that splitting a department across more than one UoA leads to:
  • a half point lower rating for mixed economy institutions (income >£10m, 80% for teaching
  • no significant impact for other institutions

Interpretation

5 13% of institutions split their submissions.

6 The splitting of submissions had no effect on:

  1. "Post 1992 institutions

  2. Institutions with small teaching economies: income <£10m, >80% for teaching

  3. Institutions with large teaching economies: income >£10m, >80% for teaching

  4. Specialist Institutions: income <£10m, >20% for research

7 Therefore the institutions shown to receive 0.5 point lower ratings were pre 1992 institutions with a mixed economy: income >£10m, >20% for research.

Cross Referred Submissions

8 Objective 3, paragraph 31 p23

"The average ratings of cross referred submissions were lower:
  • 4.7: not cross referred
  • 4.2: cross referred”

Interpretation

9 12% of institutions requested cross referral

10 The average rating for cross-referred submissions was 0.5 points lower than for those which were not cross-referred.

Conclusion

11 Objective 3: Practices in the summary(p iv) is misleading. When the report is read in detail it becomes apparent that the % of submissions shown to be receiving lower ratings is considerably less than 23% but the actual figure is not calculated.
% submissions with lower ratings=% of cross referred submissions+% split submissions from post 1992 HEIs with a mixed economy

Possible Reasons for Differences in Boundary Critical Submission Ratings

12 The differences shown in the ratings awarded to boundary critical submissions may be due to:

  1. The report methodology

  2. Genuine differences in the quality of interdisciplinary compared with single disciplinary research,

  3. The 1996 RAE assessment process

  4. HEI submission errors/tactics.

Report Methodology

13 The information analysed in the report was obtained mainly from a survey of researchers in HEIs. This survey was carried out in three stages in an attempt to obtain enough replies for statistical analysis:

  1. A detailed questionnaire was sent to 10,998 researchers and 557 panel members.

  2. A sample of non responders (2,420 researchers and 298 panel members) were sent an abbreviated questionnaire

  3. A short questionnaire was sent to 5,330 researchers

14 This methodology raises a number of questions:

  1. The use of survey through questionnaires only gathers information from a self-selecting group which is unlikely to be representative of the whole eg those who were satisfied with the outcome of RAE 1996 are less likely to have responded.
  2. The questionnaires used to obtain the data for this report were not all the same. In order to produce abbreviated and short questionnaires it was necessary to rephrase questions and /or omit some questions altogether.
  3. The percentage of responses is calculated from the total number of questionnaires sent, (irrespective of typ
  4. minus the number of researchers and panel members who were defined as "having gone away, deceased etc”. (The actual percentage response was 3% lower than quoted for both researchers and panel members).

Quality of Interdisciplinary Research

15 In many cases interdisciplinary research fields are new and may not yet have developed sufficiently to produce the same quality of output as the more established single disciplinary subject areas.

1996 RAE Assessment Process

16 In 1996 the assessment process may have unintentionally affected the ratings awarded especially to cross-referred submissions.

  1. Each panel produced its assessment criteria in isolation, so disciplines in cognate areas were sometimes applying very different criteria.
  2. All submissions were sent to the lead panel (as indicated by the institution). This panel then passed the research outputs that were to be referred on to the referral panel.
  3. The referral panel assessed the outputs it received according to their panel criteria and gave feedback to the lead panel without any discussion taking place between the panels.

17 The results of these procedures were that:

  1. Panels only referred research outputs which were then assessed in isolation from the rest of the submission and against different criteria.

  2. Panels often referred work late in the assessment process which gave the receiving panel less time to make a full assessment than they had spent on their other submissions.

  3. The feedback sent to the referring panel often contained little explanation as to how the assessment was made and the referring panels then had difficulty in incorporating this feedback into the assessment of the whole submission.

RAE 2001 Assessment Process

18 RAE 2001 is addressing these issues by implementing the following processes:

  1. All panels will be provided with a template to help them form criteria and working methods which address the key assessment issues.

  2. Setting up "Umbrella” groupings of panel chairs in cognate areas. These groups will meet throughout the assessment process to increase communication between panels and thereby increase the consistency of the assessment process.

  3. When an HEI requests referral the RAE Team will send complete copies of submission to both the lead and referral panel at the start of the assessment period. This should enable panels to make a full assessment of referred work in the context of the whole submission.

  4. The feedback information from the referral to the lead panel will give details of how the assessment was made and will usually involve a meeting of members from each panel.

HEI Submission Errors/Tactics

19 Errors in institution submissions and some of the tactics employed in order to achieve high ratings may also affect the ratings awarded for example:

  1. Institutions may ask for their work to be referred inappropriately which can lead to a low assessment from the referral panel.

The artificial splitting of researchers between UoA eg submitting those perceived to be producing high quality work to one UoA and the rest to another. This may actually result in lower ratings overall because of the difficulty in explaining the departments research strategy and staffing policy in the textual parts of the submission.


[ Back to briefing ]


Last updated 20 August 1999

[ Home | About the RAE2001 | Panels | Guidance for panel members | Guidance for institutions | Publications | Contacts ]