'Mock' Teaching Excellence Framework

23 June 2016

In response to the Times Higher Education mock Teaching Excellence Framework (TEF), Russell Group Director General Dr Wendy Piatt said:

“Robust and credible measures of teaching quality will take time to develop through piloting and review. Attempts to emulate prematurely or predict the results through tables such as this show precisely why the Government has rightly appreciated the difficulties of introducing a complicated teaching quality assessment system and will develop this new system over a longer period.

“The Department for Business, Innovation and Skills (BIS) asked the Office for National Statistics (ONS) to undertake a review of the data sources that will inform the TEF. It would be wrong to attach any weight to the results in these tables which are based on only one year’s data (compared to the Government’s preferred three years of data) and on a methodology which doesn’t meet ONS standards. Until the data are fit for purpose any analysis is likely to be inaccurate and misleading.

“A huge amount of time, effort and resources have been devoted to improving the education and student experience our universities provide.  This is reflected in feedback from employers and our students who year on year express above average levels of overall satisfaction with the quality of their course. The latest National Student Survey showed 89% of students at Russell Group universities are satisfied with the teaching on their course and 90% found their course intellectually stimulating.

“There is always room for improvement but this is best delivered through an approach to teaching quality that protects the institutional autonomy, diversity and competitiveness that our system thrives on. We look forward to contributing to the TEF consultation to help develop a system that adds value and assesses teaching fairly and accurately.”

Notes to editors

  1. According to the QS World University Rankings 2015, 11 of the top 50 universities in the world, as ranked by employers, are Russell Group universities.
  2. BIS asked the ONS to undertake a review of the data sources that will inform the TEF. The ONS interim review can be found on their website. The interim report includes a number of recommendations concerning: the target population for the TEF and the extent to which the proposed data sources (NSS and HESA DLHE) match this target population; the extent of under-reporting of certain groups and over-coverage of others which could lead to bias in use of the data; and the identification and quantification of non-response bias. Until the data are fit for purpose any analysis is likely to be inaccurate and misleading.
  3. On the NSS the ONS say, “In examining the survey, there are certain issues around collection of the data which HEFCE should be aware of when considering the use of NSS data as part of quality indicators: under-reporting of certain groups and over-coverage of others is a matter of concern and could lead to bias in use of the data; the lack of a voice from those who did not complete their course is a potential weakness in the planned quality indicators; neither DLHE nor NSS have this element. As for the previous point, this could result in bias. While the NSS response rates are good by modern standards, an understanding of the non-responders would be of significant benefit.
  4. On the validity of NSS for TEF the ONS said: “The NSS population is determined by a HESA target list, which identifies undergraduate students expected to be in the final year of their course in the period covered by the survey. While the coverage of these sources is clear, this is not necessarily the coverage required for the purpose of the TEF. It is possible that the surveys do not cover all students that are required (under coverage) and might include some students that are not in scope of TEF (over coverage). Any coverage error of this nature has the potential to introduce bias into TEF outcomes".
  5. The ONS also comment that “during the consultations, respondents expressed reservations about wider issues related to the use of information from the NSS and the DLHE. Concerns included:
  • limited variation between institutions of the raw scores from the student responses
  • difficulty in trying to compare widely differing institutions
  • difficulty in capturing the wider benefits beyond academic results of attending a higher education institution”
  1. Of both DLHE and NSS the ONS comment: “The level of non-response to both surveys is easily high enough to suggest the possibility of non-response bias, especially given the lack of treatment to deal with non-response. The study on non-respondents to the NSS suggests that this may be exacerbated by a lower chance of being invited to take part in the survey for particular sub-groups of the population."

Related case studies