The QS University Rankings 2010… The QS rankings are out (and slippage has occured, and measurement problems persist in ranking systems).
The QS rankings are now out, and there is coverage by Sean Flynn in the The Irish Times (‘TCD HAS dropped out of the world’s top 50 universities and UCD has slipped from the top 100 in the latest world university rankings.’) and the Irish Independent (John Walsh says ‘TRINITY College Dublin (TCD) and University College Dublin (UCD) have slipped in the latest world rankings of leading universities — with both universities warning last night that cuts are making it impossible to remain at the top.’)
Trinity and UCD are now ranked at #52 and 114, respectively, by QS (see www.topuniversities.com for the rankings). Cambridge displaces the mighty Harvard from the number one slot, too.
Neither article make it clear that the QS ranking was formerly the THE-QS ranking. The two split up last year, and the THE will produce its own rankings on the 16th of September. They do not make it clear that the rankings are not strictly comparable with those from previous years, either.
Sean Flynn further notes: ‘With more than 2,000 universities surveyed, the QS rankings are regarded as the most reliable guide to performance. Universities are ranked on the basis of data gathered on peer review, employer review, international faculty ratio, international student ratio, student faculty ratio and citations per faculty.’ And John Walshe says: ‘This is the seventh year of QS rankings, which are regarded as the most accurate of the international league tables.’ The accuracy and prestige of these and other rankings has actually been the subject of vigorous debate, and should not be taken as read. There’s been a lot of comment (and a lot of skepticism) on this blog about the ‘meaning’ of university rankings. Issues regarding reliability and validity (e.g. predictive, construct, face, internal validity) of measurement, and a lack of attention to basic measurement theory, have been particularly discussed.
The QS rankings are weighted as follows: Academic Peer Review 40%, Employer/Recruiter Review 10%, Student Faculty Ratio 20%, Citations per Faculty 20% and International Factors 5% each.
The THE rankings are going to be quite different – weightings will be assigned to 13 different subcategories under the following five headings:
• Teaching – the learning environment (30 per cent)
• Citations – research influence (32.5 per cent)
• Research – volume, income and reputation (30 per cent)
• International mix – staff and students (5 per cent)
• Industry income – innovation (2.5 per cent).
There is a fantastic graphic accompanying the THE article.
It should be obvious: change the weightings and you change the relative rankings – hence there is an element of a parlour game about the various ranking systems. The THE has reduced the peer review weighting:
In a significant departure from the rankings published over the previous six years, THE confirmed that it has reduced the weighting given to the results of subjective opinion polls. Some 50 per cent of the old world rankings (2004-09) were based on opinion – 40 per cent on an academic reputation survey and 10 per cent on a poll of employers.
The new system will give reputation measures a weighting of 34.5 per cent. To determine this, Ipsos MediaCT carried out a new Academic Reputation Survey, which attracted 13,388 responses from academics, representative of global scholarship. For the first time in global rankings, the survey also includes reputation measures of teaching quality. The results relating to research account for 19.5 per cent, and those relating to teaching account for 15 per cent, giving a total of 34.5 per cent.
Will this ranking be very prestigious if our universities show relative upward movement?
Previously, it was suggested here that there would be slippage in the relative positions of our universities; this is certainly the case in a numerical ordering sense at least for the ‘big two’:
These rankings should not be taken be taken as some absolute measure of quality (however defined). As noted here previously, ‘ the nature of the variables measured and the weightings applied to them to generate the overall composite ranking makes a huge difference to the outcomes generated’.
But, these caveats aside, here is a simple prediction based on the Employment Control Framework. This demands ‘A minimum 6% reduction in the number of overall core staff … will be required across the HE sector by end of 2010, as compared with the numbers in place at 31 December 2008.’
The QS methodology (2009) gave a 20% weighting to the student faculty ratio and a further 5% based onthe proportion of international faculty. The loss of academic staff at all levels is palpable in my own institution; I suspect the situation is no different in the other Irish universities. Reductions in staffing, without corresponding reductions in student numbers, inevitably lead an increase in the staff-student ratio. So, assuming nothing else changes, these reductions in staffing will result in a lower placement of our universities in international ranking systems that place a weight on staff:student ratios.
[Disclosure: I was a peer reviewer for the QS system and was not terribly happy with the system used, as it relied far too much on the availability heuristic to assess comparative university performance.]