Analysis

[vc_row][vc_column][vc_column_text]

Including five different rankings into a single “Rankometer” offers an opportunity to make some comparisons between the different ranking systems.

We use the data used in Rankometer to answer three questions:

  1.  Are all rankings the same?
  2. Do certain ranking systems favour certain countries?
  3. Which universities do best in Rankometer compared to other ranking systems?

Are all Rankings the Same?

While there has been criticism that world university rankings basically all measure the same thing – research1 , there are some notable differences between the five rankings included in Rankometer. The following is a very brief summary, a full description of the methodology can be found on the respective ranking publisher’s website.

  • Reputation Surveys: QS and Times Higher Education (THE) both use reputation surveys as part of their ranking methodology, whereas the other three rankings don’t. The QS World University Rankings uses surveys to measure academic reputation (40%) and employer reputation (10%). THE uses teaching reputation (15%) and research reputation (18%) in its World University Rankings.

  • Institutional Data: QS, THE and the Academic Ranking of World Universities (ARWU) use data such as the number of academic staff in their rankings. While the ARWU only obtains this data from third parties, QS and THE sometimes use data directly published or submitted by the universities.

  • Strictly Research: The Leiden Ranking only uses data about scientific publications contained in the Scopus database. In that sense it’s the ‘purest’ research ranking. QS, THE and ARWU also use various sources of scientometric data.

  • Weblinks: Webometrics only uses the reputation of an institution’s website based on the number and quality of other websites that link to it. This is similar to how search engines like Google rank search results. The assumption is that a good website with high quality content says something about the quality of a university.

In the table below you will find the correlations between the different rankings (Pearson correlation R2 based on ranks). The lowest correlation between rankings is the QS World University Rankings with the Leiden Ranking (R2 = 0.27). The highest correlation is between the QS-WUR and THE-WUR (R2 = 0.66) which incidentally both use reputation surveys. Also relatively highly correlated are ARWU and Webometrics (R2 = 0.61), which is a bit surprising as their methodologies are very different.

[/vc_column_text][vc_column_text]

QS-WUR ARWU THE-WUR Leiden Webometrics
QS-WUR 1.00 0.53 0.66 0.27 0.40
ARWU 0.53 1.00 0.47 0.35 0.61
THE-WUR 0.66 0.47 1.00 0.53 0.40
Leiden 0.27 0.35 0.53 1.00 0.30
Webometrics 0.40 0.61 0.40 0.30 1.00

1See: E. Hazelkorn & A. Gibson (2017) “Global science, national research, and the question of university rankings” Palgrave Communications, 3(1), 1–11 and M.M. Vernon, E.A. Balas & S. Momani (2018). “Are university rankings useful to improve research? A systematic review” PloS One, 13(3), e0193762.

[/vc_column_text][vc_column_text]

To illustrate the large differences between ranking systems at a more granular university level, the following table provides an overview of the universities with the largest differences. The largest gaps are observed for the University of Buenos Aires (935 places), Moscow State University (927 places) and Humboldt University of Berlin (921 places). Note that 1001st place is the lowest placing possible within the Rankometer methodology.

[/vc_column_text][vc_column_text]

Ranko University QS-WUR ARWU THE-WUR Leiden Webometrics
438 University of Buenos Aires, Argentina 66 201 1001 937 378
260 Moscow State University, Russia 74 93 174 1001 211
293 Humboldt University of Berlin, Germany 117 1001 80 304 226
408 Yeshiva University, United States 341 201 1001 96 785
378 National Autonomous University of Mexico 100 201 801 1001 144

[/vc_column_text][vc_column_text]

The large differences show the value of having a composite ranking like Rankometer rather than relying on a single ranking system.

Do Certain Ranking Systems Favour Certain Countries?

There have been claims of rankings bias for as long as there have been university rankings, but Rankometer provides an opportunity to test if certain rankings really do seem to favour universities from particular countries.

Using regression analysis with country dummies and assuming that Rankometer is the ‘true benchmark’ of university quality (and an independent variable) with the component rankings as dependent variables, the following countries (with 10 or more universities among Rankometer’s top-500 institutions) appear to rank higher or lower in a statistically significant way.

To interpret the results in the table, first have a look at the Rank coefficient (at the bottom), which shows that in some ranking systems universities are likely to score lower because of the inclusion of many relatively small and more narrowly focused institutions. This is very clear in the Leiden ranking (Rank coefficient of 1.4) and is least in the QS-WUR (Rank coefficient of 1.0)

Now moving to comparing the influence of countries as a factor in university rankings, it is important to note that a “-” means that universities are likely to have a higher rank in a particular ranking system than in Rankometer and a “+” means that a university is likely to have a lower rank (at 95% significance level or higher). Thus United States universities tend to score lower in QS-WUR and THE-WUR, but higher in the Leiden and Webometrics rankings and about the same in ARWU relative to their average (Rankometer) rank.

[/vc_column_text][vc_column_text]

Location Rankometer
Universities
QS-WUR ARWU THE-WUR Leiden Webometrics
United States 124 + +
United Kingdom 51 + + +
China (Mainland) 42 + +
Germany 37 +
Australia 28 + +
Italy 21 +
Canada 21 + +
France 15 + +
Spain 13 +
Netherlands 13 + +
Japan 11 +
Rank Coefficient 1.0 1.1 1.2 1.4 1.2

2Results to be presented in a forthcoming research paper. A link to the paper will be posted here.

[/vc_column_text][vc_column_text]

It is interesting to note that most ranking systems rank their ‘own’ universities higher: this is true for ARWU (China), THE-WUR (UK) and  Leiden (Netherlands). It is also notable that of 11 countries included in the table 9 have a “+”, thus ranking lower in QS-WUR on average than they do in Rankometer, with Japan being a notable exception.

Which Universities Do Best in Rankometer Compared to Other Ranking Systems?

Universities which score relatively well in all five rankings perform well in Rankometer, but poor performance in one or two rankings can place a university significantly lower in Rankometer. Because of this some “great all-rounders” actually have a higher rankometer rank than they do in any other ranking system, in some cases improving their rank by more than 30 places!

[/vc_column_text][vc_column_text]

University Ranko Increase Other Highest Rank
University of Wollongong, Australia 166 +30 196 QS-WUR
University of Pavia, Italy 371 +30 401 ARWU
University of Vienna, Austria 126 +23 149 Webometrics
University of Aveiro, Portugal 449 +23 472 Leiden
University of Adelaide, Australia 83 +21 104 Webometrics
Renmin University of China, China (Mainland) 479 +21 500 Webometrics
Murdoch University, Australia 480 +21 501 THE-WUR
Macquarie University, Australia 175 +20 195 THE-WUR
University of Amsterdam, Netherlands 41 +14 55 Leiden
University of Nottingham, United Kingdom 85 +14 99 QS-WUR
University of Sheffield, United Kingdom 80 +13 93 QS-WUR
Cardiff University, United Kingdom 139 +12 151 ARWU

[/vc_column_text][/vc_column][/vc_row]