Leiden Ranking is special from the aspect, that it only works with factual data about publication performance, while other global rankings use normalized data (see more details: STABILITY AND DYNAMICS: SUMMARY OF GLOBAL RANKING INDICATORS 2021)
by Dóra Czirfusz
Therefore analysing Leiden’s results with other world rankings may give us an interesting slice of the world of higher education rankings, considering that information of Leiden list gives us a more objective picture about universities’ performance.
As the number of ranked institutes and the forms of published data vary in almost each ranking, we computed a derived score (similar to overall scores published for several rankings) for each institution for every ranking, showing what percentage of institutions performs better than a given university in the list.[1] This score allows us to compare performance among different rankings, and answer our main question: how other rankings relate to Leiden Ranking.
In the recent analysis only those universities are included, that meets the following criteria:
- Hungarian or foreign competitors of Hungarian universities (European institutes only)
- listed in each of the examined rankings in 2019/2020 (ARWU, Leiden, QS, THE, US New).
The sample consists the following 30 university:
Austria | Croatia | Czech Republic | Finland | Hungary | Poland | Romania | Slovakia | Slovenia |
Johannes Kepler University of Linz | University of Zagreb | Charles University | Aalto University | Budapest University of Technology and Economics | Adam Mickiewicz University | Babes-Bolyai University | Comenius University | University of Ljubljana |
Technical University of Graz | Czech University of Life Sciences Prague | University of Helsinki | Eotvos Lorand University | Jagiellonian University | ||||
University of Innsbruck | Czeh Technical University in Prague | University of Jyväskylä | University of Debrecen | Nicolaus Copernicus University | ||||
University of Vienna | Masaryk University | University of Oulu | University of Szeged | University of Warsaw | ||||
Vienna University of Technology | Palacky University | University of Tampere | Warsaw University of Technology |
An easy way to check whether institutions perform the same way in different rankings is to conduct a correlational analysis where we expect a strong positive correlation among each of the rankings.
The result is almost as expected, we can see significant positive connections among each rankings, but the degree of correlation varies. The two most similar rankings – according to the positions – are ARWU and THE, while the lowest correlation is between Leiden and QS. To answer our question about Leiden Ranking, we can take a look at Leiden’s column (in bold), where it can be seen that ARWU is the closest to Leiden’s ranking positions with a high positive correlation (r=0,763) then comes U.S. News (r=0,738), still considered as high correlation. THE (r=0,585) and QS (r=0,540) also have a significant relationship with Leiden, but to a moderate degree.
Rankings | ARWU | Leiden | QS | THE | U.S. News |
ARWU | – | ,763** | ,770** | ,840** | ,825** |
Leiden | ,763** | – | ,540** | ,585** | ,738** |
QS | ,770** | ,540** | – | ,805** | ,660** |
THE | ,840** | ,585** | ,805** | – | ,770** |
U.S. News | ,825** | ,738** | ,660** | ,770** | – |
**. Correlation is significant at the 0.01 level (2-tailed). |
The result is not surprising if we consider that in ARWU’s indicators scientific activity (publications and citations) is weighted by a total of 60%, where data comes from the Web of Science database, used by Leiden Ranking, as well. In contrast, QS uses Scopus database and gives only 20% to institutional research output.
The same tendency is visible if we take a look at the examined institution’s ranking positions: ARWU’s line deviates the least from Leiden’s positions, while QS shows the most differences.[2]
See our infographic: https://create.piktochart.com/output/42325622-hei-ranking
It is also visible that, except for THE rankings, Leiden’s published positions are mostly lower than the listed positions in ARWU, QS or US News. While 17 of the 30 institutions have higher positions in THE compared Leiden, QS ranked only one-third of the institutions higher than Leiden.
However, when comparing scores (which was the base for the correlation analysis, too), it can be seen that ARWU gives higher scores to each of the institutions in our sample, and QS scored higher than Leiden only one from the 30 institutions. Furthermore, it turns out that scores higher than Leiden appear in most institutions for all of the rankings examined.
In conclusion, we can say that ARWU provides the closest performance to Leiden, according to the derived scores, while QS has the lowest correlation with Leiden. Positions and scores show different aspects of the analysis: when examining positions, THE rated most of the institutions better (17 from the 30) than Leiden, but when we consider the number of listed institutions in each ranking and we calculate using the scores, it comes out that Leiden’s scores are mostly lower than scores in any of the examined rankings. According to this analysis it seems that multi criteria indicators can help these institutions perform better than in Leiden Ranking, where only publication performance is considered.
[1] For those positions where an interval was published we used the average (e.g. 601-700 is 650,5), then the following formula was calculated: score=position/number of ranked institution*100.
[2] This chart shows the actual positions, not the derived scores.