EduRank's university ranking methodology

We significantly improved our ranking methodology in August 2021.

It's a temporary short version of the methodology. A full review of used indicators and statistical methods will come soon.

We analyzed the other rankings and critical scientific literature to find a way to correct at least some of the existing shortcomings.

EduRank.org's approach to world university ranking:

  1. Wide selection of higher education institutions. The only selection criterion for inclusion in the ranking is the issuance of bachelor's degrees and above or analogs with 4+ years of study. We aim to provide equal opportunities for inclusion for institutions from all countries, including developing ones. We don't set minimum scores required for any ranking criteria nor do we exclude institutions with missing data.
  2. Self-collected data. For ranking purposes, we don't use any data provided by universities or data that a university can tamper with without actual improvements in the quality of its functions.
  3. Metric-based. We believe that ranking by metrics is the only possible approach to rank 14160 universities in 183 countries - no surveys, no experts, no opinions.
  4. Transparent. We aim to be transparent about the choice of indicators, data processing, statistical methods used, and ranking limitations.
  5. Improvements over consistency. We see no value in keeping the methodology consistent for the sake of tracking yearly changes in the position of an individual university while sacrificing opportunities for improvement.

The final score of EduRank's overall ranking consists of 3 parts:

  1. 45% Research performance. We use the Microsoft Academic database as a proxy to retrieve scientific publications and links between them (citations). Rather than just summing them, we build a graph with publications as nodes and citations as edges to calculate the weight of each publication. Then we adjust that weight for the publication date and share of university representatives in the list of authors.
  2. 45% Non-academic prominence. We use the same approach that Google and other modern search engines use to calculate the reputation of individual web pages - backlinks to a university from other sites. We use the data from Ahrefs as a source with the largest available index of pages and links.
  3. 10% Alumni score. The indicator reflects the combined number of page views that a university's graduates and other affiliated individuals have on all 43 language versions of Wikipedia.
microsoft academic
Publications
51,654,013
Citations
1,384,130,695
ahrefs
Pages
257,800,000,000
Links
2,900,000,000,000
wikidata
Alumni
168,471
Pageviews
71,519,735,871

These are only 3 indicators, but each of them is complex enough and is alone a good predictor of a university's position in the ranking and meets our requirements for sustainability against manipulation.

Other considered but not used indicators fail in being sustainable or bring no additional value, having a low correlation with the final score.