Do university rankings measure the quality of higher education?

Isabel Sagenmüller Planning
Compartir en

Do university rankings measure the quality of higher education?

Both the Times Higher Education, the Academic Ranking of World Universities (Shanghai) and the QS University rankings are a benchmark of what the international community considers the best universities of the world. Despite contestations from the academic community, it is a reference point used by governments and funding programs to assess where to invest or where it is cost-effective to send students on scholarships. What is their method, pros and cons?

On March 10, Times Higher Education University Rankings revealed the top 200 universities in Europe, with the United Kingdom taking nearly a quarter of places and 46 institutions. Germany seconds the UK in the list with 36 institutions.

Overall, 22 countries are represented in the top 200 list, which draws upon data from the 800 universities from 70 countries.

THE measured institutions regarding their teaching environment, research environment, citations (research influence), industry income and international outlook.

Top 20 Universities in Europe (Times Higher Education Rankings) 

Rank

Institution

Country

1

University of Oxford

UK

2

University of Cambridge

UK

3

Imperial College London

UK

4

ETH Zurich – Swiss Federal Institute of Technology Zurich

Switzerland

5

University College London (UCL)

UK

6

London School of Economics and Political Science (LSE)

UK

7

University of Edinburgh

UK

8

King’s College London

UK

9

Karolinska Institute

Sweden

10

LMU Munich

Germany

11

École Polytechnique Fédérale de Lausanne (EPFL)

Switzerland

12

KU Leuven

Belgium

13

Heidelberg University

Germany

14

Wageningen University and Research Center

Netherlands

15

Humboldt University of Berlin

Germany

16

Technical University of Munich

Germany

17

École Normale Supérieure

France

18

University of Manchester

UK

19

University of Amsterdam

Netherlands

20

Utrecht University

Netherlands

Scholarship funding is inherently based on these references.

In order to evaluate the merit of applications for scholarships abroad funded by the Mexican Government, the Consejo Nacional de Ciencia y Tecnología (Conacyt) considers "the position of the proposed university when this is between the best 200 worldwide, according to renowned international rankings or where the corresponding program is one of the best 100 globally."

In Chile, the Comisión Nacional de Ciencia y Tecnología (Conicyt) is more specific and recommends applicants to their graduate scholarships abroad (Becas Chile), to apply to "programs taught at a university or research center qualified within the first 150 places in the rankings of the best universities, according to the Times Higher Education (THE) or the Academic Ranking of World Universities (Shanghai, ARWU)" of the year, or else "positioned within the 50 first places in the rankings within the particular area of the program”. If universities or programs are not qualified in these rankings, applicants must accredit the excellence of the program with similar criteria that national or international rankings measure. 

There are contested opinions about these rankings and their status verifying and promoting the quality of universities. There is both outrage about it from certain academic circles, approval from others, and a reluctant agreement from people who claim that it's one of the only ways to compare different educational institutions and support funding options with evidence.

Despite having specific regional rankings of the best quality universities in Latin America, in global rankings Latin American universities are ranked below number 400 best universities, for instance, at theTimes Higher Education.  

401-500

Federico Santa María Technical University, Chile

National Autonomous University of Mexico, Mexico

Pontifical Catholic University of Chile, Chile

501-600

            

University of the Andes, Colombia

University of Chile

Monterrey Institute of Technology and Higher Education, Mexico

601-800

 

University of Antioquia, Colombia

Austral University of Chile

National University of Córdoba, Argentina

Pontifical Catholic University of Valparaíso, Chile

University of Santiago, Chile (USACH), Chile

On the Times Higher Education website, people submit their criticisms. For instance, the predominance of a high volume of peer-reviewed research and publications, as opposed to other qualifications. Others, ask about the value of rankings where the English language dominates.

Supporters say that English is the lingua franca of teaching and research and that universities all over the world are increasingly teaching and publishing in English. To be global, research led universities need to be part of international communities with a common language.

THE representatives respond that the survey “goes to great lengths to overcome language bias. For example, our annual reputation survey is distributed in 15 languages, and we distribute the invitation-only survey on the basis of UN data on the true global distribution of scholars. We also apply an element of country normalization to recognize countries which publish research in languages other than English."

Nonetheless, other people contest the bias regarding countries where research is carried out differently, where funding is limited, and the lack of networks to access peer reviews are a barrier for accreditation, forcing teachers to undertake research, and asking researchers to teach in the classroom even when they're not qualified enough for one or another.

Felipe Martínez Rizo from Universidad Autónoma de Aguascalientes, has looked at trends in higher education rankings, and states that they show severe methodological limitations and thus are not able to justify their pretense of being reliable methods for evaluating universities.

"These rankings are examples of a poor method, combined with high media impact and leading to uses that cannot sustain improving quality," he says. He adds: How do we know whether Berkeley or UCLA, with eight other institutions that are part of the University of California, should be considered separately or as a whole? We can ask the same question treating faculties of professional studies at the Universidad Autónoma de México (UNAM), the units of the Universidad Autónoma Metropolitana, the University of Guadalajara or the different campuses of the Monterrey Technological Institute, or the various universities that came out of the University of Paris."

He ends up stating "in the absence of evaluations they treat all institutions equally, which isn’t right, the different treatments have nothing to do with objective merits but illusions." 

But, how does THE construct the ranking?

According to their website, they are “the only global performance tables that judge research-intensive universities across all their core missions”. Individual Institutions provide and sign off their institutional data for use in the rankings. When this is not provided, THE enters an estimate that rates in around the 25th percentile of other indicators.

Teaching (the learning environment): 30%

  • Reputation survey: 15%
  • Staff-to-student ratio: 4.5%
  • Doctorate-to-bachelor’s ratio: 2.25%
  • Doctorates awarded-to-academic staff ratio: 6%
  • Institutional income: 2.25%

Research (volume, income and reputation): 30%

  • Reputation survey: 18%
  • Research income: 6%
  • Research productivity: 6%
Citations (research influence): 30%

International outlook (staff, students, research): 7.5%

  • International-to-domestic-student ratio: 2.5%
  • International-to-domestic-staff ratio: 2.5%
  • International collaboration: 2.5% ledge transfer): 2.5%
Industry income (knowledge transfer): 2.5%

However, THE excludes universities” if they do not teach undergraduates or if their research output amounted to fewer than 200 articles per year over the five-year period 2010-14. In exceptional cases, institutions below the 200-paper threshold are included if they have a particular focus on disciplines with generally low publication volumes, such as engineering or arts”.

A particular concern is standardization, an issue that is raised by academics, as it is very hard to compare two universities that measure performance differently. “In order to match values that represent different data, they combine indicators in different proportions, based on the distribution of data within a particular indicator", THE states.

On the other hand, the QS World University Rankings uses:

  • 1. Academic reputation (40%)

Measured using a survey where academics are asked to identify institutions where the best work they believe. 

  • 2. Employer reputation (10%)

Based on a global survey of employers identifying universities they perceive to produce the best graduates. 

  • 3. Student-to-faculty ratio (20%)

To measure the number of academic staff employed relative to the number of students enrolled. 

  • 4. Citations per faculty (20%)

Assesses the universities’ research impact, using Scopus for the latest five years. The total citation count is assessed in relation to the number of faculty members, so larger institutions don’t have an unfair advantage. 

  • 5. International faculty ratio (5%)

Assesses how successful a university has been in attracting and academics from other nations. This is based on the proportion of international faculty members at the institution. 

  • 6. International student ratio (5%)

Assesses how successful a university has been in attracting students from other nations. This is based on the proportion of international students at the institution.

Planification Academica - Un juego de simulacion

On the other hand, Professor Martinez says that “contrary to the media sponsoring these initiatives, program rankings are not appropriate either to give orientation to future students. If you take into account that there is not a best university or program in absolute terms, but programs that are more or less adequate for individual prospects, we would understand that the most common rankings don’t substitute a good orientation system.”

What do you think about these rankings? Are they a good assessment or benchmark for your institution? What are their pros and cons? We appreciate your comments below.