QS World University Rankings

QS World University Rankings is an annual publication of university rankings by Quacquarelli Symonds (QS). The QS system comprises three parts: the global overall ranking, the subject rankings (which name the world's top universities for the study of 51 different subjects and five composite faculty areas), and five independent regional tables—namely Asia, Latin America, Emerging Europe and Central Asia, the Arab Region, and BRICS.[1]

EditorBen Sowter (Head of Research)
Staff writersCraig O'Callaghan
CategoriesHigher education
PublisherQuacquarelli Symonds Limited
First issue2004 (2004) (in partnership with THE)
2010 (2010) (on its own)
CountryUnited Kingdom

The QS ranking receives approval from the International Ranking Expert Group (IREG),[2] and is viewed as one of the most-widely read university rankings in the world, along with Academic Ranking of World Universities and Times Higher Education World University Rankings.[3] According to Alexa Internet, it is the most widely viewed university ranking worldwide.[4] However, it has been criticized for its overreliance on subjective indicators and reputation surveys, which tend to fluctuate over time.[5][6][7][8][9] Concern also exists regarding the global consistency and integrity of the data QS uses to generate its rankings.[6][10][11][12]

The QS ranking was previously known as Times Higher Education–QS World University Rankings. The publisher had collaborated with Times Higher Education (THE) magazine to publish its international league tables from 2004 to 2009 before both started to announce their own versions. QS then chose to continue using the pre-existing methodology in partnership with Elsevier, while THE adopted a new methodology to create their rankings.


A perceived need for an international ranking of universities for UK purposes was highlighted in December 2003 in Richard Lambert's review of university-industry collaboration in Britain[13] for HM Treasury, the finance ministry of the United Kingdom. Amongst its recommendations were world university rankings, which Lambert said would help the UK to gauge the global standing of its universities.

The idea for the rankings was credited in Ben Wildavsky's book, The Great Brain Race: How Global Universities are Reshaping the World,[14] to then-editor of THE, John O'Leary. THE chose to partner with educational and careers advice company Quacquarelli Symonds (QS) to supply the data, appointing Martin Ince,[15] formerly deputy editor and later a contractor to THE, to manage the project.

Between 2004 and 2009, QS produced the rankings in partnership with THE. In 2009, THE announced they would produce their own rankings, the Times Higher Education World University Rankings, in partnership with Thomson Reuters. THE cited an asserted weakness in the methodology of the original rankings,[16] as well as a perceived favoritism in the existing methodology for science over the humanities,[17] as two of the key reasons for the decision to split with QS.

QS retained intellectual property in the prior rankings and the methodology used to compile them, and continues to produce rankings based on that methodology, which are now called the QS World University Rankings.[18]

THE created a new methodology with Thomson Reuters, and published the first Times Higher Education World University Rankings in September 2010.

Global rankings


Methodology of QS World University Rankings[19]
Academic peer review40%Based on an internal global academic survey
Faculty/student ratio20%A measurement of teaching commitment
Citations per faculty20%A measurement of research impact
Employer reputation10%Based on a survey on graduate employers
International student ratio5%A measurement of the diversity of the student community
International staff ratio5%A measurement of the diversity of the academic staff

QS publishes the rankings results in the world's media and has entered into partnerships with a number of outlets, including The Guardian in the United Kingdom, and Chosun Ilbo in Korea. The first rankings produced by QS independently of THE, and using QS's consistent and original methodology, were released on September 8, 2010, with the second appearing on September 6, 2011.

QS designed its rankings to assess performance according to what it believes to be key aspects of a university's mission: teaching, research, nurturing employability, and internationalisation.[20]

Academic peer review

This is the most controversial part of the methodology. Using a combination of purchased mailing lists and applications and suggestions, this survey asks active academicians across the world about the top universities in their specialist fields. QS has published the job titles and geographical distribution of the participants.[21]

The 2017/18 rankings made use of responses from 75,015 people from over 140 nations for its academic reputation indicator, including votes from the previous five years rolled forward provided no more recent information was available from the same individual. Participants can nominate up to 30 universities, but are not able to vote for their own. They tend to nominate a median of about 20, which means that this survey includes over 500,000 data points. The average respondent possesses 20.4 years of academic experience, while 81% of respondents have over a decade of experience in the academic world.[22][21]

In 2004, when the rankings first appeared, academic peer review accounted for half of a university's possible score. In 2005, its share was cut to 40% because of the introduction of the Employer Reputation Survey.

Faculty student ratio

This indicator accounts for 20% of a university's possible score in the rankings. It is a classic measure used in various ranking systems as a proxy for teaching commitment, but QS has admitted that it is less than satisfactory.[23]

Citations per faculty

Citations of published research are among the most widely used inputs to national and global university rankings. The QS World University Rankings used citations data from Thomson (now Thomson Reuters) from 2004 to 2007, and since then has used data from Scopus, part of Elsevier. The total number of citations for a five-year period is divided by the number of academics in a university to yield the score for this measure, which accounts for 20% of a university's possible score in the rankings.

QS has explained that it uses this approach, rather than the citations per paper preferred for other systems, because it reduces the effect of biomedical science on the overall picture – biomedicine has a ferocious "publish or perish" culture. Instead, QS attempts to measure the density of research-active staff at each institution, but issues still remain about the use of citations in ranking systems, especially the fact that the arts and humanities generate comparatively few citations.[24]

However, since 2015, QS has made methodological enhancements designed to remove the advantage institutions specializing in the Natural Sciences or Medicine previously received. This enhancement is termed faculty area normalization, and ensures that an institution's citations count in each of QS's five key Faculty Areas is weighted to account for 20% of the final citations score.[25]

QS has conceded the presence of some data-collection errors regarding citations per faculty in previous years' rankings.[26]

One interesting issue is the difference between the Scopus and Thomson Reuters databases. For major world universities, the two systems capture more or less the same publications and citations. For less mainstream institutions, Scopus has more non-English language and smaller-circulation journals in its database. As the papers there are less heavily cited, though, this can also mean fewer citations per paper for the universities that publish in them.[24] This area has been criticized for undermining universities that do not use English as their primary language.[27] Citations and publications in a language different from English are harder to access. The English language is the most internationalized language, so is also the most popular when citing.

Employer review

This part of the ranking is obtained by a similar method to the Academic Peer Review, except that it samples recruiters who hire graduates on a global or significant national scale. The numbers are smaller – 40,455 responses from over 130 countries in the 2016 rankings – and are used to produce 10% of any university's possible score. This survey was introduced in 2005 in the belief that employers track graduate quality, making this a barometer of teaching quality, a famously problematic factor to measure. University standing here is of special interest to potential students, and acknowledging this was the impetus behind the inaugural QS Graduate Employability Rankings, published in November 2015.[28][29]

International orientation

The final 10% of a university's possible score is derived from measures intended to capture their internationalism: half from their percentage of international students, and the other half from their percentage of international staff. This is of interest partly because it shows whether a university is putting effort into being global, but also because it indicates whether it is taken seriously enough by students and academics around the world for them to want to be there.[30]


In September 2015, The Guardian referred to the QS World University Rankings as "the most authoritative of their kind".[31][32] In 2016, Ben Sowter, Head of Research at the QS Intelligence Unit, was ranked in 40th position in Wonkhe's 2016 'Higher Education Power List'. The list enumerated what the organisation believed to be the 50 most influential figures in UK higher education.[33]

Several universities in the UK and the Asia-Pacific region have commented on the rankings positively. Vice-chancellor of New Zealand's Massey University, Professor Judith Kinnear, says that the THE-QS ranking is a "wonderful external acknowledgement of several university attributes, including the quality of its research, research training, teaching, and employability." She said the rankings are a true measure of a university's ability to fly high internationally: "The Times Higher Education ranking provides a rather more and more sophisticated, robust, and well rounded measure of international and national ranking than either New Zealand's Performance Based Research Fund (PBRF) measure or the Shanghai rankings."[34] In September 2012, British newspaper The Independent described the QS World University Rankings as being "widely recognised throughout higher education as the most trusted international tables".[35]

Angel Calderon, Principal Advisor for Planning and Research at RMIT University and member of the QS Advisory Board, spoke positively of the QS University Rankings for Latin America, saying that the "QS Latin American University Rankings has [sic] become the annual international benchmark universities use to ascertain their relative standing in the region". He further stated that the 2016/17 edition of this ranking demonstrated improved stability.[36]


Certain commentators have expressed concern about the use or misuse of survey data. However, QS's Intelligence Unit, responsible for compiling the rankings, state that the extent of the sample size used for their surveys means that they are now "almost impossible to manipulate and very difficult for institutions to ‘game’". They also state that "over 62,000 academic respondents contributed to our 2013 academic results, four times more than in 2010. Independent academic reviews have confirmed these results to be more than 99% reliable". Furthermore, since 2013, the number of respondents to QS's Academic Reputation Survey has increased again. Their survey now makes use of nearly 75,000 academic peer reviews, making it "to date, the world’s largest aggregation of feeling in this [the global academic] community."[37][38][39]

The QS World University Rankings have been criticised by many for placing too much emphasis on peer review, which receives 40% of the overall score. Some people have expressed concern about the manner in which the peer review has been carried out.[40] In a report,[41] Peter Wills from the University of Auckland wrote of the THE-QS World University Rankings:

But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions.

However, QS state that no survey participant, academic or employer, is offered a financial incentive to respond, while no academics are able to vote for their own institutions. This renders this particular criticism invalid, as it is based on two incorrect premises: (1) that academics are currently financially incentivized to participate, and (2) that conflicts of interests are created by academics being able to vote for their own institutions.

Academicians previously criticized of the use of the citation database, arguing that it undervalues institutions that excel in the social sciences. Ian Diamond, former chief executive of the Economic and Social Research Council and now vice-chancellor of the University of Aberdeen and a member of the THE editorial board, wrote to Times Higher Education in 2007, saying:[42]

The use of a citation database must have an impact because such databases do not have as wide a cover of the social sciences (or arts and humanities) as the natural sciences. Hence the low position of the London School of Economics, caused primarily by its citations score, is a result not of the output of an outstanding institution but the database and the fact that the LSE does not have the counterweight of a large natural science base.

However, in 2015, QS's introduction of faculty area normalization ensured that QS's rankings no longer conferred an undue advantage or disadvantage upon any institution based on their particular subject specialisms. Correspondingly, the London School of Economics rose from 71st in 2014 to 35th in 2015 and 37th in 2016.[43]

Since the split from Times Higher Education in 2009, further concerns about the methodology QS uses for its rankings have been brought up by several experts.

In October 2010, criticism of the old system came from Fred L. Bookstein, Horst Seidler, Martin Fieder, and Georg Winckler in the journal Scientomentrics for the unreliability of QS's methods:

Several individual indicators from the Times Higher Education Survey (THES) data base—the overall score, the reported staff-to-student ratio, and the peer ratings—demonstrate unacceptably high fluctuation from year to year. The inappropriateness of the summary tabulations for assessing the majority of the "top 200" universities would be apparent purely for reason of this obvious statistical instability regardless of other grounds of criticism. There are far too many anomalies in the change scores of the various indices for them to be of use in the course of university management.[7]

In an article for the New Statesman entitled "The QS World University Rankings are a load of old baloney", David Blanchflower, a leading labour economist, said: "This ranking is complete rubbish and nobody should place any credence in it. The results are based on an entirely flawed methodology that underweights the quality of research and overweights fluff... The QS is a flawed index and should be ignored."[44]

However, Martin Ince,[15] chair of the Advisory Board for the Rankings, points out that their volatility has been reduced since 2007 by the introduction of the Z-score calculation method and that over time, the quality of QS's data gathering has improved to reduce anomalies. In addition, the academic and employer review are now so big that even modestly ranked universities receive a statistically valid number of votes. QS has published extensive data[45] on who the respondents are, where they are, and the subjects and industries to which the academicians and employers respectively belong.

The QS Subject Rankings have been dismissed as unreliable by Brian Leiter, who points out that programmes that are known to be high quality, and which rank highly in the Blackwell rankings (e.g., the University of Pittsburgh) fare poorly in the QS ranking for reasons that are not at all clear.[46]

In an article titled The Globalisation of College and University Rankings and appearing in the January/February 2012 issue of Change, Philip Altbach, professor of higher education at Boston College and also a member of the THE editorial board, said: "The QS World University Rankings are the most problematical. From the beginning, the QS has relied on reputational indicators for half of its analysis … it probably accounts for the significant variability in the QS rankings over the years. In addition, QS queries employers, introducing even more variability and unreliability into the mix. Whether the QS rankings should be taken seriously by the higher education community is questionable."[47]

Simon Marginson, professor of higher education at the University of Melbourne and a member of the THE editorial board, in the article "Improving Latin American universities' global ranking" for University World News on 10 June 2012, said: "I will not discuss the QS ranking because the methodology is not sufficiently robust to provide data valid as social science".[48] QS's Intelligence Unit counter these criticisms by stating that "Independent academic reviews have confirmed these results to be more than 99% reliable".[38]

In 2021, research published by the Center for Studies in Higher Education at the University of California, Berkeley raised the possibility that institutions that employ QS's consulting services are rewarded with improved rankings. QS denied the possibility and stated that it had firm policies and practices to minimize potential conflicts of interest.[49]

Young universities

QS also releases the QS Top 50 under 50 Ranking annually to rank universities which have been established for under 50 years. These institutions are judged based on their positions on the overall table of the previous year.[50] From 2015, QS's "'Top 50 Under 50" ranking was expanded to include the world's top 100 institutions under 50 years of age, while in 2017 it was again expanded to include the world's top 150 universities in this cohort. In 2020, the table was topped by Nanyang Technological University of Singapore for the seventh consecutive year. The table is dominated by universities from the Asia-Pacific region, with the top four places taken by Asian institutions.[51]

Faculties and subjects

QS also ranks universities by academic discipline organized into 5 faculties, namely Arts & Humanities, Engineering & Technology, Life Sciences & Medicine, Natural Sciences and Social Sciences & Management. The methodology is based on surveying expert academics and global employers, and measuring research performance using data sourced from Elsevier's Scopus database. In the 2018 QS World University Rankings by Subject the world's best universities for the study of 48 different subjects are named. The two new subject tables added in the most recent edition are: Classics & Ancient History and Library & Information Management.

The world's leading institution in 2020's portfolio in terms of most world-leading positions is Massachusetts Institute of Technology, which is number one for 12 subjects. Its longtime rankings rival, Harvard University, is number one for eleven subjects.[52]

Categories of QS World University rankings by faculty and subject[52]
Art & Humanities Engineering & Technology Life Sciences & Medicine Natural Sciences[note 1] Social Sciences
Archaeology Chemical Engineering Agriculture & Forestry Chemistry Accounting & Finance
Architecture Civil & Structural Engineering Anatomy & Physiology Earth & Marine Sciences Anthropology
Art & Design Computer Science & Information Systems Biological Sciences Environmental Sciences Business & Management Studies
Classics & Ancient History Electrical & Electronic Engineering Dentistry Geography Communication & Media Studies
English Language & Literature Mechanical, Aeronautical & Manufacturing Engineering Medicine Geology Development Studies
History Mineral & Mining Engineering Nursing Geophysics Economics & Econometrics
Linguistics Petroleum Engineering Pharmacy & Pharmacology Materials Science Education & Training
Modern Languages Psychology Mathematics Hospitality & Leisure Management
Performing Arts Veterinary Science Physics & Astronomy Law
Philosophy Library & Information Management
Theology, Divinity & Religious Studies Politics & International Studies
Social Policy & Administration
Sports-related Subjects
Statistics & Operational Research

Regional rankings and other tables

QS Graduate Employability Rankings

In 2015, in an attempt to meet student demand for comparative data about the employment prospects offered by prospective or current universities, QS launched the QS Graduate Employability Rankings. The most recent installment, released for the 2022 academic year, ranks 550 universities worldwide. It is led by Massachusetts Institute of Technology, and features five universities from the United States in the top 10.[53] The unique methodology consists of five indicators, with three that do not feature in any other ranking.[54]

Arab World Region

First published in 2014, the annual QS Arab Region University Rankings highlights 130 leading universities in this part of the world. The methodology for this ranking has been developed with the aim of reflecting specific challenges and priorities for institutions in the region, drawing on 10 indicators. In 2020 King Abdulaziz University in Jeddah came 1st.


In 2009, QS launched the QS Asian University Rankings or QS University Rankings: Asia in partnership with The Chosun Ilbo newspaper in Korea to rank universities in Asia independently. The ninth instalment, released for the 2017/18 academic year, ranks the 350 best universities in Asia, and is led by Nanyang Technological University, Singapore.[55]

These rankings use some of the same criteria as the world rankings, but there are changed weightings and new criteria. One addition is the criterion of incoming and outgoing exchange students. Accordingly, the performance of Asian institutions in the QS World University Rankings and the QS Asian University Rankings released in the same academic year are different from each other.[1]

Emerging Europe and Central Asia

First published in 2015, QS Emerging Europe and Central Asia University Rankings ranks 350 universities from mostly Eastern Europe and Central Asia, with Russia's Lomonosov Moscow State University in the top spot since the first publishing of rankings.

Latin America

The QS Latin American University Rankings or QS University Rankings: Latin America were launched in 2011. They use academic opinion (30%), employer opinion (20%), publications per faculty member, citations per paper, academic staff with a PhD, faculty/student ratio and web visibility (10 per cent each) as measures.[56]

The 2021 edition of the QS World University Rankings: Latin America ranks the top 300 universities in the region. Chile's Pontificia Universidad Católica de Chile retained its status as the region's best university for the fourth straight year.[57]


The number of universities in Africa increased by 115 percent from 2000 to 2010, and enrollment more than doubled from 2.3 million to 5.2 million students, according to UNESCO. However, only one African university, the University of Cape Town, was among the world's 100 best, to judge the world universities ranking of 2016.[58]


This set of rankings adopts eight indicators to select the top 100 higher learning institutions in BRICS countries. Institutions in Hong Kong, Macau and Taiwan are not ranked here.

QS Best Student Cities Ranking

In 2012, QS launched the QS Best Student Cities ranking – a table designed to evaluate which cities were most likely to provide students with a high-quality student experience. Five editions of the ranking have been published thus far, with Paris taking the number-one position in four of them.[59][60][61] The 2017 edition was also the first one to see the introduction of student opinion as a contributory indicator.


QS Quacquarelli Symonds organizes a range of international student recruitment events throughout the year. These are generally oriented towards introducing prospective students to university admissions staff, while also facilitating access to admissions advice and scholarships. In 2019, over 360 events were hosted, attended by 265,000 candidates, in 100 cities across 50 countries. Separated into ‘tours’, QS’ event offerings typically comprise a series of university and business school fairs.

World MBA Tour

The QS World MBA Tour is the world's largest series of international business school fairs, attended by more than 60,000 candidates in 100 cities across 50 countries.

World MBA Tour Premium

QS World MBA Premium also focuses on MBA student recruitment, but invites only business schools ranked in the top 200 internationally, according to the QS World University Rankings. The event aims to provide a more holistic overview of an MBA degree, with enhanced focus on pre- and post-study processes and insights.

World Grad School Tour

The QS World Grad School Tour focuses on international postgraduate programs, particularly specialised master's degrees and PhDs in FAME (Finance, Accounting, Management and Economics) and STEM disciplines.

World University Tour

The QS World University Tour has an emphasis on undergraduate student recruitment, inviting undergraduate programs only.

Connect Events

QS Connect MBA and QS Connect Masters differ from other event series’ in that an open fair format is not followed. Instead, candidates take part in pre-arranged 1-to-1 interviews with admissions staff, based on pre-submitted CVs and academic profiles.

QS Stars

QS also offers universities an auditing service that provides in-depth information about institutional strengths and weaknesses. Called QS Stars, this service is separate from the QS World University Rankings. It involves a detailed look at a range of functions which mark out a modern, global university. The minimum result that a university can receive is zero Stars, while truly exceptional, world-leading universities can receive '5*+', or 'Five Star Plus', status. The QS Stars audit process evaluates universities according to about 50 different indicators. By 2018, about 20 different universities worldwide had been awarded the maximum possible Five Star Plus rating.[62]

QS Stars[63] ratings are derived from scores on in eight out of 12 categories. Four categories are mandatory, while institutions must choose the remaining four optional categories.[64] They are:

  • Teaching
  • Employability
  • Research
  • Internationalization
  • Facilities
  • Online/Distance Learning
  • Arts & Culture
  • Innovation
  • Inclusiveness
  • Social Responsibility
  • Subject Ranking
  • Program Strength[65]

Stars is an evaluation system, not a ranking. About 400 institutions had opted for the Stars evaluation as of early 2018. In 2012, fees to participate in this program were $9850 for the initial audit and an annual license fee of $6850.[66]


  1. The term "Natural Sciences" here actually refers to physical sciences since life sciences are also a branch of natural sciences.


  1. "Asian University Rankings - QS Asian University Rankings vs. QS World University Rankings™". Archived from the original on 2013-06-06. Retrieved 2013-06-10. The methodology differs somewhat from that used for the QS World University Rankings...
  2. "IREG Ranking Audit". IREG Observatory on Academic Ranking and Excellence. International Ranking Expert Group (IREG). Archived from the original on 2016-10-29. Retrieved 14 September 2016.
  3. "University rankings: which world university rankings should we trust?". The Telegraph. 2015. Archived from the original on 2015-01-26. Retrieved 27 January 2015. It is a remarkably stable list, relying on long-term factors such as the number of Nobel Prize-winners a university has produced, and number of articles published in Nature and Science journals. But with this narrow focus comes drawbacks. China's priority was for its universities to "catch up" on hard scientific research. So if you're looking for raw research power, it's the list for you. If you're a humanities student, or more interested in teaching quality? Not so much.
  4. "topuniversities.com Competitive Analysis, Marketing Mix and Traffic - Alexa". Archived from the original on 2020-07-28. Retrieved 2020-04-01.
  5. "Strength and weakness of varsity rankings". NST Online. 2016-09-14. Archived from the original on 2018-03-30. Retrieved 2018-03-29.
  6. "The State of the Rankings | Inside Higher Ed". Archived from the original on 2018-07-11. Retrieved 2018-03-29.
  7. Bookstein, F. L.; Seidler, H.; Fieder, M.; Winckler, G. (2010). "Scientometrics, Volume 85, Number 1". Scientometrics. SpringerLink. 85 (1): 295–299. doi:10.1007/s11192-010-0189-5. PMC 2927316. PMID 20802837.
  8. "Methodology of QS rankings comes under scrutiny". www.insidehighered.com. Archived from the original on 2016-07-01. Retrieved 2016-04-29.
  9. "Competition and controversy in global rankings - University World News". www.universityworldnews.com. Archived from the original on 2016-05-05. Retrieved 2016-04-29.
  10. Bekhradnia, Bahram. "International university rankings: For good or ill?" (PDF). Higher Education Policy Institute. Archived (PDF) from the original on 2017-02-15.
  11. "Academic Ethics: To Rank or Not to Rank?". The Chronicle of Higher Education. 2017-07-12. Archived from the original on 2018-03-30. Retrieved 2018-03-29.
  12. "QS ranking downright shady and unethical". The Online Citizen. 2017-06-09. Archived from the original on 2018-03-30. Retrieved 2018-03-29.
  13. Lambert Review of Business-University Collaboration Archived October 19, 2011, at the Wayback Machine (since archived)
  14. Princeton University Press, 2010
  15. "Martin Ince Communications". Archived from the original on 2014-12-20. Retrieved 31 May 2015.
  16. Mroz, Ann. "Leader: Only the best for the best". Times Higher Education. Archived from the original on 2010-08-07. Retrieved 2010-09-16.
  17. Baty, Phil (2010-09-10). "Views: Ranking Confession". Inside Higher Ed. Archived from the original on 2010-07-15. Retrieved 2010-09-16.
  18. Labi, Aisha (2010-09-15). "Times Higher Education Releases New Rankings, but Will They Appease Skeptics?". The Chronicle of Higher Education. London, UK. Retrieved 2010-09-16.
  19. "QS World University Rankings: Methodology". QS (Quacquarelli Symonds). 2014. Archived from the original on 2015-04-29. Retrieved 29 April 2015.
  20. "MS and MBA in USA". MS MBA in USA. 2015-01-17. Archived from the original on 2015-04-18. Retrieved 31 May 2015.
  21. "2011 Academic Survey Responses". Archived from the original on February 6, 2012. Retrieved 12 September 2013.
  22. "QS Intelligence Unit - 2018 Academic Survey Responses". www.iu.qs.com. Archived from the original on 2017-07-15. Retrieved 29 June 2017.
  23. QS Intelligence Unit | Faculty Student Ratio Archived October 12, 2011, at the Wayback Machine. Iu.qs.com. Retrieved on 2013-08-12.
  24. QS Intelligence Unit | Citations per Faculty Archived October 28, 2011, at the Wayback Machine. Iu.qs.com. Retrieved on 2013-08-12.
  25. "Archived copy" (PDF). Archived from the original on 2015-09-11. Retrieved 2016-09-09.{{cite web}}: CS1 maint: archived copy as title (link) CS1 maint: bot: original URL status unknown (link)
  26. Richard Holmes. "University Ranking Watch". Archived from the original on 2015-03-16. Retrieved 31 May 2015.
  27. "Global university rankings and their impact Archived 2012-08-26 at the Wayback Machine,". "European University Association". Retrieved 3, September, 2012
  28. QS Intelligence Unit | Employer Reputation Archived August 24, 2016, at the Wayback Machine. Retrieved on 2018-05-03.
  29. "QS Intelligence Unit - QS Graduate Employability Rankings". www.iu.qs.com. Archived from the original on 2017-07-12. Retrieved 29 June 2017.
  30. QS Intelligence Unit | International Indicators Archived October 24, 2011, at the Wayback Machine. Iu.qs.com. Retrieved on 2013-08-12.
  31. Weale, Sally (2015-09-14). "British universities slip down in global rankings". The Guardian. Archived from the original on 2016-09-10. Retrieved 15 September 2016.
  32. Kich, Martin (2015-09-17). "U.S. Higher Education News for September 15, 2015". Academe Blog. Martin Kich. Archived from the original on 2016-02-22. Retrieved 15 September 2016.
  33. Leach, Mark. "Higher Education Power List - 2016". WonkHe. WonkHe. Archived from the original on 2016-09-24. Retrieved 19 September 2016.
  34. Flying high internationally Archived December 11, 2007, at the Wayback Machine
  35. "Cambridge loses top spot to Massachusetts Institute of Technology". The Independent. 11 September 2012. Archived from the original on 2012-09-15. Retrieved 11 September 2012.
  36. Calderon, Angel. "How to boost your university's ranking position". University World News. University World News. Archived from the original on 2016-09-15. Retrieved 14 September 2016.
  37. "2016 Academic Survey Responses". QS Intelligence Unit. QS Quacquarelli Symonds. Archived from the original on 2016-08-24. Retrieved 14 September 2016.
  38. "Academic Reputation". QS Intelligence Unit. QS Quacquarelli Symonds. Archived from the original on 2016-09-20. Retrieved 14 September 2016.
  39. Moran, Jack (2016-09-05). "Top 200 universities in the world 2016: the global trends". The Guardian. Archived from the original on 2016-09-24. Retrieved 14 September 2016.
  40. Holmes, Richard (2006-09-05). "So That's how They Did It". Rankingwatch.blogspot.com. Archived from the original on 2010-08-08. Retrieved 2010-09-16.
  41. "Response to Review of Strategic Plan by Peter Wills" (PDF). Archived from the original (PDF) on 6 April 2008. Retrieved 29 June 2017.
  42. "Social sciences lose 1". Timeshighereducation.co.uk. 2007-11-16. Archived from the original on 2011-11-23. Retrieved 2010-09-16.
  43. "Faculty Area Normalization – Technical Explanation" (PDF). QS Quacquarelli Symonds. Archived (PDF) from the original on 2015-09-11. Retrieved 14 September 2016.
  44. "The QS World University Rankings are a load of old baloney". 5 September 2011. Archived from the original on 2013-10-16. Retrieved 31 May 2015.
  45. "QS Intelligence Unit - QS World University Rankings". Archived from the original on 2016-01-06. Retrieved 31 May 2015.
  46. Leiter Reports: A Philosophy Blog: Guardian and "QS Rankings" Definitively Prove the Existence of the "Halo Effect" Archived 2012-08-01 at the Wayback Machine. Leiterreports.typepad.com (2011-06-05). Retrieved on 2013-08-12.
  47. Change Magazine - Taylor & Francis (13 January 2012). "Change Magazine - January-February 2012". Archived from the original on 2015-05-12. Retrieved 31 May 2015.
  48. "Improving Latin American universities' global ranking - University World News". Archived from the original on 2013-06-15. Retrieved 31 May 2015.
  49. Jaschik, Scott (April 27, 2021). "Buying Progress in Rankings?". Inside Higher Ed. Retrieved April 27, 2021.
  50. "QS Top 50 under 50". Quacquarelli Symonds. Archived from the original on 2013-06-15. Retrieved 2013-07-07.
  51. Symonds, Quacquarelli. "QS Top 50 Under 50". Top Universities. Quacquarelli Symonds. Archived from the original on 2017-07-25. Retrieved 19 July 2017.
  52. "QS World University Rankings by Subject 2020". Top Universities. QS Quacquarelli Symonds. Retrieved 27 December 2020.
  53. "QS Graduate Employability Rankings 2022". Top Universities. Retrieved 2022-08-05.
  54. "QS Graduate Employability Rankings 2020 Methodology". QS Top Universities. QS Quacquarelli Symonds. 2017-09-06. Archived from the original on 2017-09-21. Retrieved 21 September 2017.
  55. "QS University Rankings: Asia 2018". Top Universities. 2017-10-12. Archived from the original on 2016-06-16. Retrieved 2018-04-05.
  56. "Methodology (QS University Rankings – Latin America)". Quacquarelli Symonds. Archived from the original on 2014-07-29. Retrieved 12 August 2014.
  57. "QS Latin American University Rankings 2020". Top Universities. 2019-10-11. Retrieved 2020-04-11.
  58. "This matter cannot wait". D+C. Archived from the original on 2018-06-14. Retrieved 16 March 2018.
  59. "QS Best Student Cities 2016". Quacquarelli Symonds Limited. 30 November 2015. Archived from the original on 2017-07-05. Retrieved 29 June 2017.
  60. "QS Best Student Cities 2015". Quacquarelli Symonds Limited. 21 November 2014. Archived from the original on 2017-07-03. Retrieved 29 June 2017.
  61. "QS Best Student Cities 2014". Quacquarelli Symonds Limited. 14 November 2013. Archived from the original on 2017-08-28. Retrieved 29 June 2017.
  62. "QS Stars University Ratings". Top Universities. QS Quacquarelli Symonds. 2014-05-08. Archived from the original on 2016-09-14. Retrieved 2016-09-14.
  63. "QS Stars Methodology".
  64. "What is QS Stars?". 2016-10-12. Archived from the original on 2017-07-04.
  65. "QS Stars Methodology". 2012-11-04. Archived from the original on 2017-07-04.
  66. "Ratings at a Price for Smaller Universities". The New York Times. 30 December 2012. Archived from the original on 2013-04-15. Retrieved 10 September 2013.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.