2010/12/19

GLOBAL: No rankings provide all the answers

According to the OECD, there are 150 million university students in the world, and 3.3 million of them are studying outside their home country. This means there is growing demand for information about universities on a global scale. One approach to this challenge is university ranking.

Survey data shows that rankings are important to students. In addition, governments all over the world want to see how well their higher education systems are doing, while academics and university managers need information to help them to place their institutions in a global context.

Universities existed for centuries without rankings, which grew up within individual nations in the late 20th century and internationally in the 21st. They were developed because of growing participation in higher education, combined with increasingly market-driven and competitive university systems where information is power.

In 2003, Shanghai Jiao Tong University produced the first Academic Ranking of World Universities. It looks at a specific aspect of university performance, the peak of scientific achievement as recorded by measures such as Nobel Prizes and highly-cited research papers.


In 2004 QS Quacquarelli Symonds began producing the World University Rankings, which take a more synoptic viewpoint. They use expert review by academics and employers as a key input, along with international commitment to students and academic staff, the ratio of academic staff to students, and citations of published research. The present writer is Chairman of the 21-person academic advisory board of these rankings.

One advantage of the QS system is its stability. It has appeared for seven years with only small changes in approach and its volatility is less than for most other higher education rankings.

Another advantage is its completeness. In 2010, the academic and recruiter surveys that contribute half of a university's possible score made use of 185,000 votes by more than 20,000 academics and recruiters, spread throughout the world and across a full range of subjects and industries.

The academics are asked to specify which of five broad areas of academic study they are expert in: the natural sciences, biomedicine, technology, the social sciences, and the arts and humanities. They are then asked to name up to 30 universities which are excellent in this area, although in practice the median number they nominate is nearer 15. They cannot vote for their own institution.

We also gather more detailed data on their subject specialisms, so we may know that someone from the natural sciences is a chemist. This data will be used in our forthcoming subject analysis mentioned below.

The employers are asked which industry or business area (including public or non-profit) they come from, and to specify which universities they like to recruit from.

In addition, QS's data on international staff and students, academic papers and other criteria is essentially complete for more than 600 universities on which it publishes information.

While most of the world's significant research comes from about 180 universities, there is a far larger number of excellent institutions around the world that are attracting the ever-growing contingent of internationally mobile students, on which QS is able to gather worthwhile data.

This material is used in compiling the other half of an institution's possible score. Data on papers and citations comes from Elsevier, one of the world's biggest academic publishers.

QS does not include in the ranking specialist institutions which work in only one of the five broad academic areas or which do not teach undergraduates. This therefore removes some big-name medical and business schools from an overall table in which those institutions would be significantly disadvantaged.

As University World News readers may know, the QS rankings, which were published in Times Higher Education (THE) until 2009, are now published in collaboration with media partners around the world, including US News & World Report, publisher of the most prestigious national rankings, online at www.topuniversities.com, and in book form by QS.

THE launched its own rankings in 2010. They use five measures rather than QS's six. These are underlain by 13 narrower measures whose values are not published. Three of these groups of measures are to do with research, and total 65% of a university's possible score. The others are intended to look at the teaching environment and international commitment.

Is THE right to claim that this system is the best comparison tool for the world's universities? I think not. For one thing, it omits major institutions such as Warwick University in the UK and the University of Texas at Austin in the US. Perhaps worse, it is short on information.

The only part of the THE rankings which shows how complete the underlying data is, concerns industrial income. Here we see blanks for 59 of the top 200 institutions, including major universities such as Chicago, Tokyo and ETH Zurich. It is impossible to tell how complete the rest of THE's information is, but these gaps are ominous.

This may be why Jan Sadlak, President of the IREG Observatory on Academic Rankings and Excellence and perhaps the world's leading rankings expert, says that THE rankings editor Phil Baty's claim, published in University World News last month, that THE's ranking is 'the only one that really counts' is "unproven".

Bob Morse, editor of the US News and World Report Best Colleges, points to THE's attempts to measure the learning environment of universities as a further weakness.

Most of this measure is taken from a reputational survey in which an academic in Singapore could express an opinion about teaching at a university in Germany that she has never visited. This measure also uses data on a university's ratio of doctoral to bachelor-level students. This is an indicator of research intensity and has no connection to teaching.

QS has never said that its ranking is the best. We have immense confidence in the quality of our work, but no one system can tell you all you need to know about a complex body like a university.

David Eastwood, Vice-chancellor of Birmingham University in the UK and former head of the Higher Education Funding Council for England, says: "We greatly value QS for the clarity and quality of the data you use and for the stability which enables us to see and understand trends over time.

"This, we think, gives your rankings a comparative advantage and considerable authority."

The next stage in the development of world rankings will be the appearance of more information on specific subjects. QS plans to publish some of this information in 2011, and the OECD AHELO (Assessment of Higher Education Learning Outcomes) project will provide more. Students and other users of rankings have been asking for more subject specific information for years and they are bound to welcome this development.

QS will use citations, peer opinion, employer opinion and other relevant measures, specific to each subject, to develop these rankings.

AHELO grows out of OECD's expertise in assessing school learning, long honed in the PISA initiative. The preliminary stage involves 150 institutions voluntarily having their students' progress analysed in two subject areas, economics and engineering, where curricula are likely to be similar across the world.

Despite these developments, there will continue to be interest in overall world rankings of top institutions. Many universities have a higher world ranking in their strategic plan. But like existing rankings, none of these sources of information will provide all the answers.

* Martin Ince is chair of the Advisory Board of the QS World University Rankings. He is a UK-based science and education journalist and media adviser, and was deputy editor of THE: www.martinince.com

No comments:

Post a Comment

World time