One of the hottest issues out there still continuing to attract world-wide attention is university rankings. The two highest profile ranking systems, of course, are the Shanghai Jiao Tong and the Times Higher rankings, both of which focus on what might constitute a world class university, and on the basis of that, who is ranked where. Rankings are also part of an emerging niche industry. All this of course generates a high level of institutional, national, and indeed supranational (if we count Europe in this) angst about who’s up, who’s down, and who’s managed to secure a holding position. And whilst everyone points to the flaws in these ranking systems, these two systems have nevertheless managed to capture the attention and imagination of the sector as a whole. In an earlier blog enty this year GlobalHigherEd mused over why European-level actors had not managed to produce an alternate system of university rankings which might counter the hegemony of the powerful Shanghai Jiao Tong (whose ranking system privileges the US universities) on the one hand, and act as a policy lever that Europe could pull to direct the emerging European higher education system, on the other.
Yesterday The Lisbon Council, an EU think-tank (see our entry here for a profile of this influential think-tank) released which might be considered a challenge to the Shanghai Jiao Tong and Times Higher ranking schemes – a University Systems Ranking (USR) in their report University Systems Ranking Citizens and Society in the Age of Knowledge. The difference between this ranking system and the Shanghai and Times is that it focuses on country-level data and change, and not individual institutions.
The USR has been developed by the Human Capital Center at The Lisbon Council, Brussels (produced with support by the European Commission’s Education, Audiovisual and Culture Executive Agency) with advice from the OECD.
The report begins with the questions: why do we have university systems? What are these systems intended to do? And what do we expect them to deliver – to society, to individuals and to the world at large? The underlying message in the USR is that “a university system has a much broader mandate than producing hordes of Nobel laureates or cabals of tenure – and patent bearing professors” (p. 6).
So how is the USR different, and what might we make of this difference for the development of universities in the future? The USR is based on six criteria:
- Inclusiveness – number of students enrolled in the tertiary sector relative to the size of its population
- Access – ability of a country’s tertiary system to accept and help advance students with a low level of scholastic aptitude
- Effectiveness – ability of country’s education system to produce graduates with skills relevant to the country’s labour market (wage premia is the measure)
- Attractiveness – ability of a country’s system to attract a diverse range of foreign students (using the top 10 source countries)
- Age range – ability of a country’s tertiary system to function as a lifelong learning institution (share of 30-39 year olds enrolled)
- Responsiveness – ability of the system to reform and change – measured by speed and effectiveness with which Bologna Declaration accepted (15 of 17 countries surveyed have accepted the Bologna criteria.
These are then applied to 17 OECD countries (all but 2 signatories of the Bologna Process). A composite ranging is produced, as well as rankings on each of the criteria. So what were the outcomes for the higher education systems of these 17 countries?
Drawing upon all 6 criteria, a composite figure of USR is then produced. Australia is ranked 1st; the UK 2nd and Denmark 3rd, whilst Austria and Spain are ranked 16th and 17th respectively (see Table1 below). We can also see rankings based on specific criteria (Table 2 below).
There is much to be said for this intervention by The Lisbon Council – not the least being that it opens up debates about the role and purposes of universities. Over the past few months there have been numerous heated public interventions about this matter – from whether universities should be little more than giant patenting offices to whether they should be managers of social justice systems.
And though there are evident shortcomings (such as the lack of clarity about what might count as a university; the view that a university-based education is the most suitable form of education to produce a knowledge-based economy and society; what is the equity/access etc range within any one country, and so on), the USR does, at least, place issues like ‘lifelong learning’, ‘access’ and ‘inclusion’ on the reform agenda for universities across Europe. It also sends a message that it has a set of values that currently are not reflected in the two key ranking systems that it would like to advance.
However, the big question now is whether universities will see value in this kind of ranking system for its wider systemic, as opposed to institutional, possibilities, even if it is as a basis for discussing what are universities for and how might we produce more equitable knowledge societies and economies.
Susan Robertson and Roger Dale
The European Commission (EC) has just released its annual 2007 Report Progress Towards the Lisbon Objectives in Education and Training: Indicators and Benchmarks. This 195 page document highlights the key messages about the main policy areas for the EC – from the rather controversial inclusion of schools (because of issues of subsidiarity) to what has become more standard fare for the EC – the vocational education and higher education sectors.
As we explain below, while the Report gives the thumbs up to the numbers of Maths, Science and Technology (MST) graduates, it gives the thumbs down to the quality of higher education. We, however, think that the benchmarks are far too simplistic and the conclusions drawn not sufficiently rigorous to support good policymaking. Let us explain.
The Report is the fourth in a series of annual assessments examining performance and progress toward the Education and Training 2010 Work Programme. These reports work as a disciplinary tool for Member States as well as contributing to making the EU more globally competitive.
To those of you unfamiliar with EC ‘speak’ – the EC’s Work Programme centers around the realization of 16 core indicators (agreed in May 2007 at the European Council and listed in the table below) and benchmarks (5) (also listed below) which emerged from the relaunch of the Lisbon Agenda in 2005.
Chapter 7 of this Report concentrates on progress toward modernizing higher education in Europe, though curiously enough there is no mention of the Bologna Process – the radical reorganization of the degree structure for European universities which has the US and Australia on the back-foot. Instead, three key areas are identified:
- mathematics, science and technology graduates (MST)
- mobility in higher education
- quality of higher education institutions
With regard to MST, the EU is well on course to surpass the benchmark of an increase in the number of tertiary graduates in MST. However, the report notes that demographic trends (decreasing cohort size) will slow down growth in the long term.
While laudable, GlobalHigherEd notes that it is not so much the number of graduates that are produced which is the problem. Rather, there are not enough attractive opportunities for researchers in Europe so that a significant percentage move to the US (14% of US graduates come from Europe). The long term attractiveness of Europe (see our recent entry) in terms of R&D is, therefore, still a major challenge.
With regard to mobility (see our earlier overview report), the EU has had an increase in the percentage of students with foreign citizenship. In 2004, every EU country, with the exception of Denmark, Estonia, Latvia, Lithuania, Hungary and Slovakia, recorded an increase in the % of students enrolled with foreign citizenship. Austria, Belgium, Germany, France, Cyprus and the UK have the highest proportions with foreign student populations of more than 10%.
Over the period 2000 to 2005 the number of students going to Europe from China increased by 500% (from 20,000 in 2000 to 107,000 in 2005; see our more detailed report on this), while numbers from India increased by 400%. While there is little doubt that the USA’s homeland security policy was a major factor, students also view the lower fees and moderate living costs in countries like France and Germany as particularly attractive. In the main:
- the countries of origin of non-European students studying in the EU largely come from former colonies of the European member states
- mobility is within the EU rather than from beyond the EU, with the exception of the UK. The UK is also a stand-out case because of the small number of its citizens who study in other EU countries.
Finally, concerning the quality of higher education, the Bologna Reforms are nowhere to be seen. Instead the EC report uses the Shanghai Jiao Tong Academic Ranking of World Universities (ARWI) and the World Universities Ranking (WUR) by the Times Higher Education Supplement to discuss the issue of quality. The Shanghai Jiao Tong uses Nobel Awards, and citations indexes (e.g. SCI; SSCI) – however, not only is a Nobel Award a limited (some say false) proxy for quality, but the citation indexes systematically discriminate in favor of US based institutions and journals. Only scientific output is included in each of these rankings; excluded are other kinds of outputs from universities which might have an impact, such as patents, or policy advice.
While each ranking system is intended to be a measure of quality – it is difficult to know what we might learn when one (Times Higher) will rank an institution (for example, the London School of Economics) in 11th position while the other (Shanghai) ranks the same institution in 200th position. Such vast differences could only be confusing for potential students if they were using them to make their choices about a high quality institution. However, perhaps this is not the main purpose, and that it serves a more important one – of ratcheting up both competition and discipline through comparison.
League tables are now also being developed in more nuanced ways. In 2007 the Shanghai ranking introduced one by ‘broad subject field’ (see below). What is particularly interesting here is that the EU-27 does relatively well in Engineering/Technology and Computer Sciences (ENG), Clinical Medicine and Pharmacy (MED) and Natural Sciences and Mathematics (SCI) in relation to the USA, compared with the Social Sciences (where the USA outflanks it by a considerable degree). Are Social Sciences in Europe this poor in terms of quality, and hence in serious trouble? GlobalHigherEd suggests that these differences are likely a reflection of the more internationalized/Anglocized publishing practices of the science, technology and medical fields, in comparison to the social sciences, who are committed in many cases to publishing in national languages.
The somewhat dubious nature of these rankings as indicators of quality does not stop the EC using them to show that of the top 100 universities, 54 are located in the USA and only 29 in Europe. And again, the overall project of the EC is to set the agenda at the European scale for Member States by putting into place at the European level a set of instruments–including the recently launched European Research Council–intended to help retain MST graduates as well as recruit the brightest talent from around the globe (particularly China and India) and keep them in Europe.
However, the MST capacity of the EU outruns its industry’s ability to absorb and retain the graduates. It is clear the markets for students and brains are developing in different ways in different countries but with clear ‘types’ of markets and consumers emerging. The question is: what would an EU ranking system achieve as a technology of competitive market making?
Susan Robertson and Peter Jones