CHERPA-network based in Europe wins tender to develop alternative global ranking of universities

rankings 4

Finally the decision on who has won the European Commission’s million euro tender – to develop and test a  global ranking of universities – has been announced.

The successful bid – the CHERPA network (or the Consortium for Higher Education and Research Performance Assessment), is charged with developing a ranking system to overcome what is regarded by the European Commission as the limitations of the Shanghai Jiao Tong and the QS-Times Higher Education schemes. The  final product is to be launched in 2011.

CHERPA is comprised of a consortium of leading institutions in the field within Europe; all have been developing and offering rather different approaches to ranking over the past few years (see our earlier stories here, here and  here for some of the potential contenders):

Will this new European Commission driven initiative set the proverbial European cat amongst the Transatlantic alliance pigeons?  rankings 1

As we have noted in earlier commentary on university rankings, the different approaches tip the rankings playing field in the direction of different interests. Much to the chagrin of the continental Europeans, the high status US universities do well on the Shanghai Jiao Tong University Ranking, whilst Britain’s QS-Times Higher Education tends to see UK universities feature more prominently.

CHERPA will develop a design that follows the so called ‘Berlin Principles on the ranking of higher education institutions‘. These principles stress the need to take into account the linguistic, cultural and historical contexts of the educational systems into account [this fact is something of an irony for those watchers following UK higher education developments last week following a Cabinet reshuffle – where reference to ‘universities’ in the departmental name was dropped.  The two year old Department for Innovation, Universities and Skills has now been abandoned in favor of a mega-Department for Business, Innovation and Skills! (read more here)].

According to one of the Consortium members website –  CHE:

The basic approach underlying the project is to compare only institutions which are similar and comparable in terms of their missions and structures. Therefore the project is closely linked to the idea of a European classification (“mapping”) of higher education institutions developed by CHEPS. The feasibility study will include focused rankings on particular aspects of higher education at the institutional level (e.g., internationalization and regional engagement) on the one hand, and two field-based rankings for business and engineering programmes on the other hand.

The field-based rankings will each focus on a particular type of institution and will develop and test a set of indicators appropriate to these institutions. The rankings will be multi-dimensional and will – like the CHE ranking – use a grouping approach rather than simplistic league tables. In contrast to existing global rankings, the design will compare not only the research performance of institutions but will include teaching & learning as well as other aspects of university performance.

The different rankings will be targeted at different stakeholders: They will support decision-making in universities and especially better informed study decisions by students. Rankings that create transparency for prospective students should promote access to higher education.

The University World News, in their report out today on the announcement, notes:

Testing will take place next year and must include a representative sample of at least 150 institutions with different missions in and outside Europe. At least six institutions should be drawn from the six large EU member states, one to three from the other 21, plus 25 institutions in North America, 25 in Asia and three in Australia.

There are multiple logics and politics at play here. On the one hand, a European ranking system may well give the European Commission more HE  governance capacity across Europe, strengthening its steering over national systems in areas like ‘internationalization’ and ‘regional engagement’ – two key areas that have been identified for work to be undertaken by CHERPA.

On the other hand, this new European ranking  system — when realized — might also appeal to countries in Latin America, Africa and Asia who currently do not feature in any significant way in the two dominant systems. Like the Bologna Process, the CHERPA ranking system might well find itself generating ‘echoes’ around the globe.

Or, will regions around the world prefer to develop and promote their own niche ranking systems, elements of which were evident in the QS.com Asia ranking that was recently launched.  Whatever the outcome, as we have observed before, there is a thickening industry with profits to be had on this aspect of the emerging global higher education landscape.

Susan Robertson

Ranking – in a different (CHE) way?

uwe_brandenburg_2006-005nl GlobalHigherEd has been profiling a series of entries on university rankings as an emerging industry and technology of governance. This entry has been kindly prepared for us by Uwe Brandenburg. Since 2006 Uwe has been project manager at the Centre for Higher Education Development (CHE) and CHE Consult, a think tank and consultancy focusing on higher education reform.  Uwe has an MA in Islamic Studies, Politics and Spanish from the University of Münster (Germany),  and an MscEcon in Politics from the University of Wales at Swansea.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Talking about rankings usually means talking about league tables. Values are calculated based on weighed indicators which are then turned into a figure, added and formed into an overall value, often with the index of 100 for the best institution counting down. Moreover, in many cases entire universities are compared and the scope of indicators is somewhat limited. We at the Centre for Higher Education Development (CHE) are highly sceptical about this approach. For more than 10 years we have been running our own ranking system which is so different to the point that  some experts  have argued that it might not be a ranking at all which is actually not true. Just because the Toyota Prius is using a very different technology to produce energy does not exclude it from the species of automobiles. What are then the differences?

uwe1

Firstly, we do not believe in the ranking of entire HEIs. This is mainly due to the fact that such a ranking necessarily blurs the differences within an institution. For us, the target group has to be the starting point of any ranking exercise. Thus, one can fairly argue that it does not help a student looking for a physics department to learn that university A is average when in fact the physics department is outstanding, the sociology appalling and the rest is mediocre. It is the old problem of the man with his head in the fire and the feet in the freezer. A doctor would diagnose that the man is in a serious condition while a statistician might claim that over all he is doing fine.

So instead we always rank on the subject level. And given the results of the first ExcellenceRanking which focused on natural sciences and mathematics in European universities with a clear target group of prospective Master and PhD students, we think that this proves the point;  only 4 institutions excelled in all four subjects; another four in three; while most excelled in only one subject. And this was in a quite closely related field.

uwe2

Secondly, we do not create values by weighing indicators and then calculating an overall value. Why is that? The main reason is that any weight is necessarily arbitrary, or in other words political. The person weighing decides which weight to give. By doing so, you pre-decide the outcome of any ranking. You make it even worse when you then add the different values together and create one overall value because this blurs differences between individual indicators.

Say a discipline is publishing a lot but nobody reads it. If you give publications a weight of 2 and citations a weight of one, it will look like the department is very strong. If you do it the other way, it will look pretty weak. If you add the values you make it even worse because you blur the difference between both performances. And those two indicators are even rather closely related. If you summarize results from research indicators with reputation indicators, you make things entirely irrelevant.

Instead, we let the indicator results stand for their own and let the user decide what is important for his or her personal decision-making process. e.g., in the classical ranking we allow the users to create “my ranking” so they can choose the indicators they want to look at and in which order.

Thirdly, we strongly object to the idea of league tables. If the values which create the table are technically arbitrary (because of the weighing and the accumulation), the league table positions create the even worse illusion of distinctive and decisive differences between places. They then bring alive the impression of an existing difference in quality (no time or space here to argue the tricky issue of what quality might be) which is measurable to the percentage point. In other words, that there is a qualitative and objectively recognizable measurable difference between place number 12 and 15. Which is normally not the case.

Moreover, small mathematical differences can create huge differences in league table positions. Take the THES QS: even in the subject cluster SocSci you find a mere difference of 4.3 points on a 100 point scale between league rank 33 and 43. In the overall university rankings, it is a meager 6.7 points difference between rank 21 and 41 going down to a slim 15.3 points difference between rank 100 and 200. That is to say, the league table positions of HEIs might differ by much less than a single point or less than 1% (of an arbitrarily set figure). Thus, it tells us much less than the league position suggests.

Our approach, therefore, is to create groups (top, middle, bottom) which are referring to the performance of each HEI relative to the other HEIs.

uwe3

This means our rankings are not as easily read as the others. However,  we strongly believe in the cleverness of the users. Moreover, we try to communicate at every possible level that every ranking (and therefore also ours) is based on indicators which are chosen by the ranking institution. Consequently, the results of the respective ranking can tell you something about how an HEI performs in the framework of what the ranker thinks interesting, necessary, relevant, etc. Rankings therefore NEVER tell you who is the best but maybe (depending on the methodology) who is performing best (or in our cases better than average) in aspects considered relevant by the ranker.

A small, but highly relevant aspect might be added here. Rankings (in the HE system as well as in other areas of life) might suggest that a result in an indicator proves that an institution is performing well in the area measured by the indicator. Well it does not. All an indicator does is hint at the fact that given the data is robust and relevant, the results give some idea of how close the gap is between the performance of the institution and the best possible result (if such a benchmark exists). The important word is “hint” because “indicare” – from which the word “indicator” derives – means exactly this: a hint, not a proof. And in the case of many quantitative indicators, the “best” or “better” is again a political decision if the indicator stands alone (e.g. are more international students better? Are more exchange agreements better?).

This is why we argue that rankings have a useful function in terms of creating transparency if they are properly used, i.e. if the users are aware of the limitations, the purpose, the target groups and the agenda of the ranking organization and if the ranking is understood as one instrument among various others fit to make whatever decision related to an HEI (study, cooperation, funding, etc.).

Finally, modesty is maybe what a ranker should have in abundance. Running the excellence ranking in three different phases (initial in 2007, second phase with new subjects right now, repetition of natural sciences just starting) I am aware of certainly one thing. However strongly we aim at being sound and coherent, and however intensely we re-evaluate our efforts, there is always the chance of missing something; of not picking an excellent institution. For the world of ranking, Einstein’s conclusion holds a lot of truth:

Not everything that can be counted, counts and not everything that counts can be counted.

For further aspects see:
http://www.che-ranking.de/cms/?getObject=47&getLang=de
http://www.che-ranking.de/cms/?getObject=44&getLang=de
Federkeil, Gero, Rankings and Quality Assurance in Higher Education, in: Higher Education in Europe, 33, (2008), S. 209-218
Federkeil, Gero, Ranking Higher Education Institutions – A European Perspective., in: Evaluation in Higher Education, 2, (2008), S. 35 – 52
Other researchers specialising in this (and often referring to our method) are e.g. Alex Usher, Marijk van der Wende or Simon Marginson.

Uwe Brandenburg

China: from ‘emerging contender’ to ‘serious player’ in cross-border student mobility

Last year we carried a series of reports (see here, here and here) on the global distribution of student mobility. While the US and the UK had the lion’s share of this market, with 22% and 12% respectively, we noted China had made big gains. With 7% of the global market and in 6th place overall, it was an ’emerging contender’ to be taken seriously, with trends suggesting that it was a serious player as a net ‘exporter’ and importer of education services.

So it was with great interest I read today’s Chronicle of Higher Education report by reporter Mara Hvistendahl, on China now being ranked in 5th place (behind the US, UK, France and Germany) as an “importer” of foreign students. See this OECD chart, from its new Education at a Glance 2008 report, to situate this development trend and China’s current position [recall that China is not an OECD member country].

As the Chronicle report notes, this is a far cry from China’s 33 overseas students in 1950.

Given, too, that in 1997 there were only 39,000 foreign students whilst in 2007 there were some 195,000, this 5-fold increase in numbers in 10 years (Chinese Ministry of Education and the China Scholarship Council) represents a staggering achievement and the one that is likely to continue. So, how has China achieved this. According to the Chronicle report:

To attract students, China offers competitive packages, replete with living stipends, health insurance, and, sometimes, travel expenses. In 2007 the China Scholarship Council awarded 10,000 full scholarships — at a cost of 360 million yuan ($52-million) — to international students. By 2010 the council aims to double the number of awards.

Two-fifths of the 2007 grants went to students in Asia. In a separate scholarship program that reflects its global political strategy, China is using its strengths in science and technology to appeal to students in the Middle East, Africa, and Central Asia, forming partnerships with governments in those regions to sponsor students in medicine, engineering, and agriculture.

But there are other factors as well pushing China up the ladder as an education destination. China is increasing regarded as a strategic destination by American students and the US government for study abroad. Figures reported by Institute of International Education fact-sheet on student mobility to and from the US show an increase of 38% in US students going to China in just 1 year (2005/2006). This also represents a profound shift in Sino-American educational relations.

In sum, these figures reflect the outcome of an overall strategy by China (perversely aided by the US’s own global trade and diplomacy agenda):

  • to develop a world class higher education system;
  • to internationalize Chinese higher education;
  • to stem the tide of students flowing out of China;
  • to attract half a million students to China by 2020; and
  • to advance Chinese interests through higher education diplomacy.

If realized, this would put China at the top of the exporting nations along with the US. It will also register China as a global higher education player with global impact. Without doubt this will change the geo-politics of global higher education.

Susan Robertson

International students in the UK: interesting facts

Promoting and responding to the globalisation of the higher education sector are a myriad array of newer actors/agencies on the scene, including the UK Higher Education International Unit. Set up in 2007, the UK HE International Unit aims to provide:

credible, timely and relevant analysis to those managers engaged in internationalisation across the UK HE sector, namely – Heads of institutions, pro-Vice Chancellors for research and international activities; Heads of research/business development offices and International student recruitment & welfare officers.

The UK International Unit both publishes and profiles (with download options) useful analytical reports, as well as providing synoptic comparative pictures on international student recruitment and staff recruitment on UK higher education institutions and their competitors. Their newsletter is well worth subscribing to.

Readers of GlobalHigherEd might find the following UK HE International Unit compiled facts interesting:

  • In 2004, 2.7 million students were enrolled in HEIs outside their countries of citizenship. In 2005-06, six countries hosted 67% of these students (23% in the US, 12% in the UK, 11% in Germany, 10% in France, 7% in Australia, and 5% in Japan). (UNESCO, 2006)
  • New Zealand’s share of the global market for international students increased more than fourfold between 2000 and 2006. Australia’s increased by 58% and the UK’s by 35%. (OECD, 2006)
  • There were 223,850 international students (excluding EU) enrolled at UK HEIs in 2005-06, an increase of 64% in just five years. There were a further 106,000 EU students in 2005-06. (HESA, 2006)
  • International students make up 13% of all HE students in the UK, third in proportion only to New Zealand and Australia. For those undertaking advanced research programmes, the figure is 40%, second only to Switzerland. The OECD averages are 6% and 16%, respectively. (OECD, 2006)
  • UK HEIs continue to attract new full-time undergraduates from abroad. The number of new international applicants for entry in 2007 was 68,500, an increase of 7.8% on the previous year. The number of EU applicants rose by 33%. (UCAS, 2007)
  • Students from China make up almost one-quarter of all international students in the UK. The fastest increase is from India: in 2007 there were more than 23,000 Indian students in the UK, a five-fold increase in less than a decade. (British Council, 2007)
  • The number of students in England participating in the Erasmus programme declined by 40% between 1995-96 and 2004-05 – from 9,500 to 5,500. Participation from other EU countries increased during this period. However, North American and Australian students have a lower mobility level than their UK counterparts. (CIHE, 2007).

Susan Robertson

Is 2008 a watershed for Europe’s ‘Lisbon Agenda’?

It’s all really good news for the EC, according to the report European Growth and Jobs Monitor: Indicators of Success in the Knowledge Economy 2008 released today by Allianz SE, one of Europe’s leading financial service providers and the Brussels-based think tank The Lisbon Council. Indeed the report goes on to claim that 2008 marks a watershed for Europe (see our earlier report on the EC’s assessment of Lisbon in 2007). When some parts of the world are reeling from more and more bad news stories about economic slow-downs and rising debt, this claim surely needs to be looked at more closely.

According to Allianz SE, despite earlier set backs and significant policy reorientations and renovations (see Kok Review 2004) as a consequence since 2005, the Lisbon strategy is now believed to be achieving its goals.

The report notes:

…at the time of writing, Europe outpaces the United States in economic growth. And, for the first time in more than 10 years, productivity is growing faster on a quarterly average basis than in the US – an intriguing trend which, if it proves sustainable, could signal a real turning point in Europe’s decade long effort to establish itself as truly “the most competitive and dynamic knowledge-based economy in the world”, as the original Lisbon Agenda proposed. In other words… …Lisbon is working.

jobs-abd-growth.jpg

However as Financial Times reporter, Tony Barber, notes:

It sounds almost too good to be true. The report’s tone would certainly surprise many political leaders and businessmen in the US and Asia, where Europe is often portrayed as a continent in relative economic decline. In fact, when you read the Lisbon Council report in full, you begin to suspect that its real message is that the European economy, though strong in many respects, has obvious weaknesses as well. For example, on research and development spending, there has been “limited progress” and “most countries have a lot of catching up to do”.

jobs-and-growth-then-now.jpg

Looked at more closely, it is clear this up-beat report hides what might be regarded as more disturbing facts.

For instance, spending on R&D, one of the big targets for Europe in realizing a knowledge-based economy, is still a long way off target. Add to this that a number of education systems in Europe are also off target with high-drop out rates of young learners whilst in countries like Germany only about one-fifth of 15-year-olds plan to go on to university and the picture becomes less rosy.

Leaving aside for the moment the contentious matter of whether greater levels of participation in higher education automatically lead to a knowledge-based economy, it is evident that there are several ways of reading this ‘good news’.

As we can see from the table of current ranking and one year ago, it is not so much a question of Lisbon now being realised–if we view this as a regional strategy, but some economies across Europe currently performing much stronger than others.

In other words, we are seeing the effects of the strong performance of some countries (Denmark, Finland, Ireland and Sweden) and the weak performance of others, especially Italy.

What is certain from the report is that higher education will continue to be a center piece of European policy and that the 2007 agenda – to keep up the pace of change – is likely to continue to ‘shake up’ the sector in continuing radical ways.

Susan Robertson

 

Freefall in the Australian higher education market?

Today’s report by Geoff Maslen for the University World News (9th December) – on whether the Australia’s A$11 billion a year education export market is facing a potentially catastrophic fall – must have Australian politicians and university managers shaking in their boots. The figures, it seems, are in something of a free-fall….and any spinning out of control is likely to leave a pretty large hole in the economy. As Maslen notes:

Foreign students now contribute $2.4 billion a year to university coffers. Yet the flow of new students arriving in Australia to undertake university courses has plummeted from double digit increases in the early 2000s to low single-digit increases.

In the first years to 2007, the number of overseas students undertaking university award courses on campuses in Australia jumped by more than 50% to hit 175,000 for the first time. But, over that period, annual enrolment growth fell successively from 17% to 12% to 8% and this year it is down to less than 4%.

Maslen goes on to suggest that a major reason contributing to the fall is the change in the visa processes tied to the skilled migration program. Large numbers of students come to Australia from India and China with the express purpose of gaining permanent residency once they have completed their studies. However, it seems that employers have been complaining about the poor levels of English competence amongst these students, making them unsuitable for much more than casual work. As a result, students who apply to stay on will face stricter tests of their English language competence following completion of their studies and as part of their application for permanent residence.

Maslen may well be right here. However, GlobalHigherEd can’t help but think that this isn’t  the major reason, especially as it is referring to students applying to stay on once they have completed studies, rather than those who are planning to come in the first place and who may at this point feel a lot more confident about their ability to learn and use English.

What surely must also be an important factor in this mix is the growing levels of competitiveness from those who were once smaller players in the education export business – countries like France and Germany – for example, who are now regarded as potentially desirable destinations given a move toward English language instruction at the graduate level and with lower fees and moderate living costs and who are wooing Chinese students.

The USA, too, has had time to reflect on its own position, and is now reporting an increase in overseas student numbers following the post Sept 11 period when things were definitely heading in the wrong direction. US higher education institutions have invested in people and new processes in an attempt to turn around the decline in numbers and it seems that, at least for some institutions, this is paying off.

Finally, relative currency exchange trends are clearly not moving in Australia’s favour in comparison to the country’s competitors, especially the US.

What is clear is that, once in the game, there is absolutely no room for complacency – or the outcomes are potentially catastrophic, not only for the economy but for the institutions most directly affected.

Susan Robertson

Battling for market share 2: the ‘Middle Powers’ and international student mobility

Yesterday GlobalHigherEd ran the first of 4 in-depth reports on battle for market share of higher education – on the Major Players. Our reports draw from a major study released this week by the Observatory of Borderless Higher Education (OBHE) on International Student Mobility: Patterns and Trends. The Observatory report identifies four categories: (1) the Major Players; (2) the Middle Powers; (3) the Emerging Destinations, and (4) the Emerging Contenders.

Today we look at the ‘Middle Powers’.

The Middle Players are Germany and France who share, with 20% of the total amount of all foreign students (compared with the Major players who have 45%). In contrast to the Major Players who attract students from all over the world, the Middle Players attract students from regional European countries, or those where there are strong cultural and historical ties – for instance in the case of France – Morocco, Algeria and Senegal.

Sciences Po, Paris, France

Sciences Po, Paris, FranceHowever, both Germany and France have also managed to significantly increase their numbers of students from China – one of the two major target destinations for recruiting – by around 500% over the past 8-10 years.

In 1997, Germany recruited 4980 students – by 2006 they had recruited 26,390. Similarly, France recruited a mere 1,374 students from China in 1999 – by 2005 its numbers had increased to 15,963. Compare this trend with the US – who in 1997 recruited 42,503 – increasing to only 62,583 in 2006. Only Australia and the UK have figures close to those of France and Germany in relation to an expanding share of the Chinese market. The OBHE also notes that these two Middle Powers have failed to target India, making them less strategic in their approach to the market.

However, the advantage that these Middle Powers have, at least for the moment, is their value for money (low fees and affordable living costs) and the move to teaching in English (see our report last week). The question is, how long that advantage might last? The European Commission is pressing its Member States to consider imposing or increasing student fees in order to augment flagging higher education budgets.

Given the above developments, GlobalHigherEd sees a tension emerging between (i) attracting talent for the knowledge economy (stream lined visa systems for retention, R&D infrastructures etc) , (ii) being an attractive destination for higher education (aka low fees, low living costs, high quality product), and (iii) getting a higher return to the institution and the economy by way of fees.

The question for these Middle Powers is how to become major global players, and what might be the costs and benefits in doing so.

Source: OBHE (2007) International Student Mobility: Patterns and Trends.

Susan Robertson