Elephants in the room, and all that: more ‘reactions’ to the Bologna Process Leuven Communiqué

Editor’s Note: As those of you following GlobalHigherEd well know, the big news story of April on the higher education calender was the release of the Leuven Communiqué following the the 6th Bologna  Ministerial Conference held in Leuven/Louvain-la-Neuve,  28-29th April, 2009.

46 Bologna countries gathered together to review progress toward realizing the objectives of the Bologna Process by 2010, and to establish the priorities for the European Higher Education Area.  Prior to the meeting there was quite literally an avalanche of stocktaking reports, surveys, analyzes and other kinds of commentary, all fascinating reading (see this blog entry for a listing of materials).

With the Communiqué released, and the ambition to take the Bologna Process into the next decade under the banner – ‘The Bologna Process 2010′, GlobalHigherEd has invited leading European actors and commentators to ‘react’ to the Communiqué.

leuven 1

Last week  we posted some initial ‘reactions:  Pavel Zgaga’s Bologna: beyond 2010 and over the Ocean – but where to? and Peter Jones’ Was there a student voice in Leuven? In this entry, we add more. We invited Per Nyborg, Roger Dale, Pauline Ravinet and Anne Corbett to comment briefly on one aspect they felt warranted highlighting.

Per Nyborg was Head, Bologna Process Secretariat (2003-2005). Roger Dale, Professor of Sociology of Education, University of Bristol, UK, has written extensively on the governance of the European Higher Education Area and the role of Bologna Process in that. He recently published a co-edited volume on Globalisation and Europeanisation in Education (2009, Symposium Books). Pauline Ravinet is a post doctoral researcher at the Université Libre de Bruxelles, Belgium. She completed her doctoral research at Sciences Po, Paris, and has published extensively on the Bologna Process.  Anne Corbett is Visiting Fellow, European Institute, London School of Economics and Political Science (LSE). Dr. Corbett is author of Universities and the Europe of Knowledge Ideas, Institutions and Policy Entrepreneurship in European Union Higher Education 1955-2005 (Palgrave Macmillan, 2005).

Susan Robertson

~~~~~~~~~~~~~~~~~~~~

“Bologna Toward 2010″ – Per Nyborg

In 2005, halfway toward 2010, Ministers declared that they wished to establish a European Higher Education Area (EHEA) based on the principles of quality and transparency and our rich heritage and cultural diversity. They committed themselves to the principle of public responsibility for higher education. They saw the social dimension as a constituent part of the EHEA.

The three cycles were established, each level for preparing students for the labor market, for further competence building and for active citizenship. The overarching framework for qualifications, the agreed set of European standards and guidelines for quality assurance, and the recognition of degrees and periods of study, were seen as key characteristics of the structure of the EHEA.

What has been added four years later? Ministers have called upon European higher education institutions to further internationalize their activities and to engage in global collaboration for sustainable development. Competition on a global scale will be complemented by enhanced policy dialogue and cooperation based on partnership with other regions of the world. Global cooperation and global competition may have taken priority over solidarity between the 46 partner countries. But Bologna partners outside the European Economic Area region must not be left behind!  leuven 2

A clear and concise description of the EHEA and the obligations of the participating countries is what we should expect from the 2010 ministerial conference – at least if it shall be seen as the founding conference for the EHEA, not only a Bologna anniversary on the way to 2020.

~~~~~~~~~~~~~~~~~~~~~~~~~~

“Elephants in the Room, and All That”   Roger Dale

Reading the Leuven Communiqué, we can’t help but be impressed by the continuing emphasis on the public nature of and public responsibility for higher education that has characterized BFUG’s statements over the years. Indeed, the word ‘public’ appears 9 times.

However, at the same time we can’t help wondering about some other  words and connotations that don’t appear.

The nature of the ‘fast evolving society’ to which the EHEA is to respond implied by the Communiqué, seems rather different from that implied in some of these elephants in the room.

Quite apart from ‘private’ (which as Marek Kwiek has constantly reminded us is indispensable to the understanding of HE in many of the newer member states), we may cite the following:

  • First and foremost, ‘Lisbon’, with its dominant focus on Productivity and Growth;
  • Second, ‘European Commission’, the home and driver of Lisbon, and the indispensable paymaster and facilitator of the Bologna Process.
  • Third, the ‘European Research Area’; surely a report on European Universities/ERA would paint a rather different picture of the Universities over the next decade from that presented here.

It is difficult to see how complete and accurate a picture of Bologna, as it goes into its second phase, this ‘more of the same’ Communiqué provides. Perhaps the most pregnant phrase in the document is “Liaise with experts in other fields, such as research, immigration, social security and employment”,  a very mixed and interesting quartet, whose different demands may pose real problems of harmonization.

~~~~~~~~~~~~~~~~~~~~~~

“The Bologna Process  – a New Institution? ” Pauline Ravinet

I have been particularly interested in my own research on early phases and subsequent institutionalization of the Bologna Process. In this work I have tried to recompose and analyze what happened between 1998–the year when the process began with unexpected Sorbonne declaration–and now, where the Bologna process has become the central governance arena for higher education in Europe. This did not happen in one day and was rather a progressive invention of a unique European structure for the coordination of national higher education policies.

Reading the Leuven Communiqué with the institutionalization question in mind is extremely interesting. Presenting the achievements of the 2000s and defining the priorities for the decade to come, this text states more explicitly than any Bologna document before, that the process has gone much further than a ten-year provisory arrangement for the attainment of common objectives.

The Bologna process is becoming an institution. It is first an institution in its most formal meaning: the Bologna process designates an original organizational structure, functioning according to specific rules, and equipped with innovating coordination tools which will not perish but on the contrary enter a new life cycle in 2010. The Bologna Policy Forum, which met on the 29th April, will be the formal group that engages with the globalization of Bologna. This represents a further new expression of the institutionalization of Bologna.  leuven bologna policy forum

But it is also an institution in a more sociological sense. The Bologna arena has acquired value and legitimacy beyond the performance of specific tasks, it embeds and diffuses a policy vision which frames the representations and strategies of higher education actors of all over Europe, and catches the interest of students, academia, and HE experts world wide.

~~~~~~~~~~~~~~~~~~~~~~~~~

“Fit For Purpose? ”  Anne Corbett

The European Higher Education Area is the New Decade, has European Ministers responsible for higher education, declaring (para 24) that

[t]he present organisational structure of the Bologna Process is endorsed as being fit for purpose’.

You may think this to be a boring detail. However as a political scientist, I’d argue that this is the most theoretically and politically interesting phrase in the Communiqué. In policy making terms, the Bologna decade has been about framing issues, negotiating agendas, developing policies and testing out modes of cooperation which can be accepted throughout a Europe re-united for the first time for 50 years.

These are the sorts of activities typically carried out by experts and officials who are somewhat shielded from the political process. For a time they enjoy a policy monopoly (Baumgartner and Jones 1993 – see reference below).

In terms of policy effectiveness this is all to the good. The people who devote time and thought to these issues have to build up relations of trust and respect. They don’t need politicians to harry them over half-thought out ideas.

The Bologna Follow-up Group, which devises and delivers on a work programme which corresponds to the wishes of ministers, have produced an unprecedented degree of voluntary cooperation on instruments as well as aims (European Standards and Guidelines in Quality Assurance, Qualifications Frameworks, and Stocktaking or national benchmarking), thanks to working groups which recruit quite widely, seminars etc. Almost every minister at the Leuven conference started his/her 90 second speech with tributes to the BFUG.

But there comes a time in every successful policy process when political buy-in is needed. The EHEA-to-be does not have that. Institutionally Bologna is run by ministers and their administrations, technocrats and lobbyists. Finance (never ever mentioned in any communiqué) is provided by the EU Commission, EU presidencies and the host countries of ministerial conferences (up to now EU). Records of the Bologna Process remain the property of the ministries providing the secretariat in a particular policy cycle. “It works, don’t disturb it,” is the universal message of those insiders who genuinely want advance.

Students in the streets (as opposed, as Peter Jones’ entry reminds us, to those in the Brussels-based European Student Union) are a sign that a comfortably informal process has its limits once an implementation stage is reached. It is such a well known political phenomenon that it is astonishing that sophisticated figures in the BFUG are not preparing to open the door to the idea that an EHEA needs arenas at national and European level where ministers are answerable to the broad spectrum of political opinion. Parliamentarians could be in the front line here. Will either of the European assembles or any of the 46 national parliaments take up the challenge?

Baumgartner, F. and B. Jones (1993). Agendas and instability in American politics. Chicago, University of Chicago Press.

‘Tuning USA’: reforming higher education in the US, Europe style

Many of us are likely to be familiar with the film An American in Paris (1951), at least by name. Somehow the romantic encounters of an ex-GI turned struggling American painter, with an heiress  in one of Europe’s most famous cities — Paris, seems like the way things should be. lumina-13

So when the US-based Lumina Foundation announced it was launching Europe’s ‘Tuning Approach within the Bologna Process’ as an educational experiment in three American States (Utah, Indiana and Minnesota) to  “…assure rigor and relevance for college degrees at various levels” (see Inside Higher Ed, April 8th, 2009),  familiar  refrains and trains of thought are suddenly shot into reverse gear. A European in America? Tuning USA, Europe style?

For Bologna watchers, Tuning is no new initiative. According to its website profile, Tuning started in 2000 as a project:

…to link the political objectives of the Bologna Process and at a later stage the Lisbon Strategy to the higher education sector. Over time Tuning has developed into a Process: an approach to (re-)design, develop, implement, evaluate and enhance quality in first, second and third cycle degree programmes.

Given that the Bologna Process entails the convergence of 46 higher education systems across Europe and beyond (those countries who are also signatories to the Process but how operate outside its borders), the question of how comparability can be assured of curricula in terms of structures, programmes and actual teaching, was clearly a pressing issue.

Funded under the European Commission’s Erasmus Thematic Network scheme, Tuning Educational Structures in Europe emerged as a project that might address this challenge.  tuning-31

However, rather like the Bologna Process, Tuning has had a remarkable career. Its roll-out across Europe, and take up in countries as far afield as Latin America and the Caribbean (LAC) has been nothing short of astonishing.

Currently 18 Latin American and Caribbean countries (181 LAC universities) are involved in Tuning Latin America across twelve subject groups (Architecture, Business,  Civil Engineering, Education, Geology, History, Law, Mathematics, Medicine, Nursing and Physics).  The Bologna  and Tuning Processes, it would seem, are  considered a key tool for generating change across Latin America.

Similar processes are under way in Central Asia, the Mediterranean region and Africa. And while the Bologna promoters tend to emphasise the cultural and cooperation orientation of Tuning and Bologna, both are self-evidently strategies to reposition European higher education geostrategically. It is a market making  strategy as well as increasingly a model for how to restructure higher education systems to produce greater resource efficiencies, and some might add, greater equity.

tuning-21

Similarly, the Tuning Process is regarded as a means for realizing one of the ‘big goals’ that  Lumina Foundation President–Jamie Merisotis–had set for the Foundation soon after taking over the helm; to increase the proportion of the US population with degrees to 60% by 2025 so as to ensure the global competitiveness of the US.

According to the Chronicle of Higher Education (May 1st, 2009), Merisotis “gained the ear of the White House”  during the transition days of the Obama administration in 2008 when he urged Obama “to make human capital a cornerstone of US economic policy”.

Merisotis was also one of the experts consulted by the US Department of Education when it sought to determine the goals for education, and the measures of progress toward those goals.

By February 2009, President Obama had announced to Congress he wanted America to attain the world’s highest proportion of graduates by 2020.  So while the ‘big goal’ had now been set, the question was how?

One of the Lumina Foundation’s response was to initiate Tuning USA.  According to the Chronicle, Lumina has been willing to draw on ideas that are generated by the education policy community in the US, and internationally.

Clifford Adelman is one of those. A  senior associate at the Institute for Higher Education Policy in Washington, Adelman was contracted by the Lumina Foundation to produce a very extensive report on Europe’s higher education restructuring. The report (The Bologna Process for U.S. Eyes: Re-learning Higher Education in the Age of Convergence) was released early this April, and was profiled by Anne Corbett in GlobalHigherEd. In the report Adelman sets out to redress what he regards as the omissions from the Spellings Commission review of higher education.  As Adelman (2009: viii)  notes:

The core features of the Bologna Process have sufficient momentum to become the dominant global higher education model within the next two decades. Former Secretary of Education, Margaret Spellings’ Commission on the Future of Higher Education paid no attention whatsoever to Bologna, and neither did the U.S. higher education community in its underwhelming response to that Commission’s report. Such purblind stances are unforgivable in a world without borders.

But since the first version of this monograph, a shorter essay entitled The Bologna Club: What U.S. Higher Education Can Learn from a Decade of European Reconstruction (Institute for Higher Education Policy, May 2008), U.S. higher education has started listening seriously to the core messages of the remarkable and difficult undertaking in which our European colleagues have engaged. Dozens of conferences have included panels, presentations, and intense discussions of Bologna approaches to accountability, access, quality assurance, credits and transfer, and, most notably, learning outcomes in the context of the disciplines. In that latter regard, in fact, three state higher education systems—Indiana, Minnesota, and Utah—have established study groups to examine the Bologna “Tuning” process to determine the forms and extent of its potential in U.S. contexts. Scarcely a year ago, such an effort would have been unthinkable.

Working with students, faculty members and education officials from Indiana, Minnesota and Utah, Lumina has now initiated Tuning USA as a year-long project:

The aim is to create a shared understanding among higher education’s stakeholders of the subject-specific knowledge and transferable skills that students in six fields must demonstrate upon completion of a degree program. Each state has elected to draft learning outcomes and map the relations between these outcomes and graduates’ employment options for at least two of the following disciplines: biology, chemistry, education, history, physics and graphic design (see report in InsideIndianabusiness).

The world has changed. The borders between the US and European higher education are now somewhat leaky, for strategic purposes, to be sure.

A European in America is now somehow thinkable!

Susan Robertson

CRELL: critiquing global university rankings and their methodologies

This guest entry has been kindly prepared for us by Beatrice d’Hombres and Michaela Saisana of the EU-funded Centre for Research on Lifelong Learning (CRELL) and Joint Research Centre. This entry is part of a series on the processes and politics of global university rankings (see herehere, here and here).

beatriceSince 2006, Beatrice d’Hombres has been working in the Unit of Econometrics and Statistics of the Joint Research Centre of  the European Commission. She is part of the Centre for Research on Lifelong Learning. Beatrice is an economist who completed a PhD at the University of Auvergne (France). She has a particular expertise in education economics and applied econometrics.

michaela

Michaela Saisana works for the Joint Research Centre (JRC) of the European Commission at the Unit of Econometrics and Applied Statistics. She has a PhD in Chemical Engineering and in 2004 she won the European Commission – JRC Young Scientist Prize in Statistics and Econometrics for her contribution on the robustness assessment of composite indicators and her work on sensitivity analysis.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

The expansion of the access to higher education, the growing mobility of students, the need for economic rationale behind the allocation of public funds, together with the demand for higher accountability and transparency, have all contributed to raise the need for comparing university quality across countries.

The recognition of this fact has also been greatly stirred  by the publication, since 2003, of the ‘Shanghai Jiao Tong University Academic Ranking of World Universities’ (henceforth SJTU), which measures university research performance across the world. The SJTU ranking tends to reinforce the evidence that the US is well ahead of Europe in terms of cutting-edge university research.

Its rival is the ranking computed annually, since 2004, by the Times Higher Education Supplement (henceforth THES). Both these rankings are now receiving worldwide attention and constitute an occasion for national governments to comment on the relative performances of their national universities.

In France, for example, the publication of the SJTU is always associated with a surge of articles in newspapers which either bemoan  the poor performance of French universities or denounce the inadequacy of the SJTU ranking to properly assess the attractiveness of the fragmented French higher education institutions landscape (see Les Echos, 7 August 2008).

Whether the intention of the rankers or not, university rankings have followed a destiny of their own and are used by national policy makers to stimulate debates about national university systems and ultimately can lead to specific education policies orientations.

At the same time, however, these rankings are subject to a plethora of criticism. They outline that the chosen indicators are mainly based on research performance with no attempt to take into account the others missions of universities (in particular teaching), and are biased towards large, English-speaking and hard-science institutions. Whilst the limitations of the indicators underlying the THES or the SJTU rankings have been extensively discussed in the relevant literature, there has been no attempt so far to examine in depth the volatility of the university ranks to the methodological assumptions made in compiling the rankings.

crell3The purpose of the JRC/Centre for Research on Lifelong Learning (CRELL) report is to fill in this gap by quantifying how much university rankings depend on the methodology and to reveal whether the Shanghai ranking serves the purposes it is used for, and if its immediate European alternative, the British THES, can do better.

To that end, we carry out a thorough uncertainty and sensitivity analysis of the 2007 SJTU and THES rankings under a plurality of scenarios in which we activate simultaneously different sources of uncertainty. The sources cover a wide spectrum of methodological assumptions (set of selected indicators, weighting scheme, and aggregation method).

This implies that we deviate from the classic approach – also taken in the two university ranking systems – to build a composite indicator by a simple weighted summation of indicators. Subsequently, a frequency matrix of the university ranks is calculated across the different simulations. Such a multi-modeling approach and the presentation of the frequency matrix, rather than the single ranks, allows one to deal with the criticism, often made to league tables and rankings systems ,that ranks are presented as if they were calculated under conditions of certainty while this is rarely the case.  crell

The main findings of the report are the following. Both rankings are only robust in the identification of the top 15 performers on either side of the Atlantic, but unreliable on the exact ordering of all other institutes. And, even when combining all twelve indicators in a single framework, the space of the inference is too wide for about 50 universities of the 88 universities we studied and thus no meaningful rank can be estimated for those universities. Finally, the JRC report suggests that THES and SJTU rankings should be improved along two main directions:

  • first, the compilation of university rankings should always be accompanied by a robustness analysis based on a multi-modeling approach. We believe that this could constitute an additional recommendation to be added to the already 16 existing Berlin Principles;
  • second, it is necessary to revisit the set of indicators, so as to enrich it with other dimensions that are crucial to assessing university performance and which are currently missing.

Beatrice d’Hombres  and Michaela Saisana

Ranking – in a different (CHE) way?

uwe_brandenburg_2006-005nl GlobalHigherEd has been profiling a series of entries on university rankings as an emerging industry and technology of governance. This entry has been kindly prepared for us by Uwe Brandenburg. Since 2006 Uwe has been project manager at the Centre for Higher Education Development (CHE) and CHE Consult, a think tank and consultancy focusing on higher education reform.  Uwe has an MA in Islamic Studies, Politics and Spanish from the University of Münster (Germany),  and an MscEcon in Politics from the University of Wales at Swansea.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Talking about rankings usually means talking about league tables. Values are calculated based on weighed indicators which are then turned into a figure, added and formed into an overall value, often with the index of 100 for the best institution counting down. Moreover, in many cases entire universities are compared and the scope of indicators is somewhat limited. We at the Centre for Higher Education Development (CHE) are highly sceptical about this approach. For more than 10 years we have been running our own ranking system which is so different to the point that  some experts  have argued that it might not be a ranking at all which is actually not true. Just because the Toyota Prius is using a very different technology to produce energy does not exclude it from the species of automobiles. What are then the differences?

uwe1

Firstly, we do not believe in the ranking of entire HEIs. This is mainly due to the fact that such a ranking necessarily blurs the differences within an institution. For us, the target group has to be the starting point of any ranking exercise. Thus, one can fairly argue that it does not help a student looking for a physics department to learn that university A is average when in fact the physics department is outstanding, the sociology appalling and the rest is mediocre. It is the old problem of the man with his head in the fire and the feet in the freezer. A doctor would diagnose that the man is in a serious condition while a statistician might claim that over all he is doing fine.

So instead we always rank on the subject level. And given the results of the first ExcellenceRanking which focused on natural sciences and mathematics in European universities with a clear target group of prospective Master and PhD students, we think that this proves the point;  only 4 institutions excelled in all four subjects; another four in three; while most excelled in only one subject. And this was in a quite closely related field.

uwe2

Secondly, we do not create values by weighing indicators and then calculating an overall value. Why is that? The main reason is that any weight is necessarily arbitrary, or in other words political. The person weighing decides which weight to give. By doing so, you pre-decide the outcome of any ranking. You make it even worse when you then add the different values together and create one overall value because this blurs differences between individual indicators.

Say a discipline is publishing a lot but nobody reads it. If you give publications a weight of 2 and citations a weight of one, it will look like the department is very strong. If you do it the other way, it will look pretty weak. If you add the values you make it even worse because you blur the difference between both performances. And those two indicators are even rather closely related. If you summarize results from research indicators with reputation indicators, you make things entirely irrelevant.

Instead, we let the indicator results stand for their own and let the user decide what is important for his or her personal decision-making process. e.g., in the classical ranking we allow the users to create “my ranking” so they can choose the indicators they want to look at and in which order.

Thirdly, we strongly object to the idea of league tables. If the values which create the table are technically arbitrary (because of the weighing and the accumulation), the league table positions create the even worse illusion of distinctive and decisive differences between places. They then bring alive the impression of an existing difference in quality (no time or space here to argue the tricky issue of what quality might be) which is measurable to the percentage point. In other words, that there is a qualitative and objectively recognizable measurable difference between place number 12 and 15. Which is normally not the case.

Moreover, small mathematical differences can create huge differences in league table positions. Take the THES QS: even in the subject cluster SocSci you find a mere difference of 4.3 points on a 100 point scale between league rank 33 and 43. In the overall university rankings, it is a meager 6.7 points difference between rank 21 and 41 going down to a slim 15.3 points difference between rank 100 and 200. That is to say, the league table positions of HEIs might differ by much less than a single point or less than 1% (of an arbitrarily set figure). Thus, it tells us much less than the league position suggests.

Our approach, therefore, is to create groups (top, middle, bottom) which are referring to the performance of each HEI relative to the other HEIs.

uwe3

This means our rankings are not as easily read as the others. However,  we strongly believe in the cleverness of the users. Moreover, we try to communicate at every possible level that every ranking (and therefore also ours) is based on indicators which are chosen by the ranking institution. Consequently, the results of the respective ranking can tell you something about how an HEI performs in the framework of what the ranker thinks interesting, necessary, relevant, etc. Rankings therefore NEVER tell you who is the best but maybe (depending on the methodology) who is performing best (or in our cases better than average) in aspects considered relevant by the ranker.

A small, but highly relevant aspect might be added here. Rankings (in the HE system as well as in other areas of life) might suggest that a result in an indicator proves that an institution is performing well in the area measured by the indicator. Well it does not. All an indicator does is hint at the fact that given the data is robust and relevant, the results give some idea of how close the gap is between the performance of the institution and the best possible result (if such a benchmark exists). The important word is “hint” because “indicare” – from which the word “indicator” derives – means exactly this: a hint, not a proof. And in the case of many quantitative indicators, the “best” or “better” is again a political decision if the indicator stands alone (e.g. are more international students better? Are more exchange agreements better?).

This is why we argue that rankings have a useful function in terms of creating transparency if they are properly used, i.e. if the users are aware of the limitations, the purpose, the target groups and the agenda of the ranking organization and if the ranking is understood as one instrument among various others fit to make whatever decision related to an HEI (study, cooperation, funding, etc.).

Finally, modesty is maybe what a ranker should have in abundance. Running the excellence ranking in three different phases (initial in 2007, second phase with new subjects right now, repetition of natural sciences just starting) I am aware of certainly one thing. However strongly we aim at being sound and coherent, and however intensely we re-evaluate our efforts, there is always the chance of missing something; of not picking an excellent institution. For the world of ranking, Einstein’s conclusion holds a lot of truth:

Not everything that can be counted, counts and not everything that counts can be counted.

For further aspects see:
http://www.che-ranking.de/cms/?getObject=47&getLang=de
http://www.che-ranking.de/cms/?getObject=44&getLang=de
Federkeil, Gero, Rankings and Quality Assurance in Higher Education, in: Higher Education in Europe, 33, (2008), S. 209-218
Federkeil, Gero, Ranking Higher Education Institutions – A European Perspective., in: Evaluation in Higher Education, 2, (2008), S. 35 – 52
Other researchers specialising in this (and often referring to our method) are e.g. Alex Usher, Marijk van der Wende or Simon Marginson.

Uwe Brandenburg

‘University Systems Ranking (USR)’: an alternative ranking framework from EU think-tank

One of the hottest issues out there still continuing to attract world-wide attention is university rankings. The two highest profile ranking systems, of course, are the Shanghai Jiao Tong and the Times Higher rankings, both of which focus on what might constitute a world class university, and on the basis of that, who is ranked where. Rankings are also part of an emerging niche industry. All this of course generates a high level of institutional, national, and indeed supranational (if we count Europe in this) angst about who’s up, who’s down, and who’s managed to secure a holding position. And whilst everyone points to the flaws in these ranking systems, these two systems have nevertheless managed to capture the attention and imagination of the sector as a whole. In an earlier blog enty this year GlobalHigherEd mused over why European-level actors had not managed to produce an alternate system of university rankings which might counter the hegemony of the powerful Shanghai Jiao Tong (whose ranking system privileges the US universities) on the one hand, and act as a policy lever that Europe could pull to direct the emerging European higher education system, on the other.

Yesterday The Lisbon Council, an EU think-tank (see our entry here for a profile of this influential think-tank) released which might be considered a challenge to the Shanghai Jiao Tong and Times Higher ranking schemes – a University Systems Ranking (USR) in their report University Systems Ranking Citizens and Society in the Age of Knowledge. The difference between this ranking system and the Shanghai and Times is that it focuses on country-level data and change, and not  individual institutions.

The USR has been developed by the Human Capital Center at The Lisbon Council, Brussels (produced with support by the European Commission’s Education, Audiovisual and Culture Executive Agency) with advice from the OECD.

The report begins with the questions: why do we have university systems? What are these systems intended to do? And what do we expect them to deliver – to society, to individuals and to the world at large? The underlying message in the USR is that “a university system has a much broader mandate than producing hordes of Nobel laureates or cabals of tenure – and patent bearing professors” (p. 6).

So how is the USR different, and what might we make of this difference for the development of universities in the future? The USR is based on six criteria:

  1. Inclusiveness – number of students enrolled in the tertiary sector relative to the size of its population
  2. Access – ability of a country’s tertiary system to accept and help advance students with a low level of scholastic aptitude
  3. Effectiveness – ability of country’s education system to produce graduates with skills relevant to the country’s labour market (wage premia is the measure)
  4. Attractiveness – ability of a country’s system to attract a diverse range of foreign students (using the top 10 source countries)
  5. Age range – ability of a country’s tertiary system to function as a lifelong learning institution (share of 30-39 year olds enrolled)
  6. Responsiveness – ability of the system to reform and change – measured by speed and effectiveness with which Bologna Declaration accepted (15 of 17 countries surveyed have accepted the Bologna criteria.

These are then applied to 17 OECD countries (all but 2 signatories of the Bologna Process). A composite ranging is produced, as well as rankings on each of the criteria. So what were the outcomes for the higher education systems of these 17 countries?

Drawing upon all 6 criteria, a composite figure of USR is then produced. Australia is ranked 1st; the UK 2nd and Denmark 3rd, whilst Austria and Spain are ranked 16th and 17th respectively (see Table1 below). We can also see rankings based on specific criteria (Table 2 below).

thelisboncouncil1

thelisboncouncil2

There is much to be said for this intervention by The Lisbon Council – not the least being that it opens up debates about the role and purposes of universities. Over the past few months there have been numerous heated public interventions about this matter – from whether universities should be little more than giant patenting offices to whether they should be managers of social justice systems.

And though there are evident shortcomings (such as the lack of clarity about what might count as a university; the view that a university-based education is the most suitable form of education to produce a knowledge-based economy and society; what is the equity/access etc range within any one country, and so on), the USR does, at least, place issues like ‘lifelong learning’, ‘access’ and ‘inclusion’ on the reform agenda for universities across Europe. It also sends a message that it has a set of values that currently are not reflected in the two key ranking systems that it would like to advance.

However, the big question now is whether universities will see value in this kind of ranking system for its wider systemic, as opposed to institutional, possibilities, even if it is as a basis for discussing what are universities for and how might we produce more equitable knowledge societies and economies.

Susan Robertson and Roger Dale

Strategic actors in the Eurolandscape: meet ‘The Lisbon Council’

Earlier this week we posted an entry on a new European Commission ‘Communication’ – a Strategic Framework for International Science and Technology Cooperation.

In working up this entry it became clear to us that some of the state-crafting language to describe different stages of the policy process in the construction of Europe needed decoding to enable the reader to assess the relative importance of particular initiatives. For example, what is a Communication? what is its status? who is it to? and so on. While this seems an obvious point to make–that the lexicon to describe aspects of the policy process is quite different around the globe–finding a web-link with an adequate explanation of this was quite a different matter.

So when today’s Policy Brief on University Systems Ranking from The Lisbon Council hit cyberspace (we’ll profile the briefing tomorrow), it seemed that here, too, was another instance when names and terms could be rather confusing. The tight linking of the idea of ‘Lisbon’ to ‘Council’ tends to suggest that this organisation is one of a number of European bodies that make up the official governing structure of Europe. However, this is not the case. thelisboncouncil

So, who are they, and how does The Lisbon Council fit into the Eurolandscape of policymaking? This is the first in a series of posts where we introduce key strategic actors involved in constituting and governing higher education within Europe and beyond.

The Lisbon Council–or more properly The Lisbon Council for Economic Competitiveness and Social Renewal–is an independent think-tank and policy network created in 2003 to advance the now famous Lisbon 2000 Agenda; of making Europe “…the most dynamic, globally competitive, knowledge-based economy in the world….”.

According to their website, The Lisbon Council, whose tag line ‘making Europe fit for the future’, is committed to

…defining and articulating a mature strategy for managing current and future challenges. Above all, we are seeking strategies based on inclusion, opportunity and sustainability that will make the benefits of modernisation available to all our citizens.

Our network – concerned citizens, top economists, public figures, NGO leaders, business strategists and leading-edge thinkers – lends its energy, brain power and dedication to solving the great economic and social challenges of our times. At the centre of our activities are solution-oriented seminars, thought-provoking publications, media appearances and public advocacy.

We can get a sense of the kind of strategic thinking The Lisbon Council advocate to realize a globally competitive Europe by also looking at its projects (including the Human Capital Center), publications, Founding Fathers Lecture Series, and u-Tube presence.

Four ‘founding fathers’ are identified for the Lecture Series as representing Europe’s innovative visionary past – The Robert Schuman Lecture (French politician and regarded as founder of the EU), The Ludwig Erhard Lecture (German politician who presided over the post War German recovery), The Jean Jacque Rousseau Lecture (French philosopher of enlightenment thinking/socialism), and The Guglielmo Marconi Lecture (Italian inventor).

This year the Guglielmo Marconi Lecture which we feature below was delivered by Charlie Leadbetter – well-known for his work with UK-based think-tank DEMOS. Leadbetter’s lecture engages with the Commission’s 2009 theme, creativity and innovation.

Now the important thing to point out is that The Lisbon Council think-tank agenda articulates closely with the ‘new Lisbon Agenda’, launched in 2005; to reorient and reinvigorate Lisbon 2000 agenda. It is at this point that we see the European Commission’s engagement with globalization as an outward looking strategy, the move toward supply-side economics, the prioritization of human capital strategies, greater questioning of the Social Europe policies, and a commitment to press ahead with the reform of Member State’s higher education systems to make a European higher education system. These commitments have been repeatedly reinforced by European Commission President, Jose Manuel Barroso, as we see in his speech to The Lisbon Council earlier this year.

In following European policymaking in higher education, it is therefore important to look closely at organizations like The Lisbon Council, and the kind of futures thinking/policy shaping work they are engaged in as part of a wider governance of European higher education.

Susan Robertson

US/Turkish collaborations: bringing vocational schools into the global education sector

In the past three years I’ve had the great opportunity to give invited lectures, teach a graduate summer school course, and run research workshops at Bogazici University in Istanbul, Turkey.

This has been a wonderful occasion for me to listen to, and engage with, lively and committed scholars and students around processes of globalization, Turkey’s application to the EU for accession, and the geo-strategic role of Turkey situated as it is between Asia and Europe.

So it was with great interest that I read in the Observatory for Borderless Higher Education’s (OBHE) latest bulletin; that Turkey had signed a deal with the US-based Community Colleges for International Development, Inc. An Association for Global Education to put into place an exchange between US and Turkish vocational schools.

The OBHE report was based on a lead article carried in the World Bulletin. For the Turkish Higher Education Council (YÖK), these collaborative partnerships will be instituted in 7-8 Turkish vocational schools in an attempt to improve the curriculum in Turkish vocational schools.

According to the Chairperson of YÖK, Professor Yusuf Ziya Ozcan:

Vocational schools are the engines of our economy. If these schools train the work force needed by our economy and industry, most of the problems in Turkey will be solved. If we can guide some of our high-school graduates to get further education at vocational schools instead of universities, this will diminish the crowds waiting at the doors of universities as well.

Operationalizing the program means that Turkish students would spend their first year in Turkey and get their second-year education at a U.S. vocational school, whilst US students would have a chance to spend a year in Turkey.

But, why the US and not Europe, as a model for vocational education? And why build student mobility into a vocational school program?

According to Professor Ozcan:

…the best thing to do on this issue was to get support from a country where vocational education system functioned smoothly, and therefore, they decided to pay a visit to USA.

This move by the Turkish Higher Education Council to collaborate on vocational education might be read in a number of ways. For instance, Turkey’s education system has historically had close links to US, particularly through its (former) private schools and universities. This is thus business as usual, only applied to a different sector – vocational schools.

Turkey is also a popular destination for US students studying abroad as part of their undergraduate program (see Kavita Pandit’s entry on dual degree programs between Turkey and SUNY/USA). The university residence where I stayed whilst teaching at Bogazici in 2007 was buzzing with undergraduate students from the US. Thus, this new exchange initiative might be viewed as further strengthening already existing ties along channels that are already established.

Adding a component of student mobility to vocational education in Turkey might make that sector more attractive to prospective students, whilst generating the kind of knowledge and demeanor global firms think is important in its intermediary labor force. This would give Turkey’s intermediary labor a competitive advantage in the churn for flexible skilled workers in the global economy.

This deal can also be read as the outcome of an ambivalence by Turkey and its education institutions toward Europe and its regionalizing project, and vice versa. And while there are serious moves in Turkish universities, toward implementing Europe’s Bologna Process in higher education, it seems Turkey–like a number of countries around the world–is weighing up its response to the globalizing education models that are circulating so that they keep a foot in both camps – the USA and Europe.

Susan Robertson