Deterritorializing academic freedom: reflections inspired by Yale-NUS College (and the London Eye)

To what degree is academic freedom being geographically unsettled – deterritorialized, more accurately – in the context of the globalization of higher education? This was one of the issues I was asked about a few days ago when I spoke to a class of New York-based Columbia University students about the globalization of higher education, with a brief case study about Singapore’s global higher education hub development agenda. Some of the students were intrigued by this debate erupting (again) about Yale’s involvement in Yale-NUS College:

Given that we only had a limited time to discuss these issues, I’ve outlined elements of the comments I would have added if I had a little more time during the Q&A session. And clearly, there is even much more to say about these issues than what is outlined below, but I’ve got grading to attend to, so this entry will have to suffice. And if any of you (the students) have more questions, please email me anytime. I’m obviously making this follow-up public on a weblog as well, as it fits into the broad themes covered in GlobalHigherEd.

The first point I wanted to reinforce is it is important to recognize that Singapore’s attempt to become the ‘Boston of the East’ is underlain by structural change in Singapore’s economy, and a related perception that a ‘new breed’ of Singaporean is needed.* Implementation of the global education hub development agenda is therefore dependent upon the exercise of statecraft and the utilization of state largesse. For example, nothing would be happening in Singapore regarding the presence of foreign universities were it not for shifts in how the state engages with foreign actors, including universities like Yale and Chicago. The opening up of territory to commercial presence (to use GATS parlance) as well as ‘deep partnerships,’ and the myriad ideological/regulatory/policy shifts needed to draw in foreign universities, have been evident since 1998. In an overall sense, this development agenda is designed to help reshape society and economy, while discursively branding (it is hoped) Singapore as one of the ‘hotspots’ in the city-region archipelago which fuels, and profits from, the global knowledge economy.

Second, and as I noted on Thursday in my lecture, one of the three key post-1998 realignments is an enhanced acceptance of academic freedom in Singapore (in comparison to the 1980s and early 1990s). This is a point that was made in a 2005 chapter* I co-authored with Nigel Thrift and the same conditions exist today as far as I am concerned (though I do not speak for Nigel Thrift here).

In Singapore over the last decade plus, local universities have acquired considerably more autonomy regarding governance matters; faculty now have historically unprecedented freedom to shape curricula and research agendas; and students have greater freedom to express themselves, even taking on ruling politicians in campus fora from time to time. I personally believe that the practice of academic freedom is alive and well in Singapore for the most part, and that critics of (for example) the Yale-NUS venture would be fools to assume this is a Southeast Asian Soviet-era Czechoslovakia: Singapore is far more sophisticated, advanced, and complex than this. Universities like the National University of Singapore and Singapore Management University are full of discussion, politically tinged banter, illuminating discussions in classes, and vigorous debates mixed with laughter over lunches at the many campus canteens. There is no difference between the debates I had about politics in Singapore when I worked there for four years (1997-2001), and when visiting since, to those I have had at the ‘Berkeley of the Midwest’ (UW-Madison) from 2001 on, or my alma maters in Canada (University of British Columbia) and England (University of Bristol). The relatively cosmopolitan and young (age-wise) nature of the faculty base in Singapore also indirectly engenders some forms of open-mindedness that are absent from more established (and sometimes smug) centers of scholarship.

This said, Singapore is a highly charged ‘soft authoritarian’ political milieu: if certain conditions come together regarding the focus and activities of a faculty member (or indeed anyone else in Singapore, be they expatriates, permanent residents, or long-term citizens), a strong state guided by political elites has much room to maneuver – legally, administratively, procedurally, symbolically – in comparison to most other developed countries. In this kind of context, a focused form of ‘calibrated coercion’ can be exercised, if so desired, and an analyst’s life can be made very difficult despite the general practice of academic freedom on an average day- to-day basis. There are discussions about ‘OB markers’ (out-of-bounds markers) regarding certain topics, some forms of self-censorship regarding work on select themes, and perhaps a lift of the eye when CVs come in with Amnesty International volunteer experience listed on them. And at a broader scale, the Public Order Act regulates ‘cause-related’ cause-related activities that “will be regulated by permit regardless of the number of persons involved or the format they are conducted in.” Even before the 2009 tightening of revisions to the Public Order Act, I recall stumbling upon a ruckus (desperately searching for a post-lunch coffee, circa 1999-2000) when I witnessed police removing the leader of the opposition from the grounds of the National University of Singapore after he attempted give an impromptu (I assume) speech to students below the main library.

In short, Singapore is a complicated place, and one needs to work hard to understand the complexities and nuances that exist. Blind naïveté (often facilitated by temptingly high salaries, and lack of regional knowledge) is as bad as cynical critique: like all places (Singapore and the US included), there are many shades of grey in our actually existing world. On this point it worth quoting the ever insightful Cherian George:

Singapore is not for everyone. Compared with countries at a similar income level, it is backward in the inclusiveness it offers to people with disabilities. It is a relatively safe country for families – but an innocent person who is wrongly suspected of a crime has more reason to fear in Singapore than in countries that treat more seriously the rights of the accused. And those who care enough for their society to stand up and criticise it have to be prepared to be treated as an opponent by an all-powerful government, enduring harassment and threats to their livelihoods. Being a writer immersed in Singapore has not blinded me to the system’s faults. But, one common form of critique in which I find myself unable to indulge is caricature, reducing Singapore to a society ruled by a monolithic elite, served by a uniformly pliant media, and populated by lobotomised automatons. Such essentialised accounts of government, media and people may satisfy the unengaged, but they generate too much cognitive dissonance for me. The Singapore I know – like any human society – is diverse and complex…

So, in my view, the practice of academic freedom in Singapore is alive and well on a number of levels, but there are always significant political sinkholes that might open up; you just never know…

Academic freedoms & the Singapore Eye

But what are the many foreign universities with a presence in this Southeast Asian city-state doing about academic freedom? In particular, if there is a lack of clarity about the nature of academic freedom given that guidelines are not codified, rules are unclear, and there appear to be no formalized procedures for dealing with serious contests, do foreign universities just accept the same conditions local faculty and students cope with?

The answer is a clear and resolute “no,” at least for highly respected universities like Chicago, Cornell, Duke, and Yale. Rather, what they do about academic freedom depends upon the outcome of negotiations between each of these foreign universities and the Singaporean state (sometimes in conjunction with local partner universities).

One of the more intriguing things about the development process is that most of the foreign universities that have engaged with the Singaporean state have developed what are effectively bilateral understandings of academic freedom. As I noted in 2005:**

Yet despite the influx of a significant number of American and European universities in response to the emergence of these new socio-economic development objectives, the concept of academic freedom, one of the underlying foundations of world class university governance systems, has received remarkably limited discussion and debate. The discussions and negotiations about the nature of academic freedom vis a vis Singapore’s global schoolhouse have been engaged with in a circumscribed and opaque manner. Deliberations have primarily taken the form of closed negotiation sessions between senior administrators representing foreign universities, and officials and politicians representing the Singaporean state. The agreements that have been made are verbal for the most part, though they have also been selectively inscribed in the confidential Memorandum of Understandings (MOUs) and Agreements that have been signed between the Government of Singapore or local universities and the foreign universities in question. Strands of the concept have been brought over by the foreign universities, reworked during negotiations, and constituted in verbal and sometimes confidential textual form. The development of a series of case-by-case conceptualizations of academic freedom is hybridizing in effect. Through verbal agreements of unique forms, and through MOUs and Agreements of unique forms, foreign universities and the Singaporean state have splintered academic freedom in unique ways, unsettling previous notions of academic freedom in quite significant and hitherto unexamined ways.

See, for example, pages 6-8 of a Yale-issued summary of the agreement it reached with the Singaporean state.

Leaving aside the content of this message from Yale’s president, it is important to stand back and reflect on what is going on. In my opinion what we’re seeing is the creation of a strategically delineated understanding of academic freedom; one specified by just two parties in this case, and one that applies is a narrowly circumscribed geographic context (the Yale-NUS campus).

But think about the patterns here. Who is at the center of this aspect of the development process? It is the Singaporean state, including senior politicians such as the minister of education, the deputy prime minister, the prime minister, and in the Yale case senior leadership at NUS.

Much like the London Eye, a myriad of universities work with the Singaporean state on this issue at a bilateral (case by case) level. Given this, no one knows more about how academic freedom can be negotiated and framed than the people at the center of the negotiation dynamic. A Singapore Eye of sorts (if this admittedly awkward analogy makes sense!) regarding academic freedom exists. A less obtuse analogy might be a hub (the state) and spoke (multiple foreign universities) one. And the outcome is a plethora of differentially shaped academic freedoms in Singapore, scattered across the city-state in association with the foreign universities, shorn as far as I can tell from much of the context local universities (and their academics) are embedded in.

In the global higher ed context, this pattern is not unique to Singapore. The same case could be made regarding Qatar, Abu-Dhabi, Dubai, and China (albeit to a lesser degree). What is noteworthy is that the current experts regarding the globalization of academic freedom are monarchs and political elites associated with ruling regimes, not the people associated with the higher education sector, for they are too focused on their own institutional agendas.

Another interesting aspect of this development process is the absence of any form of collective representation regarding academic freedom in these hubs. Universities (e.g., Yale, Cornell, MIT, NYU, Texas A&M) active in global higher education hubs informally share information, to be sure, but their capacity to share information, and model practices, depends on proactive and savvy administrators who know what to think about, what to ask about, and where the ‘non-negotiable’ line should be drawn.

Once they forge their agreement with the state in these hubs, they move on to the implementation phase. And then 1-3-5 years later in comes a new university, and this pattern starts afresh (and a new spoke is added). But the lines connecting the foreign universities are thin. For example, it is worth considering how many of the recent negotiations about academic freedom in Singapore have been informed by a critical analysis of the pros and cons of the University of Warwick’s deliberations about opening up a branch campus in Singapore circa 2005, including Dr. Thio Li-ann‘s substantial report about academic freedom in Singapore.

In conclusion: on absence vs presence

Well, I’ve gone on now longer than I expected. But I want to close off by asking you (Columbia U students) to think about absence as much as presence. I’ve often encouraged my own students to think about this aspect of development, for while we can recognize and focus on presence, absence matters just as much. Absence is itself a phenomenon associated with the development process: absence is often desired, or absence can exist as an outcome of the lack of capability, planning, and resources.

One thing that appears to be absent in Singapore as a whole are codified rules and guidelines about academic freedom: what it is defined as, what its limitations are, and what its value is to higher education institutions. Interestingly you find all sorts of statements about the presence of academic freedom in Singapore, much like the ones I made above, or the ones put forward yesterday by Simon Chesterman (see ‘Academic freedom in New Haven and S’pore,’ The Straits Times, 30 March 2012). [Professor Chesterman is Dean of the NUS Law School. Given that he is a law professor, and also son-in-law of the architect of Singapore’s global schoolhouse development strategy (Dr. Tony Tan Keng Yam, Singapore’s current President), his views are worth taking note of.] But statements and op-eds are just that; they are not the only things that create formally demarcated and secure spaces for researchers and students. What arguably helps realize and ground academic freedom are legible guidelines, codified procedures in case of contest, laws, and symbolic affirmations of value such as this plaque I walked past this morning.

Statements, even by important officials and member of the elite do not beget confidence about the importance of academic freedom, hence the desire of universities like Yale and Cornell to act – to codify – on a bilateral basis.

One of the more curious aspects of this ongoing debate about academic freedom is that Singapore has a reputation as a place that respects the rule of law, and it has a formidable reputation for the quality and clarity of regulation regarding key industries (e.g., finance). Yet the guidelines and regulations associated with the space to produce and circulate information and knowledge via universities situated in Singaporean territory remains limited, in my opinion. Why, especially when academic freedom keeps emerging as a concern of global actors like Yale (circa 2011-2012), Warwick (circa 2005), etc.? And why when a knowledge economy and society is just that — one dependent upon the sometimes unruly production of valuable forms of knowledge?

Is bilateralism regarding academic freedom really the most effective approach? I’m not so sure for what it appears to do is provide fuel for debates, such as the one unfolding in Yale right now. Absence on this core issue (academic freedom) in Singapore as a whole, is arguably providing fuel for fire. Thus while some Yalies (is that what they are called?) seem to be disseminating remarkable unsophisticated understandings of how academia and politics works in Singapore, I would argue that the Singaporean authorities have created an opaque regulatory and discursive context regarding academic freedom vis a vis the production and circulation of knowledge. And as anyone who works on economic development knows, uncertainty is a problematic factor that can inhibit or skew the development process, partly because of misinterpretations.

A second absence is a global scale mechanism to ensure that the core principles associated with academic freedom are protected and realized as best it can be for the global community of scholars of which we are all (in Singapore, in New Haven, in Qatar, in Madison) a part. The long history of academic freedom is intertwined with the emergence of enlightenment(s), modernity(ies), and the associated development of societies and economies. Academic freedom helps create the space for the search for truth, the unfettered production and circulation of knowledge, and socio-economic innovation. But academic freedom has to be practiced, protected, codified, and realized, including while it is being globalized. The bilateralism evident in places like Singapore is inadequate in that the ‘foreign’ universities engaged in it are only thinking of themselves and not the global ecumene and community of universities. It is surely time for them to exercise some global leadership on such a core/foundational principle; one that has helped these universities become what they are.

Great to meet you all last Thursday. Be sure to think hard about this issue, gather diverse perspectives as any good student should, and feel 100% free to disagree with me.  And hope next week’s discussions go well!

Kris Olds

* Olds, K., and Thrift, N. (2005) ‘Cultures on the brink: reengineering the soul of capitalism – on a global scale’, in A. Ong and S. Collier (eds.) Global Assemblages: Technology, Politics and Ethics as Anthropological Problems, Oxford: Blackwell, pp. 270-290.

** Olds, K. (2005) ‘Articulating agendas and traveling principles in the layering of new strands of academic freedom in contemporary Singapore’, in B. Czarniawska and G. Sevón (eds.) Where Translation is a Vehicle, Imitation its Motor, and Fashion Sits at the Wheel: How Ideas, Objects and Practices Travel in the Global Economy, Malmö : Liber AB , pp. 167-189.

CHERPA-network based in Europe wins tender to develop alternative global ranking of universities

rankings 4

Finally the decision on who has won the European Commission’s million euro tender – to develop and test a  global ranking of universities – has been announced.

The successful bid – the CHERPA network (or the Consortium for Higher Education and Research Performance Assessment), is charged with developing a ranking system to overcome what is regarded by the European Commission as the limitations of the Shanghai Jiao Tong and the QS-Times Higher Education schemes. The  final product is to be launched in 2011.

CHERPA is comprised of a consortium of leading institutions in the field within Europe; all have been developing and offering rather different approaches to ranking over the past few years (see our earlier stories here, here and  here for some of the potential contenders):

Will this new European Commission driven initiative set the proverbial European cat amongst the Transatlantic alliance pigeons?  rankings 1

As we have noted in earlier commentary on university rankings, the different approaches tip the rankings playing field in the direction of different interests. Much to the chagrin of the continental Europeans, the high status US universities do well on the Shanghai Jiao Tong University Ranking, whilst Britain’s QS-Times Higher Education tends to see UK universities feature more prominently.

CHERPA will develop a design that follows the so called ‘Berlin Principles on the ranking of higher education institutions‘. These principles stress the need to take into account the linguistic, cultural and historical contexts of the educational systems into account [this fact is something of an irony for those watchers following UK higher education developments last week following a Cabinet reshuffle – where reference to ‘universities’ in the departmental name was dropped.  The two year old Department for Innovation, Universities and Skills has now been abandoned in favor of a mega-Department for Business, Innovation and Skills! (read more here)].

According to one of the Consortium members website –  CHE:

The basic approach underlying the project is to compare only institutions which are similar and comparable in terms of their missions and structures. Therefore the project is closely linked to the idea of a European classification (“mapping”) of higher education institutions developed by CHEPS. The feasibility study will include focused rankings on particular aspects of higher education at the institutional level (e.g., internationalization and regional engagement) on the one hand, and two field-based rankings for business and engineering programmes on the other hand.

The field-based rankings will each focus on a particular type of institution and will develop and test a set of indicators appropriate to these institutions. The rankings will be multi-dimensional and will – like the CHE ranking – use a grouping approach rather than simplistic league tables. In contrast to existing global rankings, the design will compare not only the research performance of institutions but will include teaching & learning as well as other aspects of university performance.

The different rankings will be targeted at different stakeholders: They will support decision-making in universities and especially better informed study decisions by students. Rankings that create transparency for prospective students should promote access to higher education.

The University World News, in their report out today on the announcement, notes:

Testing will take place next year and must include a representative sample of at least 150 institutions with different missions in and outside Europe. At least six institutions should be drawn from the six large EU member states, one to three from the other 21, plus 25 institutions in North America, 25 in Asia and three in Australia.

There are multiple logics and politics at play here. On the one hand, a European ranking system may well give the European Commission more HE  governance capacity across Europe, strengthening its steering over national systems in areas like ‘internationalization’ and ‘regional engagement’ – two key areas that have been identified for work to be undertaken by CHERPA.

On the other hand, this new European ranking  system — when realized — might also appeal to countries in Latin America, Africa and Asia who currently do not feature in any significant way in the two dominant systems. Like the Bologna Process, the CHERPA ranking system might well find itself generating ‘echoes’ around the globe.

Or, will regions around the world prefer to develop and promote their own niche ranking systems, elements of which were evident in the QS.com Asia ranking that was recently launched.  Whatever the outcome, as we have observed before, there is a thickening industry with profits to be had on this aspect of the emerging global higher education landscape.

Susan Robertson

Ranking – in a different (CHE) way?

uwe_brandenburg_2006-005nl GlobalHigherEd has been profiling a series of entries on university rankings as an emerging industry and technology of governance. This entry has been kindly prepared for us by Uwe Brandenburg. Since 2006 Uwe has been project manager at the Centre for Higher Education Development (CHE) and CHE Consult, a think tank and consultancy focusing on higher education reform.  Uwe has an MA in Islamic Studies, Politics and Spanish from the University of Münster (Germany),  and an MscEcon in Politics from the University of Wales at Swansea.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Talking about rankings usually means talking about league tables. Values are calculated based on weighed indicators which are then turned into a figure, added and formed into an overall value, often with the index of 100 for the best institution counting down. Moreover, in many cases entire universities are compared and the scope of indicators is somewhat limited. We at the Centre for Higher Education Development (CHE) are highly sceptical about this approach. For more than 10 years we have been running our own ranking system which is so different to the point that  some experts  have argued that it might not be a ranking at all which is actually not true. Just because the Toyota Prius is using a very different technology to produce energy does not exclude it from the species of automobiles. What are then the differences?

uwe1

Firstly, we do not believe in the ranking of entire HEIs. This is mainly due to the fact that such a ranking necessarily blurs the differences within an institution. For us, the target group has to be the starting point of any ranking exercise. Thus, one can fairly argue that it does not help a student looking for a physics department to learn that university A is average when in fact the physics department is outstanding, the sociology appalling and the rest is mediocre. It is the old problem of the man with his head in the fire and the feet in the freezer. A doctor would diagnose that the man is in a serious condition while a statistician might claim that over all he is doing fine.

So instead we always rank on the subject level. And given the results of the first ExcellenceRanking which focused on natural sciences and mathematics in European universities with a clear target group of prospective Master and PhD students, we think that this proves the point;  only 4 institutions excelled in all four subjects; another four in three; while most excelled in only one subject. And this was in a quite closely related field.

uwe2

Secondly, we do not create values by weighing indicators and then calculating an overall value. Why is that? The main reason is that any weight is necessarily arbitrary, or in other words political. The person weighing decides which weight to give. By doing so, you pre-decide the outcome of any ranking. You make it even worse when you then add the different values together and create one overall value because this blurs differences between individual indicators.

Say a discipline is publishing a lot but nobody reads it. If you give publications a weight of 2 and citations a weight of one, it will look like the department is very strong. If you do it the other way, it will look pretty weak. If you add the values you make it even worse because you blur the difference between both performances. And those two indicators are even rather closely related. If you summarize results from research indicators with reputation indicators, you make things entirely irrelevant.

Instead, we let the indicator results stand for their own and let the user decide what is important for his or her personal decision-making process. e.g., in the classical ranking we allow the users to create “my ranking” so they can choose the indicators they want to look at and in which order.

Thirdly, we strongly object to the idea of league tables. If the values which create the table are technically arbitrary (because of the weighing and the accumulation), the league table positions create the even worse illusion of distinctive and decisive differences between places. They then bring alive the impression of an existing difference in quality (no time or space here to argue the tricky issue of what quality might be) which is measurable to the percentage point. In other words, that there is a qualitative and objectively recognizable measurable difference between place number 12 and 15. Which is normally not the case.

Moreover, small mathematical differences can create huge differences in league table positions. Take the THES QS: even in the subject cluster SocSci you find a mere difference of 4.3 points on a 100 point scale between league rank 33 and 43. In the overall university rankings, it is a meager 6.7 points difference between rank 21 and 41 going down to a slim 15.3 points difference between rank 100 and 200. That is to say, the league table positions of HEIs might differ by much less than a single point or less than 1% (of an arbitrarily set figure). Thus, it tells us much less than the league position suggests.

Our approach, therefore, is to create groups (top, middle, bottom) which are referring to the performance of each HEI relative to the other HEIs.

uwe3

This means our rankings are not as easily read as the others. However,  we strongly believe in the cleverness of the users. Moreover, we try to communicate at every possible level that every ranking (and therefore also ours) is based on indicators which are chosen by the ranking institution. Consequently, the results of the respective ranking can tell you something about how an HEI performs in the framework of what the ranker thinks interesting, necessary, relevant, etc. Rankings therefore NEVER tell you who is the best but maybe (depending on the methodology) who is performing best (or in our cases better than average) in aspects considered relevant by the ranker.

A small, but highly relevant aspect might be added here. Rankings (in the HE system as well as in other areas of life) might suggest that a result in an indicator proves that an institution is performing well in the area measured by the indicator. Well it does not. All an indicator does is hint at the fact that given the data is robust and relevant, the results give some idea of how close the gap is between the performance of the institution and the best possible result (if such a benchmark exists). The important word is “hint” because “indicare” – from which the word “indicator” derives – means exactly this: a hint, not a proof. And in the case of many quantitative indicators, the “best” or “better” is again a political decision if the indicator stands alone (e.g. are more international students better? Are more exchange agreements better?).

This is why we argue that rankings have a useful function in terms of creating transparency if they are properly used, i.e. if the users are aware of the limitations, the purpose, the target groups and the agenda of the ranking organization and if the ranking is understood as one instrument among various others fit to make whatever decision related to an HEI (study, cooperation, funding, etc.).

Finally, modesty is maybe what a ranker should have in abundance. Running the excellence ranking in three different phases (initial in 2007, second phase with new subjects right now, repetition of natural sciences just starting) I am aware of certainly one thing. However strongly we aim at being sound and coherent, and however intensely we re-evaluate our efforts, there is always the chance of missing something; of not picking an excellent institution. For the world of ranking, Einstein’s conclusion holds a lot of truth:

Not everything that can be counted, counts and not everything that counts can be counted.

For further aspects see:
http://www.che-ranking.de/cms/?getObject=47&getLang=de
http://www.che-ranking.de/cms/?getObject=44&getLang=de
Federkeil, Gero, Rankings and Quality Assurance in Higher Education, in: Higher Education in Europe, 33, (2008), S. 209-218
Federkeil, Gero, Ranking Higher Education Institutions – A European Perspective., in: Evaluation in Higher Education, 2, (2008), S. 35 – 52
Other researchers specialising in this (and often referring to our method) are e.g. Alex Usher, Marijk van der Wende or Simon Marginson.

Uwe Brandenburg

University institutional performance: HEFCE, UK universities and the media

deem11 This entry has been kindly prepared by Rosemary Deem, Professor of Sociology of Education, University of Bristol, UK. Rosemary’s expertise and research interests are in the area of higher education, managerialism, governance, globalization, and organizational cultures (student and staff).

Prior to her appointment at Bristol, Rosemary was Dean of Social Sciences at the University of Lancaster. Rosemary has served as a member of ESRC Grants Board 1999-2003, and Panel Member of the Education Research Assessment Exercise 1996, 2001, 2008.

GlobalHigherEd invited Rosemary to respond to one of the themes (understanding institutional performance) in the UK’s Higher Education Debate aired by the Department for Innovation, Universities and Skills  (DIUS) over 2008.

~~~~~~~~~~~~~~

Institutional performance of universities and their academic staff and students is a very topical issue in many countries, for potential students and their families and sponsors, governments and businesses. As well as numerous national rankings, two annual international league tables in particular, the Shanghai Jiao Tong,  developed for the Chinese government to benchmark its own universities and the commercial Times Higher top international universities listings, are the focus of much government and institutional  interest,  as  universities vie with each other to appear in the top rankings of so-called world-class universities, even though the quest for world-class status has negative as well as positive consequences for national higher education systems (see here).

International league tables often build on metrics that are themselves international (e.g publication citation indexes) or use proxies for quality such as the proportions of international students or staff/student ratios, whereas national league tables tend to develop their own criteria, as the UK Research Assessment Exercise (RAE) has done and as its planned replacement, the Research Excellence Framework is intended to do. deem2

In March 2008, John Denham, Secretary of State for (the Department of) Innovation, Universities and Skills (or DIUS) commissioned the Higher Education Funding Council for England (HEFCE) to give some advice on measuring institutional performance. Other themes  on which the Minister commissioned advice, and which will be reviewed on GlobalHigherEd over the next few months, were On-Line Higher Education Learning, Intellectual Property and research benefits; Demographic challenge facing higher education; Research Careers; Teaching and the Student Experience; Part-time studies and Higher Education; Academia and public policy making; and International issues in Higher Education.

Denham identified five policy areas for the report on ‘measuring institutional performance’ that is the concern of this entry, namely: research, enabling business to innovate and engagement in knowledge transfer activity, high quality teaching, improving work force skills and widening participation.

This list could be seen as a predictable one since it relates to current UK government policies on universities and strongly emphasizes the role of higher education in producing employable graduates and relating its research and teaching to business and the ‘knowledge economy’.

Additionally, HEFCE already has quality and success measures and also surveys, such as the National Student Survey of all final year undergraduates for everything except workforce development.  The five areas are a powerful indicator of what government thinks the purposes of universities are, which is part of a much wider debate (see here and here).

On the other hand, the list is interesting for what it leaves out – higher education institutions and their local communities (which is not just about servicing business), or universities’ provision for supporting the learning of their own staff (since they are major employers in their localities) or the relationship between teaching and research

The report makes clear that HEFCE wants to “add value whilst minimising the unintended consequences”, (p. 2), would like to introduce a code of practice for the use of performance measures and does not want to introduce more official league tables in the five policy areas.  There is also a discussion about why performance is measured: it may be for funding purposes, to evaluate new policies, inform universities so they can make decisions about their strategic direction, improve performance or to inform the operation of markets. The disadvantages of performance measures, the tendency for some measures to be proxies (which will be a significant issue if plans to use metrics and bibliometrics  as proxies for research quality in  the new Research Excellence Framework are adopted) and the tendency to measure activity and volume but not impact are also considered in the report.

However, what is not emphasized enough are that the consequences once a performance measure is made public are not within anyone’s control.  Both the internet and the media ensure that this is a significant challenge.  It is no good saying that “Newspaper league tables do not provide an accurate picture of the higher education sector” (p 7) but then taking action which invalidates this point.

Thus in the RAE 2008, detailed cross-institutional results were made available by HEFCE to the media before they are available to the universities themselves last week, just so that newspaper league tables can be constructed.

Now isn’t this an example of the tail wagging the dog, and being helped by HEFCE to do so? Furthermore, market and policy incentives may conflict with each other.  If an institution’s student market is led by middle-class students with excellent exam grades, then urging them to engage in widening participation can fall on deaf ears.   Also, whilst UK universities are still in receipt of significant public funding, many also generate substantial private funding too and some institutional heads are increasingly irritated by tight government controls over what they do and how they do it.

Two other significant issues are considered in the report. One is value-added measures, which HEFCE feels it is not yet ready to pronounce on.  Constructing these for schools has been controversial and the question of over what period should value added measures be collected is problematic, since HEFCE measures would look only at what is added to recent graduates, not what happens to them over the life course as a whole.

The other issue is about whether understanding and measuring different dimensions of institutional performance could help to support diversity in the sector.  It is not clear how this would work for the following three reasons:

  1. Institutions will tend to do what they think is valued and has money attached, so if the quality of research is more highly valued and better funded than quality of teaching, then every institution will want to do research.
  2. University missions and ‘brands’ are driven by a whole multitude of factors and importantly by articulating the values and visions of staff and students and possibly very little by ‘performance’ measures; they are often appealing to an international as well as a national audience and perfect markets with detailed reliable consumer knowledge do not exist in higher education.
  3. As the HEFCE report points out, there is a complex relationship between research, knowledge transfer, teaching, CPD and workforce development in terms of economic impact (and surely social and cultural impact too?). Given that this is the case, it is not evident that encouraging HEIs to focus on only one or two policy areas would be helpful.

There is a suggestion in the report that web-based spidergrams based on an seemingly agreed (set of performance indicators might be developed which would allow users to drill down into more detail if they wished). Whilst this might well be useful, it will not replace or address the media’s current dominance in compiling league tables based on a whole variety of official and unofficial performance measures and proxies. Nor will it really address the ways in which the “high value of the UK higher education ‘brand’ nationally and internationally” is sustained.

Internationally, the web and word of mouth are more critical than what now look like rather old-fashioned performance measures and indicators.  In addition, the economic downturn and the state of the UK’s economy and sterling are likely to be far more influential in this than anything HEFCE does about institutional performance.

The report, whilst making some important points, is essentially introspective, fails to sufficiently grasp how some of its own measures and activities are distorted by the media, does not really engage with the kinds of new technologies students and potential students are now using (mobile devices, blogs, wikis, social networking sites, etc) and focuses far more on national understandings of institutional performance than on how to improve the global impact and understanding of UK higher education.

Rosemary Deem

The UK India Education and Research Initiative (UKIERI): reflections on ‘the complexities of global partnerships in higher education

gore221This entry has been kindly prepared by Tim Gore, Director of The Centre for Indian Business, University of Greenwich, London, UK. Tim has worked closely with educationalists, institutions, companies and governments to improve bilateral and multilateral educational links in Hong Kong, Singapore, United Arab Emirates, Jordan and India over a 23 year period. His most recent role was Director, Education at the British Council in India, where he was responsible for growing the knowledge partnership between India and the UK. Tim also led the establishment of the UK-India Education and Research Initiative (UKIERI) that is profiled in this blog entry.

~~~~~~~~~~~~~~~~~~

Building sustainable global partnerships

Partnership is a word that is often used but difficult to define. Many claim to have meaningful partnerships but in reality I suspect good partnerships are rare. Partnerships between academic institutions across national and cultural frontiers are especially challenging. In the first place, the institutions themselves are complex, multi-dimensional and resistant to being led in the traditional sense. On the other hand, there is language, the subtle nuances of unspoken cultural expectations and distance! UKIERI – the UK India Education and Research Initiative – was established with the aim of rebuilding the lapsed educational relationship between the UK and India. It was to focus on building academic partnerships that were meaningful and sustainable.

India and the UK

India emerged from its colonial period according to some commentators with the newfound national pride as the growth of their economy and their nuclear and space sciences established their national credibility (see Mohan, 2006). Since the economic reforms of 1991, India had opened its doors and witnessed a dizzying growth. But to fuel this growth, education became more important and with it an interest in partnership with amongst others the UK. The UK also recognised the need of knowledge to fuel its growth and set up several institutions such as the Science and Innovation Council to achieve this. India and China were obvious partners with their rapidly growing academic and research capabilities.

ukierilogoThe UK government put the initial funds into UKIERI to start it up closely followed by industry sponsors and later as trust was built, the Indian Government. A number of consultations in India and UK gathered views from the sector about how to achieve the goals. The result was a carefully balanced funding mechanism that encouraged competitive bids across a range of academic collaborations but with similar criteria of impact, relevance, high quality standards and sustainability. The funding was mainly mobility money to break down the difficulty of distance and encourage partners to spend time together. Bids needed to demonstrate that the activities of the partnership were of strategic importance to the institutions involved and that matching funding was available.

The concept of ‘strategic alliances’ has quickly evolved over the last few decades from a position where they were little mentioned in strategy textbooks. Michael Porter, for example, in his work on market forces in the seventies and eighties was more concerned with firms as coherent entities in themselves made up of strategic business units but conceptually sealed from competing firms in the market. Since then, alliances have become crucially important to the extent that a product such as the iPod is the product of a very complex set of strategic relationships where its brand owner, Apple, does not directly produce any part of the iPod or its content.

A variety of writers have looked at alliances from different perspectives. Economic and managerial perspectives see alliances as ways of reducing risk or exerting power and influence in a market. However, social capital and network analyses are far more subtle and see alliances as ways of accessing complex tacit knowledge that is not easy to build or acquire in other ways. Here, the concept of trust plays a big role and we come back to human interaction.

Academic institutions could be concerned with market share and can definitely be concerned about costs. So an analysis such as’ resource based theory’ or ‘transaction cost analysis’ may describe their motivations for partnership well. However, such institutions are complex and exhibit complex goals.

Studies in Norway (see Frølich, 2006) have shown that academic ambition and status is the main driver for researchers seeking overseas links rather than financial or institutional inducements which are merely facilitative. In this analysis, knowledge is power. Knowledge is difficult to acquire and especially those parts of knowledge that are not easily coded and where even the questions are difficult to frame let alone the answers that are sought. Trading in knowledge of this type is done only under conditions of trust.

However, this is only part of the picture. Institutions do have a role. In studies of the success of innovation in the Cambridge innovation cluster, the success was attributed to two sorts of social capital – structural and relational. The individual researchers can easily create the relational capital at conferences and other academic encounters but the structural capital comes by virtue of institutional links such as shared governors on a board. If we can create conditions of both structural and relational capital we can expect a more robust and productive alliance. It is this that UKIERI was trying to achieve.

Buying a stake in the process

bangalore-015UKIERI insisted that institutions buy a stake in the process at the same time as encouraging academics to create their partnerships. Funding was deliberately limited so that the institution had to contribute or find extra funding from a third party. This ensured that the strategic interests of the institution were taken into account. Many universities asked all their staff with an interest in India to attend a working group and prioritise their own bids into UKIERI. At the same time, UKIERI looked for evidence of synergy within the teams and evidence that the partnership would yield more than the sum of the parts. UKIERI arranged a two stage process of peer review to look at the academic strengths followed by a panel review to look holistically at the partnership.

Trust was built at many levels in the Initiative. The Indian Government demonstrated their trust by co-funding the second year after having satisfied themselves that there was genuine mutuality. Many partnerships had to deal with trust issues especially over funding which was channelled through the UK partner in the first year according to UK audit requirements. In a few cases trust broke down and partnerships did not work out but in the overwhelming majority the partnerships are doing well and producing strong research and academic outputs. The Initiative has been favourably reviewed by a number of institutions including the UK’s National Audit Office and a Parliamentary Select Committee.

‘Good’ communication sustains partnerships

In my experience, many partnerships run into difficulties because there is not enough contact between the partners, communications are sparse and often responses are slow or do not happen at all. Universities can give the appearance of being rather fragmented in their approach to partnerships as authority for the various components lies in different parts of the university.

Additionally, very often aspects of the partnership are agreed but then need to be ratified by academic councils or other internal quality processes and this again can cause delays. Very often, the partner is not told about the reason for delays and from the outside it is hard to understand why responses are so slow. This is accentuated when we are dealing across cultures and delays can be interpreted as lack of interest or even a lack of respect. In some cultures, it is not normal to say ‘no’ and a lack of response is the way of communicating lack of interest! All these communication issues erode the trust in the relationship and can be damaging.

I would recommend that each partnership always has a clear lead person who leads on communications and keeps in touch with all the processes on both sides of the partnership. It is important to be transparent about internal mechanisms and how long processes are really likely to take as well as what the processes are. The lead person can also coordinate visits to and fro and ensure that these are fairly regular. If there is a gap, there may be a relevant academic in the area who could take an extra day visiting the partner and keeping the relationship ‘warm’.

We often forget in our efforts to be both effective managers and academics that human relationships are at the core of all our enterprise and that these relationships need nurturing. Without this basic trust effective management of a project and high quality standards will not be enough.

Additional Reading

Frølich, N. (2006) Still academic and national – internationalisation in Norwegian research and higher education, Higher Education, 52 (3), pp. 405-420.

Gore, T. (2008) Global Research Collaboration: Lessons from Practice for Sustainable International Partnerships, October, London: Observatory of Borderless Higher Education.

Heffernan, T. and Poole, D. (2005) In search of the ‘vibe’: creating effective international education partnerships, Higher Education, 50 (2), pp. 223-45.

Mohan, C.R. (2006) India and the balance of power, Foreign Affairs, 85 (4), pp. 17-32.

Muthusamy, S. K. and White, M. A. (2007). An empirical examination of the role of social exchances in alliance performance, Journal of Management Issues, 19 (1), pp. 53-75.

Myint, Y, Vyakarnam, S. et al (2005) The Effect of Social Capital in New Venture Creation: the Cambridge High Technology Cluster.

Tim Gore

US/Turkish collaborations: bringing vocational schools into the global education sector

In the past three years I’ve had the great opportunity to give invited lectures, teach a graduate summer school course, and run research workshops at Bogazici University in Istanbul, Turkey.

This has been a wonderful occasion for me to listen to, and engage with, lively and committed scholars and students around processes of globalization, Turkey’s application to the EU for accession, and the geo-strategic role of Turkey situated as it is between Asia and Europe.

So it was with great interest that I read in the Observatory for Borderless Higher Education’s (OBHE) latest bulletin; that Turkey had signed a deal with the US-based Community Colleges for International Development, Inc. An Association for Global Education to put into place an exchange between US and Turkish vocational schools.

The OBHE report was based on a lead article carried in the World Bulletin. For the Turkish Higher Education Council (YÖK), these collaborative partnerships will be instituted in 7-8 Turkish vocational schools in an attempt to improve the curriculum in Turkish vocational schools.

According to the Chairperson of YÖK, Professor Yusuf Ziya Ozcan:

Vocational schools are the engines of our economy. If these schools train the work force needed by our economy and industry, most of the problems in Turkey will be solved. If we can guide some of our high-school graduates to get further education at vocational schools instead of universities, this will diminish the crowds waiting at the doors of universities as well.

Operationalizing the program means that Turkish students would spend their first year in Turkey and get their second-year education at a U.S. vocational school, whilst US students would have a chance to spend a year in Turkey.

But, why the US and not Europe, as a model for vocational education? And why build student mobility into a vocational school program?

According to Professor Ozcan:

…the best thing to do on this issue was to get support from a country where vocational education system functioned smoothly, and therefore, they decided to pay a visit to USA.

This move by the Turkish Higher Education Council to collaborate on vocational education might be read in a number of ways. For instance, Turkey’s education system has historically had close links to US, particularly through its (former) private schools and universities. This is thus business as usual, only applied to a different sector – vocational schools.

Turkey is also a popular destination for US students studying abroad as part of their undergraduate program (see Kavita Pandit’s entry on dual degree programs between Turkey and SUNY/USA). The university residence where I stayed whilst teaching at Bogazici in 2007 was buzzing with undergraduate students from the US. Thus, this new exchange initiative might be viewed as further strengthening already existing ties along channels that are already established.

Adding a component of student mobility to vocational education in Turkey might make that sector more attractive to prospective students, whilst generating the kind of knowledge and demeanor global firms think is important in its intermediary labor force. This would give Turkey’s intermediary labor a competitive advantage in the churn for flexible skilled workers in the global economy.

This deal can also be read as the outcome of an ambivalence by Turkey and its education institutions toward Europe and its regionalizing project, and vice versa. And while there are serious moves in Turkish universities, toward implementing Europe’s Bologna Process in higher education, it seems Turkey–like a number of countries around the world–is weighing up its response to the globalizing education models that are circulating so that they keep a foot in both camps – the USA and Europe.

Susan Robertson

OECD’s Education at a Glance 2008: a ‘problem/solution toolkit’ with problems?

Last week, or to be precise – on the 9th September at 11.00 Paris time, the Organization for Economic and Cooperative Development (OECD), launched its ‘annual snapshot’ of the sector, Education at a Glance 2008. Within hours, the wheels of the media industry around the globe were pouring out stories of shame, fame, defeat and victory, whilst politicians in their respective countries were galvanized into action – either defending their own decisions or blaming a previous regime.

As previous entries in GlobalHigherEd (see here and here and here, as examples) argue, global indicators increasingly matter, not because they are always able to tell us much that is useful, but they work as a powerful disciplinary tool on nations. This, in turn, provides the issuing agent, in this case the OECD – ostensibly a ‘collective learning machinery’ – with an important mechanism for influencing the form and scope of education policies and programs around the globe. This is the tangible stuff of globalization – but this problem/solution toolkit is not without its own epistemological problems. Let’s take a look at two countries reported on this week – which headlined the OECD’s Report in the following way.

In the UK, the BBC and the Telegraph focused on the graduate league table, and the fact that the UK has not fared particularly well. The evidence? In 2000, the UK ranked 4th in the world in the number of school-leavers going to university. By 2006, this had plummeted to 12th.

Graeme Paton of the Telegraph reported on an interview with Andreas Schliecher, the OECD’s architect of Education at a Glance. According to Dr. Schliecher, the UK has major problems in producing school leavers with sufficient quality of credentials, whilst other countries have managed to sort out these problems and were already in the fast lane, leaving the UK behind.

Ministers canvassed by the Telegraph, however, insist that they were tackling the shortfall by encouraging more pupils to go to university and by pointing out the OECD good news story for the UK, that university graduates in the UK aged 25-64 earned 59 per cent more than other people – well above the national average.

In Canada, the influential Macleans magazine reported that in the OECD Education at a Glance comparisons, Canada was one of the few countries with the highest percentage of its population having completed post-secondary education. However, we are also given another statistic, and that is that the earnings advantage gained from completing post-secondary education in Canada had decreased in recent years and was quite low compared to other OECD countries. This is reflected in the lower average private rate-of-return on investment in post-secondary education relative to other nations in the OECD.

Let’s dwell, and not just ‘glance’, at these figures for a moment, and ask what is being reported here by the OECD:

  • competitive economies need a more highly educated workplace to perform more demanding work;
  • all countries need to encourage their young people to go to university and complete a degree; and
  • the incentives for this expenditure (which is increasingly being paid by families) are that there will be a higher rate-of-return to the student than if the student had not gone to university.

However, as we can see from our example above, countries with high levels of graduation (which the OECD says is good) report increasingly lower returns to graduates (ah…and is this not bad?).

Now, this is where the underlying human capital/homo-economicus rationale underpinning the OECD’s Education at a Glance begins to falter – for it cannot explain why it is that following the OECD’s prescriptions – of a high level of enrolment in higher education – reduces the overall earnings to the individual rather than increasing it.

While not one that is acknowledged in the repertoire of the OECD’s ‘problem/solution toolkit’ approach, this is where a sociological analysis is particularly helpful. As sociologists of education (see Phil Brown and Simon Marginson) have shown using Fred Hirsch’s insights on ‘positional goods’ tied to social status in his book The Social Limits to Growth, an advantage will only have economic value when no-one else has it. That is, its value depends on its scarcity. In other words, if we all have a graduate degree, then its value is diminished in the marketplace compared with when only half of us have one. This is part of the dynamic, for example, underlying degree inflation.

There’s also another issue, and this is the assumption that jobs in the ‘new knowledge economy’ will require us all to have graduate qualifications. However, the Confederation of British Industries (reported in the UK Guardian newspaper on the 17th Sept), disagrees, arguing that universities were producing far too many graduates leaving more than a million people in jobs for which they were overqualified. They argue that there are currently 10.1 million graduates in the UK, but only 9 million graduate jobs.

The deeper, and more tricky, question for policymakers now becomes: do we encourage everyone to hop onto the same credential treadmill with fewer and fewer returns and potentially higher levels of indebtedness? To be sure, there are important outcomes for individuals of a university education. However this experience is becoming more and more expensive, and the promised lifetime earnings are likely to be less and less. And who will shoulder the cost? Families? Employers? The State? And, how might the state and interrnational organizations, like the OECD, legitimate more and more credential inflation when the current ‘knowledge economy’ discourse is showing it to be somewhat hollow?

Or, ought we not think through what a range of trajectories might be that distributes talent/skills/training and investments over a wider portfolio of education/training/career options than is currently being presented to us?

Susan Robertson

Changing higher education and the claimed educational paradigm shift – sobering up educational optimism with some sociological scepticism

If there is a consensus on the recognition that higher education governance and organization are being transformed, the same does not occur with regard to the impact of that transformation on the ‘educational’ dimension of higher education.

Under the traveling influence of the diverse versions of New Public Management (NPM), European public sectors are being molded by market-like and client-driven perspectives. Continental higher education is no exception. Austria and Portugal, to mention only these two countries, have recently re-organized their higher education system explicitly under this perspective. The basic assumptions are that the more autonomous institutions are, the more responsive they are to changes in their organizational environment, and that academic collegial governance must be replaced by managerial expertise.

Simultaneously, the EU is enforcing discourses and developing policies based on the competitive advantages of a ‘Europe of knowledge’. ‘Knowledge societies’ appear as depending on the production of new knowledge, its transmission through education and training, its dissemination through ICT, and on its use through new industrial processes and services.

By means of ‘soft instruments’ [such as the European Qualification Framework (EQF) and the Tuning I and II projects (see here and here), the EU is inducing an educational turn or, as some argue, an emergent educational paradigm. The educational concepts of ‘learning’, ‘knowledge’, ‘skills’, ‘competences’, ‘learning outcomes’ and ‘qualifications’, re-emerge in the framework of the EHEA this time as core educational perspectives.

From the analysis of the documents of the European Commission and its diverse agencies and bodies, one can see that a central educational role is now attributed to the concept of ‘learning outcomes’ and to the ‘competences’ students are supposed to possess in the end of the learning process.

In this respect, the EQF is central to advancing the envisaged educational change. It claims to provide common reference levels on how to describe learning, from basic skills up to the PhD level. The 2007 European Parliament recommendation defines “competence” as the proven ability to use knowledge, skills and personal, social and/or methodological abilities, in work or study situations and in professional and personal development”.

The shift from ‘knowledge content’ as the organizer of learning to ‘competences’, with a focus on the capacity to use knowledge(s) to know and to act technically, socially and morally, moves the role of knowledge from one where it is a formative process based on ‘traditional’ approaches to subjects and mastery of content, to one where the primary interest is in the learner achieving as an outcome of the learning process. In this new model, knowledge content is mediated by competences and translated into learning outcomes, linking together ‘understanding’, ‘skills’ and ‘abilities’.

However, the issue of knowledge content is passed over and left aside, as if the educational goal of competence building can be assigned without discussion about the need to develop procedural competencies based more on content rather than on ‘learning styles’. Indeed it can be argued that the knowledge content carried out in the process of competence building is somehow neutralized in its educational role.

In higher education, “where learning outcomes are considered as essential elements of ongoing reforms” (CEC: 8), there are not many data sources available on the educational impact of the implementation of competence-based perspectives in higher education. And while it is too early to draw conclusions about the real impact on higher education students’ experiences of the so called ‘paradigm shift’ in higher education brought by the implementation of the competence-based educational approach, the analysis of the educational concepts is, nonetheless, an interesting starting point.

The founding educational idea of Western higher education was based on the transforming potential of knowledge both at the individual and social level. Educational categories (teaching, learning, students, professors, classes, etc.) were grounded in the formative role attributed to knowledge, and so were the curriculum and the teaching and learning processes. Reconfiguring the educational role of knowledge from its once formative role in mobilizing the potential to act socially (in particular in the world of work), induces important changes in educational categories.

As higher education institutions are held to be sensitive and responsive to social and economic change, the need to design ‘learning outcomes’ on the ‘basis of internal and external stakeholders’ perceptions (as we see with Tuning: 1) grows in proportion. The ‘student’ appears simultaneously as an internal stakeholder, a client of educational services, a person moving from education to labor market and a ‘learner’ of competences. The professor, rather than vanishing, is being reinvented as a provider of learning opportunities. Illuminated by the new educational paradigm and pushed by the diktat of efficiency in a context of mass higher education, he/she is no more the ‘center’ of knowledge flux and delivery but the provider of learning opportunities for ‘learners’. Moreover, as an academic, he/she is giving up his/her ultimate responsibility to exercise quality judgments on teaching-learning processes in favor of managerial expertise on that.

As ‘learning outcomes’ are what a learner is expected to know, understand and/or be able to demonstrate on completion of learning, and given these can be represented by indicators, assessment of the educational process can move from inside to outside higher education institutions to assessment by evaluation technicians. With regard to the lecture theater as the educational locus par excellence, ICT instruments and ideographs de-localize classes to the ether of ‘www’, ‘face-to-face’ teaching-learning being a minor proportion of the ‘learner’ activities. E-learning is not the ‘death’ of the professor but his/her metamorphosis into a ‘learning monitor’. Additionally, the rise of virtual campuses introduce a new kind of academic life whose educational consequences are still to be identified.

The learner-centered model that is emerging has the educational potential foreseen by many educationalists (e.g. John Dewey, Paulo Freire, Ivan Illich, among others) to deal with the needs of post-industrial societies and with new forms of citizenship. The emerging educational paradigm promises a lot: the empowerment of the student, the enhancement of his/her capacity and responsibility to express his/her difference, the enhancement of team work, the mutual help, learning by doing, etc.

One might underline the emancipatory potential that this perspective assumes – and some educationalists are quite optimist about it. However, education does not occur in a social vacuum, as some sociologists rightly point out. In a context where HEIs are increasingly assuming the features of ‘complete organizations’ and where knowledge is indicated as the major competitive factor in the world-wide economy, educational optimism should/must be sobered up with some sociological scepticism.

In fact the risk is that knowledge, by evolving away from a central ‘formative’ input to a series of competencies, may simply pass – like money – through the individuals without transforming them (see the work of Basil Bernstein for an elaboration of this idea). By easing the frontiers between the academic and work competencies, and between education and training, higher education runs the risk of sacrificing too much to the gods of relevance, to (short term) labor market needs. Contemporary labor markets require competencies that are supposed to be easily recognized by the employers and with the potential of being continuously reformed. The educational risk is that of reducing the formation of the ‘critical self’ of the student to the ‘corporate self’ of the learner.

António M. Magalhães

New UK report: ‘How Knowledge is Reshaping the Economic Life of Nations’

The London-based Work Foundation released a new 92 page report on 11 March titled The Knowledge Economy: How Knowledge is Reshaping the Economic Life of Nations.

wfreportcover.jpg Report highlights include:

  • Work: knowledge-based industries and knowledge-related occupations have provided most of the new jobs over the past decade.
  • Trade: The UK has emerged as a world leader in trade in knowledge services with the biggest trade surplus of the major OECD economies. While the City of London and financial services remain important, two thirds of this trade comes from business services, high tech, and education and cultural services.
  • Innovation: innovation in the knowledge economy comes from both the successful exploitation of R&D undertaken in the UK and overseas and from wider forms of innovation — design and development, marketing and organisational change.

The report includes this fascinating map of the public vs private geographies of the knowledge economy (or economies, to be more precise) in the UK.

ukkbemap.jpg

More food for fodder in the ongoing attempt to understand exactly what the ‘knowledge economy’ is, and what its development means for economy, society and space.

Kris Olds