Editor’s note: this guest entry was kindly written by Gavin Moodie, principal policy adviser of Griffith University in Australia. Gavin (pictured to the right) is most interested in the relations between vocational and higher education. His book From Vocational to Higher Education: An International Perspective was published by McGraw-Hill last year. Gavin’s entry sheds light on a new ranking initiative that needs to be situated within the broad wave of contemporary rankings – and bibliometrics more generally – that are being used to analyze, legitimize, critique, promote, not to mention extract revenue from. Our thanks to Gavin for the illuminating contribution below.
It has been a busy time for world institutional rankings watchers recently. Shanghai Jiao Tong University’s Institute of Higher Education published its academic ranking of world universities (ARWU) for 2009. The institute’s 2009 rankings include its by now familiar ranking of 500 institutions’ overall performance and the top 100 institutions in each of five broad fields: natural sciences and mathematics, engineering/technology and computer sciences, life and agriculture sciences, clinical medicine and pharmacy, and social sciences. This year Dr. Liu and his colleagues have added rankings of the top 100 institutions in each of five subjects: mathematics, physics, chemistry, computer science and economics/business.
Times Higher Education announced that over the next few months it will develop a new method for its world university rankings which in future will be produced with Thomson Reuters. Thomson Reuters’ contribution will be guided by Jonathan Adams (Adams’ firm, Evidence Ltd, was recently acquired by Thomson Reuters).
And a new ranking has been published, SCImago institutions rankings: 2009 world report. This is a league table of research institutions by various factors derived from Scopus, the database of the huge multinational publisher Elsevier. SCImago’s institutional research rank is distinctive in including with higher education institutions government research organisations such as France’s Centre National de la Recherche Scientifique, health organisations such as hospitals, and private and other organisations. Only higher education institutions are considered here. The ranking was produced by the SCImago Research Group, a Spain-based research network “dedicated to information analysis, representation and retrieval by means of visualisation techniques”.
SCImago’s rank is very useful in not cutting off at the top 200 or 500 universities, but in including all organisations with more than 100 publications indexed in Scopus in 2007. It therefore includes 1,527 higher education institutions in 83 countries. But even so, it is highly selective, including only 16% of the world’s estimated 9,760 universities, 76% of US doctoral granting universities, 65% of UK universities and 45% of Canada’s universities. In contrast all of New Zealand’s universities and 92% of Australia’s universities are listed in SCImago’s rank. Some 38 countries have seven or more universities in the rank.
SCImago derives five measures from the Scopus database: total outputs, cites per document (which are heavily influenced by field of research as well as research quality), international collaboration, normalised Scimago journal rank and normalised citations per output. This discussion will concentrate on total outputs and normalised citations per output.
Together these measures show that countries have been following two broad paths to supporting their research universities. One group of countries in northern continental Europe around Germany have supported a reasonably even development of their research universities, while another group of countries influenced by the UK and the US have developed their research universities much more unevenly. Both seem to be successful in support research volume and quality, at least as measured by publications and citations.
Volume of publications
Because a reasonable number of countries have several higher education institutions listed in SCImago’s rank it is possible to consider countries’ performance rather than concentrate on individual institutions as the smaller ranks encourage. I do this by taking the average of the performance of each country’s universities. The first measure of interest is the number of publications each university has indexed in Scopus over the five years from 2003 to 2007, which is an indicator of the volume of research. The graph in figure 1 shows the mean number of outputs for each country’s higher education research institutions. It shows only countries which have more than six universities included in SCImago’s rank, which leaves out 44 countries and thus much of the tail in institutions’ performance.
Figure 1: mean of universities’ outputs for each country with > 6 universities ranked
These data are given in table 1. The first column gives the number of higher education institutions each country has ranked in SCImago institutions rankings (SIR): 2009 world report. The second column shows the mean number of outputs indexed in Scopus for each country’s higher education research institutions from 2003 to 2007. The next column shows the standard deviation of the number of outputs for each country’s research university.
The third column in table 1 shows the coefficient of variation, which is the standard deviation divided by the mean and multiplied by 100. This is a measure of the evenness of the distribution of outputs amongst each country’s universities. Thus, the five countries whose universities had the highest average number of outputs indexed in Scopus from 2003 to 2007 – the Netherlands, Israel, Belgium, Denmark and Sweden – also had a reasonably low coefficient of variation below 80. This indicates that research volume is spread reasonably evenly amongst those countries’ universities. In contrast, Canada which had the sixth highest average number of outputs also has a reasonably high coefficient of variation of 120, indicating an uneven distribution of outputs amongst Canada’s research universities.
The final column in table 1 shows the mean of SCImago’s international collaboration score, which is a score of the proportions of the institution’s outputs jointly authored with someone from another country. The US’ international collaboration is rather low because US authors collaborate more often with authors in other institutions within the country.
Table 1: countries with > 6 institutions ranked by institutions’ mean outputs, 2007
Source: SCImago Research Group (2009) SCImago institutions rankings (SIR): 2009 world report.
Citations per paper by field
We next examine citations per paper by field of research, which is an indicator of the quality of research. This is the ratio between the average citations per publication of an institution and the world number of citations per publication over the same time frame and subject area. SCImago says it computed this ratio using the method established by Sweden’s Karolinska Intitutet which it called the ‘Item oriented field normalized citation score average’. A score of 0.8 means the institution is cited 20% below average and 1.3 means the institution is cited 30% above average.
Figure 2 shows mean normalised citations per paper for each country’s higher education research institutions from 2003 to 2007, again showing only countries which have more than six universities included in SCImago’s rank. The graph for an indicator of research quality in figure 2 is similar in shape to the graph of research volume in figure 1.
Figure 2: mean of universities’ normalised citations per paper for each country with > 6 universities ranked
Table 2 shows countries with more than six higher education research institutions ranked by their institutions’ mean normalised citations. This measure distinguishes more sharply between institutions than volume of outputs – the coefficient of variations for countries’ mean institutions normalised citations are higher than for number of publications. Nonetheless, several countries with high mean normalised citations have an even performance amongst their universities on this measure – Switzerland, Netherlands, Sweden, Germany, Austria, France, Finland and New Zealand.
Finally, I wondered whether countries which had a reasonably even performance of their research universities by volume and quality of publications reflected a more equal society. To test this I obtained from the Central Intelligence Agency’s (2009) World Factbook the Gini index of the distribution of family income within a country. A country with a Gini index of 0 would have perfect equality in the distribution of family income whereas a country with perfect inequality in its distribution of family would have a Gini index of 100. There is a modest correlation of 0.37 between a country’s Gini index and its coefficient of variation for both publications and citations.
Table 2: countries with > 6 institutions ranked by institutions’ normalised citations per output
Sources: SCImago Research Group (2009) SCImago institutions rankings (SIR): 2009 world report; Central Intelligence Agency (2009) The world factbook.
SCImago’s institutions research rank is sufficiently comprehensive to support comparisons between countries’ research higher education institutions. It finds two patterns amongst countries whose research universities have a high average volume and quality of research publications. One group of countries has a fairly even performance of their research universities, presumably because they have had fairly even levels of government support. This group is in northern continental Europe and includes Switzerland, Germany, Sweden, the Netherlands, Austria, Denmark and Finland. The other group of countries also has a high average volume and quality of research publications, but spread much more unevenly between universities. This group includes the US, the UK and Canada.
This finding is influenced by the measure I chose to examine countries’ performance, the average of their research universities’ performance. Other results may have been found using another measure of countries’ performance, such as the number of universities a country has in the top 100 or 500 of research universities normalised by gross domestic product. But such a measure would not reflect a country’s overall performance of their research universities, but only the performance of its champions. Whether one is interested in a country’s overall performance or just the performance of its champions depends on whether one believes more benefit is gained from a few outstanding performers or several excellent performers. That would usefully be the subject of another study.
Central Intelligence Agency (2009) The world factbook (accessed 29 October 2009).
SCImago institutions rankings (SIR): 2009 world report (revised edition accessed 20 October 2009).