Citation indices, bibliometrics, and audit culture: new research findings

uukbiblio.jpgThe use of citation indices (bibliometrics, more broadly, which includes article counting and citation counts) is sweeping much of the world of higher education governance. Citation indices, including the Thomson Scientific produced Social Science Citation Index, and the Science Citation Index, and even Google Scholar, are undeniably part of modern academic life, though they are highly contested, reflective and disproportionately supportive of the Anglo-American world (including English as lingua franca), and little understood. Like rankings schemes (which witnessed a flurry of activity this week: see the Times Higher Education Supplement global ranking results, Beerkens’ Blog on the geographies of the new THES rankings, and the Macleans results at a Canadian scale), bibliometrics are used to analyze scholarly activity, and frame the distribution of resources (from the individual up to the institutional scale). They are increasingly used to govern academic life, for good and for bad, and they produce a myriad of impacts that need to be much more fully explored.

The UK makes particularly heavy use of bibliometrics, spurred on by their Research Assessment Exercise (RAE). For this reason UK higher education institutions should, therefore (one would hope!), have a more advanced understanding of the uses and abuses of this analytical cum governance tool. It is thus worth noting that Universities UK (UUK) released a new report today on the topic – The use of bibliometrics to measure research quality in UK higher education institutions – to generate discussion about how to reform the much maligned RAE process. The report was produced Evidence Ltd., a UK “knowledge-based company specializing in data analysis, reports and consultancy focusing on the international research base”. jadams.jpg

Evidence is led by Jonathan Adams, pictured here at a Worldwide Universities Network conference, and the person who wrote an illuminating chapter in the Rand Corporation report (Perspectives on U.S. Competitiveness in Science and Technology) we recently profiled.

Evidence also has a “strategic alliance” with Thomson Scientific (previously known as Thomson ISI) that produces the citation indices noted above.

As the UUK press release notes:

The report assesses the use of bibliometrics in both STEM (science, technology, engineering and mathematics) and non-STEM subjects, and the differences in citation behaviour among subject disciplines.

Professor Eric Thomas, chair of Universities UK’s Research Policy Committee, said: “It is widely anticipated that bibliometrics will be central to the new system, but we need to ensure it is technically correct and able to inspire confidence among the research community.

The report highlights that:

  • Bibliometrics are probably the most useful of a number of variables that could feasibly be used to measure research performance.
  • There is evidence that bibliometric indices do correlate with other, quasi-independent measures of research quality – such as RAE grades – across a range of fields in science and engineering.
  • There is a range of bibliometric variables as possible quality indicators. There are strong arguments against the use of (i) output volume (ii) citation volume (iii) journal impact and (iv) frequency of uncited papers.
  • ‘Citations per paper’ is a widely accepted index in international evaluation. Highly-cited papers are recognised as identifying exceptional research activity.
  • Accuracy and appropriateness of citation counts are a critical factor.
  • There are differences in citation behaviour among STEM and non-STEM as well as different subject disciplines.
  • Metrics do not take into account contextual information about individuals, which may be relevant. They also do not always take into account research from across a number of disciplines.
  • The definition of the broad subject groups and the assignment of staff and activity to them will need careful consideration.
  • Bibliometric indicators will need to be linked to other metrics on research funding and on research postgraduate training.
  • There are potential behavioural effects of using bibliometrics which may not be picked up for some years
  • There are data limitations where researchers’ outputs are not comprehensively catalogued in bibliometrics databases.

See here for one early reaction from the Guardian‘s Donald Macleod. This report, and subsequent reactions, are more food for fodder in ongoing debates about the global higher ed audit culture that is emerging, like it or not…

Kris Olds

3 thoughts on “Citation indices, bibliometrics, and audit culture: new research findings

  1. Pingback: Thomson Innovation, UK Research Footprints®, and global audit culture « GlobalHigherEd

  2. Pingback: Thomson Scientific and China: of gifts and knowledge production (2003-2007) « GlobalHigherEd

  3. Pingback: Euro-angsts, insights and actions regarding global university ranking schemes « GlobalHigherEd

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s