Thomson Scientific, the private firm fueling the bibliometrics drive in academia, is in the process of positioning itself as the anchor point for data on intellectual property (IP) and research. Following tantalizers in the form of free reports such as World IP Today: A Thomson Scientific Report on Global Patent Activity from 1997-2006 (from which the two images below are taken), Thomson Scientific is establishing, in phases, Thomson Innovation, which will provide, when completed:
- Comprehensive prior art searching with the ability to search patents and scientific literature simultaneously
- Expanded Asian patent coverage, including translations of Japanese full-text and additional editorially enhanced abstracts of Chinese data
- A fully integrated searchable database combining Derwent World Patent Index® (DWPISM) with full-text patent data to provide the most comprehensive patent records available
- Support of strategic intellectual property decisions through:
- powerful analysis and visualization tools, such as charting, citation mapping and search result ranking
- and, integration of business and news resources
- Enhanced collaboration capabilities, including customizable folder structures that enable users to organize, annotate, search and share relevant files.
Speaking of bibliometrics, Evidence Ltd., the private firm that is shaping some of the debates about the post-Research Assessment Exercise (RAE) system of evaluating research quality and impact in UK universities, recently released the UK Higher Education Research Yearbook 2007. This £255 (for higher education customers) report:
[P]rovides the means to gain a rapid overview of the research strengths of any UK Higher Education institution, and compare its performance with that of its peers. It is an invaluable tool for those wishing to assess their own institution’s areas of relative strength and weakness, as well as versatile directory for those looking to invest in UK research. It will save research offices in any organisation with R&D links many months of work, allowing administrative and management staff the opportunity to focus on the strategic priorities that these data will help to inform….
It sets out in clear diagrams and summary tables the research profile for Universities and Colleges funded for research. Research Footprints® compare each institution’s performance to the average for its sector, allowing strengths and weaknesses to be rapidly identified by research managers and by industrial customers.
See below, for one example of how a sample university (in this case the University of Warwick) has its “Research Footprint®” graphically represented. This image is included in a brief article about Warwick by Vice-Chancellor Nigel Thrift, and is available on Warwick’s News & Events website.
Given the metrics that are utilized, it is clear, even if the data is not published, that individual researchers’ footprints will be available for systematic and comparative analysis, thereby enabling the governance of faculty with the back-up of ‘data’, and the targeted recruitment of the ‘big foot’ wherever s/he resides (though Sasquatches presumably need not apply!).
In last week’s conference on Realising the Global University, organised by the Worldwide Universities Network (WUN), Professor David Eastwood, Chief Executive of the Higher Education Funding Council for England (HEFCE), spoke several times about the role of funding councils in governing universities and academics to enhance England’s standing in the global higher education sphere (‘market’ is perhaps a more appropriate term given the tone of discussions). One of the interesting dimensions of Eastwood’s position was the uneasy yet dependent relationship HEFCE has on bibliometrics and globally-scaled university ranking schemes to frame the UK’s position, taking into account HEFCE’s influence over funding councils in England, Scotland, Wales and Northern Ireland (which together make up the UK). Eastwood expressed satisfaction with the UK’s relative standing, yet (a) concern about emerging ‘Asian’ countries (well really just China, and to a lesser degree Singapore), (b) the need to compete with research powerhouses (esp., the US), and (c) the need to forge linkages with research powerhouses and emerging ‘contenders’ (ideally via joint UK-US and UK-China research projects, which are likely to lead to more jointly written papers; papers that are posited to generate relatively higher citation counts). These comments help us better understand the opening of a Research Councils UK (RCUK) office in China on 30 October 2007.
In this context, and further to our 9 November entry on bibliometrics and audit culture, it is worth noting that HEFCE launched a consultation process today about just this – bibliometrics as the core element of a new framework for assessing and funding research, especially with respect to “science-based” disciplines. HEFCE notes that “some key elements in the new framework have already been decided” (i.e., get used to the idea, and quick!), and that the consultation process is instead focused on “how they should be delivered”. Elements of the new framework include (but are not limited to):
- Subject divisions: within an overarching framework for the assessment and funding of research, there will be distinct approaches for the science-based disciplines (in this context, the sciences, technology, engineering and medicine with the exception of mathematics and statistics) and for the other disciplines. This publication proposes where the boundary should be drawn between these two groups and proposes a subdivision of science-based disciplines into six broad subject groups for assessment and funding purposes.
- Assessment and funding for the science-based disciplines will be driven by quantitative indicators. We will develop a new bibliometric indicator of research quality. This document builds on expert advice to set out our proposed approach to generating a quality profile using bibliometric data, and invites comments on this.
- Assessment and funding for the other disciplines: a new light touch peer review process informed by metrics will operate for the other disciplines (the arts, humanities, social sciences and mathematics and statistics) in 2013. We have not undertaken significant development work on this to date. This publication identifies some key issues and invites preliminary views on how we should approach these.
- Range and use of quantitative indicators: the new funding and assessment framework will also make use of indicators of research income and numbers of research students. This publication invites views on whether additional indicators should be used, for example to capture user value, and if so on what basis.
- Role of the expert panels: panels made up of eminent UK and international practising researchers in each of the proposed subject groups, together with some research users, will be convened to advise on the selection and use of indicators within the framework for all disciplines, and to conduct the light touch peer review process in non science-based disciplines. This document invites proposals for how their role should be defined within this context.
- Next steps: the paper identifies a number of areas for further work and sets out our proposed workplan and timetable for developing and introducing the new framework, including further consultations and a pilot exercise to help develop a method for producing bibliometric quality indicators.
- Sector impact: a key aim in developing the framework will be to reduce the burden on researchers and higher education institutions (HEIs) created by the current arrangements. We also aim for the framework to promote equal opportunities. This publication invites comments on where we need to pay particular attention to these issues in developing the framework and what more can be done.
This process is worth following even if you are not working for a UK institution for it sheds light on the emerging role of bibliometrics as a governing tool (which is evident in more and more countries), especially with respect to the global (re)positioning of national higher education systems vis a vis a particular understandings of ‘research quality’ and ‘productivity’. Over time, of course, it will also transform some of the behaviour of many UK academics, perhaps spurring on everything from heightened competition to get into high citation impact (CIF) factor journals, greater international collaborative work (if such work indeed generates more citations), the possible creation of “citation clubs” (much more easily done, perhaps, that HEFCE realizes), less commitment to high quality teaching, and a myriad of other unknown impacts, for good and for bad, by the time the new framework is “fully driving all research funding” in 2014.