Introduction
This blog post outlines some of the most important Jisc activities in the area of research metrics and indicators and, in particular, the data infrastructure underpinning them. The data underpinning metrics can be divided into that relating to objects, which need identifiers, and that relating to activity.
Identifiers
Identifiers for the key entities in research – researchers, funders, universities, grants and outputs – enable systems to manage and track records of those entities, ensuring that the data on which metrics are built is reliable.
* Researchers: ORCID is an international, interdisciplinary, open, not-for-profit organisation that provides researchers with a unique digital identifier solution which enables a wide range of improvements to the scholarly communications ecosystem (including research management, grant applications and research discoverability).
- After developing a UK consensus which resulted in endorsement of ORCID by ARMA UK, HEFCE, HESA, RCUK, UCISA, Wellcome Trust, RLUK and SCONUL, Jisc has enabled further practical exploration of the ORCID system in UK universities through a pilot study.
- Organisations that have adopted ORCID expect to see measurable efficiency improvements within two years of implementation – especially in internal data quality, streamlining of publications management, and enhanced reporting to funders – with accrued benefits increasing steadily over the following three to four years.
- In 2015 Jisc negotiated a national consortium arrangement that brings reduced ORCID membership costs to UK institutions and enhanced technical support. Nearly 50 universities and the Research Councils have joined.
* Organisations: There are several national bodies that identify universities in various ways (e.g. HESA and Research Councils) that are significant in different scenarios.
- A cross-sector working group convened by Jisc concluded that no single organisational identifier candidate would fulfil all potential use cases. It recommended the international ISNI as the baseline identifier, linking together others, with both commercial and public registration agencies issuing ISNIs.
- These recommendations have informed both Research Councils and HEFCE thinking. A statement of agreement is being prepared so that key organisations, such as RCUK, Jisc, HEFCE, etc can sign up to them.
* Funders: The de facto standard identifier for researcher funders is FundRef, operated by CrossRef and backed by Elsevier. Since a funder is just another kind of organisation, efforts are being made to align FundRef with ISNI. Jisc is in discussions with CrossRef to exploit this.
* Outputs: The de facto standard identifier for outputs is the Digital Object Identifier. DOIs are issued for publications by CrossRef (publishers), and for datasets by DataCite (British Lilbrary). Jisc has close working relations with both, on behalf of UKHE.
Activity
Activity covers any events related to research, such as an article or dataset being used or cited. With the growth of digital research, there is more data about such activity potentially available.
* Research data: Interest in reliable metrics for research data is rapidly growing as academic practice changes, and funders and journals increasingly require the sharing of research data. As a newly mainstream area of activity, very little is known about the patterns and expectations of data sharing, so well-designed indicators will be invaluable in understanding resourcing implications alongside measuring impact.
- Jisc sits at the forefront of this work in the UK, with plans for a download metrics service already at the pilot stage and being tested with numerous institutional and subject area data repositories. Alongside this, we play a part in a global effort to understand the demand for other forms of data usage metrics.
- Work by a range of international organisations has identified downloads and data citations as the two metrics most in demand from researchers and repository managers. Using metrics such as these would allow researchers to support the case for the impact of their research, institutions to accurately plan future data storage needs, and policy makers to understand the impact of their data policies and make alterations as required.
- Jisc is working in partnership with international standards bodies such as NISO and COUNTER to standardise approaches to these metrics globally. Early Jisc work has focused on download metrics but, as work on citation metrics matures, these will be included in our ongoing project, as will “alternative” metrics, drawing on social media.
* Research publications: The UK has world-leading services counting the usage of academic journals and the articles therein, based on (and contributing to) international standards such as COUNTER.
- One service provides information on journal usage, the other on articles downloaded from UK repositories. In each case, the information is cleaned to remove known distortions such as those from web robots.
- In addition, Jisc and the Open University operate the world’s most comprehensive and advanced aggregation of open access research papers, CORE , which provides a range of analytic services based on that corpus. As open access grows, this will become an ever more useful source of data to support research indicators, giving the UK a singular advantage.
- Drawing from several previous projects, Jisc is conducting an experiment around an open, semantic citation dataset from CORE that aims to provide transparent article-level citation indicators. This will be followed by a review and evaluation of the validity and reliability of article-level indicators, and the degree to which they simplify the complexity of research assessment, to inform their use by research managers.
Conclusion
The recent Metric Tide report proposed the notion of responsible metrics as a way of framing appropriate uses of quantitative indicators in the governance, management and assessment of research. One of the crucial dimensions of responsible metrics is transparent measurement systems. Data collection and analytical processes should be kept open and transparent (including university rankings), so that those being evaluated can test and verify the results. Furthermore, those processes should be based on data that adequately represents important dimensions of research, not just those that have traditionally been easy to count. Jisc’s work is providing the digital bedrock on which such processes can be built.