Research analytics service: From defining the problem to refining the solution

Introduction

In my previous post (Research analytics service: Defining the problems and refining potential solutions) I gave an overview of the project, some background to research analytics at Jisc, and details on the discovery phase for a Research Analytics Service. In this post I’m going to update you on the progress made in the first half of this discovery phase and what we plan to achieve in the next.

Discovery Phase – Defining the Problem

The main purpose of the discovery phase has been to better define the problems in research analytics, working with our members and funders, prior to refining potential solutions that might be developed as a future research analytics service. The problem definition phase is now complete, and we’ve narrowed down to one area to investigate in the solution refining phase.

Throughout this project we have been engaging with UKRI, Research England, and the Forum For Responsible Research Metrics’ (FFRRM) and institutional leaders to:

  • Help us define the problems in the research landscape where research analytics could help
  • Refine them down to smaller, more manageable problems that can act as the starting point for possible solutions
  • Understand other trends and developments in the landscape that may be relevant to this work
  • Clarify the role that Jisc might play in addressing these problems and trends

We employed Research Consulting to look at the broad context of the problem, consult stakeholders and produce case studies from five HEIs. The case studies involved detailed interviews and site visits, and these were at Liverpool Hope University, Middlesex University, University of Manchester, Ulster University and the Royal Central School of Speech and Drama. The case studies and interviews with stakeholders led to the following conclusions about the level of maturity of research analytics:

  • Research analytics remains immature in most HEIs
  • ‘Research intelligence’ is primarily qualitative in nature
  • Analytics expertise generally resides in the Planning & Management Information functions, and is focussed on teaching and learning
  • Analytics maturity is linked to size and data availability, but also internal leadership
  • Most opportunities (>70%) rely on use of HEIs’ internal data rather than third party data alone

We’ve also been looking at what Jisc’s role should be in this area as well as perceptions around research analytics. This has led to the following conclusions:

  • There is a low awareness and uptake of research analytics at present, but there are clear indicators of emerging interest
  • Funder and HEI priorities differ, and there is no ‘one-size-fits-all’ solution
  • Any Jisc solution will need to complement or co-exist with existing analytics tools and services in use by institutions.

User stories

Over 160 user stories were gathered from the case studies, workshops and interviews. These have been extremely helpful in deciding where we should focus our effort in the next phase. An analysis of the data grouped these into themes:

  • Research strategy and planning
  • Researcher development and careers
  • Financial performance and sustainability
  • Collaboration, impact and knowledge exchange
  • Understanding and evaluating research performance
  • Compliance with legal, regulatory or funder requirements
  • Scholarly communication

The following graphical representation shows this thematic analysis of the user stories (see my recent presentation at the ARMA study tour for details):

A preliminary assessment of which of these could be taken to a solution phase was made. This assessment was based on a combination of the level of interest identified within the sector and the potential barriers to development of a solution, taking account of the likely availability of relevant data sources, the specificity of user requirements, and the presence of existing commercial solutions.

The main areas identified as the most promising for Jisc to pursue were researcher development and careers, research strategy and planning (particularly improving application success rates) and evaluating research performance (particularly facilitating external benchmarking).

Design Sprint

Taking all the data collected in the problem definition phase and the recommendations on areas to pursue, we ran a design sprint to review the list of possible solutions to be developed in the next phase. We took the thematic areas and refined these down to Grant Applications (including Artificial Intelligence (AI) and Text and Data Mining (TDM) techniques) and Career Development. At this stage, it was felt that we should focus on just Grant Applications, although this didn’t mean the other areas weren’t important.

We also reviewed the user stories for areas that could be selected for our Analytics Lab team to take forward to develop as analytical dashboards. A number of these clustered around measuring success rates of grant applications and this will be the theme of an internally run Analytics Lab.

The ideas related to grant applications, and ones that we could explore and develop in the next phase, included the following:

  • Automated analysis of grant submission. Tools to help with grant submission process.
  • Text and Data Mining service for all grant applications received for funders. Information to help refine future calls and provide insights.
  • Artificial Intelligence improving the quality of grant applications, making it easier to match the requirements of the call.
  • Compare HEIs’ success rates with grant applications on dashboard.
  • Breakdown of funding to HEIs from different sources (e.g. UK, EU, international).
  • Knowledge Graph, including research outputs, showing opportunities for innovation, current trends and opportunities.
  • Application analysis. Providing information on key topics and players. Help from grant submissions.
  • Partner matching and consortium building using grant application data.
  • The main priority of providing grant application metrics is to help support institutions, their research strategy and planning their portfolio of research.

Next steps: Solution Refining

In this second phase of the discovery process we need to test the hypotheses and assumptions made in the area of grant applications, including how improvements could be made using AI and TDM techniques. The work has been split into three parts:

  • Desk research and defining product features – this will identify the ways stakeholders in HEI and funding organisations currently analyse grant application data. The findings from this work will feed into a survey and interview questionnaire to gather views from prospective users. This will include initial solution flowcharts to help us narrow down the options for a service and questions to gauge the appetite for analytics on grant applications.
  • Prototyping and user testing – circulating the survey and interviewing stakeholders, including funders, HEI representatives and experts in AI and TDM, to determine their views and test service prototypes.
  • Solution pitch – the objective of the whole discovery phase is to deliver a solution pitch to Jisc’s investment committee for funds and resources to develop a research analytics service. This will include the evidence collected in this phase and is like the elements you’d find in a business case. This includes the results from the desk research, surveys and interviews, the rationale for the service, prototypes, the level of demand from the sector, business models and the skills required to build the service.

We are interested in hearing from institutions who would like to get involved in this stage of the project. We’d especially like to hear from those willing to be interviewed and to help us test the assumptions and hypotheses of the grant application solution. You can find my contact details here.

Once the discovery phase is complete, I will post the results, including the plans for developing a research analytics service, on this blog.

By Christopher Brown

Product Manager (Research) at Jisc.