Categories
Research at Risk Research Data Management Businesss Case Research data metrics

FAIR in Practice, a reality check indeed!

To follow-up my previous blog on the FAIR in practice work Jisc is taking on, I’d like to report back on a lively panel session on FAIR principles that took place as part of the Jisc Research Data Management event in York, on 28th June 2017.

Panellists Cameron Neylon, Ingrid Dillo and Ingeborg Verheul
Panellists Cameron Neylon, Ingrid Dillo and Ingeborg Verheul

Expert panellists Cameron Neylon (Curtin University, Western Australia), Ingeborg Verheul (SURFsara Netherlands) and Ingrid Dillo (DANS, Netherlands), and around 50 people in the audience took part in a dialogue about the omnipresent FAIR principles.

First statements by the panellists set the scene and (in my words) be summarised as: ‘The FAIR principles are not new, they are a re-arrangement of principles being considered for data and repositories 10 years ago’, ‘The FAIR principles are a useful marketing tool, they set an objective at the horizon that everybody wants to work on, not held back by technological detail’; ‘The FAIR principles are presented as the solution but prove to be very hard to implement in a repository or in data practice, so let’s not celebrate too early.’

Two statements were put before the panel and participants for comment, the first being

The biggest strength of FAIR principles are that they are well-defined, compact, clear and a concise brand.

In the discussion that followed panellists and audience put forward various views on the value of the narrative power of FAIR and the usefulness of separating the ultimate goal from actual work and problematic issues to achieve it. FAIR is perceived as a strong brand but that does not make underlying issues (‘what is true accessibility?’) more clear. To many (researchers) FAIR may feel less scary than ‘open; but what will be the implications if FAIR become a substitute for ‘open’? And yes, achieving FAIR is hard, but we said that about Open Access fifteen years ago and look where we are now.

Second statement for discussion was:

The biggest weakness of FAIR principles are the lack of implementation guidance, particularly across disciplines. Lack of metrics to measure FAIR.

Lively discussion referred to the dependencies between the letters F.A.I. and R., and uncertainty which of these you’ll measure. And what about openness? Long term access? Discipline specific factors to be measured? At DANS in the Netherlands a simple measuring tool is created to start measuring, look at results and take it further. Perhaps FAIR is more applicable to data management, rather than data sharing? A disciplinary approach/perspective is advocated to avoid misunderstanding. Some feel that FAIR is just a framework.

Panel and audience FAIR in practice session
Panel and audience FAIR in practice session

Much more was said, and the consultants working for Jisc on the ‘FAIR in practice’ project will without doubt include the views and warnings expressed into their analysis of the state of FAIR in the UK. The variety of opinions provided and issues raised do underpin the necessity of a reality check!

In the coming months a range of interviews and group discussions will be held with experts, researchers, policy makers, Jisc staff, and other stakeholders. After analysing the outcomes we’ll report back with a factual, useful overview of FAIR in practice and suggestions on how to get the best out of them.

audience FAIR session
audience FAIR session