Facilitate Open Science Training for European Research
Open Data Strategies and Research Data Realities
Martin Donnelly
Digital Curation Centre
University of Edinburgh
NCP Academy Webinar
31 October 2017
The Digital Curation Centre (DCC)
• UK national centre of expertise in digital preservation
and data management, est. 2004
• Principal audience is the UK higher education sector, but
we increasingly work further afield (continental Europe,
North America, South Africa, Asia…)
• Provide guidance, training, tools (e.g. DMPonline) and
other services on all aspects of research data
management and Open Science
• Tailored consultancy/training
• Organise national and international events and webinars
(International Digital Curation Conference, Research
Data Management Forum)
• Phase 1 (2014-2016): Spread
the Seeds of Open Science and
Open Access
• Creation of Open Science
Taxonomy
• 2000+ training materials,
categorized in the FOSTER
Portal
• More than 100 f2f training
events in 28 countries and 25
online courses, totalling more
than 6300 participants
FacilitateOpenScienceTrainingforEuropeanResearch
The project
http://fosteropenscience.eu
• Phase 2 (2017-2019): Let the Flowers of Open Science Bloom
• Focus on:
• Training for the practical implementation of Open Science (face to face
and online) including RDM and Open Data
• Developing intermediate/advanced level/discipline-specific training
resources in collaboration with three disciplinary communities (and
related RIs): Life Sciences (ELIXIR), Social Sciences (CESSDA) and
Humanities (DARIAH)
• Update the FOSTER Portal to support moderated learning, badges and
gamification
• In concrete terms:
• 150 new training resources
• Over 50 training events (outcome-oriented, providing participants with
tangible skills) and 20 e-learning courses
• Multi-module Open Science Toolkit
• Trainers Network, Open Science Bootcamp, Open Science Training
Handbook, and more…
FacilitateOpenScienceTrainingforEuropeanResearch
The project
http://fosteropenscience.eu
OVERVIEW
1. Context: Open Data and openness in general
2. Benefits of an Open approach, and risks of getting it
wrong
3. Emerging high-level consensus, e.g. H2020 policy and
FAIR data principles
4. Clashes between the ideal world and the real world
5. RDM and Open Data in practice: key points and rules of
thumb, plus reflections on assessing H2020 DMPs
6. Contacts and links
Open Access + Open Data = Open Science
• Openness in research is situated within a context of ever
greater transparency, accessibility and accountability
• As Open Access to publications became normal (if not yet
ubiquitous), the scholarly community turned its attention to the
data which underpins the research outputs, and eventually to
consider it a first-class output in its own right. The development
of the OA and research data management (RDM) agendas are
closely linked as part of a broader trend in research, sometimes
termed ‘Open Science’ or ‘Open Research’
• “The European Commission is now moving beyond open access towards
the more inclusive area of open science. Elements of open science will
gradually feed into the shaping of a policy for Responsible Research and
Innovation and will contribute to the realisation of the European
Research Area and the Innovation Union, the two main flagship
initiatives for research and innovation”
http://ec.europa.eu/research/swafs/index.cfm?pg=policy&lib=science
Growing momentum and ubiquity…
Data management
is a part of good
research practice.
- RCUK Policy and Code of
Conduct on the
Governance of Good
Research Conduct
Good practice in RDM
RDM is “the active
management and appraisal
of data over the lifecycle of
scholarly and scientific
interest”
Core activities include:
- Planning and describing data-
related work before it takes place
- Documenting your data (and
processing/workflows) so that
others can find and understand it
- Choosing open (or at least
standardised) file formats where
possible
- Storing data safely during a project
- Depositing it in a trusted archive
at the end of the research
- Linking publications to the
datasets that underpin them… and
increasingly code/scripts too
Benefits of Openness
• IMPACT and LONGEVITY: Open data (and publications) receive
more citations, over longer periods
• SPEED: The research process becomes faster
• ACCESSIBILITY: Interested third parties can (where
appropriate) access and build upon publicly-funded research
outputs with minimal barriers to access
• EFFICIENCY: Data collection can be funded once, and used
many times for a variety of purposes
• TRANSPARENCY and QUALITY: The evidence that underpins
research can be made open for anyone to scrutinise, and
attempt to replicate findings. This leads to a more robust
scholarly record, and reduces academic fraud for example
• DURABILITY: Simply put, fewer important datasets will be lost
Risks of not doing this, or getting this wrong
• LEGAL – sensitive data is protected by law (and
contracts) and needs to be protected
• FINANCIAL – non-compliance with funder policies can
lead to reduced access to income streams
• SCIENTIFIC – potential discoveries may be hidden away
in drawers, on USB
• OPPORTUNITY COST – reduced visibility for research >
lost opportunities for collaboration
• QUALITY – the scholarly record becomes less robust
• REPUTATIONAL – responsible data management is
increasingly considered a core element of good scholarly
practice in the 21st century
Sounds good, right?!
• So why don’t we live in an Open Data utopia?
• Five main reasons…
• Issues of ownership / privacy / ethics / security
• Issues around reward and recognition for researchers
• Lack of joined-up thinking within institutions, countries,
internationally… (this is being addressed, slowly but surely)
• Technical/financial/organisational limitations, including the
need for selection and appraisal of data
• And a bonus one… researchers don’t always relate to
the terminology we use!
Emerging global consensus?
• Disciplinary quirks and differences notwithstanding, the
past decade has seen great progress in shared best
practice and common expectations…
• RCUK Common Principles
• National Open Data and Open Science strategies*
• Establishment of the Research Data Alliance
• EC data pilot > adoption of the FAIR Principles
• “As open as possible; as closed as necessary.”
(* DCC and SPARC-Europe are currently revising our joint list
and analysis of national open data/open science policies. Get in
touch if you have anything to add!)
Case study: the European Commission and
FAIR data
• The EC has adopted FORCE11’s ‘FAIR’ approach to
research data management.
• These principles state that “One of the grand challenges
of data-intensive science is to facilitate knowledge
discovery by assisting humans and machines in their
discovery of, access to, integration and analysis of, task-
appropriate scientific data and their associated
algorithms and workflows.”
• To help achieve this, (meta)data should be…
• Findable
• Accessible
• Interoperable
• Reusable
The FAIR Data Principles (1/4)
To be Findable:
F1. (meta)data are assigned a globally unique and
eternally persistent identifier.
F2. data are described with rich metadata.
F3. (meta)data are registered or indexed in a
searchable resource.
F4. metadata specify the data identifier.
The FAIR Data Principles (2/4)
To be Accessible:
A1. (meta)data are retrievable by their
identifier using a standardized communications
protocol.
A1.1. the protocol is open, free, and universally
implementable.
A1.2. the protocol allows for an authentication and
authorization procedure, where necessary.
A2. metadata are accessible, even when the data are
no longer available.
The FAIR Data Principles (3/4)
To be Interoperable:
I1. (meta)data use a formal, accessible, shared, and
broadly applicable language for knowledge
representation.
I2. (meta)data use vocabularies that follow FAIR
principles.
I3. (meta)data include qualified references to other
(meta)data.
The FAIR Data Principles (4/4)
To be Re-usable:
R1. meta(data) have a plurality of accurate and
relevant attributes.
R1.1. (meta)data are released with a clear and
accessible data usage license.
R1.2. (meta)data are associated with
their provenance.
R1.3. (meta)data meet domain-relevant community
standards.
H2020 Data Management Plan
• The DMP should include information on:
• the handling of research data during and after the end of the
project
• what data will be collected, processed and/or generated
• which methodology and standards will be applied
• whether data will be shared/made open access, and
• how data will be curated and preserved (including after the end
of the project)
• DMPs are submitted as deliverables – first version due at
six-month stage
• Template and guidance is given in the Guidelines doc
Reflections on assessing H2020 DMPs
• It would be better if everyone followed the same template –
the EC does provide one, but its use isn’t (yet) mandatory
• A DMP doesn’t need to tell everything there is to know about
a project: brevity is a plus!
• Areas of frequent weakness: security (access and storage),
ethical restrictions for data sharing, appraisal of long-term
value/interest, quality assurance processes, costs
• Advice:
• Be clear about the different between in-project and post-project
data storage and archiving;
• Don’t just regurgitate the H2020 guidelines – reviewers pick up on
that really quickly;
• Try not to confuse publications and data (I have seen projects
describe archived data as ‘gold Open Access’ which doesn’t make
much sense)
Strategies for success, a three step guide
Step 1. Be clear about who is involved
• RDM is a hybrid activity, involving multiple stakeholder groups…
• The researchers themselves
• Research support personnel
• Partners based in other institutions, funders, data centres, commercial
partners, etc
• No single person does everything, and it makes no sense to duplicate
effort or reinvent wheels
• Data Management Planning (DMP) underpins and pulls together
different strands of data management activities. DMP is the process
of planning, describing and communicating the activities carried
out during the research lifecycle in order to…
• Keep sensitive data safe
• Maximise data’s re-use potential
• Support longer-term preservation
• Data Management Plans are a means of communication, with
contemporaries and future re-users alike
Step 2. Write things down
• In a data management plan / record
• In metadata to describe the data and help others to
understand it
• In workflows and README files
• In version management
• In justifying decisions re. access, embargo, selection
and appraisal… the list can be very long…
Communication is crucial…
…and plans can and do change!
Step 3. Don’t try to do everything yourself
• See Step 1 ;)
A few do’s and don’ts for RDM
DO DON’T
Have a plan for your data Make it up as you go along
Keep backups. Make this easy with automated
syncing services like Dropbox, provided your
data isn’t too sensitive
Carry the only copy around on a memory
card, your laptop, your phone, etc
Describe your data as you collect it. This
makes it possible for others to interpret it,
and for you to do the same a few years down
the line
Leave this till the end. The quality of
metadata decreases with time, and the
best metadata is created at the moment of
data capture
Save your work in open file formats, where
possible, and use accepted metadata
standards to enable like-with-like comparison
Invent new ‘standards’ where community
norms already exist
Deposit your data in a data centre or
repository, and link it to your publications
Be afraid to ask for help. This will exist
both within your institution, and via
national / European support organisations
RDM / Open Data in practice: key points
1. Understand your funder’s policies (and perhaps national policy
initiatives – see recent SPARC-Europe reports)
2. Create a data management plan (e.g. with DMPonline)
3. Decide which data to preserve (e.g. using the DCC How-To
guide and checklist, “Five Steps to Decide what Data to Keep”)
4. Identify a long-term home for your data (e.g. via re3data.org)
5. Link your data to your publications with a persistent identifier
(e.g. via DataCite)
• N.B. Many archives, including Zenodo, will do this for you
6. Investigate EU infrastructure services and resources
And finally, a few RDM rules of thumb
• Without intervention, data + time = no data
• See Vines, above
• Prioritise: could anyone die or go to jail?
• Legal issues (e.g. protecting vulnerable subjects) are the most
important
• Storage is not the same as management
• Think of data as plants and the servers as a greenhouse
• The plants still need to be fed, watered, pruned, etc… and
sometimes disposed of
• Management is not the same as sharing
• Not all data should be shared
• Approach: “As open as possible, as closed as necessary”
• Remember that plans are just that – they are not contracts!
Contact details
• For more information about the
FOSTER project:
• Website: www.fosteropenscience.eu
• Principal investigator: Eloy Rodrigues
(eloy@sdum.uminho.pt)
• General enquiries: Gwen Franck
(gwen.franck@eifl.net)
• Twitter: @fosterscience
• My contact details:
• Email: martin.donnelly@ed.ac.uk
• Twitter: @mkdDCC
• Slideshare:
http://www.slideshare.net/martindo
nnelly
This work is licensed under the
Creative Commons Attribution
2.5 UK: Scotland License.

Open Data Strategies and Research Data Realities

  • 1.
    Facilitate Open ScienceTraining for European Research Open Data Strategies and Research Data Realities Martin Donnelly Digital Curation Centre University of Edinburgh NCP Academy Webinar 31 October 2017
  • 2.
    The Digital CurationCentre (DCC) • UK national centre of expertise in digital preservation and data management, est. 2004 • Principal audience is the UK higher education sector, but we increasingly work further afield (continental Europe, North America, South Africa, Asia…) • Provide guidance, training, tools (e.g. DMPonline) and other services on all aspects of research data management and Open Science • Tailored consultancy/training • Organise national and international events and webinars (International Digital Curation Conference, Research Data Management Forum)
  • 3.
    • Phase 1(2014-2016): Spread the Seeds of Open Science and Open Access • Creation of Open Science Taxonomy • 2000+ training materials, categorized in the FOSTER Portal • More than 100 f2f training events in 28 countries and 25 online courses, totalling more than 6300 participants FacilitateOpenScienceTrainingforEuropeanResearch The project http://fosteropenscience.eu
  • 4.
    • Phase 2(2017-2019): Let the Flowers of Open Science Bloom • Focus on: • Training for the practical implementation of Open Science (face to face and online) including RDM and Open Data • Developing intermediate/advanced level/discipline-specific training resources in collaboration with three disciplinary communities (and related RIs): Life Sciences (ELIXIR), Social Sciences (CESSDA) and Humanities (DARIAH) • Update the FOSTER Portal to support moderated learning, badges and gamification • In concrete terms: • 150 new training resources • Over 50 training events (outcome-oriented, providing participants with tangible skills) and 20 e-learning courses • Multi-module Open Science Toolkit • Trainers Network, Open Science Bootcamp, Open Science Training Handbook, and more… FacilitateOpenScienceTrainingforEuropeanResearch The project http://fosteropenscience.eu
  • 5.
    OVERVIEW 1. Context: OpenData and openness in general 2. Benefits of an Open approach, and risks of getting it wrong 3. Emerging high-level consensus, e.g. H2020 policy and FAIR data principles 4. Clashes between the ideal world and the real world 5. RDM and Open Data in practice: key points and rules of thumb, plus reflections on assessing H2020 DMPs 6. Contacts and links
  • 6.
    Open Access +Open Data = Open Science • Openness in research is situated within a context of ever greater transparency, accessibility and accountability • As Open Access to publications became normal (if not yet ubiquitous), the scholarly community turned its attention to the data which underpins the research outputs, and eventually to consider it a first-class output in its own right. The development of the OA and research data management (RDM) agendas are closely linked as part of a broader trend in research, sometimes termed ‘Open Science’ or ‘Open Research’ • “The European Commission is now moving beyond open access towards the more inclusive area of open science. Elements of open science will gradually feed into the shaping of a policy for Responsible Research and Innovation and will contribute to the realisation of the European Research Area and the Innovation Union, the two main flagship initiatives for research and innovation” http://ec.europa.eu/research/swafs/index.cfm?pg=policy&lib=science
  • 7.
    Growing momentum andubiquity… Data management is a part of good research practice. - RCUK Policy and Code of Conduct on the Governance of Good Research Conduct
  • 8.
    Good practice inRDM RDM is “the active management and appraisal of data over the lifecycle of scholarly and scientific interest” Core activities include: - Planning and describing data- related work before it takes place - Documenting your data (and processing/workflows) so that others can find and understand it - Choosing open (or at least standardised) file formats where possible - Storing data safely during a project - Depositing it in a trusted archive at the end of the research - Linking publications to the datasets that underpin them… and increasingly code/scripts too
  • 9.
    Benefits of Openness •IMPACT and LONGEVITY: Open data (and publications) receive more citations, over longer periods • SPEED: The research process becomes faster • ACCESSIBILITY: Interested third parties can (where appropriate) access and build upon publicly-funded research outputs with minimal barriers to access • EFFICIENCY: Data collection can be funded once, and used many times for a variety of purposes • TRANSPARENCY and QUALITY: The evidence that underpins research can be made open for anyone to scrutinise, and attempt to replicate findings. This leads to a more robust scholarly record, and reduces academic fraud for example • DURABILITY: Simply put, fewer important datasets will be lost
  • 10.
    Risks of notdoing this, or getting this wrong • LEGAL – sensitive data is protected by law (and contracts) and needs to be protected • FINANCIAL – non-compliance with funder policies can lead to reduced access to income streams • SCIENTIFIC – potential discoveries may be hidden away in drawers, on USB • OPPORTUNITY COST – reduced visibility for research > lost opportunities for collaboration • QUALITY – the scholarly record becomes less robust • REPUTATIONAL – responsible data management is increasingly considered a core element of good scholarly practice in the 21st century
  • 11.
    Sounds good, right?! •So why don’t we live in an Open Data utopia? • Five main reasons… • Issues of ownership / privacy / ethics / security • Issues around reward and recognition for researchers • Lack of joined-up thinking within institutions, countries, internationally… (this is being addressed, slowly but surely) • Technical/financial/organisational limitations, including the need for selection and appraisal of data • And a bonus one… researchers don’t always relate to the terminology we use!
  • 12.
    Emerging global consensus? •Disciplinary quirks and differences notwithstanding, the past decade has seen great progress in shared best practice and common expectations… • RCUK Common Principles • National Open Data and Open Science strategies* • Establishment of the Research Data Alliance • EC data pilot > adoption of the FAIR Principles • “As open as possible; as closed as necessary.” (* DCC and SPARC-Europe are currently revising our joint list and analysis of national open data/open science policies. Get in touch if you have anything to add!)
  • 13.
    Case study: theEuropean Commission and FAIR data • The EC has adopted FORCE11’s ‘FAIR’ approach to research data management. • These principles state that “One of the grand challenges of data-intensive science is to facilitate knowledge discovery by assisting humans and machines in their discovery of, access to, integration and analysis of, task- appropriate scientific data and their associated algorithms and workflows.” • To help achieve this, (meta)data should be… • Findable • Accessible • Interoperable • Reusable
  • 14.
    The FAIR DataPrinciples (1/4) To be Findable: F1. (meta)data are assigned a globally unique and eternally persistent identifier. F2. data are described with rich metadata. F3. (meta)data are registered or indexed in a searchable resource. F4. metadata specify the data identifier.
  • 15.
    The FAIR DataPrinciples (2/4) To be Accessible: A1. (meta)data are retrievable by their identifier using a standardized communications protocol. A1.1. the protocol is open, free, and universally implementable. A1.2. the protocol allows for an authentication and authorization procedure, where necessary. A2. metadata are accessible, even when the data are no longer available.
  • 16.
    The FAIR DataPrinciples (3/4) To be Interoperable: I1. (meta)data use a formal, accessible, shared, and broadly applicable language for knowledge representation. I2. (meta)data use vocabularies that follow FAIR principles. I3. (meta)data include qualified references to other (meta)data.
  • 17.
    The FAIR DataPrinciples (4/4) To be Re-usable: R1. meta(data) have a plurality of accurate and relevant attributes. R1.1. (meta)data are released with a clear and accessible data usage license. R1.2. (meta)data are associated with their provenance. R1.3. (meta)data meet domain-relevant community standards.
  • 18.
    H2020 Data ManagementPlan • The DMP should include information on: • the handling of research data during and after the end of the project • what data will be collected, processed and/or generated • which methodology and standards will be applied • whether data will be shared/made open access, and • how data will be curated and preserved (including after the end of the project) • DMPs are submitted as deliverables – first version due at six-month stage • Template and guidance is given in the Guidelines doc
  • 19.
    Reflections on assessingH2020 DMPs • It would be better if everyone followed the same template – the EC does provide one, but its use isn’t (yet) mandatory • A DMP doesn’t need to tell everything there is to know about a project: brevity is a plus! • Areas of frequent weakness: security (access and storage), ethical restrictions for data sharing, appraisal of long-term value/interest, quality assurance processes, costs • Advice: • Be clear about the different between in-project and post-project data storage and archiving; • Don’t just regurgitate the H2020 guidelines – reviewers pick up on that really quickly; • Try not to confuse publications and data (I have seen projects describe archived data as ‘gold Open Access’ which doesn’t make much sense)
  • 20.
    Strategies for success,a three step guide
  • 21.
    Step 1. Beclear about who is involved • RDM is a hybrid activity, involving multiple stakeholder groups… • The researchers themselves • Research support personnel • Partners based in other institutions, funders, data centres, commercial partners, etc • No single person does everything, and it makes no sense to duplicate effort or reinvent wheels • Data Management Planning (DMP) underpins and pulls together different strands of data management activities. DMP is the process of planning, describing and communicating the activities carried out during the research lifecycle in order to… • Keep sensitive data safe • Maximise data’s re-use potential • Support longer-term preservation • Data Management Plans are a means of communication, with contemporaries and future re-users alike
  • 22.
    Step 2. Writethings down • In a data management plan / record • In metadata to describe the data and help others to understand it • In workflows and README files • In version management • In justifying decisions re. access, embargo, selection and appraisal… the list can be very long… Communication is crucial… …and plans can and do change!
  • 23.
    Step 3. Don’ttry to do everything yourself • See Step 1 ;)
  • 24.
    A few do’sand don’ts for RDM DO DON’T Have a plan for your data Make it up as you go along Keep backups. Make this easy with automated syncing services like Dropbox, provided your data isn’t too sensitive Carry the only copy around on a memory card, your laptop, your phone, etc Describe your data as you collect it. This makes it possible for others to interpret it, and for you to do the same a few years down the line Leave this till the end. The quality of metadata decreases with time, and the best metadata is created at the moment of data capture Save your work in open file formats, where possible, and use accepted metadata standards to enable like-with-like comparison Invent new ‘standards’ where community norms already exist Deposit your data in a data centre or repository, and link it to your publications Be afraid to ask for help. This will exist both within your institution, and via national / European support organisations
  • 25.
    RDM / OpenData in practice: key points 1. Understand your funder’s policies (and perhaps national policy initiatives – see recent SPARC-Europe reports) 2. Create a data management plan (e.g. with DMPonline) 3. Decide which data to preserve (e.g. using the DCC How-To guide and checklist, “Five Steps to Decide what Data to Keep”) 4. Identify a long-term home for your data (e.g. via re3data.org) 5. Link your data to your publications with a persistent identifier (e.g. via DataCite) • N.B. Many archives, including Zenodo, will do this for you 6. Investigate EU infrastructure services and resources
  • 26.
    And finally, afew RDM rules of thumb • Without intervention, data + time = no data • See Vines, above • Prioritise: could anyone die or go to jail? • Legal issues (e.g. protecting vulnerable subjects) are the most important • Storage is not the same as management • Think of data as plants and the servers as a greenhouse • The plants still need to be fed, watered, pruned, etc… and sometimes disposed of • Management is not the same as sharing • Not all data should be shared • Approach: “As open as possible, as closed as necessary” • Remember that plans are just that – they are not contracts!
  • 27.
    Contact details • Formore information about the FOSTER project: • Website: www.fosteropenscience.eu • Principal investigator: Eloy Rodrigues (eloy@sdum.uminho.pt) • General enquiries: Gwen Franck (gwen.franck@eifl.net) • Twitter: @fosterscience • My contact details: • Email: martin.donnelly@ed.ac.uk • Twitter: @mkdDCC • Slideshare: http://www.slideshare.net/martindo nnelly This work is licensed under the Creative Commons Attribution 2.5 UK: Scotland License.

Editor's Notes

  • #4 The FOSTER project – what and how. FOSTER’s training strategy uses a combination of methods and activities, from face-to-face training, to the use of e-learning, blended and self-learning, as well as the dissemination of training materials/contents/curricula via a dedicated training portal, plus a helpdesk. Face-to-face trainings targets graduate schools in European universities and in particular will train trainers/teachers/multipliers that can conduct further training and dissemination activities in their institution, country and disciplinary community. FOSTER combines experiences and materials to showcase best practices, setting the scene for an active learning and teaching community for open access practices across Europe. The main outcomes of the project are: The FOSTER portal to host training courses and curricula; Facilitate the organisation of FOSTER training events and the creation of training content across Europe Identification of existing contents that can be reused in the context of the training activities and develop/create/ enhance contents if/where they are needed;
  • #5 Partners: University of Minho, University of Göttingen, Open University, Stichting eIFL.net, Digital Curation Centre University of Edinburgh and University of Glasgow, Danmarks Tekniske Universitet, Stichting LIBER, Spanish National Research Council, GESIS – Leibniz Institute for the Social Sciences, Centre for Genomic Regulation Associated partners: DARIAH EU, TIB Hannover
  • #6 Context: Open Knowledge Foundation, Creative Commons, G8 statement Open Science principles are an essential part of knowledge creation and sharing and innovation. They directly support researchers’ need for greater impact, optimum dissemination of research, while also enabling the engagement of citizen scientists and society at large on societal challenges. FOSTER aims to set in place sustainable mechanisms for EU researchers to integrate Open Science in their daily workflow, supporting researchers to optimizing their research visibility and impact and to facilitate the adoption of EU open access policies.
  • #7 Intro: Open Science, Open Research, Science 2.0 Who’s involved? OS is a horizontal topic, relevant to all stakeholders of the Research Cycle & Learning Objectives and Methods must address the various needs of each group;
  • #10 Note that even if data is not suitable for sharing/publication, it still needs active management!
  • #13 Global north perhaps – ref. South African webinars last week, which were an eye opener
  • #18 Note that the European Commission has established an Expert Group on Turning FAIR Data into Reality (E03464) which will run until Spring 2018.
  • #25 A DMP is a basic statement of how you will create, manage, share and preserve your data Funders expect the decisions to be justified, particularly where it’s not in line with their policy (e.g. limits on data sharing)