INSPIRE 2014 conference


Published on

Exploring crowdsourcing and citizen science within the context of EU INSPIRE directive

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

INSPIRE 2014 conference

  1. 1. Crowdsourcing, Citizen Science & INSPIRE Muki Haklay & Claire Ellul Extreme Citizen Science (ExCiteS) research group, UCL @mhaklay / @UCL_ExCiteS
  2. 2. Outline • Three eras of environmental information: – By experts, for experts (1969-1992) – By experts, for experts & the public (1992-2012) – By experts & the public, for experts & the public (2012 on) • Crowdsourced geographic information & citizen science • Challenges within the INSPIRE framework – case study
  3. 3. First era: 1969-[1987-92] Experts Public Decision Makers Experts
  4. 4. First era: 1969-[1987-92] • Experts responsible for creating environmental information and using it to advise government • Top-down attitude to environmental decision making • ‘Information Deficit’ model towards the public • Environmental information by experts, for experts
  5. 5. Second era: 1992 – [2005-12]
  6. 6. Second era: 1992 – [2005-12] • Rio Principle 10, Aarhus Convention • Public access to environmental information is a prerequisite to participation, civil society organisations as intermediaries • The Web as the dissemination medium • Information by experts, for experts and the public (but in expert form)
  7. 7. Third era: Since 2012
  8. 8. Third era & INSPIRE
  9. 9. Crowdsourcing/VGI2006 Nick Black
  10. 10. Mapping parties Nick Black
  11. 11. 2014
  12. 12. Billy Brown
  13. 13. More than maps • Prof. Jacquie McGlade, head of European Environment Agency, 2008 (Aarhus + 10): ‘Often the best information comes from those who are closest to it, and it is important we harness this local knowledge if we are to tackle climate change adequately… people are encouraged to give their own opinion on the quality of the beach and water, to supplement the official information.’ 2008
  14. 14. EEA WaterWatch2008
  15. 15. Citizen science • While Citizen Science has a long history, new formed emerged in the past decades, facilitated by the web - ‘citizen cyberscience’ • Types: – biodiversity/conservation observations recording; – volunteer computing; – volunteer thinking; – Do It Yourself (DIY) science; – community/civic science See Haklay, M., 2013, Citizen Science and Volunteered Geographic Information – overview and typology of participation in Crowdsourcing Geographic Knowledge
  16. 16. More information at
  17. 17. Air Quality Source: West Wiltshire 2008
  18. 18. 2008
  19. 19. Mapping for Change 2012
  20. 20. June 2012
  21. 21. June 2013
  22. 22. EEA Work Programme 2014-18 • As Part of Strategic Area 3 activities: ‘to widen and deepen the European knowledge base by developing communities of practice and engaging in partnerships with stakeholders beyond Eionet, such as business and research communities, Civil Society Organisations (CSO), and initiatives concerning lay, local and traditional knowledge and citizen science’
  23. 23. Data Quality Assurance • Crowdsourcing - the number of people that edited the information • Social - gatekeepers and moderators • Geographic - broader geographic knowledge • Domain knowledge - the knowledge domain of the information • Instrumental observation – technology based calibration • Process oriented – following a procedure
  24. 24. Citizen Science & Metadata • The Challenge… – Increasing creation and user base of spatial data – open data movement, FOSS4G software – Lack of expertise –users may come from variety of backgrounds, and don’t have GIS training or understanding of spatial data quality aspects • Limitations (production) – Metadata standards producer centric: Complex; No guidelines as to the amount of detail required; Difficult to understand; held in non-standard methods – e.g. PDF, website, wiki
  25. 25. Citizen Science & Metadata • Limitations (Data and metadata de-coupled) – Metadata not updated automatically when data changes – Metadata Capture not integrated into workflow – Some Citizen Science projects do capture metadata, by accident rather than design, to meet a specific research aim – Metadata not used by end users of data, consequent lack of understanding of data quality & fitness for purpose • Limitations (Use) – Metadata presentation not use focus: what do people need to know to re-use data and combine it with other sources? – Does not keep track of derived data or record, attribute or geometry level updates. But do is this level of detail useful?
  26. 26. Case Study – Noise Data • Scenario: You are an environmental consultant, wanting noise information about Deptford in South London for a project in the area. You have no specific GIS expertise or training • You identify three maps of noise– all online – ideally you’d like to re-use the data to save capturing more • Do you chose one or take information from many? How do you make the choice?
  27. 27. Dataset A
  28. 28. Dataset B
  29. 29. Dataset C
  30. 30. Integrating Noise Data • Just by looking at the maps: – Is there data for the areas I am interested in? • There is some data for Deptford in all three maps – Does the data cover the entire area of interest, or are there gaps in the data • Maps C and A both have gaps – What is the dB(A) range covered by the maps – does it cover really loud noise? • Yes, all datasets cover noise over 70dB(A), although the resolution differs (but what about underlying data?)
  31. 31. Integrating Noise Data • Things you can learn from the websites – Dataset B:
  32. 32. Integrating Noise Data • Things you can learn from the websites- Dataset C:
  33. 33. Hidden Information
  34. 34. Crowdsourced Geographic information in Government Kathmandu Living LabsJo Somerfield • 29 case studies from across the world • Success factor, challenges and lessons •
  35. 35. Summary • Citizen-produced environmental information is on the rise and will continue to increase • Characteristics – heterogeneous, temporal & spatial variability, sources • Different procedures, organisational structure and practices that demand consideration of data management and curation