Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Developing an Australian phenology monitoring network, Tim Brown, ACEAS Grand 2014

547 views

Published on

Developing an Australian phenology monitoring network using near-surface remote sensing. Tim Brown and Trevor Keenan. ACEAS Grand 2014

Published in: Environment, Technology, Education
  • Be the first to comment

  • Be the first to like this

Developing an Australian phenology monitoring network, Tim Brown, ACEAS Grand 2014

  1. 1. Building an Australian Phenocam Network A Report from the 2014 ACEAS Phenology Monitoring Working Group Tim Brown (Australian National University) ACEAS Workshop Participants – March, 2014 Trevor Keenan (Co-Organizer), Alison Specht, Mike Liddell, Natalia Restrepo Coupe, Ivan Hanigan, Jeff Taylor, Yu Liu, Eva Van Gorsel, Albert Van Dijk, Remko Duursma, Caitlin Moore, Stefan Meier, Grant Thorpe, Andrew Richardson, Oliver Sonnentag.
  2. 2. Complex Systems • Complex systems solve problems based on the rate of information exchange, complexity of connective networks and quality of information available
  3. 3. Research as a complex system Current ecological questions are too hard to be understood with existing data and methods Barriers to knowledge discovery in ecology:
  4. 4. Research as a complex system Current ecological questions are too hard to be understood with existing data and methods Barriers to knowledge discovery in ecology: • Rate of data discovery: Tools available and ease of use
  5. 5. Research as a complex system Current ecological questions are too hard to be understood with existing data and methods Barriers to knowledge discovery in ecology: • Rate of data discovery: Tools available and ease of use • How easy is it to find existing data: Journal Paywalls
  6. 6. Research as a complex system Current ecological questions are too hard to be understood with existing data and methods Barriers to knowledge discovery in ecology: • Rate of data discovery: Tools available and ease of use • How easy is it to find existing data: Journal Paywalls • Ease of using existing data: Unavailable/non-interoperable data
  7. 7. Research as a complex system Current ecological questions are too hard to be understood with existing data and methods Barriers to knowledge discovery in ecology: • Rate of data discovery: Tools available and ease of use • How easy is it to find existing data: Journal Paywalls • Ease of using existing data: Unavailable/non-interoperable data • Data quality: Time period monitored; precision; spatial resolution
  8. 8. Research as a complex system Current ecological questions are too hard to be understood with existing data and methods Barriers to knowledge discovery in ecology: • Rate of data discovery: Tools available and ease of use • How easy is it to find existing data: Journal Paywalls • Ease of using existing data: Unavailable/non-interoperable data • Data quality: Time period monitored; precision; spatial resolution • Data Analysis  How hard is it to work with data, ask new questions and answer them  Wikipedia/Google/Smartphones
  9. 9. Research as a complex system Current ecological questions are too hard to be understood with existing data and methods Barriers to knowledge discovery in ecology: • Rate of data discovery: Tools available and ease of use • How easy is it to find existing data: Journal Paywalls • Ease of using existing data: Unavailable/non-interoperable data • Data quality: Time period monitored; precision; spatial resolution • Data Analysis  How hard is it to work with data, ask new questions and answer them  Wikipedia/Google/Smartphones The bottom line: How hard (and costly) is it to answer a given question?  Reducing these barriers increases knowledge discovery
  10. 10. Research as a complex system Current ecological questions are too hard to be understood with existing data and methods Barriers to knowledge discovery in ecology: • Rate of data discovery: Tools available and ease of use • How easy is it to find existing data: Journal Paywalls • Ease of using existing data: Unavailable/non-interoperable data • Data quality: Time period monitored; precision; spatial resolution • Data Analysis  How hard is it to work with data, ask new questions and answer them  Wikipedia/Google/Smartphones The bottom line: How hard (and costly) is it to answer a given question?  Reducing these barriers increases knowledge discovery
  11. 11. Traditionally field ecology has had very limited capacity
  12. 12. Traditionally field ecology has had very limited capacity • Low spatial/time resolution data
  13. 13. Traditionally field ecology has had very limited capacity • Low spatial/time resolution data • Limited sensors other than weather
  14. 14. Traditionally field ecology has had very limited capacity • Low spatial/time resolution data • Limited sensors other than weather • Sampling is manual; subjective
  15. 15. Traditionally field ecology has had very limited capacity • Low spatial/time resolution data • Limited sensors other than weather • Sampling is manual; subjective • Observations not-interoperable • little or no data sharing • often proprietary
  16. 16. Traditionally field ecology has had very limited capacity • Low spatial/time resolution data • Limited sensors other than weather • Sampling is manual; subjective • Observations not-interoperable • little or no data sharing • often proprietary • Repeat experiments to verify results are often at different site by different observers
  17. 17. Traditionally field ecology has had very limited capacity • Low spatial/time resolution data • Limited sensors other than weather • Sampling is manual; subjective • Observations not-interoperable • little or no data sharing • often proprietary • Repeat experiments to verify results are often at different site by different observers What % of data from the last century of ecology is available for reuse?
  18. 18. Traditionally field ecology has had very limited capacity • Low spatial/time resolution data • Limited sensors other than weather • Sampling is manual; subjective • Observations not-interoperable • little or no data sharing • often proprietary • Repeat experiments to verify results are often at different site by different observers What % of data from the last century of ecology is available for reuse? (even your own data)
  19. 19. “Next-Gen” ecology
  20. 20. “Next-Gen” ecology • Large, long-term field projects with standardized instruments and data products (“Big Data”): TERN, NEON, FluxNet
  21. 21. “Next-Gen” ecology • Large, long-term field projects with standardized instruments and data products (“Big Data”): TERN, NEON, FluxNet • NEON: • 106 sites around the US • 30-years • Each site has the same suite of 100’s of types of calibrated sensors; coupled with on-the-ground measurements, annual aerial overflights • Billions of data points per year • All data public
  22. 22. “Next-Gen” ecology • Large, long-term field projects with standardized instruments and data products (“Big Data”): TERN, NEON, FluxNet • NEON: • 106 sites around the US • 30-years • Each site has the same suite of 100’s of types of calibrated sensors; coupled with on-the-ground measurements, annual aerial overflights • Billions of data points per year • All data public • Requires: Public Data, Data Standards and Synthesis
  23. 23. “Next-Gen” ecology • Large, long-term field projects with standardized instruments and data products (“Big Data”): TERN, NEON, FluxNet • NEON: • 106 sites around the US • 30-years • Each site has the same suite of 100’s of types of calibrated sensors; coupled with on-the-ground measurements, annual aerial overflights • Billions of data points per year • All data public • Requires: Public Data, Data Standards and Synthesis • Enables: Data Sharing and collaboration ; Increases our ability to solve complex problems
  24. 24. Big Data is not just for “big” projects National Arboretum Phenomic & Environmental Sensor Array (Canberra, ACT)
  25. 25. National Arboretum Sensor Array
  26. 26. National Arboretum Sensor Array • 20-node Wireless mesh sensor network (Temp, Hg, PAR, Soil T/H)
  27. 27. National Arboretum Sensor Array • 20-node Wireless mesh sensor network (Temp, Hg, PAR, Soil T/H) • High resolution dendrometers on 20 trees
  28. 28. National Arboretum Sensor Array • 20-node Wireless mesh sensor network (Temp, Hg, PAR, Soil T/H) • High resolution dendrometers on 20 trees • (3) Gigapixel timelapse cameras: Leaf phenology for > 1,000 trees
  29. 29. National Arboretum Sensor Array • 20-node Wireless mesh sensor network (Temp, Hg, PAR, Soil T/H) • High resolution dendrometers on 20 trees • (3) Gigapixel timelapse cameras: Leaf phenology for > 1,000 trees • Sequence every tree on site for < $50 tree
  30. 30. National Arboretum Sensor Array • 20-node Wireless mesh sensor network (Temp, Hg, PAR, Soil T/H) • High resolution dendrometers on 20 trees • (3) Gigapixel timelapse cameras: Leaf phenology for > 1,000 trees • Sequence every tree on site for < $50 tree • 5 full weather stations
  31. 31. National Arboretum Sensor Array • 20-node Wireless mesh sensor network (Temp, Hg, PAR, Soil T/H) • High resolution dendrometers on 20 trees • (3) Gigapixel timelapse cameras: Leaf phenology for > 1,000 trees • Sequence every tree on site for < $50 tree • 5 full weather stations • All data live online in realtime
  32. 32. National Arboretum Sensor Array • 20-node Wireless mesh sensor network (Temp, Hg, PAR, Soil T/H) • High resolution dendrometers on 20 trees • (3) Gigapixel timelapse cameras: Leaf phenology for > 1,000 trees • Sequence every tree on site for < $50 tree • 5 full weather stations • All data live online in realtime • Annual LIDAR
  33. 33. National Arboretum Sensor Array • 20-node Wireless mesh sensor network (Temp, Hg, PAR, Soil T/H) • High resolution dendrometers on 20 trees • (3) Gigapixel timelapse cameras: Leaf phenology for > 1,000 trees • Sequence every tree on site for < $50 tree • 5 full weather stations • All data live online in realtime • Annual LIDAR • UAV overflights (monthly?) • Georectified Google Earth/GIS image layers • 5mm resolution DEM/3D point cloud of site in time-series
  34. 34. National Arboretum Sensor Array • 20-node Wireless mesh sensor network (Temp, Hg, PAR, Soil T/H) • High resolution dendrometers on 20 trees • (3) Gigapixel timelapse cameras: Leaf phenology for > 1,000 trees • Sequence every tree on site for < $50 tree • 5 full weather stations • All data live online in realtime • Annual LIDAR • UAV overflights (monthly?) • Georectified Google Earth/GIS image layers • 5mm resolution DEM/3D point cloud of site in time-series Total Cost ~$200K
  35. 35. Who cares about data standards anyway?
  36. 36. Who cares about data standards anyway? • Data management and synthesis is essential to doing science in the 21st century
  37. 37. Who cares about data standards anyway? • Data management and synthesis is essential to doing science in the 21st century • New technology lets us measure the world in unprecedented detail but it creates so much data we have to organize it better
  38. 38. Phenocams Phenocam at Capitol Reef Field Station, Utah, USA (Live view: bit.ly/CRFS2)
  39. 39. What is a phenocam? Richardson et al (2009), Near-surface remote sensing of spatial and temporal variation in canopy phenology, Ecological Applications, 19(6), 1417-1428.
  40. 40. What is a phenocam? • Phenology: Study of periodic plant and animal life-cycle events and how these are influenced by seasonal and interannual variations in climate. (source: wikipedia) Richardson et al (2009), Near-surface remote sensing of spatial and temporal variation in canopy phenology, Ecological Applications, 19(6), 1417-1428.
  41. 41. What is a phenocam? • Phenology: Study of periodic plant and animal life-cycle events and how these are influenced by seasonal and interannual variations in climate. (source: wikipedia) • Phenocam: Low-cost, automated, consumer digital camera for capturing environmental change in the field (A. Richardson: US PhenoCam network) Richardson et al (2009), Near-surface remote sensing of spatial and temporal variation in canopy phenology, Ecological Applications, 19(6), 1417-1428.
  42. 42. What can you measure with phenocams?
  43. 43. What can you measure with phenocams? Color-based vegetation change driven by biological response to environment or disturbance events (drought, herbivory, fire, etc) • Flowering, Leaf and bark color changes, canopy color variation, grassland and understory greenup
  44. 44. What can you measure with phenocams? Color-based vegetation change driven by biological response to environment or disturbance events (drought, herbivory, fire, etc) • Flowering, Leaf and bark color changes, canopy color variation, grassland and understory greenup • Most used for monitoring temperate deciduous forest and satellite validation.
  45. 45. What can you measure with phenocams? Color-based vegetation change driven by biological response to environment or disturbance events (drought, herbivory, fire, etc) • Flowering, Leaf and bark color changes, canopy color variation, grassland and understory greenup • Most used for monitoring temperate deciduous forest and satellite validation. • Less work on vegetation types that don’t show strong canopy-wide seasonal change such Australian dry/wet sclerophyll forests, grasslands, etc.
  46. 46. Phenocam Science • Characterize relation between environmental drivers and plant response (data from co-located sensors: CO2 flux, microclimate, etc.)
  47. 47. Phenocam Science • Characterize relation between environmental drivers and plant response (data from co-located sensors: CO2 flux, microclimate, etc.) • Influence of seasonal plant cycles on ecosystem carbon budgets
  48. 48. Phenocam Science • Characterize relation between environmental drivers and plant response (data from co-located sensors: CO2 flux, microclimate, etc.) • Influence of seasonal plant cycles on ecosystem carbon budgets • Scaling of ground-based phenology to satellite (and back)
  49. 49. Phenocam Science • Characterize relation between environmental drivers and plant response (data from co-located sensors: CO2 flux, microclimate, etc.) • Influence of seasonal plant cycles on ecosystem carbon budgets • Scaling of ground-based phenology to satellite (and back) • Fractional cover of green vegetation (nadir cameras)
  50. 50. Phenocam Science • Characterize relation between environmental drivers and plant response (data from co-located sensors: CO2 flux, microclimate, etc.) • Influence of seasonal plant cycles on ecosystem carbon budgets • Scaling of ground-based phenology to satellite (and back) • Fractional cover of green vegetation (nadir cameras) • Leaf area index (upwards facing cameras).
  51. 51. Phenocam Science • Characterize relation between environmental drivers and plant response (data from co-located sensors: CO2 flux, microclimate, etc.) • Influence of seasonal plant cycles on ecosystem carbon budgets • Scaling of ground-based phenology to satellite (and back) • Fractional cover of green vegetation (nadir cameras) • Leaf area index (upwards facing cameras). • Flowering phenology
  52. 52. Phenocam Science • Characterize relation between environmental drivers and plant response (data from co-located sensors: CO2 flux, microclimate, etc.) • Influence of seasonal plant cycles on ecosystem carbon budgets • Scaling of ground-based phenology to satellite (and back) • Fractional cover of green vegetation (nadir cameras) • Leaf area index (upwards facing cameras). • Flowering phenology • Snow cover monitoring
  53. 53. Phenocam Science • Characterize relation between environmental drivers and plant response (data from co-located sensors: CO2 flux, microclimate, etc.) • Influence of seasonal plant cycles on ecosystem carbon budgets • Scaling of ground-based phenology to satellite (and back) • Fractional cover of green vegetation (nadir cameras) • Leaf area index (upwards facing cameras). • Flowering phenology • Snow cover monitoring • Capture Rare events
  54. 54. Phenocam Science • Characterize relation between environmental drivers and plant response (data from co-located sensors: CO2 flux, microclimate, etc.) • Influence of seasonal plant cycles on ecosystem carbon budgets • Scaling of ground-based phenology to satellite (and back) • Fractional cover of green vegetation (nadir cameras) • Leaf area index (upwards facing cameras). • Flowering phenology • Snow cover monitoring • Capture Rare events • Fire (occurrence and recovery)
  55. 55. Phenocam Science • Characterize relation between environmental drivers and plant response (data from co-located sensors: CO2 flux, microclimate, etc.) • Influence of seasonal plant cycles on ecosystem carbon budgets • Scaling of ground-based phenology to satellite (and back) • Fractional cover of green vegetation (nadir cameras) • Leaf area index (upwards facing cameras). • Flowering phenology • Snow cover monitoring • Capture Rare events • Fire (occurrence and recovery) • Identify outlier individuals for analysis/sequencing
  56. 56. Richardson, Klosterman, and Toomey. Near-Surface Sensor-Derived Phenology. (2013). In Phenology: An Integrative Environmental Science. Springer Netherlands, 2013. 413-430.
  57. 57. Fall in Canberra, April 25, 2014 See the full panorama online here: bit.ly/CBR-TS
  58. 58. Fall in Canberra, April 25, 2014 See the full panorama online here: bit.ly/CBR-TS
  59. 59. Fall in Canberra, April 25, 2014 See the full panorama online here: bit.ly/CBR-TS
  60. 60. Why a working group? Everyone is installing cameras now because they are cheap
  61. 61. Why a working group? Everyone is installing cameras now because they are cheap • US standards are being developed (US PhenoCam and NEON)
  62. 62. Why a working group? Everyone is installing cameras now because they are cheap • US standards are being developed (US PhenoCam and NEON) BUT no global standards or global network
  63. 63. Why a working group? Everyone is installing cameras now because they are cheap • US standards are being developed (US PhenoCam and NEON) BUT no global standards or global network • Non-US networks exist but limited data online.
  64. 64. Why a working group? Everyone is installing cameras now because they are cheap • US standards are being developed (US PhenoCam and NEON) BUT no global standards or global network • Non-US networks exist but limited data online. • Lots of cameras at random field sites globally that aren’t indexed
  65. 65. Why a working group? Everyone is installing cameras now because they are cheap • US standards are being developed (US PhenoCam and NEON) BUT no global standards or global network • Non-US networks exist but limited data online. • Lots of cameras at random field sites globally that aren’t indexed • No standards or data management plans for Australia (yet)
  66. 66. Why a working group? Everyone is installing cameras now because they are cheap • US standards are being developed (US PhenoCam and NEON) BUT no global standards or global network • Non-US networks exist but limited data online. • Lots of cameras at random field sites globally that aren’t indexed • No standards or data management plans for Australia (yet) • Non-deciduous forests are not well represented
  67. 67. 2014 Workshop: Goals
  68. 68. 2014 Workshop: Goals • Summarized cameras and existing data sets • Preliminary analysis of 15 cameras around Australia • Mix of fixed internet-enabled cameras (flux towers), “game cameras” • Primary Contributors (camera data):  Alfredo Huete (Natalia Restrepo Coupe)  Jason Berringer (Caitlin Moore)  Supersite network (Myself, Mike Liddell)
  69. 69. 2014 Workshop: Goals • Summarized cameras and existing data sets • Preliminary analysis of 15 cameras around Australia • Mix of fixed internet-enabled cameras (flux towers), “game cameras” • Primary Contributors (camera data):  Alfredo Huete (Natalia Restrepo Coupe)  Jason Berringer (Caitlin Moore)  Supersite network (Myself, Mike Liddell) • What data products can we create from Australian phenocams?
  70. 70. 2014 Workshop: Goals • Summarized cameras and existing data sets • Preliminary analysis of 15 cameras around Australia • Mix of fixed internet-enabled cameras (flux towers), “game cameras” • Primary Contributors (camera data):  Alfredo Huete (Natalia Restrepo Coupe)  Jason Berringer (Caitlin Moore)  Supersite network (Myself, Mike Liddell) • What data products can we create from Australian phenocams? • Metadata standards and long term storage • Ivan Hanigan (supersite manager) is primary lead on this
  71. 71. 2014 Workshop: Goals • Summarized cameras and existing data sets • Preliminary analysis of 15 cameras around Australia • Mix of fixed internet-enabled cameras (flux towers), “game cameras” • Primary Contributors (camera data):  Alfredo Huete (Natalia Restrepo Coupe)  Jason Berringer (Caitlin Moore)  Supersite network (Myself, Mike Liddell) • What data products can we create from Australian phenocams? • Metadata standards and long term storage • Ivan Hanigan (supersite manager) is primary lead on this • Collect all existing camera data • Email me if you have cameras
  72. 72. (Preliminary) Results
  73. 73. (Preliminary) Results • Whole forest analysis doesn’t yield much but looking at individuals is promising
  74. 74. (Preliminary) Results • Whole forest analysis doesn’t yield much but looking at individuals is promising • Track individual trees or groups of species (requires new tools)
  75. 75. (Preliminary) Results • Whole forest analysis doesn’t yield much but looking at individuals is promising • Track individual trees or groups of species (requires new tools) • Green-up of understory
  76. 76. (Preliminary) Results • Whole forest analysis doesn’t yield much but looking at individuals is promising • Track individual trees or groups of species (requires new tools) • Green-up of understory • Fire monitoring
  77. 77. (Preliminary) Results • Whole forest analysis doesn’t yield much but looking at individuals is promising • Track individual trees or groups of species (requires new tools) • Green-up of understory • Fire monitoring • Grasslands and fields and other open areas
  78. 78. (Preliminary) Results • Whole forest analysis doesn’t yield much but looking at individuals is promising • Track individual trees or groups of species (requires new tools) • Green-up of understory • Fire monitoring • Grasslands and fields and other open areas • Need better data: • Imagery is difficult enough to analyse: low quality images and moving FOV make it harder • Timestamps inconsistent (GMT vs Local vs Daylight) • Large gaps in data
  79. 79. Outcomes: Recommendations
  80. 80. Outcomes: Recommendations • The image is the data! -- field of view (FOV) is your sample plot • You wouldn’t move your study plot at random every 6 months
  81. 81. Outcomes: Recommendations • The image is the data! -- field of view (FOV) is your sample plot • You wouldn’t move your study plot at random every 6 months • Need protocols and standards for hardware, data capture and how data gets harvested
  82. 82. Outcomes: Recommendations • The image is the data! -- field of view (FOV) is your sample plot • You wouldn’t move your study plot at random every 6 months • Need protocols and standards for hardware, data capture and how data gets harvested • Also pay attention to how things are replaced, matching FOVs • Simple things like how you replace the batteries on a camera can greatly impact data quality
  83. 83. Outcomes: Moving forward
  84. 84. Outcomes: Moving forward • Data Standards
  85. 85. Outcomes: Moving forward • Data Standards • Tools for managing the images (1000’s of images)
  86. 86. Outcomes: Moving forward • Data Standards • Tools for managing the images (1000’s of images) • Tools for analysing the data • What data products can provide from phenocam images? • TERN/Supersites/Flux sites are good for this since they have lots of other stuff being measured • Less well characterized for Australian environments
  87. 87. Outcomes: Moving forward • Data Standards • Tools for managing the images (1000’s of images) • Tools for analysing the data • What data products can provide from phenocam images? • TERN/Supersites/Flux sites are good for this since they have lots of other stuff being measured • Less well characterized for Australian environments • Persistent storage
  88. 88. Outcomes: Moving forward • Data Standards • Tools for managing the images (1000’s of images) • Tools for analysing the data • What data products can provide from phenocam images? • TERN/Supersites/Flux sites are good for this since they have lots of other stuff being measured • Less well characterized for Australian environments • Persistent storage • Results sharing and tools that can synthesize the wider network’s data and results
  89. 89. Outcomes: Moving forward • Data Standards • Tools for managing the images (1000’s of images) • Tools for analysing the data • What data products can provide from phenocam images? • TERN/Supersites/Flux sites are good for this since they have lots of other stuff being measured • Less well characterized for Australian environments • Persistent storage • Results sharing and tools that can synthesize the wider network’s data and results • Collaborate with NEON/US PhenoCam Network • Also get EU, India, China, etc on board
  90. 90. Data sharing: Embargos and other challenges
  91. 91. Data sharing: Embargos and other challenges • TERN, NEON, etc: data is designed to be public • Very important because it gives everyone a level playing field for examining questions together
  92. 92. Data sharing: Embargos and other challenges • TERN, NEON, etc: data is designed to be public • Very important because it gives everyone a level playing field for examining questions together • But, for many other sources of data there are (may be) good reasons for it to be proprietary • Data costs money! • People need to get credit for creating quality data sets  Initial credit in publications  Citation credit for the data itself if you create a good dataset
  93. 93. Data sharing: Embargos and other challenges • TERN, NEON, etc: data is designed to be public • Very important because it gives everyone a level playing field for examining questions together • But, for many other sources of data there are (may be) good reasons for it to be proprietary • Data costs money! • People need to get credit for creating quality data sets  Initial credit in publications  Citation credit for the data itself if you create a good dataset • Providing resources and tools that create standardized data and streamline analysis benefit everyone • Private datasets can be easily pushed to the public when ready
  94. 94. Thanks to all the workshop participants Photo: Yu Liu (CERN)
  95. 95. Credits & Thanks Justin Borevitz and the Borevitz Lab, ANU 2014 ACEAS Phenocam Workshop participants Funding - ACEAS: Phenocam workshop - FNQ Camera (TERN) - NEON/NSF: US Phenocam Workshop - NCRIS: Phenocam camera server - ANU Major Equipment Grant: National Arboretum sensor array Email me your phenocam info: tim.brown@anu.edu.au Find me here: bit.ly/Tim_ANU Or Google: Tim Brown anu Borevitz Lab: borevitzlab.anu.edu.au

×