Relevance to Themes of the meeting: The abstract is mostly relevant to the themes surrounding Managing, publishing, and finding data in that the generation of standards for field spectroscopy will benefit the searching, sharing and reuse of data so collected. In the formulation of the metadata standards the approach also leads to the establishment of better practice for field data collection.
The role of spectroradiometers and related surface-based optical instrumentation is well recognisedIncreasing availability of spectral data (more instrumentation)Moves to continuous monitoring of reflectancesIncreased focus on cal / val
Data from spectroradiometers and other optical instrumentation plays an important role in calibration and validation of image data from remote sensing missions. In Australia and internationally there is increased interest in archives for the exchange and reuse of spectral datasets and reference spectral data. However, there are no national or international standards for in situ spectral measurement or management of such data in environmental applications, without which quality, a critical issue, cannot be universally assessed.Purpose of the workshopData from spectroradiometers and related surface-based optical instrumentation plays an important role in the calibration and validation (cal/val) of image data from remote sensing missions and will play a key role in the validation efforts in Terrestrial Ecosystem Research Network AusCover Facility (TERN AusCover). Stimulated by TERN and related international initiatives (E.g. FluxNet, SpecNet, NEON) there has been increased interest in archives for the reuse and exchange of spectral datasets and reference spectral data. However, whilst ‘spectral libraries’ or ‘spectral databases’ place an emphasis on organisation, storage and retrieval, they also raise issues of quality and the representativeness of the data they contain for subsequent re-use. Field measurement practices vary and, in the absence of agreed best practice guides, user experience is often a key guide to data quality. Well documented metadata are fundamental to both the establishment of best practice and criteria for quality assessment; these metadata document the conditions (meteorological, geometrical, physical) under which the measurements were obtained, details about the object measured, and instrument performance (radiometric and spectral checks) as well as any subsequent processing undertaken on the data.To date there are no national or international standards for in situ spectral measurement or management of such data in environmental applications, without which quality can not be universally assessed. The time is right to propose such international standards. This workshop will bring together key Australian and international experts in field spectroscopy, cal/val and data wharehousing for a workshop to assess current practice, establish future best practice, and to establish long-term directions for the exchange of bio-optical and related metadata. The key outcome will be an international journal paper which proposes a new, tested standard for international adoption.
In short, quality relies less on spectral data themselves and more on associated metadataThe existence of extensively documented metadata ultimately determines the long-term usability of spectral dataAssists searching and selectionAssists assessment of suitability for other research projectsCritical if those data are obtained in field dataOrganised, logical, consistent structureComprehensive descriptionCorrect level of detailCollectionsIndividual data items
The need to acquire high quality spectral signatures and to document the conditions under which the measurements were made is paramount to the use of bio-optical data to address high quality ecosystem science questions. The workshop is aimed at driving best practice in field measurement and to laying the foundations of an international standard for the exchange of such data. The workshop will address the following fundamental questions: 1. What are the key criteria for assessing the quality and robustness of a spectral signature obtained in the field? 2. What are the key components that then lead to the acquisition of high quality spectra in the field?3. What is the best means by which the spectrum and its metadata can be exchanged to preserve its quality and robustness?
Through four and a half days of presentations, breakout groups and general discussion, the meeting focused initially on agenda setting, then on metadata and informatics solutions to arrive at a synthesis and identification of next steps. Metadata were recognized as the key to long-term usage and sharing of field spectral data. However, standards to ensure quality in data collection are required to facilitate accurate cross comparison of data from different studies; currently there is no international backbone that ensures this. It was recognized that a careful balance needed to be struck between three key factors: the diversity of studies undertaken using field spectroscopy, the diversity of instrumentation used and the need for some strictness in standards to ensure data quality and legacy value. Any standards developed also need to build in flexibility to cope with new innovations in the technology. The meeting was highly successful in forming an outline of best practice to improve data collection in the field. Through breakout sessions, the group began the identification of core metadata requirements for a number of different applications (soils, underwater spectra and vegetation). A variety of methods to both exchange and store spectral data were presented and discussed as were novel tools to assist in summarising the completeness and quality of such datasets. It was recognized that tools are required to help ease the burden of input of metadata. The role of peer review in determining quality was also discussed. The need for care in the preparation of field protocols and of recording data in the field was widely recognized.
Along with the identification of core metadata elements to establish best practice quality assessment, a spectral database system based on the already operational SPECCHIO system, was proposed to meet international objectives to solve the problem. The planned outcomes are being implemented through ANDS sponsorship to enhance SPECCHIO to ensure the long-term storage of data and support scientistis in data analysis activities, with the intention to harmonise TERN and other Australian spectra data.
To develop a national system to house spectral libraries and associated metadataIntended to allow the Australian RS community to collate, share and discover existing spectral libraries and facilitate the capture of new datasets as they are formed.In particular it will provide recorded consistent metadata and a consistent method for publishing, discovering and assessing this informationAligns with ANDS objectives by creating descriptions of the spectral libraries and sharing those descriptions in ARDC. There will be an automatic feed from the system to ARDC. Data from the libraries will also be available for reuse by other researchers.
- By providing a point of contact and coordination for CSIRO Earth observation capability assessment, inter-agency coordination, science planning, more effective sharing of data, tools and infrastructure and support in the development of national policy and pursuit of EO-related proposals under Federal and State funding initiatives; - And by providing the means to more efficiently coordinate, support and enhance linkages to key international bodies like CEOS, GEO and GEOSS, and to other federal agencies, including DSTO , DIGO, GA, BoM and DCCEE.The TCP proposal provides a preliminary guide as to how funding may be divided between these three objectives in the first year of the TCP. This profile would be expected to change in the later years of the TCP. Feedback on this funding profile as we develop our suggestions for the science plan during this workshop is welcome.Now, to what we are hoping to achieve in this workshop. Today we will review the status of Earth observation in a national context, and then review our capabilities and needs across CSIRO from both input and output perspectives. We have 10 Divisions, 4 Flagships and 3 other TCPs represented here today, so I must emphasise that today’s presentations must necessarily be kept brief, but hopefully the information we share today will place us in good stead for the subsequent discussions. Our purpose in being here is to try and develop a draft science plan for the TCP. So when we begin the discussion and break-out sessions, we will be seeking input from you all to- Identify areas of need the TCP should addressAnd develop project concepts addressing these needs, in terms of either develop capability, leverage infrastructure or develop efficient networks.With regard to these project concepts, we will ask you to discuss very approximate timelines, project costs and required resources, and what sort of contribution might be made towards these projects from existing Divisional resources. As I’ve already explained, co-contributions from Divisions will be essential. These proposals will be presented to the Workshop tomorrow. Finally, we will also discuss the relative priority of the project proposals, and give some discussion to the relative split of funds across the three TCP aspects: capability development, leveraging infrastructure, and developing efficient networks.
The challenge to make good measurements in the fieldSpectral data collections are most often project (campaign) based, obtained for different purposes (unique?)Different methods, different instrumentsOf highly variable (unknown) qualityThe need for comparison, comparability, evaluationHow to store and easily exchange such dataImplications for data quality and assessmentCoping with single spectra, nested data from projects, replicates, related targets, campaignsEfficient in metadata entry
Towards standards for the exchange offield spectral datasetsTim Malthus | Research Program Leader, CSIRO Land and WaterLaurie Chisholm, Andreas Hueni, Barbara Rasaiah, Simon JonesCSIRO LAND AND WATERTERN Symposium 2013
Australian context3 | Algorithms for water quality | Tim Malthus
Context• The challenge to make good measurements in the field• Data produced of often of high value but not easily discoverable• Needs for data preservation, legacy value, lineage• There is increased interest in the storage, reuse exchange of spectral data, but users need reassurance in data quality and representativeness• National or international standards are lacking
Metadata• Critical if data are obtained in the field• The key to determining quality in field spectroscopic data• The existence of extensively documented metadata ultimately determines long-term usability and quality• Assists searching and selection• Assists assessment of suitability for other research projects
Four strands• TERN AusCover• ACEAS Workshop• PhD research, RMIT• Australian National Data Service DC10 project7 | Algorithms for water quality | Tim Malthus
Bio-optical data: Best practice and legacy datasets workshop 18 – 22 June 2012University of Queensland, Brisbane Report: http://www.aceas.org.au
Aims of the Workshop• To drive best practice in field measurement and to lay the foundations of an international standard for the exchange of spectral datasets.1. What are the key criteria for assessing the quality and robustness of a spectral signature obtained in the field?2. What are the key components that then lead to the acquisition of high quality spectra in the field?3. What is the best means by which the spectrum and its metadata can be exchanged to preserve its quality and robustness?
Key conclusions• The role of metadata• Standards to ensure quality in data collection are required to facilitate accurate cross comparison of data from different studies• Standards developed need to build in flexibility to cope with: • The diversity of studies undertaken using field spectroscopy • The diversity of instrumentation used • New innovations in the technology• Thorough discussion on the methods to both exchange and store spectral data• Tools are required to help ease the burden of input of metadata• The need for care in the preparation of field protocols and of recording data in the field
Measuring quality• One approach is quality analysis algorithms to quantify the completeness and quality of a metadata set for the purpose of informing a data user’s decision to use it• Amalgamation of data completeness + quality assurance Non-critical Errors Critical Metadata fields Metadata fields and Quality Assurance (mid impact) (highest impact) (lowest impact) Optimal / Suboptimal Quality Measure
Metadata completeness Critical metadata: 100% Noncritical metadata: (variable) General Campaign metadata Instrument General project Viewing geometry information Illumination information… + + Additional metadata Campaign-specific metadata that would enhance re- Vegetation usability & fusibility with other datasets Mineral exploration Substratum target…
Outcomes• With modifications, SPECCHIO should be proposed as the international tool for storage and exchange of spectral datasets• Draft minimum metadata requirements for key applications (soils, vegetation, aquatic habitats)• Novel tools to assist in summarising the completeness and quality of spectral datasets• Outlines of best practice to improve data collection in the field
ANDS Data Capture Project DC-10Establishing a Spectral Database for the AustralianRemote Sensing CommunityAim: to develop a national spectral information system to housespectral libraries and associated metadata, providing a consistentresearch support platform. • collation, sharing, publishing collections; • facilitate capture of new spectroscopy data; • enhance discoverability This project is supported by the Australian National Data Service (ANDS)14 | Algorithms for water quality | Tim Malthus
SPECCHIO (v3) System architecture reconfigured to web-based application, yet allows administrative control via desktop. Functionality enhancements based upon extensive community consultation.
DC-10 Status User Development: Upgrade of the SPECCHIO system Testing, Consultation deploymentNov 2012 April/May 2013 May/July 2013 The enhanced SPECCHIO software will move to open source, hosted by UoW. 3/17/2013 Page 16
Thank youCSIRO Land and WaterTim Malthus - Research Program Leadert +61 2 6246 5732E email@example.com www.csiro.auCSIRO LAND AND WATER
Contentsal Reﬂectance Factor dE dL Geometr y of incident and reﬂected beams from Nicodemus, F E., et al., 1977. Geometrical Considerations and .ature for Reﬂectance. Institute for Basic S of S tandards, National Bureau tandards,Washington D.C., US 67 pp.) A, ? Spectral 6 Information System 3/17/2013 Page 18
Possible Ontology of Spectral InformationSystems (SIS) in Australia Single User Spectroscopy Community CSIRO Users Internet Intranet CSIRO XML XML Single installation Internal SPECCHIO @ TERN field laptop Central SPECCHIO server @ CSIRO server @ ANDS 3/17/2013 Page 19
General Dataflow, Scientific Data Analysis andProcessing SPECCHIO Java Environment Basic Data Metadata Processing & Editing Spectroradiometer Visualisation File System Data Data Ingestion Export SPECCHIO spectral database File SystemScientific Programming SPECCHIO Language Java Components Data & Processing SPECCHIO API Matlab et al and Environment Analysis 3/17/2013 RDBMS Page 20
Abstract• Relevance to Themes of the meeting: The abstract is mostly relevant to the themes surrounding Managing, publishing, and finding data in that the generation of standards for field spectroscopy will benefit the searching, sharing and reuse of data so collected. In the formulation of the metadata standards the approach also leads to the establishment of better practice for field data collection.• Notes:• The 2013 TERN Symposium abstract guidelines:• Demonstration of the power of the network approach – cooperation between TERN facilities, and/or collaboration with stakeholders (researchers, managers, industry, community)• Explicit reliance upon hard or soft infrastructure, capabilities or outputs that were not possible/accessible/feasible prior to TERN• Clear alignment with one or more of the major TERN themes (carbon, sustainable landuse/management, biodiversity, fire, research data management, successful delivery of research infrastructure, capacity- building, understanding variability)• Including clear statements of the implications for stakeholders• Descriptions of how TERN infrastructure/products/activities are actually being used by stakeholders.21 | Algorithms for water quality | Tim Malthus
But… a number of questions, challenges, issues • The challenge to make good measurements in the field • Has best practice been followed? • Spectral data collections are most often project (campaign) based, obtained for different purposes • single spectra, nested data from projects, replicates, related targets, campaigns • Need to assess quality of spectra obtained • Incompatible, often internal, data formats, from different instruments, separated from metadata • How to store to facilitate exchange • Efficiency in metadata entry
What determines quality? Suggestions…• Quality of conditions under which a spectrum was obtained• Quality of the instrument and its calibration• Design of the study• Experience of the user• Visual quality of the spectrum obtained• Quality of the documentation obtained with the measurement, including information on the properties of the target (even a photograph)• Quality flags?•…
ANDS Data Capture Project “DC-10” Establishing a Spectral Database for the Australian Remote Sensing CommunityAim: to develop a national spectral information system to housespectral libraries and associated metadata, providing a consistentresearch support platform. • collation, sharing, publishing collections; • facilitate capture of new spectroscopy data; • enhance discoverability This project is supported by the Australian National Data Service (ANDS)
Architecture being implemented25 | Algorithms for water quality | Tim Malthus