OGC spet 2010 Meta-propagation of uncertainties within workflows
- 265 views
To begin with let us quote the QA4EO (Quality Assurance for Earth Observation)1: ...
To begin with let us quote the QA4EO (Quality Assurance for Earth Observation)1:
“If the vision of GEOSS is to be achieved, Quality Indicators (QIs) should be ascribed to data and, in particular, to delivered information products, at each stage of the data processing chain - from collection and processing to delivery. A QI should provide sufficient information to allow all users to readily evaluate a product’s suitability for their particular application, i.e. its “fitness for purpose”. To ensure that this process is internationally harmonised and consistent, the QI needs to be based on a documented and quantifiable assessment of evidence demonstrating the level of traceability to internationally agreed (where possible SI) reference standards. Such standards may be manmade, natural or intrinsic in nature. The documented evidence should include a description of the processes used, together with an uncertainty budget (or other appropriate quality performance measure).The guidelines of QA4EO provide a template and guidance on how to achieve this in a harmonised and robust manner. “
For interoperability purposes, each data and process registered within EuroGEOSS possesses appropriate metadata elements. The metadata description and the semantics attached to each component of a workflow (datasets and processing services) allow updating/swapping of these components. With varying quality of the components of the workflow, the quality of the outputs of this workflow can become unreliable. With the knowledge of the level of uncertainty in each dataset involved and the sensitivity aspects of the processing steps it is possible to define the quality of a workflow and the level of uncertainty of the outputs by error propagation principles.
Reusing of a given model encapsulated in a scientific workflow implies running the workflow using either the same datasets but not necessarily coming from the same sources, or different datasets which have also not necessarily the required/desired scale specified by the workflow. From error propagation principles and the knowledge of the quality metadata of the components of the workflow, using datasets from different sources or at different scales can be assessed for the quality of the workflow. As part of the integrated modelling activity the latter assessment will help the modeller in choosing the appropriate datasets or in refining the workflow model for example by considering data assimilation, downscaling, multiple scale integration steps within the scientific model and its associated workflow. The workflow quality assessment will help also the modeller in swapping or refining the processing steps as well. Under these modelling activities, the workflow is then seen as the concrete support of a conceptual model, which evolves as the conceptual model does.
On top of quality descriptors existing in the ISO19157, the present document describes the requirements for uncertainty analysis within scientific workflows.
- Total Views
- Views on SlideShare
- Embed Views