Can ISO 19157 support current NASA data quality metadata?Ted Habermann
ISO 19157 provides a powerful framework for describing quality of Earth science datasets. As NASA migrates towards using that standard, it is important to understand whether and how existing data quality content fits into the ISO 19157 model. This talk demonstrates that fit and concludes that ISO 19157 can include all existing content and also includes new capabilities that can be very useful for all kinds of NASA data users.
ISO Metadata Improvements - Questions and AnswersTed Habermann
The ISO Standards for describing geospatial data, services, and other resources are changing. These slides describe a few of these changes in terms of documentation needs and how the new standards address these needs. I talked with these slides at a recent webinar that is available at https://www.youtube.com/watch?v=un-PtJLclIM&feature=youtu.be
We are interested in developing a standard method for writing ISO TC211 compliant metadata into HDF data files. This presentation shows some initial workflows for this using the HDF Product Designer.
The NASA Earth Science Data and Information System (ESDIS) is migrating documentation for their data and products towards International Standards developed by ISO Technical Committee 211 (ISO/TC211). In order to do this effectively, NASA must understand and participate in the ISO process. This presentation was given at a NASA ISO Seminar during November 2012. It outlines the ISO standards process and describes some extensions to the ISO standards that are being proposed to address ESDIS requirements not addressed in the original standard.
PhD Maintainability of transformations in evolving MDE ecosystemsJokin García Pérez
- Co-evolve transformations to metamodel evolution
- Adapter-based approach to co-evolve generated SQL in model to text transformations
- Testing model to text transformations
-
Can ISO 19157 support current NASA data quality metadata?Ted Habermann
ISO 19157 provides a powerful framework for describing quality of Earth science datasets. As NASA migrates towards using that standard, it is important to understand whether and how existing data quality content fits into the ISO 19157 model. This talk demonstrates that fit and concludes that ISO 19157 can include all existing content and also includes new capabilities that can be very useful for all kinds of NASA data users.
ISO Metadata Improvements - Questions and AnswersTed Habermann
The ISO Standards for describing geospatial data, services, and other resources are changing. These slides describe a few of these changes in terms of documentation needs and how the new standards address these needs. I talked with these slides at a recent webinar that is available at https://www.youtube.com/watch?v=un-PtJLclIM&feature=youtu.be
We are interested in developing a standard method for writing ISO TC211 compliant metadata into HDF data files. This presentation shows some initial workflows for this using the HDF Product Designer.
The NASA Earth Science Data and Information System (ESDIS) is migrating documentation for their data and products towards International Standards developed by ISO Technical Committee 211 (ISO/TC211). In order to do this effectively, NASA must understand and participate in the ISO process. This presentation was given at a NASA ISO Seminar during November 2012. It outlines the ISO standards process and describes some extensions to the ISO standards that are being proposed to address ESDIS requirements not addressed in the original standard.
PhD Maintainability of transformations in evolving MDE ecosystemsJokin García Pérez
- Co-evolve transformations to metamodel evolution
- Adapter-based approach to co-evolve generated SQL in model to text transformations
- Testing model to text transformations
-
“Cadastral Maps for Socio-Economic Data Visualization and Integration for Lan...irjes
The impact of mining and mineral extraction activities can be significant on the surrounding land,
water and air bodies, in any operational area. The environmental degradation ranges from localized surface and
ground water contamination to the damaging effects of airborne pollutants on the regional ecosystem; which
need the properly designed geospatial database. The monitoring of these environmental impacts requires a userfriendly
and cost effective method to quantify the land cover changes over large time periods. Now-a-days, it
has become compulsory to use the remote sensing techniques for regular monitoring of these environmental
hazards in-and-around the mining areas using cadastral map. This paper provides a case study on the use of
geospatial techniques for environmental monitoring in the mining areas.
Building Spatial Data Infrastructures for Spatial Planning in Africa: Lagos e...Samuel Dekolo
Lagos is the fastest growing Megacity in Sub-Saharan Africa, with its population estimated to double in the first quarter of this century; it is expected to be the third largest urban agglomerations in the world. This growth is not without challenges, as the city is grappling with myriads of urban management problems. City planners lack the most important ingredient of land use management, which is Information. In spite of huge investment on spatial data infrastructures at the national and state levels of government, most land use planners at both state and local government level agencies are ignorant of existing geospatial technology portals and unlock the full potentials of information and communication technologies. A statewide survey of the spatial data infrastructures of the city’s urban and land use management ministry and agencies proves its pathetic state, thereby creating information gap void between urban development and intelligent management. The result is has led to a sporadic growth of slums and unplanned settlements which now accounts for over 60% of the city. To avoid an impasse, it is necessary to review the level of geospatial technologies used at the local level and recommend formidable means of integration in the decision making process. This paper examines the level of geospatial technologies and Spatial Data Infrastructure use in spatial planning agencies and barriers to implementation in the 20 local governments of Lagos State and suggests the way forward.
Software Product Measurement and Analysis in a Continuous Integration Environ...Gabriel Moreira
Presentation of a paper presented in the International Conference ITNG 2010, about a framework constructed for software internal quality measurement program with automatic metrics extraction, implemented at a Software Factory.
Multi-modal sources for predictive modeling using deep learningSanghamitra Deb
Using Vision Language models : Is it possible to prompt them similar to LLMs? when to use out of the box and when to pre-train? General multi-modal models --- deeplearning. Machine learning metrics, feature engineering and setting up an ML problem.
Classification on multi label dataset using rule mining techniqueeSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
“Cadastral Maps for Socio-Economic Data Visualization and Integration for Lan...irjes
The impact of mining and mineral extraction activities can be significant on the surrounding land,
water and air bodies, in any operational area. The environmental degradation ranges from localized surface and
ground water contamination to the damaging effects of airborne pollutants on the regional ecosystem; which
need the properly designed geospatial database. The monitoring of these environmental impacts requires a userfriendly
and cost effective method to quantify the land cover changes over large time periods. Now-a-days, it
has become compulsory to use the remote sensing techniques for regular monitoring of these environmental
hazards in-and-around the mining areas using cadastral map. This paper provides a case study on the use of
geospatial techniques for environmental monitoring in the mining areas.
Building Spatial Data Infrastructures for Spatial Planning in Africa: Lagos e...Samuel Dekolo
Lagos is the fastest growing Megacity in Sub-Saharan Africa, with its population estimated to double in the first quarter of this century; it is expected to be the third largest urban agglomerations in the world. This growth is not without challenges, as the city is grappling with myriads of urban management problems. City planners lack the most important ingredient of land use management, which is Information. In spite of huge investment on spatial data infrastructures at the national and state levels of government, most land use planners at both state and local government level agencies are ignorant of existing geospatial technology portals and unlock the full potentials of information and communication technologies. A statewide survey of the spatial data infrastructures of the city’s urban and land use management ministry and agencies proves its pathetic state, thereby creating information gap void between urban development and intelligent management. The result is has led to a sporadic growth of slums and unplanned settlements which now accounts for over 60% of the city. To avoid an impasse, it is necessary to review the level of geospatial technologies used at the local level and recommend formidable means of integration in the decision making process. This paper examines the level of geospatial technologies and Spatial Data Infrastructure use in spatial planning agencies and barriers to implementation in the 20 local governments of Lagos State and suggests the way forward.
Software Product Measurement and Analysis in a Continuous Integration Environ...Gabriel Moreira
Presentation of a paper presented in the International Conference ITNG 2010, about a framework constructed for software internal quality measurement program with automatic metrics extraction, implemented at a Software Factory.
Multi-modal sources for predictive modeling using deep learningSanghamitra Deb
Using Vision Language models : Is it possible to prompt them similar to LLMs? when to use out of the box and when to pre-train? General multi-modal models --- deeplearning. Machine learning metrics, feature engineering and setting up an ML problem.
Classification on multi label dataset using rule mining techniqueeSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Building a mind map for test data management.
Overview
1. Test data source
2. Extract or create data
3. Transform data
4. Provision
5. Target
Source: http://debasishbhadra.blogspot.com/2013/12/create-your-own-mindmap-for-test-data.html
If you are translating metadata between dialects do you know what you are losing? There is a way to identify it and quantitatively characterize lossiness of the translation.
The HDF Product Designer – Interoperability in the First MileTed Habermann
Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.
Hdf Augmentation: Interoperability in the Last MileTed Habermann
Science data files are generally written to serve well-defined purposes for a small science teams. In many cases, the organization of the data and the metadata are designed for custom tools developed and maintained by and for the team. Using these data outside of this context many times involves restructuring, re-documenting, or reformatting the data. This expensive and time-consuming process usually prevents data reuse and thus decreases the total life-cycle value of the data considerably. If the data are unique or critically important to solving a particular problem, they can be modified into a more generally usable form or metadata can be added in order to enable reuse. This augmentation process can be done to enhance data for the intended purpose or for a new purpose, to make the data available to new tools and applications, to make the data more conventional or standard, or to simplify preservation of the data. The HDF Group has addressed augmentation needs in many ways: by adding extra information, by renaming objects or moving them around in the file, by reducing complexity of the organization, and sometimes by hiding data objects that are not understood by specific applications. In some cases these approaches require re-writing the data into new files and in some cases it can be done externally, without affecting the original file. We will describe and compare several examples of each approach.
The ISO Metadata Standards include the capability to add citations to many kinds of external resources. This is very important for providing complete documentation required to understand and reproduce scientific results.
Communities use many different dialects to document their data. We need to be able to translate between these dialects and to understand how much is lost in translation
Wikis, Rubrics and Views: An Integrated Approach to Improving DocumentationTed Habermann
For many years scientists and data managers have focused on creating metadata that supports the discovery of available data. This is important, but once data sets are discovered, users need metadata that supports use and understanding of those data. This talk describes a system developed to support the required metadata improvements using wikis, rubrics, and metadata views. The wikis provide a mechanism for the community to record experiences and lessons learned and provide high-quality examples. Rubrics provide a mechanism for consistent and clear quantitative evaluation of the completeness of metadata records. The results displays include integrated links to the wiki. Views provide views with connections to the wiki and on-going interactive learning. These tools can be used with metadata from any standard and can facilitate translation of the metadata between multiple standards.
The HDF format is the foundation for sharing data in many communities that have created domain-specific conventions on top of HDF. This presentation was given at the Winter meeting of the Earth Science Information Partnership (ESIP).
For many years metadata development activities have focused on developing and sharing metadata for discovering data. This is important. Once data are discovered, metadata supporting use and understanding become important. Efforts to encourage scientists and data providers to create those metadata have had limited success. This talk describes some approaches and tools for supporting the organizational change efforts required to integrate use and understanding metadata into organizational cultures. These approaches are described in terms of the ideas presented in Switch: How to Change Things When Change is Hard.
New data access paradigms support a variety of human and machine access paths with data servers (THREDDS, https://www.unidata.ucar.edu/software/thredds/current/tds/ and Hyrax, http://opendap.org) that support multiple services for a given dataset. We need metadata that can describe those services and unambiguously differentiate between access paths for humans and for machines. The ISO 19115 metadata standard includes service metadata and allows data and services for that data to be described in the same record. I propose that we use the service metadata for machine access and the more traditional distribution information for human access. This talk was presented at the ESIP (espied.org) meeting during January 2014.
NASA's Earth Observing System (EOS) archive includes data collected over many years by many satellite instruments. These data are stored in the HDF format that includes data and metadata. The content of the metadata was examined for compliance with a set of conventions developed by the NASA science community at the beginning of the EOS Project (the HDF-EOS conventions). The initial results show that ~50% of the data files and 76% of the datasets have metadata that allows them to be used easily in standard tools. This talk was presented at the ESIP (espied.org) meeting during January 2014.
Science platforms are made up of (at least) four planks: data formats, services, tools and conventions. I focus here on formats and conventions, specifically the HDF5 format, already used in many disciplines, and the Climate-Forecast and HDF-EOS Conventions. Many science disciplines have already agreed on HDF as the preferred format for storing and sharing data. It is well established in high performance computing and supports arbitrary grouping and annotation. Community conventions are critical for useful data on top of the format. The Climate-Forecast (CF) conventions were created for relatively simple gridded data types while the HDF-EOS conventions originally considered more complex data (swaths). Making simple conventions more complex makes adoption more difficult. Community input and the need for stable data processing systems must be balanced in governance of conventions.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...Wasswaderrick3
In this book, we use conservation of energy techniques on a fluid element to derive the Modified Bernoulli equation of flow with viscous or friction effects. We derive the general equation of flow/ velocity and then from this we derive the Pouiselle flow equation, the transition flow equation and the turbulent flow equation. In the situations where there are no viscous effects , the equation reduces to the Bernoulli equation. From experimental results, we are able to include other terms in the Bernoulli equation. We also look at cases where pressure gradients exist. We use the Modified Bernoulli equation to derive equations of flow rate for pipes of different cross sectional areas connected together. We also extend our techniques of energy conservation to a sphere falling in a viscous medium under the effect of gravity. We demonstrate Stokes equation of terminal velocity and turbulent flow equation. We look at a way of calculating the time taken for a body to fall in a viscous medium. We also look at the general equation of terminal velocity.
Slide 1: Title Slide
Extrachromosomal Inheritance
Slide 2: Introduction to Extrachromosomal Inheritance
Definition: Extrachromosomal inheritance refers to the transmission of genetic material that is not found within the nucleus.
Key Components: Involves genes located in mitochondria, chloroplasts, and plasmids.
Slide 3: Mitochondrial Inheritance
Mitochondria: Organelles responsible for energy production.
Mitochondrial DNA (mtDNA): Circular DNA molecule found in mitochondria.
Inheritance Pattern: Maternally inherited, meaning it is passed from mothers to all their offspring.
Diseases: Examples include Leber’s hereditary optic neuropathy (LHON) and mitochondrial myopathy.
Slide 4: Chloroplast Inheritance
Chloroplasts: Organelles responsible for photosynthesis in plants.
Chloroplast DNA (cpDNA): Circular DNA molecule found in chloroplasts.
Inheritance Pattern: Often maternally inherited in most plants, but can vary in some species.
Examples: Variegation in plants, where leaf color patterns are determined by chloroplast DNA.
Slide 5: Plasmid Inheritance
Plasmids: Small, circular DNA molecules found in bacteria and some eukaryotes.
Features: Can carry antibiotic resistance genes and can be transferred between cells through processes like conjugation.
Significance: Important in biotechnology for gene cloning and genetic engineering.
Slide 6: Mechanisms of Extrachromosomal Inheritance
Non-Mendelian Patterns: Do not follow Mendel’s laws of inheritance.
Cytoplasmic Segregation: During cell division, organelles like mitochondria and chloroplasts are randomly distributed to daughter cells.
Heteroplasmy: Presence of more than one type of organellar genome within a cell, leading to variation in expression.
Slide 7: Examples of Extrachromosomal Inheritance
Four O’clock Plant (Mirabilis jalapa): Shows variegated leaves due to different cpDNA in leaf cells.
Petite Mutants in Yeast: Result from mutations in mitochondrial DNA affecting respiration.
Slide 8: Importance of Extrachromosomal Inheritance
Evolution: Provides insight into the evolution of eukaryotic cells.
Medicine: Understanding mitochondrial inheritance helps in diagnosing and treating mitochondrial diseases.
Agriculture: Chloroplast inheritance can be used in plant breeding and genetic modification.
Slide 9: Recent Research and Advances
Gene Editing: Techniques like CRISPR-Cas9 are being used to edit mitochondrial and chloroplast DNA.
Therapies: Development of mitochondrial replacement therapy (MRT) for preventing mitochondrial diseases.
Slide 10: Conclusion
Summary: Extrachromosomal inheritance involves the transmission of genetic material outside the nucleus and plays a crucial role in genetics, medicine, and biotechnology.
Future Directions: Continued research and technological advancements hold promise for new treatments and applications.
Slide 11: Questions and Discussion
Invite Audience: Open the floor for any questions or further discussion on the topic.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...Studia Poinsotiana
I Introduction
II Subalternation and Theology
III Theology and Dogmatic Declarations
IV The Mixed Principles of Theology
V Virtual Revelation: The Unity of Theology
VI Theology as a Natural Science
VII Theology’s Certitude
VIII Conclusion
Notes
Bibliography
All the contents are fully attributable to the author, Doctor Victor Salas. Should you wish to get this text republished, get in touch with the author or the editorial committee of the Studia Poinsotiana. Insofar as possible, we will be happy to broker your contact.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
19157 Questions and Answers
1. ISO 19157 – Questions and
Answers
Ted Habermann
Director of Earth Science
The HDF Group
thabermann@hdfgroup.org
1
The ISO Data Quality Metadata standards
are evolving.
Data Quality metadata has moved from
ISO 19115 to ISO 19157.
Evolution is good.
2. The Big Picture
ISO 19157 is a conceptual model
of data quality that was recently
approved as an international
standard. It combines concepts
from three older standards into a
unified conceptual model for
describing data quality.
Many of the principle elements
of this conceptual model are
abstract, and can be
implemented in several ways.
3. The Big Picture
ISO 19157 is a conceptual model
of data quality that was recently
approved as an international
standard. It combines concepts
from three older standards into a
unified conceptual model for
describing data quality.
Many of the principle elements
of this conceptual model are
abstract, they can be
implemented in several ways.
When only the abstract concepts
are considered, the model is very
simple.
4. Data Quality Scope
4
“The quality of my data vary in time and space and different parameters have
different quality measures and results.”
ISO quality reports all include descriptions of temporal and spatial extents and
elements of the data set that they pertain to. You can say things like:
Between 2001 and 2002 the quality of the data in the northern hemisphere …
or
The data collected by this sensor degraded during June 2011 because…
or
Quality information for this parameter is in this variable…
<<DataType>>
DQ_Scope
+ level : MD_ScopeCode
+ extent [0..1] : EX_Extent
+ levelDescription [0..*] :MD_ScopeDescription
5. Stand Alone Quality Reports
5
“There are papers and web pages that describe the quality of my data.”
Papers and reports that describe data quality are StandAloneReports.
Metadata can include brief descriptions of the results (abstracts) and
references to any number of these (citations).
DQ_StandaloneQualityReportInformation
+ reportReference: CI_Citation
+ abstract : CharacterString
Abstract: The fire training-set may also have been biased
against savanna and savanna woodland fires since their
detection is more difficult than in humid, forest environments
with cool background temperatures [Malingreau, 1990]. There
may, therefore, be an under-sampling of fires in these warmer
background environments. Citation: Malingreau J.P, 1990, The contribution
of remote sensing to the global monitoring of
fires in tropical and subtropical ecosystems. In:
Fire in Tropical Biota, (J.G. Goldammer , editor),
Springer Verlag , Berlin: 337-370.
6. What is a Data Quality Element?
6
Data
Quality
Element
Measure
Result
Method
QA_PercentMissingData
Number of Pixels with Missing Flags
Total Number of Pixels
15%
7. What Are Quality Measures?
“My metadata already include data quality measures .”
ECHO includes two types of quality measures.
8. What Are Quality Measures?
“I use consistent Quality Measures across many products.”
QAStats – Standard measures for all products
QAPercentMissingData - Granule level % missing data. This attribute can be
repeated for individual parameters within a granule.
QAPercentOutOfBoundsData – Granule level % out of bounds data. This attribute
can be repeated for individual parameters within a granule.
QAPercentInterpolatedData – Granule level % interpolated data. This attribute
can be repeated for individual parameters within a granule.
QAPercentCloudCover – This attribute is used to characterize the cloud cover
amount of a granule. This attribute may be repeated for individual parameters
within a granule. (Note - there may be more than one way to define a cloud or it's
effects within a product containing several parameters; i.e. this attribute may be
parameter specific)
ECHO includes two types of quality measures.
9. What Are Quality Measures?
“I use consistent types of Quality Measure across many products.”
QAFlags – Classes of quality measures with product specific implementations
AutomaticQualityFlag – The granule level flag applying generally to the granule and specifically to
parameters the granule level. When applied to parameter, the flag refers to the quality of that parameter
for the granule (as applicable). The parameters determining whether the flag is set are defined by the
developer and documented in the Quality Flag Explanation.
AutomaticQualityFlagExplanation – A text explanation of the criteria used to set automatic quality flag,
including thresholds or other criteria.
OperationalQualityFlag – The granule level flag applying both generally to a granule and specifically to
parameters at the granule level. When applied to parameter, the flag refers to the quality of that
parameter for the granule (as applicable). The parameters determining whether the flag is set are defined
by the developers and documented in the Operational Quality Flag Explanation.
OperationalQualityFlagExplanation – A text explanation of the criteria used to set operational quality
flag; including thresholds or other criteria.
ScienceQualityFlag – Granule level flag applying to a granule, and specifically to parameters. When
applied to parameter, the flag refers to the quality of that parameter for the granule (as applicable). The
parameters determining whether the flag is set are defined by the developers and documented in the
Science Quality Flag Explanation.
ScienceQualityFlagExplanation – A text explanation of the criteria used to set science quality flag;
including thresholds or other criteria.
10. <<Abstract>>
DQ_Element
Data Quality Measures
DQM_Measure
+ measureIdentifier : MD_Identifier
+ name : CharacterString
+ alias [0..*] : CharacterString
+ sourceReference [0..*] : CI_Citation
+ elementName [1..*] : TypeName
+ definition : CharacterString
+ description [0..1] : DQM_Description
+ valueType : TypeName
+ valueStructure [0..1] : DQM_ValueStructure
+ example [0..*] : DQM_Description
DQM_BasicMeasure
+ name : CharacterString
+ definition : CharacterString
+ example : DQM_Description [0..1]
+ valueType : TypeName
<<CodeList>>
DQ_ValueStructure
+ bag
+ set
+ sequence
+ table
+ matrix
+ coverage
DQM_Parameter
+ name : CharacterString
+ definition : CharacterString
+ description: DQM_Description [0..1]
+ valueType : TypeName
+ valueStructure [0..1] : DQM_ValueStructure
DQM_Description
+ textDescription: CharacterString
+ extendedDescription [0..1] : MD_BrowseGraphic
DQ_MeasureReference
+ measureIdentification [0..1] : MD_Identifier
+ nameOfMeasure [0..*] : CharacterString
+ measureDescription [0..1] : CharacterString
if measureIdentification is not provided,
then nameOfMeasure shall be provided
+ measure 0..1
ISO 19157 includes a DQ_MeasureReference designed to provide a connection to a
detailed description of the quality measure.
“My data quality measures are consistently described in a database .”
11. Data Quality Measures
11
“I need to clearly and consistently explain how I measure quality.”
The ISO model for quality measures includes identifiers, definitions, descriptions,
references and illustrations.
12. Modular Data Quality Information
12
“My data quality information exists in databases or web services.”
Major elements of the 19157 conceptual model are separate components that can be
independently connected to the metadata and reused in multiple records.
Results
Measures
Methods
13. Data Quality Results
13
“My metadata currently includes descriptions of the quality of my data.”
These descriptions can be included in 19157 metadata as descriptive reports.
<Quality>
Due to the lack of high resolution data available over the region for
1993-94, it has been hard to validate the product. However the maps of
burnt areas correspond well with active fire maps for the
region. Where large [>3km] scars are found, the detection is more
reliable. In areas of small scars more problems are involved. It is
hoped that the 1994-95 data set will cover the whole of the study area
and be calibrated by high resolution data.
</Quality>
DQ_DescriptiveResult
+ statement : CharacterString
<gco:CharacterString>
Due to the lack of high resolution data available over the region for
1993-94, it has been hard to validate the product. However the maps of
burnt areas correspond well with active fire maps for the
region. Where large [>3km] scars are found, the detection is more
reliable. In areas of small scars more problems are involved. It is
hoped that the 1994-95 data set will cover the whole of the study area
and be calibrated by high resolution data.
</gco:CharacterString>
14. Data Usage
14
“Users increase our understanding of data quality. We need to keep them in
the loop.”
MD_Usage
+ specificUsage : CharacterString
+ usageDateTime [0..1] : DateTime
+ userDeterminedLimitations [0..1] : CharacterString
+ userContactInfo [1..*] : CI_ResponsibleParty
+ response [0..*] : CharacterString
+ additionalDocumentation [0..*] : CI_Citation
+ identifiedIssues [0..1] : CI_Citation
16. Summary
“There are papers and web pages
that describe the quality of my data.”
“My data quality information
exists in databases or web
services.”
“I use consistent Quality
Measures across many
products.”
“I use consistent types of
Quality Measure across
many products.”
“I need to clearly and consistently
explain how I measure quality.”
“My metadata currently includes
descriptions of the quality of my
data.”
“Users increase our
understanding of data quality. We
need to keep them in the loop.”
“The quality of my data vary in time and
space and different parameters have
different quality measures and results.”
“Users all over the world need to use and
understand the quality of my data.”
18. Acknowledgements
This work was partially supported by contract number NNG10HP02C from NASA.
Any opinions, findings, conclusions, or recommendations expressed in this material are
those of the author and do not necessarily reflect the views of NASA or The HDF Group.