This document describes a quality assurance workflow authoring tool for citizen science and crowd-sourced data. The tool aims to integrate authoritative and crowd-sourced data by bringing together a structured, standards-based institutional approach with a citizen-focused, timely crowd-sourced approach. The tool uses a BPMN-based workflow to chain OGC Web Processing Services for quality control processes. This allows stakeholders to design customizable QA workflows by selecting from a repository of generic quality control processes.
COBWEB technology platform and future development needs, ISPRA 2016COBWEB Project
COBWEB is a European Commission-funded research project that developed a generic crowdsourcing infrastructure called the Citizen Observatory Web. The COBWEB framework allows for co-design of mobile applications to collect environmental data from citizens. It provides tools for quality assurance of citizen-sourced data and publishing data using open standards. The project is working to open source components of the COBWEB framework and synchronize work with other groups to promote adoption of geospatial standards for citizen science data.
COBWEB: Towards an Optimised Interoperability Framework for Citizen ScienceCOBWEB Project
Presented by Ingo Simonis and Rob Atkinson (OGC-Europe) at the COBWEB Summit, a side event of the Open Geospatial Constorium's (OGC) 99th Technical & Planning Committee (TC/PC) Meeting held at University College Dublin, 2016.
This document outlines several key components of an educational technology system including:
1) Moodle and Fedena software for managing the learning management system and student life cycle on AWS cloud infrastructure.
2) Features like online applications, registration, examinations, programs, profiles, and employment assistance for students.
3) Enterprise applications including a portal, organization structure, users, messaging, grievances, HR, documents, and financials.
4) A university connect framework including a campus website, news, forums, and connections to the LMS, CMS, forums and blogs.
Lighting in cinematography is determined by three factors: the light source (whether natural or artificial), the quality of light (ranging from high to low contrast, and soft to hard), and the direction of the light. Proper lighting is crucial for analyzing other cinematography elements such as framing, focal depth, and how the subject is presented in the shot.
On Data Quality Assurance and Conflation Entanglement in Crowdsourcing for En...Greenapps&web
Volunteer geographical information (VGI) either in the context of citizen science, active crowdsourcing and even passive crowdsourcing has been proven useful in various societal domains such as natural hazards, health status, disease epidemic and biological monitoring. Nonetheless, the variable degrees or unknown quality due to the crowdsourcing settings are still an obstacle for fully integrating these data sources in environmental studies and potentially in policy making. The data curation process in which a quality assurance (QA) is needed is often driven by the direct usability of the data collected within a data conflation process or data fusion (DCDF) combining the crowdsourced data into one view using potentially other data sources as well. Using two examples, namely land cover validation and inundation extent estimation, this paper discusses the close links between QA and DCDF in order to determine whether a disentanglement can be beneficial or not to a better understanding of the data curation process and to its methodology with respect to crowdsourcing data. Far from rejecting the usability quality criterion, the paper advocates for a decoupling of the QA process and the DCDF step as much as possible but still in integrating them within an approach analogous to a Bayesian paradigm.
ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume II-3/W5, 2015
Authenticating Location Based Skyline Queries in Mobile EnvironmentEditor IJCATR
With the booming of Smartphone’s and mobile devices, location-based services have experienced massive
escalation in nowadays. The outsourcing data processing services to cloud service provider becomes very trending in recent
years, which provides solution to the clients instead of data owner. However, we cannot expect real solutions from the data
processing services; it may give dishonest results to the clients. Therefore, to provide the correct results some authentication
techniques are requiring. In this paper, we learn the authentication techniques for location-based arbitrary-subspace skyline
queries (LASQs), which signify an essential class of LBS applications. We suggest a basic Merkle Skyline R-tree method
and a novel Partial S4-tree method to authenticate LASQs. For authentication process using this LASQ, the client can
contact server frequently during movement and also verify the results by client itself
The COBWEB Summit was held as a side event chaired by Chris Higgins at the Open Geospatial Consortium's (OGC) 99th Technical and Planning Committee (TC/PC) Meeting.
The event was held at University College Dublin.
COBWEB technology platform and future development needs, ISPRA 2016COBWEB Project
COBWEB is a European Commission-funded research project that developed a generic crowdsourcing infrastructure called the Citizen Observatory Web. The COBWEB framework allows for co-design of mobile applications to collect environmental data from citizens. It provides tools for quality assurance of citizen-sourced data and publishing data using open standards. The project is working to open source components of the COBWEB framework and synchronize work with other groups to promote adoption of geospatial standards for citizen science data.
COBWEB: Towards an Optimised Interoperability Framework for Citizen ScienceCOBWEB Project
Presented by Ingo Simonis and Rob Atkinson (OGC-Europe) at the COBWEB Summit, a side event of the Open Geospatial Constorium's (OGC) 99th Technical & Planning Committee (TC/PC) Meeting held at University College Dublin, 2016.
This document outlines several key components of an educational technology system including:
1) Moodle and Fedena software for managing the learning management system and student life cycle on AWS cloud infrastructure.
2) Features like online applications, registration, examinations, programs, profiles, and employment assistance for students.
3) Enterprise applications including a portal, organization structure, users, messaging, grievances, HR, documents, and financials.
4) A university connect framework including a campus website, news, forums, and connections to the LMS, CMS, forums and blogs.
Lighting in cinematography is determined by three factors: the light source (whether natural or artificial), the quality of light (ranging from high to low contrast, and soft to hard), and the direction of the light. Proper lighting is crucial for analyzing other cinematography elements such as framing, focal depth, and how the subject is presented in the shot.
On Data Quality Assurance and Conflation Entanglement in Crowdsourcing for En...Greenapps&web
Volunteer geographical information (VGI) either in the context of citizen science, active crowdsourcing and even passive crowdsourcing has been proven useful in various societal domains such as natural hazards, health status, disease epidemic and biological monitoring. Nonetheless, the variable degrees or unknown quality due to the crowdsourcing settings are still an obstacle for fully integrating these data sources in environmental studies and potentially in policy making. The data curation process in which a quality assurance (QA) is needed is often driven by the direct usability of the data collected within a data conflation process or data fusion (DCDF) combining the crowdsourced data into one view using potentially other data sources as well. Using two examples, namely land cover validation and inundation extent estimation, this paper discusses the close links between QA and DCDF in order to determine whether a disentanglement can be beneficial or not to a better understanding of the data curation process and to its methodology with respect to crowdsourcing data. Far from rejecting the usability quality criterion, the paper advocates for a decoupling of the QA process and the DCDF step as much as possible but still in integrating them within an approach analogous to a Bayesian paradigm.
ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume II-3/W5, 2015
Authenticating Location Based Skyline Queries in Mobile EnvironmentEditor IJCATR
With the booming of Smartphone’s and mobile devices, location-based services have experienced massive
escalation in nowadays. The outsourcing data processing services to cloud service provider becomes very trending in recent
years, which provides solution to the clients instead of data owner. However, we cannot expect real solutions from the data
processing services; it may give dishonest results to the clients. Therefore, to provide the correct results some authentication
techniques are requiring. In this paper, we learn the authentication techniques for location-based arbitrary-subspace skyline
queries (LASQs), which signify an essential class of LBS applications. We suggest a basic Merkle Skyline R-tree method
and a novel Partial S4-tree method to authenticate LASQs. For authentication process using this LASQ, the client can
contact server frequently during movement and also verify the results by client itself
The COBWEB Summit was held as a side event chaired by Chris Higgins at the Open Geospatial Consortium's (OGC) 99th Technical and Planning Committee (TC/PC) Meeting.
The event was held at University College Dublin.
Workflows, provenance and reporting: a lifecycle perspective at BIH 2013, RomeCarole Goble
Workflow systems support the design, configuration and execution of repetitive, multi-step pipelines and analytics, well established in many disciplines, notably biology and chemistry, but less so in biodiversity and ecology. From an experimental perspective workflows are a means to handle the work of accessing an ecosystem of software and platforms, manage data and security, and handle errors. From a reporting perspective they are a means to accurately document methodology for reproducibility, comparison, exchange and reuse, and to trace the provenance of results for review, credit, workflow interoperability and impact analysis. Workflows operate in an evolving ecosystem and are assemblages of components in that ecosystem; their provenance trails are snapshots of intermediate and final results. Taking a lifecycle perspective, what are the challenges in workflow design and use with different stakeholders? What needs to be tackled in evolution, resilience, and preservation? And what are the “mitigate or adapt” strategies adopted by workflow systems in the face of changes in the ecosystem/environment, for example when tools are depreciated or datasets become inaccessible in the face of funding shortfalls?
Digital supply chain quality managementMartin Geddes
We've figured out how to send physical goods around the world: aggregate them into containers. We're still struggling how to do digital good, which we disaggregate into packets. Here's the answer.
The document introduces COBWEB, a European Commission-funded project that develops a crowdsourcing infrastructure for collecting and analyzing environmental data. It summarizes the goals of the project, its partners which include UNESCO biosphere reserves, methods for co-designing use cases, and the development of quality assurance processes and mobile/web apps. Key components under development include workflows, services, sensor networks, and tools for customizing data collection and ensuring data quality.
Context aware qo e modelling, measurement, and prediction in mobile computing...ieeeprojectschennai
Context aware qo e modelling, measurement, and prediction in mobile computing systems
+91-9994232214,8144199666, ieeeprojectchennai@gmail.com,
www.projectsieee.com, www.ieee-projects-chennai.com
IEEE PROJECTS 2015-2016
-----------------------------------
Contact:+91-9994232214,+91-8144199666
Email:ieeeprojectchennai@gmail.com
Support:
-------------
Projects Code
Documentation
PPT
Projects Video File
Projects Explanation
Teamviewer Support
Context aware qo e modelling, measurement, and prediction in mobile computing...ieeeprojectschennai
The document proposes a context-aware Quality of Experience (QoE) modeling method called CaQoEM. CaQoEM uses Bayesian networks and utility theory to measure and predict user QoE under uncertainty, accounting for factors like location, delay, jitter, packet loss and user satisfaction. The method was validated through experiments and simulations, achieving 98.93% accuracy in predicting QoE across different network conditions like handoffs and congestion.
An Enhanced Framework for Improving Spatio-Temporal Queries for Global Positi...IJORCS
This document proposes a framework to improve the processing of spatio-temporal queries for global positioning systems. The framework employs a new indexing algorithm built on SQL Server 2008 that avoids the overhead of R-Tree indexing. It utilizes dynamic materialized views and an adaptive safe region to reduce communication costs and update loads. Caching is used to enhance performance. The notification engine processes concurrent queries using publish/subscribe to group similar queries. Experiments showed the framework outperformed R-Tree indexing.
Recently with the increasing development of distributed computer systems (DCSs) in networked
industrial and manufacturing applications on the World Wide Web (WWW) platform, including service-oriented
architecture and Web of Things QoS-aware systems, it has become important to predict the Web performance.
In this paper, we present Web performance prediction in time by making a forecast of a Web resource
downloading using the Efficient Turning Bands (TB) geostatistical simulation method. Real-life data for the
research were obtained from our own website named "Distributed forecasting system". Generation of log file
form website and performing monitoring of a group of Web clients from connected LAN. For better web
prediction we used spatio temporal prediction method with time utility for downloading particular file from
website and calculate forecasting result using Turning bands method but improving more forecasting
accuracy use the efficient turning band method basically efficient turning band use Naive bays algorithm and
calculate efficient result and that result is compared with Turning band and efficient turning band method.
The efficient turning band method result show good forecasting quality of Web performance prediction and
forecasting.
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09849539085, 09966235788 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09849539085, 09966235788 or mail us - ieeefinalsemprojects@gmail.co¬m-Visit Our Website: www.finalyearprojects.org
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09849539085, 09966235788 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
This document provides an overview of a web performance boot camp. The goals of the class are to provide an understanding of web performance, empower attendees to identify and resolve performance problems, and demonstrate common tools and techniques. The class structure includes sections on what performance is, performance basics, the MPPC model of web performance, and tools and testing. Key topics that will be covered include metrics like response time, statistical distributions, Little's Law, the response time equation, and the dimensions that impact performance like geography, network, browser/device, and page composition.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
On Developing and Operating of Data Elasticity Management ProcessHong-Linh Truong
The Data-as-a-Service (DaaS) model enables data analytics
providers to provision and offer data assets to their consumers. To achieve quality of results for the data assets, we need to enable DaaS elasticity by trading off quality and cost of resource usage. However, most of the current work on DaaS is focused on infrastructure elasticity, such as scaling
in/out data nodes and virtual machines based on performance and usage, without considering the data assets' quality of results. In this talk, we introduce an elastic data asset model for provisioning data enriched with quality of results. Based on this model, we present techniques to generate and operate data elasticity management process that is used to
monitor, evaluate and enforce expected quality of results. We develop a runtime system to guarantee the quality of resulting data assets provisioned on-demand. We present several experiments to demonstrate the usefulness of our proposed techniques.
QoE++: Shifting from Ego- to Eco-System? QCMan 2015 KeynoteTobias Hoßfeld
QoE++: Shifting from Ego- to Eco-System?
QoE research has advanced significantly in recent years with a focus on the QoE ego-system. Thereby, different facets have been addressed by the research community like subjective user studies to identify QoE influence factors for particular applications like video streaming, QoE models to capture the effects of those influence factors on concrete applications, QoE monitoring approaches at the end user site but also within the network to assess QoE during service consumption and to provide means for QoE management for improved QoE.
However, in order to progress in the area of QoE, new research directions have to be taken. There is a need for QoE++. The application of QoE in practice needs to consider the entire QoE eco-system and the stakeholders along the service delivery chain to the end user. In comparison to the traditional QoE ego-system thinking, the QoE eco-system addresses among others the following research topics: in-session vs. global system perspective, short- vs. long-time scales when considering QoE, single vs. multi-user QoE, single vs. concurrent usage of applications and services, user vs. business perspective by addressing all key stakeholder goals.
QoE++ requires (a) to extend current QoE models by the different perspectives of the QoE eco-system including the service provider perspective, (b) to incorporate user behavior as part of the model, (c) and to identify and include relevant internal and external context factors including physical, cultural, social, economic context. QoE++ faces several major challenges.
(1) Can we utilize QoE for network & service management? Or is it more appropriate to consider user engagement or user behavior? Which context factors are relevant or are such context-factors even more important for network & service management, e.g. in order to foresee and react on flash crowds?
(2) How to realize cross-layer optimization between applications and their demands and the network capabilities, and thus a network-wide QoE optimization? Is SDN the right technology to cope with those challenges?
(3) Can we transform QoE into business models, SLAs, etc.? Or is it possible to 'trade' QoE? For example, offering WiFi sharing at home, a user may get improved service delivery and QoE by its ISP.
(4) Do we understand fundamental models and natural relationships of QoE++? How can we extend existing QoE models to take into account the service provider's perspective? How to quantify QoE fairness? What is the relationship between QoE and user behavior?
Following QoE++ will shift from ego- to eco-systems and give answers to those questions. In this talk, we will discuss QoE++ and highlight some of the challenges above.
The document introduces COBWEB, a research project that develops a crowdsourcing infrastructure for collecting and analyzing environmental data provided by citizens. The project aims to address data quality issues and support policy decisions. It has several pilot sites and partners, including UNESCO biosphere reserves. The framework includes mobile apps, QA processes, and a portal to view and analyze citizen-submitted data. It uses open standards and aims to be customizable for different use cases involving topics like biological monitoring and flooding.
Context, Quality and Relevance: Dependencies and Impacts on RESTful Web Servi...Hong-Linh Truong
The document discusses the importance of context and data quality for RESTful web services that provide access to large datasets. It notes that existing approaches do not provide adequate support for specifying or accessing quality and context information at the level of individual data resources. The authors propose conventions and representations to describe quality and context for data resources in RESTful service documentation (WADL) and to access such information through REST APIs, in order to support composition of data-intensive services.
Presented by Barnard Kroon (University College Dublin) at the COBWEB Summit, a side event of the Open Geospatial Constorium's (OGC) 99th Technical & Planning Committee (TC/PC) Meeting held at University College Dublin, 2016.
Presented by Dr. Andreas Matheus, 21st June 2016.
During the COBWEB Summit at Open Geospatial Constorium's (OGC) 99th Technical Planning Committee (TC/PC) Meeting.
More Related Content
Similar to COBWEB A quality assurance workflow authoring tool for citizen science and crowdsourced data
Workflows, provenance and reporting: a lifecycle perspective at BIH 2013, RomeCarole Goble
Workflow systems support the design, configuration and execution of repetitive, multi-step pipelines and analytics, well established in many disciplines, notably biology and chemistry, but less so in biodiversity and ecology. From an experimental perspective workflows are a means to handle the work of accessing an ecosystem of software and platforms, manage data and security, and handle errors. From a reporting perspective they are a means to accurately document methodology for reproducibility, comparison, exchange and reuse, and to trace the provenance of results for review, credit, workflow interoperability and impact analysis. Workflows operate in an evolving ecosystem and are assemblages of components in that ecosystem; their provenance trails are snapshots of intermediate and final results. Taking a lifecycle perspective, what are the challenges in workflow design and use with different stakeholders? What needs to be tackled in evolution, resilience, and preservation? And what are the “mitigate or adapt” strategies adopted by workflow systems in the face of changes in the ecosystem/environment, for example when tools are depreciated or datasets become inaccessible in the face of funding shortfalls?
Digital supply chain quality managementMartin Geddes
We've figured out how to send physical goods around the world: aggregate them into containers. We're still struggling how to do digital good, which we disaggregate into packets. Here's the answer.
The document introduces COBWEB, a European Commission-funded project that develops a crowdsourcing infrastructure for collecting and analyzing environmental data. It summarizes the goals of the project, its partners which include UNESCO biosphere reserves, methods for co-designing use cases, and the development of quality assurance processes and mobile/web apps. Key components under development include workflows, services, sensor networks, and tools for customizing data collection and ensuring data quality.
Context aware qo e modelling, measurement, and prediction in mobile computing...ieeeprojectschennai
Context aware qo e modelling, measurement, and prediction in mobile computing systems
+91-9994232214,8144199666, ieeeprojectchennai@gmail.com,
www.projectsieee.com, www.ieee-projects-chennai.com
IEEE PROJECTS 2015-2016
-----------------------------------
Contact:+91-9994232214,+91-8144199666
Email:ieeeprojectchennai@gmail.com
Support:
-------------
Projects Code
Documentation
PPT
Projects Video File
Projects Explanation
Teamviewer Support
Context aware qo e modelling, measurement, and prediction in mobile computing...ieeeprojectschennai
The document proposes a context-aware Quality of Experience (QoE) modeling method called CaQoEM. CaQoEM uses Bayesian networks and utility theory to measure and predict user QoE under uncertainty, accounting for factors like location, delay, jitter, packet loss and user satisfaction. The method was validated through experiments and simulations, achieving 98.93% accuracy in predicting QoE across different network conditions like handoffs and congestion.
An Enhanced Framework for Improving Spatio-Temporal Queries for Global Positi...IJORCS
This document proposes a framework to improve the processing of spatio-temporal queries for global positioning systems. The framework employs a new indexing algorithm built on SQL Server 2008 that avoids the overhead of R-Tree indexing. It utilizes dynamic materialized views and an adaptive safe region to reduce communication costs and update loads. Caching is used to enhance performance. The notification engine processes concurrent queries using publish/subscribe to group similar queries. Experiments showed the framework outperformed R-Tree indexing.
Recently with the increasing development of distributed computer systems (DCSs) in networked
industrial and manufacturing applications on the World Wide Web (WWW) platform, including service-oriented
architecture and Web of Things QoS-aware systems, it has become important to predict the Web performance.
In this paper, we present Web performance prediction in time by making a forecast of a Web resource
downloading using the Efficient Turning Bands (TB) geostatistical simulation method. Real-life data for the
research were obtained from our own website named "Distributed forecasting system". Generation of log file
form website and performing monitoring of a group of Web clients from connected LAN. For better web
prediction we used spatio temporal prediction method with time utility for downloading particular file from
website and calculate forecasting result using Turning bands method but improving more forecasting
accuracy use the efficient turning band method basically efficient turning band use Naive bays algorithm and
calculate efficient result and that result is compared with Turning band and efficient turning band method.
The efficient turning band method result show good forecasting quality of Web performance prediction and
forecasting.
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09849539085, 09966235788 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09849539085, 09966235788 or mail us - ieeefinalsemprojects@gmail.co¬m-Visit Our Website: www.finalyearprojects.org
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09849539085, 09966235788 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
This document provides an overview of a web performance boot camp. The goals of the class are to provide an understanding of web performance, empower attendees to identify and resolve performance problems, and demonstrate common tools and techniques. The class structure includes sections on what performance is, performance basics, the MPPC model of web performance, and tools and testing. Key topics that will be covered include metrics like response time, statistical distributions, Little's Law, the response time equation, and the dimensions that impact performance like geography, network, browser/device, and page composition.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
On Developing and Operating of Data Elasticity Management ProcessHong-Linh Truong
The Data-as-a-Service (DaaS) model enables data analytics
providers to provision and offer data assets to their consumers. To achieve quality of results for the data assets, we need to enable DaaS elasticity by trading off quality and cost of resource usage. However, most of the current work on DaaS is focused on infrastructure elasticity, such as scaling
in/out data nodes and virtual machines based on performance and usage, without considering the data assets' quality of results. In this talk, we introduce an elastic data asset model for provisioning data enriched with quality of results. Based on this model, we present techniques to generate and operate data elasticity management process that is used to
monitor, evaluate and enforce expected quality of results. We develop a runtime system to guarantee the quality of resulting data assets provisioned on-demand. We present several experiments to demonstrate the usefulness of our proposed techniques.
QoE++: Shifting from Ego- to Eco-System? QCMan 2015 KeynoteTobias Hoßfeld
QoE++: Shifting from Ego- to Eco-System?
QoE research has advanced significantly in recent years with a focus on the QoE ego-system. Thereby, different facets have been addressed by the research community like subjective user studies to identify QoE influence factors for particular applications like video streaming, QoE models to capture the effects of those influence factors on concrete applications, QoE monitoring approaches at the end user site but also within the network to assess QoE during service consumption and to provide means for QoE management for improved QoE.
However, in order to progress in the area of QoE, new research directions have to be taken. There is a need for QoE++. The application of QoE in practice needs to consider the entire QoE eco-system and the stakeholders along the service delivery chain to the end user. In comparison to the traditional QoE ego-system thinking, the QoE eco-system addresses among others the following research topics: in-session vs. global system perspective, short- vs. long-time scales when considering QoE, single vs. multi-user QoE, single vs. concurrent usage of applications and services, user vs. business perspective by addressing all key stakeholder goals.
QoE++ requires (a) to extend current QoE models by the different perspectives of the QoE eco-system including the service provider perspective, (b) to incorporate user behavior as part of the model, (c) and to identify and include relevant internal and external context factors including physical, cultural, social, economic context. QoE++ faces several major challenges.
(1) Can we utilize QoE for network & service management? Or is it more appropriate to consider user engagement or user behavior? Which context factors are relevant or are such context-factors even more important for network & service management, e.g. in order to foresee and react on flash crowds?
(2) How to realize cross-layer optimization between applications and their demands and the network capabilities, and thus a network-wide QoE optimization? Is SDN the right technology to cope with those challenges?
(3) Can we transform QoE into business models, SLAs, etc.? Or is it possible to 'trade' QoE? For example, offering WiFi sharing at home, a user may get improved service delivery and QoE by its ISP.
(4) Do we understand fundamental models and natural relationships of QoE++? How can we extend existing QoE models to take into account the service provider's perspective? How to quantify QoE fairness? What is the relationship between QoE and user behavior?
Following QoE++ will shift from ego- to eco-systems and give answers to those questions. In this talk, we will discuss QoE++ and highlight some of the challenges above.
The document introduces COBWEB, a research project that develops a crowdsourcing infrastructure for collecting and analyzing environmental data provided by citizens. The project aims to address data quality issues and support policy decisions. It has several pilot sites and partners, including UNESCO biosphere reserves. The framework includes mobile apps, QA processes, and a portal to view and analyze citizen-submitted data. It uses open standards and aims to be customizable for different use cases involving topics like biological monitoring and flooding.
Context, Quality and Relevance: Dependencies and Impacts on RESTful Web Servi...Hong-Linh Truong
The document discusses the importance of context and data quality for RESTful web services that provide access to large datasets. It notes that existing approaches do not provide adequate support for specifying or accessing quality and context information at the level of individual data resources. The authors propose conventions and representations to describe quality and context for data resources in RESTful service documentation (WADL) and to access such information through REST APIs, in order to support composition of data-intensive services.
Presented by Barnard Kroon (University College Dublin) at the COBWEB Summit, a side event of the Open Geospatial Constorium's (OGC) 99th Technical & Planning Committee (TC/PC) Meeting held at University College Dublin, 2016.
Presented by Dr. Andreas Matheus, 21st June 2016.
During the COBWEB Summit at Open Geospatial Constorium's (OGC) 99th Technical Planning Committee (TC/PC) Meeting.
Presented by Dr Andreas Matheus, June 1st 2016 at the 10th GEO European Projects Workshop.
This was part of the session 'Citizens' Observatories for environmental policy monitoring and development'.
Wide access to spatial Citizen Science data - ECSA Berlin 2016COBWEB Project
Authors: Paul van Genuchten, Lieke Verhelst, Clemens Portele
Presented at the European Citizen Science Association conference Berlin, May 2016.
One of the objectives of COBWEB is to publish citizen science data to GEOSS, the Global Earth Observation System of Systems. GEOSS has a focus on spatial standards (CSW, SensorWeb, WMS/WFS). However, a major part of citizen science community is not aware of these standards, and average users use search engines to discover data and common formats to analyse data. So how do we bridge the gap between services in GEOSS and search engines?
A Standardized Encoding to Exchange Citizen Science Data - ESCA 2015COBWEB Project
With more and more citizen observatories and sampling campaigns there are all sorts of data being collected, each using different formats and techniques. This is not great for re-use and sharing of the data. Which is where standardization comes in and helps to improve the situation. Dr Ingo Simonis discusses how OGC have tackled this challenge.
COBWEB Smart Technology = Smart Data? Citizen Science in the Dyfi Biosphere R...COBWEB Project
COBWEB Smart Technology = Smart Data? Citizen Science in the Dyfi Biosphere Reserve, Welsh Wildlife Centre, Cilgerran, 7th March 2015. Dr Crona Hodges.
COBWEB RDA Plenery 5 - Joint meeting of IG Geospatial & IG Big Data - Didier...COBWEB Project
Didier Leibovici & Mike Jackson, University of Nottingham (COBWEB partner)
Geogspatial Data Curation & interoperability in the COBWEB project - citizen science and crowdsourcing for environmental policy
COBWEB presentation given at the Citizens' Observatories: Empowering European Society Open Conference, which took place on 4th December 2014, Brussels, Belgium.
COBWEB - Existing Work and Future Plans - Presentation by James Hodges of the...COBWEB Project
"COBWEB - Existing Work and Future Plans". Presentation given by James Hodges, Outward Bound Trust at the Gweithdy COBWEB yn Machynlleth / COBWEB Workshop in Machynlleth on 20th May 2014.
Find out more about this event (in Welsh or English/yn Cymraeg neu Saesneg) on the COBWEB Project website:
http://bit.ly/1nMjmUP
Coetiroedd Dyfi Woodlands Presentation by Kirsten Manley from COBWEB Workshop...COBWEB Project
Presentation given by Kirsten Manley on the Coetiroedd Dyfi Woodlands group and work at the Gweithdy COBWEB yn Machynlleth / COBWEB Workshop in Machynlleth on 20th May 2014.
Find out more about this event (in Welsh or English/yn Cymraeg neu Saesneg) on the COBWEB Project website:
http://bit.ly/1nMjmUP
COBWEB: helping to map vegetation - work with Aberystwyth University - Crona ...COBWEB Project
Hodges, C. 2014. COBWEB: helping to map vegetation - work with Aberystwyth University. Presentation as part of the COBWEB Workshop, 22nd May 2014, y Plas, Dyfi, UK.
WP6 Demonstrators Estimating inundation extent from a distance - Brewar, Evan...COBWEB Project
Brewar, P., Evans, B., Hodges, C., Macklin, M. and Williams, R. 2014. WP6 Demonstrators Estimating inundation extent from a distance. Presentation as part of the COBWEB Workshop, 22nd May 2014, y Plas, Dyfi, UK.
Attention Citizens! Presentation as part of the Citizen Science Workshop - Ni...COBWEB Project
This document provides tips for communicating Citizen Science projects and using social media engagement. It recommends targeting key audiences and engaging citizens early in the design process. Planning social media content should make the project aims and calls to action clear, and explain why citizens should participate and how their contributions will be used. Popular social media channels like Facebook, Twitter, YouTube and Google+ should be used consistently to support engagement with project communities over the long term. Images, video, guest posts, and live events can help build trust and encourage participation and sharing.
Ensuring the Citizen is at the heart of the COBWEB - Citizen Observatory Web ...COBWEB Project
"Ensuring the Citizen is at the heart of the COBWEB - Citizen Observatory Web" presentation by Jamie Williams, Environment Systems as part of the European Commission Speakers' Corner programme at GEO X, Geneva, Switzerland, 15th January 2014.
COBWEB (presentation from Citizens’ Science and Smart Cities Summit) - Chris ...COBWEB Project
The COBWEB Project is a 4-year research project that aims to explore how crowdsourced environmental data from citizen science projects can be made interoperable and reusable across projects and data infrastructures. It will develop mobile applications to collect and analyze crowdsourced data on the environment from biosphere reserves in the UK, Germany, and Greece to support decision making. The project is currently in its 16th month of the 48 month duration and working to implement its data platform and develop its first demonstrator in Wales by the next milestone.
COBWEB: Citizen Observatories Web Ecology meets the crowd - Crona Hodges COBWEB Project
Presentation given at the 33rd CEN (European Committee for Standardisation) Workshop, part of the Joint CEN/TC 287 and OGC Workshop which took place on 30th September 2013, Frascati, Italy.
Find out more about the COBWEB Project at:
http://cobwebproject.eu/dissemination/
access management,citizen observatory,cobweb,cobwebfp7,,european union,fp7,geoss,saml
The modification of an existing product or the formulation of a new product to fill a newly identified market niche or customer need are both examples of product development. This study generally developed and conducted the formulation of aramang baked products enriched with malunggay conducted by the researchers. Specifically, it answered the acceptability level in terms of taste, texture, flavor, odor, and color also the overall acceptability of enriched aramang baked products. The study used the frequency distribution for evaluators to determine the acceptability of enriched aramang baked products enriched with malunggay. As per sensory evaluation conducted by the researchers, it was proven that aramang baked products enriched with malunggay was acceptable in terms of Odor, Taste, Flavor, Color, and Texture. Based on the results of sensory evaluation of enriched aramang baked products proven that three (3) treatments were all highly acceptable in terms of variable Odor, Taste, Flavor, Color and Textures conducted by the researchers.
Optimizing Post Remediation Groundwater Performance with Enhanced Microbiolog...Joshua Orris
Results of geophysics and pneumatic injection pilot tests during 2003 – 2007 yielded significant positive results for injection delivery design and contaminant mass treatment, resulting in permanent shut-down of an existing groundwater Pump & Treat system.
Accessible source areas were subsequently removed (2011) by soil excavation and treated with the placement of Emulsified Vegetable Oil EVO and zero-valent iron ZVI to accelerate treatment of impacted groundwater in overburden and weathered fractured bedrock. Post pilot test and post remediation groundwater monitoring has included analyses of CVOCs, organic fatty acids, dissolved gases and QuantArray® -Chlor to quantify key microorganisms (e.g., Dehalococcoides, Dehalobacter, etc.) and functional genes (e.g., vinyl chloride reductase, methane monooxygenase, etc.) to assess potential for reductive dechlorination and aerobic cometabolism of CVOCs.
In 2022, the first commercial application of MetaArray™ was performed at the site. MetaArray™ utilizes statistical analysis, such as principal component analysis and multivariate analysis to provide evidence that reductive dechlorination is active or even that it is slowing. This creates actionable data allowing users to save money by making important site management decisions earlier.
The results of the MetaArray™ analysis’ support vector machine (SVM) identified groundwater monitoring wells with a 80% confidence that were characterized as either Limited for Reductive Decholorination or had a High Reductive Reduction Dechlorination potential. The results of MetaArray™ will be used to further optimize the site’s post remediation monitoring program for monitored natural attenuation.
Kinetic studies on malachite green dye adsorption from aqueous solutions by A...Open Access Research Paper
Water polluted by dyestuffs compounds is a global threat to health and the environment; accordingly, we prepared a green novel sorbent chemical and Physical system from an algae, chitosan and chitosan nanoparticle and impregnated with algae with chitosan nanocomposite for the sorption of Malachite green dye from water. The algae with chitosan nanocomposite by a simple method and used as a recyclable and effective adsorbent for the removal of malachite green dye from aqueous solutions. Algae, chitosan, chitosan nanoparticle and algae with chitosan nanocomposite were characterized using different physicochemical methods. The functional groups and chemical compounds found in algae, chitosan, chitosan algae, chitosan nanoparticle, and chitosan nanoparticle with algae were identified using FTIR, SEM, and TGADTA/DTG techniques. The optimal adsorption conditions, different dosages, pH and Temperature the amount of algae with chitosan nanocomposite were determined. At optimized conditions and the batch equilibrium studies more than 99% of the dye was removed. The adsorption process data matched well kinetics showed that the reaction order for dye varied with pseudo-first order and pseudo-second order. Furthermore, the maximum adsorption capacity of the algae with chitosan nanocomposite toward malachite green dye reached as high as 15.5mg/g, respectively. Finally, multiple times reusing of algae with chitosan nanocomposite and removing dye from a real wastewater has made it a promising and attractive option for further practical applications.
Monitor indicators of genetic diversity from space using Earth Observation dataSpatial Genetics
Genetic diversity within and among populations is essential for species persistence. While targets and indicators for genetic diversity are captured in the Kunming-Montreal Global Biodiversity Framework, assessing genetic diversity across many species at national and regional scales remains challenging. Parties to the Convention on Biological Diversity (CBD) need accessible tools for reliable and efficient monitoring at relevant scales. Here, we describe how Earth Observation satellites (EO) make essential contributions to enable, accelerate, and improve genetic diversity monitoring and preservation. Specifically, we introduce a workflow integrating EO into existing genetic diversity monitoring strategies and present a set of examples where EO data is or can be integrated to improve assessment, monitoring, and conservation. We describe how available EO data can be integrated in innovative ways to support calculation of the genetic diversity indicators of the GBF monitoring framework and to inform management and monitoring decisions, especially in areas with limited research infrastructure or access. We also describe novel, integrative approaches to improve the indicators that can be implemented with the coming generation of EO data, and new capabilities that will provide unprecedented detail to characterize the changes to Earth’s surface and their implications for biodiversity, on a global scale.
Download the Latest OSHA 10 Answers PDF : oyetrade.comNarendra Jayas
Latest OSHA 10 Test Question and Answers PDF for Construction and General Industry Exam.
Download the full set of 390 MCQ type question and answers - https://www.oyetrade.com/OSHA-10-Answers-2021.php
To Help OSHA 10 trainees to pass their pre-test and post-test we have prepared set of 390 question and answers called OSHA 10 Answers in downloadable PDF format. The OSHA 10 Answers question bank is prepared by our in-house highly experienced safety professionals and trainers. The OSHA 10 Answers document consists of 390 MCQ type question and answers updated for year 2024 exams.
Download the Latest OSHA 10 Answers PDF : oyetrade.com
COBWEB A quality assurance workflow authoring tool for citizen science and crowdsourced data
1. A Quality Assurance workflow Authoring Tool
for citizen science and crowd-sourced data.
Didier Leibovici,
Julian Rosser, Mike Jackson and the COBWEB project
Nottingham Geospatial Institute
University of Nottingham, UK
2. • Aim is to bring together a precise, structured, top-
down and formal standards-based institutional
approach with low cost, relevant, rich and timely
citizen-focussed approach of the crowd but where
there are short-comings of completeness, precision,
interoperability and often minimal direction.
• Not straight forward - the two perspectives of what
constitutes useful, QA’d, fit-for-use data are very
different.
Research Objective - to integrate (with QA)
authoritative and crowd-sourced data
3. Crowd Sourcing Authoritative Government
Data
‘Non-systematic incomplete coverage vs Systematic + comprehensive
Near ‘real-time’ and ongoing data
collection allowing trend analysis
vs ‘Historic’ and ‘snap-shot’ map
data
Free ‘un-calibrated’ data but often at
hi-res and up-to-the-minute
vs Quality assured ‘expensive’
data.
‘Unstructured’ and mass consumer
driven metadata and mash-ups.
vs ‘Structured’ and defined metadata
but often in rigid ontologies.
Unconstrained capture + distribution
from ‘ubiquitous’ mobile devices
vs ‘Controlled’ licensing, access
policies and digital rights.
Simple’ consumer driven web services
for data collection + processing.
vs ‘Complex ‘institutional survey +
GIS applications
A clash of paradigms and Market Dynamics:
Jackson, M. J., Rahemtulla, H. + Morley, J. (2010). “The Synergistic Use of Authenticated + Crowd-Sourced Data for
Emergency Response”, Proc, 2nd Int Workshop on Validation of Geo-Information Products for Crisis Management
(VALgEO), 11-13/10/10, Ispra, Italy, pp 91-99. http://globesec.jrc.ec.europa.eu/workshops/valgeo-2010/proceedings
5. When considering the use of crowd-sourced GI data we
need to quality assure it from:
1. A Spatial (geometric) perspective
2. A Thematic (domain attribution) perspective
3. A Temporal (time-related attribution) perspective
And in terms of data quality “Elements” we have to consider:
Completeness – by area, by class,
Consistency – e.g. topological, semantic, temporal
Accuracy – relative, absolute
Usability – fitness for purpose for a particular application or
requirement
Aspects of Quality
6. Solution adopted (i)
• “Internal” quality metrics <Completeness,
positional accuracy, consistency, etc.>
defined by ISO 19157
• “External” consumer quality <fitness for
purpose> metrics based on GeoViQua
[www.geoviqua.org>]
• Stakeholder model QA <data collector’s
judgement, trust, reliability> [Meek et al
2014]
7. Metadata on Data Quality three models
• ISO19157 (producer model)
where DQ_Scope will be ”feature"
DQ_Usability
• DQ_Completeness
DQ_CompletenessCommission
DQ_CompletenessOmission
• DQ_ThematicAccuracy
DQ_ThematicClassificationCorrectness
DQ_NonQuantitativeAttributeAccuracy
DQ_QuantitativeAttributeAccuracy
• DQ_LogicalConsistency
DQ_ConceptualConsistency
DQ_DomainConsistency
DQ_FormatConsistency
DQ_TopologicalConsistency
• DQ_TemporalAccuracy
DQ_AccuracyOfATimeMeasurement
DQ_TemporalConsistency
DQ_TemporalValidity
• DQ_PositionalAccuracy
DQ_AbsoluteExternalPositionalAccuracy
DQ_GriddedDataPositionalAccuracy
DQ_RelativeInternalPositionalAccuracy
Simplified GeoViqua model (consumer model)
where DQ_Scope will be ”external data"
GVQ_PositiveFeedback
GVQ_NegativeFeedback
COBWEB Stakeholder Quality Model
where DQ_Scope will be ”volunteer"
CSQ_Vagueness
CSQ_Ambiguity
CSQ_Judgement
CSQ_Reliability
CSQ_Validity
CSQ_Trust
CSQ_NoContribution
8. Solution adopted (ii)
• OGC WPS standard which allows access to a repository of
processes and services from compliant clients
• A key aspect of the standard is the provision to chain disparate
processes and services to form a reusable workflow
• Use of BPMN rather than (BPEL) for workflow engine - excels in
modelling processes visually allowing non-domain experts to
communicate and mutually understand their models.
• Configurable workflows - stakeholders able to design a solution
to fit use case from a generic set of WPS processes
9. Solution adopted (iii)
• Github used for code repository and open source
evolution of solution
• Built on open source implementations of WPS, client
libraries (52 North), BPMN implementation is JBPM
maintained by JBOSS, WPS runs on Apache Tomcat,
JBPM deployed on JBOSS Wildfly
• Full details in “A BPMN solution for chaining OGC
services to quality assure location-based crowd-
sourced data”, Meek, Jackson, Leibovici (2015)
submitted to: Computers and Geosciences
Mike Jackson, 4-5 Nov., 2015, China
10. the COBWEB QAQC the 7+ pillars of Quality Controls (QC)
7 pillars of QC and the 7+ cross-pillar
a QC
11.
12. .workflow authoring tool
BPMN encoding
.composition support
SKOS encoding
.repository of QCs
as WPS
QAQC: workflow of QC as WPS
QAwAT
QAwOnt
QAwWPS
14. Qualifying the Observations, the Volunteers
and the Authoritative data
Quality elements generated & evolving
QC examplesExample of a QA workflow
Design and composition using a graphical tool
15. QC examplesQAQC workflow Authoring Tool (QAwAT)
QAwAT
Design and composition in Eclipse
Design and composition JBPM web editor
16. Some results on the Japanese knotweed co-design
beforeQA
0.0 0.2 0.4 0.6 0.8 1.0
02468
0.5 + artificial sd 0.0001
DQ_ClassificationCorrectness & DQ_Usability
Density
18. Rosser J, Pourabdolllah A, Brackin R, Jackson MJ, Leibovici DG (2016) Full Meta Objects for Flexible Geoprocessing Workflows:
profiling WPS or BPMN? 19th AGILE Conference, 14-17 June 2016, Helsinki, Finland
Leibovici DG, Williams J, Rosser J.F, Hodges C, Scott D, Chapman C, Higgins C, and Jackson M.J (2016) The COBWEB Quality
Assurance System in Practice: Example for an Invasive Species Study. ECSA conference 19-21 May 2016, Berlin, Germany
Meek, S., Jackson, M., Leibovici, L. (2016), A BPMN solution for chaining OGC services to quality assure location-based
crowdsourced data , Computers &Geosciences, 87(2016)76–83
Leibovici DG, Meek S, Rosser J and Jackson MJ (2015) DQ in the citizen science project COBWEB: extending the standards. Data
Quality DWG, OGC/TC Nottingham, September 2015, U.K
Leibovici DG, Evans B, Hodges C, Wiemann S, Meek S, Rosser J and Jackson MJ (2015 ) On Data Quality Assurance and Conflation
Entanglement in Crowdsourcing for Environmental Studies. ISSDQ 2015 - The 9th International Symposium on Spatial Data Quality,
29-30 September, La Grande Motte, France
Meek S, Jackson MJ, Leibovici DG (2014) A flexible framework for assessing the quality of crowdsourced data. AGILE conference, 3-6
June 2014, Castellon, Spain
Leibovici DG and Jackson MJ (2013) Copula metadata est. AGILE conference, 14-17 May 2013, Leuven, Belgium
Leibovici DG, Pourabdollah A and Jackson MJ (2013) Which Spatial Data Quality can be meta-propagated? Journal of Spatial
Sciences, 58(1): 3-14
Leibovici DG, Pourabdollah A and Jackson M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in
Interoperable Spatial Data Infrastructures. EGU 2011, European Geosciences Union, General Assembly, Vienna, Austria April
2011
Pawlowicz S, Leibovici DG, Haines-Young R, Saull R and Jackson M (2011) Dynamical Surveying Adjustments for Crowd-sourced Data
Observations. EnviroInfo 2011, Ispra, Italy
Leibovici DG and Pourabdollah A (2010) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes.
OGC TC/PC Meetings, 20-24 September 2010, Toulouse, France
Jackson, M., Rahemtulla, H., Morley, J. (2010). The synergistic use of authenticated and crowd-sourced data for
emergency response, International Workshop on Validation of Geo-Information Products for Crisis
Management (VALgEO), Ispra, Italy. pp 91-99.
19. Quality Assurance workflow Authoring Tool
(QAwAT)
Didier G. Leibovici,
Julian Rosser, Mike Jackson and the COBWEB project
Nottingham Geospatial Institute
University of Nottingham, UK
Email: firstname.secondname@nottingham.ac.uk
Thank you!
Editor's Notes
1/ QAWAT is the quality assurance tool designed by the university of Nottingham with the FP7 project COBWEB.
It is used for citizen science by the stakeholder who designed the survey campaign using the data capture tool.
2/ Quality assurance via the QAQC processing comes during the data capture or after the data capture of observations from each Volunteering citizen
and aims at producing metadata on data quality for the captured observation
4/ Added to the ISO standard for quality: the producer model, the QAQC uses a quality model to qualify the volunteers: the stakeholder model
as well as a very simplified version of the consumer model designed by GeoViQUA.
The Stakeholder quality model evaluate or calibrate the properties of the volunteer seen as a sensor using 6 dimensions related to its accuracy, consistency, and trust
3/ The COBWEB platform and the Quality Assurance tool are based on interoperability standards for geoprocessing, workflow and quality information encoding
5/ Each Quality Control produces or updates a number of quality elements from the three models;
and is seen as a single task within the workflow evaluating the quality from different type of controls: the pillars.
The whole QA workflow will be composed a a series of QC’s belonging to these 7 pillars.
The 7+ pillars deal with security and privacy when necessary for a specific QC which would belong to one of the 7 pillars
6/ The categorisation of the QCs in 7 pillars is to help in the development of geoprocesses and in the composition of the workflow.
The 7 pillars represent the top an ontology of the QCs for VGI ,citizen science and crowdsourcing.
A similar QC can exist in different pillars but the quality elements generated or the rule to assign their values will be different due to the semantic of the pillars.
7/ Any code wrapped up into a WPS is registered as part of a particular pillar and the workflow web editor is the tool used to compose the workflow.
In its algorithm, each QC will have a processing or geoprocessing part followed by a logic reasoning part depending on the results of the first part and the semantic attached to it and to the pillar description.
8/ From the list of QCs in each pillar, any tool to compose a BPMN workflow can be used. The conceptual aspect of the workflow captured by the graphical analytic of the BPMN can be annotated and shared within the different stakeholders. QAwAT is to compose and execute the workflow using the workflow engine linked to the WPS.
9/ As an example we have here a quality Control in pillar1 helping to assess the position of the aimed point when taking a photo with the smartphone.
Besides reporting the obs point as being the LoS point one can also from the distance to the point estimate a Topological Consistency, with the position of the observer to identify properly what he/she is observing.
From the uncertainty of the GPS of the phone and of the DEM and of the bearings parameters. The LoS point, distance to it and its uncertainty can be computed then the probability for a Normal distribution to be less than a stakeholder given’s threshold of a reasonable distance to make a proper observation.
10/ We used JBPM suite which has a workflow engine and an editor either within Eclipse or as a web editor.
The workflow engine has been modified to accept WPS services as tasks.
11/ Using the online editor. The editor enables one to drag and drop the tasks. Then the input URL and output names are filled in.
The QA workflow can be run from the web interface or later from a WPS interface once the whole workflow is wrapped into a process of the WPS.
12/ The QAQC starts by postulating 50% quality uncertainty for Classification Correctness (for example)
13/ The quality elements evolves through the workflow as different QC update their values.
The results therefore depend on the choice of QCs and set of parameters used in that workflow.
The BPMN of the workflow is stored as metaquality element so encoding the provenance of the metadata values on data quality.
This is a qualifying not a validating step or verifying step. It indicates how uncertain we are about each observation considering the rules put in the QA workflow.
Low and high quality values being less uncertain, here of being a true Japanese Knotweed observation or not.
After verification using the ground truth, even though the tendency is that the QAQC helps to identify correct observations,
you can see that some high quality were given to some wrong JKW observations and vice versa.
Quite a number of citizen’s data still have uncertainty attached, i.e. still around 50%.
14/ and here are the results for the Snowdonian National Park and the initial survey for Japanese knotweed performed during May-July 2015, with some the data source used in the pillars as well.
In summary, the ones close to managed lands or woods but not to the EO identified areas as at risk of having JKW have a lower quality (higher uncertainty).
15/ Some of the research on data quality and workflows at the Nottingham Geospatial Institute
1/ QAWAT is the quality assurance tool designed by the university of Nottingham with the FP7 project COBWEB.
It is used for citizen science by the stakeholder who designed the survey campaign using the data capture tool.
2/ Quality assurance comes during the data capture or after the data capture of observations from each Volunteering citizen
and aims at producing metadata on data quality for the captured observation
3/ The COBWEB platform and the Quality Assurance tool are based on interoperability standards for geoprocessing, workflow and quality information encoding
4/ Added to the ISO standard for quality: the producer model,
the QAQC uses a quality model to qualify the volunteers: the stakeholder model
as well as a very simplified version of the consumer model designed by GeoViQUA.
The Stakeholder quality model evaluate or calibrate the properties of the volunteer seen as a sensor using 6 dimensions related to its accuracy, consistency, and trust
5/ Each Quality Control or QC for short produces or updates a number of quality elements from the three models;
and is seen as a single task within workflow evaluating the quality from different type of controls: the pillars.
The whole QA workflow will be composed a a series of QC belonging to these 7 pillars.
The 7+ pillar deals with security and privacy when necessary for a specific QC which would belong to one of the 7 pillars
6/ The categorisation of the QCs in 7 pillars is to help in the development of geoprocesses and in the composition of the workflow.
The 7 pillars represent the top an ontology of the QCs for VGI ,citizen science and crowdsourcing.
A similar QC can exist in different pillars but the quality elements generated or the rule to assign their values will be different due to the semantic of the pillars.
7/ Any code wrapped up into a WPS is registered as part of a particular pillar and the workflow web editor is the tool used to compose the workflow.
In its algorithm, each QC will have a processing or geoprocessing part followed by a logic reasoning part depending on the results of the first part and the semantic attached to it and to the pillar description.
8/ From the list of QCs in each pillar, any tool to compose a BPMN workflow can be used. The conceptual aspect of the workflow captured by the graphical analytic of the BPMN can be annotated and shared within the different stakeholders. QAwAT is to compose and execute the workflow using the workflow engine linked to the WPS.
9/ As an example we have here a quality Control in pillar1 helping to assess the position of the aimed point when taking a photo with the smartphone.
Besides reporting the obs point as being the LoS point one can also from the distance to the point estimate a Topological Consitency, with the position of the observer to identify properly what he/she is observing.
From the uncertainty of the GPS of the phone and of the DEM and of the bearings parameters. The LoS point, distance to it and its uncertainty can be computed then the probability for a Normal distribution to be less than a stakeholder given’s threshold of a reasonable distance to make a proper observation.
10/ We used JBPM suite which has a workflow engine and an editor either within Eclipse or as a web editor.
The workflow engine has been modified to accept WPS services as tasks.
11/ Using the online editor. The editor allows to drag and rop the tasks. Then the input URL and ouput names are filled in.
The QA workflow can be run from the web interface or later from a WPS interface once the whole workflow is wrapped into a process of the WPS.
12/ The QAQC starts by postulating 50% quality uncertainty for Classification Correctness (for example)
13/ The quality elements evolves through the workflow as different QC update their values.
The results therefore eon the choice of QCs and set of parameters used in that workflow.
The BPMN of the workflow is stored as metaquality element so encoding the provenance of the metadata values on data quality.
This is a qualifying not a validating step or verifying step. It indicates how uncertain we are about each observation considering the rules put in the QA workflow.
Low and high quality values being less uncertain, here of being a true Japanese Knotweed observation or not.
After verification using the ground truth, even though the tendency is that the QAQC helps to identify correct observations,
you can see that some high quality were given to some wrong JKW observations and vice versa.
Quite a number of citizen’s data still have uncertainty attached, i.e. still around 50%.
14/ … and here is the map of the results for the Snowdonian National Park and the initial survey for Japanese knotweed performed during May-July 2015, with some the data source used in the pillars as well.
Basically the ones too close to ‘managed lands’ or ‘woods’ and not enough close to the area identified from EO as at risk of having JKW have a lower quality.
15/ Some of the research on data quality, error propagation and workflows at the Nottingham Geospatial Institute