IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
Illumination-robust Recognition and Inspection in Carbide Insert ProductionIDES Editor
In processes of the production chain of carbide
inserts, such as unloading or packaging, the conformity test
of the insert type is performed manually, which causes a
statistic increase of errors due to monotony and fatigue of
workers as well as the wide variety of insert types. A measuring
method is introduced that automatically inspects the chipformer
geometry, the most significant quality feature of
inserts. The proposed recognition approach is based on local
energy model of feature perception and concatenates the phase
congruency in terms of local filter orientations into a compact
spatial histogram. This method has been tested on several
inserts of different types. Test results show that prevalent
insert types can be inspected and classified robustly under
illumination variations.
Presented at the 2nd International Conference on Earth Observation for Global Changes, Chengdu, China.
Abdulhakim Abdi & Anand Nandipati
http://www.geospatialtechnologist.com/
OGC spet 2010 Meta-propagation of uncertainties within workflowsDidier, G. Leibovici
To begin with let us quote the QA4EO (Quality Assurance for Earth Observation)1:
“If the vision of GEOSS is to be achieved, Quality Indicators (QIs) should be ascribed to data and, in particular, to delivered information products, at each stage of the data processing chain - from collection and processing to delivery. A QI should provide sufficient information to allow all users to readily evaluate a product’s suitability for their particular application, i.e. its “fitness for purpose”. To ensure that this process is internationally harmonised and consistent, the QI needs to be based on a documented and quantifiable assessment of evidence demonstrating the level of traceability to internationally agreed (where possible SI) reference standards. Such standards may be manmade, natural or intrinsic in nature. The documented evidence should include a description of the processes used, together with an uncertainty budget (or other appropriate quality performance measure).The guidelines of QA4EO provide a template and guidance on how to achieve this in a harmonised and robust manner. “
For interoperability purposes, each data and process registered within EuroGEOSS possesses appropriate metadata elements. The metadata description and the semantics attached to each component of a workflow (datasets and processing services) allow updating/swapping of these components. With varying quality of the components of the workflow, the quality of the outputs of this workflow can become unreliable. With the knowledge of the level of uncertainty in each dataset involved and the sensitivity aspects of the processing steps it is possible to define the quality of a workflow and the level of uncertainty of the outputs by error propagation principles.
Reusing of a given model encapsulated in a scientific workflow implies running the workflow using either the same datasets but not necessarily coming from the same sources, or different datasets which have also not necessarily the required/desired scale specified by the workflow. From error propagation principles and the knowledge of the quality metadata of the components of the workflow, using datasets from different sources or at different scales can be assessed for the quality of the workflow. As part of the integrated modelling activity the latter assessment will help the modeller in choosing the appropriate datasets or in refining the workflow model for example by considering data assimilation, downscaling, multiple scale integration steps within the scientific model and its associated workflow. The workflow quality assessment will help also the modeller in swapping or refining the processing steps as well. Under these modelling activities, the workflow is then seen as the concrete support of a conceptual model, which evolves as the conceptual model does.
On top of quality descriptors existing in the ISO19157, the present document describes the requirements for uncertainty analysis within scientific workflows.
IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
Illumination-robust Recognition and Inspection in Carbide Insert ProductionIDES Editor
In processes of the production chain of carbide
inserts, such as unloading or packaging, the conformity test
of the insert type is performed manually, which causes a
statistic increase of errors due to monotony and fatigue of
workers as well as the wide variety of insert types. A measuring
method is introduced that automatically inspects the chipformer
geometry, the most significant quality feature of
inserts. The proposed recognition approach is based on local
energy model of feature perception and concatenates the phase
congruency in terms of local filter orientations into a compact
spatial histogram. This method has been tested on several
inserts of different types. Test results show that prevalent
insert types can be inspected and classified robustly under
illumination variations.
Presented at the 2nd International Conference on Earth Observation for Global Changes, Chengdu, China.
Abdulhakim Abdi & Anand Nandipati
http://www.geospatialtechnologist.com/
OGC spet 2010 Meta-propagation of uncertainties within workflowsDidier, G. Leibovici
To begin with let us quote the QA4EO (Quality Assurance for Earth Observation)1:
“If the vision of GEOSS is to be achieved, Quality Indicators (QIs) should be ascribed to data and, in particular, to delivered information products, at each stage of the data processing chain - from collection and processing to delivery. A QI should provide sufficient information to allow all users to readily evaluate a product’s suitability for their particular application, i.e. its “fitness for purpose”. To ensure that this process is internationally harmonised and consistent, the QI needs to be based on a documented and quantifiable assessment of evidence demonstrating the level of traceability to internationally agreed (where possible SI) reference standards. Such standards may be manmade, natural or intrinsic in nature. The documented evidence should include a description of the processes used, together with an uncertainty budget (or other appropriate quality performance measure).The guidelines of QA4EO provide a template and guidance on how to achieve this in a harmonised and robust manner. “
For interoperability purposes, each data and process registered within EuroGEOSS possesses appropriate metadata elements. The metadata description and the semantics attached to each component of a workflow (datasets and processing services) allow updating/swapping of these components. With varying quality of the components of the workflow, the quality of the outputs of this workflow can become unreliable. With the knowledge of the level of uncertainty in each dataset involved and the sensitivity aspects of the processing steps it is possible to define the quality of a workflow and the level of uncertainty of the outputs by error propagation principles.
Reusing of a given model encapsulated in a scientific workflow implies running the workflow using either the same datasets but not necessarily coming from the same sources, or different datasets which have also not necessarily the required/desired scale specified by the workflow. From error propagation principles and the knowledge of the quality metadata of the components of the workflow, using datasets from different sources or at different scales can be assessed for the quality of the workflow. As part of the integrated modelling activity the latter assessment will help the modeller in choosing the appropriate datasets or in refining the workflow model for example by considering data assimilation, downscaling, multiple scale integration steps within the scientific model and its associated workflow. The workflow quality assessment will help also the modeller in swapping or refining the processing steps as well. Under these modelling activities, the workflow is then seen as the concrete support of a conceptual model, which evolves as the conceptual model does.
On top of quality descriptors existing in the ISO19157, the present document describes the requirements for uncertainty analysis within scientific workflows.
This session will guide participants through the various types of content they can offer students in a live synchronous learning session to increase student interaction in the Wimba Live Classroom. Presented during Development Week 2012.
Local Commerce Monitor, Wave 16 - Franchisees of National CompaniesBIA/Kelsey
Data from BIA/Kelsey's Local Commerce Monitor, Wave 16 (Q3 2012) survey of SMBs (small medium businesses) on their marketing and advertising behaviors. The LCM survey tracks SMB ad and marketing spending, Web footprint, media performance assessments, and opinions about key topics like emerging media and sales channels. This deck spotlights franchisees of national companies and chains.
Digital Adoption by SMBs: A Preview of BIA/Kelsey’s Latest SMB Research - Loc...BIA/Kelsey
During the webinar, BIA/Kelsey's Steve Marshall and Abid Chaudhry, shared five initial takeaways from our Local Commerce Survey (LCM) survey that tracks the marketing and advertising behaviors of small and medium-sized businesses (SMBS), which included:
Takeaway 1: Spend on advertising media has plateaued.
Takeaway 2: Spend for online presence and engagement is increasing strongly.
Takeaway 3: SMBs are integrating their online properties.
Takeaway 4: Discounts and promotions are growing and evolving.
Takeaway 5: Social media has become a pivotal platform.
This deck includes a full analysis of these takeaways. LCM Wave 18 will publish on September 15, 2014 and be available for purchase from the BIA/Kelsey website: http://www.biakelsey.com/Research-and-Analysis/SMB-and-Consumer-Research/Local-Commerce-Monitor/ or by emailing info@biakelsey.com.
This session will guide participants through the various types of content they can offer students in a live synchronous learning session to increase student interaction in the Wimba Live Classroom. Presented during Development Week 2012.
Local Commerce Monitor, Wave 16 - Franchisees of National CompaniesBIA/Kelsey
Data from BIA/Kelsey's Local Commerce Monitor, Wave 16 (Q3 2012) survey of SMBs (small medium businesses) on their marketing and advertising behaviors. The LCM survey tracks SMB ad and marketing spending, Web footprint, media performance assessments, and opinions about key topics like emerging media and sales channels. This deck spotlights franchisees of national companies and chains.
Digital Adoption by SMBs: A Preview of BIA/Kelsey’s Latest SMB Research - Loc...BIA/Kelsey
During the webinar, BIA/Kelsey's Steve Marshall and Abid Chaudhry, shared five initial takeaways from our Local Commerce Survey (LCM) survey that tracks the marketing and advertising behaviors of small and medium-sized businesses (SMBS), which included:
Takeaway 1: Spend on advertising media has plateaued.
Takeaway 2: Spend for online presence and engagement is increasing strongly.
Takeaway 3: SMBs are integrating their online properties.
Takeaway 4: Discounts and promotions are growing and evolving.
Takeaway 5: Social media has become a pivotal platform.
This deck includes a full analysis of these takeaways. LCM Wave 18 will publish on September 15, 2014 and be available for purchase from the BIA/Kelsey website: http://www.biakelsey.com/Research-and-Analysis/SMB-and-Consumer-Research/Local-Commerce-Monitor/ or by emailing info@biakelsey.com.
A Vision-Based Mobile Platform for Seamless Indoor/Outdoor PositioningGuillaume Gales
The emergence of smartphones equipped with Internet access, high resolution cameras, and posi- tioning sensors opens up great opportunities for visualising geospatial information within augmented reality applications. While smartphones are able to provide geolocalisation, the inherent uncertainty in the estimated position, especially indoors, does not allow for completely accurate and robust alignment of the data with the camera images.
In this paper we present a system that exploits computer vision techniques in conjunction with GPS and inertial sensors to create a seamless indoor/outdoor positioning vision-based platform. The vision-based approach estimates the pose of the camera relative to the fac ̧ade of a building and recognises the fac ̧ade from a georeferenced image database. This permits the insertion of 3D widgets into the user’s view with a known orientation relative to the fac ̧ade. For example, in Figure 1 (a) we show how this feature can be used to overlay directional information on the input image. Furthermore we provide an easy and intuitive interface for non-expert users to add their own georeferenced content to the system, encouraging volunteering GI. Indeed, to achieve this users only need to drag and drop predefined 3D widgets into a reference view of the fac ̧ade, see Figure 1 (b). The infrastructure is flexible in that we can add different layers of content on top of the fac ̧ades and hence, this opens many possibilities for different applications. Furthermore the system provides a representation suitable for both manual and automatic content authoring.
With the increasing needs of intelligent and autonomous systems to sense, move and react with the surroundings, it is a clear necessity to train such systems with as much relevant data as can be obtained. However, there are many challenges in obtaining real world data, particularly in a 3D environment. In this talk, I will cover some of the recent advances in Graphics and Computing techniques in 3D processing and their possible application in dynamic settings for autonomous systems. A vision of how synthetic data could be relevant in the future of intelligent systems is presented, along with the challenges. Backup material covers latest papers on the subject
For the full video of this presentation, please visit:
http://www.embedded-vision.com/platinum-members/qualcomm/embedded-vision-training/videos/pages/may-2016-embedded-vision-summit-mangen
For more information about embedded vision, please visit:
http://www.embedded-vision.com
Michael Mangen, Product Manager for Camera and Computer Vision at Qualcomm, presents the "High-resolution 3D Reconstruction on a Mobile Processor" tutorial at the May 2016 Embedded Vision Summit.
Computer vision has come a long way. Use cases that were previously not possible in mass-market devices are now more accessible thanks to advances in depth sensors and mobile processors. In this presentation, Mangen provides an overview of how we are able to implement high-resolution 3D reconstruction – a capability typically requiring cloud/server processing – on a mobile processor. This is an exciting example of how new sensor technology and advanced mobile processors are bringing computer vision capabilities to broader markets.
3D perception is crucial for understanding the real world. It offers many benefits and new capabilities over 2D across diverse applications, from XR and autonomous driving to IOT, camera, and mobile. 3D perception with machine learning is creating the new state of the art (SOTA) in areas, such as depth estimation, object detection, and neural scene representation. Making these SOTA neural networks feasible for real-world deployment on mobile devices constrained by power, thermal, and performance has been a challenge. Qualcomm AI Research has developed not only novel AI techniques for 3D perception but also full-stack AI optimizations to enable real-world deployments and energy-efficient solutions. This presentation explores the latest research that is enabling efficient 3D perception while maintaining neural network model accuracy. You’ll learn about:
- The advantages of 3D perception over 2D and the need for 3D perception across applications
- Advancements in 3D perception research by Qualcomm AI Research
- Our future 3D perception research directions
Data Usability Assessment for Remote Sensing Data: Accuracy of Interactive Da...Beniamino Murgante
Data Usability Assessment for Remote Sensing Data: Accuracy of Interactive Data Quality Interpretation
Erik Borg, Bernd Fichtelmann - German Aerospace Center, German Remote Sensing Data Center
Hartmut Asche - Department of Geography, University of Potsdam
15. Sächsisches GI/GIS/GDI Forum
Dresden, 15. September 2015
GI29015 – INTRODUCTION TO OPEN DATA MANAGEMENT IN EUROPE OF REGIONS –
Doz. Dr. Frank HOFFMANN, CSc – Vorstandsvorsitzender IGN e.V.
Academician of International Eurasian Academy of Sciences (IEAS)
15. Sächsisches GI/GIS/GDI Forum und Club of Ossiach Workshops,
Dresden: 15. September 2015
CLUB OF OSSIACH & GI2015 WORKSHOPS
PROGRAMME & PROCEEDINGS
Edited by F. HOFFMANN (IGN)
CoO + GI2015 ppt_charvat ict for a sustainable agriculture – public support n...IGN Vorstand
15. Sächsisches GI/GIS/GDI Forum und Club of Ossiach Workshops,
Dresden: 15. September 2015
CLUB OF OSSIACH RECOMMENDATION FOR ICT FOR FAMILY FARMING
Karel CHARVAT, Club of Ossiach / CCSS (CZ)
CoO + GI2015 ppt_mayer ict for a sustainable agriculture - status and missingIGN Vorstand
15. Sächsisches GI/GIS/GDI Forum und Club of Ossiach Workshops,
Dresden: 15. September 2015
ICT FOR A SUSTAINABLE AGRICULTURE AND FORESTRY STATUS AND MISSING
Walter H. MAYER, CEO PROGIS / Treasurer of CoO
15. Sächsisches GI/GIS/GDI Forum und Club of Ossiach Workshops
COPERNICUS PROGRAMME AND SENTINEL DATA FOR AGRICULTURE AND FORESTRY
Lenka Hladíková, CENIA, Czech Environmental Information Agency (CZ)
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
1. USE OF THE DATA
UNCERTAINTY ENGINE (DUE)
BY NATIONAL MAPPING AND
CADASTRAL AGENCIES
Dipl. – Ing. Tomas Cajthaml
19.05.2012 GI2012 1
2. Agenda
1. Introduction
2. State of the art of the Czech cadastre
3. DUE software
4. Estimation of pos. acccuracy of points
5. Estimation of areas
6. Conclusions
Terminology note: in this presentation the
terms uncertainty and accuracy are
considered as identical
19.05.2012 GI2012 2
3. Introduction
Data Quality is still marginal, but important in
the process of SDI building
NMCAs has particular systems (Quality
Management Systems) of data production
including data quality
INSPIRE trying to improve quality standards
has to be established in the SDI because of its
higher usage and improvement
Quality Awareness is rising up with INSPIRE
(data specifications, GCM, tec. guidelines)
19.05.2012 GI2012 3
4. Quality standards in production
Internal quality External quality
Users
Clients PDAs
Maps Computers Users
Services Tablets
Data Capture Production Output Selection Usage Apps
Specification Specification Licencing Metadata, Software
policy catalogues
ISO 19158 ISO 19131 GeoRM, ISO 19115, OGC, … ,
ISO 19157 metadata GIS GIS, PDAs …
Audits Audits Audits Access control
SLAs
Certification Certification Certification Certification
Accreditation Accreditation Accreditation
Edited accoroding to: Y. Bedard - Geospatial Data Quality + Risk Management + Legal Liability = Evolving Professional Practices 4
19.05.2012 GI2012
5. State of the art of the Czech
cadastre
◦ DKM (digital cadastral map) - map with the highest
positional accuracy with most points in the range of up to
14 cm. This cadastral map is created by new cadastral
mapping by accurate field surveying techniques,
◦ KMD (cadastral map digitized by readjustment) -
cadastral map, created by reprocessing of the available
cadastral evidence. Cadastral parcels are digitized over
transformed raster images (digitized points are identified
from new and old survey sketches, documentation of
detailed survey of changes etc.),
◦ Analogue cadastral map – scanned as raster images of
old cadastral maps. As the KMD progresses slowly and is
costly, analogue cadastral maps are nowadays digitized
into UKM (simplified goal directed cadastral map). The
COSMC complied with requests from the Ministry of
Interior and Municipalities to maintain the UKM as a simple
vector image without attribute values and techniques of
KMD.
19.05.2012 GI2012 5
6. Quality of cadastral maps
Quality code Characteristic (standard coordinate Lineage (source of measured
(previous error with description of lineage of points) – in relation to old
classes of the point) positional classes and
positional mapping technology
uncertainty)
3 < 0.14m Field surveying with
agreement of land owners
4 Standard coordinate error < 0.26m Photogrammetry
6 Digitized points from maps at
1:1000
7 Digitized points from maps at
1:2000
8 Digitized points from old maps at Other digitalization,
1:5000 and smaller scales + high surveying with agreement of
positional uncertainty points, land owners
without agreement of land owners
19.05.2012 GI2012 6
7. Data Uncertainty Engine
Gerard B. M. Heuvelink – professor Wageningen University and
Research Centre, Netherland
James D. Brown – Institute for Biodiversity and Ecosystem Dynamics,
Amsterdam University, Netherland
Creation – Harmonirib: www.harmonirib.com
DUE software for estimation of
◦ Positional accuracy (uncertainty)
◦ Temporal accuracy (uncertainty)
◦ Attribute accuracy (uncertainty)
Data Attributes:
◦ Numerical variables (e.g. rainfall)
◦ Discrete numerical variables (e.g. bird counts)
◦ Categorical variables (e.g. land-cover)
Supported file formats
◦ ESRI shapefiles *.shp
◦ Simplified GeoEAS *.eas
◦ ASCII raster *.asc
◦ ASCII file for simple time-series *.tsd
19.05.2012 GI2012 7
8. Sources of uncertainty
Basic cycle – 5 stages = basic steps:
1. Importing (saving) data as objects with
attributes
Model Model
2. Describingofthe sources of uncertainty
Description
Params.
uncertainty states
3. Defining an uncertainty model, aided by
Input the description model
data4. Evaluating the quality or goodness of
the uncertainty model
Model Model
Model definition Output
5. Generating
structurerealizations of uncertain
output
data for use in MCS (Monte Carlo Sim.)
with models
Data ± U Model ± U Output ± U
In: Brown J. - Results on assessing uncertainties in data and models
19.05.2012 GI2012 8
9. Possitional accurracy of point
estimation
Pos. accuracy of surveyed points
Analogue cadastral map as an example
Evaluation and comparison of two data
sets:
◦ Digitized analogue cadastral map
◦ Universe of discourse = laser scanning data
-> Probability Distribution Function creation
based on comparison of identical points
coordinates difreences ->
19.05.2012 GI2012 9
10. Step by step approach
1. digitization of analogue cadastral map
2. acquisition of samples of spatial data in the test area by
mobile laser scanning (establishing the universe of
discourse of data set),
3. point cloud digitization - obtaining corner points of
buildings identical with cadastral map content in 3D - they
will be used to determine/derive probabilistic error model,
4. creation of a 2D digitized design file – MicroStation
Bentley SELECT series 2 version was used to digitize 3D
design file (this is a simple step - convert 3D file into 2D)
5. evaluation of systematic error (bias) – systematic error
calculation or spatial statistics (geostatistic) or it’s variogram
evaluation,
6. determination of probability model parameters
7. generation of realizations by the Monte Carlo method
19.05.2012 GI2012 10
11. Probability Distribution Function
Histogram
16 120,00%
Sample – buildings from laser scanning = universe of discourse:
14
100,00%
Oxy = dx 2 + dy 2 = 2,41 m
12
Position deviation 80,00%
10
1 n
σ ( X ) var( X ) D( X ) = xi E x = 1,78 m 2
Rate (count)
2 2
8 Variance 60,00%
n i=1
Četnost
Rate
σ = D X = var X = 1,33 m
6
Kumul. %
40,00%
Standard deviation
4
20,00%
2
0 0,00%
Classes [m]
19.05.2012 GI2012 11
12. Area of a lot estimation
Use of the same data sets
Calculate area of a lots from laser scanning
data -> compare it with areas digitized – to
improve values of areas
Calculate global or local marginal deviations to
announce needs of
recheck/resurvey/recalculate areas
Important for purposes of:
◦ Taxation
◦ Subsidies (e.g. farmers)
19.05.2012 GI2012 12
13. Conclusions
Calculating tolerances for control
measurements of geographic databases –
good to check new survey sketches – detect
problematic areas
Calculating of complicated areas with Monte
Carlo simulation is easier then with other ways
Improve or confirm estimation of data quality -
code of points testing with samples and with
realizations from DUE -> output in metadata
It could be easy to present positional accuracy
also for INSPIRE purposes
19.05.2012 GI2012 13
14. USE OF THE DATA UNCERTAINTY
ENGINE (DUE) BY NATIONAL MAPPING
AND CADASTRAL AGENCIES
Thank you very much for your
attention
Dipl. – Ing. Tomas Cajthaml
Many thanks to:
•GEOVAP Pardubice - for laser scanning data and trial software
•Bentley Systems - for MicroStation and Descartes trial software
19.05.2012 GI2012 14