Automated LiDAR Data
Quality Control

February 12, 2013

Engineering | Architecture | Design-Build | Surveying | GeoSpatia...
Presenter

Matt Bethel, GISP
Director of Technology for Merrick & Company
Development Manager for Merrick’s Advanced
Remot...
Merrick & Company Office Locations

500 employees at 13 national
and 4 international offices
PREXXXX 3

Copyright © 2010 M...
Merrick’s International Project Experience

PREXXXX 4

Copyright © 2010 Merrick & Company All rights reserved.
Presentation Objective

This presentation will review an automated approach to
airborne LiDAR quality analysis and quality...
USGS NGP Lidar Base Specification Version 1.0
 http://pubs.usgs.gov/tm/11b4/TM

11-B4.pdf
 Intended

to create consisten...
Who should have LiDAR QA/QC concerns?

 Data

providers: to ensure that data meets project
specifications prior to delive...
The Problem – Client Side
 RFPs

and project scope of works state accuracy
requirements but…


rarely say anything about...
The Problem – Vendor Side
When contracted to QA LiDAR projects, we have seen a rise in poor
quality data as a trade off to...
The Problem – Quality
 QA/QC

methodologies ranged from…



None



Checking a representative sample (what happens ever...
Our Goals


To check all airborne LiDAR data in an automated fashion








Make it customizable





Make it us...
MARS Tool Development


We developed many stand alone tools in
MARS to analyze and report many
aspects of LiDAR QA/QC


...
Modularization and Automation






We built a module in MARS that
combines our stand alone tools
into an automated pro...
Results
A

comprehensive and automated approach to checking the quality
of all LiDAR point file deliverables in their ENT...
Performance Benchmarks
Run times depends on:


Data


Flightline overlap



Project boundary complexity



Number of p...
Report Demo

PREXXXX 16

Copyright © 2010 Merrick & Company All rights reserved.
Future Developments
 Workflow

staged processing



Coverage check



Boresight



Filter



Delivery

 Distributed
...
Thank you

Matt Bethel

Director of Technology
Merrick & Company - Booth #45
matt.bethel@merrick.com
303-353-3662
http://w...
Upcoming SlideShare
Loading in …5
×

Automated LiDAR Data Quality Control

448 views

Published on

Conference presentation for the 2013 International LiDAR Mapping Forum (ILMF), which was held at the Hyatt Regency Denver at Colorado Convention Center in Denver, Colorado from February 11 - 13, 2013.

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
448
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
18
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Automated LiDAR Data Quality Control

  1. 1. Automated LiDAR Data Quality Control February 12, 2013 Engineering | Architecture | Design-Build | Surveying | GeoSpatial Solutions
  2. 2. Presenter Matt Bethel, GISP Director of Technology for Merrick & Company Development Manager for Merrick’s Advanced Remote Sensing (MARS) software PREXXXX 2 Copyright © 2010 Merrick & Company All rights reserved.
  3. 3. Merrick & Company Office Locations 500 employees at 13 national and 4 international offices PREXXXX 3 Copyright © 2010 Merrick & Company All rights reserved.
  4. 4. Merrick’s International Project Experience PREXXXX 4 Copyright © 2010 Merrick & Company All rights reserved.
  5. 5. Presentation Objective This presentation will review an automated approach to airborne LiDAR quality analysis and quality control (QA/QC) that is based on the USGS’ National Geospatial Program LiDAR Base Specification Version 1.0. It will showcase a fully automated process for analyzing LiDAR data in its entirety to verify and report compliance to a project’s acceptance criteria. PREXXXX 5 Copyright © 2010 Merrick & Company All rights reserved.
  6. 6. USGS NGP Lidar Base Specification Version 1.0  http://pubs.usgs.gov/tm/11b4/TM 11-B4.pdf  Intended to create consistency across all of USGS’ National Geospatial Program (NGP) funded LiDAR data collections, in particular those undertaken in support of the National Elevation Dataset (NED)  Unlike most other “LiDAR specs”, which focus on the derived bare earth digital elevation model (DEM) product, this specification places unprecedented emphasis on the handling of the source LiDAR point cloud data PREXXXX 6 Copyright © 2010 Merrick & Company All rights reserved.
  7. 7. Who should have LiDAR QA/QC concerns?  Data providers: to ensure that data meets project specifications prior to delivery  Client/End users (commercial entities, local/state/federal organizations): to ensure that they are receiving the products that they purchased and require for their specific needs  Any purchaser of LiDAR data that requires a reliable process to determine if final payment should be authorized PREXXXX 7 Copyright © 2010 Merrick & Company All rights reserved.
  8. 8. The Problem – Client Side  RFPs and project scope of works state accuracy requirements but…  rarely say anything about how they will test these requirements  usually talk about absolute accuracy but not always relative  sometimes contradict themselves (“+/-15cm RMSEz at the 95% C.I.”)  are often copied from other documents and the client is left not really knowing what they are asking for or understand what they are getting  most everyone is asking for something slightly different  USGS Lidar Base Specification Version 1.0  “We want that”  “We want pieces of that”  “We want to refer to that but ask for this” PREXXXX 8 Copyright © 2010 Merrick & Company All rights reserved.
  9. 9. The Problem – Vendor Side When contracted to QA LiDAR projects, we have seen a rise in poor quality data as a trade off to push the bidding price down  Data providers vary the procedure, frequency, and extent of their LiDAR calibration  Many vendors use automated boresight tools which could have potentially negative outcomes:   Effective enough to be dangerous  Most do not consider all aspects of an error budget  Does not always find and flag flight planning or acquisition issues, sensor malfunctions, or human mistakes   Lower skill level required Often times, little to no QA/QC procedures Some ‘cheat’ to get around proper calibration and other QC tasks  Clipping off or reclassifying edge lap to avoid dealing with LiDAR boresight  Shifting tiles to a custom geoid derived from the vertical error to ground control  Some vendors can hide error through other creative techniques especially if they discover problems after the plane has left the jobsite  These practices can be caught and/or avoided PREXXXX 9 Copyright © 2010 Merrick & Company All rights reserved.
  10. 10. The Problem – Quality  QA/QC methodologies ranged from…  None  Checking a representative sample (what happens everywhere else?)  Checking some things but not others (i.e. absolute accuracy but not relative calibration)  Throwing many people and a lot of time at projects to manually check as much data as possible (or that budget will allow)  Contract it out, typically it’s done right but at added costs and delays  Clients rarely know how to properly review LiDAR data nor do they have the tools to do so  We needed more automated tools to get quality answers quickly and accurately about our LiDAR data PREXXXX 10 Copyright © 2010 Merrick & Company All rights reserved.
  11. 11. Our Goals  To check all airborne LiDAR data in an automated fashion       Make it customizable   Make it usable   Make it accurate   Make it work across sensor platforms Make it fast Provide quantitative and qualitative results, whenever possible When this is impossible, create derivative products during the automated process that will help the user QC the data as quickly and thoroughly as possible Create tools that catch problems before they are too late Create links to supplemental data that can assist with the QC process Create reports that the end user can understand Deliver these reports to the client or empower them to perform automated QA/QC analysis on their own data Provide this tool to end users that have these challenges PREXXXX 11 Copyright © 2010 Merrick & Company All rights reserved.
  12. 12. MARS Tool Development  We developed many stand alone tools in MARS to analyze and report many aspects of LiDAR QA/QC  Control reporting tools (absolute accuracy)  Flight line vertical separation rasters (relative accuracy)  Point density reporting  Spatial distribution verification  Hillshade to check LiDAR filter  LAS statistics  Intensity/range analysis  Void detection  Others  These tools run on the entire dataset and often produce a report or a single, manageable, output raster, compressed to a JPEG2000 format for fast display and small file size  Excluding control point reporting, the products of these tools report on all of the data, not a representative sample PREXXXX 12 Copyright © 2010 Merrick & Company All rights reserved.
  13. 13. Modularization and Automation    We built a module in MARS that combines our stand alone tools into an automated process that test for the 29 USGS LiDAR specification V1.0 items This creates two PDF reports (detailed and summary) plus subsequent derivative products It is batched and performance has been optimized to run on large data sets    Effective RAM utilization   Multi-threaded Temporary local disc caching for slower network processing needs It is customizable so that some or all of the tests can be processed, depending on the need or available input data Output report and derivative data are both thematically rendered and statistically reported PREXXXX 13 Copyright © 2010 Merrick & Company All rights reserved.
  14. 14. Results A comprehensive and automated approach to checking the quality of all LiDAR point file deliverables in their ENTIRETY – no representative sample testing A tool that saves an enormous amount of manual QC labor hours and dollars A workflow addition that eliminates costly rework and project delays A process for data providers to deliver better products (first time delivery acceptance) and invoice the customer sooner A tool for end users to understand what level of data quality they are receiving and be able to provided proof of required rework. This also educates the client about their data investment. A mechanism for clients to decrease the delivery acceptance period and start using the data sooner PREXXXX 14 Copyright © 2010 Merrick & Company All rights reserved.
  15. 15. Performance Benchmarks Run times depends on:  Data  Flightline overlap  Project boundary complexity  Number of project boundaries  Number of delivery tiles  LiDAR density  Land cover Processing computer hardware  Number of CPUs  Amount of available RAM   45 LiDAR flightline distribution   MARS QC Module Benchmark Results 40 35 30 Runtime (hours)  25 Disc / network speed Settings  15 All tests run versus selected tests  20 Optional derivative data produced 10  Very rough processing speed (data ratio to processing time) is ~3 GB per hour on a high end processing computer (8-16 CPUs and 12-48 GB of RAM) PREXXXX 15 Copyright © 2010 Merrick & Company All rights reserved. 5 0 0 20 40 60 80 LiDAR Data Size (GB) 100 120
  16. 16. Report Demo PREXXXX 16 Copyright © 2010 Merrick & Company All rights reserved.
  17. 17. Future Developments  Workflow staged processing  Coverage check  Boresight  Filter  Delivery  Distributed  More processing user definable LiDAR QA/QC tests  Additional  Horizontal LiDAR specifications accuracy measurement and reporting capabilities PREXXXX 17 Copyright © 2010 Merrick & Company All rights reserved.
  18. 18. Thank you Matt Bethel Director of Technology Merrick & Company - Booth #45 matt.bethel@merrick.com 303-353-3662 http://www.merrick.com/Geospatial http://www.merrick.com/Geospatial/Services/MARS-Software PREXXXX 18 Copyright © 2010 Merrick & Company All rights reserved.

×