* * <should use present tense except when talking about completed project>
* This process was initiated almost an year ago and only recently have tools been launched to do this –ex LASTools. Only required a list of the key lidar tile names and used arc view licence and SAGA/opensource Reduced operator time to approx 5 mins per project and processed overnight Quicker in new LASTools as SAGA/Dos/Python required multiple conversions.
* Ok so we have created high density outputs from the Lidar such as contours, DEM etc Now we can’t do any large scale analysis due to system issues…
Low cost solutions for Lidar and GIS analysis
Low Cost Solutions for LiDAR Point Cloud Analysis -Utilizing Open Source and Low Cost software- Lidar Technologies 2012 Cairns, 7th June, 2012Rahman Schionning and George CoreaAtherton Tablelands Geographic Information Services
OverviewWhere we were an year agoThe search for new softwareSAGA Demo – LAS CapabilitiesPython integrationDealing with outputs and derivative productsThe 32bit Windows shortfallMore tools we are exploringA future option
So... Youve gone and gotten yourself a point cloud! Now What?
What software have we already got?• ArcGIS and MapInfo could handle derivative products like the 1m ASCII DEM, 12.5cm Imagery and the 25cm Contours (in small portions)• Erdas Imagine could import .las for rasterization and subsequent analysis• Global Mapper worked well but we only had one license at the time.• Mars Freeview
Customized Tools •SAGA / Python based extraction of “.las”
Dealing with outputs of Lidar- ArcGIS centred approachesEven 10km2 blocks of 25cm data(usually) crashed in vectorprocessingSolution is to “chunk” thedatasets and then merge afteranalysis.Issue is that for the area of TRCthis means…Hundreds of files =High operator time for running processes =Increased risk to operator error (-:
Complex modelsSimplified to require minimum inputs…with Python Scripting
Other complex “Point Clouds”- ArcGIS couldn’t process datasets of ~ 1m points at 30m spacing (same was truewith Mapinfo and QGIS)Utilized …UNIX “SED” command to reformat txt dataset togeocodable formatthen python to “chunk” the datasets into segments of500k pointsfinally models to rejoin all the processed data.
•Some hard lessons…Dataset size:- Spatially splitting data significantly reduces processing time even whenmultiple parts have to be merged at the completion of the process. Modelsincluding python components have helped significantly. “[If] there were originally 65,000 * 1000^2 = 6.5 E10 cells in the DEM. Torepresent each of these requires at least four ordered pairs of either 4-byte integer or 8-byte floating coordinates, or 32-64 bytes. Thats a 1.3 E12 - 2.6 E12 byte (1.3 - 2.5 TB)requirements. We havent even begun to account for file overhead (a feature is stored asmore than just its coordinates), indexes, or the attribute values, which themselves couldneed 0.6 TB (if stored in double precision) or more (if stored as text), plus storage foridentifiers. Oh, yes--ArcGIS likes to keep two copies of each intersection around, therebydoubling everything. You might need 7-8 TB just to store the output.Even if you had the storage needed, (a) you might use twice this (or more) if ArcGIS iscaching intermediate files and (b) its doubtful that the operation would complete in anyreasonable time, anyway.”A response to my query -http://gis.stackexchange.com/questions/16110/issues-with-large-datasets which succinctlydescribes the issues with large datesets.
•Some hard lessons…- For example A dataset which took 7 hours to process (and crashed sometimes),processed in about 100 mins when split into 6 parts and then took 10 mins to merge.- Need to investigate multi-core, multi-threaded operations but the currentlyavailable modules are to cumbersome to easily integrate into our models.
•Added “intelligence” to modelsso that unnecessary processesaren’t run “brute force” in ArcGISmodel builder •Some steps took ~12h to run and so this (~10min) quick step saved ~10h per model that took 30+ hours to complete.
•Added “intelligence” to modelsso that unnecessary processesaren’t run “brute force” in ArcGISmodel builder •Some steps took ~12h to run and so this quick step saved ~10h per model that took 30+ hours to complete.
More Tools •Global Mapper ~$350 •great for conversion and clipping. •3D viewing exists but not as good as Mars free view. Flood and View shed analysis •Meshlab ~$0 •Only useful for visualization of small areas and objects in high resolution. •Google Sketchup ~$0-$500 •similar to Meshlab in usefulness for GIS. •Commercial > $2500 o Makai Voyager – resamples las files to proprietary format. Highly functional for detailed analysis but commercial version is not yet launched o Point Tools – resamples las files to proprietary format. Highly functional for detailed analysis o LASTools - o Mars Pro -