• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Dtm Quality Assesment
 

Dtm Quality Assesment

on

  • 2,307 views

 

Statistics

Views

Total Views
2,307
Views on SlideShare
2,224
Embed Views
83

Actions

Likes
1
Downloads
67
Comments
1

4 Embeds 83

http://www.vector1media.com 80
http://translate.googleusercontent.com 1
http://www.linkedin.com 1
https://www.linkedin.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

11 of 1 previous next

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • ok
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Dtm Quality Assesment Dtm Quality Assesment Document Transcript

    • THE NATIONAL TECHNICAL-SCIENTIFIC CONFERENCE „Modern technologies for the 3RD Millenium” – ORADEA, 2009 DTM QUALITY ASSESMENT DROJ Gabriela1, ABSTRACT The usage of Geographic Information Systems (GIS) has been rapidly increased and it became the main tool for analyzing spatial data in many engineering applications and decision making activities. These applications and activities require more and more the three-dimensional reference space (surface), known as Digital Terrain Model (DTM). Quality assessment of data is a critical parameter for DTM production and it relies heavily on statistical methods. In contrast, visual methods are generally neglected despite their potential for improving DTM quality. In this paper, several enhanced visual techniques and for quality assessment are described and illustrated with areas and datasets selected from Oradea hill’s area. KEYWORDS: GIS, Spatial Data Quality, DTM, 1. INTRODUCERE The change from paper maps to digital data, in various kinds of geographical data analysis and applications, has made easy to use the same spatial data for different applications and also for combining several layers into quite complex spatial models. Using GIS technology, contemporary maps have taken radical new forms of display beyond the usual 2D planimetric paper map. Today, it is expected to be able to drape spatial information on a 3D view of the terrain. The 3D view of the terrain is called Digital Terrain Model (DTM) or Digital Elevation Model (DEM). [2] The digital terrain models are frequently used to take important decisions like to answer hydrological questions concerning flooding. In order to judge these decisions the quality of DTM must be known. The quality of DTM is, unfortunately, rarely checked. While the development of GIS advances, DTM research has so far been neglected. Quality assessment of data is a critical parameter for DTM production and it relies heavily on statistical methods. In contrast, visual methods are generally neglected despite their potential for improving DTM quality. In this paper, several enhanced visual techniques and for quality assessment are described and illustrated with areas and datasets selected from Oradea hill’s area. 2. DIGITAL TERRAIN MODELING FOR GIS APPLICATIONS A DTM is a digital representation of ground surface, topography or terrain. It is also known as Digital Elevation Model (DEM). The DTM can 1 Sef lucrări dr., Universitatea din Oradea, Facultatea de Arhitectură şi Construc ii, e-mail: g_abyg@yahoo.ro 1
    • SECTION: Architecture, Construction, Cadastral Survey, Sanitary Engineering And Environmental Protection be represented as a raster (a grid of squares), as contour lines or as a triangular irregular network (TIN) [3,5,8] . In GIS applications and according to the relative bibliography, the following two methodologies are frequently used for DTM GIS 3D modelling: (i) the cartographic interpolation-based digitizing method, and (ii) the image-based automatic measurements method [6]. (i) The cartographic interpolation-based digitizing method is widely used because topographic maps are usually available. The input data form the basis for the computation of the DTM, consisting of points. The computation itself consists in spatial interpolation algorithms. There are several interpolation methods frequently used in GIS. The following eight widely used methods are compared and studied in this paper. These methods are: Inverse distance weighted (IDW), Spline Biquadratic interpolation, Spline Bicubic interpolation, B-spline interpolation, Nearest neighbours – Voronoi diagrams, Delaunay triangulation, Quadratic Shepard interpolation and Kriging interpolation [1,6]. (ii) The image-based automatic measurements method is based on close-range photogrammetry or airborne laser scanning and outputs bulk points with a high density. The DTM for GIS 3D modelling applications is realized in the post processing phase usually by creating the TIN or by interpolation. 3. DTM QUALITY ASSESMENT The quality of spatial analysis depends on data quality, data model relevance and on the way they interact. The model is a conceptualization, a representation and an abstraction of the real world, a selected representation of space, time, or attributes. The model relevance is a semantic quality of the representation by which a complex reality is captured. Data quality refers to the performance of the dataset given the specification of the data model[7,10] One of the DTM quality assessment goals is to fulfill the requirements of spatial data standards. The ISO distinguishes five elements of data quality: completeness; logical consistency; and three types of accuracy (positional, temporal, and thematic). This paper is concerned with accuracy, defined as a difference between the value of a variable, as it appears in a dataset, and the value of the variable in the data model (or “reality”). More specifically, we are referring to positional accuracy. We can distinguish between absolute and relative accuracy in terms of nature of the data. The term precision is considered as a component of accuracy, related to the scale, resolution, and also to the generalization of datasets. 2
    • THE NATIONAL TECHNICAL-SCIENTIFIC CONFERENCE „Modern technologies for the 3RD Millenium” – ORADEA, 2009 The term error is used for lack of quality, or little or no accuracy. In addition to mistakes it also refers to the statistical concept of variation. The variation corresponds to random errors, thus incorrect spatial variation can be considered as systematic or gross error. According to these definitions, a level of accuracy (or error) can be described with a root mean square error (RMSE) and precision with a standard deviation or a standard error (σ). The proposed procedure for quality assessment of the spatial datasets, especially of a DTM, comprises the following steps: (1) preparing the datasets; (2) processing with statistical or visual methods; (3) obtaining results as numbers, thematic maps, graphs, etc.; (4) analysis (comparison with expected results); and (5) obtaining metadata or corrected datasets (see figure 1).[7] Figure1: The five-step procedure for quality assessment of a DTM[7] 4. DTM QUALITY ASSESMENT: EXPERIMENTS WITH REAL- WORLD DATA DTMs are the most popular results of interpolation. In the following we will test different methods and algorithms for creation of DTM in order to establish a minimum set of parameters to compare and evaluate the quality of resulted data. To test and compare the methods with real data we have selected an area from north hills of Oradea municipality. For the first DTM we used photogrammetric measurement of spot elevations from orthorectified airborne image of the area. The TIN of the area was generated using ARCGIS Desktop 9.1. In the pictures below we represented the 3D model created. 3
    • SECTION: Architecture, Construction, Cadastral Survey, Sanitary Engineering And Environmental Protection Figure 2 - Oradea 3D model This model will consider the reference for testing and evaluating the most popular interpolation algorithms used for DTM creation. These methods are Inverse distance weighted (IDW), Spline Biquadratic interpolation, Spline Bicubic interpolation, B-spline interpolation, Nearest neighbours – Voronoi diagrams, Delaunay Triangulation, Quadratic Shepard interpolation, Kriging interpolation. The first step was to create a regular grid with the step of 500 m, for a total of 30 points. On this set of points we tested the algorithms specified before. In the figure 2 we represented the results of these algorithms. In the second step we have created a regular grid with the step of 250 m, for a total of 121 points. On this sets of points we tested the algorithms specified before. In the figure 3 we represented the results of these algorithms. 4
    • THE NATIONAL TECHNICAL-SCIENTIFIC CONFERENCE „Modern technologies for the 3RD Millenium” – ORADEA, 2009 a)Inverse distance weighted b) Spline Biquadratic interpolation c) Spline Biquadratic interpolation d) B-spline interpolation e)Nearest neighbours – Voronoi diagrams f) Delaunay Triangulation g) Quadratic Shepard interpolation h) Kriging interpolation Figure 3 - DTM model generated with 30 known points 5
    • SECTION: Architecture, Construction, Cadastral Survey, Sanitary Engineering And Environmental Protection a) Inverse distance weighted b) Spline Biquadratic interpolation c) Spline Biquadratic interpolation d) B-spline interpolation e) Nearest neighbours – Voronoi diagrams f) Delaunay Triangulation g) Quadratic Shepard interpolation h) Kriging interpolation Figure 4 - DTM model generated with 121 known points 6
    • THE NATIONAL TECHNICAL-SCIENTIFIC CONFERENCE „Modern technologies for the 3RD Millenium” – ORADEA, 2009 The analyses of the results were made by direct observation, with visual comparisons of the models and by using statistical parameters. The visual comparisons show a high similarity with the reference for the Delaunay Triangulation and Shepard interpolation in both test cases. In order to evaluate the surfaces generated by using statistical methods its necessary to test the quality of the surfaces with an independent set of data, which were not considered in the interpolation. In the following we will evaluate the surfaces generated by using a random set of points for which was determined the real value of the altitude. For this independent random set we have determined the following statistical parameters: variation, median absolute deviation, standard deviation and the root mean square error. The determined values are presented in the table below: Tabelul 1. Statistical Analysys Methods Variation Median absolute Standard Root Mean deviation deviation Square Error 500 250 500 500 500 250 500 250 IDW 231 167 15,21 15,21 15,21 12,29 13,48 10,93 Bi-quadratic 390 426 19,75 19,75 19,75 20,64 15,35 13,84 Bi-cubic 947 1044 30,77 30,77 30,77 32 19,14 17,14 B-spline 166,2 145 12,89 12,89 12,89 12 12,54 10,37 Voronoi 246 94,5 15,71 15,71 15,71 9,75 13,79 9,27 Delaunay 218 36,63 14,78 14,78 14,78 6,05 13,34 7,99 Shepard 164 71,78 12.81 12.81 12.81 8,47 11,97 8,49 Kriging 160,06 78,97 12,65 12,65 12,65 8,88 11,75 8,94 5. RESULTS The results obtained in the both cases show that the most accurate surfaces are generated, for the first case (grid of 500 m), by Kriging, Shepard and B-spline algorithms and for the second case (grid of 250m) by Delauney triangulation followed by Shepard and Kriging. If we evaluate all the statistical data we notice that the Delauney triangulation is representing the optimal method. Similar results can be obtained by Kriging and Shepard interpolation. Even these methods are sometimes more efficient than the Delauney triangulation. Nevertheless, the Delauney algorithm is recommended because it needs less computing time 7
    • SECTION: Architecture, Construction, Cadastral Survey, Sanitary Engineering And Environmental Protection and it is not changing the original values of the points. The b-spline algorithm gives also a good result but in this case the computing time is much higher and it is smoothing the surface, fact which is making this method inadequate for surfaces with a high altitude difference. . CONCLUZII The first conclusion which is pointed up by the values presented in table 1 and by the analysis of the visual results, in the both cases, shows that input data form the basis for the computation of the DTM. The density of the known points is more important factor in increasing DTM quality than the algorithm used in surface creation. A second conclusion is that no optimal algorithm for any situation, the results given by different computational method is influenced by more factors like: the conformation of the field, the density of the initial points, the quality of the known values and nevertheless by algorithm used. The last conclusion is that, we can improve the quality of the DTM by using the Delauney algorithm and a high density of known points. Regarding the way of computing for Delaunay triangulation is necessary to have at least 3 points for each hill or valley. BIBLIOGRAFIE 1. ASIM M. R., GHULAM M. & BRODLIE K. (2004), Constrained Visualization of 2D Positive Data using Modified Quadratic Shepard Method, WSCG’2004, Plzen, Czech Republic. 2. BENWELL G., (2006) A Land Speed Record? Some Management Issues to Address. Int'l Conference on Managing GIS for success. Melbourne, Australia, pp. 70-75, ISBN: 0 7325 1359 6, 1996. 3. DU CHONGJIANG(1996), An Interpolation Method for Grid-Based Terrain Modeling, The Computer Journal, Vol. 39, No. 10, 1996. 4. ESRI(1999). Environmental Science Research Institute, Inc. Arc/Info 8.0.1 software. 5. FLORIANI L. DE, PUPPO E., MAGILLO P.,(1999) Application of Computational Geometry to Geographic Informational Systems, Handbook of Computational Geometry, 1999 Elsevier Science, 6. W. KAREL, N. PFEIFER, C. BRIESE(2006), DTM Quality Assessment, ISPRS Technical Commission II Symposium 2006, XXXVI/2, pp. 7-12. 7. PODOBNIKAR, T(2009)-Methods for visual quality assessment of a digital terrain model, http://sapiens.revues.org, september 2009 8. van KREVELD M.(1997), Algorithms for Triangulated Terrains. Conference on Current Trends in Theory and Practice of Informatics, 1997. 8