Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

GOAR: GIS Oriented Mobile Augmented Reality for Urban Landscape Assessment

18,462 views

Published on

This slide is presented in CMC2012 (2012 4th International Conference on
Communications, Mobility, and Computing).
Abstract. This research presents the development of a mobile AR system which realizes geometric consistency
using GIS, a gyroscope and a video camera which are mounted in a smartphone for urban landscape assessment. A low cost AR system with high flexibility is developed.
Geometric consistency between a video image and 3DCG are verified. In conclusion, the proposed system was evaluated as feasible and effective.

Published in: Technology
  • Be the first to comment

GOAR: GIS Oriented Mobile Augmented Reality for Urban Landscape Assessment

  1. 1. 4th International Conference on Communications, Mobility, and Computing (CMC2012), Guilin, China GOARGIS ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT TOMOHIRO FUKUDA, TIAN ZHANG, AYAKO SHIMIZU, MASAHARU TAGUCHI, LEI SUN and NOBUYOSHI YABUKI Division of Sustainable Energy and Environmental Engineering, Graduate School of Engineering, Osaka University, Japan
  2. 2. Outline1. Introduction2. System Development 1. Development Environment of a System 2. System Flow3. Verification of System 1. Consideration of allowable residual error 2. Accuracy of geometric consistency with a video image and 3DCG4. Conclusion 2
  3. 3. Outline1. Introduction2. System Development 1. Development Environment of a System 2. System Flow3. Verification of System 1. Consideration of allowable residual error 2. Accuracy of geometric consistency with a video image and 3DCG4. Conclusion 3
  4. 4. 1.1 Motivation -1 1. Introduction In recent years, the need for landscape simulation has been growing. A review meeting of future landscape is carried out on a planned construction site in addition to being carried out in a conference room. It is difficult for stakeholders to imagine concretely such an image that is three-dimensional and does not exist. A landscape visualization method using Computer Graphics (CG) and Virtual Reality (VR) has been developed. However, this method requires much time and expense to make a 3D model. Moreover, since consistency with real space is not achieved when using VR on a planned construction site, it has the problem that a reviewer cannot get an immersive experience. 4 A landscape study on site VR caputure of Kobe city
  5. 5. 1.1 Motivation -2 1. Introduction In this research, the authors focus Augmented Reality (AR) which can superimpose an actual landscape acquired with a video camera and 3DCG. When AR is used, a landscape assessment object will be included in the present surroundings. Thereby, a drastic reduction of the time and expense involved in carrying out 3DCG modeling of the present surroundings can be expected. A smartphone is widely available on the market level. Sekai Camera Web Smartphone Market in Japan http://sekaicamera.com/ 5
  6. 6. モバイル型景観ARの進化 Introduction1.2 Previous Study 1.  In AR, realization of geometric consistency with a video image of an actual landscape and CG is an important feature 1. Use of physical sensors such as GPS (Global Positioning System) and gyroscope. To realize highly precise geometric consistency, special hardware which is expensive is required. Image Sketch (2005) 2006 6 ©2012 Tomohiro Fukuda, Osaka-U
  7. 7. 1.2 Previous Study 1. Introduction2. Use of an artificial marker. Since an artificial marker needs to be always visible by the AR camera, the movable span of a user is limited. Moreover, to realize high precision, it is necessary to use a large artificial marker. Yabuki, N., et al.: 2011, An invisible height evaluation system for building height regulation to preserve good landscapes using augmented reality, Automation in Construction, Volume 20, Issue 3, 228-235. artificial marker 7
  8. 8. 1.3 Aim 1. Introduction In this research, GOAR (GIS Oriented Mobile AR) system which realizes geometric consistency using GIS to obtain position data instead of GPS which obtains a low accuracy of the location information, a gyroscope and a video camera which are mounted in a smartphone is developed. A low cost AR system with high flexibility is realizable. (Virtual Object for Landscape Simulation) 8
  9. 9. Outline1. Introduction2. System Development 1. Development Environment of a System 2. System Flow3. Verification of System 1. Consideration of allowable residual error 2. Accuracy of geometric consistency with a video image and 3DCG4. Conclusion 9
  10. 10. 2. System Development2.1 Development Environment Of a System Standard Spec Smartphone: GALAPAGOS 003SH (Softbank Mobile Corp.) Development Language: OpenGL-ES(Ver.2.0),Java(Ver.1.6) Development Environment: Eclipse Galileo(Ver.3.5) Location Estimation Technology: GIS includes Google Maps API and Digital Elevation Model (DEM) which is 10 m mesh size Video Camera Spec of 003SH OS Android™ 2.2 Qualcomm®MSM8255 CPU Snapdragon® 1GHz ROM:1GB Memory RAM:512MB Weight ≒140g Size ≒W62×H121×D12mm Display Size 3.8 inch Resolution 480×800 pixel 003SH 10
  11. 11. 2.2 System Flow -1 2. System Development While the CG model realizes Calibration of a video camera ideal rendering by the Definition of landscape assessment 3DCG model perspective drawing method, rendering of a video camera Activation of AR system produces distortion. Selection of 3DCG model Starting of Activation of Activation of Google Maps gyroscope video camera Input of DEM Angle information Capture of live acquisition video image Distortion Calibration Position information acquisition Definition of position and angle information on CG virtual camera Superposition to live video image and 3DCG model Display of AR image Save of AR image Calibration of the video camera 11 using Android NDK-OpenCV
  12. 12. 2.2 System Flow -2 2. System Development 3DCG Model Calibration of a video camera Definition of landscape assessment 3DCG model Geometry, Texture, Unit Activation of AR system Selection of 3DCG model 3DCG model allocation file Starting of Activation of Activation of Google Maps gyroscope video camera Input of DEM Angle information Capture of live acquisition video image 3DCG model name, File name, Position Position data (longitude, latitude, information acquisition orthometric height), Degree of rotation angle, and Zone Definition of position and angle number of the rectangular plane information on CG virtual camera Superposition to live video image and 3DCG model 3DCG model arrangement information file Display of AR image Save of AR image Number of the 3DCG model allocation information file, 12 Each name
  13. 13. 2.2 System Flow -3 2. System Development Calibration of a video camera Definition of landscape assessment 3DCG model Activation of AR system Selection of 3DCG model Starting of Activation of Activation of Google Maps gyroscope video camera Input of DEM Angle information Capture of live acquisition video image Position information acquisition Definition of position and angle information on CG virtual camera Superposition to live video image and 3DCG model Display of AR image Save of AR image GUI of the Developed System 13
  14. 14. 2.2 System Flow -4 2. System Development Calibration of a video camera Definition of landscape assessment 3DCG model yaw Activation of AR system Selection of 3DCG model Starting of Activation of Activation of Google Maps gyroscope video camera Input of DEM Angle information Capture of live acquisition video image roll Position pitch information acquisition Definition of position and angle information on CG virtual camera Coordinate System of Developed AR system Superposition to live video image and 3DCG model Display of AR image Save of AR image 14
  15. 15. 2.2 System Flow -5 2. System Development Calibration of a video camera Definition of landscape assessment 3DCG model Activation of AR system Selection of 3DCG model 1.The user tap the current location on Google Maps Starting of Activation of Activation of Google Maps gyroscope video camera Input of DEM Angle information Capture of live acquisition video image Position information acquisition 2.The position data (longitude, Definition of position and angle latitude) on the current information on CG virtual camera location is obtained Superposition to live video image and 3DCG model Display of AR image Save of AR image 3.Altitude is created using position data (longitude, latitude) and DEM 15
  16. 16. 2.2 System Flow -6 2. System Development Calibration of a video camera Definition of landscape assessment 3DCG model Activation of AR system Selection of 3DCG model Starting of Activation of Activation of Google Maps gyroscope video camera Input of DEM Angle information Capture of live acquisition video image Position information acquisition Definition of position and angle information on CG virtual camera Superposition to live video image and 3DCG model Display of AR image Save of AR image 16
  17. 17. モバイル型景観ARの進化 17
  18. 18. Outline1. Introduction2. System Development 1. Development Environment of a System 2. System Flow3. Verification of System 1. Consideration of allowable residual error 2. Accuracy of geometric consistency with a video image and 3DCG4. Conclusion 18
  19. 19. 3. Verification of System3.1 Consideration of allowable residual error  The residual error of position (longitude, latitude) occurs by the gap with the position in which a user does a tap on Google Maps as an actual position.  When the size of the digital map is maximized on Google Maps, the distance in the real space of the map is 123 m to the size of a screen being 78 mm. That is, 1 mm on a screen is about 1.6 m in the real space.  On the other hand, since a tap is operated with a finger, a residual error may occur only the width of the finger used for a tap. Since the width of the finger had individual difference, it was set as 5 mm in this research.  Therefore, if the scale of a digital map and the error of the width of a finger are taken into consideration, an error will be set to less than 8 m when directing latitude and longitude.  Moreover, about the residual error of altitude, it is expected that 10m mesh DEM cannot respond to change of the altitude from a model creation time and a difference with reality may occur since the altitude between the mesh 5mm (Width of finger) = 8m (Distance in real space) vertices are linearly interpolated. 1mm (Size of screen) = 1.6m (Distance in real space) 19
  20. 20. 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG -1  Experimental Methodology ▶ The parameters for realizing geometric consistency are: ▶ Position: latitude, longitude, altitude by GIS ▶ Angle: yaw, pitch, roll by gyroscope ▶ The accuracy of geometric consistency is determined by combining the residual error of these parameters. ▶ A known building and viewpoint place are set up. ▶ In one experiment, only one parameter was acquired from a device and the remaining parameters set up a known value as a fixed value. ▶ Calculation of residual error between live video image and CG at the same point 20
  21. 21. 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG -2  Known Building Target ▶ GSE Common East Building at Osaka University Suita Campus ▶ W29.6 m, D29.0 m, H67.0 m Photo Drawing 28.95m 29.6m 28.95m 29.6m 64.8m 64.8m 29.6m Latitude, Longitude, Orthometric height Outlined 3D Model 34.823026944, 135.520751389, 60.15 21
  22. 22. 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG -3  Known Viewpoint Place ▶ No.14-563 reference point. Distance from the reference point to the center of the Building was 203 m. ▶ AR system was installed with a tripod at a level height 1.5m. Reference Point Building Target BC 10m A D 203m Maximum Altitude: 53.5m Altitude of Reference Viewpoint Point: 53.1m (No.14-563 Reference Point) Latitude, Longitude, Altitude Measuring Points of 34.82145699, 135.519612, 53.1 Residual Error Minimum Altitude: 51.0m 22
  23. 23. 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG -4  Parameter Settings of Eight Experiments Parameter Settings (S: Static Value = Known value, D: Dynamic Value = Acquired value from a device ) Position Information of Angle Information of Experiment CG Virtual Camera CG Virtual Camera Latitude Longitude Altitude yaw pitch roll No.1 S S S S S S No.2 D (GIS) D (GIS) D (GIS) S S S No.3 D (GIS) D (GIS) D (GIS) D D D 1) No.4 D (GPS) D (GPS) D (GPS) D D D 1) T. Fukuda, T. Zhang, A. Shimizu, M. Taguchi, L. Sun, N. Yabuki, “SOAR: Sensor oriented Mobile Augmented Reality for Urban Landscape Assessment”, Proceedings of the 17th International Conference on Computer Aided Architectural Design Research in Asia (CAADRIA), pp.387-396, 2012-4. 23
  24. 24. 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Calculation Procedure of Residual Error 1. Pixel Error: Each difference between the horizontal direction and vertical direction of four points measured by pixels (Δx, Δy). ⊿x ⊿y Live Image CG Model Calculation image of residual error between live video image and CG 2. Distance Error: From the acquired value (Δx, Δy), each difference in the horizontal direction and vertical direction was computed as a meter unit by the formula 1 and the formula 2 (ΔX, ΔY). (1) (2) W: Actual width of an object (m) H: Actual height of an object (m) x: Width of 3DCG model on AR image (px) y: Height of 3DCG model on AR image (px) 24
  25. 25. 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.1 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.1 S S S S S S (0.12m/pixel) No.1 No.2 No.3 No.4 Pixel Error Max. Mean Min. Unit Distance Error Distance Error Unit: Unit No.1 No.2 No.3 No.4
  26. 26. 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.2 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.2 D (GIS) D (GIS) D (GIS) S S S (0.12m/pixel) No.1 No.2 No.3 No.4 Pixel Error Max. Mean Min. Unit Distance Error Distance Error Unit: Unit No.1 No.2 No.3 No.4
  27. 27. 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.3 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.3 D (GIS) D (GIS) D (GIS) D D D (0.12m/pixel) No.1 No.2 No.3 No.4 Pixel Error Max. Mean Min. Unit Distance Error Distance Error Unit: Unit No.1 No.2 No.3 No.4
  28. 28. 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.4 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.4 D (GPS) D (GPS) D (GPS) D D D (0.12m/pixel) No.1 No.2 No.3 No.4 Pixel Error Max. Mean Min. Unit Distance Error Distance Error Unit: Unit No.1 No.2 No.3 No.4
  29. 29. 3. Verification of System 3.2 Accuracy of geometric consistency with a video image and 3DCG  Allowable residual error of longitude and latitude: 8m at the maximum  Result of No.3, the maximum residual error is 6.5 m, a mean distance error is 2.2 m, and it became smaller than anticipation.  When the mean distance error of No.3 was compared with No.4:  Horizontal: 0.7 m larger  Vertical: 5 m smaller  Proposed GIS technique obtains position data on higher accuracy especially in a vertical direction rather than GPS. Max. Mean Min. 6.3mDistance Error Unit: 3m 2.3m 1.1m 1.3m 1.3m 0.11m No.1 No.2 No.3 No.4 No.1 No.3 No.4 29
  30. 30. Outline1. Introduction2. System Development 1. Development Environment of a System 2. System Flow3. Verification of System 1. Consideration of allowable residual error 2. Accuracy of geometric consistency with a video image and 3DCG4. Conclusion 30
  31. 31. 4. Conclusion4.1 Conclusion  The developed AR system has geometric consistency using GIS and the gyroscope with which the smartphone is equipped. Therefore, a user can use it easily and we can describe it as a system with high flexibility.  In GOAR system, appearance of the residual error of longitude and latitude by a user specifying a current position on Google Maps and the residual error of altitude by using 10m meshed DEM is expected. As a result of the experiment, the maximum residual error of longitude and latitude was 6.5 m, and the mean distance error was 2.2 m. The maximum residual error of altitude was 2.6 m and the mean distance error was 1.3 m. Any result became smaller than assumption.  Consequently, the proposed GOAR system was evaluated as feasible and effective. 31
  32. 32. 4. Conclusion4.2 Future Work  A future work should attempt to reduce the residual error included in the dynamic value acquired with gyroscope.  It is also necessary to verify accuracy of the residual error to objects further than 200m away and usability. 32
  33. 33. Thank you for your attention! E-mail: fukuda@see.eng.osaka-u.ac.jp Twitter: fukudatweetFacebook: Tomohiro Fukuda Linkedin: Tomohiro Fukuda

×