SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT
Upcoming SlideShare
Loading in...5
×
 

SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

on

  • 1,766 views

This slide is presented in CAADRIA2012 (The 17th International Conference on Computer Aided Architectural Design Research in Asia). ...

This slide is presented in CAADRIA2012 (The 17th International Conference on Computer Aided Architectural Design Research in Asia).
Abstract. This research presents the development of a sensor oriented mobile AR system which realizes geometric consistency using GPS, a gyroscope and a video camera which are mounted in a smartphone for urban landscape assessment. A low cost AR system with high flexibility is realized. Consistency of the viewing angle of a video camera and a CG virtual camera, and geometric consistency between a video image and 3DCG are verified. In conclusion, the proposed system was evaluated as feasible and effective.

Statistics

Views

Total Views
1,766
Views on SlideShare
728
Embed Views
1,038

Actions

Likes
0
Downloads
78
Comments
0

3 Embeds 1,038

http://d.hatena.ne.jp 1035
http://www.google.co.jp 2
https://twitter.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT Presentation Transcript

  • CAADRIA2012, Chennai, India SOAR SENSOR ORIENTED MOBILEAUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT TOMOHIRO FUKUDA, TIAN ZHANG, AYAKO SHIMIZU, MASAHARU TAGUCHI, LEI SUN and NOBUYOSHI YABUKIDivision of Sustainable Energy and Environmental Engineering, Graduate School of Engineering, Osaka University, Japan
  • Contents1. Introduction2. System Development 1. Development Environment of a System 2. System Flow3. Verification of System 1. Consistency with the viewing angle of a video camera and CG virtual camera 2. Accuracy of geometric consistency with a video image and 3DCG4. Conclusion 2
  • Contents1. Introduction2. System Development 1. Development Environment of a System 2. System Flow3. Verification of System 1. Consistency with the viewing angle of a video camera and CG virtual camera 2. Accuracy of geometric consistency with a video image and 3DCG4. Conclusion 3
  • 1.1 Motivation -1 1. Introduction In recent years, the need for landscape simulation has been growing. A review meeting of future landscape is carried out on a planned construction site in addition to being carried out in a conference room. It is difficult for stakeholders to imagine concretely such an image that is three-dimensional and does not exist. A landscape visualization method using Computer Graphics (CG) and Virtual Reality (VR) has been developed. However, this method requires much time and expense to make a 3D model. Moreover, since consistency with real space is not achieved when using VR on a planned construction site, it has the problem that a reviewer cannot get an immersive experience. 4 A landscape study on site VR caputure of Kobe city
  • 1.1 Motivation -2 1. Introduction In this research, the authors focus Augmented Reality (AR) which can superimpose an actual landscape acquired with a video camera and 3DCG. When AR is used, a landscape assessment object will be included in the present surroundings. Thereby, a drastic reduction of the time and expense involved in carrying out 3DCG modeling of the present surroundings can be expected. A smartphone is widely available on the market level. Sekai Camera Web Smartphone Market in Japan http://sekaicamera.com/ 5
  • モバイル型景観ARの進化 Introduction1.2 Previous Study 1.  In AR, realization of geometric consistency with a video image of an actual landscape and CG is an important feature 1. Use of physical sensors such as GPS (Global Positioning System) and gyroscope. To realize highly precise geometric consistency, special hardware which is expensive is required. Image Sketch (2005) 2006 6 ©2012 Tomohiro Fukuda, Osaka-U
  • 1.2 Previous Study 1. Introduction2. Use of an artificial marker. Since an artificial marker needs to be always visible by the AR camera, the movable span of a user is limited. Moreover, to realize high precision, it is necessary to use a large artificial marker. Yabuki, N., et al.: 2011, An invisible height evaluation system for building height regulation to preserve good landscapes using augmented reality, Automation in Construction, Volume 20, Issue 3, 228-235. artificial marker 7
  • 1.3 Aim 1. Introduction In this research, SOAR (Sensor Oriented Mobile AR) system which realizes geometric consistency using GPS, a gyroscope and a video camera which are mounted in a smartphone is developed. A low cost AR system with high flexibility is realizable. 9
  • Contents1. Introduction2. System Development 1. Development Environment of a System 2. System Flow3. Verification of System 1. Consistency with the viewing angle of a video camera and CG virtual camera 2. Accuracy of geometric consistency with a video image and 3DCG4. Conclusion 10
  • 2. System Development2.1 Development Environment Of a System Standard Spec Smartphone: GALAPAGOS 003SH (Softbank Mobile Corp.) Development Language: OpenGL-ES(Ver.2.0),Java(Ver.1.6) Development Environment: Eclipse Galileo(Ver.3.5) Location Estimation Technology: A-GPS (Assisted Global Positioning System) Video Camera Spec of 003SH OS Android™ 2.2 Qualcomm®MSM8255 CPU Snapdragon® 1GHz ROM:1GB Memory RAM:512MB Weight ≒140g Size ≒W62×H121×D12mm Display Size 3.8 inch Resolution 480×800 pixel 003SH 11
  • 2.2 System Flow -1 2. System Development While the CG model realizes Calibration of a video camera ideal rendering by the Definition of landscape assessment 3DCG model perspective drawing method, rendering of a video camera Activation of AR system produces distortion. Selection of 3DCG model Activation of Activation ofActivation of GPS gyroscope video camera Position Angle information information Capture of live acquisition video image Distortion Calibration acquisition Definition of position and angle information on CG virtual camera Superposition to live video image and 3DCG model Display of AR image Save of AR image Calibration of the video camera 12 using Android NDK-OpenCV
  • 2.2 System Flow -2 2. System Development 3DCG Model Calibration of a video camera Definition of landscape assessment 3DCG model Geometry, Texture, Unit Activation of AR system Selection of 3DCG model 3DCG model allocation file Activation of Activation ofActivation of GPS gyroscope video camera Position Angle information information Capture of live acquisition video image 3DCG model name, File name, acquisition Position data (longitude, latitude, orthometric height), Degree of Definition of position and angle information on CG virtual camera rotation angle, and Zone number of the rectangular plane Superposition to live video image and 3DCG model Display of AR image 3DCG model arrangement information file Save of AR image Number of the 3DCG model allocation information file, 13 Each name
  • 2.2 System Flow -3 2. System Development Calibration of a video camera Definition of landscape assessment 3DCG model Activation of AR system Selection of 3DCG model Activation of Activation ofActivation of GPS gyroscope video camera Position Angle information information Capture of live acquisition video image acquisition Definition of position and angle information on CG virtual camera Superposition to live video image and 3DCG model Display of AR image Save of AR image GUI of the Developed System 14
  • 2.2 System Flow -4 2. System Development Calibration of a video camera Definition of landscape assessment 3DCG model yaw Activation of AR system Selection of 3DCG model Activation of Activation ofActivation of GPS gyroscope video camera Position Angle information information Capture of live acquisition video image roll acquisition pitch Definition of position and angle information on CG virtual camera Superposition to live video image and 3DCG model Coordinate System of Developed AR system Display of AR image Save of AR image 15
  • 2.2 System Flow -4 2. System Development Calibration of a video camera Position data is converted into the coordinates (x, y) of a Definition of landscape assessment 3DCG model rectangular plane. Orthometric Activation of AR system height is created by subtracting the geoid height from the Selection of 3DCG model ellipsoidal height. Activation of Activation ofActivation of GPS gyroscope video camera Position Angle information information Capture of live acquisition video image acquisition Definition of position and angle information on CG virtual camera Superposition to live video image and 3DCG model Angle value of yaw points out Display of AR image magnetic north. In an AR Save of AR image system, in order to use a true north value, a magnetic declination is acquired and corrected. 16
  • 2.2 System Flow -5 2. System Development Calibration of a video camera Definition of landscape assessment 3DCG model Activation of AR system Selection of 3DCG model Activation of Activation ofActivation of GPS gyroscope video camera Position Angle information information Capture of live acquisition video image acquisition Definition of position and angle information on CG virtual camera Superposition to live video image and 3DCG model Display of AR image Save of AR image 17
  • モバイル型景観ARの進化 201118
  • Contents1. Introduction2. System Development 1. Development Environment of a System 2. System Flow3. Verification of System 1. Consistency with the viewing angle of a video camera and CG virtual camera 2. Accuracy of geometric consistency with a video image and 3DCG4. Conclusion 19
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Known Building Target ▶ GSE Common East Building at Osaka University Suita Campus ▶ W29.6 m, D29.0 m, H67.0 m Photo Drawing 28.95m 29.6m 28.95m 29.6m 64.8m 64.8m 29.6m Latitude, Longitude, Orthometric height Outlined 3D Model 34.823026944, 135.520751389, 60.15 24
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Known Viewpoint Place ▶ No.14-563 reference point. Distance from the reference point to the center of the Building was 203 m. ▶ AR system was installed with a tripod at a level height 1.5m. Building Target BC A D 203m Viewpoint (No.14-563 Reference Point) Latitude, Longitude, Orthometric height Measuring Points of 34.82145699, 135.519612, 53.1 Residual Error 25
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Parameter Settings of Eight Experiments Parameter Settings (S: Static Value = Known value, D: Dynamic Value = Acquired value from a device ) Position Information of Angle Information of Experiment CG Virtual Camera CG Virtual Camera Latitude Longitude Altitude yaw pitch roll No.1 S S S S S S No.2 D D D D D D No.3 D S S S S S No.4 S D S S S S No.5 S S D S S S No.6 S S S D S S No.7 S S S S D S No.8 S S S S S D 26
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Calculation Procedure of Residual Error 1. Pixel Error: Each difference between the horizontal direction and vertical direction of four points measured by pixels (Δx, Δy). ⊿x ⊿y Live Image CG Model Calculation image of residual error between live video image and CG 2. Distance Error: From the acquired value (Δx, Δy), each difference in the horizontal direction and vertical direction was computed as a meter unit by the formula 1 and the formula 2 (ΔX, ΔY). (1) (2) W: Actual width of an object (m) H: Actual height of an object (m) x: Width of 3DCG model on AR image (px) y: Height of 3DCG model on AR image (px) 27
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.1 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.1 S S S S S S (0.12m/pixel) No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Pixel Error Max. Mean Min. Unit: Distance Error Distance Error Unit: No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Unit:
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.2 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.2 D D D D D D (0.12m/pixel) No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Pixel Error Max. Mean Min. Unit: Distance Error Distance Error Unit: No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Unit:
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.3 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.3 D S S S S S (0.12m/pixel) No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Pixel Error Max. Mean Min. Unit: Distance Error Distance Error Unit: No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Unit:
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.4 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.4 S D S S S S (0.12m/pixel) No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Pixel Error Max. Mean Min. Unit: Distance Error Distance Error Unit: No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Unit:
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.5 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.5 S S D S S S (0.12m/pixel) No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Pixel Error Max. Mean Min. Unit: Distance Error Distance Error Unit: No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Unit:
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.6 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.5 S S S D S S (0.12m/pixel) No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Pixel Error Max. Mean Min. Unit: Distance Error Distance Error Unit: No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Unit:
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.7 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.5 S S S S D S (0.12m/pixel) No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Pixel Error Max. Mean Min. Unit: Distance Error Distance Error Unit: No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Unit:
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results: No.8 AR image Position Information of Angle Information of Experim CG Virtual Camera CG Virtual Camera ent Latitude Longitude Altitude yaw pitch roll No.5 S S S S S D (0.12m/pixel) No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Pixel Error Max. Mean Min. Unit: Distance Error Distance Error Unit: No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 Unit:
  • 3. Verification of System3.2 Accuracy of geometric consistency with a video image and 3DCG  Results in No.1 (All the known static value used) ▶ Pixel error: less than 1.5 pixels ▶ Mean distance error: less than 0.15m ▶ Accuracy of the AR system was found to be high. ▶ Object 200m away can be evaluated when a known static value is used.  Results in No.2 (General SOAR) ▶ Pixel error: less than 20 pixels (H), 55 pixels (V) ▶ Mean distance error: less than 2.3m (H), 6.3m (V)  Results through No.2-8 No.1 ▶ Although No.2 is all dynamic inputs, the X residual error of No.2 is smaller than the one of No.6. It is the result of offsetting the X residual error of No.2, the X residual error of No.6 which is + angle, and the X residual error of No.4 which is - angle. Max. Mean Min. Distance Error Distance Error Unit: No.2 No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 36
  • Contents1. Introduction2. System Development 1. Development Environment of a System 2. System Flow3. Verification of System 1. Consistency with the viewing angle of a video camera and CG virtual camera 2. Accuracy of geometric consistency with a video image and 3DCG4. Conclusion 37
  • 4. Conclusion4.1 Conclusion  When known values are used for position and angle information on CG virtual camera and the distance was 200 m, the pixel error is less than 1.5 pixels, and the mean distance error is less than 0.15 m. This value is the tolerance level of landscape assessment.  When dynamic values acquired with GPS and gyroscope were used, the pixel error was less than 20 horizontal pixels and 55 vertical pixels. The mean distance error was 2.3 horizontal meters and 6.3 vertical meters.  The developed SOAR has geometric consistency using GPS and the gyroscope with which the smart phone is equipped. 38
  • 4. Conclusion4.2 Future Work  A future work should attempt to reduce the residual error included in the dynamic value acquired with GPS and the gyroscope.  It is also necessary to verify accuracy of the residual error to objects further than 200m away and usability. 39
  • Thank you for your attention! E-mail: fukuda@see.eng.osaka-u.ac.jp Twitter: fukudatweetFacebook: Tomohiro Fukuda Linkedin: Tomohiro Fukuda