Your SlideShare is downloading. ×
0
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

2013 Lecture 8: Mobile AR

2,298

Published on

A lecture on Mobile Augmented Reality. A lecture given by Mark Billinghurst at the University of Canterbury on Friday September 13th 2013. This is part of the COSC 426 graduate course on Augmented …

A lecture on Mobile Augmented Reality. A lecture given by Mark Billinghurst at the University of Canterbury on Friday September 13th 2013. This is part of the COSC 426 graduate course on Augmented Reality.

Published in: Technology, Business
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,298
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
110
Comments
0
Likes
4
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. COSC 426: Augmented Reality Mark Billinghurst mark.billinghurst@hitlabnz.org Sept 13th 2013 Lecture 8: Mobile AR
  • 2. 1983 – Star Wars
  • 3. 1999 - HIT Lab US
  • 4. CPU: 300 Mhz HDD; 9GB RAM: 512 mb Camera: VGA 30fps Graphics: 500K poly/sec 1998: SGI O2 2008: Nokia N95 CPU: 332 Mhz HDD; 8GB RAM: 128 mb Camera: VGA 30 fps Graphics: 2m poly/sec
  • 5. Mobile Phone AR   Mobile Phones   camera   processor   display   AR on Mobile Phones   Simple graphics   Optimized computer vision   Collaborative Interaction
  • 6. 2005: Collaborative AR   AR Tennis   Shared AR content   Two user game   Audio + haptic feedback   Bluetooth networking
  • 7. Mobile AR History
  • 8. Evolution of Mobile AR Wearable AR Handheld AR Displays Camera phone 1995 1997 2001 2003 2004 Camera phone - Self contained AR Wearable Computers PDAs -Thin client AR PDAs -Self contained AR Camera phone - Thin client AR
  • 9. Handheld Displays Tethered Applications   Fitzmaurice Chameleon (1994)   Rekimoto’s Transvision (1995)   Tethered LCD   PC Processing and Tracking
  • 10. Handheld AR Display - Tethered 1995, 1996 Handheld AR   ARPad, Cameleon   Rekimoto’s NaviCam, Transvision   Tethered LCD   PC Processing and Tracking
  • 11. AR Pad (Mogilev 2002) Handheld AR Display   LCD screen   Camera   SpaceOrb 3 DOF controller   Peripheral awareness   Viewpoint awareness
  • 12. Mobile AR: Touring Machine (1997)   University of Columbia   Feiner, MacIntyre, Höllerer, Webster   Combines   See through head mounted display   GPS tracking   Orientation sensor   Backpack PC (custom)   Tablet input
  • 13. MARS View   Virtual tags overlaid on the real world   “Information in place”
  • 14. Backpack/Wearable AR 1997 Backpack AR   Feiner’s Touring Machine   AR Quake (Thomas)   Tinmith (Piekarski)   MCAR (Reitmayr)   Bulky, HMD based
  • 15. PCI 3D Graphics Board Hard Drive Serial Ports CPU PC104 Sound Card PC104 PCMCIA GPS Antenna Tracker Controller DC to DC Converter Battery Wearable Computer GPS RTK correction Radio Example self-built working solution with PCI-based 3D graphics Columbia Touring Machine Mobile AR - Hardware
  • 16.   1997 Philip Kahn invents camera phone   1999 First commercial camera phone Sharp J-SH04
  • 17. Millions of Camera Phones
  • 18. Handheld AR – Thin Client 2001 BatPortal (AT&T Cambridge)   PDA used as I/O device   Wireless connection to workstation   Room-scale ultrasonic tracking (Bat) 2001 AR-PDA (C Lab)   PDA thin graphics client   Remote image processing   www.ar-pda.com
  • 19. 2003 ARphone (Univ. of Sydney)   Transfer images via Bluetooth (slow – 30 sec/image)   Remote processing – AR Server     Mobile Phone AR – Thin Client
  • 20. Early Phone Computer Vision Apps 2003 – Mozzies Game - Best mobile game Optical motion flow detecting phone orientation Siemens SX1 – Symbian, 120Mhz, VGA Camera 2005 – Marble Revolution (Bit-Side GmbH) Winner of Nokia's Series 60 Challenge 2005 2005 – SymBall (VTT)
  • 21. Computer Vision on Mobile Phone   Cameras and Phone CPU sufficient for computer vision applications   Pattern Recognition (Static Processing)   QR Code   Shotcode (http://www.shotcode.com/)   Motion Flow (2D Image Processing)   GestureTek -  http://www.gesturetekmobile.com/   TinyMotion   3D Pose Calculation   Augmented Reality
  • 22. Handheld AR – Self Contained 2003 PDA-based AR   ARToolKit port to PDA   Studierstube ported to PDA   AR Kanji Educational App.   Mr Virtuoso AR character   Wagner’s Invisible Train -  Collaborative AR
  • 23. Mobile Phone AR – Self Contained 2004 Mobile Phone AR   Moehring, Bimber   Henrysson (ARToolKit)   Camera, processor, display together
  • 24. AR Advertising   Txt message to download AR application (200K)   See virtual content popping out of real paper advert   Tested May 2007 by Saatchi and Saatchi
  • 25. 2008 - Location Aware Phones Nokia NavigatorMotorola Droid
  • 26. Real World Information Overlay   Tag real world locations   GPS + Compass input   Overlay graphics data on live video   Applications   Travel guide, Advertising, etc   Eg: Mobilizy Wikitude (www.mobilizy.com)   Android based, Public API released   Other companies   Layar, AcrossAir, Tochnidot, RobotVision, etc
  • 27. Layar – www.layar.com
  • 28. HIT Lab NZ Android AR Platform   Architectural Application   Loads 3D models   a OBJ/MTL format   Positions content in space   GPS, compass   Intuitive user interface   toolkit to modify the model   Connects to back end model database
  • 29. Architecture Android application Database server Postgres Web Interface Add models Web application java and php server
  • 30. 1995 Handheld Display: NaviCam, AR-PAD, Transvision 1997 Wearable AR: Touring Machine, AR Quake 2001 Handheld AR – Thin Client: AR-PDA, Bat Portal 2003 Handheld AR – Self contained: Invisible Train 2003 Mobile Phone – 2D Vision: Mozzies, Symball 2003 Mobile Phone – Thin Client: ARphone 2004 Mobile Phone – Self contained: Moehring, Symbian History of Handheld and Mobile AR
  • 31. Mobile AR by Weight Backpack+HMD: …5-8kg Scale it down: Vesp‘R [Kruijff ISMAR07]: …Sony UMPC 1.1GHz …1.5kg …still >$5K Scale it down more: Smartphone…$500 …All-in-one …0.1kg …billions of units 1996 2003 2007
  • 32. 2013 State of the Art Handheld Hardware available PDA, mobile phones, external cameras Sensors: GPS, accelerometer, compass Software Tools are Available Tracking: ARToolKitPlus, stbTracker, Vuforia Graphics: OpenGL ES Authoring: Layar, Wikitude, Metaio Creator What is needed: High level authoring tools Content development tools Novel interaction techniques User evaluation and usability
  • 33. Mobile AR Companies   Mobile AR   GPS + compass   Many Companies   Layar   Wikitude   Acrossair   PressLite   Yelp   Robot vision   Etc..
  • 34. $2 million USD in 2010 $732 million USD in 2014
  • 35. Qualcomm   Acquired Imagination   October 2010 - Releases free Android AR SDK   Computer vision tracking - marker, markerless   Integrated with Unity 3D renderer   http://developer.qualcomm.com/ar
  • 36. Rock-em Sock-em   Shared AR Demo   Markerless tracking
  • 37. Mobile AR Technology
  • 38. Technology Components   Software Platform   Eg studierStube platform   Developer Tools   iOS, Android   Tracking Technology   Computer vision, sensor based   Mobile Graphics   OpenGL ES   Interaction Methods   Handheld Interaction
  • 39. iPhone 4   Apple iOS   Faster CPU (1.2GHz)   High screen resolution   3.5”, 960x640   camera API   Multi-touch   Hardware 3D   GPS, compass, accelerometer and gyroscope
  • 40. Hardware Sensors   Camera (resolution, fps)   Maker based/markerless tracking   Video overlap   GPS (resolution, update rate)   Outdoor location   Compass   Indoor/outdoor orientation   Accelerometer   Motion sensing, relative tilt
  • 41. Studierstube ES Framework   Typical AR application framework   Developed at TU Graz
  • 42. Hardware OS/Low Level API Programming Libraries End User Application
  • 43. The Studierstube ES framework Tracking Platform Graphics Content User Interface - Application
  • 44. Mobile Graphics
  • 45. Computer Graphics on Mobile Phones   Small screen, limited input options   Limited support for accelerated graphics   Most phones have no GPU   Mobile Graphics Libraries   OpenGL ES (1.0, 2.0) -  Cross platform, subset of OpenGL -  C/C++ low level library   Java M3G -  Mobile 3D graphics API for J2ME platform -  Object importer, scene graph library -  Support from all major phone manufacturers
  • 46. OpenGL ES   Small-footprint subset of OpenGL   OpenGL is too large for embedded devices!   Powerful, low-level API, full functionality for 3D games   Can do almost everything OpenGL can   Available on all key platforms   Software and hardware implementations available   Fully extensible   Extensions like in OpenGL
  • 47. OpenGL ES on mobile devices
  • 48. Versions   Two major tracks   Not compatible, parallel rather than competitive   OpenGL ES 1.x   Fixed function pipeline   Suitable for software implementations   All 1.x are backwards compatible   OpenGL ES 2.x   Vertex and pixel shaders using GLSL ES   All 2.x will be backwards compatible
  • 49. Fixed Function (1.x) h"p://www.khronos.org/opengles/2_X/  
  • 50. Programmable (2.x) h"p://www.khronos.org/opengles/2_X/  
  • 51. OpenGL ES 1.x vs 2.0
  • 52. Tracking
  • 53. Mobile Augmented Reality’s goal Create an affordable, massively multi-user, widespread platform © Tinmith, U. of South Australia
  • 54. How to not do it… Ka-Ping Yee: Peephole Displays, CHI’03
  • 55. Tracking on mobile phones   Vision-based tracking   Marker-based tracking   Model-based natural feature tracking   Natural feature tracking in unknown environments   Sensor tracking   GPS, inertial compass, gyroscope
  • 56. Backpack-based Höllerer et al. (1999), Piekarski & Thomas al. (2001), Reitmayr & Schmalstieg (2003)   Laptop, HMD   Enhanced GPS (DGPS / RTK) + inertial sensor for viewpoint tracking   Hand tracking w/ fiducial markers
  • 57. Tracking for Handheld AR SLIDE 61 Backpack-based 2. Kalkusch et al., 2002   Video see-through HMD w/ camera   Viewpoint Tracking w/ inside-out computer vision using markers   ARToolKit markers on walls installed and surveyed manually
  • 58. Tablet PC / UMPC-based Schall et al., 2006   Hybrid tracking on UMPC   Camera  fiducial marker tracking   When no marker in view  inertial sensor + UWB tracking
  • 59. PDA-based 1. BatPortal (Newman et al., 2001)   PDA as thin client (rendering & tracking on server + VNC)   Ultrasonic tracking SHEEP (MacWilliams et al., 2003)   Tracking by ART (external IR cameras + retroreflective target)   Projection-based AR environ.
  • 60. non-AR Tracking on Phones AR-PDA (2003)   Model-based tracking   PDA = thin client  tracking on server   Not real-time Mosquito Hunt (2003) Marble Revolution (2004) Pingis (VTT, 2006) Game control w/ optical flow techniques TinyMotion (2006) GUI control & input on cell phones w/ image differencing & block correlation
  • 61. History of AR Tracking on Phones (1)   2003   ARToolKit on PDA   Wagner et at.   2004   3D Marker on Phone   Möhring et al.   2005   ARToolKit on Symbian   Henrysson et al.
  • 62. Tracking for Handheld AR SLIDE 66 Fiducial marker tracking on handhelds Möhring et al., 2004 Henrysson et al., 2006Wagner et al., 2003 Rohs, 2006Bucolo et al., 2005
  • 63. History of AR Tracking on Phones (2)   2005   Visual Codes   Rohs et at.   2008   Advanced Marker Tracking   Wagner et al.   2008   Natural Feature Tracking   Wagner et al.
  • 64. What can we do on today‘s mobile phones?   Typical specs   600+ MHz   ~5MB of available RAM   160x120 - 320x240 at 15-30 Hz camera   Possible to do   Marker tracking in 5-15ms   Natural feature tracking in 20-50ms
  • 65. Other Mobile Sensors   Orientation   Compass   Relative movement/rotation   Accelerometer, gyroscope   Context   Audio, light sensor, proximity   Location   GPS, A-GPS, Wifi positioning, Cell tower triangulation
  • 66. Handheld AR Interfaces
  • 67. Handheld HCI   Consider your user   Follow good HCI principles   Adapt HCI guidelines for handhelds   Design to device constraints   Rapid prototyping   User evaluation
  • 68. Sample Handheld AR Interfaces   Clean   Large Video View   Large Icons   Text Overlay
  • 69. Handheld Display vs Fixed Display   Experiment comparing handheld moving, to handheld button input, small fixed display, desktop display, large plasma   Users performed (1) navigation task, (2) selection task   Moving handheld display provided greater perceived FOV, higher degree of presence, faster completion time J. Hwang, J. Jung, G. Kim. Hand-held Virtual Reality: A Feasibility Study. In proceedings of VRST 2006
  • 70. Search Task Completion Time
  • 71. FOV, Presence and Immersion
  • 72. HandHeld AR Wearable AR Output: Display Input Input & Output HMD vs Handheld AR Interface
  • 73. Handheld Interface Properties   Handheld interface vs. HMD interface   Display is handheld rather than headworn   Much greater peripheral view of real world   Display and input device connected   Can move device independent of view Phone Keypad Touch Screen One handed input Keypad only Bimanual interaction Object based interaction Two handed input Stylus/touch screen Screen based input/selection Large screen Limited number of buttons
  • 74. Handheld Interface Metaphors   Tangible AR Lens Viewing   Look through screen into AR scene   Interact with screen to interact with AR content -  Eg Invisible Train   Tangible AR Lens Manipulation   Select AR object and attach to device   Use the motion of the device as input -  Eg AR Lego
  • 75. Translation Study Conditions A: Object fixed to the phone (one handed) B: Button and keypad input C: Object fixed to the phone (bimanual) - one hand for rotating tracking pattern
  • 76. Results – Translation •  9 subjects – within subject design •  Timing •  Tangible fastest •  twice as fast as keypad •  Survey •  Tangible easiest (Q1) •  Keypad most accurate (Q2) •  Tangible quickest (Q3) •  Tangible most enjoyable (Q4) •  Ranking •  Tangible favored A B C Rank 1.44 2.56 2.0
  • 77. Conditions A: Arcball B: Keypad input for rotation about the object axis C: Object fixed to the phone (one handed) D: Object fixed to the phone (bimanual) Rotation Study
  • 78. •  Timing •  Keypad(B) and Arcball(A) fastest •  No significant survey differences A B C D Rank 3.0 2.3 2.4 2.2 Results – Rotation
  • 79. Collaborative AR   AR Tennis   Virtual tennis court   Two user game   Audio + haptic feedback   Bluetooth messaging
  • 80. TAT Augmented ID
  • 81. Design Guidelines Apply handheld HCI guidelines for on-screen content - large buttons, little text input, etc Design physical + virtual interface elements Pick appropriate interface metaphor - “handheld lens” approach using handheld motion - Tangible AR for AR overlay Build prototypes Continuously evaluate application
  • 82. Mobile AR Browsing
  • 83. AR Browsers   AR equivalent of web browser   Request and serve up content   Commercial outdoor AR applications   Junaio, Layar, Wikitude, etc   All have their own language specifications   Wikitude – ARML   Junaio – XML, AREL
  • 84. AR Browsers   Commercial outdoor AR applications   Junaio, Layar, Wikitude, etc   All have their own language specifications   Wikitude – ARML   Junaio - XML   Need for common standard   Based on existing standards for geo-located content etc   Support for dynamic/interactive content   Easier to author mobile AR applications   Easy to render on AR browsers
  • 85. Architecture
  • 86. Common AR Browsers   Layar   http://www.layar.com/   Wikitude   http://www.wikitude.com/   Junaio   http://www.junaio.com   TagWhat   http://www.tagwhat.com/   Sekai Camera   http://sekaicamera.com/
  • 87. Nokia City Lens   More recent AR Browser
  • 88. Junaio - www.junaio.com
  • 89. Key Features   Content provided in information channels   Over 2,000 channels available   Two types of AR channels   GLUE channels – visual tracking   Location based channels – GPS, compass tracking   Simple to use interface with multiple views   List, map, AR (live) view   Point of Interest (POI) based   POIs are geo-located content
  • 90. QR Code Launch
  • 91. Glue Tracking - Markerless   Search for “instant tracker”
  • 92. Junaio Interface
  • 93. Interface   List View, Map View, AR View
  • 94. Back-end Servers
  • 95. Data Flow
  • 96. Search.php <?xml version="1.0" encoding="UTF-8"?> <results> <poi id="1" interactionfeedback="none"> <name><![CDATA[[Hotel Hello World]]]></name> <description><![CDATA[[This is a beautiful, family hotel and restaurant, just around the corner. Special Dinner and Rooms available.]]]></description> <l>37.776685,-122.422771,0</l> <mime-type>text/plain</mime-type> <icon>http://dev.junaio.com/publisherDownload/tutorial/icon_map.png</icon> <thumbnail>http://dev.junaio.com/publisherDownload/tutorial/thumb.jpg</thumbnail> <phone>555/1234567</phone> <homepage> http://www.hotelaroundthecorner.com </homepage> </poi> </results>
  • 97. AR Outcome
  • 98. Limitations of Plain XML   No interactivity   Only simple pop-ups   No user interface Customizations   Can only use Junaio GUI elements   No local interactivity   Always needs remote server connection
  • 99. Junaio AREL
  • 100. AREL   Augmented Reality Environment Language   Overcomes limitations of XML by itself   Based on web technologies; XML, HTML5, JavaScript   Core Components 1.  AREL XML: Static file, specifies scene content 2.  AREL JavaScript: Handles all interactions and animation. Any user interaction send an event to AREL JS 3.  AREL HTML5: GUI Elements. Buttons, icons, etc   Advantages   Scripting on device, more functionality, GUI customization
  • 101. Adding Interactivity
  • 102. Basic Interactivity   Add a button on screen to move virtual character   Use the following   HTML: button specification   Javascript: Interaction   PHP/XML: 3D model   Junaio Tutorial 5   http://www.junaio.com/develop/quickstart/ advanced-interactions-and-location-based- model-3ds/
  • 103. Server File Structure HTML – GUI JavaScript - interactivity Main Index PHP - content
  • 104. search.php – specify Lego Man if(!empty($_GET['l'])) $position = explode(",", $_GET['l']); … //return the lego man $oLegoMan = ArelXMLHelper::createLocationBasedModel3D( "1", // id "lego man", //title WWW_ROOT . "/resources/walking_model3_7fps.md2", // mainresource WWW_ROOT . "/resources/walking_model.png", // resource $position, // location array(0.2, 0.2, 0.2), // scale new ArelRotation(ArelRotation::ROTATION_EULERRAD, array(1.57,0,1.57)) // rotation ); … Use local position Lego model and texture
  • 105. styles.css – HTML GUI #buttons { position: absolute; bottom: 44px; right: 44px; } .ipad div { width: 104px; height: 106px; } #buttons div { background-image: url("../images/button.png"); background-repeat: no-repeat; background-size: 100%; float: left; } Button location Button style
  • 106. Logic_LBS5.js - JavaScript   Create an event listener   setEventListener();   Add functionality to model object   Load model from scene   Adding model behaviours   Add functionality to GUI objects   Define the event listener   Bind model behaviours to GUI objects
  • 107. Result
  • 108. Authoring Tools
  • 109. Metaio Creator   Drag and drop Junaio authoring
  • 110. BuildAR – buildar.com
  • 111. Sony CSL © 2004 Developing Mobile AR Experiences

×