Your SlideShare is downloading. ×
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
CAMAR2009
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

CAMAR2009

2,172

Published on

CAMAR 2.0; Context-aware Mobile Augmented Reality 2.0; R&D Activities @ GIST U-VR Lab 2009; slide presented at 12th MobileWebAppsCamp (Mobile UX and Mobile AR) in Seoul, Korea

CAMAR 2.0; Context-aware Mobile Augmented Reality 2.0; R&D Activities @ GIST U-VR Lab 2009; slide presented at 12th MobileWebAppsCamp (Mobile UX and Mobile AR) in Seoul, Korea

Published in: Technology
1 Comment
2 Likes
Statistics
Notes
No Downloads
Views
Total Views
2,172
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
69
Comments
1
Likes
2
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Supported by GIST, KIST and KOCCA of MCST, Korea Mobile AR 핵심 기술 R&D 동향 및 전망: CAMAR2.0: Context-aware Mobile AR Applications Mobile AR Core-Technology R&D Activities and its Applications Woontack Woo (우운택), Ph.D. GIST CTI U-VR Lab. Twitter: @wwoo_ct Dec. 4, 2009 @ MobileWebAppsCamp
  • 2. GIST CTI/U-VR Lab Overview GIST U-VR Lab(since 2001) GIST CTRC (since 2005) GIST CTI (since 2007) Supported by KOCCA 1 Director (2) Post Doc 14 Ph.D. & 4 MS Students 4 Research Staffs 1 Admin Staff Alumni: 3 Ph.D., 9 MS
  • 3. GIST CTI/U-VR Lab International Collaboration German China U. Of Munchen, Germany: MS Research Asia: Yoosoo Oh (2005.10-2006.2) Hyseok Yoon (2009.8-2010.2) Frederike Otto (2005.10-2006.1) Jonghyun Han((2009.8-2010.2) France Japan INRIA Rhone-Aples, France: Univ. of Nagoya , Japan: Kiyoung Kim (2005.10-2006.3) Dr. Seiie Jang (2005.10-2007.9) Wonwoo Lee (2007.1-2007.6) USA Swiss GATECH, USA: EPFL CVLab., Switzerland Dr. Jaeseok Yun (07.4-09.08) Youngmin Park(2007.9-2008.2) Youngjung Suh (07.9-08.2) Dr. Vincent Lapetit (2010.1) CMU HCII, USA: Dr. Seungjun Kim (06.9-) Austria Choonsung Shin (07.9-08.2) UASU., Austria Woonhyuk Baek (2009.12-2010.2) Newzealand UCSB, USA: Kyungdahm Yun (2007.12-2008.2) HIT Lab. NZ: Dr. Sehwan Kim (06.10-) Taejin Ha (2007.1-2; 2010.2-8) USC ICT, USA: Sewon Na (2007.6-8; 2008.1-2) Sejin Oh (06.10-07.3) Singapore Hyeongmook Lee (2007.12-2008.2) Ahyoung Choi (09.7-10.1) Dongpyo Hong (2008.3-5) MR Lab., Singapore: Dr. Raphael Grasset (2009.3-4) UIC EVL, USA: Minkyung Lee (2005.2-2005.6) Minkyung Lee (2009.11-12) Youngho Lee(2002.3-6)
  • 4. Outline Introduction: Why Augmented Reality? Paradigm Shift & Ubiquitous VR (U-VR) AR R&D Trend & Applications •AR@GIST (2005-2007 & 2008-2009) & @ISMAR09 •Desktop AR •ARtalet for Digilog Book (CTI by KOCCA, 2007-2010) •Miniature AR (GIST, 2009) •Mobile AR •CAMAR (with UCN by IITA, 2006-2008) •CAMAR2.0 (with KIST by KOCCA, 2009-2012) What’s Next?
  • 5. Why Augmented Reality?
  • 6. Why Augmented Reality? MIT’s annual review; “10 Emerging Tech.s 2007” Gartner: top 10 disruptive technologies 2008-2012  Multicore and hybrid processors  Virtualization and fabric computing  Social networks and social software  Cloud computing and cloud/Web platforms  Web mashups  User Interface  Ubiquitous computing  Contextual computing  Augmented reality  Semantics VR vs. AR @ goggle
  • 7. Hype Cycle of AR 2009
  • 8. Hype Cycle of AR I am wondering how I lived I might be without AR able to make money I hate AR Nope, no money here CAMAR • Mobile IT Infra • CV HW/SW Tech • AI (context) Tech What is Augmented Vision? Modified by Woo @ GIST U-VR Lab NaviCam(95) MARS(97) Tinmith(98) ARQuake(00) AR-PDA(01) AR-Phone(04) SLAM(09)
  • 9. What’s Augmented Reality? What’s AR?  A live direct or indirect view of a physical real-world environment whose elements are merged with (or augmented by) virtual CG imagery  Azuma's definition [97]  combines real and virtual  is interactive in real time  is registered in 3D  Milgram's Reality-Virtuality Continuum [94]
  • 10. History of AR History and Trend of AR  Estimated user 180M+ by 2012  Already major brands are taking keen interest  Consumers are hungry for Apps Wikitude AR Guide U-VR Lab 2008 ARToolkit 2001 Continuum 1999 1994 2007 2009 2002 1998 Sony „eye of Sony „EyePet‟ 1992 1st ISMAR 1st IWAR judgment‟ AR goers are 1966 Phrase on PS3 portable in coined by HMD by iPhone and T. Caudell Ivan Sutherland Android phones @ Boeing https://www.icg.tugraz.at/~daniel/HistoryOfMobileAR/
  • 11. Driving Force of AR Driving the Future of AR  IT Infra: ubiComp; cloud computing  SW: web? platform, Computer vision & AI (context-aware)  Smart platform: from Mobile to Wearable by NanoTech  Processor power: Moore’s Law continues to hold up  Multicore CPU, GPU, Memory, GPS, Compass, WiFi, RFID & battery life  Display: HMD, Handheld, spatial, holographic  UI & UX: Multimodal interaction/collaboration  Filtering & SNS Apps  Authoring tools
  • 12. AR Applications Possible AR Applications  Consumer products, furniture, decoration  Construction, houses, apartments  Augmented ads, packages, products  Entertainment, infotainment, games  Fashion, trends, digital arts  Personal digital items  Medical, training  Sharing by communities
  • 13. Ubiquitous Virtual Reality? Bringing down all the resources out of cyberspace and into ubiquitous VR
  • 14. Paradigm Shift Computing in next 5-10 Years is • Ubiquitous (smart space) => Intelligence -> wisdom • Wearable (nomadic human) => Personal -> Social • Content (responsive content) => Emotion -> Fun • How to exploit Emotion, Imagination, Creativity of Human? Mainframe Personal Networked Ubiquitous U-VR 1960s Computer 1980s Computer 1990s Computers 2000s Computing 2010s Computing Text CG/Image Multimedia u-Media u-Contents Sharing Human- Community Sharing a Individual over centered -centered computer usage Internet computing contents Information Knowledge Intelligence Wisdom Emotion Fun
  • 15. Paradigm Shift Metaverse  A virtual world, in Neal Stephenson's SF novel Snow Crash  In Metaverse, Humans, as avatars, interact w/ each other in a 3D space that uses the metaphor of the real world An enhancement of our perception of Towards a digital memory of the outside world one’s existence A digital reproduction of our reality Social ties, Relations with real life
  • 16. Paradigm Shift Tangible Tangible Anywhere Ubiquitous VR[4] User Interface[1] Space Initiative[2] Augmentation[3] Use physical spaces, New conceptual space Acquiring and presenting Augment the real world surfaces, and objects as where human, virtual and content for mobile AR with smart virtual world both controls and real world are connected Mobility, collaboration, Collaborative Wearable representations interactive visualization, Context-aware MR immersion [1] Ishii, H. and Ullmer, “Tangible bits: towards seamless interfaces between people, bits and atoms”. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1997. ACM, New York, NY, 234-241.. [2] Myunwoong Park, H. Ko, and S. Park, “Bridging virtual reality and reality - tangible space initiative.”, The 7th Multi- conference on Systematics, Cybernetics and Informatics, 2003. [3] Tobias Hollerer, “Anywhere Augmentation: Acquiring and Presenting Content for Ubiquitous Virtual Reality”, Proceeding of ISUVR2007, 2007. [4] Youngho Lee, S.Oh, C.Shin and W. Woo,” Survey on Ubiquitous Virtual Reality,” ISUVR 2008
  • 17. What’s U-VR Space? Dual Reality Influencing each other  The theory of Ideas: Plato (BC5C): Idea -> World Integrated Dual Reality influencing each other  U-VR: SPACE 2.0  Augment the RS with VS VS  Sensors link RS and VS  Contexts & Contents are shared in VS and RS RS VS AR RS
  • 18. Ubiquitous VR Augmented Reality (AR) Digilog Book with AR agent Real World AR VR Digilog Book • To enhance the user's sensory perception of the real world by seamlessly augmenting reality with additional information • Realtime tracking, registration, rendering, interaction Miniature AR 오세진, 우운택, ”스마트 공간에서의 맥락 인식 모바일 증강 현실 기술,” 핚국 차세대 컴퓨팅 학회 논문지, 5권 1호, pp. 15-23,2009 홍동표, 우운택, “모바일 증강 현실 시스템에 대한 연구 동향,” 정보과학회지 26권 1호, pp. 88-97, 2008. 이민경, 우운택, ”증강현실 기술 연구 동향 및 젂망,” 핚국정보처리학회 학회지, Vol. 11, No. 1, pp. 29-40, 2004. 8/25
  • 19. Ubiquitous VR 2008 U-VR Simulator  Link Real and Virtual Environments  Bidirectional control: Light, TV, Air- Conditioner, Lamp C.Kang, Y.Oh, W.Woo, "Widget-based Simulator for Testing Smart Space", Transactions on Edutainment III, LNCS 5940, pp. 59-69. (accepted) C.Kang, Y.Oh, W.Woo, "An Architecture for Flexible Entity Configuration in A Simulation Environment", Edutainment 2009, LNCS 5670, pp. 38-48, Aug. 2009 Y.Oh, C.Kang, W.Woo, "U-VR Simulator Linking Real and Virtual Environments based on Context-Awareness," IWUVR, pp. 052-055, 2009 오유수, 강창구, 우운택, "현실 환경과 가상 환경을 연동하는 맥락 인식 기반 U-VR 시뮬레이터," KHCI2009, pp. 310-314, 2009. 강창구, 오유수, 우운택, "ubiHome 3D Simulator: 유비쿼터스 컴퓨팅 환경에서 응용 서비스의 래피드 프로토 타이핑을 위핚 스마트 홈 3D 시뮬레이터," 2008년 싞호처리합동학술대회 논문집, 제21권, paper 제1호, pp. 86, 2008.
  • 20. Ubiquitous VR 2008 UVR Simulator  Link Real and Virtual with smart sensors/services  Bidirectional control: Light, TV, Air-Conditioner, Lamp C. Kang, Y. Oh and W. Woo, " ubiHome 3D Simulator: Smart Home 3D Simulator for Rapid Prototyping of Service Applications in Ubiquitous Computing Environments,“ KSPC 2008
  • 21. Ubiquitous VR 2008 Interaction with AR contents using contextual cues Motion uPart USB Bridge Particle http://particle.teco.eduidle: 16 Hour Light -Sound Dongpyo Hong, J. Looser, H. Seichter, M. Billinghurst, W. Woo, "A Sensor-based Interaction for Ubiquitous Virtual Reality Systems," In Proceedings of International Conference on Ubiquitous Virtual Reality (ISUVR2008), pp. 75-78, 2008.
  • 22. Ubiquitous Virtual Reality Ubiquitous VR (U-VR) • A concept of realizing VR on ubiComp-enabled smart environment, i.e., making VR pervasive into our daily lives Key dimension • Reality: Reality-Virtuality Continuum • Context: Static-Dynamic Context Continuum • Activity: Personal-Social Activity Continuum Youngho Lee, S. Oh, C. Shin and W. Woo, "Ubiquitous Virtual Reality and Its Key Dimension," IWUVR, pp. 5-8, 2009 Bruce Thomas, ”Roadblocks: Current Technology Challenges for Ubiquitous Virtual Reality,” IWUVR, pp. 1-4, 2009 Youngho Lee, S. Oh, C. Shin, and W. Woo, "Recent Trends in Ubiquitous Virtual Reality,“ ISUVR, pp. 33-36, 2008 Youngjung Suh, K. Kim, J. Han, and W. Woo, ""Virtual Reality in Ubiquitous Computing Environment", ISUVR pp. 1~2, 2007 Sehwan Kim, Y. Suh, Y. Lee and W. Woo, "Toward ubiquitous VR: When VR Meets ubiComp," ISUVR, pp. 1-4, 2006. Sehwan Kim, Y. Lee, W. Woo, "How to Realize Ubiquitous VR?," Pervasive:TSI Workshop, pp. 493-504, 2006.
  • 23. What’s U-VR Space? U-VR: Socially wise Mediated Reality Reality Static-Dynamic Context Life- logging UbiComp • level of smartness Real Real World Personal-Social Activity AR Collaborative Ubiquitous Mashup AR • Size of group, Social AR AR relationship between Context- aware AR MR World U-VR members and Cultural background of Virtual community Leisure Social Mirror Virtual Activity World World Reality–Virtuality Virtual World Context • Milgram’s continuum Youngho Lee, Sejin Oh, Choonsung Shin, Ubiquitous Virtual Reality and Its Key Dimension, International Workshop on Ubiqiutous Virtual Reality 2009, pp. 5-8, 2009.
  • 24. How to realize/experience U-VR? U-VR = Socially Wise Mediated Reality • To support social activities by sharing u-contents Socially: • A shared sense of goal, time, space, presence, feeling • To provide intelligent services just-in-time using wisdom Wise: according to user‟s explicit request or implicit intension • Multimodal, intimacy, invisible context-aware UI Mediated • To filter out things we do not wish to have thrust upon us Reality against our will Youngho Lee, Sejin Oh, Choonsung Shin, Ubiquitous Virtual Reality and Its Key Dimension, IWUVR 2009, pp. 5-8, 2009. Y.Suh, K.Kim, J.Han, W.Woo, ""Virtual Reality in Ubiquitous Computing Environment", ISUVR07, vol.260, pp.1-2, 2007 S.Kim, Y.Suh, Y.Lee, W.Woo, “Toward ubiquitous VR: When VR meets ubiComp“ ISUVR06, pp.1-4, 2006.
  • 25. What’s U-Contents? U-Contents • Digilog contents in Ubiquitous Smart Space (or UbiComp-enabled Space) Properties • u-Realism: seamless integration and multimodal feedback • u-Intelligence: collective context and intelligent response • u-Mobility: selective sharing in both VE and RE Kiyoung Kim, S.Oh, J.Han, W.Woo, u-Contents: Description and Representation of Contents in Ubiquitous VR, IWUVR 2009, pp. 9-12, 2009. K.Kim, D.Hong, Y.Lee, W.Woo, "Realization of u-Contents: u-Realism, u-Mobility and u-Intelligence," ISUVR07, vol.260, pp. 3-4, 2007
  • 26. Building U-VR Experience
  • 27. AR R&D Trend: AR Tracking, Redering, Interaction
  • 28. AR @ GIST U-VR Lab (05-07) BilliARd (2005) AR Design (2005) AR Design (2006) VR@Home (2006) Haptic AR for Billiard Immersive Modeling AR-based Product AR-based User using AR Design Created Content AR@Home (2007) AR@Home (2007) AR@Home (2007) MR (2007) 2D Picture: Markless A 3D: Markless AR for AR 3D: Markless AR for AR Marker concealment for R@Home @Home @Home mediated reality
  • 29. AR @ ISMAR 2009 ISMAR 2009 Modeling (V.D. Hengel) City Alive (K. Kim) pTAM (G. Klein) multiTrack (D. Wagner) Loc&Track (R. Castle) proFoma ()
  • 30. AR @ U-VR Lab 2008-9 U-VR Lab 2008-9: Tracking Tracking (W. Baek) Tracking MO (W.Back) Tracking (W. Baek) multiTrack (Y. Park) Rendering w/M (Y. Park) Tracking (K.Kim)
  • 31. AR @ U-VR Lab 2008-9 U-VR Lab 2008-9: Authoring, Interaction, Agent Panorama (W.Back) PageRecognition(K.Kim) Layer Authoring (J.Park) Cube tracking(W.Baek) AR Agent (S.Oh) AR Agent (S.Oh)
  • 32. AR R&D Trend: AR Applications @ GIST U-VR Lab
  • 33. Digilog Book 2007 Why Digilog Book? 1st Generation: e-Book 2nd Generation: u-Book - XML, PDF (copy of paper book) - Multimedia (text, sound, animation, flash) 3rd Generation: Digilog-Book Analog (emotional) + DIGLOG: Digilog-Book Digital Contents (experience) DIGITAL + ANALOG Immersive Analog + Digital Book Contents Digilog-Book Combining Analog-Emotion and OSMU, MSMU Digital-sense Culture Contents DB Y.Lee, T. Ha, H. Lee, K. Kim, W. Woo, "디지로그 북 - 아나로그 책과 디지털 콘테츠의 융합 ," 정보통싞분야학회 합동학술대회, 14권, pp. 186-189, 2007. T.Ha, Y.Lee, W.Woo, "Digilog Book for Temple Bell Tolling Experience based on Interactive Augmented Reality with Culture Technology," International Journal of Virtual Reality, (Accepted)
  • 34. Digilog Book 2009 Digilog Book : “Hong, Kildong” Ani Authoring(T.Ha) Layer Authoring (J.Park) Agent Authoring (S.Oh) 3D Authoring (H.Choi) Haptic Authring (S.Park)
  • 35. Miniature AR 2009 3D Reconstruction-based Tracking  Setting AR coordinates and computing camera poses in real-time Multi-core programming Re-localization/Tracking Minimize the reprojection error E   x i,j  K Ri t i X 2 i j 3D Reconstruction & AR coordinates Real-time 3D camera tracking 김기영, 박영민, 백운혁, 우운택, “미니어처 AR: 증강 현실 기반 차세대 디지로그형 콘텐츠 체험 젂시 시스템”, 차세대 PC 춘계 학술 대회, pp. 000-000, 2009
  • 36. Miniature AR 2009 Building marker-less AR on a miniature Model Matching(CTI) AR Authoring (Y.Park) Modeling (K.Kim) Modeling (K.Kim) Interaction (W.Baek)
  • 37. Mobile AR Why Mobile?  Integrating your life with mPhone  Wearable AR with HMD vs.  Handheld AR: UMPC, PDA, mobile phone  History of Handheld and Mobile AR  1995 Handheld Display: NaviCam, AR-PAD, Transvision  1997 Wearable AR: Touring Machine, AR Quake  2001 PDA: Thin Client (Bat Portal)  2003 PDA: Self contained (Invisible)  2003 mPhone: CVision (Mozzies, Symball)  2003 mPhone: Thin Client (ARphone)  2004 mPhone: Self-contained (Moehring, AR Tennis)
  • 38. Mobile AR Camera for Physical Interaction  Linking the physical to the virtual world  The environment as part of the interface  Camera phones as “bridging” devices  Integration with the user’s activities
  • 39. Mobile AR But…  …just switching from PC to mobile phone won’t simply solve all current problems  No comprehensive toolkit supporting the development of applications based on physical mobile interactions  New concepts and SW have to be developed that  Make best use of the new platform  Smaller operating system  Weaker hardware (CPU, memory, storage)  Take advantage of new possibilities  Ultra mobile devices, New I/O (touchscreen, sound, vibrator, …)  in-situ content authoring/mashup toolkit  New UI & UX with digilog, SNS, authoring for AR eco-system 2001-2009 © Woo, GIST CTI & U-VR Lab., Gwangju 500-712, S. Korea
  • 40. Mobile AR Mobile AR Applications: Great Green opportunities! Layar2.0 Bionic Eye Robot vision Fairy Trails cAR Locator Cheap Gas Wikitude Cyclopedia Worksnug buUuk TAT Aug-ID AR Compass FirePower Acrossair SREngine
  • 41. CAMAR (06-08) . CAMAR  Context-Aware Mobile AR  Personalization, interaction, sharing CAMAR NVUI U-콘텐츠의 개 인화된 상호작 U-환경의 보이지 용 및 선택적 공 않는 정보의 가 유를 위핚 맥락 시화 위핚 맥락 인식 모바일 AR 인식 모바일AR 서비스의 자동발 견 및 컨트롤러의 개인화를 위핚 모 바일 AR 오세진, 우운택, "스마트 공간에서의 맥락인식 모바일증강현실 기술," 한국차세대컴퓨팅학회논문지, 제5권 제1호, pp. 15-23, 2009. Sejin Oh, Woontack Woo, "CAMAR: Context-aware Mobile Augmented Reality in Smart Space," IWUVR, pp. 48-51, 2009. 우운택 외, “CAMAR: 맥락인식 모바일 증강현실 기술” 진한엠앤비, 2009.
  • 42. CAMAR2.0 42 Copyright @ GIST CTI All rights reserved.
  • 43. CAMAR 2.0 CAMAR 2.0  참여자들의 위치/장소/사회적 관계 및 문화적 배경 등을 포 함핚 통합 맥락을 활용하여 소셜 u-콘텐츠를 실내·외 객체에 이음매 없이 실시간으로 증강하고, in situ MR 저작과 선택적 공유를 통해 지속발젂 가능핚 콘텐츠 생태계를 구성 CAMAR Tour guide App. CAMAR Mashup App. CAMAR App CAMAR Authoring CAMAR widget CAMAR Renderer CAMAR Manager (contents search, CAMAR Tracker adaptation, recommendation) UCAM 2.0 Core ypTracker wlTracker isTracker (communicator, context integrator) APIs(GPS, WI-Fi, light, OpenCV, Winsock OpenGL ES, Dirct3D Mobile,etc..) Hardware (GPS, WI-Fi, light, CPU, GPU, FPU..) & OS (Windows mobile, windows CE, symbian)
  • 44. CAMAR 2.0 - mUCARF (2009.11.30)
  • 45. CAMAR 2.0 AR on mobile Phone: Indoor Concept Interaction Mashup Integrated Visual Tag 2D Tagging 3D Tagging
  • 46. CAMAR 2.0 AR on mobile Phone: Location-aware Bluetooth MCU Compass Compass 부착: 후면 Compass Visualization Indoor localization CAMAR Guider Annotation Outdoor Tour
  • 47. CAMAR 2.0 Mashup 맥락 기반 In Situ MR 매쉬업: 모바일 환경에서 일반 사용자(End-User)가 MR을 통해 실제 환경의 In-situ 서비스, 센서 및 태그를 중심으로 관련된 맥락을 조합/추가/ 변형하여 새로운 형태의 서비스/콘텐츠를 생성하고 다수 사용자간의 공유와 참여를 47 통해 콘텐츠의 지속가능한 생태계를 만들어가는 기술
  • 48. Mobile Platform Comparison (2009.05.29) Hardware Spec. T*OMNIA Nokia N95 iPHONE 3G iPHONE 3Gs ARM 1176 Marvell PXA320 Dual ARM11 620MHz ARM Cortex-A8 CPU (800MHz) (332Mhz) (Limited to work 600MHz in 412MHz) YUV420  Pixel Formats RGB24 RGBA32 RGBA32 RGB565 or RGB24 3D CG HW X O O O Accelerator Floating Point Unit X O (ARM 11) O (VFP) O (NEON) Touch Screen O X (Nokia N97) O O Camera Sensor O O O O Location Sensor △ GPS, Wifi, GPS, Wifi, O (GPS, Bluetooth, Wi-Fi) (No Wi-Fi API) Bluetooth Bluetooth Digital Compass X X (Nokia N97) X O Sensor △ △ Light Sensor X (API is not O (No API ?) (No Light API) confirmed) Accelerometer Sensor O O O O
  • 49. CAMAR 2.0 IV Tag Barcode Tracking (320*240, RGB 565) Binary Barcode Pose Phone CPU OS Decoding Total Conv. Detec. Estim. M480 624 MHz Win Mo. 3.69 2.82 5.728 34.01 46.256 M490 800 MHz Win Mo. 3.722 2.1 5.5 27.11 38.432 iPhone 3G 412 MHz iPhone OS 7.467 1.5 0.817 7.842 17.626 iPhone 3GS 600 MHz iPhone OS 3.752 1.0 0.498 2.935 8.185 TEGRA 650 MHz Win CE 2.77 3.15 6.176 22.900 35.01 Natural Feature Tracking (320*240, ~70pts, 300F) Platform CPU FAST Detection F2F matching Pose Update Total M480 624 MHz 3.25 8.89 91.8 103.94 M490 800 MHz 2.63 7.17 75.95 85.75 iPhone 3G 412 MHz 4.05 13.1 26.9 44.05 iPhone 3GS 600 MHz 2.18 7.39 19.7 29.27 (2009.07.24)
  • 50. What’s Next? The knowledge of joining a crowd and knowing: Who do I know here? Where do I know them from? Who should I get to know? Who here is single? Are there any naked pictures of them on the internet?
  • 51. What’s Next? Mobile Web in U-VR Era?  Networked [knowledge -> intelligence -> wisdom]? 지속적 성장 ecosystem 소통 공유 전달 Wev4.0: Seth Godin ubiquity, identity, connection Web3.0: Tim Berners-Lee (coined WWW) semantic web Web2.0 vs. Web2 (Squared; Tim O'Reilly) social web (user generated)
  • 52. Twitterverse Twitterverse
  • 53. CAMAR 2.0 for Galleries
  • 54. Summary Introduction: Why Augmented Reality? Paradigm Shift & Ubiquitous VR (U-VR) AR R&D Trend & Applications •AR@GIST (2005-2007 & 2008-2009) & @ISMAR09 •Desktop AR •ARtalet for Digilog Book (CTI by KOCCA, 2007-2010) •Miniature AR (GIST, 2009) •Mobile AR •CAMAR (with UCN by IITA, 2006-2008) •CAMAR2.0 (with KIST by KOCCA, 2009-2012) What’s Next?
  • 55. Q&A “The future is already here. It is just not uniformly distributed” by William Gibson (SF writer) More Information Woontack Woo, Ph.D. Twitter: @wwoo_ct Mail: wwoo@gist.ac.kr Web: http://cti.gist.ac.kr 22nd CTI Workshop @ GIST, Dec. 5, 2009 3rd CTI Tutorial on AR Tracking @ GIST, Jan. 11-22, 2010 IWUVR 2010 @ Finland, May 17-20, 2010 ISWC 2010 & ISMAR 2010 @ Seoul, Oct.10-16, 2010

×