Woontack Woo (禹雲澤), Ph.D.
KAIST UVR Lab.
Joint WS@ [July 30, 2015]
Director: Woontack Woo
International Collaborators
Tae-Kyun Kim @ Imperial College, London
Vincent Lepetit @ TU Graz, Austria
Antonis Argyros @ U. of Crete, Greece
M.Billinghurst@USA, S. Feiner@CU, S.Kim@USC
CTRI AHRC/UVR Lab. Members
2 Post Doc.s & 7 Researchers
4 Ph.D. & 4+2 MS students
3 Interns
1 Admin Staff
1 Visitors
The Ubiquitous VR research aims at the development of new
computing paradigms for ”DigiLog Life in Smart Spaces”:
UVR is Augmented P.I.E.
The Ubiquitous VR research aims at the development of new
computing paradigms for ”DigiLog Life in Smart Spaces”:
UVR is Augmented P.I.E.
Crossing Boundaries for AH in UVR 2.0
Augmented Perception, Intelligence, Experience
UVR
3D Vision
for User Localization
Quantified Self
for Personalized
Augmentation
I3 Visualization
for Organic UI
3D Interaction
for Hand-based
Collaboration
Augmented Reality as a New Medium
UVR, Real-Virtual Symbiosis
AR2.0 Augmented Human for UVR
Current UVR Lab Projects
Summary, Q&A
New Media Trends
Smaller, cheaper, faster, smarter, more intimate
Desktop/Laptop
Mobile
Wearable?
What’s Next?
1980s
• IBM :
Personal
Computer
1990s
• MS :
Personal
Computing
2000s
• Google :
Information
Sharing over
the Internet
2010s
• Apple :
Mobile
Computing
2020s
• S, N, D ? :
new value
over IoT/IoE
C
R
C
R
C
R
C
C
C
C
???
UI Paradigm Shift in New Media
Type & Click:
Desktop, Laptop
Point & Touch:
Mobile
What’s Next?
Gaze & Gesture:
Wearable
http://goo.gl/sFiJAo
Preferred
location
for
wearables
https://youtu.be/EvyfHuKZGXU
VR/AR will be “the next mega tech
theme” through 2030.
- Gene Munster (Piper Jaffray)
http://goo.gl/XCPkE0
Milgram's Reality-Virtuality Continuum [94]
Azuma's definition on AR [97]
combines real and virtual
is interactive in real time
is registered in real 3D world
R.	Azuma,	A	Survey	of	Augmented	Reality,	Presence,	Vol.6,	No.4,	Aug.	1997,	pp.	355‐385.
P.	Milgram	and	A.	F.	Kishino,	Taxonomy	of	Mixed	Reality	Visual	Displays,	IEICE	Trans.	on	I&S,	E77‐D(12),	pp.	1321‐1329,	1994.
Possible Applications
Enterprise
Maintenance & Repair
 Boeing@AWE2015: workers 90% better, 30% faster
Medical, training
Construction, houses, apartments
Consumer products, furniture, decoration
Augmented ads, packages, products
Individual
Entertainment, games
Infotainment
Digital arts, fashion
Digital Heritage
1968
HM 3D Display (I. Sutherland
@ Utah)
1992
Augmented Reality (T. Caudell
@ Boing)
1994
WearComp (S. Mann)
Taxonomy (P. Milgram)
1995
NaviCam (J. Rekimoto @ Sony)
1997
A survey of AR (R.Azuma)
MARS (S. Feiner @ Columbia)
Wearable AR (T.Starner)
1998
Tinmith (B. Thomas)
1999
ARToolkit (H. Kato @ HC)
WorldBoard (J. Spohrer @ IBM)
2000
AR Quake (B. Thomas @ SA)
BARS (S. Julier @ )
2001
mCollaAR (G. Reitmayr)
MagicBook (M. Billinghurst)
Ubiquitous VR (UVR Lab.)
2003
Human Pacman (A.D. Cheok)
iLamp (R. Raskar @ MERL)
PDA-AR (D. Wagner @ TUGraz)
2004
Phone-AR (M. Mohring @ BU)
2007
PTAM (G. Klein)
SONY “Eye of Judgement”
DigiLog Book (UVR Lab.)
CAMAR (UVR Lab)
2008
Wikitude (Mobilizy)
DigiLog Miniature (UVR Lab.)
2009
Layar (SPRXmobile)
Arhrrrr! (GATECH)
SLAM on iPhone (G. Klein)
SONY “EyePet”
2012
Trans-Space (KAIST UVR Lab)
SONY PS Vita (mAR)
Wearable AR/VR Industries
 Meta
 Google + Magic Leap
 Microsoft HoloLens
 Apple + Metaio
 Intel + Recon
 Facebook + Oculus Rift
 Samsung Gear VR + FOVE
 Sony Morpheus
Get video
from camera
Recognize
Object of
Interest
Estimate position
and orientation of
the camera
Render the
augmented
scene
Process the
interaction
Update the
status
Components of an AR System
Fusing CG models with real environment
Get video
from camera
Recognize
Object of
Interest
Estimate position
and orientation of
the camera
Render the
augmented
scene
Process the
interaction
Update the
status
Required Math: 3D Tracking for AR
3D Geometry, Linear Algebra, Signal/Image Processing,
Machine Learning, Numerical Robust Optimization, etc.
Range of Techniques
2D Picture Tracking (K.Kim) 3D	Vase	Tracking	(Y.	Park) 3D	Vase	Tracking	(Y.	Park)
Tracking (W. Baek) Tracking MO (W.Back) Multiple object Tracking
(Y.Park)
ISMAR2008 ISMAR 2009 VC2010
3D objects Tracking,
ISMAR2008 (K.Kim)
Motion	Blur	in	3D	tracking,	
ISMAR2009	(Y.	Park)
Scalable	Tracking,	VC	2010	
(KKim)
ISMAR2010 ISMAR 2011 ICAT2011
Modeling & Tracking,
ISMAR2010 (KKim)
Depth-assisted Tracking,
ISMAR2011 (Y.Park)
Depth-assisted Detection,
ICAT2011 (W.Lee)
Ubiquitous VR for DigiLog Life (Woo)
3D Link between dual (real & virtual) spaces with
additional information/content
CoI (Context-of-Interest) Augmentation, not just sight:
sound, haptics, smell, taste, etc.
Bidirectional Interaction for H2H/O2O communication in
dual spaces
SPACES
Private (3rd skin)
Social
General
Real
Space Social Networks
Seamless
Augmentation btw
dual spaces
How to
Link
Seamle
ssly?
CoI
LINK
U-
Content
Virtual space
From AR toward Ubiquitous VR
Enhance Experience, Engage, Edutainment
UVR @ UVR Lab 2008-09
UVR	Simulator Augmented	Room
From AR toward Ubiquitous VR
Enhance Experience, Engage, Edutainment
UVR @ UVR Lab 2008-09
Enhance Experience, Engage, EdutainmentDigiLog Miniature	Kildong DigiLog Agents
Storytelling	application
Integrated	with	virtools*
Storytelling	application
Integrated	with	virtools*
UVR with 'Internet of Everything (Cisco)'
IoE, Internet of Things That Think
Brings together people, process, data, and things to make
networked connection more relevant and valuable
Process
People
Data,
Information,
Knowledge,
wisdom
Things
(IoT)
P2P Collaboration
(People to People)
P2M Analytics
(People to Machine)
M2M
(Machine to Machine)
P2K Analytics
(People to Knowledge)
http://goo.gl/Fz2BNp
Metaverse
Modeling
Interaction
Networking
Sensing
Metaverse
Measured
Space
Physical
Space
Virtual
Space
SensingModeling
Augmentation
& Interaction
시간축
공간축(경도)
공간축(위도)
Map as a New Platform with Holistic Layers
• CoI-aware
• Environmental context:
• User context: knowledge,
experience, preference
• Just-in-time visualization
• Interaction
• Mash-up authoring
• SNS
• Extension of human
perception
User
Engagement
Context-
awareness
Realistic 3D
augmentation
What are Keys for UVR eco-System?
Augmented Content is a King,
Context is a Queen controlling the King, and
User is the God!
• Multicore/GPGPU
• Object recognition/
tracking
• Light source estimation
• Physics engine
• AI engine
How to avoid the fate of VR?
"to look for DigiLog, the
intersections between the
digital world (3D, SNS) and
analog world (CoI)”
Future Human in UVR era?
AR (Push): extended space-time
AH (Pull): how to extend human’s physical, intellectual,
social abilities?
http://goo.gl/yJPDtC
… After
Singularity?
(by Ray Kurzweil)
?
From AR to Augmented Human
Augmented Human means
Augmented Perception, Intelligence, Experience
Enhancing 5+ Sensation and perception
Offering wisdom with QS and Linked Open Data
Improving spatio-temporal-social ability
How to achieve AH?
Quantified Self
Holistic QS for Qualified Social Life
Current Projects
Trans-Space : Hand-based Collaboration with 3D Glasses
Supported by KIST for 2010-2019
ARtlet 2.0: AR authoring with 3D Glasses
Supported by NRF for 2014-2017
K-Culture Time Machine : AR for Historical Seoul Tour
Supported by MoCTS for 2014-2017
QS from Emotion-Mining: Smile Coach in Smart Phone
Supported by KAIST for 2014-2015
Interaction & Collaboration in Trans Spaces
• Supporting bare hand interaction and collaboration with virtual
object by high-DoF hand trackingGoal
• Interaction with virtual object without additional interface in AR
• Remote collaboration in the same Trans-Space
Needs &
Values
• Building Video see-through HMD based Augmented Reality
• Hand Tracking by egocentric RGB-D camera
Approach
A’s avatar
Bare hand
interaction
Space #1
Virtual/Physical
Object-of-Interest
Space #2
Local user A
2-Way
Networking
Finger Tracking in Trans Spaces
Y.Jang, S-T.Noh, H.J.Chang, T-K.Kim, W.Woo, "3D Finger CAPE: Clicking Action and Position Estimation under Self-
Occlusions in Egocentric Viewpoint," IEEE TVCG, vol. 21, no. 4, 2015. (presented in IEEE VR 2015).
Ours HSKL FORTH
Sensor-assisted Hand Tracking
• Egocentric Hand & Arm pose estimation with Visual-Inertial
Sensor FusionGoal
• Assist low frame-rate visual tracking with high frame-rate
inertial tracking
Needs &
Values
• Calibration for motion capture with on-body sensor network
• Arm pose estimation and direct
Approach
In progress
Shoulder
Elbow
Upper Arm
Forearm
IMU
IMU
Camera + IMU
Hand Gesture Tracking in Trans Spaces
Ours HSKL FORTH
Y.Jang, S-T.Noh, H.J.Chang, T-K.Kim, W.Woo, "3D Finger CAPE: Clicking Action and Position Estimation under Self-
Occlusions in Egocentric Viewpoint," IEEE TVCG, vol. 21, no. 4, 2015. (presented in IEEE VR 2015).
Collaboration in Trans Spaces
Ours HSKL FORTH
S-T. Noh, H-S. Yeo, W. Woo, “An HMD-based Mixed Reality Framework for Avatar-Mediated
Remote Collaboration Using Bare-Hand Interaction”, ICAT-EGVE 2015, under review
AiRSculpt: A Wearable AR 3D Sculpting
• To quickly create and manipulate 3D virtual content directly with
their bare hands in a real-world settingGoal
• High entrance barrier of 3D graphic tool
• Need of bare-hand user interface for HMD
Needs &
Values
• Video see-through HMD with RGB-D Camera
• 3D Tracking Module and Sculpting System
Approach
Sung-A Jang, Hyung-il Kim, Woontack Woo, Graham Wakefield, “AiRSculpt: A Wearable Augmented Reality 3D Sculpting System,”
HCII 2014, Vol.8530, pp.130-141, Jun. 2014
Wearable UI/UX
Jooyeun Ham, Jonggi Hong, Youngkyoon Jang, Woontack Woo, “Smart Glasses’ Augmented Wearable
Interface based on Wristband-type Motion-aware Touch Panel,” in Proc. IEEE 3DUI, 2014. (Accepted)
ARtalet2.0: Augmented Space Authoring
• Geometry-aware Interactive AR Authoring using Smartphone in
3D Glass EnvironmentGoal
• Authoring method that enables user to easily build AR world in i
n-situ environment and manipulate 3D virtual content to it
Needs &
Values
• Obtains 3D image features from unknown 3D space, analyzes g
eometry, and interactively align local reference coordinatesApproach
Concept Figures
하태진, 우운택, “착용형 증강현실 저작을 위한 기하 인식 기반 정합 좌표계 보정," 한국그래픽스학회 (KCGS),
pp. 57-58, Jul. 2014.
ARtalet2.0: Subspace Selection
• Remote Subspace Selection using Bare-hand input for Invoking
Subspace in Augmented Reality SpaceGoal
• Natural space selection under physical boundary conditions
• Space-of-Interest as extended unit of Object-of-Interest
Needs &
Values
• Definition of Subspace set by 3DoF freehand pinch-tip pointer
• 4 progressive selection techniques ( RSC / RCO / TSC / TSL )
Approach
H.Lee, S.Noh, and W.Woo, “Remote and Progressive Space Selection with Freehand Pinches for Augmented
Reality Space," preparing for IEEE TVCG.
User’s view with
Invoked Subspace
Egocentric Subspace
Selection
leads
ARtalet2.0: Realistic Rendering
객체의 빛
특성 분석
객체의 빛
특성 분석
실제 광원
추정
실제 광원
추정
가상 물체
Rendering
가상 물체
Rendering
현실과의
이질감
현실과의
이질감
사용자의
몰입 감소
사용자의
몰입 감소
“Augmented Reality Virtual Fitting Room”
관련연구
활용사례 문제점
K-Culture Time Machine
• Development of Creation and Provision Technology for
Time•Space-connected Cultural ContentsGoal
• New fusion of cultural contents that connect a variety of cultural
contents in time and space of the various agencies
Needs &
Values
• Semantic modeling and analysis of cultural contents
• AR/VR visualization of time•space-connected cultural contents
Approach
“K-Culture Time Machine: Development of Creation and Provision Technology for Time•Space-connected
Cultural Contents,” HIMI 2015
Virtual events
Virtual map
Space-Telling for AR
Representation
methodology reflecting
characteristics of AR
technology
Space-driven
Augmented Reality
representational
methodology based on
mashup hyperlink to
spatio-temporal
related contents of
heterogeneous database
with authoring tool
2D Map
3D CG Model DB
Image DB
Video DB
Cultural
Heritage
Info. DB
3D Physical
Space
Spatial
Coordinate
Temporal
Coordinate
15c 16c 17c
Hyperlink
Virtual
Space
“E. Kim, W. Woo, “Augmented Reality Based Space-telling Framework for Archeological Site
Tours” (Korean), Journal of HCI Korea, 2015. (will be published)
User Localization for Outdoor AR
•Enabling Robust Camera Tracking & Localization in Outdoor AR Env.Goal
• Various Conditional Changes in Outdoor Environment.
• (ex- non-static obj., light condition, scalability)
Needs &
Values
• Keyframe-based 3D Point Registration.
• Real-time Camera Tracking and Localization.
Approach
off‐line processing
3D Feature Point
Extraction & Reconstruction
Keyframes (R|t)
Sensor inf.
..…
..…
3D Visual Data Management
User Self‐localization
(Camera Detection & Tracking)
on‐line processing
KeyPoints
User Localization for Outdoor AR
3D Feature Point Extraction 
& Reconstruction
Drone
Manual
City Experience with AR
• 발터 벤야민의 도시 해석기반 ‘서울’ AR 경험 디자인Goal
• 공간적 차원과 시간적 차원에서 ‘파리’를 분석했던 벤야민의 시
각을 통해 증강현실을 통한 도시 경험의 층위를 확장할 수 있음
Needs &
Values
• 도시 만보객을 대상으로 다음과 같은 세가지 방식의 증강현실
UX 디자인:이미지로 읽기/흔적으로 읽기/체험으로 읽기
Approach
일상과
일상경험
분석하기
Urban EX with AR
Urban Flaneur의 특징
산책자로서
도시
바라보기
변증법적
도시이미지
읽기
Location-based Film Experience with AR
• To make AR video service, which provides location- based film exp
erience in augmented placesGoal
• The partial functionalities of context-awareness have prevented in
teractive and intelligent multimedia services
Needs &
Values
• Using 5W1H (Who, When, Where, What, How, Why) metadata sc
hema for interpreting contexts of places, user, and videoApproach
Visualization of PoI for Collaborative AR
• Visualization of the Others’ AI(Attention Information) for
Enhancing A Sense of PresenceGoal
• Compelling shared experience by the absence of mediation
• Specification of a permissible attention information
Needs &
Values
• Visualized info. controlling user’s attention(transitional view)
• Delineation of a location and a direction for the cameraApproach
김재인, 우운택, & 시정곤. (2014). 관심 객체의 위치 정보 가시화를 위한 증강현실 뷰어 설계. 한국 HCI 학회 학술대회, 90-91.
사용자 위치
사용자 위치
Augmented Organic UI for KCTM
• 스페이스텔링과 타임머신 기반의 메타데이터를 투어가이드 목적
으로 현실공간에서 표현하기 위한 증강UI 개발Goal
• 현실공간 위에서 메타데이터 연관관계 가시화 필요
• 증강현실 맥락에 적합한 정보 구성과 인터랙션 방식 개발 필요
Needs &
Values
• 화면 움직임, POI 인식, 시야 등의 현실 조건 반영한 데이터 표현
• 투어리스트의 사용 맥락과 니즈에 적합한 인터페이스 구성Approach
Concept Figures
Smart Mirror
• Track facial landmarks efficiently in cases of occlusion, brightness changes
and deformationsGoal
• Face alignment plays an important role in many applications such as face re
cognition, facial expression detection, face synthesis and 3D face modelling
Needs &
Values
• We propose 1) a new set of facial landmarks and 2) a novel random
regression forest designed to achieve real-time face alignmentApproach
Landmark Re-definition RF-based Landmark Detection
Multi-device Interaction (cw J. Lee)
• Context aware interaction on multi-device, multi-user situationGoal
• Diverse smart devices exist
• Support better interaction on all device generally, applicable
dominantly
Needs &
Values
• Using Field theory as a representative model
• Laying all device and people as a object which has each own field on a
ideal interaction space using their essential and behavioral informations
Approach
Concept Figures
Mirror Mirror (collaboration With D. Saakes)
• A personal clothing design system for use at homeGoal
• Select/design items with hand gestures in front of a mirror
• have them fabricated on the spot with a projector
Needs &
Values
• combine spatial AR with a mirror to achieve 3D feedback
• Edit properties of patterns, color, density and layering w gesture
Approach
D. Saakes1, H. Yeo, S. Noh, G. Han, W. Woo, “Mirror Mirror:," ACM SIGGRAPH2015 Studio
Augmented Reality as a New Medium
UVR, Real-Virtual Symbiosis
AR2.0 Augmented Human for UVR
Current UVR Lab Projects
Summary, Q&A
More Information
Woontack Woo, Ph.D.
FB @wtwoo Twitter @wwoo_ct
wwoo@kaist.ac.kr http://uvr.kaist.ac.kr
11th ISUVR 2016 @ YUST, China, Jun. 29 – Jul. 5, 2016
“The future is already here. It is just not uniformly distributed”
by William Gibson (SF writer)

Introduction to UVR Lab 2.0

  • 1.
    Woontack Woo (禹雲澤),Ph.D. KAIST UVR Lab. Joint WS@ [July 30, 2015]
  • 2.
    Director: Woontack Woo InternationalCollaborators Tae-Kyun Kim @ Imperial College, London Vincent Lepetit @ TU Graz, Austria Antonis Argyros @ U. of Crete, Greece M.Billinghurst@USA, S. Feiner@CU, S.Kim@USC CTRI AHRC/UVR Lab. Members 2 Post Doc.s & 7 Researchers 4 Ph.D. & 4+2 MS students 3 Interns 1 Admin Staff 1 Visitors The Ubiquitous VR research aims at the development of new computing paradigms for ”DigiLog Life in Smart Spaces”: UVR is Augmented P.I.E. The Ubiquitous VR research aims at the development of new computing paradigms for ”DigiLog Life in Smart Spaces”: UVR is Augmented P.I.E.
  • 3.
    Crossing Boundaries forAH in UVR 2.0 Augmented Perception, Intelligence, Experience UVR 3D Vision for User Localization Quantified Self for Personalized Augmentation I3 Visualization for Organic UI 3D Interaction for Hand-based Collaboration
  • 4.
    Augmented Reality asa New Medium UVR, Real-Virtual Symbiosis AR2.0 Augmented Human for UVR Current UVR Lab Projects Summary, Q&A
  • 5.
    New Media Trends Smaller,cheaper, faster, smarter, more intimate Desktop/Laptop Mobile Wearable? What’s Next? 1980s • IBM : Personal Computer 1990s • MS : Personal Computing 2000s • Google : Information Sharing over the Internet 2010s • Apple : Mobile Computing 2020s • S, N, D ? : new value over IoT/IoE C R C R C R C C C C ???
  • 6.
    UI Paradigm Shiftin New Media Type & Click: Desktop, Laptop Point & Touch: Mobile What’s Next? Gaze & Gesture: Wearable http://goo.gl/sFiJAo Preferred location for wearables https://youtu.be/EvyfHuKZGXU VR/AR will be “the next mega tech theme” through 2030. - Gene Munster (Piper Jaffray) http://goo.gl/XCPkE0
  • 7.
    Milgram's Reality-Virtuality Continuum[94] Azuma's definition on AR [97] combines real and virtual is interactive in real time is registered in real 3D world R. Azuma, A Survey of Augmented Reality, Presence, Vol.6, No.4, Aug. 1997, pp. 355‐385. P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays, IEICE Trans. on I&S, E77‐D(12), pp. 1321‐1329, 1994.
  • 8.
    Possible Applications Enterprise Maintenance &Repair  Boeing@AWE2015: workers 90% better, 30% faster Medical, training Construction, houses, apartments Consumer products, furniture, decoration Augmented ads, packages, products Individual Entertainment, games Infotainment Digital arts, fashion Digital Heritage
  • 9.
    1968 HM 3D Display(I. Sutherland @ Utah) 1992 Augmented Reality (T. Caudell @ Boing) 1994 WearComp (S. Mann) Taxonomy (P. Milgram) 1995 NaviCam (J. Rekimoto @ Sony) 1997 A survey of AR (R.Azuma) MARS (S. Feiner @ Columbia) Wearable AR (T.Starner) 1998 Tinmith (B. Thomas) 1999 ARToolkit (H. Kato @ HC) WorldBoard (J. Spohrer @ IBM) 2000 AR Quake (B. Thomas @ SA) BARS (S. Julier @ ) 2001 mCollaAR (G. Reitmayr) MagicBook (M. Billinghurst) Ubiquitous VR (UVR Lab.) 2003 Human Pacman (A.D. Cheok) iLamp (R. Raskar @ MERL) PDA-AR (D. Wagner @ TUGraz) 2004 Phone-AR (M. Mohring @ BU) 2007 PTAM (G. Klein) SONY “Eye of Judgement” DigiLog Book (UVR Lab.) CAMAR (UVR Lab) 2008 Wikitude (Mobilizy) DigiLog Miniature (UVR Lab.) 2009 Layar (SPRXmobile) Arhrrrr! (GATECH) SLAM on iPhone (G. Klein) SONY “EyePet” 2012 Trans-Space (KAIST UVR Lab) SONY PS Vita (mAR)
  • 10.
    Wearable AR/VR Industries Meta  Google + Magic Leap  Microsoft HoloLens  Apple + Metaio  Intel + Recon  Facebook + Oculus Rift  Samsung Gear VR + FOVE  Sony Morpheus
  • 11.
    Get video from camera Recognize Objectof Interest Estimate position and orientation of the camera Render the augmented scene Process the interaction Update the status
  • 12.
    Components of anAR System Fusing CG models with real environment
  • 13.
    Get video from camera Recognize Objectof Interest Estimate position and orientation of the camera Render the augmented scene Process the interaction Update the status
  • 14.
    Required Math: 3DTracking for AR 3D Geometry, Linear Algebra, Signal/Image Processing, Machine Learning, Numerical Robust Optimization, etc. Range of Techniques
  • 15.
    2D Picture Tracking(K.Kim) 3D Vase Tracking (Y. Park) 3D Vase Tracking (Y. Park) Tracking (W. Baek) Tracking MO (W.Back) Multiple object Tracking (Y.Park)
  • 16.
    ISMAR2008 ISMAR 2009VC2010 3D objects Tracking, ISMAR2008 (K.Kim) Motion Blur in 3D tracking, ISMAR2009 (Y. Park) Scalable Tracking, VC 2010 (KKim) ISMAR2010 ISMAR 2011 ICAT2011 Modeling & Tracking, ISMAR2010 (KKim) Depth-assisted Tracking, ISMAR2011 (Y.Park) Depth-assisted Detection, ICAT2011 (W.Lee)
  • 17.
    Ubiquitous VR forDigiLog Life (Woo) 3D Link between dual (real & virtual) spaces with additional information/content CoI (Context-of-Interest) Augmentation, not just sight: sound, haptics, smell, taste, etc. Bidirectional Interaction for H2H/O2O communication in dual spaces SPACES Private (3rd skin) Social General Real Space Social Networks Seamless Augmentation btw dual spaces How to Link Seamle ssly? CoI LINK U- Content Virtual space
  • 18.
    From AR towardUbiquitous VR Enhance Experience, Engage, Edutainment UVR @ UVR Lab 2008-09 UVR Simulator Augmented Room
  • 19.
    From AR towardUbiquitous VR Enhance Experience, Engage, Edutainment UVR @ UVR Lab 2008-09 Enhance Experience, Engage, EdutainmentDigiLog Miniature Kildong DigiLog Agents Storytelling application Integrated with virtools* Storytelling application Integrated with virtools*
  • 20.
    UVR with 'Internetof Everything (Cisco)' IoE, Internet of Things That Think Brings together people, process, data, and things to make networked connection more relevant and valuable Process People Data, Information, Knowledge, wisdom Things (IoT) P2P Collaboration (People to People) P2M Analytics (People to Machine) M2M (Machine to Machine) P2K Analytics (People to Knowledge) http://goo.gl/Fz2BNp
  • 21.
  • 22.
  • 23.
    • CoI-aware • Environmentalcontext: • User context: knowledge, experience, preference • Just-in-time visualization • Interaction • Mash-up authoring • SNS • Extension of human perception User Engagement Context- awareness Realistic 3D augmentation What are Keys for UVR eco-System? Augmented Content is a King, Context is a Queen controlling the King, and User is the God! • Multicore/GPGPU • Object recognition/ tracking • Light source estimation • Physics engine • AI engine How to avoid the fate of VR? "to look for DigiLog, the intersections between the digital world (3D, SNS) and analog world (CoI)”
  • 24.
    Future Human inUVR era? AR (Push): extended space-time AH (Pull): how to extend human’s physical, intellectual, social abilities? http://goo.gl/yJPDtC … After Singularity? (by Ray Kurzweil) ?
  • 25.
    From AR toAugmented Human Augmented Human means Augmented Perception, Intelligence, Experience Enhancing 5+ Sensation and perception Offering wisdom with QS and Linked Open Data Improving spatio-temporal-social ability How to achieve AH? Quantified Self Holistic QS for Qualified Social Life
  • 26.
    Current Projects Trans-Space :Hand-based Collaboration with 3D Glasses Supported by KIST for 2010-2019 ARtlet 2.0: AR authoring with 3D Glasses Supported by NRF for 2014-2017 K-Culture Time Machine : AR for Historical Seoul Tour Supported by MoCTS for 2014-2017 QS from Emotion-Mining: Smile Coach in Smart Phone Supported by KAIST for 2014-2015
  • 27.
    Interaction & Collaborationin Trans Spaces • Supporting bare hand interaction and collaboration with virtual object by high-DoF hand trackingGoal • Interaction with virtual object without additional interface in AR • Remote collaboration in the same Trans-Space Needs & Values • Building Video see-through HMD based Augmented Reality • Hand Tracking by egocentric RGB-D camera Approach A’s avatar Bare hand interaction Space #1 Virtual/Physical Object-of-Interest Space #2 Local user A 2-Way Networking
  • 28.
    Finger Tracking inTrans Spaces Y.Jang, S-T.Noh, H.J.Chang, T-K.Kim, W.Woo, "3D Finger CAPE: Clicking Action and Position Estimation under Self- Occlusions in Egocentric Viewpoint," IEEE TVCG, vol. 21, no. 4, 2015. (presented in IEEE VR 2015). Ours HSKL FORTH
  • 29.
    Sensor-assisted Hand Tracking •Egocentric Hand & Arm pose estimation with Visual-Inertial Sensor FusionGoal • Assist low frame-rate visual tracking with high frame-rate inertial tracking Needs & Values • Calibration for motion capture with on-body sensor network • Arm pose estimation and direct Approach In progress Shoulder Elbow Upper Arm Forearm IMU IMU Camera + IMU
  • 30.
    Hand Gesture Trackingin Trans Spaces Ours HSKL FORTH Y.Jang, S-T.Noh, H.J.Chang, T-K.Kim, W.Woo, "3D Finger CAPE: Clicking Action and Position Estimation under Self- Occlusions in Egocentric Viewpoint," IEEE TVCG, vol. 21, no. 4, 2015. (presented in IEEE VR 2015).
  • 31.
    Collaboration in TransSpaces Ours HSKL FORTH S-T. Noh, H-S. Yeo, W. Woo, “An HMD-based Mixed Reality Framework for Avatar-Mediated Remote Collaboration Using Bare-Hand Interaction”, ICAT-EGVE 2015, under review
  • 32.
    AiRSculpt: A WearableAR 3D Sculpting • To quickly create and manipulate 3D virtual content directly with their bare hands in a real-world settingGoal • High entrance barrier of 3D graphic tool • Need of bare-hand user interface for HMD Needs & Values • Video see-through HMD with RGB-D Camera • 3D Tracking Module and Sculpting System Approach Sung-A Jang, Hyung-il Kim, Woontack Woo, Graham Wakefield, “AiRSculpt: A Wearable Augmented Reality 3D Sculpting System,” HCII 2014, Vol.8530, pp.130-141, Jun. 2014
  • 33.
    Wearable UI/UX Jooyeun Ham,Jonggi Hong, Youngkyoon Jang, Woontack Woo, “Smart Glasses’ Augmented Wearable Interface based on Wristband-type Motion-aware Touch Panel,” in Proc. IEEE 3DUI, 2014. (Accepted)
  • 34.
    ARtalet2.0: Augmented SpaceAuthoring • Geometry-aware Interactive AR Authoring using Smartphone in 3D Glass EnvironmentGoal • Authoring method that enables user to easily build AR world in i n-situ environment and manipulate 3D virtual content to it Needs & Values • Obtains 3D image features from unknown 3D space, analyzes g eometry, and interactively align local reference coordinatesApproach Concept Figures 하태진, 우운택, “착용형 증강현실 저작을 위한 기하 인식 기반 정합 좌표계 보정," 한국그래픽스학회 (KCGS), pp. 57-58, Jul. 2014.
  • 35.
    ARtalet2.0: Subspace Selection •Remote Subspace Selection using Bare-hand input for Invoking Subspace in Augmented Reality SpaceGoal • Natural space selection under physical boundary conditions • Space-of-Interest as extended unit of Object-of-Interest Needs & Values • Definition of Subspace set by 3DoF freehand pinch-tip pointer • 4 progressive selection techniques ( RSC / RCO / TSC / TSL ) Approach H.Lee, S.Noh, and W.Woo, “Remote and Progressive Space Selection with Freehand Pinches for Augmented Reality Space," preparing for IEEE TVCG. User’s view with Invoked Subspace Egocentric Subspace Selection leads
  • 36.
    ARtalet2.0: Realistic Rendering 객체의빛 특성 분석 객체의 빛 특성 분석 실제 광원 추정 실제 광원 추정 가상 물체 Rendering 가상 물체 Rendering 현실과의 이질감 현실과의 이질감 사용자의 몰입 감소 사용자의 몰입 감소 “Augmented Reality Virtual Fitting Room” 관련연구 활용사례 문제점
  • 37.
    K-Culture Time Machine •Development of Creation and Provision Technology for Time•Space-connected Cultural ContentsGoal • New fusion of cultural contents that connect a variety of cultural contents in time and space of the various agencies Needs & Values • Semantic modeling and analysis of cultural contents • AR/VR visualization of time•space-connected cultural contents Approach “K-Culture Time Machine: Development of Creation and Provision Technology for Time•Space-connected Cultural Contents,” HIMI 2015 Virtual events Virtual map
  • 38.
    Space-Telling for AR Representation methodologyreflecting characteristics of AR technology Space-driven Augmented Reality representational methodology based on mashup hyperlink to spatio-temporal related contents of heterogeneous database with authoring tool 2D Map 3D CG Model DB Image DB Video DB Cultural Heritage Info. DB 3D Physical Space Spatial Coordinate Temporal Coordinate 15c 16c 17c Hyperlink Virtual Space “E. Kim, W. Woo, “Augmented Reality Based Space-telling Framework for Archeological Site Tours” (Korean), Journal of HCI Korea, 2015. (will be published)
  • 39.
    User Localization forOutdoor AR •Enabling Robust Camera Tracking & Localization in Outdoor AR Env.Goal • Various Conditional Changes in Outdoor Environment. • (ex- non-static obj., light condition, scalability) Needs & Values • Keyframe-based 3D Point Registration. • Real-time Camera Tracking and Localization. Approach off‐line processing 3D Feature Point Extraction & Reconstruction Keyframes (R|t) Sensor inf. ..… ..… 3D Visual Data Management User Self‐localization (Camera Detection & Tracking) on‐line processing KeyPoints
  • 40.
    User Localization forOutdoor AR 3D Feature Point Extraction  & Reconstruction Drone Manual
  • 41.
    City Experience withAR • 발터 벤야민의 도시 해석기반 ‘서울’ AR 경험 디자인Goal • 공간적 차원과 시간적 차원에서 ‘파리’를 분석했던 벤야민의 시 각을 통해 증강현실을 통한 도시 경험의 층위를 확장할 수 있음 Needs & Values • 도시 만보객을 대상으로 다음과 같은 세가지 방식의 증강현실 UX 디자인:이미지로 읽기/흔적으로 읽기/체험으로 읽기 Approach 일상과 일상경험 분석하기 Urban EX with AR Urban Flaneur의 특징 산책자로서 도시 바라보기 변증법적 도시이미지 읽기
  • 42.
    Location-based Film Experiencewith AR • To make AR video service, which provides location- based film exp erience in augmented placesGoal • The partial functionalities of context-awareness have prevented in teractive and intelligent multimedia services Needs & Values • Using 5W1H (Who, When, Where, What, How, Why) metadata sc hema for interpreting contexts of places, user, and videoApproach
  • 43.
    Visualization of PoIfor Collaborative AR • Visualization of the Others’ AI(Attention Information) for Enhancing A Sense of PresenceGoal • Compelling shared experience by the absence of mediation • Specification of a permissible attention information Needs & Values • Visualized info. controlling user’s attention(transitional view) • Delineation of a location and a direction for the cameraApproach 김재인, 우운택, & 시정곤. (2014). 관심 객체의 위치 정보 가시화를 위한 증강현실 뷰어 설계. 한국 HCI 학회 학술대회, 90-91. 사용자 위치 사용자 위치
  • 44.
    Augmented Organic UIfor KCTM • 스페이스텔링과 타임머신 기반의 메타데이터를 투어가이드 목적 으로 현실공간에서 표현하기 위한 증강UI 개발Goal • 현실공간 위에서 메타데이터 연관관계 가시화 필요 • 증강현실 맥락에 적합한 정보 구성과 인터랙션 방식 개발 필요 Needs & Values • 화면 움직임, POI 인식, 시야 등의 현실 조건 반영한 데이터 표현 • 투어리스트의 사용 맥락과 니즈에 적합한 인터페이스 구성Approach Concept Figures
  • 45.
    Smart Mirror • Trackfacial landmarks efficiently in cases of occlusion, brightness changes and deformationsGoal • Face alignment plays an important role in many applications such as face re cognition, facial expression detection, face synthesis and 3D face modelling Needs & Values • We propose 1) a new set of facial landmarks and 2) a novel random regression forest designed to achieve real-time face alignmentApproach Landmark Re-definition RF-based Landmark Detection
  • 46.
    Multi-device Interaction (cwJ. Lee) • Context aware interaction on multi-device, multi-user situationGoal • Diverse smart devices exist • Support better interaction on all device generally, applicable dominantly Needs & Values • Using Field theory as a representative model • Laying all device and people as a object which has each own field on a ideal interaction space using their essential and behavioral informations Approach Concept Figures
  • 47.
    Mirror Mirror (collaborationWith D. Saakes) • A personal clothing design system for use at homeGoal • Select/design items with hand gestures in front of a mirror • have them fabricated on the spot with a projector Needs & Values • combine spatial AR with a mirror to achieve 3D feedback • Edit properties of patterns, color, density and layering w gesture Approach D. Saakes1, H. Yeo, S. Noh, G. Han, W. Woo, “Mirror Mirror:," ACM SIGGRAPH2015 Studio
  • 48.
    Augmented Reality asa New Medium UVR, Real-Virtual Symbiosis AR2.0 Augmented Human for UVR Current UVR Lab Projects Summary, Q&A
  • 49.
    More Information Woontack Woo,Ph.D. FB @wtwoo Twitter @wwoo_ct wwoo@kaist.ac.kr http://uvr.kaist.ac.kr 11th ISUVR 2016 @ YUST, China, Jun. 29 – Jul. 5, 2016 “The future is already here. It is just not uniformly distributed” by William Gibson (SF writer)