Virtual Teratak: A 3D Home Control and Entertainment System
2010TDC_light
1. 2010 Taiwan Display Conference
An Interactive Display System Design Using Image Feedback
Localization Techniques
Yen-Ju Tao1
, Chih-Yung Cheng1
, and Shih-Mim Liu2
1
Department of Electrical Engineering, National Taiwan Ocean University
2
Department of Electrical Engineering, National United University
Contact Address: No.2, Pei-Ning Rd., Keelung, Taiwan, 202, Tel: +886-2-24622192 ext. 6256,
Fax: +886-2-24635408, E-mail: ccheng@ntou.edu.tw
ABSTRACT
The system used image to localize an active light
element which operates directly on a default projected
surface in order to replace the functionality of a mouse or
a keyboard. In this way, we can make the manipulation
much more intuitive and interactive. Here a projector is
chosen to be our display device, and optoelectronic
elements (such as infrared LED or laser diode) are used as
localization points. After the captured image is processed,
light position is sent as a mouse cursor event to perform
the operation of movement or selection. Furthermore, a
tracking control system is added to allow better user
mobility and enlarge the operating range. Several
applications of the system are demonstrated in the paper.
Keywords: interactive device, display, image processing,
localization, tracking, optoelectronic elements
摘要摘要摘要摘要
此系統利用光電元件在預設投影的平面上操作,使
用上更直覺及能夠有更多的互動,且不需藉由滑鼠及鍵
盤來輔助。在此選擇常使用的投影機做為顯示介面,利
用光電元件(紅外線 LED、雷射光點)做為移動的判
斷點,利用視覺回授的技術進行影像處理、判斷分析
後,再把欲控制的指令下達給滑鼠游標,來完成移動或
選取或更多的功能。將系統加裝馬達,更可以增加系統
的自由度,可追蹤使用者,拓展區域並提高使用活動
性。此系統的幾項應用將在論文中一一說明。
關鍵詞關鍵詞關鍵詞關鍵詞::::互動裝置、顯示器、影像處理、定位、追蹤、
光電元件
1. INTRODUCTION
Mouse and keyboard are most commonly used input
devices. Nowadays more humanized and intuitive input
methods are needed and to be developed. For instance, it
will be convenient and natural for people who have to
write or take notes on an electronic document to be able to
work directly on the screen or even on the table by using
pen-like devices. Although there are products as Tablet and
Touch Screen [1], the price is too high to have these
products widespread.
This paper is aimed to implement a low budget
interactive environment and to make input devices handier.
Under this framework, optoelectronic elements are used
and combined with a camera for localization. Compared
with the traditional mouse, this localization provides the
absolute position instead of the relative increment. Several
different display surfaces are then used and tested for
various applications in order to show its diversity. Also a
motor control tracking system is equipped to track user’s
position and increase the operating range and flexibility.
2. SYSTEM ARCHITECTURE
2.1 Hardware Configuration
The system can be roughly divided into two parts.
The main platform in Fig. 1. is responsible for image
capture and interactive display, and the hand-held part
shown in Fig. 2. is a pen-like optoelectronic device for the
localization purpose. To be more precise, the main
platform elements include a projector for the interactive
image display, a CMOS camera to capture user’s image for
the user tracking functionality, and a Wiimote on it, which
can capture infrared LED light point [3]. Several AI
motors were added and controlled to provide the pan and
tilt tracking movement capability. The hand-held device in
Fig. 2. uses the infrared transmitter (also known as
infrared LED) which can be detected by Wiimote and
naturally be a wireless pointing device.
Fig. 1. Main system configuration
Fig. 2. Localization point device
2. 2.2 Tracking Control Systems
Consider that a user could possibly move or walk
during a presentation or a lecture. The system provides a
user-tracking mode for following user’s position.
Therefore a camera is mounted on the platform to capture
user’s image. The color models of RGB and HSV are used
to extract information needed for tracking the user [4]. Fig.
3. shows that we can separate the image of the user from
background, and accordingly enable the motor control
system move the whole main platform to lock on the user.
In the mean time, by using the open source of Wiimote
from web, we can scale and adjust the information from
Wiimote to find out the position and location of the
infrared transmitter.
Fig. 3. Tracking the user
2.3 Application Systems Design
With the above mentioned hardware and software
architecture, several applications are proposed. They can
be roughly divided into static and dynamic experimental
environments.
Fig. 4. shows the case where display device is fixed.
This type of setting can be applied to certain static
environment, such as an interactive reading or working
desk. By using the pointing device, user can operate it
directly on the projected screen. The light point position
and trajectory can be captured and computed from image
feedback. Further interactive reactions depend on how the
users click the objects.
Fig. 4. Static environment sketch
In Fig. 5., a view on the desk from user’s perspective
is shown. Users can mark or write on the projected
document just like using a pen-like mouse, which is more
intuitive than traditional mouse.
Fig. 5. A closer look on the projected surface
In a later stage, the system is developed to track the
user with large movement, like in Fig. 6. This type of
design can provide more flexible presentation space and
convenience [5]. For example, user can move or walk
around between multiple display screens and still use the
pointing device.
Fig. 6. System can rotate and track the user
Fig. 7. shows another possibility that user can play
dynamically in an interactive game environment with the
IR LED feedback information.
Fig. 7. Movable system allow user interact in a game
3. Experiments and Demonstrations
Based on the proposed framework and technique,
several applications were implemented and demonstrated
in the following subsections.
3. 3.1 Laser and Infrared LED Dual System
Fig. 8. (A) demonstrates a scenario that under a static
setup, both a laser pointer and an infrared LED can be
used interchangeably. Laser pointer is used for indicating
the target object and infrared LED is used as a mouse. Fig.
8. (B) is the traditional use of laser pointer emphasizing
the focus element on the projected screen. Fig. 8. (C) and
Fig. 8. (D) 11 show how the infrared LED pointer can
control the pendulum swing directly on the projected
screen.
(A)
(B)
(C)
(D)
Fig. 8. (A) Demonstration of a presentation environment
(B) Point and focus an object on the screen (C) Drag the
pendulum to an initial condition (D) Write equation by
infrared LED
3.2 User Tracking and Multi-Screens Control
A user-tracking scenario is shown in Fig. 9. The main
display platform can use visual servoing techniques to
locate and track the user if needed [6].
Fig. 9. Tracking user’s location
Fig. 10. shows that the user can walk around
between three different screens. The captured user’s image
can be processed and judged whether the projector needs
to change its projecting directions. In the meantime, the
user can still have control on these three screens with the
pointing device.
Fig. 10. Direct screen control on three different screens by
using infrared LED
3.3 Different Screens Applications
This experiment is designed to show the applicability
of the method on different kinds of surfaces and materials.
First we tested it on a clean table surface shown in Fig. 14.
One can not only highlight the sentences, but also write
and type words on screen keyboard.
Fig. 11. Highlighting sentences, taking notes and typing
words on a table surface
4. Furthermoer, Fig. 12. shows that the system can also
work on car windows. This time we can use the pointing
device to direct the cursor on the window and react to the
fast moving balls in the game.
Fig. 12. Playing games on car windows
The final experiment is on the material as shown in
Fig. 13. It is made of a transparent acrylic board covered
with thin papers. When the projector shows some images
on it, we can see the image on both sides of it. Since the
infrared light can also penetrate through the board, the
localization functionality can also be performed on both
sides. Fig. 14. and Fig. 15. are snapshots of both sides
during a chess playing game.
Fig. 13. Image display and IR pointing can be performed
on both sides of an acrylic board
Fig. 14. Put one chess on one side of the board
Fig. 15. Put the second chess on the other side of the board
4 CONCLUSION
It is indeed a trend for electronic products to have
humanization operating interface and interactive access.
To add on the common display devices such as LCDs or
projectors with more interactive capability triggers this
research. A low budget solution was discussed and
implemented in this paper. Several possible applications of
this framework were designed that tried to provide a more
interactive environment for presentation and game playing
platform.
So far this system and technology still have several
flaws that need to be conquered. For instance, when the
projector moves, the focus of the projected image can
easily get lost. AF (auto focus) technology is an important
issue to consider [7]. Also the light emitted from the
projector is way too strong for our eyes if we stare at the
projected image in long time. Meanwhile the size of the
main platform should be greatly reduced to have more
mobile applications. These problems can hopefully be
improved by upgrading the hardware equipments [8].
The future work we expect to accomplish is to
control the panel or display just by hands or fingers, and to
get rid of controllers so that we can control and send the
commands more intuitively and handy.
5 ACKNOWLEDGEMENT
This research was partially supported by NSC
through Grant 98-2221-E-019-035.
6 REFERENCES
[1] B. G. Blundell and A. J. Schwarz, Creative 3-D
Display and Interaction Interfaces: A
Trans-Disciplinary Approach, Wiley-Interscience,
USA, 2006
[2] R. C. Gonzalez and P. Wintz, Digital Image
Processing(2nd Edition), Prentice Hall, USA, 2006
[3] J. Lee, S. Hudson and P. Dietz, "Hybrid Infrared and
Visible Light Projection for Location Tracking",
ACM Symposium on User Interface Software and
Technology (UIST), pp. 57 - 60, October (2007)
[4] R. Hirsch, Exploring Color Photography: A
Complete Guide, Laurence King Publishing, USA,
2004
[5] D. Avrahami, S. E. Hudson, “Forming Interactivity:
A Tool for Rapid Prototyping of Physical Interactive
Products”, ACM Symposium on Designing
Interactive Systems (DIS.02), pp.141 - 146, 2002
[6] H. Ma, H. Lu and M. Zhang, A real-time effective
system for tracking passing people using a single
camera, Sch. of Electron. & Inf. Eng., Dalian Univ.
of Technol., Intelligent Control and Automation,
2008. WCICA 2008. 7th World Congress, pp. 6173 -
6177, June (2008)
[7] A. Gardel, J. L. Lazaro and J. M. Lavest, Camera
auto-calibration with virtual patterns, Dept. of
Electron., Alcala Univ., Madrid, Spain, Emerging
Technologies and Factory Automation, pp. 566 -
572, September (2003)
[8] J. Duan, Y. Deng and K. Liang, Development of
Image Processing System Based on DSP and FPGA,
Electronic Measurement and Instruments, pp. 2-791 -
2-794, October (2007)