From Event to Action: Accelerate Your Decision Making with Real-Time Automation
FLEXPAD.pptx
1. FLEXPAD
Guided By, Submitted By,
Prof Shyma Kareem Pranav Prakash
Assistant Professor S4 MCA
Department Of Computer Applications Regno:MCK21MCA-2025
2. ABSTRACT SUMMARY
• Flexpad is an interactive system that combines a depth camera and a projector to
transform sheets of plain paper or foam into flexible, highly deformable, and
spatially aware handheld displays.
• We present a novel approach for tracking deformed surfaces from depth images in
real time.
• It captures deformations in high detail, is very robust to occlusions created by the
user’s hands and fingers, and does not require any kind of markers or visible
texture.
3. INTRODUCTION
• Flexpad is a system that supports highly flexible bending interactions for projected
handheld displays.
• A Kinect depth camera and a projector form a camera projection unit that lets
people use blank sheets of paper, foam or acrylic of different sizes and shapes as
flexible displays.
• It contribute an algorithm for capturing even complex deformations in high detail
and in real time.
• It also contribute a novel robust method for detecting hands and fingers with a
Kinect camera using optical analysis of the surface material.
4. FLEXPAD OVERVIEW
Components used
• Kinect camera
• Projector
• Sheet of paper
• Foam or acrylic
Flexible display material
• Any sheet can be used as passive display
• Two materials of sheet
5. FLEXPAD IMPLEMENTATION
• This present an approach that requires only image data from a Kinect depth sensor.
• It allows for tracking an entirely blank deformable projection surface in high
detail.
• In account for occlusion by the user’s hands and fingers it uses a parameterized,
deformable object model that is fit into the depth image data by a semi-global
search algorithm, the following we explain
1) the removal of hands and fingers from the input data,
2) the global deformation model and
3) how a global optimization method is used to fit the deformation model to the
input data in real time.
6. Optical Surface Material Analysis for Hand and Finger
Detection
• When the finger or the flat hand is touching the surface, the resolution is
insufficient to differentiate it reliably from the underly surface.
• So this model introduce optical surface material analysis as a novel method of
distinguishing between skin and the projection surface using the Kinect sensor.
• It is based on the observation that different materials have different reflectivity and
translucency properties.
• The surface reflectivity of an object with diffuse material can be determined by
looking at the local point brightness.
7. Animating Virtual Paper Characters
Flexpad Demonstration:
• An application for children that uses deformation for animating paper characters in
an interactive picture.
• By deforming the display and moving it in space the creature becomes animated.
• High-resolution deformation allows very individualized and varied animation
patterns.
8. Cont…
• Once animated, the creature is released into the virtual world of the animated
picture For example, fish may move with different speeds and move their fins.
11. Limitations
• Due to live tracking of deformable surface there maybe some delay in projected
output
• Due to the low resolution of the Kinect depth sensor it is difficult to perform a
shape-based classification of fingers and hands at larger distances.
12. Conclusions
Summary of study findings
• Highly flexible and interactive display
• For touch input on deformable display, the desired touch is at the center
• Deformable and stretchable material are used as display
• Low error and high accuracy got during performance evaluation
13. References
• 1. Alexander, J., Lucero, A. and Subramanian, S. Tilt displays: designing display
surfaces with multi-axis tilting and actuation. In Proc. MobileHCI’12, ACM Press,
2012.
• 2. Balakrishnan, R., Fitzmaurice, G., Kurtenbach, G. and Singh K. Exploring
interactive curve and surface manipulation using a bend and twist sensitive input
strip. In Proc. I3D ’99, 1999.
• 3. Bandyopadhyay, D., Raskar, R. and Fuchs, H. Dynamic shader lamps: Painting
on movable objects. In Intl. Symp. Augmented Reality, 2001.
• 4. Barnes C., Jacobs D., Sanders, J. and Goldman B. D., Rusinkiewicz, S.,
Finkelstein A. and Agrawala, M. Video puppetry: a performative interface for
cutout animation. In Proc. SIGGRAPH Asia’08, ACM Press, 2008.
• 5. Benko, H, Jota R., and Wilson, A. MirageTable: freehand interaction on a
projected augmented reality tabletop. In Proc. CHI ’12, ACM Press, 2012