This document summarizes a project to control a virtual human using gestures recognized by the Kinect sensor. The Kinect is used to track joint locations and recognize gestures like moving the hands up and down to change the virtual human's heart rate or blood pressure. It can also detect a CPR gesture by measuring the distance between the wrists and shoulder center. The virtual human then displays different animations and reactions based on its mathematically modeled health conditions and whether the user is performing CPR.
2. Introduction
• Build a virtual human using Unity
• Gesture recognition with Kinect.
• Control the virtual human with gestures
• Enable user to interact with the virtual human
5. Gesture Recognition
• Kinect can only recognize locations of joints and bones, not
gestures
• Use relative locations of various joints to recognize gestures
• Left hand move up/down: Compare location of left wrist with
shoulder left, hip center:
• Right hand move up/down: Compare location of right wrist
with shoulder right, hip center:
6. Control Virtual Human
• Unity program and Kinect program communicate via a socket.
• Change Heart Rate: Move Left hand up or down
• Change Blood Pressure: Move Right hand up or down
7.
8. Doing CPR
• Move both hands in and out.
• Kinect sends locations of Left Wrist/Right Wrist and Shoulder
Center joint to the Unity program.
• Measure distance between Left Wrist/Right Wrist and
Shoulder Center joint.
• The Unit program know that user want to do CPR.
9. Virtual Human Animations
• Mathematically Driven Virtual Human
• Human reactions to Low(<60), normal(60-100), higher heart
rate(100-150 and BP>120) ,heart attack(>150) and CPR
procedure.
• Motion captured from real human by MS Kinect
• Heart beating animation
• CPR animation
10. CPR posture identification
• From MS kinect tracking data of human skeleton.
• Bend both arms and release them would be count as one
effective electric shock using defibrillator .
• Three times of shocks would make the patient back to live.