This document summarizes a project to control a virtual human using gestures recognized by the Kinect sensor. The Kinect is used to track joint locations and recognize gestures like moving the hands up and down to change the virtual human's heart rate or blood pressure. It can also detect a CPR gesture by measuring the distance between the wrists and shoulder center. The virtual human then displays different animations and reactions based on its mathematically modeled health conditions and whether the user is performing CPR.