This project develops a natural user interface for interacting with 3D environments using the Microsoft Kinect. Two Kinect devices are placed in a virtual reality space to track a user's full body movements and gestures. The Kinect data is used to create a digital avatar that represents the user's position and allows directly interacting with virtual objects by reaching out. Gesture recognition is also implemented to provide additional controls for navigation and selection. The goal is to make interacting with complex 3D data more intuitive by mirroring natural physical interactions.