LevelUps are telescopic shoes, which you can wear in virtual environment in combination with head mounted displays.
And as you step on virtual objects they expand, giving you the sense of elevation.
The document summarizes a student project to develop a virtual mouse interface using computer vision and finger tracking. The project is divided into 5 modules: 1) basic video operations in OpenCV, 2) image processing techniques, 3) object tracking, 4) finger-tip detection, and 5) using detected finger motions to control mouse functions. Key functions demonstrated include moving the cursor, left and right clicking, dragging, brightness control, and scrolling. Evaluation of the system found finger tracking accuracy between 60-85% for different gestures. The project aims to provide an alternative input method that reduces hardware needs and workspace.
This document provides an overview of virtual reality (VR), including:
- VR allows users to interact with simulated 3D environments through electronic devices like goggles.
- The first VR experience was called Sensorama in 1957. NASA developed early VR headsets in the 1980s.
- Modern VR headsets use screens, sensors for head tracking, and audio to immerse users in 3D virtual worlds. Software is developed using programming languages like C++.
- VR works by simulating real presence using computer-generated imagery and content across the senses like sight and sound. Sensors like gyroscopes track head motion to update the virtual view.
This presentation describes and defines Virtual Really.
Its also mentions some of its ongoing research for its viable usage in the field of electrical engineering.
It was done a school project.
And the information was collected sources available on the internet.
Virtual reality (VR) allows users to interact with and become immersed in simulated 3D environments. A variety of input devices, from data gloves to VR headsets, track user movement and provide visual, auditory, and haptic feedback. VR finds applications in fields like scientific visualization, medicine, education, and training where it allows users to interact with and explore virtual environments that may be dangerous, inaccessible, or expensive to experience directly.
A virtual environment (VE) is a digital space in which a user’s movements are tracked and his or her surroundings rendered, or digitally composed and displayed to the senses, in accordance with those movements.
This document provides information about virtual reality (VR) including its concepts, forms, applications, and devices. It discusses three forms of VR: through-the-window, immersive, and second person. VR applications include perambulation, synthetic experiences, and realization. Key VR devices described are data gloves, head mounted displays, VR chairs, cameras, and sound systems. Basics of the VRML file format and elements are also covered.
The document discusses kinetic user interfaces (KUI) where motion is used as an interface. It describes three projects - the Smart Badge Project, EPCFind, and RelateGateways Framework - that use motion to trigger contextual services, log entries, and discovery of co-located services respectively. The Smart Badge Project uses motion within buildings to trigger services for users, EPCFind uses motion to log the location of objects, and RelateGateways Framework uses motion to discover nearby pervasive services.
The document summarizes a student project to develop a virtual mouse interface using computer vision and finger tracking. The project is divided into 5 modules: 1) basic video operations in OpenCV, 2) image processing techniques, 3) object tracking, 4) finger-tip detection, and 5) using detected finger motions to control mouse functions. Key functions demonstrated include moving the cursor, left and right clicking, dragging, brightness control, and scrolling. Evaluation of the system found finger tracking accuracy between 60-85% for different gestures. The project aims to provide an alternative input method that reduces hardware needs and workspace.
This document provides an overview of virtual reality (VR), including:
- VR allows users to interact with simulated 3D environments through electronic devices like goggles.
- The first VR experience was called Sensorama in 1957. NASA developed early VR headsets in the 1980s.
- Modern VR headsets use screens, sensors for head tracking, and audio to immerse users in 3D virtual worlds. Software is developed using programming languages like C++.
- VR works by simulating real presence using computer-generated imagery and content across the senses like sight and sound. Sensors like gyroscopes track head motion to update the virtual view.
This presentation describes and defines Virtual Really.
Its also mentions some of its ongoing research for its viable usage in the field of electrical engineering.
It was done a school project.
And the information was collected sources available on the internet.
Virtual reality (VR) allows users to interact with and become immersed in simulated 3D environments. A variety of input devices, from data gloves to VR headsets, track user movement and provide visual, auditory, and haptic feedback. VR finds applications in fields like scientific visualization, medicine, education, and training where it allows users to interact with and explore virtual environments that may be dangerous, inaccessible, or expensive to experience directly.
A virtual environment (VE) is a digital space in which a user’s movements are tracked and his or her surroundings rendered, or digitally composed and displayed to the senses, in accordance with those movements.
This document provides information about virtual reality (VR) including its concepts, forms, applications, and devices. It discusses three forms of VR: through-the-window, immersive, and second person. VR applications include perambulation, synthetic experiences, and realization. Key VR devices described are data gloves, head mounted displays, VR chairs, cameras, and sound systems. Basics of the VRML file format and elements are also covered.
The document discusses kinetic user interfaces (KUI) where motion is used as an interface. It describes three projects - the Smart Badge Project, EPCFind, and RelateGateways Framework - that use motion to trigger contextual services, log entries, and discovery of co-located services respectively. The Smart Badge Project uses motion within buildings to trigger services for users, EPCFind uses motion to log the location of objects, and RelateGateways Framework uses motion to discover nearby pervasive services.
DEVELOPMENT OF CONTROL SOFTWARE FOR STAIR DETECTION IN A MOBILE ROBOT USING A...IAEME Publication
In this paper our main aim is to design and develop the control software for the detection and alignment of stairs by a stair climbing and manually operated robot. The robot platform is a differential drive, with skid steering system. The system is mounted on a rugged chassis. Vision sensors are mounted on the robot. These are cameras which will provide motion images of the robot’s surroundings. The application software will apply image processing and artificial intelligence techniques to detect stairs at Real-time and align the robot at an appropriate distance from the stair. Use of canny edge detection method to detect the edges of the stairs, smoothen the image and removing noise from the image. Neural networking will be used to detect stairs and faults. Machine learning technology to overcome faults in stairs and act accordingly from the saved experiences. This will be Linux based application which will have support of OpenCV API.
Virtual reality is an artificial environment that is created with software and presented to the user through interactive devices. It involves immersing the senses in a 3D computer-generated world. The history of VR began in the 1950s with flight simulators for pilots. Major developments included research programs in the 1960s, commercial development in the 1980s, and the first commercial entertainment system in the early 1990s. There are different types of VR including immersive VR, augmented VR, video mapping, and desktop VR. Popular applications of VR include gaming, education, and training. The Oculus Rift is a virtual reality headset that provides an immersive stereoscopic 3D viewing experience.
This document discusses different types of immersive technology including virtual reality (VR), augmented reality (AR), and mixed reality (MR). VR creates a simulated 3D environment that users can interact with using special equipment. AR superimposes digital images on the real world to provide a composite view. MR blends the physical and digital worlds for natural 3D interactions. Examples of immersive technologies provided include 360-degree videos, VR headsets like Oculus Quest, AR on smartphones, and mixed reality devices like HoloLens 2 and Jio Glass. Augmented guidance systems are also discussed which use AR to visually guide users through tasks.
This document describes a virtual mouse system that uses computer vision and OpenCV to detect hand gestures from video input and use those gestures to control cursor movements and mouse clicks. Specifically, it tracks colored markers on fingertips to determine pointer position and recognizes gestures like clicking to emulate mouse functions without physical hardware. The system is implemented using Python libraries like OpenCV, MediaPipe, and PyAutoGUI to process video frames in real-time, identify hand and finger positions, and map those positions to mouse events. This allows users to control the computer interface entirely through natural hand motions detected by a webcam.
- The document provides an introduction to immersive reality, including virtual reality, augmented reality, and mixed reality. It discusses the history and types of these technologies.
- Examples of applications are given for each type of immersive reality, including gaming, medical, military, and more. Components of technologies like VR headsets and how they work are outlined.
- Challenges and benefits of these realities are compared. The Microsoft HoloLens mixed reality headset is discussed as a specific example.
Virtual reality refers to immersive, three-dimensional environments simulated by computer. This document provides an introduction to augmented reality, discussing its definition, history, key concepts and technologies. It describes early virtual reality systems from the 1950s-1960s and how the field has advanced with head mounted displays, input devices, and software. Applications are explored in fields like gaming, aviation, medicine and more.
Concept of Virtual reality
Virtual Reality Components of VR System, Types of VR
System, 3D Position Trackers, Navigation and Manipulation
Interfaces
Visual computation in virtual reality
Augmented Reality
Application of VR
Virtual Riality in simulation gaming and idk.pptxssuser0b0103
With the increasing popularity of food delivery in colleges and universities, the traditional telephone order food has inconvenience to the customers and the food delivery store. The online food ordering system provides convenience for the customers. It overcomes the disadvantages of the traditional queuing system. This system increases the takeaway of foods than visitors. Therefore, this system enhances the speed and standardization of taking the order from the customer. It provides a better communication platform. the user’s details are noted electronically. Using this application, the customers need not go to the restaurant by themselves, but they can order the dishes through Android mobiles anywhere. In this system there are four namely, Admin, Delivery boy, Restaurant manager and User. Admin can login, manage restaurants by adding, updating and deleting, manage delivery person by adding, updating and deleting. Admin can also check register users and the orders total count. Delivery boy can login and see the allotted orders, they can upload the status of the order whether it is picked, on the way or delivered. Restaurant manager can login and update their restaurants details, they can even check for reviews and ratings given by users. They can manage menu by adding new items and deleting unwanted. Manager can manage orders by allotting to the delivery boy, can update status of delivery. Manager can see the payment done by electronic mode. Users can register and login. User have option to choose the cuisine, hotels of nearby. User will get details of restaurant like name, location and reviews. User will can select the food from the menu list, can add to favorites and can processed further. User can view the history of their orders and the current orders status. User have online payment options. User will get notification of the order status.
The project is about building a human-computer interaction system
using hand gesture by cheap alternative to depth camera. We present
a robust , efficient and real-time technique for depth mapping using
normal 2D -camera and Infrared LED arrays . We use HOG feature
based SVM classifiers to predict hand pose and dynamic hand gestures . The system also tracks hand movements and events like grabbing and
clicking bythe hand.
The Leap Motion is a small, compact device that plugs into a USB port and acts as a stereographic camera. It can detect hand and finger movements within its field of vision and track their motion. The Leap Motion software makes this tracking data available to any program through a simple API. It works across Linux, Windows, and MacOS and continuously analyzes stereographic images to detect hands, fingers, and objects for various potential applications.
The document describes a Kinect-based drawing application that allows users to draw on a canvas using gestures captured by the Kinect sensor. It discusses implementing cursor control and brush tools using Kinect skeleton tracking and joint position data. Future work ideas include expanding the drawing features and developing a keyboard simulator and sign language translator using Kinect gesture recognition.
How to integrate the Leap Motion SDK V2 with the Oculus Rift DK2 in Unity for hand skeleton tracking, object hand interaction and player movement using the Rift's positional tracker.
Virtual reality (VR) is a simulated, interactive, three-dimensional environment that can simulate physical presence in real or imagined worlds. VR was first created in the 1980s and has since been used for applications in entertainment, education, manufacturing, and medicine by creating immersive experiences that seem indistinguishable from real environments. As computing power continues to rapidly increase according to Moore's Law, VR is expected to become a widespread technology available for use in homes by the year 2037.
The document discusses human activity recognition from video data using computer vision techniques. It describes recognizing activities at different levels from object locations to full activities. Basic activities like walking and clapping are the focus. Key steps involve tracking segmented objects across frames and comparing motion patterns to templates to identify activities through model fitting. The DEV8000 development kit and Linux are used to process video and recognize activities in real-time. Applications discussed include surveillance, sports analysis, and unmanned vehicles.
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesChristina Lin
Traditionally, dealing with real-time data pipelines has involved significant overhead, even for straightforward tasks like data transformation or masking. However, in this talk, we’ll venture into the dynamic realm of WebAssembly (WASM) and discover how it can revolutionize the creation of stateless streaming pipelines within a Kafka (Redpanda) broker. These pipelines are adept at managing low-latency, high-data-volume scenarios.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
More Related Content
Similar to Motorized Stilts that Simulate Stair Steps in Virtual Reality.pptx
DEVELOPMENT OF CONTROL SOFTWARE FOR STAIR DETECTION IN A MOBILE ROBOT USING A...IAEME Publication
In this paper our main aim is to design and develop the control software for the detection and alignment of stairs by a stair climbing and manually operated robot. The robot platform is a differential drive, with skid steering system. The system is mounted on a rugged chassis. Vision sensors are mounted on the robot. These are cameras which will provide motion images of the robot’s surroundings. The application software will apply image processing and artificial intelligence techniques to detect stairs at Real-time and align the robot at an appropriate distance from the stair. Use of canny edge detection method to detect the edges of the stairs, smoothen the image and removing noise from the image. Neural networking will be used to detect stairs and faults. Machine learning technology to overcome faults in stairs and act accordingly from the saved experiences. This will be Linux based application which will have support of OpenCV API.
Virtual reality is an artificial environment that is created with software and presented to the user through interactive devices. It involves immersing the senses in a 3D computer-generated world. The history of VR began in the 1950s with flight simulators for pilots. Major developments included research programs in the 1960s, commercial development in the 1980s, and the first commercial entertainment system in the early 1990s. There are different types of VR including immersive VR, augmented VR, video mapping, and desktop VR. Popular applications of VR include gaming, education, and training. The Oculus Rift is a virtual reality headset that provides an immersive stereoscopic 3D viewing experience.
This document discusses different types of immersive technology including virtual reality (VR), augmented reality (AR), and mixed reality (MR). VR creates a simulated 3D environment that users can interact with using special equipment. AR superimposes digital images on the real world to provide a composite view. MR blends the physical and digital worlds for natural 3D interactions. Examples of immersive technologies provided include 360-degree videos, VR headsets like Oculus Quest, AR on smartphones, and mixed reality devices like HoloLens 2 and Jio Glass. Augmented guidance systems are also discussed which use AR to visually guide users through tasks.
This document describes a virtual mouse system that uses computer vision and OpenCV to detect hand gestures from video input and use those gestures to control cursor movements and mouse clicks. Specifically, it tracks colored markers on fingertips to determine pointer position and recognizes gestures like clicking to emulate mouse functions without physical hardware. The system is implemented using Python libraries like OpenCV, MediaPipe, and PyAutoGUI to process video frames in real-time, identify hand and finger positions, and map those positions to mouse events. This allows users to control the computer interface entirely through natural hand motions detected by a webcam.
- The document provides an introduction to immersive reality, including virtual reality, augmented reality, and mixed reality. It discusses the history and types of these technologies.
- Examples of applications are given for each type of immersive reality, including gaming, medical, military, and more. Components of technologies like VR headsets and how they work are outlined.
- Challenges and benefits of these realities are compared. The Microsoft HoloLens mixed reality headset is discussed as a specific example.
Virtual reality refers to immersive, three-dimensional environments simulated by computer. This document provides an introduction to augmented reality, discussing its definition, history, key concepts and technologies. It describes early virtual reality systems from the 1950s-1960s and how the field has advanced with head mounted displays, input devices, and software. Applications are explored in fields like gaming, aviation, medicine and more.
Concept of Virtual reality
Virtual Reality Components of VR System, Types of VR
System, 3D Position Trackers, Navigation and Manipulation
Interfaces
Visual computation in virtual reality
Augmented Reality
Application of VR
Virtual Riality in simulation gaming and idk.pptxssuser0b0103
With the increasing popularity of food delivery in colleges and universities, the traditional telephone order food has inconvenience to the customers and the food delivery store. The online food ordering system provides convenience for the customers. It overcomes the disadvantages of the traditional queuing system. This system increases the takeaway of foods than visitors. Therefore, this system enhances the speed and standardization of taking the order from the customer. It provides a better communication platform. the user’s details are noted electronically. Using this application, the customers need not go to the restaurant by themselves, but they can order the dishes through Android mobiles anywhere. In this system there are four namely, Admin, Delivery boy, Restaurant manager and User. Admin can login, manage restaurants by adding, updating and deleting, manage delivery person by adding, updating and deleting. Admin can also check register users and the orders total count. Delivery boy can login and see the allotted orders, they can upload the status of the order whether it is picked, on the way or delivered. Restaurant manager can login and update their restaurants details, they can even check for reviews and ratings given by users. They can manage menu by adding new items and deleting unwanted. Manager can manage orders by allotting to the delivery boy, can update status of delivery. Manager can see the payment done by electronic mode. Users can register and login. User have option to choose the cuisine, hotels of nearby. User will get details of restaurant like name, location and reviews. User will can select the food from the menu list, can add to favorites and can processed further. User can view the history of their orders and the current orders status. User have online payment options. User will get notification of the order status.
The project is about building a human-computer interaction system
using hand gesture by cheap alternative to depth camera. We present
a robust , efficient and real-time technique for depth mapping using
normal 2D -camera and Infrared LED arrays . We use HOG feature
based SVM classifiers to predict hand pose and dynamic hand gestures . The system also tracks hand movements and events like grabbing and
clicking bythe hand.
The Leap Motion is a small, compact device that plugs into a USB port and acts as a stereographic camera. It can detect hand and finger movements within its field of vision and track their motion. The Leap Motion software makes this tracking data available to any program through a simple API. It works across Linux, Windows, and MacOS and continuously analyzes stereographic images to detect hands, fingers, and objects for various potential applications.
The document describes a Kinect-based drawing application that allows users to draw on a canvas using gestures captured by the Kinect sensor. It discusses implementing cursor control and brush tools using Kinect skeleton tracking and joint position data. Future work ideas include expanding the drawing features and developing a keyboard simulator and sign language translator using Kinect gesture recognition.
How to integrate the Leap Motion SDK V2 with the Oculus Rift DK2 in Unity for hand skeleton tracking, object hand interaction and player movement using the Rift's positional tracker.
Virtual reality (VR) is a simulated, interactive, three-dimensional environment that can simulate physical presence in real or imagined worlds. VR was first created in the 1980s and has since been used for applications in entertainment, education, manufacturing, and medicine by creating immersive experiences that seem indistinguishable from real environments. As computing power continues to rapidly increase according to Moore's Law, VR is expected to become a widespread technology available for use in homes by the year 2037.
The document discusses human activity recognition from video data using computer vision techniques. It describes recognizing activities at different levels from object locations to full activities. Basic activities like walking and clapping are the focus. Key steps involve tracking segmented objects across frames and comparing motion patterns to templates to identify activities through model fitting. The DEV8000 development kit and Linux are used to process video and recognize activities in real-time. Applications discussed include surveillance, sports analysis, and unmanned vehicles.
Similar to Motorized Stilts that Simulate Stair Steps in Virtual Reality.pptx (15)
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesChristina Lin
Traditionally, dealing with real-time data pipelines has involved significant overhead, even for straightforward tasks like data transformation or masking. However, in this talk, we’ll venture into the dynamic realm of WebAssembly (WASM) and discover how it can revolutionize the creation of stateless streaming pipelines within a Kafka (Redpanda) broker. These pipelines are adept at managing low-latency, high-data-volume scenarios.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Introduction- e - waste – definition - sources of e-waste– hazardous substances in e-waste - effects of e-waste on environment and human health- need for e-waste management– e-waste handling rules - waste minimization techniques for managing e-waste – recycling of e-waste - disposal treatment methods of e- waste – mechanism of extraction of precious metal from leaching solution-global Scenario of E-waste – E-waste in India- case studies.
Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...University of Maribor
Slides from talk presenting:
Aleš Zamuda: Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapter and Networking.
Presentation at IcETRAN 2024 session:
"Inter-Society Networking Panel GRSS/MTT-S/CIS
Panel Session: Promoting Connection and Cooperation"
IEEE Slovenia GRSS
IEEE Serbia and Montenegro MTT-S
IEEE Slovenia CIS
11TH INTERNATIONAL CONFERENCE ON ELECTRICAL, ELECTRONIC AND COMPUTING ENGINEERING
3-6 June 2024, Niš, Serbia
Using recycled concrete aggregates (RCA) for pavements is crucial to achieving sustainability. Implementing RCA for new pavement can minimize carbon footprint, conserve natural resources, reduce harmful emissions, and lower life cycle costs. Compared to natural aggregate (NA), RCA pavement has fewer comprehensive studies and sustainability assessments.
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMSIJNSA Journal
The smart irrigation system represents an innovative approach to optimize water usage in agricultural and landscaping practices. The integration of cutting-edge technologies, including sensors, actuators, and data analysis, empowers this system to provide accurate monitoring and control of irrigation processes by leveraging real-time environmental conditions. The main objective of a smart irrigation system is to optimize water efficiency, minimize expenses, and foster the adoption of sustainable water management methods. This paper conducts a systematic risk assessment by exploring the key components/assets and their functionalities in the smart irrigation system. The crucial role of sensors in gathering data on soil moisture, weather patterns, and plant well-being is emphasized in this system. These sensors enable intelligent decision-making in irrigation scheduling and water distribution, leading to enhanced water efficiency and sustainable water management practices. Actuators enable automated control of irrigation devices, ensuring precise and targeted water delivery to plants. Additionally, the paper addresses the potential threat and vulnerabilities associated with smart irrigation systems. It discusses limitations of the system, such as power constraints and computational capabilities, and calculates the potential security risks. The paper suggests possible risk treatment methods for effective secure system operation. In conclusion, the paper emphasizes the significant benefits of implementing smart irrigation systems, including improved water conservation, increased crop yield, and reduced environmental impact. Additionally, based on the security analysis conducted, the paper recommends the implementation of countermeasures and security approaches to address vulnerabilities and ensure the integrity and reliability of the system. By incorporating these measures, smart irrigation technology can revolutionize water management practices in agriculture, promoting sustainability, resource efficiency, and safeguarding against potential security threats.
ACEP Magazine edition 4th launched on 05.06.2024Rahul
This document provides information about the third edition of the magazine "Sthapatya" published by the Association of Civil Engineers (Practicing) Aurangabad. It includes messages from current and past presidents of ACEP, memories and photos from past ACEP events, information on life time achievement awards given by ACEP, and a technical article on concrete maintenance, repairs and strengthening. The document highlights activities of ACEP and provides a technical educational article for members.
Literature Review Basics and Understanding Reference Management.pptxDr Ramhari Poudyal
Three-day training on academic research focuses on analytical tools at United Technical College, supported by the University Grant Commission, Nepal. 24-26 May 2024
Literature Review Basics and Understanding Reference Management.pptx
Motorized Stilts that Simulate Stair Steps in Virtual Reality.pptx
1. Level-Ups
Motorized Stilts that Simulate Stair Steps in Virtual Reality
dominik schmidt, robert kovacs, vikram mehta,
udayan umapathi, sven köhler, lung-pan cheng, patrick baudisch
27. Lifting a person even bigger mechanics, heavy device
we drive the mechanism when the foot is in the air, light device
That’s what the pressure sensors do
50. Next steps
Multiple steps at a time
(right now need to re-arm in between)
Faster actuation, running?
51. Level-Ups
Motorized Stilts that Simulate Stair Steps in Virtual Reality
dominik schmidt, robert kovacs, vikram mehta,
udayan umapathi, sven köhler, lung-pan cheng, patrick baudisch
Editor's Notes
levelUps are telescopic shoes, which you can wear in virtual environment in combination with head mounted displays.
And as you step on virtual objects they expand, giving you the sense of elevation.
Why are we doing this?
Obiviusly VR is happening right now, with lot of devices like occulus, gear and so on, but in VR research the question is for a while how can we improve this visual immersion giving more physical feedback.
So people had lot of ideas from haptic gloves to modeling virtual rooms…
But in this paper we focused on a very specific case, when we are steping on and off virtual objects.
Traditionally walk simulation has been solved with locomotion devices.
These are stationary devices which can give you the feeling that you are walking around, while you are actually stepping in one place.
So adding elevation to these devices is presumably straightforward.
In the same time there is an other trend in virtual reality where people are debating that locomotion devices are not giving the full experience, we should allow people to walk around freely.
So people tried around different combinations, for example this mobile devices that are moving with you and you can step on them consequently.
But we wanted to go a step further
To allow elevation in real-walking environments, with a wearable design
So here is what we made, this are the levelUps
This is the video what I have already shown, where the user is stepping on virtual objects
The main mechanism is this golden component, called lift table or scissor table.
And we also added a couple more components, the actuation, the sensing system, but I’ll walk you through all of these in a minute.
Here is just an other scene
Primarily LevelUps are designed to simulate elevation, but there is an other scenario where you could use them for simulating depth.
For example while you are balancing on a beam they are fully extracted and when you would put your feet left and right they would retract, giving you the feeling of a void.
But on the whole the main contribution is that level ups can simulate elevation in real walking environments, while the device itself is wearable and wireless.
Now let me tell you about the mechanical design.
presumably, the hardware design was simple and straightforward…
Except from that it was not…
We tried lots of other designs, as you can see
And made lots of prototypes.
But let me just walk you through our final solution.
The primary component is the lift table, what is originally designed for manual actuation with the knob.
So we needed to motorize it.
The key challenge here was this blue carriage, which allows the motor to move together back and forth with the actuation thread, and in the same time holds the motor fixed
And to give the electronics the sense how high the boot is right now, we added a linear potentiometer which actually measures the height of the lift table
But there is a little bit of more control…
there are pressure sensors integrated into the toes and heels.
Let me tell you why.
Our original idea was to create a design what could actually lift a person, but it turned out that this would require lot of power and heavy mechanism.
This sensors are connected to an Arduino nano, which also drives the motor system and does the logic.
We also had to designed an artificial foot…
Because our initial design was completely rigid and it doesn’t felt very well.
So we decided to add a little bit of flexibility with these fiberglass red sticks on the bottom.
Which we actually get from a kid bow.
Ok, next problem was stabilizing the ankle.
The human ankle is actually designed for working reasonably close to the ground., but since we operate it them now on 30 cm heigh, we needed to add support the ankle.
Initially we tried to completely immobilize the foot, what is not a good experience.
Than we realized that what we really need to do is to prevent the left right movement, and to allow to flexing back and forth.
And that is exactly what the regular roller-skate boots does.
As you can see..
We used a boot from a normal rollerblade skate.
So this is our final design.
They are operated in a typical VR environment.
Here Sven is wearing the Oculus and a MacBook on his chest, where the actual VR processing is running.
As you san see everything is wireless, the stilts are controlled via Bluetooth.
To find out where the user is in the wirtual world, we used a motion capture system, in this case Optitrack.
The software system was implemented in Unity3D.
To give levelUps the chance to actuate in time, we did a simply ray-casting in front of the user, measuring the intersection with the ground, and sending the command to LevelUps for going up or down
Finally to find out if the stilts are good
we conducted a brief evaluation in which participants stepped on and off virtual boxes
We tested three different conditions> the levelUps turned ON against two control condition, where the levelUps were turned off and last, where the user was wearing regular shoes
As you can see participants enjoyed the experience when they were wearing the levelUps turned on
And also judged as more realistic.
So in conclusion, our main contribution was bringing the concept of elevation to real walking environments and we implemented it in a wearable wireless design.
In the same time this device has also certain limitations, first of all it has limited length, so it can’t simulate multiple stare steps in a row.
But we can do is to slowly rewinding the device over time, and than it can be again actuated.
The second constrain is the speed of the actuation. Our LevelUps are good for walking, but certainly not for running.
Thank you for you attention and I’m happy to take your questions.