SlideShare a Scribd company logo
1 of 1
Download to read offline
AUTHOR
Purdue University
TITLE
Problem Statement
Recent advancement in computers and robotics
makes it possible for people with spinal cord
injuries (SCIs) and other upper limb mobility
impairments to perform daily living and other
tasks more independently through the assistance
of a robotic arm. However, operation of robotic
arms has always been challenging, particularly for
individuals with upper extremity mobility
impairments. Some kind of human-computer
interface (HCI) must be employed to initiate and
orchestrate the task. Multiple methods have been
suggested to manipulate a robotic arm with
sufficient dexterity to accomplish most basic
tasks, such as picking up items, drinking from a
glass or self-feeding. There are very few human
computer interfaces that are designed specifically
to facilitate individuals with upper extremity
mobility impairments (Collinger et al., 2009)
A multimodal vision-based assistive robotic
system dedicated to assisting quadriplegics due
to SCI is presented (Figure 2.) . The system could
not only assist people with quadriplegia for
activities of daily living, such as eating, drinking
and dressing, but also it could provide
students/scientists with quadriplegia an
alternative way to perform “hands-on” laboratory
procedures more independently.
Multimodal Vision-Based Approach for Robotic Arm Control
for Individuals with Upper Level Spinal Cord Injuries
Hairong Jiang1, Juan P. Wachs1, Martin Pendergast2, Bradley S. Duerstock1, 3
Schools of Industrial Engineering1, School of Electrical and Computer Engineering2, Weldon School of Biomedical Engineering3
Purdue University, West Lafayette, Indiana
Methodology
ACKNOWLEDGMENTS
This work was established through the Institute for
Accessible Science by the NIH Director's Pathfinder Award
to Promote Diversity in the Scientific Workforce, funded by
the American Recovery and Reinvestment Act and
administered by the National Institute of General Medical
Sciences (grant no. 1DP4GM096842-01).
Future Directions
 Perform feasibility and preliminary testing
with able-bodied subjects and subjects
with quadriplegia.
 Compare Multimodal (keyboard and 3D
joystick combined), unimodal (default
joystick/keyboard/3D joystick) systems and
control (OEM) methods.
 Conduct simple task performance tests
including average completion time and
false manipulations during participant
study.
 Conduct lab procedure performance tests
with participants with upper extremity
impairments in ABIL.
Contact Info
Brad Duerstock, IAS Director
Susan M. Mendrysa, IAS Assistant Director
E-mail: ias@purdue.edu
URL: iashub.org
2. A computer vision system using a
Kinect® camera was adopted to obtain
feedback from the user and supervise
the performance of the actuator (Jiang
et al., 2012). The Kinect sensor
capturing both color and depth
information was calibrated and
integrated into the system to provide
vision-based information for robotic
control. Two main objectives were
achieved by utilizing the Kinect sensor:
assistance and supervision. Human
body was tracked to facilitate individuals
with quadriplegia for activities of daily
living. For instance, the face was tracked
to enable automatic drinking and
shaving service for individuals with
quadriplegia. The end effector of the
robotic system was supervised by the
Kinect sensor to provide further
information for object grabbing.
1. Two modalities were adopted to
control the system: Bluetooth keyboard
and three dimensional (3D) joystick. By
this procedure, the user was provided
with more flexibility, in turn, to make the
human computer interface more
adaptable. JACOTM Robot Manipulator
from Kinova Technology was adopted as
the actuator for the multimodal vision-
based assistive robotic system. All the
functions for robotic control were mapped
to a compact Bluetooth keyboard to
achieve wireless control. A 3D joystick
aims at haptic video game playing was
remodeled and programmed as a 3D
controller for the actuator. Two control
modes were employed (Figure 1.). Figure 1. 3D Joystick Control Diagram
Figure 2. System Architecture for Multimodal Vision-based
Assistive Robotic System
Reference
[1] Collinger JL, Wang W, Degenhart AD, Vinjamuri R, Sudre
GP, Weber DJ, Tyler-Kabara EC (2009) Towards a Direct Brain
Interface for Controlling Assistive Devices. In: The 1st
International Symposium on Quality of Life Technologies.
Pittsburgh, PA.
[2] H. Jiang, J. P. Wachs and B. S. Duerstock. “Facilitated
Gesture Recognition Based Interfaces for People with Upper
Extremity Physical Impairments" in Progress in Pattern
Recognition, Image Analysis, Computer Vision, and
Applications. Lecture Notes in Computer Science Volume
7441,pp 228-235, 2012

More Related Content

What's hot

Smart Mutatable Advanced Technology Wheelchair
Smart Mutatable Advanced Technology WheelchairSmart Mutatable Advanced Technology Wheelchair
Smart Mutatable Advanced Technology Wheelchairiosrjce
 
Role of Biomedical Technologies in Prosthetics and Orthotics
Role of Biomedical Technologies in Prosthetics and OrthoticsRole of Biomedical Technologies in Prosthetics and Orthotics
Role of Biomedical Technologies in Prosthetics and OrthoticsRohan Gupta
 
Modeling and Manufacturing of Powered vehicle for physically challenged people
Modeling and Manufacturing of Powered vehicle for physically  challenged peopleModeling and Manufacturing of Powered vehicle for physically  challenged people
Modeling and Manufacturing of Powered vehicle for physically challenged peopleIJMER
 
Voiceandaccelerometercontrolledwheelchair
VoiceandaccelerometercontrolledwheelchairVoiceandaccelerometercontrolledwheelchair
VoiceandaccelerometercontrolledwheelchaireSAT Publishing House
 
Biometrics
BiometricsBiometrics
Biometricsjwvantas
 
Advances and development in biomechatronics introduction to arm prosthesis
Advances and development in biomechatronics introduction to arm prosthesisAdvances and development in biomechatronics introduction to arm prosthesis
Advances and development in biomechatronics introduction to arm prosthesisIAEME Publication
 

What's hot (9)

Smart Mutatable Advanced Technology Wheelchair
Smart Mutatable Advanced Technology WheelchairSmart Mutatable Advanced Technology Wheelchair
Smart Mutatable Advanced Technology Wheelchair
 
Role of Biomedical Technologies in Prosthetics and Orthotics
Role of Biomedical Technologies in Prosthetics and OrthoticsRole of Biomedical Technologies in Prosthetics and Orthotics
Role of Biomedical Technologies in Prosthetics and Orthotics
 
Modeling and Manufacturing of Powered vehicle for physically challenged people
Modeling and Manufacturing of Powered vehicle for physically  challenged peopleModeling and Manufacturing of Powered vehicle for physically  challenged people
Modeling and Manufacturing of Powered vehicle for physically challenged people
 
Voiceandaccelerometercontrolledwheelchair
VoiceandaccelerometercontrolledwheelchairVoiceandaccelerometercontrolledwheelchair
Voiceandaccelerometercontrolledwheelchair
 
Biometrics
BiometricsBiometrics
Biometrics
 
Mech biomechatronic hand ppt
Mech biomechatronic hand pptMech biomechatronic hand ppt
Mech biomechatronic hand ppt
 
Posterv.1.2
Posterv.1.2Posterv.1.2
Posterv.1.2
 
Advances and development in biomechatronics introduction to arm prosthesis
Advances and development in biomechatronics introduction to arm prosthesisAdvances and development in biomechatronics introduction to arm prosthesis
Advances and development in biomechatronics introduction to arm prosthesis
 
W04507129134
W04507129134W04507129134
W04507129134
 

Viewers also liked

Viewers also liked (17)

Eduwin & maffe 2
Eduwin & maffe 2Eduwin & maffe 2
Eduwin & maffe 2
 
Test
TestTest
Test
 
Nowe mieszkania warszawa
Nowe mieszkania warszawaNowe mieszkania warszawa
Nowe mieszkania warszawa
 
Lil wayne dec 1
Lil wayne dec 1Lil wayne dec 1
Lil wayne dec 1
 
El dinero
El dineroEl dinero
El dinero
 
씽크 페이퍼
씽크 페이퍼씽크 페이퍼
씽크 페이퍼
 
Fices
FicesFices
Fices
 
L
LL
L
 
First Announcement - ICFR 2013
First Announcement - ICFR 2013First Announcement - ICFR 2013
First Announcement - ICFR 2013
 
Pension Risk Transfer Index: November 2012
Pension Risk Transfer Index: November 2012Pension Risk Transfer Index: November 2012
Pension Risk Transfer Index: November 2012
 
Looking back at your preliminary task what did
Looking back at your preliminary task what didLooking back at your preliminary task what did
Looking back at your preliminary task what did
 
Plakat.2pdf
Plakat.2pdfPlakat.2pdf
Plakat.2pdf
 
Hgc formacionciu 1y2_b_n5
Hgc formacionciu 1y2_b_n5Hgc formacionciu 1y2_b_n5
Hgc formacionciu 1y2_b_n5
 
Elite Training Consultacy
Elite Training ConsultacyElite Training Consultacy
Elite Training Consultacy
 
редни броеви
редни броевиредни броеви
редни броеви
 
Lenda de São Martinho
Lenda de São MartinhoLenda de São Martinho
Lenda de São Martinho
 
Aprendizaje por proyectos
Aprendizaje por proyectosAprendizaje por proyectos
Aprendizaje por proyectos
 

Similar to Poster (Version 3)

IoT Based Human Activity Recognition and Classification Using Machine Learning
IoT Based Human Activity Recognition and Classification Using Machine LearningIoT Based Human Activity Recognition and Classification Using Machine Learning
IoT Based Human Activity Recognition and Classification Using Machine LearningIRJET Journal
 
International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, Apr...
International Journal on Cybernetics & Informatics  (IJCI) Vol. 4, No. 2, Apr...International Journal on Cybernetics & Informatics  (IJCI) Vol. 4, No. 2, Apr...
International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, Apr...IJCI JOURNAL
 
Human Activity Recognition
Human Activity RecognitionHuman Activity Recognition
Human Activity RecognitionIRJET Journal
 
Project Profile -jimmy.pdf
Project Profile -jimmy.pdfProject Profile -jimmy.pdf
Project Profile -jimmy.pdfjimmy majumder
 
IRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex SensorsIRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex SensorsIRJET Journal
 
Thesis Studio I - Midterm
Thesis Studio I - MidtermThesis Studio I - Midterm
Thesis Studio I - MidtermKelly Nichols
 
Wearable sensor network for lower limb angle estimation in robotics applications
Wearable sensor network for lower limb angle estimation in robotics applicationsWearable sensor network for lower limb angle estimation in robotics applications
Wearable sensor network for lower limb angle estimation in robotics applicationsTELKOMNIKA JOURNAL
 
A LOW COST EEG BASED BCI PROSTHETIC USING MOTOR IMAGERY
A LOW COST EEG BASED BCI PROSTHETIC USING MOTOR IMAGERY A LOW COST EEG BASED BCI PROSTHETIC USING MOTOR IMAGERY
A LOW COST EEG BASED BCI PROSTHETIC USING MOTOR IMAGERY ijitcs
 
Eye tracker based HCI
Eye tracker based HCIEye tracker based HCI
Eye tracker based HCISaswati
 
Detection Of Saccadic Eye Movements to Switch the Devices For Disables
Detection Of Saccadic Eye Movements to Switch the Devices For DisablesDetection Of Saccadic Eye Movements to Switch the Devices For Disables
Detection Of Saccadic Eye Movements to Switch the Devices For Disablesijsrd.com
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesIRJET Journal
 
Brain Computer Interface
Brain Computer Interface Brain Computer Interface
Brain Computer Interface Deepti Singh
 
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEM
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEMA SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEM
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEMijaia
 
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEM
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEMA SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEM
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEMgerogepatton
 
(2009) Human-Biometric Sensor Interaction: Impact of Training on Biometric Sy...
(2009) Human-Biometric Sensor Interaction: Impact of Training on Biometric Sy...(2009) Human-Biometric Sensor Interaction: Impact of Training on Biometric Sy...
(2009) Human-Biometric Sensor Interaction: Impact of Training on Biometric Sy...International Center for Biometric Research
 
Visual, navigation and communication aid for visually impaired person
Visual, navigation and communication aid for visually impaired person Visual, navigation and communication aid for visually impaired person
Visual, navigation and communication aid for visually impaired person IJECEIAES
 
IRJET - Creating a Security Alert for the Care Takers Implementing a Vast Dee...
IRJET - Creating a Security Alert for the Care Takers Implementing a Vast Dee...IRJET - Creating a Security Alert for the Care Takers Implementing a Vast Dee...
IRJET - Creating a Security Alert for the Care Takers Implementing a Vast Dee...IRJET Journal
 

Similar to Poster (Version 3) (20)

using the Leap.pdf
using the Leap.pdfusing the Leap.pdf
using the Leap.pdf
 
IoT Based Human Activity Recognition and Classification Using Machine Learning
IoT Based Human Activity Recognition and Classification Using Machine LearningIoT Based Human Activity Recognition and Classification Using Machine Learning
IoT Based Human Activity Recognition and Classification Using Machine Learning
 
International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, Apr...
International Journal on Cybernetics & Informatics  (IJCI) Vol. 4, No. 2, Apr...International Journal on Cybernetics & Informatics  (IJCI) Vol. 4, No. 2, Apr...
International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, Apr...
 
Human Activity Recognition
Human Activity RecognitionHuman Activity Recognition
Human Activity Recognition
 
C1103011926
C1103011926C1103011926
C1103011926
 
Project Profile -jimmy.pdf
Project Profile -jimmy.pdfProject Profile -jimmy.pdf
Project Profile -jimmy.pdf
 
IRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex SensorsIRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex Sensors
 
Thesis Studio I - Midterm
Thesis Studio I - MidtermThesis Studio I - Midterm
Thesis Studio I - Midterm
 
Wearable sensor network for lower limb angle estimation in robotics applications
Wearable sensor network for lower limb angle estimation in robotics applicationsWearable sensor network for lower limb angle estimation in robotics applications
Wearable sensor network for lower limb angle estimation in robotics applications
 
A LOW COST EEG BASED BCI PROSTHETIC USING MOTOR IMAGERY
A LOW COST EEG BASED BCI PROSTHETIC USING MOTOR IMAGERY A LOW COST EEG BASED BCI PROSTHETIC USING MOTOR IMAGERY
A LOW COST EEG BASED BCI PROSTHETIC USING MOTOR IMAGERY
 
Eye tracker based HCI
Eye tracker based HCIEye tracker based HCI
Eye tracker based HCI
 
Detection Of Saccadic Eye Movements to Switch the Devices For Disables
Detection Of Saccadic Eye Movements to Switch the Devices For DisablesDetection Of Saccadic Eye Movements to Switch the Devices For Disables
Detection Of Saccadic Eye Movements to Switch the Devices For Disables
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
 
Brain Computer Interface
Brain Computer Interface Brain Computer Interface
Brain Computer Interface
 
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEM
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEMA SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEM
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEM
 
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEM
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEMA SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEM
A SMART BRAIN CONTROLLED WHEELCHAIR BASED MICROCONTROLLER SYSTEM
 
(2009) Human-Biometric Sensor Interaction: Impact of Training on Biometric Sy...
(2009) Human-Biometric Sensor Interaction: Impact of Training on Biometric Sy...(2009) Human-Biometric Sensor Interaction: Impact of Training on Biometric Sy...
(2009) Human-Biometric Sensor Interaction: Impact of Training on Biometric Sy...
 
Visual, navigation and communication aid for visually impaired person
Visual, navigation and communication aid for visually impaired person Visual, navigation and communication aid for visually impaired person
Visual, navigation and communication aid for visually impaired person
 
IRJET - Creating a Security Alert for the Care Takers Implementing a Vast Dee...
IRJET - Creating a Security Alert for the Care Takers Implementing a Vast Dee...IRJET - Creating a Security Alert for the Care Takers Implementing a Vast Dee...
IRJET - Creating a Security Alert for the Care Takers Implementing a Vast Dee...
 
V4 n2 139
V4 n2 139V4 n2 139
V4 n2 139
 

Poster (Version 3)

  • 1. AUTHOR Purdue University TITLE Problem Statement Recent advancement in computers and robotics makes it possible for people with spinal cord injuries (SCIs) and other upper limb mobility impairments to perform daily living and other tasks more independently through the assistance of a robotic arm. However, operation of robotic arms has always been challenging, particularly for individuals with upper extremity mobility impairments. Some kind of human-computer interface (HCI) must be employed to initiate and orchestrate the task. Multiple methods have been suggested to manipulate a robotic arm with sufficient dexterity to accomplish most basic tasks, such as picking up items, drinking from a glass or self-feeding. There are very few human computer interfaces that are designed specifically to facilitate individuals with upper extremity mobility impairments (Collinger et al., 2009) A multimodal vision-based assistive robotic system dedicated to assisting quadriplegics due to SCI is presented (Figure 2.) . The system could not only assist people with quadriplegia for activities of daily living, such as eating, drinking and dressing, but also it could provide students/scientists with quadriplegia an alternative way to perform “hands-on” laboratory procedures more independently. Multimodal Vision-Based Approach for Robotic Arm Control for Individuals with Upper Level Spinal Cord Injuries Hairong Jiang1, Juan P. Wachs1, Martin Pendergast2, Bradley S. Duerstock1, 3 Schools of Industrial Engineering1, School of Electrical and Computer Engineering2, Weldon School of Biomedical Engineering3 Purdue University, West Lafayette, Indiana Methodology ACKNOWLEDGMENTS This work was established through the Institute for Accessible Science by the NIH Director's Pathfinder Award to Promote Diversity in the Scientific Workforce, funded by the American Recovery and Reinvestment Act and administered by the National Institute of General Medical Sciences (grant no. 1DP4GM096842-01). Future Directions  Perform feasibility and preliminary testing with able-bodied subjects and subjects with quadriplegia.  Compare Multimodal (keyboard and 3D joystick combined), unimodal (default joystick/keyboard/3D joystick) systems and control (OEM) methods.  Conduct simple task performance tests including average completion time and false manipulations during participant study.  Conduct lab procedure performance tests with participants with upper extremity impairments in ABIL. Contact Info Brad Duerstock, IAS Director Susan M. Mendrysa, IAS Assistant Director E-mail: ias@purdue.edu URL: iashub.org 2. A computer vision system using a Kinect® camera was adopted to obtain feedback from the user and supervise the performance of the actuator (Jiang et al., 2012). The Kinect sensor capturing both color and depth information was calibrated and integrated into the system to provide vision-based information for robotic control. Two main objectives were achieved by utilizing the Kinect sensor: assistance and supervision. Human body was tracked to facilitate individuals with quadriplegia for activities of daily living. For instance, the face was tracked to enable automatic drinking and shaving service for individuals with quadriplegia. The end effector of the robotic system was supervised by the Kinect sensor to provide further information for object grabbing. 1. Two modalities were adopted to control the system: Bluetooth keyboard and three dimensional (3D) joystick. By this procedure, the user was provided with more flexibility, in turn, to make the human computer interface more adaptable. JACOTM Robot Manipulator from Kinova Technology was adopted as the actuator for the multimodal vision- based assistive robotic system. All the functions for robotic control were mapped to a compact Bluetooth keyboard to achieve wireless control. A 3D joystick aims at haptic video game playing was remodeled and programmed as a 3D controller for the actuator. Two control modes were employed (Figure 1.). Figure 1. 3D Joystick Control Diagram Figure 2. System Architecture for Multimodal Vision-based Assistive Robotic System Reference [1] Collinger JL, Wang W, Degenhart AD, Vinjamuri R, Sudre GP, Weber DJ, Tyler-Kabara EC (2009) Towards a Direct Brain Interface for Controlling Assistive Devices. In: The 1st International Symposium on Quality of Life Technologies. Pittsburgh, PA. [2] H. Jiang, J. P. Wachs and B. S. Duerstock. “Facilitated Gesture Recognition Based Interfaces for People with Upper Extremity Physical Impairments" in Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications. Lecture Notes in Computer Science Volume 7441,pp 228-235, 2012