SlideShare a Scribd company logo
1
Medical Hands-Free System
Authors
Avraham Levi, Guy Peleg
Supervisors
Ron Sivan, PhD , Yael Einav, PhD
Abstract - Contemporary Human Computer Interaction (HCI) is based on usage of multiple
devices such as keyboard, mouse or touchscreen, all of which require manual contact. This
imposes a serious limitation when it comes to deal some medical equipment and applications
where sterility is required, such as in the operating room. Statistical reports show that keyboard
and mice contain more bacteria than lavatories. We focus in our project on the application of a
hands free motion detector in surgical procedures. The purpose of the application in this setting is
to enable a surgeon to account for the placement of surgical items to prevent forgetting any inside
the patient by mistake. The application makes use of a hand motion detector from Leap Motion,
Inc. We hope it will give us a good opportunity to solve this issue, and, furthermore, introduce the
concept of hands free controller to the medical world.
Keywords - Human computer interaction (HCI), Leap motion (Leap) , Infra-red (IR) ,Gesture,
Image moments, Retained foreign object (RFO).
1. INTRODUCTION
This project addresses the issue of recognizing hand and finger movement. Using a 3D
motion controller, and utilizing computer vision method which is image moments. we
hope to solve the issue of defining the hand motion patterns. In practice, we will need to
expand the SDK the manufacturer of the device provides, to enable developers to record
and create their own custom made gestures with our SDK we will develop an application
that we hope will help surgeons in the operating room by assisting them in keeping track
of instruments and materials they use during surgery so they wouldn't forget anything
inside the patient. Our purpose is to introduce the possibility of hands-free control of a
computer to the medical world, which is usually very conservative and is slow in adopting
new technologies.
2. THEORY
2.1. Background and related work
2.1.1. Gesture
Gestures are a method of communication using only hand movement, such as sign
language used by the deaf (See Figure 1). A gesture includes movement of body parts
(hands, fingers) or various implements, such as flags, pens, etc.
Figure 1 : Hand gestures used in sign language
2
2.1.2.The leap motion controller
The Leap Motion controller [1] is a computer hardware sensor designed to detect hand
and finger movement requiring no contact or touch. The main components of the device
are three IR LEDs for illumination and two monochromatic IR cameras. The device is
sensitive to hand and finger movements to a distance of about 1 meter. The LEDs
generate a 3d pattern of dots of IR light, and the cameras generate 200 frames per
second of captured data. This data is sent via USB to a host computer. The device
comes with a development kit (SDK) capable of recognizing a few simple gestures. The
main part of our project will consist of expanding the SDK for enabling users to define
their own, more complex gestures. In particular, we plan to use that environment to
create applications based on customized gestures to control medical equipment used in
sterile conditions, where forms of control which require touch are precluded.
Figure 2 : The inside components of the Leap Motion controller
2.1.3.Other products
Other products in the 3D motion controller domain are Kinect [2] and Intel Perception
Camera [3] . Compared with the Leap Motion device, the Kinect does not come with
finger recognition and its main purpose is to capture the whole body, therefore it is much
slower in capturing hand and fingers movement, too slow for gesture recognition.
Because the Kinect camera captures the soundings as well, computing time is much
longer. Moreover, the Kinect is much larger and requires more space. The Kinect,
however, has better range and can capture the surroundings from a distance.
Another comparable device is the Intel Perceptual Computing Camera. However, It is
available for purchase, although it rather more expensive than the Leap Motion ($200 for
the device and $150 for Intel SDK). The device is not designed specifically for gesture
recognition, it works in visible light (not IR), and depends on ambient lighting. It supports
voice and various other technologies that are immaterial to our application.
2.1.4. The health care industry
Research [4] [5] strongly supports that computer keyboards and other input devices are
source of bacteria and cross-contamination that can lead to hospital acquired infections.
Therefore washable keyboards, mice, TV remotes and mobile products with anti-
microbial product protection should be put in place along with proper disinfection
protocols to reduce the risk of infection and cross-contamination.
2.1.5. Surgical inventory count
Oftentimes surgical instruments may accidentally be left behind in a surgical cavity
causing, in the worst case, severe infection or even death [6]. These types of events
(called retained foreign object – RFO) are considered as "never events" , namely – they
are preventable medical errors. Unlike other medical errors, RFO errors were declared as
"touch zero" errors – namely, the goal is to reach zero events, since this type of error is
considered easy to prevent [7].
3
Over the history of surgical procedures a strict surgical inventory count protocol was
developed and is now obligatory in all surgical settings around the world. Two nurses are
responsible for counting all surgical sponges, needles and instruments in most surgical
settings. However, there are some surgical procedures in which the count protocol is not
performed on a regular basis. These are "small", "simple" procedures in which there is no
nurse continuously aiding the surgeon (e.g., episiotomy). Naturally, where there is no
formal count protocol performed or no count at all, the chances for retained surgical
sponge rises dramatically [6] . In these procedures the surgeon has to rely on his or her
memory to recall how many items were used or how many items were in the set that was
opened (for example 10 pads, 2 needles and 3 gauzes). To account for all items the
surgeon actually needs to compare the number in his memory (e.g., "there were 5 pads in
the set and then I opened a package of 5 more, and there were 3 needles in the set.") To
the number of items found at the end of the procedure. However, to keep these numbers
in mind for the whole procedure is much of a burden and since short term memory is so
vulnerable, there is a good chance that the surgeon may make a mistake.
In the past 10-15 years there were few technological developed to support the counting
protocol: "SURGICOUNT Medical" and "SmartSponge System" are two examples. These
systems require hand contact to operate and are controlled mostly by the nurses
responsible for the count. Our solution is designed to help surgeons in cases where there
is no nurse available to write down (or type) items' count. Using the Leap Motion device
for this purpose will provide the surgeon means to document the usage of items without
the need to write down or type anything. This solution is the only solution (until now) that
surgeons have to document anything themselves during surgery. It is expected to
dramatically reduce memory load and promote a safer care for patients.
Figure 3 : Intra-operative radiograph performed because of an incorrect sponge count in a 54 year
old woman undergoing urethral suspension. The radio-opaque marker (arrow) of a 4 x 4 inch
surgical sponge is visible in the pelvis. The sponge was identified.
2.2. Detailed description
2.2.1. Gesture recognition
Gesture recognition refers to interpretation of human gestures via mathematical
algorithms. The IR cameras in the Leap Motion device reads the movements of the
human hand and communicates the data to a computer that uses the gesture input to
control devices or applications. Using mathematical algorithms the computer can analyze
the captured data and categorize it to one of several predefined patterns.
4
2.2.2. Preprocessing
For the moment we confine our attention to planar gestures – gestures in which the index
finger traces a path in a plane. We assume it will be easier for humans to reproduce and
hope it will also be simpler to recognize, while not restricting the repertoire of possible
gestures too severely.
It is obvious that any free hand motion cannot be constrained to be completely planar:
planarity will be only approximated. We therefore find, as a first step in interpreting
gesture data, the plane whose distance from the captured gesture trace is minimal, using
Singular Vector Decomposition (see 2.2.3).
2.2.3. Singular vector decomposition
Singular Vector Decomposition (SVD) is a factorization method with many useful
applications in signal processing and statistics.
One useful application that can be used for our purpose is fitting planes and lines by
orthogonal distance regression [8] .Say we want to find the plane that are as close as
possible to set of 𝑛 3-D points (𝑝1, … , 𝑝 𝑛) which we captured by the device.
Let the matrix A of size n x 3 holds the 𝑝𝑖 = (π‘₯𝑖, 𝑦𝑖, 𝑧𝑖) in each row :
𝐴 = [
π‘₯1 𝑦1 𝑧1
: : :
π‘₯ 𝑛 𝑦𝑛 𝑧 𝑛
]
Using the Transpose Matrix operation we create the 3 x n AT
:
𝐴 𝑇
= [
π‘₯1 … π‘₯ 𝑛
𝑦1 … 𝑦𝑛
𝑧1 … 𝑧 𝑛
]
Building matrix B by Matrix Multiplication between A and AT
yielding a 3 x 3 matrix :
𝐡 = 𝐴 βˆ™ 𝐴 𝑇
Solving the eigenvalue equation for matrix B:
det(𝐡 βˆ’ πœ† βˆ™ 𝐼 ) = 0
This equation is a polynomial of degree 3, and hence has 3 solutions. Under the
conditions of the problem at hand, they are expected to be real, positive and distinct.
Ξ»1, Ξ»2, Ξ»3 ∈ ℝ+
For each of the Ξ» values we compute the corresponding eigenvector:
(𝐡 βˆ’ πœ†π‘– βˆ™ 𝐼 )𝑒̅ 𝑖 = 0
Using the eigenvalues we have calculated, we need to create the U matrix that built from
each of the eigenvectors.
U = [
𝑒1π‘₯
𝑒1𝑦
𝑒1𝑧
𝑒2π‘₯
𝑒2𝑦
𝑒2𝑧
𝑒3π‘₯
𝑒3𝑦
𝑒3𝑧
]
⏟
u1 u2 u3
5
The eigenvector that belongs to the minimal value πœ†π‘– is the normal for our working
plane.
2.2.4. Projecting point on the new plane
After we have found the closest plane to all the given points, we project each point onto
the plane to be given set of N coplanar. We use the canonical form of the plane equation
to compute the distance of each point 𝑃𝑖(π‘₯𝑖, 𝑦𝑖, 𝑧𝑖)from the plane:
𝐴π‘₯𝑖 + 𝐡𝑦𝑖 + 𝐢𝑧𝑖 + 𝐷
√𝐴2 + 𝐡2 + 𝐢2
= 𝑑𝑖
𝑃𝑖
βƒ—βƒ— βˆ’ 𝑑𝑖 𝑛⃗ = 𝑃𝑖′⃗⃗⃗⃗
2.2.5. Reducing dimensions
Now that we have moved all points into one plane, we want to reduce the number of
coordinates of each point from 3 to 2. Let {𝑃𝑖(π‘₯𝑖, 𝑦𝑖, 𝑧𝑖)} be that set of points, and let
𝑀(𝐴π‘₯ + 𝐡𝑦 + 𝐢𝑧 + 𝐷 = 0) be that plane. In general, the plane 𝑀 need not be 𝑀0(𝑧 = 0),
the XY plane, and therefore the 𝑧 component of the points 𝑃𝑖 need not vanish. We
therefore construct a Cartesian system on plane 𝑀 by choosing two perpendicular lines
on 𝑀, namely LX Let and LY. 𝐿 𝑋(𝐴π‘₯ + 𝐡𝑦 + 𝐷 = 0) is the intersection line between 𝑀 and
𝑀0. As the origin we choose the point 𝑂 (0, βˆ’
𝐷
𝐡
, 0) on 𝐿 𝑋. 𝐿 π‘Œ will then be the line lying in
𝑀0 that passes through the origin 𝑂 and is perpendicular to 𝐿 𝑋. The distances π‘₯′𝑖 and
𝑦′𝑖 of every point 𝑃𝑖 from 𝐿 𝑋 and 𝐿 π‘Œ correspondingly will act as the coordinates of the
points for further analysis.
Considering Figure 4, we find the distance d on 𝑀0 between the projection on 𝑃𝑖 on 𝑀0
and 𝐿 𝑋, which by definition also lies on 𝑀0. This distance d and the z coordinate of point
𝑃𝑖 form a right-angle triangle whose hypotenous is the distance of 𝑃𝑖from LX, hence is y'i.
Defining point Q as the intersection of y'i and 𝐿 𝑋, the distance from Q to the origin O is the
distance on 𝑀 from 𝑃𝑖 to 𝐿 𝑋, and hence is x'i.
Figure 4 : Reducing the number of coordinates
Developing the math we get:
22
2222
'
))(())((
BA
ABx
B
D
yAxB
B
D
yAB
x
iiii
i


ο€½
2
22
2
' )(
i
ii
i z
BA
DByAx
y 


ο€½
6
2.2.6. Building the image
Once a planar shape is obtained, we find a bounding rectangle for the points inside.
Forming a matrix M with the dimensions of that rectangle, we initialize the matrix
according to this formula:
𝑀(π‘₯, 𝑦) = {
1 π‘₯, 𝑦 𝑖𝑠 π‘Ž π‘‘π‘Žπ‘‘π‘Ž π‘π‘œπ‘–π‘›π‘‘
0 π‘œπ‘‘β„Žπ‘’π‘Ÿπ‘€π‘–π‘ π‘’
Yielding a matrix representing the image.
2.2.7.Image moments
In order to distinguish between patterns we compute Image Moments [9] [10]. Image
moments, each being a real number, are various weighted averages of the image pixels
intensities, representing increasing detail of pixel distribution, such as centroid, area, and
information about orientation. Moments are invariant to translation, and some are
invariant to rotation as well, and we limit our attention to those only.
We have used the following mapping function:
𝑓(π‘₯, 𝑦) = {
0 , 𝑖𝑓 π‘‘β„Žπ‘’ 𝑝𝑖π‘₯𝑒𝑙 𝑖𝑠 π‘€β„Žπ‘–π‘‘π‘’
1 , 𝑖𝑓 π‘‘β„Žπ‘’ 𝑝𝑖π‘₯𝑒𝑙 𝑖𝑠 π‘π‘™π‘Žπ‘π‘˜
We first calculate the raw moments
𝑀 π‘π‘ž = βˆ‘ βˆ‘ π‘₯ 𝑝
𝑦 π‘ž
𝑓( π‘₯, 𝑦)
𝑦π‘₯
πœ‡ π‘π‘ž = βˆ‘ βˆ‘(π‘₯ βˆ’ π‘₯Μ…)
𝑝
(𝑦 βˆ’ 𝑦̅) π‘ž
𝑓(π‘₯, 𝑦)
𝑦π‘₯
π‘€β„Žπ‘’π‘Ÿπ‘’ π‘₯Μ… =
𝑀10
𝑀00
π‘Žπ‘›π‘‘ 𝑦̅ =
𝑀01
𝑀00
We have chosen to use 9 moments for now, but this number may increase if it turns out
good recognition needs more. Now that we have calculated the moments we need make
sure that they are scale invariants so we use the following formula:
πœ‚π‘–π‘— =
πœ‡π‘–π‘—
πœ‡00
1+
𝑖+𝑗
2
Now we want our moments to be invariant under rotation, most known set of moments
are Hu's [11] set of invariant moments, also known as the 7 moments of Hu:
𝝓 𝟏 = πœ‚20 + πœ‚02
𝝓 𝟐 = (πœ™1)2
+ 4πœ‚11
2
𝝓 πŸ‘ = (πœ‚30 βˆ’ 3πœ‚12)2
+ (3πœ‚12 βˆ’ πœ‚30)2
𝝓 πŸ’ = (πœ‚30 + πœ‚12)2
+ (πœ‚30 + πœ‚21)2
𝝓 πŸ“ = (πœ‚30 βˆ’ 3πœ‚12)(πœ‚30 + 3πœ‚12)[(πœ‚30 + πœ‚12)2
βˆ’ 3(πœ‚30 + 3πœ‚12)2]
+ (3πœ‚21 βˆ’ πœ‚03)(πœ‚03 + 3πœ‚21)[3(πœ‚30 + 3πœ‚12)2 βˆ’ (πœ‚21 + πœ‚03)2]
𝝓 πŸ” = (πœ‚20 βˆ’ πœ‚02)[πœ‚30 + πœ‚12)2
βˆ’ (πœ‚21 + πœ‚03)2
] + 4πœ‚_11 (πœ‚30 + πœ‚12)(πœ‚21 + πœ‚03)
𝝓 πŸ• = (3πœ‚21 βˆ’ πœ‚03)(πœ‚30 + πœ‚12)[(πœ‚30 + πœ‚12)2
βˆ’ 3(πœ‚21 + πœ‚03)2] βˆ’ (πœ‚30 βˆ’ 3πœ‚12)(πœ‚21 + πœ‚03)[(3(πœ‚30 + πœ‚12)2
βˆ’ (πœ‚21 + πœ‚03)2]
7
The 9 moments of a gesture are taken as coordinates of the gesture in some abstract 9-
dimensional space. It is assumed that the representation of similar gestures will
congregate into "clouds" whose extent, in Euclidean metric, will be small compared to the
distance between "cloud" centroids. Those centroid will be calculated for each "cloud"
and will be saved, and with each insertion of new moments the centroid will be updated,
the usage of this data is represented in the next section.
2.2.8. Minimum distance algorithm
Minimum Distance algorithm is basic method to compare distance between our ongoing
gesture and the "clouds" of calculated method, the main idea of the usage of the algorithm
in this context is to calculate the distance between the ongoing gesture centroid to the other
centroids, so in high probability the "cloud" with the minimum distance to the ongoing
gesture is the same gesture,
We have considered two distance function:
𝑙𝑒𝑑 𝑋, π‘Œ 𝑏𝑒 π‘‰π‘’π‘π‘‘π‘œπ‘Ÿ 𝑖𝑛 π‘˜ π‘‘π‘–π‘šπ‘’π‘›π‘ π‘–π‘œπ‘›π‘Žπ‘™ π‘ π‘π‘Žπ‘π‘’
πΈπ‘’π‘π‘™π‘–π‘‘π‘’π‘Žπ‘› ∢ βˆšβˆ‘(𝑋𝑖 βˆ’ π‘Œπ‘–)2
π‘˜
𝑖=1
π‘€π‘Žπ‘›β„Žπ‘Žπ‘‘π‘‘π‘Žπ‘› ∢ βˆ‘|𝑋𝑖 βˆ’ π‘Œπ‘–|
π‘˜
𝑖=1
We assume that the number of gesture recorded is finite and considerably low. So in that
case we can allow us to calculate each distance. Since the number of rows of data is finite
the complexity of this process the complexity is low. Which is 𝑂(𝑆).
The Algorithm:
Let 𝑆 = 𝑐1, … , 𝑐 𝑛 be the set of centroids of gesture {𝑔𝑖} in feature space.
Let p be the point in representing a new gesture to be recognized in feature space
ο‚· Compute the distance d(p, ci)
ο‚· Find the minimum distance.
ο‚· Identify the new gesture as the gesture i with the minimal distance
ο‚· Return the id
2.3. Expected results
The final product will be implemented in each device that require hand contact
especially for counting procedures in operating room. This would reduce the number
of retain sponge cases.
Our SDK will give the opportunity for developers around the world to implement
application which based on custom-made gesture.
8
3. PRELIMINARY SOFTWARE ENGINEERING DOCUMENTATION
3.1. Requirements (Use Cases)
3.2. GUI
In this section we would like to introduce our UI prototype for the project. This prototype
may demonstrate the usability of the SDK and the Medical application.
3.2.1. SDK expansion
We have divided the toolkit to 3 major steps which give the user all the functionally he
needs to create a custom made gesture. Each step screen has its own purpose, we gave
the user a wizard that give him information about the current step.
Step 1 (Figure 5): In this step the user records the custom made gesture he or she
would like to create.
Step 2 (Figure 6): In this step the user trains the computer to recognize the new gesture
by recording additional examples preferably by different people.
Step 3 (Figure 7): In this step the user tests the system by presenting the gesture to see
if it is recognized.
Step 4 : In this step the user gets the information about the new gesture he has just
created.
Each screen has the following components (See figures below):
A) Wizard- A UI component which presents the user a sequence of screens that lead him
through a series of well-defined steps.
B) Gesture grid panel – A real time panel displayer which tracks the movement of the
user finger and display this movement in this panel.
C) Coordinates table – A table which is filled with the point coordinates of each finger
position that was detected in the current frame.
D) Operation buttons – the four control buttons which can be clicked upon to perform an
action.
E) Status bar – A status line that gives an information about the Leap controller
connection , and about the action that is performed.
F) Help button – a control button which stands for giving help for the user.
9
G) Match rate bar –a bar that presents the likelihood rate of the gesture that the user is
doing with the custom made gesture that was saved in the record step. (refers to steps
2+3)
G
A
B
C
D
E
F
Figure 5 : Record Step
Figure 6 : Train Step
Figure 7 : Exercise Step
10
3.2.2. Items counting application
We design our medical application to be simple and intuitive to the surgeons that will
use it eventually. We focused on the Episiotomy procedure kit since it perfectly
demonstrate the usage of our system.
As we can see in Figure 8 below, there are 4 components:
A) Item panel – this panel includes elements : gesture icon , item image and counter .
B) Undo panel – this panel show the gesture icon that the surgeons needs to do in
order to undo his last gesture action.
C) More panel - this panel show the gesture icon that the surgeons needs to do in
order to add different item to the surgery.
D) Status bar – A status line that gives an information about the Leap controller
connection , and about the action that is performed.
Using these components the surgeon makes the gesture according the item that was
entered to the surgery environment and the counter that belongs to this gesture is
incremented, and a feedback will be shown on the screen.
3.3. Program structure – Architecture, Design (UML diagrams)
3.3.1. Software architecture
The SDK itself will divide into two main parts, the main part the core of the program
with all the logic is going to be C++ based application. When the GUI is going to be
done with QT or WPF/WINFORMS and will communicate with the C++ program that
will transfer event for each gesture that have been made, as for the medical application
the whole application is going to be written with WPF and communicate with the SDK.
The following API is an initial prototype of the interface of our SDK:
Init() – this function initialize the setting of the device.
SetupListener() – this function connect to the infrastructure of the system.
CallBackFunc() – this interface stands for the user in order to send it as a delegate
function to the device.
OnClose() – operation to be done when the application is close.
BA
C
D
Figure 8 : Instruments counting application
11
SDK Class Diagram
Medical Counter Class Diagram
12
3.4. Testing plan
3.4.1. Testing plan for the SDK
3.4.2. Testing plan for the medical application
REFERENCES
[1] L. M. Controller, "Leap Motion Specs," Leap Motion Inc, 2013. [Online]. Available:
https://www.leapmotion.com/product.
[2] "Kinect WikiPedia," [Online]. Available: http://en.wikipedia.org/wiki/Kinect.
[3] I. Corp, "Intel Dev Guide for perceptual-computing," Intel Corp, [Online]. Available: http://software.intel.com/en-
us/vcsource/tools/perceptual-computing-sdk.
[4] A. K. Al-Ghamdi, S. M. A. Abdelmalek, A. M. Ashshi, H. Faidah, H. Shukri and A. A. Jiman-Fatan, "Bacterial
contamination of computer keyboards and mice ,elevtors and shoping carts," African Journal of Microbiology
Research, no. 5(23), 2011.
[5] D. C. -. A. N. M. Unit, "Abc News - Your Keyboard: Dirtier Than a Toilet," 5 May 2008. [Online]. Available:
http://abcnews.go.com/Health/Germs/story?id=4774746.
[6] M. M. D. M. S. L. S. M. a. M. J. Z. M. Atul A. Gawande, "Risk Factors for Retained Instruments," The new england
journal of medicine, 2003.
[7] M. t. S. F. B. M. K. P. S. R. J. T. S. B. M. a. H. A. K. B. C. William Kaiser, "The Retained Surgical Sponge," ANNALS
OF SURGERY, vol. 224.
[8] W. Comunity, "SVD - WIKI," [Online]. Available: http://en.wikipedia.org/wiki/Singular_value_decomposition.
[9] J. Flusser, "On the independence of rotation moment invariants," Pattern Recognition, no. 33, 1999.
[10] J. F. a. T. Suk, "Rotation Moment Invariants for Recognition of Symmetric Objects," IEEE TRANSACTIONS ON
IMAGE PROCESSING, vol. 15, 2006.
[11] J. L. Zhihu Huang, "Analysis of Hu's Moment Invariants on Image Scaling and Rotation," ECU Publications Pre,
2011.
Test name Scenario Expected result
Record input data We record input from the
device
A file with the recorded data will
appear in the DB.
Call SVD function over input data file SVD function gets the raw data
from the file
Function return the normal of
closest plane.
Call image moments function Image moment will be
extracted from the bit map
matrix.
All moment will be recorded to
the DB.
Record And test 3 Different Gestures
repeat this with different people
We record 3 different gesture
and test them
There system will notice
between the 3 different
gestures
Test name Scenario Expected result
Open application User starts the application Application opens in main
screen with no errors
User imitates gesture shown on the
screen
The user follow the gesture
shape.
A counter of the instrument in
being increased , and the
shape is being highlighted .
The user chose to undo the action User imitates gesture of undo. The last counter is being
decreased.
Open surgery log The user press log button. A log of his actions is being
shown
"you made X , pad 4X4 was
incremented by 1".
Open surgery inventory report The user click inventory report
button.
A report of all the required
equipment is being shown
"two pair of scalpels" etc.
13

More Related Content

What's hot

Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
IJMER
Β 
Niknewppt
NiknewpptNiknewppt
Niknewppt
Nikith Kumar Reddy
Β 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
NidhinRaj Saikripa
Β 
Nikppt
NikpptNikppt
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET Journal
Β 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
NidhinRaj Saikripa
Β 
Gesture recognition
Gesture recognitionGesture recognition
Gesture recognition
Mariya Khan
Β 
Gesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image ComparisonGesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image Comparison
ijait
Β 
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-CamHand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
ijsrd.com
Β 
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET Journal
Β 
gesture-recognition
gesture-recognitiongesture-recognition
gesture-recognition
Venkat RAGHAVENDRA REDDY
Β 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Kalle
Β 
human activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysishuman activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysis
Venkat Projects
Β 
Augmented Reality for Robotic Surgical Dissection - Final Report
Augmented Reality for Robotic Surgical Dissection - Final ReportAugmented Reality for Robotic Surgical Dissection - Final Report
Augmented Reality for Robotic Surgical Dissection - Final Report
Milind Soman
Β 
Human activity recognition
Human activity recognitionHuman activity recognition
Human activity recognition
Randhir Gupta
Β 
Blue eyes technology New Version 2017
Blue eyes technology New Version 2017Blue eyes technology New Version 2017
Blue eyes technology New Version 2017
Ajith Kumar Ravi
Β 
Vertical Fragmentation of Location Information to Enable Location Privacy in ...
Vertical Fragmentation of Location Information to Enable Location Privacy in ...Vertical Fragmentation of Location Information to Enable Location Privacy in ...
Vertical Fragmentation of Location Information to Enable Location Privacy in ...
ijasa
Β 
Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapes Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapes
ijistjournal
Β 
Hand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture RecognitionHand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture Recognition
AM Publications,India
Β 
Introduction
IntroductionIntroduction
Introduction
raminenihemu418
Β 

What's hot (20)

Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Β 
Niknewppt
NiknewpptNiknewppt
Niknewppt
Β 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
Β 
Nikppt
NikpptNikppt
Nikppt
Β 
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
Β 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
Β 
Gesture recognition
Gesture recognitionGesture recognition
Gesture recognition
Β 
Gesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image ComparisonGesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image Comparison
Β 
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-CamHand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Β 
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture Recognition
Β 
gesture-recognition
gesture-recognitiongesture-recognition
gesture-recognition
Β 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Β 
human activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysishuman activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysis
Β 
Augmented Reality for Robotic Surgical Dissection - Final Report
Augmented Reality for Robotic Surgical Dissection - Final ReportAugmented Reality for Robotic Surgical Dissection - Final Report
Augmented Reality for Robotic Surgical Dissection - Final Report
Β 
Human activity recognition
Human activity recognitionHuman activity recognition
Human activity recognition
Β 
Blue eyes technology New Version 2017
Blue eyes technology New Version 2017Blue eyes technology New Version 2017
Blue eyes technology New Version 2017
Β 
Vertical Fragmentation of Location Information to Enable Location Privacy in ...
Vertical Fragmentation of Location Information to Enable Location Privacy in ...Vertical Fragmentation of Location Information to Enable Location Privacy in ...
Vertical Fragmentation of Location Information to Enable Location Privacy in ...
Β 
Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapes Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapes
Β 
Hand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture RecognitionHand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture Recognition
Β 
Introduction
IntroductionIntroduction
Introduction
Β 

Viewers also liked

II Jornadas I+D+i Promalaga - Rodolfo Carpintier - DAD
II Jornadas I+D+i Promalaga - Rodolfo Carpintier - DADII Jornadas I+D+i Promalaga - Rodolfo Carpintier - DAD
II Jornadas I+D+i Promalaga - Rodolfo Carpintier - DAD
PromΓ‘laga
Β 
IUL English language Units (1-4)
IUL English language Units (1-4)IUL English language Units (1-4)
IUL English language Units (1-4)
Maurizio Naso
Β 
Home medical
Home medicalHome medical
Home medical
Modupe Sarratt
Β 
Cohen & co Trasporti e sviluppo territoriale
Cohen & co Trasporti e sviluppo territorialeCohen & co Trasporti e sviluppo territoriale
Cohen & co Trasporti e sviluppo territoriale
Marco Percoco
Β 
Resume PDF
Resume PDFResume PDF
Nurgivo Alfajri (11353103824)
Nurgivo Alfajri (11353103824)Nurgivo Alfajri (11353103824)
Nurgivo Alfajri (11353103824)
Nurgivo Alfajri
Β 
mm Bagali...... start up...... just go for that.........Start up....the taste...
mm Bagali...... start up...... just go for that.........Start up....the taste...mm Bagali...... start up...... just go for that.........Start up....the taste...
mm Bagali...... start up...... just go for that.........Start up....the taste...
dr m m bagali, phd in hr
Β 
Electrical Survey Document
Electrical Survey DocumentElectrical Survey Document
Electrical Survey Document
Darren Pfau
Β 
Habeck Resume 021715
Habeck Resume 021715Habeck Resume 021715
Habeck Resume 021715
Bill Habeck
Β 
Prezentacja siewierz
Prezentacja siewierzPrezentacja siewierz
Prezentacja siewierz
Magdalena Skula
Β 
The Status of Soil Resources in Mozambique, Jacinto Mirione Mafalacusser
The Status of Soil Resources in Mozambique, Jacinto Mirione MafalacusserThe Status of Soil Resources in Mozambique, Jacinto Mirione Mafalacusser
The Status of Soil Resources in Mozambique, Jacinto Mirione Mafalacusser
FAO
Β 
Day 2: Innovation in parliaments #2, Mr. Xavier Armendariz, ICT coordinator, ...
Day 2: Innovation in parliaments #2, Mr. Xavier Armendariz, ICT coordinator, ...Day 2: Innovation in parliaments #2, Mr. Xavier Armendariz, ICT coordinator, ...
Day 2: Innovation in parliaments #2, Mr. Xavier Armendariz, ICT coordinator, ...
wepc2016
Β 
CV
CVCV
Voluntariado SMM
Voluntariado SMMVoluntariado SMM
Voluntariado SMM
mixiricas
Β 
Achieve3000
Achieve3000Achieve3000
Achieve3000
Gabriel Castro
Β 
Xogos tradicionais de Galicia
Xogos tradicionais de GaliciaXogos tradicionais de Galicia
Xogos tradicionais de Galicia
mixiricas
Β 
JuliÑn López YÑñez: Cómo liderar el cambio en instituciones educativas
JuliÑn López YÑñez: Cómo liderar el cambio en instituciones educativasJuliÑn López YÑñez: Cómo liderar el cambio en instituciones educativas
JuliÑn López YÑñez: Cómo liderar el cambio en instituciones educativas
EncuentroEducacion
Β 
Atlantic 2.0 2013-2016 : VISION - STRATEGIE - MISE EN OEUVRE
Atlantic 2.0 2013-2016 : VISION - STRATEGIE - MISE EN OEUVREAtlantic 2.0 2013-2016 : VISION - STRATEGIE - MISE EN OEUVRE
Atlantic 2.0 2013-2016 : VISION - STRATEGIE - MISE EN OEUVRE
Atlantic 2.0
Β 
Software Libero e LibreOffice
Software Libero e LibreOfficeSoftware Libero e LibreOffice
Software Libero e LibreOffice
LibreItalia
Β 
O paxaro bobo
O paxaro boboO paxaro bobo

Viewers also liked (20)

II Jornadas I+D+i Promalaga - Rodolfo Carpintier - DAD
II Jornadas I+D+i Promalaga - Rodolfo Carpintier - DADII Jornadas I+D+i Promalaga - Rodolfo Carpintier - DAD
II Jornadas I+D+i Promalaga - Rodolfo Carpintier - DAD
Β 
IUL English language Units (1-4)
IUL English language Units (1-4)IUL English language Units (1-4)
IUL English language Units (1-4)
Β 
Home medical
Home medicalHome medical
Home medical
Β 
Cohen & co Trasporti e sviluppo territoriale
Cohen & co Trasporti e sviluppo territorialeCohen & co Trasporti e sviluppo territoriale
Cohen & co Trasporti e sviluppo territoriale
Β 
Resume PDF
Resume PDFResume PDF
Resume PDF
Β 
Nurgivo Alfajri (11353103824)
Nurgivo Alfajri (11353103824)Nurgivo Alfajri (11353103824)
Nurgivo Alfajri (11353103824)
Β 
mm Bagali...... start up...... just go for that.........Start up....the taste...
mm Bagali...... start up...... just go for that.........Start up....the taste...mm Bagali...... start up...... just go for that.........Start up....the taste...
mm Bagali...... start up...... just go for that.........Start up....the taste...
Β 
Electrical Survey Document
Electrical Survey DocumentElectrical Survey Document
Electrical Survey Document
Β 
Habeck Resume 021715
Habeck Resume 021715Habeck Resume 021715
Habeck Resume 021715
Β 
Prezentacja siewierz
Prezentacja siewierzPrezentacja siewierz
Prezentacja siewierz
Β 
The Status of Soil Resources in Mozambique, Jacinto Mirione Mafalacusser
The Status of Soil Resources in Mozambique, Jacinto Mirione MafalacusserThe Status of Soil Resources in Mozambique, Jacinto Mirione Mafalacusser
The Status of Soil Resources in Mozambique, Jacinto Mirione Mafalacusser
Β 
Day 2: Innovation in parliaments #2, Mr. Xavier Armendariz, ICT coordinator, ...
Day 2: Innovation in parliaments #2, Mr. Xavier Armendariz, ICT coordinator, ...Day 2: Innovation in parliaments #2, Mr. Xavier Armendariz, ICT coordinator, ...
Day 2: Innovation in parliaments #2, Mr. Xavier Armendariz, ICT coordinator, ...
Β 
CV
CVCV
CV
Β 
Voluntariado SMM
Voluntariado SMMVoluntariado SMM
Voluntariado SMM
Β 
Achieve3000
Achieve3000Achieve3000
Achieve3000
Β 
Xogos tradicionais de Galicia
Xogos tradicionais de GaliciaXogos tradicionais de Galicia
Xogos tradicionais de Galicia
Β 
JuliÑn López YÑñez: Cómo liderar el cambio en instituciones educativas
JuliÑn López YÑñez: Cómo liderar el cambio en instituciones educativasJuliÑn López YÑñez: Cómo liderar el cambio en instituciones educativas
JuliÑn López YÑñez: Cómo liderar el cambio en instituciones educativas
Β 
Atlantic 2.0 2013-2016 : VISION - STRATEGIE - MISE EN OEUVRE
Atlantic 2.0 2013-2016 : VISION - STRATEGIE - MISE EN OEUVREAtlantic 2.0 2013-2016 : VISION - STRATEGIE - MISE EN OEUVRE
Atlantic 2.0 2013-2016 : VISION - STRATEGIE - MISE EN OEUVRE
Β 
Software Libero e LibreOffice
Software Libero e LibreOfficeSoftware Libero e LibreOffice
Software Libero e LibreOffice
Β 
O paxaro bobo
O paxaro boboO paxaro bobo
O paxaro bobo
Β 

Similar to Medical Handsfree System - Project Paper

Virtual surgery [new].ppt
Virtual surgery [new].pptVirtual surgery [new].ppt
Virtual surgery [new].ppt
Sreeraj Rajendran
Β 
Media Control Using Hand Gesture Moments
Media Control Using Hand Gesture MomentsMedia Control Using Hand Gesture Moments
Media Control Using Hand Gesture Moments
IRJET Journal
Β 
Real time human-computer interaction
Real time human-computer interactionReal time human-computer interaction
Real time human-computer interaction
ijfcstjournal
Β 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
IRJET Journal
Β 
FUSION OF GAIT AND FINGERPRINT FOR USER AUTHENTICATION ON MOBILE DEVICES
FUSION OF GAIT AND FINGERPRINT FOR USER AUTHENTICATION ON MOBILE DEVICESFUSION OF GAIT AND FINGERPRINT FOR USER AUTHENTICATION ON MOBILE DEVICES
FUSION OF GAIT AND FINGERPRINT FOR USER AUTHENTICATION ON MOBILE DEVICES
vasim hasina
Β 
Design a System for Hand Gesture Recognition with Neural Network
Design a System for Hand Gesture Recognition with Neural NetworkDesign a System for Hand Gesture Recognition with Neural Network
Design a System for Hand Gesture Recognition with Neural Network
IRJET Journal
Β 
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollerHand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
eSAT Publishing House
Β 
40120140503005 2
40120140503005 240120140503005 2
40120140503005 2
IAEME Publication
Β 
Gestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand GloveGestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand Glove
IRJET Journal
Β 
Design and Development ofa Mirror Effect Control Prosthetic Hand with Force S...
Design and Development ofa Mirror Effect Control Prosthetic Hand with Force S...Design and Development ofa Mirror Effect Control Prosthetic Hand with Force S...
Design and Development ofa Mirror Effect Control Prosthetic Hand with Force S...
TELKOMNIKA JOURNAL
Β 
IRJET= Air Writing: Gesture Recognition using Ultrasound Sensors and Grid-Eye...
IRJET= Air Writing: Gesture Recognition using Ultrasound Sensors and Grid-Eye...IRJET= Air Writing: Gesture Recognition using Ultrasound Sensors and Grid-Eye...
IRJET= Air Writing: Gesture Recognition using Ultrasound Sensors and Grid-Eye...
IRJET Journal
Β 
virtual surgery
virtual surgeryvirtual surgery
virtual surgery
Makka Vasu
Β 
using the Leap.pdf
using the Leap.pdfusing the Leap.pdf
using the Leap.pdf
ThontadharyaThontadh
Β 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
IRJET Journal
Β 
14 561
14 56114 561
14 561
Chaitanya Ram
Β 
Gesture recognition for computers
Gesture recognition for computersGesture recognition for computers
Gesture recognition for computers
jaimin_m_raval
Β 
Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured TapesMouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapes
ijistjournal
Β 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand Gestures
IRJET Journal
Β 
C0321012018
C0321012018C0321012018
C0321012018
theijes
Β 
Smartphone based wearable sensors for cyborgs using neural network engine
Smartphone based wearable sensors for cyborgs using neural network engineSmartphone based wearable sensors for cyborgs using neural network engine
Smartphone based wearable sensors for cyborgs using neural network engine
Dayana Benny
Β 

Similar to Medical Handsfree System - Project Paper (20)

Virtual surgery [new].ppt
Virtual surgery [new].pptVirtual surgery [new].ppt
Virtual surgery [new].ppt
Β 
Media Control Using Hand Gesture Moments
Media Control Using Hand Gesture MomentsMedia Control Using Hand Gesture Moments
Media Control Using Hand Gesture Moments
Β 
Real time human-computer interaction
Real time human-computer interactionReal time human-computer interaction
Real time human-computer interaction
Β 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
Β 
FUSION OF GAIT AND FINGERPRINT FOR USER AUTHENTICATION ON MOBILE DEVICES
FUSION OF GAIT AND FINGERPRINT FOR USER AUTHENTICATION ON MOBILE DEVICESFUSION OF GAIT AND FINGERPRINT FOR USER AUTHENTICATION ON MOBILE DEVICES
FUSION OF GAIT AND FINGERPRINT FOR USER AUTHENTICATION ON MOBILE DEVICES
Β 
Design a System for Hand Gesture Recognition with Neural Network
Design a System for Hand Gesture Recognition with Neural NetworkDesign a System for Hand Gesture Recognition with Neural Network
Design a System for Hand Gesture Recognition with Neural Network
Β 
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollerHand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Β 
40120140503005 2
40120140503005 240120140503005 2
40120140503005 2
Β 
Gestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand GloveGestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand Glove
Β 
Design and Development ofa Mirror Effect Control Prosthetic Hand with Force S...
Design and Development ofa Mirror Effect Control Prosthetic Hand with Force S...Design and Development ofa Mirror Effect Control Prosthetic Hand with Force S...
Design and Development ofa Mirror Effect Control Prosthetic Hand with Force S...
Β 
IRJET= Air Writing: Gesture Recognition using Ultrasound Sensors and Grid-Eye...
IRJET= Air Writing: Gesture Recognition using Ultrasound Sensors and Grid-Eye...IRJET= Air Writing: Gesture Recognition using Ultrasound Sensors and Grid-Eye...
IRJET= Air Writing: Gesture Recognition using Ultrasound Sensors and Grid-Eye...
Β 
virtual surgery
virtual surgeryvirtual surgery
virtual surgery
Β 
using the Leap.pdf
using the Leap.pdfusing the Leap.pdf
using the Leap.pdf
Β 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
Β 
14 561
14 56114 561
14 561
Β 
Gesture recognition for computers
Gesture recognition for computersGesture recognition for computers
Gesture recognition for computers
Β 
Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured TapesMouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapes
Β 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand Gestures
Β 
C0321012018
C0321012018C0321012018
C0321012018
Β 
Smartphone based wearable sensors for cyborgs using neural network engine
Smartphone based wearable sensors for cyborgs using neural network engineSmartphone based wearable sensors for cyborgs using neural network engine
Smartphone based wearable sensors for cyborgs using neural network engine
Β 

Medical Handsfree System - Project Paper

  • 1. 1 Medical Hands-Free System Authors Avraham Levi, Guy Peleg Supervisors Ron Sivan, PhD , Yael Einav, PhD Abstract - Contemporary Human Computer Interaction (HCI) is based on usage of multiple devices such as keyboard, mouse or touchscreen, all of which require manual contact. This imposes a serious limitation when it comes to deal some medical equipment and applications where sterility is required, such as in the operating room. Statistical reports show that keyboard and mice contain more bacteria than lavatories. We focus in our project on the application of a hands free motion detector in surgical procedures. The purpose of the application in this setting is to enable a surgeon to account for the placement of surgical items to prevent forgetting any inside the patient by mistake. The application makes use of a hand motion detector from Leap Motion, Inc. We hope it will give us a good opportunity to solve this issue, and, furthermore, introduce the concept of hands free controller to the medical world. Keywords - Human computer interaction (HCI), Leap motion (Leap) , Infra-red (IR) ,Gesture, Image moments, Retained foreign object (RFO). 1. INTRODUCTION This project addresses the issue of recognizing hand and finger movement. Using a 3D motion controller, and utilizing computer vision method which is image moments. we hope to solve the issue of defining the hand motion patterns. In practice, we will need to expand the SDK the manufacturer of the device provides, to enable developers to record and create their own custom made gestures with our SDK we will develop an application that we hope will help surgeons in the operating room by assisting them in keeping track of instruments and materials they use during surgery so they wouldn't forget anything inside the patient. Our purpose is to introduce the possibility of hands-free control of a computer to the medical world, which is usually very conservative and is slow in adopting new technologies. 2. THEORY 2.1. Background and related work 2.1.1. Gesture Gestures are a method of communication using only hand movement, such as sign language used by the deaf (See Figure 1). A gesture includes movement of body parts (hands, fingers) or various implements, such as flags, pens, etc. Figure 1 : Hand gestures used in sign language
  • 2. 2 2.1.2.The leap motion controller The Leap Motion controller [1] is a computer hardware sensor designed to detect hand and finger movement requiring no contact or touch. The main components of the device are three IR LEDs for illumination and two monochromatic IR cameras. The device is sensitive to hand and finger movements to a distance of about 1 meter. The LEDs generate a 3d pattern of dots of IR light, and the cameras generate 200 frames per second of captured data. This data is sent via USB to a host computer. The device comes with a development kit (SDK) capable of recognizing a few simple gestures. The main part of our project will consist of expanding the SDK for enabling users to define their own, more complex gestures. In particular, we plan to use that environment to create applications based on customized gestures to control medical equipment used in sterile conditions, where forms of control which require touch are precluded. Figure 2 : The inside components of the Leap Motion controller 2.1.3.Other products Other products in the 3D motion controller domain are Kinect [2] and Intel Perception Camera [3] . Compared with the Leap Motion device, the Kinect does not come with finger recognition and its main purpose is to capture the whole body, therefore it is much slower in capturing hand and fingers movement, too slow for gesture recognition. Because the Kinect camera captures the soundings as well, computing time is much longer. Moreover, the Kinect is much larger and requires more space. The Kinect, however, has better range and can capture the surroundings from a distance. Another comparable device is the Intel Perceptual Computing Camera. However, It is available for purchase, although it rather more expensive than the Leap Motion ($200 for the device and $150 for Intel SDK). The device is not designed specifically for gesture recognition, it works in visible light (not IR), and depends on ambient lighting. It supports voice and various other technologies that are immaterial to our application. 2.1.4. The health care industry Research [4] [5] strongly supports that computer keyboards and other input devices are source of bacteria and cross-contamination that can lead to hospital acquired infections. Therefore washable keyboards, mice, TV remotes and mobile products with anti- microbial product protection should be put in place along with proper disinfection protocols to reduce the risk of infection and cross-contamination. 2.1.5. Surgical inventory count Oftentimes surgical instruments may accidentally be left behind in a surgical cavity causing, in the worst case, severe infection or even death [6]. These types of events (called retained foreign object – RFO) are considered as "never events" , namely – they are preventable medical errors. Unlike other medical errors, RFO errors were declared as "touch zero" errors – namely, the goal is to reach zero events, since this type of error is considered easy to prevent [7].
  • 3. 3 Over the history of surgical procedures a strict surgical inventory count protocol was developed and is now obligatory in all surgical settings around the world. Two nurses are responsible for counting all surgical sponges, needles and instruments in most surgical settings. However, there are some surgical procedures in which the count protocol is not performed on a regular basis. These are "small", "simple" procedures in which there is no nurse continuously aiding the surgeon (e.g., episiotomy). Naturally, where there is no formal count protocol performed or no count at all, the chances for retained surgical sponge rises dramatically [6] . In these procedures the surgeon has to rely on his or her memory to recall how many items were used or how many items were in the set that was opened (for example 10 pads, 2 needles and 3 gauzes). To account for all items the surgeon actually needs to compare the number in his memory (e.g., "there were 5 pads in the set and then I opened a package of 5 more, and there were 3 needles in the set.") To the number of items found at the end of the procedure. However, to keep these numbers in mind for the whole procedure is much of a burden and since short term memory is so vulnerable, there is a good chance that the surgeon may make a mistake. In the past 10-15 years there were few technological developed to support the counting protocol: "SURGICOUNT Medical" and "SmartSponge System" are two examples. These systems require hand contact to operate and are controlled mostly by the nurses responsible for the count. Our solution is designed to help surgeons in cases where there is no nurse available to write down (or type) items' count. Using the Leap Motion device for this purpose will provide the surgeon means to document the usage of items without the need to write down or type anything. This solution is the only solution (until now) that surgeons have to document anything themselves during surgery. It is expected to dramatically reduce memory load and promote a safer care for patients. Figure 3 : Intra-operative radiograph performed because of an incorrect sponge count in a 54 year old woman undergoing urethral suspension. The radio-opaque marker (arrow) of a 4 x 4 inch surgical sponge is visible in the pelvis. The sponge was identified. 2.2. Detailed description 2.2.1. Gesture recognition Gesture recognition refers to interpretation of human gestures via mathematical algorithms. The IR cameras in the Leap Motion device reads the movements of the human hand and communicates the data to a computer that uses the gesture input to control devices or applications. Using mathematical algorithms the computer can analyze the captured data and categorize it to one of several predefined patterns.
  • 4. 4 2.2.2. Preprocessing For the moment we confine our attention to planar gestures – gestures in which the index finger traces a path in a plane. We assume it will be easier for humans to reproduce and hope it will also be simpler to recognize, while not restricting the repertoire of possible gestures too severely. It is obvious that any free hand motion cannot be constrained to be completely planar: planarity will be only approximated. We therefore find, as a first step in interpreting gesture data, the plane whose distance from the captured gesture trace is minimal, using Singular Vector Decomposition (see 2.2.3). 2.2.3. Singular vector decomposition Singular Vector Decomposition (SVD) is a factorization method with many useful applications in signal processing and statistics. One useful application that can be used for our purpose is fitting planes and lines by orthogonal distance regression [8] .Say we want to find the plane that are as close as possible to set of 𝑛 3-D points (𝑝1, … , 𝑝 𝑛) which we captured by the device. Let the matrix A of size n x 3 holds the 𝑝𝑖 = (π‘₯𝑖, 𝑦𝑖, 𝑧𝑖) in each row : 𝐴 = [ π‘₯1 𝑦1 𝑧1 : : : π‘₯ 𝑛 𝑦𝑛 𝑧 𝑛 ] Using the Transpose Matrix operation we create the 3 x n AT : 𝐴 𝑇 = [ π‘₯1 … π‘₯ 𝑛 𝑦1 … 𝑦𝑛 𝑧1 … 𝑧 𝑛 ] Building matrix B by Matrix Multiplication between A and AT yielding a 3 x 3 matrix : 𝐡 = 𝐴 βˆ™ 𝐴 𝑇 Solving the eigenvalue equation for matrix B: det(𝐡 βˆ’ πœ† βˆ™ 𝐼 ) = 0 This equation is a polynomial of degree 3, and hence has 3 solutions. Under the conditions of the problem at hand, they are expected to be real, positive and distinct. Ξ»1, Ξ»2, Ξ»3 ∈ ℝ+ For each of the Ξ» values we compute the corresponding eigenvector: (𝐡 βˆ’ πœ†π‘– βˆ™ 𝐼 )𝑒̅ 𝑖 = 0 Using the eigenvalues we have calculated, we need to create the U matrix that built from each of the eigenvectors. U = [ 𝑒1π‘₯ 𝑒1𝑦 𝑒1𝑧 𝑒2π‘₯ 𝑒2𝑦 𝑒2𝑧 𝑒3π‘₯ 𝑒3𝑦 𝑒3𝑧 ] ⏟ u1 u2 u3
  • 5. 5 The eigenvector that belongs to the minimal value πœ†π‘– is the normal for our working plane. 2.2.4. Projecting point on the new plane After we have found the closest plane to all the given points, we project each point onto the plane to be given set of N coplanar. We use the canonical form of the plane equation to compute the distance of each point 𝑃𝑖(π‘₯𝑖, 𝑦𝑖, 𝑧𝑖)from the plane: 𝐴π‘₯𝑖 + 𝐡𝑦𝑖 + 𝐢𝑧𝑖 + 𝐷 √𝐴2 + 𝐡2 + 𝐢2 = 𝑑𝑖 𝑃𝑖 βƒ—βƒ— βˆ’ 𝑑𝑖 𝑛⃗ = 𝑃𝑖′⃗⃗⃗⃗ 2.2.5. Reducing dimensions Now that we have moved all points into one plane, we want to reduce the number of coordinates of each point from 3 to 2. Let {𝑃𝑖(π‘₯𝑖, 𝑦𝑖, 𝑧𝑖)} be that set of points, and let 𝑀(𝐴π‘₯ + 𝐡𝑦 + 𝐢𝑧 + 𝐷 = 0) be that plane. In general, the plane 𝑀 need not be 𝑀0(𝑧 = 0), the XY plane, and therefore the 𝑧 component of the points 𝑃𝑖 need not vanish. We therefore construct a Cartesian system on plane 𝑀 by choosing two perpendicular lines on 𝑀, namely LX Let and LY. 𝐿 𝑋(𝐴π‘₯ + 𝐡𝑦 + 𝐷 = 0) is the intersection line between 𝑀 and 𝑀0. As the origin we choose the point 𝑂 (0, βˆ’ 𝐷 𝐡 , 0) on 𝐿 𝑋. 𝐿 π‘Œ will then be the line lying in 𝑀0 that passes through the origin 𝑂 and is perpendicular to 𝐿 𝑋. The distances π‘₯′𝑖 and 𝑦′𝑖 of every point 𝑃𝑖 from 𝐿 𝑋 and 𝐿 π‘Œ correspondingly will act as the coordinates of the points for further analysis. Considering Figure 4, we find the distance d on 𝑀0 between the projection on 𝑃𝑖 on 𝑀0 and 𝐿 𝑋, which by definition also lies on 𝑀0. This distance d and the z coordinate of point 𝑃𝑖 form a right-angle triangle whose hypotenous is the distance of 𝑃𝑖from LX, hence is y'i. Defining point Q as the intersection of y'i and 𝐿 𝑋, the distance from Q to the origin O is the distance on 𝑀 from 𝑃𝑖 to 𝐿 𝑋, and hence is x'i. Figure 4 : Reducing the number of coordinates Developing the math we get: 22 2222 ' ))(())(( BA ABx B D yAxB B D yAB x iiii i   ο€½ 2 22 2 ' )( i ii i z BA DByAx y    ο€½
  • 6. 6 2.2.6. Building the image Once a planar shape is obtained, we find a bounding rectangle for the points inside. Forming a matrix M with the dimensions of that rectangle, we initialize the matrix according to this formula: 𝑀(π‘₯, 𝑦) = { 1 π‘₯, 𝑦 𝑖𝑠 π‘Ž π‘‘π‘Žπ‘‘π‘Ž π‘π‘œπ‘–π‘›π‘‘ 0 π‘œπ‘‘β„Žπ‘’π‘Ÿπ‘€π‘–π‘ π‘’ Yielding a matrix representing the image. 2.2.7.Image moments In order to distinguish between patterns we compute Image Moments [9] [10]. Image moments, each being a real number, are various weighted averages of the image pixels intensities, representing increasing detail of pixel distribution, such as centroid, area, and information about orientation. Moments are invariant to translation, and some are invariant to rotation as well, and we limit our attention to those only. We have used the following mapping function: 𝑓(π‘₯, 𝑦) = { 0 , 𝑖𝑓 π‘‘β„Žπ‘’ 𝑝𝑖π‘₯𝑒𝑙 𝑖𝑠 π‘€β„Žπ‘–π‘‘π‘’ 1 , 𝑖𝑓 π‘‘β„Žπ‘’ 𝑝𝑖π‘₯𝑒𝑙 𝑖𝑠 π‘π‘™π‘Žπ‘π‘˜ We first calculate the raw moments 𝑀 π‘π‘ž = βˆ‘ βˆ‘ π‘₯ 𝑝 𝑦 π‘ž 𝑓( π‘₯, 𝑦) 𝑦π‘₯ πœ‡ π‘π‘ž = βˆ‘ βˆ‘(π‘₯ βˆ’ π‘₯Μ…) 𝑝 (𝑦 βˆ’ 𝑦̅) π‘ž 𝑓(π‘₯, 𝑦) 𝑦π‘₯ π‘€β„Žπ‘’π‘Ÿπ‘’ π‘₯Μ… = 𝑀10 𝑀00 π‘Žπ‘›π‘‘ 𝑦̅ = 𝑀01 𝑀00 We have chosen to use 9 moments for now, but this number may increase if it turns out good recognition needs more. Now that we have calculated the moments we need make sure that they are scale invariants so we use the following formula: πœ‚π‘–π‘— = πœ‡π‘–π‘— πœ‡00 1+ 𝑖+𝑗 2 Now we want our moments to be invariant under rotation, most known set of moments are Hu's [11] set of invariant moments, also known as the 7 moments of Hu: 𝝓 𝟏 = πœ‚20 + πœ‚02 𝝓 𝟐 = (πœ™1)2 + 4πœ‚11 2 𝝓 πŸ‘ = (πœ‚30 βˆ’ 3πœ‚12)2 + (3πœ‚12 βˆ’ πœ‚30)2 𝝓 πŸ’ = (πœ‚30 + πœ‚12)2 + (πœ‚30 + πœ‚21)2 𝝓 πŸ“ = (πœ‚30 βˆ’ 3πœ‚12)(πœ‚30 + 3πœ‚12)[(πœ‚30 + πœ‚12)2 βˆ’ 3(πœ‚30 + 3πœ‚12)2] + (3πœ‚21 βˆ’ πœ‚03)(πœ‚03 + 3πœ‚21)[3(πœ‚30 + 3πœ‚12)2 βˆ’ (πœ‚21 + πœ‚03)2] 𝝓 πŸ” = (πœ‚20 βˆ’ πœ‚02)[πœ‚30 + πœ‚12)2 βˆ’ (πœ‚21 + πœ‚03)2 ] + 4πœ‚_11 (πœ‚30 + πœ‚12)(πœ‚21 + πœ‚03) 𝝓 πŸ• = (3πœ‚21 βˆ’ πœ‚03)(πœ‚30 + πœ‚12)[(πœ‚30 + πœ‚12)2 βˆ’ 3(πœ‚21 + πœ‚03)2] βˆ’ (πœ‚30 βˆ’ 3πœ‚12)(πœ‚21 + πœ‚03)[(3(πœ‚30 + πœ‚12)2 βˆ’ (πœ‚21 + πœ‚03)2]
  • 7. 7 The 9 moments of a gesture are taken as coordinates of the gesture in some abstract 9- dimensional space. It is assumed that the representation of similar gestures will congregate into "clouds" whose extent, in Euclidean metric, will be small compared to the distance between "cloud" centroids. Those centroid will be calculated for each "cloud" and will be saved, and with each insertion of new moments the centroid will be updated, the usage of this data is represented in the next section. 2.2.8. Minimum distance algorithm Minimum Distance algorithm is basic method to compare distance between our ongoing gesture and the "clouds" of calculated method, the main idea of the usage of the algorithm in this context is to calculate the distance between the ongoing gesture centroid to the other centroids, so in high probability the "cloud" with the minimum distance to the ongoing gesture is the same gesture, We have considered two distance function: 𝑙𝑒𝑑 𝑋, π‘Œ 𝑏𝑒 π‘‰π‘’π‘π‘‘π‘œπ‘Ÿ 𝑖𝑛 π‘˜ π‘‘π‘–π‘šπ‘’π‘›π‘ π‘–π‘œπ‘›π‘Žπ‘™ π‘ π‘π‘Žπ‘π‘’ πΈπ‘’π‘π‘™π‘–π‘‘π‘’π‘Žπ‘› ∢ βˆšβˆ‘(𝑋𝑖 βˆ’ π‘Œπ‘–)2 π‘˜ 𝑖=1 π‘€π‘Žπ‘›β„Žπ‘Žπ‘‘π‘‘π‘Žπ‘› ∢ βˆ‘|𝑋𝑖 βˆ’ π‘Œπ‘–| π‘˜ 𝑖=1 We assume that the number of gesture recorded is finite and considerably low. So in that case we can allow us to calculate each distance. Since the number of rows of data is finite the complexity of this process the complexity is low. Which is 𝑂(𝑆). The Algorithm: Let 𝑆 = 𝑐1, … , 𝑐 𝑛 be the set of centroids of gesture {𝑔𝑖} in feature space. Let p be the point in representing a new gesture to be recognized in feature space ο‚· Compute the distance d(p, ci) ο‚· Find the minimum distance. ο‚· Identify the new gesture as the gesture i with the minimal distance ο‚· Return the id 2.3. Expected results The final product will be implemented in each device that require hand contact especially for counting procedures in operating room. This would reduce the number of retain sponge cases. Our SDK will give the opportunity for developers around the world to implement application which based on custom-made gesture.
  • 8. 8 3. PRELIMINARY SOFTWARE ENGINEERING DOCUMENTATION 3.1. Requirements (Use Cases) 3.2. GUI In this section we would like to introduce our UI prototype for the project. This prototype may demonstrate the usability of the SDK and the Medical application. 3.2.1. SDK expansion We have divided the toolkit to 3 major steps which give the user all the functionally he needs to create a custom made gesture. Each step screen has its own purpose, we gave the user a wizard that give him information about the current step. Step 1 (Figure 5): In this step the user records the custom made gesture he or she would like to create. Step 2 (Figure 6): In this step the user trains the computer to recognize the new gesture by recording additional examples preferably by different people. Step 3 (Figure 7): In this step the user tests the system by presenting the gesture to see if it is recognized. Step 4 : In this step the user gets the information about the new gesture he has just created. Each screen has the following components (See figures below): A) Wizard- A UI component which presents the user a sequence of screens that lead him through a series of well-defined steps. B) Gesture grid panel – A real time panel displayer which tracks the movement of the user finger and display this movement in this panel. C) Coordinates table – A table which is filled with the point coordinates of each finger position that was detected in the current frame. D) Operation buttons – the four control buttons which can be clicked upon to perform an action. E) Status bar – A status line that gives an information about the Leap controller connection , and about the action that is performed. F) Help button – a control button which stands for giving help for the user.
  • 9. 9 G) Match rate bar –a bar that presents the likelihood rate of the gesture that the user is doing with the custom made gesture that was saved in the record step. (refers to steps 2+3) G A B C D E F Figure 5 : Record Step Figure 6 : Train Step Figure 7 : Exercise Step
  • 10. 10 3.2.2. Items counting application We design our medical application to be simple and intuitive to the surgeons that will use it eventually. We focused on the Episiotomy procedure kit since it perfectly demonstrate the usage of our system. As we can see in Figure 8 below, there are 4 components: A) Item panel – this panel includes elements : gesture icon , item image and counter . B) Undo panel – this panel show the gesture icon that the surgeons needs to do in order to undo his last gesture action. C) More panel - this panel show the gesture icon that the surgeons needs to do in order to add different item to the surgery. D) Status bar – A status line that gives an information about the Leap controller connection , and about the action that is performed. Using these components the surgeon makes the gesture according the item that was entered to the surgery environment and the counter that belongs to this gesture is incremented, and a feedback will be shown on the screen. 3.3. Program structure – Architecture, Design (UML diagrams) 3.3.1. Software architecture The SDK itself will divide into two main parts, the main part the core of the program with all the logic is going to be C++ based application. When the GUI is going to be done with QT or WPF/WINFORMS and will communicate with the C++ program that will transfer event for each gesture that have been made, as for the medical application the whole application is going to be written with WPF and communicate with the SDK. The following API is an initial prototype of the interface of our SDK: Init() – this function initialize the setting of the device. SetupListener() – this function connect to the infrastructure of the system. CallBackFunc() – this interface stands for the user in order to send it as a delegate function to the device. OnClose() – operation to be done when the application is close. BA C D Figure 8 : Instruments counting application
  • 11. 11 SDK Class Diagram Medical Counter Class Diagram
  • 12. 12 3.4. Testing plan 3.4.1. Testing plan for the SDK 3.4.2. Testing plan for the medical application REFERENCES [1] L. M. Controller, "Leap Motion Specs," Leap Motion Inc, 2013. [Online]. Available: https://www.leapmotion.com/product. [2] "Kinect WikiPedia," [Online]. Available: http://en.wikipedia.org/wiki/Kinect. [3] I. Corp, "Intel Dev Guide for perceptual-computing," Intel Corp, [Online]. Available: http://software.intel.com/en- us/vcsource/tools/perceptual-computing-sdk. [4] A. K. Al-Ghamdi, S. M. A. Abdelmalek, A. M. Ashshi, H. Faidah, H. Shukri and A. A. Jiman-Fatan, "Bacterial contamination of computer keyboards and mice ,elevtors and shoping carts," African Journal of Microbiology Research, no. 5(23), 2011. [5] D. C. -. A. N. M. Unit, "Abc News - Your Keyboard: Dirtier Than a Toilet," 5 May 2008. [Online]. Available: http://abcnews.go.com/Health/Germs/story?id=4774746. [6] M. M. D. M. S. L. S. M. a. M. J. Z. M. Atul A. Gawande, "Risk Factors for Retained Instruments," The new england journal of medicine, 2003. [7] M. t. S. F. B. M. K. P. S. R. J. T. S. B. M. a. H. A. K. B. C. William Kaiser, "The Retained Surgical Sponge," ANNALS OF SURGERY, vol. 224. [8] W. Comunity, "SVD - WIKI," [Online]. Available: http://en.wikipedia.org/wiki/Singular_value_decomposition. [9] J. Flusser, "On the independence of rotation moment invariants," Pattern Recognition, no. 33, 1999. [10] J. F. a. T. Suk, "Rotation Moment Invariants for Recognition of Symmetric Objects," IEEE TRANSACTIONS ON IMAGE PROCESSING, vol. 15, 2006. [11] J. L. Zhihu Huang, "Analysis of Hu's Moment Invariants on Image Scaling and Rotation," ECU Publications Pre, 2011. Test name Scenario Expected result Record input data We record input from the device A file with the recorded data will appear in the DB. Call SVD function over input data file SVD function gets the raw data from the file Function return the normal of closest plane. Call image moments function Image moment will be extracted from the bit map matrix. All moment will be recorded to the DB. Record And test 3 Different Gestures repeat this with different people We record 3 different gesture and test them There system will notice between the 3 different gestures Test name Scenario Expected result Open application User starts the application Application opens in main screen with no errors User imitates gesture shown on the screen The user follow the gesture shape. A counter of the instrument in being increased , and the shape is being highlighted . The user chose to undo the action User imitates gesture of undo. The last counter is being decreased. Open surgery log The user press log button. A log of his actions is being shown "you made X , pad 4X4 was incremented by 1". Open surgery inventory report The user click inventory report button. A report of all the required equipment is being shown "two pair of scalpels" etc.
  • 13. 13