In this project we demonstrated the ability to provide a 3 dimensional human-computer interface input/interaction mechanism using simple setup comprised of 2 fixed WiiMote and a moving light source. This can be further improved by adding additional sensors to the moving light source to give a rich input mechanism with virtual or real 3D space. While the work we did does not give a robust implementation, with relatively simple techniques it can be aggregated to create an accurate and responsive 3D input setup with relatively low cast (at about $21 per WiiMote). It is to be seen what application could be created for such a setup utilizing the technique discussed here.
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
Wii Sensor Bar Positioning in 3D Space
1. Wii Sensor Bar Positioning in 3D Space
Computational Geometry Course Project
Tomer Cagan and Yoav Francis, The Interdisciplinary Center, Herzelia
Abstract
Once introduced,the WiiMote andSensorBarbecame a verycommon formof user inputformany
applications,mainlyforentertainmentbutalsoformedical andotherfields.Mostcommonlythe
technologyisusedforcontrol in2d space (up/downandleft/right)relyingonthe WiiMote camera
trackingmechanismandofferlimitedsupportof “Z”axis(front/back) bymeansof accelerometer
measurement.Inourprojectwe wouldliketoexplore the possibilityof trackingthe sensorbarin3D
space witha setup that includes 2WiiMotes.
Introduction
The WiiMote was introducedduring2006 alongwiththe Wii systemfromNintendo.Inadditiontothe
commonbuttonsand sticksthe WiiMote providesmotionsensingwithtwomainmechanisms,an
accelerometerwhichwe will notdiscusshere andanoptical sensorwhichisthe maininterestof this
small research.
The optical sensorisa simple grayscale digital camera installedbehindanIRfilterwhichhasa builtin
chipfor trackingup to 4 IR lights.Alongwiththe WiiMote comesa SensorBarwhichis actually asimple
constructionthatincludesseveralIRlamps positionedaknownlengthapart thatcan be trackedby the
WiiMote.The actual camera is a 100Hz, 128x96 pixelscamerathattracks the lightsandinterpolate the
readingtogive a 1024x768 pixel resolution.
The WiiMote communicatesviaBluetoothinterface which,alongwithmanyavailablelibraries/APIs
makesita veryconvenient andpopularuserinputdevice.
Firstamongstthe applicationdevelopersforthe WiiMote trackingcapabilitiesisJohnnyChungLee from
Google whoresearchesHuman-ComputerInterface (HCI) [3]anddevelopedseveral interesting
applications, includingaWhite Board,FingerTracking,Surface IdentificationandHeadTracking.
While many of the applicationsare usingthe WiiMote asan inputdevice to operate acomputeror
control specificaspects/programs[7];otherworkswithWiiMote includeaphysicsteachingkitthatuses
the WiiMote and SensorBar [1];and a researchon rehabilitationusingWiiMotetotrack limbsandgive
feedback [8].
2. RelatedWork
As we’ve seeninthe previouschapter,whilethere are manyapplications withWiiMote and SensorBar
setup,mostof themonlydeal with2D space and may alsouse accelerometertogive notionof
back/frontmovement.
A “3D orientation”isdemonstratedinJohnnyChungLee’sproject forheadtracking[4] inwhichthe
SensorBar is fixedtothe user’sheadandtracked.Thisallowsrenderinga3D scene withorientation
relative tothe user’spositionandthusachievinga3 dimensional virtual space thatchangeswiththe
user’sperspective.
An applicationthataddressesareal 3D space usesbasictrigonometriccalculationstodeterminethe
distance of the SensorBar from the WiiMote [1].
The Wii Physicsprojectusesthe WiiMote toteachphysics.Inone of its applicationsthe WiiMote is
suspendedof aspringfacingthe SensorBar whichispositionstaticallybelow it(onthe floor).Inthis
setup,since the SensorBaris parallel tothe camerafieldof view (FOV)one candeduce acalculationfor
the distance of the WiiMote to the SensorBar [2]:
Figure 1: Basic setup for “Z” (Distance) measurement
The calculation isbased on the setup depicted above
Define: HFOV = 41°, VFOV = 31°
Angular field of view per pixel, Θ 𝐹𝑂𝑉 =
𝐻𝐹𝑂𝑉
1024
+
𝑉𝐹𝑂𝑉
768
2
Distancebetween two dots on the camera, 𝑟 = √(𝑥1 − 𝑥2)2 + (𝑦1 − 𝑦2 )2
Total angle subtended by the two LEDs and WiiMote, 2𝛼 = 𝑟Θ 𝐹𝑂𝑉 =
(
𝐻𝐹𝑂𝑉
1024
+
𝑉𝐹𝑂𝑉
768
)√( 𝑥1−𝑥2
)2+( 𝑦1−𝑦2
)2
2
→
𝛼 =
𝑟Θ 𝐹𝑂𝑉
2
=
(
𝐻𝐹𝑂𝑉
1024
+
𝑉𝐹𝑂𝑉
768
) √( 𝑥1 − 𝑥2
)2 + ( 𝑦1 − 𝑦2
)2
4
3. Givend (actual distance betweenthe LEDs= size of SensorBar, 𝑧 =
𝑑
2 tan( 𝛼)
→
𝑧 =
𝑑
2 tan
(
(
𝐻𝐹𝑂𝑉
1024
+
𝑉𝐹𝑂𝑉
768
)√( 𝑥1 − 𝑥2)2
+ ( 𝑦1
− 𝑦2
)
2
4
)
For a givenSensorBar (orothersimilarlightsource),the distance betweenthe twoextreme light(din
calculationabove) isknownandthusone can findthe distance zbetweenthe WiiMote andSensorBar.
Thisapproach islimitedtothe situationwhere the SensorBarisparallel tothe camera’sfieldof view. As
we will see in the nextchapter, aproblemariseswhenthe SensorBarisnot parallel tothe camera’sfield
of viewinwhichcase,dis nota knownquantity.
Our Project
In thischapterwe presentourproject.We firstintroduce the difficultyfacingwithonlyone remoteand
thenpresentthe calculationsandthe actual work we didin buildingasmall demo.
The problem
As we sawabove, witha predeterminedorientation(parallel) of the Sensor Barrelative tothe WiiMote
camera one can calculate the distance betweenthe two.The calculation isbasedonknowingthe
camera fieldof viewandthe actual size/distance betweenthe Sensor BarIR lamps.
The problemariseswhenthe orientationof the Sensor Barisnot fixedasin the above calculation, in
whichcase usingone WiiMote will notsuffice.Inthe followingimageswe cansee some casesinwhich
the SensorBar is notorientedinparallel tothe camera’sfieldof view.
Top View Camera POV
R1
R1
Figure 2: Sensor Bar in Diagonal Orientation
Figure 3: Sensor Bar in Reverse Diagonal Orientation
4. In Figure2: Sensor Bar in Diagonal Orientation the SensorBar issome distance awayfromthe WiiMote
and ispositionedatsome angle relative tothe cameraFOV, thusit hassome reading{{x1,y1},{x2,y2}}.In
Figure 3: Sensor Bar inReverse Diagonal Orientation we see the Sensorbaris the same distance from
the camera but at a reverse orientation;still,since the WiiMotemerelytracksthe lightsonthe sensor
bar the same reading,{{x1,y1},{x2,y2}} isobserved –itcannot inferthe rotation. The same case occurs in
Figure 4: Sensor Bar Further Away as well. Here the Sensorbar isactuallyfurtherawayand witha
differentangle relative tothe WiiMote camerabutstill the same reading,{{x1,y1},{x2,y2}},isobserved.
In anyof the setupsdepictedinthe figuresaboveone cannotuse the knowledge of the distance (d)
betweenthe SensorBar’slightsasitdoesnotlay parallel tothe camera’sFOV.Thus,thisvoidsthe
calculationasit waspresentedin [2].
In our projectwe wantedtosee howa setupthat includestwoWiiMotescanbe usedtoovercome this
problemandwhetheritcangive an accurate 3D space mapping.
Basic Setup
For the projectwe decidedtostart witha basicsetupas depictedinthe followingdiagram:
As we can see,the tworemoteslayperpendiculartoone anotherandeachimpose a side ora wall of a 3
dimensional cube.The SensorBarismovedwithinthisspace create bythe WiiMote camerasFOV.The
R1
R2
R1
R2
w2
w1
R1
Figure 4: Sensor Bar Further Away
Figure 5: Basic Setup
5. readingof each WiiMote correspondstothe positioninspace of the SensorBar.Note that thisreadingit
insome arbitrary space imposedbythe FOV.We will discussthe meaninglater.
Development Phases
In the followingsubsectionswe will slowlyexplore the setupand come upwiththe requiredcalculations
Pointin 3
The firstphase was merelyasimple “test”of seeingapointinthe locationof the SensorBar by
combiningreadingsfromthe twoWiiMotes.
In Figure 6 we can see the three axesx,y and z (green,blue andred) imposedbythe 2 WiiMotesFOVs.
Each readinggivesanx and y coordinatesthatshouldbe interpretedasareadingonone of the axes.
The actual readingsfromremote r1 and r2 (cyanand magenta,respectively) are ongivenandinterpreted
as the x,y(w1) and y,z(w2) plainsrespectively.Thus,if we have areading{x1,y1} and{x2,y2} fromthe two
WiiMotesrespectivelytheyare interpretedinthe 3Dcoordinatesas{x,y,z}x=x1,y = y1,z = x2.Note
that arbitrarilywe selectedtouse y1 as the y-coordinate reading.y2 couldhave be usedinsteadoran
average of both.
R1
R2
w2
w1
x
z
World x axis
World y axis
World z axis
WiiMote center lines
R1 FOV Axis Projection
R1 FOV Axis Projection
Legend
Figure 6: Axes and Interpreting Readings
6. As we can see inthe above picture (ref),the readingfromeachWiiMote isthe darksquare on the
correspondingwall.The blackbox inthe centeristhe positionof the SensorBar(averagingall the
separated{x,y} readingsandinterpretingasabove.
SensorBarinarbitrary3D
Once we had a basic setupandreadingwe wentonto show the actual SensorBar withinour(arbitrary)
3D Space.Giventwosetsof up to 4 {x,y} measurementscorrespondingtothe 4 lightsa WiiMote can
track we neededtodrawthe positionandorientationof the SensorBarwithinourspace.
Pitch
Calculatingthe pitchisnotpossible –the WiiMote will see the lightsourcesinthe exactsame waywhen
changingthe pitch(up tothe pointthatit won’tsee themat all).Having2 WiiMotesdoesn’thelpinthis
case.
Roll
Calculatingroll isstraightforwardandactuallycan be achievedbyjustone WiiMote.The roll of the
sensorbar isthe angle betweenthe vectorof the SensorBar andthe “floor”imposedbythe remote
FOV.
Yaw
To calculate yawone must have readingfromtwoWiiMotes.Once youhave readingsof at least2 lights
from2 WiiMotes,the 2 setsof readinggive adeltaalongthe worldx and z axes:
Figure 7: Point in 3D Demo
7. From these quantitiesandthe relationbetweenthenwe candeduce thatthe angle relative tothe
world, α,isthe inverse tangentof dx1 anddx2.
Havingthe roll and yaw,alongwiththe positionwe cannow draw the actual positionandorientationof
the SensorBar withinthe virtual space imposedbythe FOV of the WiiMotes.
R1
R2
w2
w1
x
z
α
dx1
dx2
dx1 dx2
Figure 9: Sensor Bar in Arbitrary 3D Space Demo
Figure 8: Relations for Finding Yaw
8. In Figure9: Sensor Bar in Arbitrary 3D Space Demo above we can see a screenshotfromthe demo
showingthe SensorBarpositionandorientation.Notethe Role andYaw valuesdisplaycorrespondingto
the orientationof the SensorBar
SensorBarin3D Space
So far we onlydealtwithanarbitrarypositionandwiththe orientationof the SensorBar.Ina sense we
were workingina 3D space that isa cube of 1024 x 768 x 1024 pixels.The actual dimensionof apixel is
not knownanddependsonhowfar the SensorBar isfrom eachWiiMote – the closeritis,the smalleris
the physical interpretationof apixel movement.
In orderto give a pixel actual physical interpretationwe have todeterminehow farawayeach WiiMote
is.Thena movementof some pixelsinthe virtual worldactuallycorrespondstoadistance inthe real
world(andvise-versa)
Havingcalculatedthe actual orientationof the SensorBar,we can get back to the calculationsdescribed
in[2] and listedabove.Havingthe 2WiiMotesreadingwe cansee the relationsbetweenthemandwe
can alsoknowthe actual distance betweenthe lightthateachone of WiiMotescan observe.
It isimportantto note the inverse relationbetweenthe 2FOV.Thus, to calculate the di exposedtoeach
of the remotes(R1 andR2) we have to use a differenttrigonometricfunction.
ProjectedDistance onWiiMote 1: 𝑑1 = 𝑑 sin 𝛼
ProjectedDistance onWiiMote 2: 𝑑2 = 𝑑 cos 𝛼
Havingthis actual distance we cango back and plugthe distance back intothe equationabove andget
the distancesz1 and z2 fromR1 and R2 respectively.
HFOV
R1
R2
HFOV
α
90-α
d1
d2
10. As we can see, we getan average offsetinthe calculation of around30% - whichis ratherpoor – thisis
probablydue tolightningissuesinterference,frequentandun-smoothchangesinthe reading,andthe
usage of a rathersimple,notbrightenough, SensorBar.
The Demo Program
As part of our projectwe createda simple demoprogramthatdemonstratesthe differentphasesand
relatedgoal.Followingisabrief discussionof the code we librariesusedandthe code we wrote.
DevelopmentEnvironment
The mechanismof workingwiththe WiiMote Bluetoothprotocolsandexplanationsof whatandhowit
workscan be foundin [9].There are manyimplementationsof the WiiMote API.These librariesgive a
simple yetcompletemechanismtocommunicate withthe WiiMote andreadsensorandbutton
informationfromit.
For convenienceandease of developmentwe selectedtouse WiimoteLib[5],amanagedcode library,
whichseemstobe complete andeasyto use.Inadditiontothe availabilityof the librarywe have found
several resourcesthathelpwithstartingWii development.Foremost of these are the resourcesfoundin
[6].
Due to the nature of the demos – a continuingupdating of arenderedscene, we selectedto use XNA as
our developmentplatform.XNA isamanagedcode wrapperaroundDirectX,mostlyusedasa game
developmentframeworkthatsupports3Dand 2D andprovidessimple mechanismsforgame
developmentthatlentitself easilytoourpurpose. Inadditionwe used [11] to getup to speedwith
developing3Dscenes.
DevelopmentProject
We createda simple multi-screenXNA projecttodemonstrate the above phases.Thissectionliststhe
importantcomponents
IWiiMotesService.csandWiiMotesServiceImpl.csspecifyaninterface andimplementationof
such functionsthatare requiredforinteractingwiththe WiiMotes.
WsbpDemo.cs(andProgram.cs) –entrypointto the demo
Demosfolder–includesthe actual democode foreach of the demos.
ScreenManagerfolder–screenmanagementfacilities.DevelopedbyMicrosoft
Screensfolder–miscellaneousmenuscreensandassets
The projectisavailable forgeneral use athttp://code.google.com/p/wsbp/
Future Work
In our projectwe onlydemonstratedwhatcanbe done withasetupof 2 WiiMotesand a lightsource
similartoSensorBar. The demoswe createdare relativelycrude butwe believethat they capture the
essence of the workthat can be done witha similarsetup.
11. The foremostissue thatneedstobe addressedisthe stabilityof the reading.Thatcanbe solvedby
smoothingthe readingusinganappropriate digitalfilter.Itisstill remaintoresolvethe bestfitting filter
(a simple runningaverage?more complexdigital filter(DSP)?).
In addition toselectingafilter,the strategyof applying itshouldbe determined –the WiiMote tracksup
to fourlightsandreport theminsome order.Movingthe SensorBar around show/hidethe lights;
furtherawayonly2 lightsare perceived.Thisimpliesthatthe inputtothe filtershouldbe carefully
determined.
In thisprojectwe onlyexplore the use of the IRcamerawithits trackingcapabilities.Since the Sensor
Bar is the actuallybeingmovedwiththe coveredspace.Itispossibletomountsome IRlightonan
additional WiiMote andthenone canhave inputsfromadditional builtinsensors.Insucha setupone
can use these additional inputsformore accurate trackingand manipulationsof the data.
Once the above modificationsare implementeditisstill tobe seenwhatactual applicationscanbe
developedusingthissetup.We believethiscanbe takenintwo directions –manipulate aphysical
objectinthe real worldbase on accurate movementandorientationtracking(andotherinputs) within
the virtual space (e.g.flya drone ina room accordingto movementof aWiiMote inthe imposedVR
space) and mappingphysical worldtoVRsuchas gamesthat take intoaccount the volume of a room or
the physical movementof a trackedobjectrelative tosome virtual object(augmentedreality).
Conclusion
In thisprojectwe demonstratedthe abilitytoprovide a3 dimensional human-computerinterface
input/interactionmechanismusingsimple setupcomprisedof 2 fixedWiiMote andamovinglight
source.Thiscan be furtherimprovedbyaddingadditional sensorstothe movinglightsource togive a
rich inputmechanismwithvirtual orreal 3D space.While the workwe diddoesnotgive a robust
implementation,withrelativelysimple techniquesitcanbe aggregatedtocreate an accurate and
responsive 3Dinputsetupwithrelativelylow cast(atabout$21 perWiiMote).Itisto be seenwhat
applicationcouldbe createdforsucha setuputilizingthe techniquediscussedhere.
Bibliography
[1] WiiPhysics,PhysicswithaWiiMote, http://wiiphysics.site88.net/
[2] WiiPhysics,DistanceMeasurementwiththe WiiMote,
http://wiiphysics.site88.net/physics.html
[3] JohnnyChungLee, Wii Projects, http://johnnylee.net/projects/wii/
[4] JohnnyChungLee, HeadTrackingfor DesktopVRDisplaysusingthe WiiRemote,
http://www.youtube.com/watch?v=Jd3-eiid-Uw
[5] WiimoteLib,ManagedLibraryforNintendo’sWiiMote, http://wiimotelib.codeplex.com/
[6] Wii@ESU,http://brannigan.emporia.edu/projects/WII/
[7] Wii@ESU Projectspage, http://brannigan.emporia.edu/projects/WII/wiiprojects/index.htm