IRJET- Simultaneous Localization and Mapping for Automatic Chair Re-Arran...
Eecs221 final report
1. 1
Abstract— Today’s mobile devices host advanced GPS system
for efficient localization. However, these techniques have several
limitations like use of GPS signals, presence of error sources in
CDMA signals, etc. While GPS solves the problem of accurate
localization in outdoor environments, indoor localization is still
an area of active research. Most of the methods deal with
establishing PDR (pedestrian dead reckoning) - a technique
which provides position information of the pedestrian. The
project describes an efficient way in which smartphones can
communicate with a BLE sensor tag to approximate position and
motion. We would extend the capability of a smartphone as a
personal device to track the navigation information in an indoor
space.
I. INTRODUCTION
We started with an idea to determine a path in an area
where there is no network and GPS connectivity. The plan is
to provide indoor navigation details to a user, who is stuck in
a kind of maze, say a shopping mall. This position estimation
can be done using dead reckoning. Dead Reckoning is the
method of determining current position of an entity by using a
previously obtained position along with known speed and
direction over time. Due to the introduction of various sensors
like accelerometers, compass and gyroscope in latest mobile
devices, dead reckoning has become an obvious choice for
indoor localization. There are numerous techniques for
implementing dead reckoning [1] which are broadly classified
in:
A. Systems with Smart Environment
Systems called as “fiducials” come under this category.
Fiducial systems require installation prior to its use which
takes substantial amount of time. These systems can provide
accurate knowledge of absolute position and orientation but
on the cost of expensive installation. These fiducial systems
are based on sources like ultrasound, infrared light, magnetic
fields etc. Such sources are unacceptable in application related
to security as these are affected by external interferences. But
the main advantage of these fiducial systems is the accuracy
they provide by not allowing errors to accumulate over time.
B. Systems with Unstructured Environment
Systems classified under this category don’t require smart
environment for computation of absolute position and
orientation. There are numerous positioning systems which
are classified in this category.
1) Computer Vision technique is used for estimation of
absolute position. This system doesn’t require any
modifications in the environment. Drawback of this
system is that it requires a large database.
2) Pedometer is another simple system which can be used
for position estimation. This device counts steps based on
the body movements. The pedometer must be calibrated
to the average length of the person using it. The
positioning done by this technique doesn’t provide
accuracy, as error accumulates over time mainly due to
the variation in stride length that a person takes due to
fatigue or irregular terrain.
3) Absolute position estimation can also be achieved by
using 2D accelerometer. High frequency signal generated
by the accelerometer is passed through Kalman filter to
conduct band filtering for removal of high frequency
components. This filtered signal is used for counting the
number of steps. Similar to the pedometer technique, this
technique also has error due to variant striding styles of
the person.
4) To overcome the limitations of previous techniques,
Ultrasonic sensor were used to measure Stride length. But
they require a line of sight orientation of sensors which is
difficult to achieve over distorted terrain. Also RF phase
change technique was presented for counting number of
steps. This was achieved by computing RF phase change
between signals from waist pack and foot. The drawback
of all the techniques mentioned before is that they are
restricted to 2D positioning, and none of them can
compute altitude changes.
5) Extending the application by introducing 3D, IMU
(Inertial Measurement Unit) was created and used. These
IMU have 6 degrees of freedom. They provide
information about 3D orientation and positioning. IMU
are electronic devices that are implemented using
Improving Dead Reckoning by M2M
Communication.
Abhishek Madav1
amadav@uci.edu, Saurebh Raut2
rauts@uci.edu, Suhas Tikoo3
stikoo@uci.edu
Electrical Engineering and Computer Science Department,
University of California, Irvine, CA – 92697.
2. 2
combination of accelerometer, gyroscope and
magnetometers.
6) Further many new sensor nodes (motes) were
developed by companies. These motes are capable of
collecting information through the sensors and
distributing the information to networks over wireless
communication. TI sensor tag is an example of such a
sensor node.
Due to the introduction of sensors like accelerometers,
compass and gyroscope in latest wireless sensor nodes, dead
reckoning has become an obvious choice for indoor
localization. Wireless communication ability of these sensor
nodes facilitates their information exchange with personalized
mobile phones to implement indoor localization systems.
II. IMPLEMENTATION
We have implemented indoor navigation tracking which
requires collection of information from motion sensors like
accelerometer, gyroscope etc. and transmission of this
information to personal mobile device. This can be
accomplished by using a sensor node that supports BLE
interface and measures the distance as well as the direction of
the path it follows. EcoBT is one such device that works on
BLE interface and has a tri-axial accelerometer on board. But
due to the drift error that accumulates over time, only
accelerometer doesn’t suffice to determine distance.
Since the new position is calculated from the previous
calculated position and the measured acceleration and angular
velocity, these errors accumulate roughly proportionally to the
time since the initial position was input. Therefore the
position must be periodically corrected by input from some
other type of sensor data.
So we shifted our concentration to a different device – TI
Sensor Tag. This device also uses BLE interface and consists
of 6 sensors on-board namely – Accelerometer,
Magnetometer, Gyroscope, IR Temperature, Barometer and
Humidity. We have connected this sensor tag with
smartphone of the user. Since it is not feasible to use
navigational systems indoor due to network constraints, this
sensor tag measures the user’s distance and direction as he
moves and transmit these readings to mobile phone which
calculates the path he covered. These calculations are done in
an android app using the algorithm elaborated below. Texas
instruments have provided an open source android app which
can communicate with TI sensor tag efficiently. We
improvised this app to achieve our goal of implementing dead
reckoning in no GPS zone with minimal errors.
We faced few challenges during the implementation phase;
one of them was to change the lower limit on sampling rate
for more frequent readings. Initially, the accelerometer sensor
had a minimum sampling rate of 10 samples per second.
Using IAR Embedded workbench we made changes in sensor
tag’s accelerometer profile and increased the sampling rate by
factor of 5 i.e. 50 samples per second. The other challenge
was to integrate data from two sensors – accelerometer and
gyroscope. This was done since distance travelled can’t be
calculated using only accelerometer readings due to
accumulation of error. So we had to make use of gyroscope to
formulate this value. The inclusion of gyroscope limited the
sampling rate since we couldn’t enhance it as much as the
sampling rate of accelerometer.
The first step in using a combination of accelerometer and
gyroscope is to align their coordinate systems. Generally, this
is done by setting the initial reference axis of the
accelerometer. Monitoring the change in the gyroscope
readings help to map its axis to accelerometer’s reference
axis. The acceleration vector from accelerometer points in the
direction of displacement and by using this unit vector can be
obtained. The readings from gyroscope mentions the angle of
orientation based on x, y & z axis, and from these angles we
can compute unit vector in the direction of displacement. The
above two unit vectors need to be same but due to the drift
error there is a deviation between them. So the final unit
vector is obtained by assigning weights to these two unit
vectors based on the reliability of accelerometer and
gyroscope.
Flowchart:
3. 3
Pseudo code for computing distance and direction:
While( SensorTag != Disconnected){
Read(Accelerometer from SensorTag.Accelerometer )
Begin
AACCX = AACC.X
AACCY = AACC.Y
AACCZ = AACC.Z – g //Compensate gravity
End
/*We need to smoothen the values we got from the
accelerometer. But there is a trade of between the number of
values utilized for averaging as more values we use for
averaging less will be our ability to detect drastic change in
acceleration*/
Compute_Average()
Begin
ACCAVGX = (AACCX1 + AACCX2 + … + AACCX10 ) / 10
ACCAVGy = (AACCY1 + AACCY2 + … + AACCY10 ) / 10
ACCAVGz = (AACCZ1 + AACCZ2 + … + AACCZ10 ) / 10
End
/*To find the unit vector which points to the direction of
displacement we divide the accelerometer values by mod
which is computed as follows*/
mod = Math.sqrt((ACCAVGX)*( ACCAVGX) + (ACCAVGX)*(
ACCAVGz) + (ACCAVGz)*( ACCAVGz))
AACCunit = AACCX/mod* i + AACCY /mod* j + AACCZ/mod* k
Read(Gyroscope from SensorTag.Gyroscope)
Begin
GyroX = Gyroscope.X
GyroY = Gyroscope.Y
GyroZ = Gyroscope.Z
End
Compute_Average()
Begin
GyroAVGx = (GyroX1+ GyroX2 ) / 2
GyroAVGy = (GyroY1+ GyroY2 ) / 2
GyroAVGz = (GyroZ1+ GyroZ2 ) /2
End
Compute_Angles()
Begin
AngleXZ = GyroAVGy *sampling-time
AngleXY = GyroAVGz *sampling-time
AngleYZ = GyroAVGx *sampling-time
End
/*AngleXZ, AngleXY, AngleYZ are the orientation angles of
Gryoscope in the 3D space. Using the orientation angles we
are computing the components of the unit vector pointing the
displacement along the 3 directions. The sampling rate of
gyroscope is 10 samples per second. Hence sampling time is
100 milliseconds*/
Rgx=Math.Sin(AngleXZ)/(Math.sqrt((Math.cos(AngleXZ)*Mat
h.cos(AngleXZ)) + (Math.tan(AngleYZ)*Math.tan(AngleYZ)) +
1))
Rgy=Math.Sin(AngleYZ)/(Math.Sqrt((Math.Cos(AngleYZ)*Mat
h.Cos(AngleYZ)) + (Math.Tan(AngleXZ)*Math.Tan(AngleXZ)) +
1))
Rgz = Math.Sqrt(1 - ((Rgx)*(Rgx)) - ((Rgy)*(Rgy)))
/*Now we have 2 unit vectors facing the direction of the
displacement computed by accelerometer and Gyroscope. We
now implement sensor fusion to overcome the drift error in
accelerometer readings. xc, yc & zc are the values computed
using sensor fusion. W1 and W2 are the weights which are set
depending on reliability of sensors. Generally value of W2 is
10 to 20 times of W1 for good results*/
xc = (AACCX *w1/mod + Rgx*w2)/(w1 + w2)
yc = (AACCY *w1/mod + Rgy*w2)/(w1 + w2)
zc = (AACCZ *w1/mod + Rgz*w2)/(w1 + w2)
/*Using the kinetics equation s = ut + (0.5)*a*t^2 we
computed the displacements along every axis as follows.*/
sx = sx + (0.5)*xc*(sampling-time^2)*(9.8)*mod
sz = sz + (0.5)*zc*(sampling-time^2)*(9.8)*mod
sy = sy + (0.5)*yc*(sampling-time^2)*(9.8)*mod
/*Computing the direction:- we can detect the change in
direction by checking the gyroscope readings. */
if (GYROAvg > threshold) Direction Changed.
End if
/*TI sensor tag doesn’t have compass hence we set the default
direction as North when we start the sensor tag. Also other
limitation is that TI sensor tag doesn’t provide signed
gyroscope values, rotation in clockwise or anticlockwise
direction will result in positive value only. So for proper
direction orientation we need to travel in clockwise direction
only.*/
End While
The sensor tag hosts a rich set of sensors including
gyroscope and accelerometer. We efficiently utilize readings
from these pair of sensors. Such readings when clubbed with
effective knowledge of velocity and position helps in
applications of indoor dead reckoning. We utilize the open
source toolkit ‘Evothings’ with Apache Cordova to develop
native application, interfacing the sensor with mobile
application.
The application takes care to reset the cumulative distance
travelled in a particular direction. Consequently, upon a
change in direction previously travelled distance gets plotted
on a planar graph and a new value starts computing.
6. 6
Demonstration video
https://www.youtube.com/watch?v=AbVquIfA90c
IV. USE OF ACCELEROMETER IN STEP
DETECTION AND STEP LENGTH ESTIMATION
The method we used gives lot of errors in 3D space. Step
detection using accelerometer is a good alternative to our
approach. The sensor tag implements KXTJ9 tri-axis
accelerometer used majorly for mobile applications. It offers
an internal voltage regulator, I2C digital communication, and
up to 14-bit resolution. Additional features include low current
consumption; 10 µA in standby, 10 µA at low resolution, and
325 µA at high resolution. Accelerometer provides
acceleration values along each of the three axes of motion.
Plot of the acceleration against time shows unique
characteristics for different gaits of a person like walking,
climbing stairs, standing still. The characteristics however
vary based on the location of the device on the human. In case
of a mobile phone, it is common to be placed in trouser
pocket, shirt pocket, held to the ear or held in the hand. Based
on the location of the device on the user body, the
accelerometer patterns vary. The algorithm therefore should
nullify such variations in the readings and should not depend
on actual amplitude of the acceleration for detecting the steps
and estimating the step length. To detect the robust detection
of the position, the norm value a3D of the improved 3-axis
acceleration values (ax, ay, az) can be used, like equation:
Aeff3D [t] = √( a x [t ]^2 + a y [t ]^2 + a z [t ]^2)
Where ax, ay and az are acceleration components along the
x, y and z axes of the device. The acceleration values obtained
from the device sensors have gravity component of the Earth
(g) added. We need to subtract the gravity component from the
effective value to get the resultant acceleration i.e. the
acceleration resulting from the user movement.
Aresult = Aeff – g
Since the readings from the sensor are quite erroneous
because of the transient noise which usually has high
frequency component, we smoothen the sensor output with
band-pass filter like Butterworth filter. Butterworth filter is
known to have maximally flat output response in the pass-
band. After vector transformation, velocity and position can be
determined by the integration of acceleration with
compensation of gravity and Coriolis force. [2]. In the above
technique errors have been reduced by ZUPT (Zero Velocity
Update), which in turn implements a Complementary Filter for
smoothening of the sensor output.
Fig1: Sensor fusion algorithm using Complementary filter. [3]
7. 7
V. CONCLUSION
This project makes use of accelerometer and gyroscope
values from TI Sensor Tag to calculate the direction and
distance travelled, and plots this movement on a planar graph.
Sampling rate of the sensors induces limitation to certain
extend but this can be avoided by updating the firmware
which increases the sampling rate. This project can be
improved by making it work in 3D space. This can be
achieved using some additional hardware like compass and
since the accelerometer in sensor tag is not stable enough to
provide correct values in 3D space we need a proper filtering
algorithm like Kalman filter or Butterworth filter for removing
the drift components in the sensor readings.
VI. TASK ASSIGNMENT
The project was implemented by the collective efforts of the
authors mentioned and the assignment of tasks was made in no
exclusive order. The project started with an initial survey of
the existing models of dead reckoning and collective data from
the various sources was meticulously considered as the choice
of implementation technique for this project.
Saurebh Raut:
Updated the firmware of TI sensor tag for improving
the sampling rate for sensor readings.
Working of Bluetooth Low Energy (BLE).
Worked upon the function of calculating distance
using accelerometer and then combined it with
gyroscope for better precision.
Worked upon the detection of change in direction
using gyroscope.
Midterm and Final project reports.
Suhas Tikoo:
Worked upon the function of capturing the notion of
direction of the motion.
Applied the mathematical formulas in the application
using JavaScript knowledge.
Developed and improved the aesthetics of the mobile
application.
Survey of EcoBT and its shortcomings in this project.
Midterm and Final project reports.
Abhishek Madav:
Implemented BLE communication protocol for the
Android application.
Worked upon the application architecture to support
and read the Sensor Tag data for effective
manipulation using the algorithm described.
Developed the canvas module of the application to
plot the distances travelled using reference direction
from gyroscope.
Survey of various techniques of step detection and
step estimation.
Midterm and Final project reports.
VII. REFERENCES
[1] Lauro Ojeda, Johann Borenstein (2007) “Personal Dead-
reckoning System for GPS-denied Environments”,
Proceedings 2007 IEEE International Workshop on Safety,
Security and Rescue Robotics, pp. 1-6.
[2] Pai D. (June, 2012) Padati: A Robust Pedestrian Dead
Reckoning System on Smartphones. Available:
goo.gl/XSQG4g
[3] Yun-Ki Kim (October, 2012) Performance improvement
and height estimation of pedestrian dead-reckoning system
using a low cost MEMS sensor. Available: goo.gl/2NloLs
[4] Yunye Jin (March 2011) “A Robost Dead-Reckoning
Pedestrian Tracking System with Low Cost Sensors”, 2011
IEEE International Conference on PerCom. pp. 222-230
[5] Pai H. Chou “EcoBT: Miniature, Versatile Mote Platform
Based on Bluetooth Low Energy Technology”.
[6] Valentina Marotto (2013) “Orientation Analysis through a
Gyroscope sensor for Indoor Navigation Systems”.
SENSORDEVICES 2013: The Fourth International
Conference on Sensor Device Technologies and Applications.