2.
used to predict and estimate the target location and velocity 3. The two sensors are not within line of sight with eachinformation. The predicted target location will then be used other (i.e., not within the detection angle of each other).to obtain the predicted target detection probability based onits Monte Carlo sampling. According to the above joint sensing conditions, no The paper is organized as follows: Joint sensing matter which sensor in these two sensors is the emitter, themechanism will be introduced in Section II. EKF for fusing signal can be received by the other sensor.sensor measurement and calculating IQ for target tracking As an example, Fig. 2 shows the joint sensing region ofwill be described in Section III. The collaborative sensing sensors 1 and 2, when sensor 1 is the emitting sensor and sensor 2 is the receiving sensor. The ellipse consists of allscheme, including the adaptive sensor scheduling, target points where the sum of its distances to sensor 1 and sensor 2detection model, target detection probability and its is 2d. The target must be inside this ellipse if sensor 1 andcorresponding Monte Carlo calculate method, will be sensor 2 can jointly sense the target.detailed in Section IV. Simulation results will be reported inSection V. Finally conclusions and future work will be Ultrasonic Ultrasonic Sensor5Collabonlth·e Sensor 4introduced in Section VI. II. JOINT SENSING Ultrasonic In this paper, we assume that each ultrasonic sensorinstalls the sound wave emitter and receiver, and all thesensors in the network are homogeneous and time 150synchronized. Ultrasonic Normally an ultrasonic sensor adopts the active sensing 100 �/b===�� Sensor3mechanism where the sensor emits sound wave andmeasures the reflected echo from the target. The time of 50flight (TOF) is converted into range information towards thetarget. In this paper, we adopt a simplified cone shape °OL-- 50 �-1 00 -150 ���--�=-�� L--500 �-- � � �-450 � Ultrasonicdetection region model for a typical ultrasonic sensor, where Sensor 1 Sensor 2one ultrasonic sensor i is characterized by its location (xs;, Figure l. Joint sensingYSi), orientation fl;, detection angle a, and detection ranged. The TOF equals to the round trip time of the wave fromthe emitting sensor to the target and then back to the emittingsensor, which corresponds to the round trip distance of thesound wave that is bounded by 2d. As shown in Fig. 1, only when the target is within thedetection region (e.g., at location A) of emitting ultrasonicsensor 3, this sensor can obtain its measurement individually.Sensor 3 can never detect the target when the target isoutside its detection region (e.g., at location B), although thesound wave can also reach ultrasonic sensor 5 after beingreflected from the target (because the total trip distance isless than 2<1). This signal received by ultrasonic sensor 5 is Figure 2. Joint sensing regionsimply discarded in [5, 8]. However, we found that suchsignal can be very useful as the sensor measurement,therefore we call ultrasonic sensor 5 can jointly sense thetarget with the emitting sensor 3. Fig. 1 also shows that whenthe target is located (e.g., at location A) in the detectionregion of the emitting sensor 3, joint sensing can also bedone by sensor 2. Note that sensor 6 can not jointly senseany target with sensor 3 as the sound wave from ultrasonicsensor 3 can always reach ultrasonic sensor 6 directly, nomatter whether or not there are targets in the network. In this paper, we assume that the target can be jointlysensed by two sensors, if the following joint sensingconditions are satisfied:1. The target is within the detection angles of both sensors;2. The sum of distances from the target to the sensors is Figure 3. Joint sensing region of sensors 1, 2 and 3 less than 2d; 100 Authorized licensed use limited to: Sethu Institute of Technology. Downloaded on July 08,2010 at 05:21:03 UTC from IEEE Xplore. Restrictions apply.
3.
In Fig. 2, areas a and b can only be sensed by sensor I Z(k) = hk(X(k),Xs(k),Xs1 (k),Xs2 (k),...,Xsm (k»+V(k)individually, as any point in area a is not in the detectionangle of sensor 2 (i.e., not satisfy joint sensing condition 1) h(X(k),Xs(k),Xs (k»and the sum of the distances from any point in area b to tsensor 1 and sensor 2 is larger than 2d (i.e., not satisfy joint h(X(k),Xs(k),Xs2 (k» = . +V(k) (2)sensing condition 2). Areas c, d, and e can be jointly sensedby sensor 1 and sensor 2 as any point in them satisfies thethree joint sensing conditions. h(X(k),Xs(k),Xs (k» m Similarly, we know that areas f and g can not be jointlysensed by sensor 1 and sensor 2. with The original detection region of sensor 1 by individualsensing is the union of areas a, b, c and d. Now we fmd thatby joint sensing, the target located in area e can also be J(X(k),X.(k),X.j(k»=�(x(k)-Xs(k/ +(y(k)-YS(k/jointly sensed, which indicates that joint sensing can increasethe detection region of individual sensors. In addition, if the +�(x(k)-XSj(k/ +(y(k)-Y�(k/ (3)target is located in area c or d, we can obtain two sensor where Xs(k) = (Xs(k),Ys(k)) is the known location of sensormeasurements, one is the distance from sensor 1 to the target,and the other one is the sum of the distances from sensor 1 to s(k), and X: (k)=(x� (k�y� (k)) is the location of the sensorthe target and from the target to sensor 2. j As another example, Fig. 3 shows the joint sensing Sj(k). Here V (k) is the Gaussian white measurement noiseregion of sensors 1, 2 and 3 when sensor 1 is the emitting of the receiving sensors with zero mean and variance R(k).sensor. We can fmd that the detection region of the jointsensing is irregular and much larger than the original To be consistent, when the target can be sensed byindividual sensor detection region. emitting sensor s(k), the measurement equation (3) is also applicable with the emitting sensor and the receiving sensor III. EKF TRACKING ALGORITHM AND IQ being the same. The EKF algorithm is adopted to fuse the joint sensingmeasurements and the measurements taken at different time Given the estimate X(k I k) of X(k) at time step k,steps. EKF algorithm consists of an prediction update phase to The following constant velocity target motion model is calculate the predicted state X(k + 11 k) and itsused in this paper: corresponding prediction error covariance P(k + 1 I k) , X(k+l)= F.0(k) +qU(k) ( 1) and a measurement update phase to obtain the new statewith estimation X(k + 11 k + 1) and its corresponding error Mi 0 �k+l) [ 1 1 Nk 0 0 2 covariance matrix �k+llk+l) from X(k + llk) and x,,(k+1) o 1 0 0 Mk 0 P(k + 11 k) using the sensor measurements Z(k+ 1). X(k) = , Rk G(k)= y(k+l) 0 0 1 Nk The detailed information on EKF operations can be 0 Mi found in [ 1 1] and is omitted here due to the space Yik+l) 0 0 o 1 2 limitation. 0 Mk Based on the state estimation, various measures can be (uik))and defmed for the IQ, i.e., the tracking accuracy, such as the Ux(k» trace or the determinant of the covariance matrix, U(k)= eigenvalues of the difference between the desired and the predicted covariance matrices, and the entropy of the statewhere x(k) and y(k) are x- and y- coordinates of the target attime step k; xv(k) and yv( k ) are respectively the estimation distribution. In this paper, the IQ, <p( k) , at time step k is defined as the trace of the covariance matrix, i.e.,velocities of the target along x- and y- directions at time step ¢(k) = Trace(P(k I k)). (4)k; At k is the time difference between the measurement IV. COLLABORATIVE SENSING AND ADAPTIVE SENSORtimes at steps k and k+ 1. Here U(k) is the Gaussian white SCHEDULINGacceleration noise with zero mean and covariance matrix Q. Suppose at time step k, sensor s(k) is the emitting sensor, A. Inter-Sensor Interference and Sensor Schedulingand the target is jointly sensed by s(k) and sensors sl(k),S2(k), . . . , sm(k), then the nonlinear observation model is A serious problem in WSN of active sensors is the inter-sensor interference (lSI) when nearby ultrasonic 101 Authorized licensed use limited to: Sethu Institute of Technology. Downloaded on July 08,2010 at 05:21:03 UTC from IEEE Xplore. Restrictions apply.
4.
sensors emit sound wave simultaneously. Such interference B. Collaborative Sensingwill result in erroneous sensor readings and must be dealt In this paper, collaborative sensing is used to stand forwith properly. lSI also introduces a new technological joint sensing and the joint sensing enabled adaptive sensorconstraint in the design and implementation of a WSN. In scheduling.this paper, we assume the WSN is deployed in a small area Either centralized or distributed target tracking structurewhere the sensor nodes are in the interference range of each can be adopted, depending on the fusion centre being theother, and only single target tracking is considered. To avoid centralized management centre or the scheduled sensor. AtlSI, at each time step, only one emitting sensor will be each time step, the scheduled sensor emits the sound wavescheduled and the other sensors will participate in joint and all other sensor nodes that can perform joint sensingsensing with the scheduled emitting sensor. with the emitting sensor will collect the measurements and Periodic sensor scheduling is used in [6, 9] where the forward the measurements to the fusion centre. The fusiontime is divided into periodic cycles. Within a cycle, a centre will run EKF to give updates of the state estimationpredefmed duration (called time slot) is assigned for each using the new measurements and schedule the emittingultrasonic sensor for sensing, during which it can work sensor for the next time step. Then it will inform theproperly without interference from other sensors. scheduled sensor to perform the emitting operation in the A critical drawback of periodic sensor scheduling is that a next time step, together with the state estimation anddetection may be missed when a scheduled sensor can not covariance matrix information in the distributed structure.generate effective joint sensing measurements, which results We assume that the fusion centre knows the location andin lower tracking accuracy. This problem is shown in Fig. 4 orientation of each sensor. In the distributed structure, thisfor a WSN with six ultrasonic sensors. The discretized targettrajectory is displayed as circles with the scheduled sensors means that each sensor knows such information of eachindicated beside the trajectory points, with the trajectory node because each sensor is possible to be the fusion centre.point displayed as shaded circle if it can be successfully Different measures can be used as the performancedetected by joint sensing of sensors, otherwise displayed as indices to select the emitting sensor, including the jointnon-shaded circles. In the scheduling cycle i identified by the sensing detection probability, tracking accuracy, and energysolid ellipse, the sensors are scheduled from sensor 1 to efficiency. However, to calculate these performance indices 5sensor 6. However, only scheduled sensors 1, 3, , and 6 under the joint sensing mechanism is not an easy task. Forgenerate effective detections whereas sensors 2 and 4 simplification and easy to compare with individual sensinggenerate empty detections. For example, the first trajectory scheme, in this paper, we schedule the emitting sensorpoint with scheduled emitting sensor 1 associates with two according to the individual sensor detection probability.measurements by sensors 1 and 6. The second trajectory Due to the uncertainties in the target motion model suchpoint with scheduled emitting sensor 2 can not be detected as as the target maneuvering, even using adaptive sensorit is outside the detection angle of sensor 2. The third scheduling, it is still possible that the scheduled sensor cantrajectory point can be detected by joint sensing of sensor 3 not detect the target. If this happens, the fusion centre will 5and sensor . Similarly, in the scheduling cycle i+ 1, only use the predicted state and its covariance matrix as thescheduled sensors 2 and 6 generate effective detections estimation result. 5whereas sensors 1, 3, 4, and generate empty detections. C. Detection Probability for Individual Sensing To overcome the above drawback of period sensorscheduling, we introduce the adaptive sensor scheduling to Suppose Sj is the emitting sensor, for a given targetselect the emitting sensor for the next time step according to location X=(x, y), the target can be detected by Sjthe predicted target location and the sensing region of the individually if it is in the detection region of Sj.sensors. Mathematically the following target detection model is used p. (x,y) = {I, � if (x,y) is in the det tion region of Sj (5)Ij; I 0, otherwtse. Without loss of generality, in this paper, we suppose the target location is 2-dimensional. For emitting sensor selection, the prediction of the detection probability of a given emitting sensor is required. Denote the prediction of the target location and its covariance as and L which J.1 are sub-vector of X(k + 11 k) and sub-matrix of P(k+llk) respectively. Then the probability density function (PDF) function of the 10cationX=(x, y) will be Figure 4. Effective and missing detections in periodic sensor scheduling f(x,y) = ��112 ext{ �(X-J.1t (X-J.1)) r;1 . (6) 102 Authorized licensed use limited to: Sethu Institute of Technology. Downloaded on July 08,2010 at 05:21:03 UTC from IEEE Xplore. Restrictions apply.
5.
State prediction E. Emitting Sensor Selection for step k+1 In this paper, the emitting sensor is selected as the sensor with the maximal detection probability for individual sensing. After the emitting sensor is selected and activated for emitting wave, for individual sensing scheme only the State estimation emitting sensor can take the measurement whereas for the at step k joint sensing scheme, multiple sensors can take the measurements simultaneously. v. SIMULATION RESULTS Sensor Sj Simulation experiments are conducted for comparing between joint sensing and individual sensing schemes, both Figure 5. Predicted detection probability for joint sensing based on adaptive sensor scheduling. As shown in Fig. 7, the monitored field is 300cm X 300cm square area. The bottom-left comer of the field is with the coordinate (0, 0), whereas the upper-right comer is with the coordinate (300, State prediction 300). Each ultrasonic sensor has the maximal sensing range for step k+1 of 300 cm and the maximal measurement angle is ± 22°. There are eight ultrasonic sensors located along the edge of the area respectively with coordinates (70, 0), (190, 0), (300, 70), (300, 190), (230, 300), (110, 300), (0, 230) and (0, 110). The orientations of the sensors are respectively 90°, 90°, 180°, 180°, 270°, 270°, 0° and 0° such that the field can be covered mostly and any two sensors are not in the Sensor Sj detection region of each other. In this setup, each sensor is Figure 6. Approximate the Gaussian distribution by random samples in the lSI region of any other sensor. We assume the duration of a time slot is 100 ms. Fig. 5 shows the current state estimation and state The target moves along a straight line as shown in Fig. 7prediction for the next time step by 3a ellipses, as well as with the speed 100cmlsecond. Q is setup at 1.57*1061 wherethe sensing region of sensor Sj. In general, the prediction of I is the identity matrix. Periodic sensor scheduling is used for initial detection of the target and to initiate the trackingthe joint sensing detection probability by using Sj for the procedure. The initial location estimation of the target is setnext time step will be to the point along the central line of the beam pattern of the detecting sensor with the distance to the detecting sensor �i JJp(XY)�i (x,y)dxdy. = (7) equal to the initial measurement. The initial velocity estimation of the target is set to O. The initial covariance canD. Monte Carlo Method for Approximation of Detection be set heuristically according to the orientation and Probability measurement of the detecting target. Unfortunately, as shown in Fig. 5, to calculate the joint Typical estimated trajectories of adaptive sensorsensing detection probability analytically for a given scheduling using individual sensing and joint sensing areemitting sensor is difficult. Therefore, we adopt the Monte shown in the Fig. 7. Clearly, the estimated trajectory byCarlo simulation method to generate random samples to joint sensing is much closer to the true target trajectory asapproximate the Gaussian distribution in equation (6) of the compared with the individual sensing scheme. The evolutions of tracking errors, i.e., the IQ, are showntarget location prediction. Fig. 6 illustrates an example forapproximating a two-dimensional location Gaussian in Fig. 8. The gain of the joint sensing is observed. Thedistribution by random samples. Suppose the total number maximal and averaged tracking error of individual sensingof samples used is K, then the Gaussian distribution is are about 25 cm and 10.01 cm respectively, whereas that ofapproximated using discrete probability mass function the joint sensing are about 10 cm and 3.66 cm respectively, a significant improvement.(PMF) P(Xj,y) with value 11K for each sample location. The improvement of the IQ of joint sensing against the Ps individual sensing is due to the increase of the detectionAccordingly, the detection probability of S; is I region and more simultaneous measurements. Fig. 9 showsapproximated as the number of sensors used for taking measurements at each time step. Because the individual sensing is a specific joint sensing scenario where the emitting sensor is the same as the receiving sensor, the number of sensors of the individual sensing scheme (being one by default) is always smaller or 103 Authorized licensed use limited to: Sethu Institute of Technology. Downloaded on July 08,2010 at 05:21:03 UTC from IEEE Xplore. Restrictions apply.
6.
equal to that of the joint sensing scheme. We can fmd thatfor most of the time steps, the joint sensing scheme has ----ioE- Individual sensing ---e- Joint sensingmore than two simultaneous measurements except time step 4.522 and time step 25 with no measurement and one 4measurement, respectively. However, for individual sensing 3.5scheme from time steps 16-24, it does not have anymeasurements, except time step 23, although one emittingsensor is scheduled for each time step. For time steps 16-21, 2.5and time step 22, joint sensing has more than twomeasurements but individual sensing does not have any 1.5measurements, which demonstrates that the joint sensingcan increase the detection region significantly. 0.5 VI. CONCLUSIONS °0L--��-�� 1 0 - 15 � � - -7 � 20 --7 � 25 -�30 Time steps A novel collaborative sensing scheme is proposed fortarget tracking application in WSNs by joint sensing and Figure 9. Number of simultaneous measurements obtained by adaptiveadaptive sensor scheduling. The proposed scheme can sensor schedulingincrease the detection region of an individual sensor andintroduce more simultaneous sensor measurements for a REFERENCES [I). A. Tolstikov, W. Xiao, 1. Biswas, S. Zhang, and C. K. Tham,single sensing operation. It is shown by simulations that theIQ of the WSN can be improved significantly using joint "Information Quality Management in Sensor Networks based on thesensing. Future research issues include sensor scheduling Dynamic Bayesian Network Model," the third Internationalfor joint sensing for large scale WSNs, adaptive tracking Conference on Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP 2007), Dec 2007, pp. 751-756.algorithms for high maneuvering targets, joint sensing for [2). C. Bisdikian, "On Sensor Sampling and Quality of Information: amulti-target tracking, as well as real test-bed development. Starting Point," in Proc. of IEEE PERCOM Workshops, March 2007, S nsor 6 e Sensor5 3001-------=-::=-::"---=------ pp. 279 - 284. [3). 1. Wang, Y. Liu, and S. K. Das, "Improving Information Quality of True target Irajectory __ Estimated trajectory by individual sensing Sensory Data through Asynchronous Sampling," the First 250 ---&- Estimated trajectory by joint sensing International Workshop on Information Quality and Quality of S nsor 7 e Service for Pervasive Computing (IQ2S 2009) in PerCom 2009, 200 March 2009, pp. 1-6. [4). F. Zhao, 1. Liu, 1. Liu, L. Guibas, and 1. Reich, "Collaborative Signal and Information Processing: an Information Directed Approach." 150 Proc. IEEE, vol. 91, Aug. 2003, pp. 1199-1209. S nsor 8 e [5). W. Xiao, S. Zhang, 1. Lin, and C. K. Tham, "Energy-Efficient 100 Adaptive Sensor Scheduling for Target Tracking in Wireless Sensor Networks," Journal of Control Theory and Applications, Vol. 8, Jan. 50 2010, pp. 86-92. [6). W. Xiao, 1. Wu, L., and L. Dong, "Sensor Scheduling for Target 1� - - Tracking in Networks of Active Sensors," Acta Automatica Sinica, � -·�- �- r�� 0 0-�50 -��� ��250�� 300 � , S�nso I e n �� 2 S vol. 32, Nov. 2006, pp. 922-928. Figure 7. Estimated trajectories by adaptive sensor scheduling [7]. 1. Lin, W. Xiao, F. Lewis, and L. Xie, "Energy Efficient Distributed Adaptive Multi-Sensor Scheduling for Target Tracking in Wireless 30 Sensor Networks," IEEE Transactions on Instrumentation and � Individual sensing Measurement, vol. 58, Jun. 2009, pp. 1886-1896. Joint sensing 25 ---e-- [8]. L. Chen, B.K. Szymanski, 1.W. Branch, "Quality-Driven Congestion Control for Target Tracking in Wireless Sensor Networks," 5th IEEE " 20 International Conference on Mobile Ad Hoc and Sensor Systems � (MASS 2008), Sept-Oct 2008, pp. 766 - 771. Q; � 15 [9). W. Xiao, J. K. Wu, L. Shue, Y. Li, and L. Xie, "A Prototype � " Ultrasonic Sensor Network for Tracking of Moving Targets," the 1st (510 IEEE Conference on Industrial Electronics and Applications (ICIEA 2006), May 2006, pp. 1511-1516. 5 [10). Y. K. Toh, W. Xiao, and L. Xie, "A Wireless Sensor Network Target Tracking System with Distributed Competition based Sensor Scheduling," the third International Conference on Intelligent Time steps Sensors, Sensor Networks and Information Processing (ISSNIP 2007), Dec 2007, pp. 257-262. [II). Y. Bar-Shalom, X. R. Li, and T. Kirubarajan, Estimation with Figure 8. Evolutions of tracking errors for adaptive sensor scheduling Applications to Tracking and Navigation. New York: John Wiley & Sons, 2001. 104 Authorized licensed use limited to: Sethu Institute of Technology. Downloaded on July 08,2010 at 05:21:03 UTC from IEEE Xplore. Restrictions apply.
Views
Actions
Embeds 0
Report content