Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
SakuraSensor: Quasi-Realtime
Cherry-Lined Roads Detection through
Participatory Video Sensing by Cars
Shigeya Morishita†, ...
Latest car navigation systems
• Help drivers search comfortable & efficient routes
• Criteria
– Traveling distance
– Trave...
Scenic route search
Problems of existing services
• Information is edited manually
– Small number of scenic spots
– Low up...
Related work
4
Method
Proposed
method
ParkNet [12] SignalGuru [15] Nericell [3]
Participatory
sensing
○ ○
Cooperative
sens...
SakuraSensor: automatically identifies scenic
spots location and collects videos using PS
・ we target cherry-lined roads
・...
SakuraSensor App for iOS devices
6Full size video - https://youtu.be/2pRfDS7DeAc Demo at Hall C No.20
Key Idea
7
CloudCars with Smartphone
Too much cost for
cellular bandwidth
& computation
resource at cloud
Recording
video
...
Technical Challenges
TC1: Real-Time flowering cherry detection by
smartphone
TC2: Efficient load distribution among cars
8
TC1: Real-time Cherry Detection
• Employ simple computer vision techniques
– Smart phone has lower computation power than ...
Step1: Removing Artificial Objects
An input image Binary image after edge detection
box counting
method [5]
fractal
dimens...
Real-time fractal dimension calculation
11
Red regions show natural objects
Step2: Detecting Cherry by Color Analysis
12
Used 148 regions extracted
from various scenes
• Created color histogram of f...
HSV color space
• H(Hue)
• S(Saturation)
• V(Value of Brightness)
From 「http://en.wikipedia.org/wiki/HSL_and_HSV」
characte...
H-S histogram for flowering cherry
H
S
0 179
0
255
• Created from total of
148 cherry regions
• The value at each
coordina...
Step3: Calculating cherry intensity of an image
H
S
0
0
Pixel’s
(H, S)=(30, 20)
The value of (30, 20) is
0.816
An input im...
Real-time cherry intensity calculation
16
Red boxes show high
cherry intensity regions
TC2: Load Distribution among Cars
When all cars always conduct image analysis & uploads
too much cost (battery consumptio...
k-stage sensing
18
location where sensing
is performed
Narrows sensing interval step-by-step when new PoI is found
Fixed i...
k-stage sensing
19
Shorter Interval
(2nd stage)
PoI is detected!
Sensing is
performed in
this Radius
PoI detected by
prece...
Evaluation of SakuraSensor
• Investigate effectiveness of cherry intensity
– Compare the results of manual classification ...
Videos used for experiments
• Recorded videos in 8 different scenes (routes) using
SakuraSensor app for iOS by multiple ca...
1-Second Videos Manual Classification
Class name Criteria
C1 cherry ratio (in image) < 5%
C2 5% ≤ cherry ratio < 25%
C3 25...
Videos of each class
23
C1 (ratio < 5%) C2 (5% ≤ ratio < 25%) C3 (25% ≤ ratio)
Evaluation Methodology
𝐶1 𝐶1
𝐶2 𝐶2
𝐶3 𝐶3
Dividing videos
of each class
into halves
Training set Test set
24
Set of 1
secon...
Evaluation Methodology
𝐶1
𝐶1
𝐶2
𝐶2
𝐶3
𝐶3
Training set
Test set
Median of cherry intensity: M1 (0.00033)
Median of cherry i...
Classification Accuracy (1-second videos)
• 𝑪 𝟏 and 𝑪 𝟑: good results
• 𝑪 𝟐: not enough 26
𝐶1 𝐶2 𝐶3
precision
recall
0.97
...
Evaluation of k-stage sensing
27
• Simulation by 600 cars (k=3, 300m150m50m)
smaller sensing times similar PoI discovery...
Conclusions
• SakuraSensor
– Participatory video sensing system by cars
– Consisting of two key techniques
• Flowering che...
29
Thank you!
Demonstration at Hall C No.20
Upcoming SlideShare
Loading in …5
×

[Ubicomp'15]SakuraSensor: Quasi-Realtime Cherry-Lined Roads Detection through Participatory Video Sensing by Cars

29,580 views

Published on

SakuraSensor, a system which senses and shares the information of roads with flowering cherries by leveraging car-mounted smart-phones.
Honorable Mention Award of UbiComp2015.

Published in: Mobile
  • Be the first to comment

[Ubicomp'15]SakuraSensor: Quasi-Realtime Cherry-Lined Roads Detection through Participatory Video Sensing by Cars

  1. 1. SakuraSensor: Quasi-Realtime Cherry-Lined Roads Detection through Participatory Video Sensing by Cars Shigeya Morishita†, Shogo Maenaka†, Daichi Nagata† Morihiko Tamai†, Keiichi Yasumoto†, Toshinobu Fukukura‡, Keita Sato‡ †Nara Institute of Science and Technology ‡DENSO CORPORATION
  2. 2. Latest car navigation systems • Help drivers search comfortable & efficient routes • Criteria – Traveling distance – Traveling time – Toll/Toll-free – Fuel efficiency – Scenic beauty 2 © NAVITIME (http://products.navitime.co.jp/function/2519.html」) Toll-free Fuel-efficient Minimum distance Toll Scenic
  3. 3. Scenic route search Problems of existing services • Information is edited manually – Small number of scenic spots – Low update frequency • Scenery information consists of only texts and images – insufficient for users 3 Our approach • Use participatory sensing by cars • Collect and share videos of scenic spots Example of scenic spot info.
  4. 4. Related work 4 Method Proposed method ParkNet [12] SignalGuru [15] Nericell [3] Participatory sensing ○ ○ Cooperative sensing ○ Real-time ○ ○ ○ ○ Information detection from videos ○ × (ultrasound signals) △(traffic signals) × (horn sounds) [12] ParkNet: Drive-by Sensing of Road-Side Parking Statistics, MobiSys’10 [15] SignalGuru: Leveraging Mobile Phones for Collaborative Traffic Signal Schedule Advisory, MobiSys’11 [11] Nericell: Rich Monitoring of Road and Traffic Conditions using Mobile Smartphones, SenSys’08 Many existing studies on participatory sensing (PS) by cars No studies use both PS and real-time video sensing
  5. 5. SakuraSensor: automatically identifies scenic spots location and collects videos using PS ・ we target cherry-lined roads ・ automatically collect and update scenic information ・ gathering videos of scenic location The best period of flowering cherries is short and uncertain from year to year and from place to place
  6. 6. SakuraSensor App for iOS devices 6Full size video - https://youtu.be/2pRfDS7DeAc Demo at Hall C No.20
  7. 7. Key Idea 7 CloudCars with Smartphone Too much cost for cellular bandwidth & computation resource at cloud Recording video Analyzing & sharing video with cherriesUpload whole recorded video Recording video Analyzing video Upload only video with flowering cherries Sharing video with cherries
  8. 8. Technical Challenges TC1: Real-Time flowering cherry detection by smartphone TC2: Efficient load distribution among cars 8
  9. 9. TC1: Real-time Cherry Detection • Employ simple computer vision techniques – Smart phone has lower computation power than PC/Cloud Basic approach • Count cherry-like color pixels in each image • Identify amount of flowering cherry as cherry intensity Problem to solve • Artificial objects with similar color must be removed 9
  10. 10. Step1: Removing Artificial Objects An input image Binary image after edge detection box counting method [5] fractal dimensions10 • Employ fractal analysis – Note: natural objects has higher fractal dimension
  11. 11. Real-time fractal dimension calculation 11 Red regions show natural objects
  12. 12. Step2: Detecting Cherry by Color Analysis 12 Used 148 regions extracted from various scenes • Created color histogram of flowering cherry in HSV color space
  13. 13. HSV color space • H(Hue) • S(Saturation) • V(Value of Brightness) From 「http://en.wikipedia.org/wiki/HSL_and_HSV」 characterizes the color significantly varies depending on the lighting condition 13 Our approach used only H-S color space
  14. 14. H-S histogram for flowering cherry H S 0 179 0 255 • Created from total of 148 cherry regions • The value at each coordinate is normalized between 0 and 1 14
  15. 15. Step3: Calculating cherry intensity of an image H S 0 0 Pixel’s (H, S)=(30, 20) The value of (30, 20) is 0.816 An input image Cherry intensity = average value of all pixels Use Backprojection method [6]
  16. 16. Real-time cherry intensity calculation 16 Red boxes show high cherry intensity regions
  17. 17. TC2: Load Distribution among Cars When all cars always conduct image analysis & uploads too much cost (battery consumption, bandwidth, etc) Possible approach • each car senses at a fixed interval may miss PoI (cherry locations) 17
  18. 18. k-stage sensing 18 location where sensing is performed Narrows sensing interval step-by-step when new PoI is found Fixed interval (1st stage) PoI is detected! The preceding car
  19. 19. k-stage sensing 19 Shorter Interval (2nd stage) PoI is detected! Sensing is performed in this Radius PoI detected by preceding car The following car traveling the same road Narrows sensing interval step-by-step when new PoI is found location where sensing is performed
  20. 20. Evaluation of SakuraSensor • Investigate effectiveness of cherry intensity – Compare the results of manual classification and automatic classification by cherry intensity Videos manual classification (used as ground truth) classification by cherry intensity Compute accuracy by comparison 20
  21. 21. Videos used for experiments • Recorded videos in 8 different scenes (routes) using SakuraSensor app for iOS by multiple cars scene name date vehicle area Length (min.) S1 Mar. 31 V1 Aichi Pref. 17 S2 Apr. 5 V2 Nara Pref. 12 S3 Apr. 10 V2 Nara Pref. 66 S4 Apr. 10 V3 Nara Pref. 261 S5 Apr. 10 V4 Nara Pref. 186 S6 Apr. 11 V1 Gifu Pref. 72 S7 Apr. 12 V2 Osaka Pref. 137 S8 Apr. 18 V1 Aichi Pref. 89 extracted 1-second videos at random starting time from each scene 21
  22. 22. 1-Second Videos Manual Classification Class name Criteria C1 cherry ratio (in image) < 5% C2 5% ≤ cherry ratio < 25% C3 25% ≤ cherry ratio Scene C1 C2 C3 S1 79 17 10 S2 93 10 17 S3 372 43 3 S4 1613 96 45 S5 1167 6 0 S6 261 47 72 S7 888 1 0 S8 521 10 7 Total 4994 230 154 22 • Classification results with the same decision by two persons were used
  23. 23. Videos of each class 23 C1 (ratio < 5%) C2 (5% ≤ ratio < 25%) C3 (25% ≤ ratio)
  24. 24. Evaluation Methodology 𝐶1 𝐶1 𝐶2 𝐶2 𝐶3 𝐶3 Dividing videos of each class into halves Training set Test set 24 Set of 1 second videos Set of videos of class 𝐶1 Set of videos of 𝐶2 Set of videos of 𝐶3 Manual classification by human
  25. 25. Evaluation Methodology 𝐶1 𝐶1 𝐶2 𝐶2 𝐶3 𝐶3 Training set Test set Median of cherry intensity: M1 (0.00033) Median of cherry intensity: M2 (0.00791) Median of cherry intensity: M3 (0.03326) Vi Cherry intensity A video 25 𝐷(𝑉𝑖) 𝑀1 𝑀2 𝑀3 𝑉𝑖 is classified into the class that has smallest distance between 𝐷 𝑉𝑖 and its median
  26. 26. Classification Accuracy (1-second videos) • 𝑪 𝟏 and 𝑪 𝟑: good results • 𝑪 𝟐: not enough 26 𝐶1 𝐶2 𝐶3 precision recall 0.97 0.90 0.74 0.83 0.24 0.65
  27. 27. Evaluation of k-stage sensing 27 • Simulation by 600 cars (k=3, 300m150m50m) smaller sensing times similar PoI discovery rate
  28. 28. Conclusions • SakuraSensor – Participatory video sensing system by cars – Consisting of two key techniques • Flowering cherry detection by in-vehicle smartphone – Color histogram analysis for identifying cherry-blossoms – Fractal dimension analysis for removing artificial objects other than flowering cherry – Cherry detection accuracy (C3) with 0.7 of Precision and 0.8 of Recall • k-stage sensing – Distribute sensing load among cars – Similar PoI discovery rate with about half sensing times compared with the fixed interval sensing method 28
  29. 29. 29 Thank you! Demonstration at Hall C No.20

×