ANSWERING COMPLEX
LOCATION-BASED QUERIES
WITH CROWDSOURCING
Karim Benouaret
Raman Valliyur-Ramalingam,
François Charoy
Inr...
Nancy
Ask the crowd to contribute
How to do that Cost Effectively ?
• Express the problem (the query)
• Transform it in something executable
• Manage the ex...
A Query
•<

Object=roads,
Context=need repair,
Location=Nancy,
Assessment = {not damaged, damaged, very
damaged},
Start Da...
Collect
Clustering
Select
Assess
A Process
• 3 crowdsourcing activities
Strategies
• Deadline
• One after the other
• Buffer
• Start the voting activities
when k photo are
available
• FIFO
• Sta...
Experimentation
• Understand the behavior of each strategy
• Subset of the Gowalla Data Set
• Checkins at different places...
Number of results vs Number of days
Evolution of the number of results
Quality vs Duration
Quality vs Number of vote
Conclusion
• Promising Preliminary results
• Interpretation of context aware crowdsourcing queries

requires more work
• C...
Current work
• Implementation of the process on a real BPM Systems
• Deployment on AWS EC2 and S3
• Prepare experimentatio...
Questions ?
Current Structure of the system

Network service

Data Quality
Service

Mobile app
service

Crowd
Management
Service

Task...
Collection
Selection
Assessment
Answering Complex Location-Based Queries with Crowdsourcing
Answering Complex Location-Based Queries with Crowdsourcing
Answering Complex Location-Based Queries with Crowdsourcing
Upcoming SlideShare
Loading in …5
×

Answering Complex Location-Based Queries with Crowdsourcing

761 views

Published on

CollaborateCom 2013 presentation

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
761
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
11
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Answering Complex Location-Based Queries with Crowdsourcing

  1. 1. ANSWERING COMPLEX LOCATION-BASED QUERIES WITH CROWDSOURCING Karim Benouaret Raman Valliyur-Ramalingam, François Charoy Inria – Université de Lorraine – CNRS LORIA Service and Cooperation Team
  2. 2. Nancy
  3. 3. Ask the crowd to contribute
  4. 4. How to do that Cost Effectively ? • Express the problem (the query) • Transform it in something executable • Manage the execution • Evaluate the result
  5. 5. A Query •< Object=roads, Context=need repair, Location=Nancy, Assessment = {not damaged, damaged, very damaged}, Start Date = 10/19/2013, End Date = 10/25/2013, Strategy=Deadline >
  6. 6. Collect
  7. 7. Clustering
  8. 8. Select
  9. 9. Assess
  10. 10. A Process • 3 crowdsourcing activities
  11. 11. Strategies • Deadline • One after the other • Buffer • Start the voting activities when k photo are available • FIFO • Start the voting activity as soon as a data is available • Wait for k/2 vote C S A C S A C S A
  12. 12. Experimentation • Understand the behavior of each strategy • Subset of the Gowalla Data Set • Checkins at different places • The ground truth is generated • Participants have a probability to give a wrong answer. • Variable • Number of days of the experiment • Number of votes required for each place and photo (k) to be selected
  13. 13. Number of results vs Number of days
  14. 14. Evolution of the number of results
  15. 15. Quality vs Duration
  16. 16. Quality vs Number of vote
  17. 17. Conclusion • Promising Preliminary results • Interpretation of context aware crowdsourcing queries requires more work • Crowdsourcing process orchestration is difficult • Large scale • Not sequential ? • Different strategies lead to different results • Quality vs number of results • The problem of evaluation is an issue
  18. 18. Current work • Implementation of the process on a real BPM Systems • Deployment on AWS EC2 and S3 • Prepare experimentation with the Lorraine Smart City Living Lab
  19. 19. Questions ?
  20. 20. Current Structure of the system Network service Data Quality Service Mobile app service Crowd Management Service Task Management Orchestration Engine Data Production
  21. 21. Collection
  22. 22. Selection
  23. 23. Assessment

×