3. Project Object
ο΄ JPL has developed a hand-held methane sniffer that
can be deployed on a flying robot.
ο΄ To develop an automated system that guides the robot
or person carrying the sniffer from first detection to the
leak source.
ο΄ Input: methane concentration and wind vector at
detected location
ο΄ Output: direction to the leak source
4. Foundation Approach
ο΄The approach is a Bayesian Model
ο΄First, set several hypotheses(leak location,
leak concentration)
ο΄Using the collocated data to update the
probability of those hypotheses
10. Build a virtual world for hypothesis
ο΄Decompose the map into cells tagged as
building, air and ground.
ο΄Make a methane leak location cell and its
leak concentration as a hypothesis.
ο΄Distribute methane from cells to cells using
Gaussian distribution by calculated wind.
12. Calculated methane concentration
under each hypothesis
Concentration plot in 4 Different hypotheses virtual worlds
We normalized the probability
for all the hypotheses in the
initialized stage:
ππππππππππππππππππππππππ
= 1/ππ
For the virtual world according
to each hypothesis, calculate
the methane concentration in
every cell in the map.
We can draw the
concentration/coordinate plot
as left showing
14. Model the likelihood using gamma
distribution
ππππππππππ = Concentration under πππ‘π‘π‘ hypothesis
x = Concentration detected in real word
ππππππππππππβππππππππ = Gamma.pdf(ππππππππππ, x)
ππππππππππππππππππππππππππππ
ππ
= ππππππππππππππππππππππππππ ππ
ππ
β πππππππππππππππππ
Repeated the steps for each detection
17. Entropy for the set of hypotheses
ο΄Make probability of hypothesis as variable
marked as π₯π₯ππ
ο΄Mark the all hypotheses set as ππ
ο΄Entropy of the set of hypotheses is:
ο΄π»π» ππ = β βππ ππ(π₯π₯ππ
)ππππππππ ππ(π₯π₯ππ
)
18. Update probability in the future under
specific assumption
ο΄Prepared a location ππ as detected
candidate in the future
ο΄Assume the πππ‘π‘π‘
hypothesis is true for next
detection
ο΄Calculate new probability for each
hypothesis under that assumption:
ο΄πππππππππππππππππππππ,ππ
ππ
= πΊπΊπΊπΊπΊπΊπΊπΊπΊπΊ. ππππππ ππππππππππ
ππ
, ππππππππππ
ππ
ο΄ππππππππππππππππππππππππ,ππ
ππ
= ππππππππππππππππππππππππ β ππππππππππππβππππππππ,ππ
ππ
19. Expected Information Gain under
specific assumption
ο΄The new Entropy for the probability
updated hypotheses:
ο΄π»π» ππππ,ππ =β βi P π₯π₯ππ,ππ
ππ
β logbP π₯π₯ππ,ππ
ππ
ο΄Expected Information Gain:
ο΄πΌπΌ πΌπΌππ,ππ = π»π» ππ β π»π»(ππππ,ππ)
20. Sum the Expected Information Gain for
each assumption
ο΄Assume all the hypotheses are true one
hypothesis a time
ο΄Sum all the Expected Information Gain
under one true hypothesis assumption
weighted by its original probability:
ο΄πΌπΌ πΌπΌππ = βππ πΌπΌ πΌπΌππ,ππ β ππ(π₯π₯ ππ
)
21. Choose candidates by Expected
Information Gain
ο΄By compared πΌπΌ πΌπΌππ for different locations,
we can decide which candidate location
to move next step in order to get more
information.
ο΄Itβs a start point for path planning
23. A virtualization Demo
ο΄Thank David for all his the intelligent ideas
and algorithm framework
ο΄Thank Lance for his real world knowledge
and practical experience to build our
model
ο΄Here is a Demo running on real data
collected by Lance.