Upcoming SlideShare
×

The Ludic Fallacy Applied to Automated Planning

1,610 views

Published on

This is a short talk I gave to the Strathclyde Planning Group on deficiencies I can see in the way we thing and reason about planning in non-deterministic environments. PPDDL - the accepted standard - is overly simplistic and can get us into hot water because we focus on solving the PPDDL problem, rather than the Real World problem it models.

The breakout session that followed was very useful for generating a lot of ideas about different threads we could use to attack the weaknesses of PPDDL and work being done around the edges, which I hope to summarise at some point.

Published in: Technology
0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• slide 68 - 'mov' instead of 'mob'

Are you sure you want to  Yes  No
• This is a short talk I gave to the Strathclyde Planning Group on deficiencies I can see in the way we thing and reason about planning in non-deterministic environments. PPDDL - the accepted standard - is overly simplistic and can get us into hot water because we focus on solving the PPDDL problem, rather than the Real World problem it models.

The breakout session that followed was very useful for generating a lot of ideas about different threads we could use to attack the weaknesses of PPDDL and work being done around the edges, which I hope to summarise at some point.

Are you sure you want to  Yes  No
• Be the first to like this

Views
Total views
1,610
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
9
2
Likes
0
Embeds 0
No embeds

No notes for slide

The Ludic Fallacy Applied to Automated Planning

1. 1. The Ludic Fallacy SPG 11th Feb 2011
2. 2. • Ludic - Of or pertaining to games of chance• Fallacy - An argument which seems to be correct but which contains at least one error.
3. 3. Example
4. 4. Example• Suppose you ﬂip a coin, what is the chance it comes up heads?
5. 5. Example• Suppose you ﬂip a coin, what is the chance it comes up heads?• 50/50
6. 6. Example• Suppose you ﬂip a coin, what is the chance it comes up heads?• 50/50• Suppose you ﬂip the coin 100 times and the ﬁrst 99 were tails. What is the chance of the ﬁnal ﬂip giving heads?
7. 7. Example• Suppose you ﬂip a coin, what is the chance it comes up heads?• 50/50• Suppose you ﬂip the coin 100 times and the ﬁrst 99 were tails. What is the chance of the ﬁnal ﬂip giving heads?• Independent variables, still 50/50.
8. 8. Example• Suppose you ﬂip a coin, what is the chance it comes up heads?• 50/50• Suppose you ﬂip the coin 100 times and the ﬁrst 99 were tails. What is the chance of the ﬁnal ﬂip giving heads?• Independent variables, still 50/50.• ...or is it?
9. 9. Origins
10. 10. Origins• Originally postulated by Nassim Nicholas Taleb in "The Black Swan".
11. 11. Origins• Originally postulated by Nassim Nicholas Taleb in "The Black Swan".• Broadly, the ability to describe the outcomes of events gives an impression of control. It does not give ACTUAL control of the events.
12. 12. Origins• Originally postulated by Nassim Nicholas Taleb in "The Black Swan".• Broadly, the ability to describe the outcomes of events gives an impression of control. It does not give ACTUAL control of the events.• A complex but inaccurate model is most importantly inaccurate.
13. 13. "Gambling With the Wrong Dice"
14. 14. "Gambling With the Wrong Dice"• Case Study based on Las Vegas casino.
15. 15. "Gambling With the Wrong Dice"• Case Study based on Las Vegas casino.• Extensive and sophisticated systems and models to account for potential cheating.
16. 16. "Gambling With the Wrong Dice"• Case Study based on Las Vegas casino.• Extensive and sophisticated systems and models to account for potential cheating.• Aim was to manage risk.
17. 17. "Gambling With the Wrong Dice"• Case Study based on Las Vegas casino.• Extensive and sophisticated systems and models to account for potential cheating.• Aim was to manage risk.• But the vast majority of losses came from non- gambling activity : a disgruntled ex-employee, onstage accidents, failure to ﬁle correct paperwork and a kidnap ransom.
18. 18. Blinded By Probability
19. 19. Blinded By Probability• Because we see numbers as solvable, we focus on solving them.
20. 20. Blinded By Probability• Because we see numbers as solvable, we focus on solving them.• Lose sight of the broader picture.
21. 21. Blinded By Probability• Because we see numbers as solvable, we focus on solving them.• Lose sight of the broader picture.• The "game" becomes our main focus rather than the world it represents.
22. 22. Back to Coins
23. 23. Back to Coins• We ﬂip 99 times, all tails.
24. 24. Back to Coins• We ﬂip 99 times, all tails.• 0.5^99 = 1.8x10^-30
25. 25. Back to Coins• We ﬂip 99 times, all tails.• 0.5^99 = 1.8x10^-30• Which is more likely, this highly improbable event is happening, or the assumptions that we used to build the model dont hold true?
26. 26. Back to Coins• We ﬂip 99 times, all tails.• 0.5^99 = 1.8x10^-30• Which is more likely, this highly improbable event is happening, or the assumptions that we used to build the model dont hold true?• Is the coin fair?
27. 27. Back to Coins• We ﬂip 99 times, all tails.• 0.5^99 = 1.8x10^-30• Which is more likely, this highly improbable event is happening, or the assumptions that we used to build the model dont hold true?• Is the coin fair?• What actually is the probability of getting heads next?
28. 28. Off-modelConsequences
29. 29. Off-model Consequences• When we have a model, we risk getting blinkered into thinking about the model instead of the world.
30. 30. Off-model Consequences• When we have a model, we risk getting blinkered into thinking about the model instead of the world.• But models are abstract representations.
31. 31. Off-model Consequences• When we have a model, we risk getting blinkered into thinking about the model instead of the world.• But models are abstract representations.• No PDDL model describes the effect of a meteorite hitting a robot, yet it is an (unlikely) possibility.
32. 32. Off-model Consequences• When we have a model, we risk getting blinkered into thinking about the model instead of the world.• But models are abstract representations.• No PDDL model describes the effect of a meteorite hitting a robot, yet it is an (unlikely) possibility.• Outcomes of actions, or events, cannot be fully enumerated. There exist "off-model consequences"
33. 33. Coins Again
34. 34. Coins Again• We talk about coins having a head and a tail side and 50/50 chance of either.
35. 35. Coins Again• We talk about coins having a head and a tail side and 50/50 chance of either.• This isnt strictly true - theres a third possibility we dont model :
36. 36. Coins Again• We talk about coins having a head and a tail side and 50/50 chance of either.• This isnt strictly true - theres a third possibility we dont model : • Edge
37. 37. Coins Again• We talk about coins having a head and a tail side and 50/50 chance of either.• This isnt strictly true - theres a third possibility we dont model : • Edge• This is Talebs "Black Swan", highly unlikely but theoretically possible events that are ignored.
38. 38. Coins Again• We talk about coins having a head and a tail side and 50/50 chance of either.• This isnt strictly true - theres a third possibility we dont model : • Edge• This is Talebs "Black Swan", highly unlikely but theoretically possible events that are ignored. • A true Black Swan must also be "high impact"
39. 39. What Am I Driving At?
40. 40. Probabilistic Planning
41. 41. Probabilistic Planning• PPDDL is a prime example of "doing it wrong"
42. 42. Probabilistic Planning• PPDDL is a prime example of "doing it wrong"• Extends PDDL by applying probabilities to sets of effects. P(X=i) I occurs, P(X=j) J occurs etc.
43. 43. Probabilistic Planning• PPDDL is a prime example of "doing it wrong"• Extends PDDL by applying probabilities to sets of effects. P(X=i) I occurs, P(X=j) J occurs etc.• Is the world really so cut and dry? Or is this simply shoehorning probabilities into PDDL in the most obvious way possible.
44. 44. Summary
45. 45. Summary• Models are typically incomplete.
46. 46. Summary• Models are typically incomplete.• Models are frequently wrong.
47. 47. Summary• Models are typically incomplete.• Models are frequently wrong.• Probabilistic models make even more assumptions!
48. 48. Summary• Models are typically incomplete.• Models are frequently wrong.• Probabilistic models make even more assumptions!• We allow ourselves to be deceived by numbers into believing we can quantify the unquantiﬁable.
49. 49. Summary• Models are typically incomplete.• Models are frequently wrong.• Probabilistic models make even more assumptions!• We allow ourselves to be deceived by numbers into believing we can quantify the unquantiﬁable.• As a result, we get bogged down solving a problem that isnt necessarily reﬂective of the real world.
50. 50. So What Can We Do?
51. 51. Introduce Noise
52. 52. Introduce Noise• Most basic approach is to add noise to probabilistic models.
53. 53. Introduce Noise• Most basic approach is to add noise to probabilistic models.• If the model has P(x) = 0.2, test generated plans at say P(x) = 0.2+-0.05
54. 54. Introduce Noise• Most basic approach is to add noise to probabilistic models.• If the model has P(x) = 0.2, test generated plans at say P(x) = 0.2+-0.05• Allows for a rudimentary "what happens if these values are not spot on" check
55. 55. Epsilon-separation of states
56. 56. Epsilon-separation of states• Similar concept to that used in temporal actions.
57. 57. Epsilon-separation of states• Similar concept to that used in temporal actions.• In this case epsilon denotes a marginal probability of transitioning between any pair of states.
58. 58. Epsilon-separation of states• Similar concept to that used in temporal actions.• In this case epsilon denotes a marginal probability of transitioning between any pair of states.• Still not ideal, but at least captures the possibility of events changing the state in an undetermined way.
59. 59. Epsilon-separation of states• Similar concept to that used in temporal actions.• In this case epsilon denotes a marginal probability of transitioning between any pair of states.• Still not ideal, but at least captures the possibility of events changing the state in an undetermined way.• Somewhat analogous to Van Der Waals forces.
60. 60. State Charts
61. 61. State Charts• In the FSM family, State Charts frequently used to represent interruptible processes e.g. Embedded Systems
62. 62. State Charts• In the FSM family, State Charts frequently used to represent interruptible processes e.g. Embedded Systems• One process interrupts the other, acts and the the ﬁrst can resume from its previous state.
63. 63. State Charts• In the FSM family, State Charts frequently used to represent interruptible processes e.g. Embedded Systems• One process interrupts the other, acts and the the ﬁrst can resume from its previous state.• Can we use this model to capture the consequences of unmodelled events?
64. 64. Abstract / Anonymous Actions
65. 65. Abstract / Anonymous Actions• In Prolog _ represents the anonymous variable.
66. 66. Abstract / Anonymous Actions• In Prolog _ represents the anonymous variable.• Nothing analogous to this in PDDL.
67. 67. Abstract / Anonymous Actions• In Prolog _ represents the anonymous variable.• Nothing analogous to this in PDDL.• Would introducing this give us ﬂexibility to patch plans when off-model events occur?
68. 68. Abstract / Anonymous Actions• In Prolog _ represents the anonymous variable.• Nothing analogous to this in PDDL.• Would introducing this give us ﬂexibility to patch plans when off-model events occur?• Could this be used for actions (perhaps based on DTG clusterings) be useful for this?
69. 69. Brainstorm!