Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Automation,
with Humans
in Mind
Making Complex Systems
Predictable, Reliable and
Humane
Hi, folks.
I do things to/with
computers.
I build real-time
systems.
I build
fault-tolerant
systems.
The
whole
survives.
I build critical
systems.
Failure is
catastrophic.
Complex Systems
Complex Systems
• Non-linear feedback
Complex Systems
• Non-linear feedback
• Coupled to external systems
Complex Systems
• Non-linear feedback
• Coupled to external systems
• Difficult to model, understand
AdRoll
Let’s talk about
the future.
Let’s talk automation.
Let’s talk
human
cooperating
with
machine.
Apollo 13
A complex craft.
It wasn’t clear
how to orient
the system.
Rocket
with a tin
can and
some
humans on
top?
More elaborate
rocket plane?
A matter of significant debate.
A balance was struck.
Saturn V
was a big,
completely
automatic
rocket.
With a space plane on top.
A wee
problem
with the
Service
Module.
No fuel, no
O2 and a
dead boat.
What to do?
Improvise.
Mission
Control
puzzled
out new
budgets.
Used the
Lunar
Module
rocket for
main
propulsion.
Bridged
incompatible
systems with
available
materials.
Tools aid
experts to
overcome
catastrophic
failure.
Automation, done right,
relieves tedium.
Automation, done right,
reduces errors.
Automation, done right,
liberates.
Let’s talk
human
versus
machine.
Chernobyl
Graphite-moderated
boiled water reactor.
Requires active cooling.
Worse, unstable at
low power levels.
Even worse, very high
positive void coefficient.
Worst of all, Soviet
political dynamics.
During a test of a backup
system, the reactor was
driven into a failure-prone
state.
Warning
signs
were
ignored.
Boom
In the immediate
aftermath vital
equipment is not
available.
It’s all
locked in
a safe.
The sole man
with a key is
dead, crushed
under rubble.
There’s nothing to be done.
The
reactor
fails
according
to its
nature.
Much is irradiated.
Many die.
An entire
region of
Ukraine is
abandoned.
Automation, done wrong,
mechanizes humans.
Automation, done wrong,
misdirects.
Automation, done wrong,
entraps.
Every system
carries the
potential for its
own destruction.
“Normal Accidents”
Failure is inevitable.
The design of any
system must include
failure as a first
class concern.
Otherwise, system
failure happens in
completely arbitrary
ways.
How do you design
for failure?
Cyborg
it up a
little.
Don’t
do it
alone.
Have resources
you’re willing
to sacrifice.
Accept failure.
Learn from it.
Study the
accidents
of others.
Some things aren’t
worth building.
Understand what you build.
Thanks!
<3
@bltroutwine
Upcoming SlideShare
Loading in …5
×

Automation With Humans in Mind: Making Complex Systems Predictable, Reliable and Humane

4,373 views

Published on

I believe that our current approach to designing software systems is driving society in a bad direction. In particular, I believe we are creating a society predicated on automation which is oriented to be serviced by humans or, requiring no service, is simply in control of humans. Ignoring the dystopian overtones of this, I argue that this is a technically flawed approach, that such automation is less reliable, less flexible and less robust through time than a system designed with humans as the controlling party in mind. I will argue--with a mix of personal experience, reference to academic literature and historical examples--that complex systems designed with human control in mind are more lasting through time, more technically excellent and just generally more useful. I will further argue that a re-orientation toward human supremacy in computer systems is especially important as we begin to tightly couple western civilization's technology to the internet, being the Internet of Things. I'll talk a bit about the political and social implications, as well, after I've made a purely technical argument.

Published in: Software

Automation With Humans in Mind: Making Complex Systems Predictable, Reliable and Humane

  1. 1. Automation, with Humans in Mind Making Complex Systems Predictable, Reliable and Humane
  2. 2. Hi, folks.
  3. 3. I do things to/with computers.
  4. 4. I build real-time systems.
  5. 5. I build fault-tolerant systems.
  6. 6. The whole survives.
  7. 7. I build critical systems.
  8. 8. Failure is catastrophic.
  9. 9. Complex Systems
  10. 10. Complex Systems • Non-linear feedback
  11. 11. Complex Systems • Non-linear feedback • Coupled to external systems
  12. 12. Complex Systems • Non-linear feedback • Coupled to external systems • Difficult to model, understand
  13. 13. AdRoll
  14. 14. Let’s talk about the future.
  15. 15. Let’s talk automation.
  16. 16. Let’s talk human cooperating with machine.
  17. 17. Apollo 13
  18. 18. A complex craft.
  19. 19. It wasn’t clear how to orient the system.
  20. 20. Rocket with a tin can and some humans on top?
  21. 21. More elaborate rocket plane?
  22. 22. A matter of significant debate.
  23. 23. A balance was struck.
  24. 24. Saturn V was a big, completely automatic rocket.
  25. 25. With a space plane on top.
  26. 26. A wee problem with the Service Module.
  27. 27. No fuel, no O2 and a dead boat.
  28. 28. What to do?
  29. 29. Improvise.
  30. 30. Mission Control puzzled out new budgets.
  31. 31. Used the Lunar Module rocket for main propulsion.
  32. 32. Bridged incompatible systems with available materials.
  33. 33. Tools aid experts to overcome catastrophic failure.
  34. 34. Automation, done right, relieves tedium.
  35. 35. Automation, done right, reduces errors.
  36. 36. Automation, done right, liberates.
  37. 37. Let’s talk human versus machine.
  38. 38. Chernobyl
  39. 39. Graphite-moderated boiled water reactor.
  40. 40. Requires active cooling.
  41. 41. Worse, unstable at low power levels.
  42. 42. Even worse, very high positive void coefficient.
  43. 43. Worst of all, Soviet political dynamics.
  44. 44. During a test of a backup system, the reactor was driven into a failure-prone state.
  45. 45. Warning signs were ignored.
  46. 46. Boom
  47. 47. In the immediate aftermath vital equipment is not available.
  48. 48. It’s all locked in a safe.
  49. 49. The sole man with a key is dead, crushed under rubble.
  50. 50. There’s nothing to be done.
  51. 51. The reactor fails according to its nature.
  52. 52. Much is irradiated.
  53. 53. Many die. An entire region of Ukraine is abandoned.
  54. 54. Automation, done wrong, mechanizes humans.
  55. 55. Automation, done wrong, misdirects.
  56. 56. Automation, done wrong, entraps.
  57. 57. Every system carries the potential for its own destruction.
  58. 58. “Normal Accidents”
  59. 59. Failure is inevitable.
  60. 60. The design of any system must include failure as a first class concern.
  61. 61. Otherwise, system failure happens in completely arbitrary ways.
  62. 62. How do you design for failure?
  63. 63. Cyborg it up a little.
  64. 64. Don’t do it alone.
  65. 65. Have resources you’re willing to sacrifice.
  66. 66. Accept failure. Learn from it.
  67. 67. Study the accidents of others.
  68. 68. Some things aren’t worth building.
  69. 69. Understand what you build.
  70. 70. Thanks! <3 @bltroutwine

×