Recognizing what technologies will be useful prior to prototyping is error prone, with resulting higher-than-acceptable developmental rejection rates. MIT Lincoln Laboratory (MIT LL) has been using serious games to aid in technology assessment programs. This approach combines economic game theory with rapid-play, rapidly-developed digital simulations to collect quantitative data, improve qualitative feedback, and crowdsource the ingenuity of human experts.
Similar to Games for Analysis of Technologies in Human-Intensive Systems - Dr. Tim Dasey, Informatics and Decision Support Group, MIT Lincoln Laboratory
Similar to Games for Analysis of Technologies in Human-Intensive Systems - Dr. Tim Dasey, Informatics and Decision Support Group, MIT Lincoln Laboratory (20)
2. HIVELET Connections - 2
Rob Seater 2017
Missile
Defense
ISR &
Intelligence
Cyber
Operations
Chem-Bio
Defense
Emergency
Management
Air Traffic
Control
LL History of Games Development
2009 2010 2011 2012 2013 20142001 2015 2016
Holding
flights
Airborne
flights
3. HIVELET Connections - 3
Rob Seater 2017
Virtual Human-in-the-Loop
Experimentation Purposes
Expert
Decision
Analysis
Experiential
Learning
Crowdsourced
Discovery
Diagnostic /
Evaluation
Training Analysis
Integrated
Human-
Machine
Performance
Impact of
Future
Technology
Discovery
Guided
Brainstorming
(Red and Blue)
Machine
Learning from
Human Experts
Requirements
Prediction &
Estimation
Image Source: LL screenshot (bottom left), Shutterstock.com (others)
MIT LL uses serious games for a wide range or purposes.
Here, our focus is on future technology & discovery.
4. HIVELET Connections - 4
Rob Seater 2017
Serious Games for
Technology Exploration
A game is a sensor for measuring
human decision making.
Serious games are a tool for system analysis
to addresses the role of human decision, failings,
and creativity.
Technology exploration games crowdsource
discovery of new tactics, clever combinations of
technologies, and requirements estimates based on
iterative experiential play.
5. HIVELET Connections - 5
Rob Seater 2017
Technology Concept Lifecycle
seed
concept,
emerging
technology
or threat
2N combinations
concepts
& con-ops focused,immersive,
detailedevaluation
fieldtestingand
refinement
triagebasedonanticipated
utilityandacceptance
ideas worth
pursuing with
more costly R&D
brainstorming
brainstorming
leading design
trade-offs
deployment
& training
Goals:
- Catch issues as early as possible
- Be as systematic and quantitative as able
6. HIVELET Connections - 6
Rob Seater 2017
Analyzing User-Facing Future
Technology is Hard
Requirements Unknown Experts Bad at Theorizing
Doctrine & Tactics ChangeEveryone Is a Novice User
…plus everyone is waiting on you before the project can start.
Image Source: Shutterstock.com
7. HIVELET Connections - 7
Rob Seater 2017
Rapid-Play Serious Games
x
1-2 minute turns
(see trends,
shift priorities)
10-15 minute games
(experiment w/ feedback,
measure rate of learning)
many participants
1-2 hour session
(low burden on participants,
can be paired with other materials)
x many scenarios
Short Accessible Flexible
many plays
• No special equipment
• Play remotely
• Easy to generate scenarios
• Easy to modify objectives
• Minutes to hours
• High level
Also, they can be created quickly by adapting existing templates.
Image Source: Shutterstock.com
8. HIVELET Connections - 8
Rob Seater 2017
Homeland Security Applications
Rapid Play Games Informing Conventional Analysis
Image Source: LL Screenshots
Naval Missile Defense Public Health Response to Bio-Terrorism
Standoff Security for Transportation Hubs
Robotic Support for First Responders
Citizen Preparedness for Radiological Fallout
9. HIVELET Connections - 9
Rob Seater 2017
Defense Applications
Rapid Play Games Informing Conventional Analysis
Image Source: LL Screenshots
Naval Fleet Composition
Drone-Based Aerial Refueling ISR Team Workflows & Tool Suites
Infantry Squad Tactics Using Small UAVs
10. HIVELET Connections - 10
Rob Seater 2017
• Rapid-Play Digital Simulations (‘Games’)
– provide concrete intuition about potential benefit
– many iterations allow for more exploration and data collection
• Game Theory / Auction Theory
– player must think critically about what is most beneficial
– player can choose custom combinations to execute a novel strategy
HIVELET Approach
Combine Game Theory with Rapid-Play Digital Simulations
Player-Driven
Capability Selection
Mission Simulation
as Digital Game
rapidly alternate
between modes to
build intuition,
collect data, and
explore novel tactics
Human-Interactive Virtual Exploration for Lightweight Evaluation of Technologies
11. HIVELET Connections - 11
Rob Seater 2017
General Framework
Modular Components Support Different Experiments
Available Capabilities
Selection / Pricing
Mechanism
Data Collection
Mission SimulatorScenario / Domain
Data Analysis
Image Source: LL Screenshot
12. HIVELET Connections - 12
Rob Seater 2017
Case Study: UAV Integration into Infantry Squads
Mission: find downed predator drone in urban environment,
retrieve data, destroy predator, extract safely
Capabilities: personal UAV sensors, control/flight systems,
battery, targeting support, intelligent behaviors
Research Question: Which capabilities or combinations of capabilities are
most valuable to this mission, and how do they alter tactics and doctrine?
10 minutes
per full cycle
13. HIVELET Connections - 13
Rob Seater 2017
Experimental Trials
• 36 participants over 5 trials
– 25 lab researchers
– 9 military fellows
– 3 admin/support
• 2.5 hour play session
– 1:00 Training
– 1:00 Score competition
– 0:30 Discussion
• Quantitative & qualitative results
– Positive feedback from experienced
military fellows & lab UAV researchers
– Data and discussion supported the
primary experimental questions
– Written report in progress
Image Source: LL Photo
Goal: Demonstrate the value of data collected by this technique, not the
realism of the technologies modeled or environment used for that validation.
14. HIVELET Connections - 14
Rob Seater 2017
Alternate Missions & Environments
Ruined Desert City at Dawn Arctic Expanse
Rocky Desert at NightIsland City
Urban Sprawl
Each of the 5 terrains can be paired with either of 2 mission objectives –
search for a crashed asset / clear a convoy’s path of threats.
Image Source: LL Screenshot
15. HIVELET Connections - 15
Rob Seater 2017
Also
• Consequence by architecture
• Information required for player to take action
• Player performance relative to benchmarks
Example HIVELET Output
Quantitative Qualitative
Also
• Changes in technology utility perception
• Player perception of risk tolerance vs.
scoring features
• Player rationale for observed strategy evolution
Player technology selection
preferences. Player
preferences correlate with
performance and change
after gameplay.
Anticipated
pre-game
Not
anticipated
Strategy used in
game
Strategy not
used in game
High
Moderate
Ineffective
Unobserved
Strategy Effectiveness
Unanticipated, effective
strategies
16. HIVELET Connections - 16
Rob Seater 2017
• Convergence. Do players quickly form and express consistent opinions?
• Accuracy. Are player opinions worth listening to? Does it match success?
• Transferability. Do lessons in the game map to the real world? (e.g. risk)
• User Impact. Does this approach affect the opinions players express?
• Novel Lessons. Do we learn anything new, or just confirm prior knowledge?
Key Questions for Method Validation
17. HIVELET Connections - 17
Rob Seater 2017
• Convergence
– Do players learn the game and technologies quickly?
– Does the game change their opinions?
– Was 1-2 hours enough play time?
Experimental Results
Lesson: We can collect consistent data in a short time.
Image Source: LL Graphic
18. HIVELET Connections - 18
Rob Seater 2017
• Accuracy / Consistency
– Does player make good choices for themselves?
– Can we trust their selections to reflect utility?
Experimental Results
Lesson: Players make good choices for themselves.
Image Source: LL Graphic
19. HIVELET Connections - 19
Rob Seater 2017
• Score Sensitivity / Realism
– Should we believe that behaviors in the game match behaviors in the world?
– Will players be appropriately risk averse in a virtual environment?
– Can we control their level of risk taking?
Experimental Results
Averages from Post-Game Survey (1 = low, 5 = high)
Lesson: We can control player risk aversion with scoring.
Image Source: LL Graphic
20. HIVELET Connections - 20
Rob Seater 2017
• Improve the Quality of Qualitative Feedback
– Does playing a game change player opinions and ability to discuss the topic?
– Does making in-game choices about capability selection change opinions?
Experimental Results
Lesson: Customizing loadouts make players more critical
than just experiencing the virtual model.
Image Source: LL Graphic
21. HIVELET Connections - 21
Rob Seater 2017
• Crowdsourcing Ingenuity
– Do players discover novel strategies?
– Do players discover surprises in the value (or lack of value) of technologies?
Experimental Results
Lesson: Players discovered novel (effective) strategies.
Image Source: LL Graphic
22. HIVELET Connections - 22
Rob Seater 2017
Lessons Learnt for Rapid-Play Games
• Design
– Design the game around the data to be collected (sensor vs. simulation)
– Focus on key decisions, abstract away detail when possible
– Balance experimental design with realistic models (are rare events rare?)
• Feedback
– Use multiple scoring axes or variable scoring weights
– Clarify best outcome vs. best practice
• Scenarios
– Procedurally generate scenarios (faster, more insight)
– Use a ‘detective’ style game when possible (cleaner analysis)
– Use turn-based games when possible (clearer intent)
• Complementary Techniques
– Precede with open ended tabletops to discover relevant info & actions
– Follow with more traditional detailed / immersive evaluations
24. HIVELET Connections - 24
Rob Seater 2017
Alternative Quantitative Analysis &
Evaluation Mechanisms
More Virtual / Early Phase More Physical / Late Phase
Behavioral
Model +
Simulation
Interactive
Digital
Simulation
Immersive
Chamber
Simulation
Augmented
Indoor
Exercise
Large Scale
Outdoor
Exercise
strong
strong
weak
moderate
strong
strong
moderate
moderate
moderate
moderate
strong
strong
weak
moderate
strong
moderate
weak
weak
strong
weak
Technical Feasibility
High-N Experiments
Realistic Detail
Low Cost / Burden
New Uses/Exploits weak
weak
strong
strong
weak
strong
moderate
moderate
moderate
strong
Req’ts Identification
weak weak moderate weak strongCurrent Availability
25. HIVELET Connections - 25
Rob Seater 2017
→ Challenges and Gaps
→ Approach & Results
→ Other Applications
Outline
Image Source: LL Screenshot
26. HIVELET Connections - 26
Rob Seater 2017
→ Challenges and Gaps
→ Approach & Results
→ Other Applications
Outline
Image Source: LL Screenshot
27. HIVELET Connections - 27
Rob Seater 2017
→ Challenges and Gaps
→ Approach & Results
→ Other Applications
Outline
Image Source: LL Screenshot
28. HIVELET Connections - 28
Rob Seater 2017
• Key Questions Supported
Experimentally
– Convergence. Do players quickly form
and express consistent opinions?
– Accuracy. Are player opinions worth
listening to? Does it match success?
– Transferability. Do lessons in the game
map to the real world? (e.g. risk)
– User Impact. Does this approach affect
the opinions players express?
– Novel Lessons. Do we learn anything
new, or just confirm prior knowledge?
Experimental Validation
Convergence on Successful Strategies
Gameplay Shatters Rosy Speculation