Visualization of security data has not advanced significantly since the days of the WOPR in War Games. Other tech industries have embraced the role of modern user interfaces to facilitate and expedite data search, analysis and discovery, which has significantly helped users in those industries gain insights from a big data environment. In contrast, the security industry prefers to relegate everyone into command line prompts and clunky interfaces with minimal functionality and an inability to scale to the volume, velocity, and variety of security data. I’ll address the core challenges and impact of the industry’s failure to take data visualization and user experience seriously, and provide recommendations on key areas that would most benefit from modern data visualization. Through the use of attack timelines, I’ll demonstrate how we, as an industry, must move beyond familiar visualization conventions (that tend to break at scale) and provide functional data visualization that is usable for analysts and operators across all levels of expertise.
My name is Matthew Park. I lead the UX team at Endgame. Endgame is a player in the EPP EDR Space - building endpoint protection for commercial enterprise.
What our team does and cares about (The UX Team) is building thoughtful and practical workflows, visualizations, and experiences for the cyber security analysts. If you don’t understand the image to the right - You’re not supposed to, it’s one of my favorite jokes on how people define ux vs ui.
So I don’t want too be bold, but this is probably one of the greatest scenes in film history. For those not familiar with the film War Games, the United States built a war simulator called the WOPR, during the latter portion of the cold war, to run possible war simulations - hoping to learn from each scenario for us to have the upper hand in any possible nuclear engagement. Here WOPR can no longer tell the difference between a game and reality as it brute forces it way into NORAD for nuclear launch codes. And the final scene culminates to matthew broderick screaming “Learn DAMMIT LEARN” as he try to get the WOPR to learn of the futility of war through tic-tac-toe. The movie is 50% siren horns and compilation shots of whizzing lights of early 1980 technology, and 50% matthew broderick being a more tech savvy ferris bueller. It's as perfect as 80s movies get. And it's absolutely ridiculous.
Now It’s ridiculous... because it seems like the main driving component of these SOC analyst and military general’s decisions were made from these WOPR visualizations. I can count between two fingers the number of times I saw a NORAD analyst with an opened unix shell, instead of the multiple times they are eyes-glued to the visual boards of missile strikes. And although it offends my designer (at least, visual) sensibilities, the map visualizations of the WOPR were minimal, clean, relatively easy to understand, and clearly had the mindset of NORAD analysts in mind. Hollywood fiction at it’s finest. The title of this talk is ‘Data Visualizations in Cyber Security: Still home of the WOPR” - And let’s face it, visualizations today in no way dictate or have the workflow influence that the WOPR ever had. It was never home of the WOPR.
Our analysts today instead sift through streams of alerts there list heavy views and Naturally form around that and do not want to be bogged down by time-consuming or useless visualizations.
So let’s step back into reality of what a real SOC organization is like:
The cyber security space is naturally a very difficult domain to explore and understand. From the Defensive Cyber Operations (DCO) side, there are streams of new exploits types and malicious attacks constantly threatening an array of different network environments. A typical defensive analyst job is to maintain their knowledge of these attacks; they need to know what patterns to look for, what to spot, essentially finding that needle in a haystack - in a very short amount of time. Of course once that needle is found, these security analysts are then tasked to find where other corresponding problematic areas exist; exposing and remediating other areas of the network the attacker could have manipulated. It's a classic ‘cat and mouse’ game that keeps analysts constantly on their toes searching or reacting to malicious events.
Naturally, these analysts form habits in searching their network environment and do not want to be bogged down by time-consuming or useless visualizations.
But why do security analysts generally regard new types of visualization introduced to them as “useless eye-candy” and disruptive to their workflow. Raffael Marty (author of Applied Security Visualization) - talks about this general problem plaguing security visualizations today. He says “they are either the work of designers with no background in security, or of security professionals who don’t understand data visualizations.” One is beautiful but not practical in getting work done, the other is effective, but clunky to use (non-intuitive) and requires the average analyst to do more work in piecing together stories. Traditionally - basic “out-of-the-box” visualizations fall into this category, as our industry/products currently lean towards the latter as it is easier to build. We tend to lean on familiarity, rather than make a focused effort towards usability.
But done right, visualizations can be a powerful tool for an analyst. Lets walk through a quick example:
So we generally know analysts are known to be more comfortable (or at least have more experience) sifting through lists/lines of data, whether it’s through an security enterprise platform or command line. But this can be quite time consuming, especially if you don’t know what to look for. And when you’re streaming through lists of event or alert data, it's very tough to spot patterns or recognize changes in a larger scale. These are where our ideals of visualizations should drop in.
Image Example: Now, these data sets have the same mean, variance, regression lines, and error rates.
However, plotting them as charts makes their unique patterns very obvious. And this is a singular and simple example of how we can expand a greater visibility of data back to our users.
Now - While we can talk abstractly around the pitfalls and ideals of visualizations as a whole, the purposes of this talk is primarily based around attack timelines. At the end of this talk I will be discussing and showing a couple concepts we are currently prototyping. But these new designs shouldn’t be the key takeaway from this talk. My hope is as I discuss our approach and processes in concepting our ideal attack timeline visualizations - you all can take some of these key principles back and broaden your perspective how to approach your own set of visualizations (Whether it be your own set of attack timelines or anything else). In order to introduce a new set functional and usable visualizations into the security space (that go beyond familiarity), the people building them need to do research into proper design patterns as well as gain a better understanding of their users.
So, with that in mind - let's discuss how we initially approached our problem with attack timelines. It should be no surprise, being someone who lives in the user experience space, we decidedly took a user centric approach to the problem.
The goal of an attack timeline or alert triage visualization is to allow an analyst to quickly asses the relative severity of an alert so that they can dismiss, remediate, or escalate to another analyst. They serve as a means to communicate the story of an attack (or supposed attack) and should be used as a platform for data manipulation and exploration.
Coming from an organization that has prior experience with these type of tools, we had a couple of biases going into our initial creation:
There are large groups of users that lack security and platform domain experience. Making a lot current visualizations difficult to navigate, as they are overly complicated and too expansive - compounded with the fact users do not have the proper training to make an informed decision.
Users lack time: These analyst will typically have 5-10min to make a decision on an alert. As illustrated before, the queue of alerted information can be never-ending and can stack up when not dealt with in a timely manner.
In order to differentiate ourselves we wanted to provide value to our users particularly around enhancing the analyst's workflow: What are the appropriate pivot points for Response action Gathering more data Launching an investigation Context on actions already taken Collaboration/Commenting
Knowing our biases, we wanted to confirm or deny what we knew by capturing user data through different types of user testing and research. By studying our users we used this opportunity to redefine our user roles by creating new personas specifically around alert triage.
Describe the user testing groups
Describe user groups
In general Opening the attack timeline visualization to accommodate a lot of variables – there was a general concern of whether a visualization accurately represent the amount of data returned back on alert. People with prior experience with attack timelines noted they only saw high-level event connection patterns which wasn’t enough Time is an abstract concept and not inherently visual. Timelines either try to force time into a linear perspective or remove it completely. Tier 1 Tier 1 - Convey meaning Tier 1 - Lack of Time Tier 3 Tier 3 for Tier 1s - Increase working memory/Enhance detection and recognition Tier 3 for Tier 3s - Facilitate discovery/search
We can divide the purpose of a data visualization into two primary categories:
Visualizations should be used a tool to enhance the typical analyst workflow by providing high to low-level visibility and context to granular data. When created purposefully, they shine when mapping large quantities of data at scale and through time. Visualizations should be used as a tool for collaboration or reporting. Clear visual representations of data, do not require you to be (in this case, a security) domain expert. Anyone can understand trends or content distribution. Humans are naturally visual people and process visual data at a faster “at-a-glance” rate - which opens accessibility of content to a larger array of users.
With our redefined personas, workflow habits captured, design requirements identified. We chose to revisit the basic foundations any type of visualizations should follow. Ben Shneiderman’s (A very distinguished scholar in the field of visualization) information seeking mantra - found that the most powerful visualizations share the same traits:
Overview (Dashboard) First. It’s the first thing a user will see, and should guides them to other parts of the product for further exploration. It should be carefully planned to highlight the important parts of the story, and give lesser weight to the not-so-critical parts. The Overview creation is a process of constant refining and experimenting. And in that sense, the overview section would benefit most from constant testing and refining to arrive at the perfect dashboard design.
Zoom and Filter. Once all the data is presented to the user in the overview section, the user will want to focus on particular areas of interest. From a design perspective, you should aim to provide the user with plenty of control for zooming and filtering data from the overview. This involves zooming and filtering the data using the visualizations interactive features: zooming, scrolling, panning, drill-down, legend, range selector, (this is VERY IMPORTANT in complex visualizations, zoom and filter functionality should be deigned in a way that doesn’t get your user lost in the visualization) If designed correctly - This will yield maximum insights and action from the information at hand.
Details on Demand. You want to give the viewer access to the minutia of details. This would bring them as close as possible to the raw data, and equip them to find what they started looking for. This third layer of data would be less visual, and more text-heavy with a focus on accurate information rather than trends. This way the analyst gets what he or she needs, in a way that drives action.
By using the three steps of the information-seeking mantra, you can avoid information overload, analyze data more easily, and find solutions faster.
Our next step was finding a structure that could accommodate all the flexibility of all the user data we acquired. These were two examples (OF MANY Ideas) of our first iterations of the timeline – you can see in both examples that they hint at other areas of the network environment while remaining still pulling back that initial timeline snapshot of parent process events.
The problem of both of these iterations is that we were confining our data to a very rigid (linear) view – and still constraining time to a single axis. We had the overview section outlined well, but it was difficult to zoom and filter down to the granular level
As we were working through our initial ideas – we were receiving more event data back from our sensors that we initially anticipated – other than process, network, user – We were also beginning to show showing changes of events in real time on an event level.
“I like to present narratives with sprawling information-rich panoramas. Yet these diagrams are radical reductions of written sources I’ve researched. I have had to choose who what to include, who and what not. Because the variables I have to work with are extremely limited, the people and events I use are reduced to symbols that are plotted in relationships to each other in the diagrams. Even within such limitations, it is possible to tell a compelling story.” Ward Shelley ‘Addendum to Alfred Barr, ver. 2”
Used to address the problem of representing the fluidity of time in space – especially in a static form.
Traditionally used and represented in multiple ways in different geographies –
Historical geography – what happened where in part times. Cultural Geography – Where events happened in time Time Geography – How much time it took for events to happen in space. Quantitiative geography which encompasses spatial diffusion and time series analysis. What occurred where in known periods of time
Types: Andrenko and colleagues - two types of temporal aspects that are crucial when dealing with spatio-temporal data. Temporal primitives and structural organization of temporal dimensions. Primitives: Time points (point in time) or time intervals (extent of time). Structures: Ordered time, branching time, and multiple perspectives.
Time points (point in time)– branching time.
This view freed up a variable axis, but was poor for comparison.
Ordered Time (Most commonly used) - broken into two types. Linear Time or Cyclical Time, Linear time provides a continuous sequence of temporal primitives, from past to future, (timelines) and cyclic time organizes primitives in recurrent finite sets (times of day) Branching time - representing alternative scenarios Multiple Perspectives - representing more than one point of view.
time intervals (extent of time)– WE orginally represented in a point in time view – but it was poor for comparisons
Branching time - representing alternative scenarios – Multiple Perspectives - representing more than one point of view.
Data Visualizations in Cyber Security: Still Home of the WOPR?
Data Visualizations in
Still Home of the WOPR?
Confidential and Proprietary
Bsides Las Vegas 2017
Who Am I?
User Experience Lead
Background in Big Data and
Video Games Design
Home of the
fiction at it’s
Learn DAMMIT LEARN
The Life of an
5 /70 Alerts 30 /250 Alerts 150 /8,173 Alerts
Author of Applied
“The general problem plaguing security
visualizations today…they are either the work of
designers with no background in security, or of
security professionals who don’t understand data
tends to lean
can find that
needle in the
How Amazing is WarGames?
The General Problems with
Visualizations in Security
Let’s Talk Attack Timelines
Discovery / Recognizing our
Biases / User Testing
Persona Creation / Design
Concepting / Basics of Design
Prototyping / Looking Ahead
End of the appetizer,
time for the entrée.
understanding our users, capturing
Creating a basic foundation from known design
patterns, Creating new design requirements
from our users
Prototyping and User Testing
Feature creation and taking it back into the ‘wild’
Experience Time Differentiation
• Lack of security
• Lack of platform
• Limited time to
review alerts and
• Forced to make
• Forces conformity
• Requires level of
expertise to extract
User-centric Design Study
GOAL: Capture team dynamics and worker roles within security
organization to identify challenges common across security teams
Team Type Environment Collection
A Traditional SOC Individuals Day-to-day use User interviews
B Novice Training Team Mock Scenario Side-by-side
C Internal Red vs. Blue Mock Scenario Mirrored Scenario
as User Group B
Have little to no prior experience (average of 1 year)
in the cyber security space. First line of defense in a
Security Operations Center.
Main responsibility is to initially triage alerts and
determine if escalation (to higher tiered) is required.
Primarily rely on a platform’s GUI.
Intimately understand network and platform architecture.
Seen as domain experts on the SOC team and more
comfortable working through the command line.
Investigates escalated alerts, and determine root causes
and extent to remediate problems.
Expert in EDR platforms and sophisticated
Uses command line and scripting languages to
bypass UI and collect large data feeds using 3rd
Skilled security practitioners, not necessarily subject matter
Extensive management experience, oversees day-to-day ops.
Set schedules, assigns prioritization, generates reports.
Findings: Security Work Roles
Findings: Day in Life of a Security Analyst
1. Visualizations should be used a tool
to enhance the typical analyst
workflow by providing high to low-level
visibility and context to granular data.
2. Visualizations should be used as a
tool for collaboration or reporting.
Foundations of Visualizations: Ben Shneiderman’s Information seeking mantra
1. Overview First 2. Zoom and Filter 3. Details on Demand
• Should guide the user them to
other parts of the product for
• The overview should
summarize the overarching
story from the entire data set
without getting into the minor
• Aim to provide the user with
plenty of control for zooming
and filtering data from the
• Extremely important for
complex visualizations (ie.
attack timelines) – The Zoom
and Filter is the driving
mechanism for organization to
• This third layer of data would be
less visual, and more text-heavy
with a focus on accurate
information rather than trends.
• Bring them as close as possible
to the raw data, and equip them
to find what they started out
Addendum of Alfred
Barr Pt 2
“…I like to present narratives with sprawling information-rich
panoramas. Yet these diagrams are radical reductions of written
sources I’ve researched. I have had to choose who what to
include, who and what not. Because the variables I have to work
with are extremely limited, the people and events I use are
reduced to symbols that are plotted in relationships to each
other in the diagrams…”
Prototyping: Spatio-Temporal Structures
Existence Changes: Changes in instant events, such as the appearing or disappearing of objects
Spatial Changes: Change in spatial properties of objects such as location, size and shape
The only path is
• Adding workflow enhancements:
• More user testing and refinement – are trying to
poke as many holes as possible
• Scaling past a singular endpoint