SlideShare a Scribd company logo
1 of 112
Info-Tech Research Group Inc. is a global leader in providing IT research and advice.
Info-Tech’s products and services combine actionable insight and relevant advice
with ready-to-use tools and templates that cover the full spectrum of IT concerns.
© 1997-2020 Info-Tech Research Group Inc.
Secure Your High-Risk
Data
Develop a comprehensive data security
plan.
Info-Tech Research Group | 2
Table of
Contents 4 Analyst Perspective
5 Executive Summary
6 Executive Brief
20
21 Step 1.1: Understand the Role
of Data Security
40 Step 1.2: Document Current
Data Security Controls
73 Step 1.3: Assess Compliance
and Regulatory Frameworks
81
82 Step 2.1: Develop the Data
Security Roadmap
92
93 Step 3.1: Prepare your People
98 Step 3.2: Establish Metrics
105 Summary of Accomplishment
107 Research Contributors
110 Bibliography
Info-Tech Research Group Inc. is a global leader in providing IT research and advice.
Info-Tech’s products and services combine actionable insight and relevant advice
with ready-to-use tools and templates that cover the full spectrum of IT concerns.
© 1997-2020 Info-Tech Research Group Inc.
Secure Your High-Risk
Data
Develop a comprehensive data security
plan.
E X E C U T I V E B R I E F
Info-Tech Research Group | 4
Analyst
Perspective
Secure your data to help
secure your business.
Cassandra Cooper
Senior Research Analyst, Security,
Risk, and Compliance
Info-Tech Research Group
Gone are the days of simply protecting the network perimeter, or securing your primary assets, and
resting comfortably at night knowing that your business’ most important data would not be at risk.
An expanded global economy brings with it an expectation of collaboration – not just internally, but
between different organizations in different areas of the world. Data has become an asset, and
unlike the devices to which it travels, data is not static or stationary, but dynamic and fluid in nature.
Throughout its lifecycle, data will live in a multitude of repositories and move through various
sources. The extent of a business’ data sources no longer lie within the confines of the office or
primary workspace, a set of easily-controlled devices, or even a physical data center – organizations
increasingly keep high volumes of sensitive, valuable data in the cloud. While accessible and
convenient, this poses its own set of data security risks and questions.
As a result, business and IT leaders must map the flow of data throughout each stage of its
lifecycle. In order to properly account for information that poses a risk to the organization should it
be lost, corrupted, or used for malicious intent, the IT leader’s mindset must switch from securing
the assets and the network, to securing the data itself.
Info-Tech Research Group | 5
Executive Summary
Your Challenge Common Obstacles Info-Tech’s Approach
Securing data is no longer as simple as implementing
a singular set of controls.
Sensitive and high-risk data now lives in various
repositories both in and out of the organization, in both
on-prem and cloud environments.
Layer in the process of exchanging data and ensuring
secure transfer of this information while still making it
accessible to the intended group of end-users, and the
challenge becomes increasingly complex.
To implement appropriate security controls, InfoSec
leaders must first understand:
• What data they need to keep safe
• Where it exists within the organization
Additionally, organizations must understand the
current compliance and regulatory obligations based
on location and industry.
Finally, InfoSec leaders must select a combination of
technical and process controls that fit the business
environment and reduce user friction.
Info-Tech’s Secure your High-Risk Data takes a multi-
faceted approach to the problem that incorporates
foundational technical elements, compliance
considerations, and supporting processes and policies.
• Assess what technical controls currently exist
within the organization and consider additional
controls.
• Review compliance obligations and information
security frameworks (NIST, CIS) for guidance.
• Develop a set of data security initiatives that involve
both technical and supporting procedural controls.
Info-Tech Insight
A modern data security strategy must protect data through the entire data lifecycle. Data security efforts must be business-focused, with
multi-layered defense, while extending to all data sources.
Info-Tech Research Group | 6
Info-Tech Research Group | 6
Info-Tech Research Group | 6
Your challenge
This research is designed to help
organizations who are facing these
challenges/looking to/need to:
• Identify the set of technical controls involved in adequately securing the
organization’s high-risk, or highly sensitive data.
• Develop a better understanding of the current sources and repositories for
the organization’s data and how they are currently protected and secured.
• Implement an approach to effective data security that takes into account
the relevant technical and process-based controls.
• Ensure that compliance and regulatory obligations are met and protect all
retained sources of personal data.
This research is intended for organizations looking to protect highly sensitive or high-
risk data, that have previously undergone data classification using either a tool or
solution or Info-Tech’s Discover and Classify Your Data research.
Data-In-Transit
Applications
/ Systems
Data
Warehouses
Data Lakes
Data Marts
Flat Files
Devices /
Endpoints
Info-Tech Research Group | 7
The Modern Data Breach Landscape
The impact and frequency of data
breaches is nothing new for CIOs and
CISOs. And while even the tightest data
security programs can still result in a
breach, a comprehensive plan that covers
all vectors of potential attack
significantly reduces both compliance and
business risk.
Making the Case for End-to-End Data
Security
Of data breaches were perpetrated
by external actors, while 45% of
breaches featured hacking as the
attack tactic.
Of data-breach victims report
having had their personal data
compromised.
Info-Tech Insight
A multi-vector defense involves protection of data-
in-use, at-rest, and in-transit, through a combination
of both technical and process controls aligned to
specific control categories.
58%
70%
Source: Verizon Data Breaches 2020
Info-Tech Research Group | 8
Common obstacles
34%
Of an organization’s
Of organizations have
in the past year.
49%
These barriers make this
challenge difficult to address
for many organizations:
The Data Security Dilemma
• A changing technological environment characterized by an increase in
cloud computing, rapid proliferation of IoT of connected devices, and a
vast number of data sources that become targets for malicious attacks.
• The traditional approach of securing the network perimeter no longer
ensures security of all data; organizations must take a multi-faceted
approach to securing data.
• Global privacy regulations are increasing, adding to an already complex
environment characterized by multi-jurisdictional differences in controls
to implement.
• Most organizations will be faced with multiple compliance obligations
and must begin to start looking at data security holistically as opposed
to piecemeal.
• Additionally, industry-specific compliance regulations often present IT
leaders with a host of instructions but lack specific examples of
technical controls to be put into place in order to ensure data is
secured.
40%
Of attacks were a result of
Source: 2020 Thales Data Threat Report
Info-Tech Research Group | 9
Info-Tech Research Group | 9
Creation
Storage
Usage
Archiving
Destruction
Info-Tech’s approach:
End-to-End Data Security The Info-Tech difference:
Effective data security is more than just a single
layer or set of controls around your applications
– it’s a comprehensive approach to both data-at-
rest and data-in-transit. Successful end-to-end
data security calls on an exhaustive strategy that
includes:
• Technical and process controls within the
seven core categories of data security;
• Secures data through all stages of the data
lifecycle, from data creation through until
data destruction;
• Encompasses both data-at-rest and data-in-
transit.
Through both technical and process controls,
Info-Tech’s research equips you with the
knowledge and tools for an effective approach to
data security.
Data-in-Transit
Access
Control
Data
Operations
Authentication
Data Integrity
Data Loss
Management
Encryption
Obfuscation &
Data Minimization
Data-at-Rest Data-at-Rest
Data-in-Use
Data Classification
Data Security Control Categories
Data
Lifecycle
Stages
Info-Tech Research Group | 10
Info-Tech Research Group | 11
Info-Tech’s methodology for Secure Your
High-Risk Data
1. Review Data Security
Methodologies
2. Develop the Data
Security Roadmap
3. Implement Data
Security Roadmap
Phase Steps
1.1 Understand the Role of Data Security
1.2 Review Current Security Controls
1.3 Assess Compliance and Regulatory
Frameworks
2.1 Develop the Data Security Roadmap
3.1 Prepare Your People
3.2 Establish Metrics
Phase Outcomes
Comprehensive overview of:
• Access Control
• Data Operations
• Authentication
• Data Integrity
• Data Loss Management
• Encryption
• Obfuscation and Data Minimization
Alignment to governing compliance
regulations.
• Understanding of where the gaps lie in
current data security framework and
control implementation.
• Prioritized set of technical and process-
based data security initiatives based on
specific criteria.
• Strategy for implementing both
technical and process-based data
security controls.
• Metrics for evaluating the success
of data security framework within
the organization.
Info-Tech Research Group | 12
Data
Security
Has
Layers
A modern data security strategy protects data throughout its entire lifecycle.
Data security efforts must be prioritized, business-focused, with multi-layered defense, and extend to all data sources.
Start with the
technical, finish with
the process.
For every technical control (think
encryption, MFA), there are an
adjacent set of supporting controls
that must be put into action. Think
technical, but follow-up by ensuring
that the process is developed and
documented.
Get your priorities
right.
Before you can present a set of
controls back to your senior
leadership team, you need to identify
which ones are most important in
securing what’s most important to
the business. Prioritize using controls
that consider cost, effort, as well as
compliance and business risk
reduction.
Secure your people.
During the implementation phase,
ensure that any employees or end-
users that will be impacted by
changes in process, or additional
controls in place are properly
informed, trained, and supported.
Secure data fails to remain secure
when those managing it are not
informed and educated.
Nothing in isolation.
With this exhaustive set of data security controls, you
need to evaluate the organization’s full operational
landscape. Assess your gap-closing initiatives as part of a
comprehensive, defense-in-depth data security roadmap.
Sometimes more is better.
Instead of focusing on one specific data security
technique or technology, examine how multiple controls
can work cohesively in order to facilitate a multi-faceted
approach to securing your data.
Info-Tech Research Group | 13
Each step of this blueprint is accompanied by supporting
deliverables to help you accomplish your goals:
Blueprint deliverables
Key deliverable:
Data Security Matrix
tool
An assessment tool based on the current
state and desired future state, that leverages
a set of targeted initiatives to close gaps and
provide a data security roadmap.
A comprehensive outlook on the
current state of data security. The
Data Security Roadmap Report
also provides guidance on the
implementation of technology,
processes, and training to properly
secure your high-risk, high-value
data.
Data Security
Technical and
Report
Data Security
Executive Report
template
Review and present your findings to your
executive leadership team by highlighting
core gaps and identified initiatives to ensure
sufficient data security standards are in
place.
Info-Tech Research Group | 14
Blueprint benefits
IT/InfoSec Benefits Business Benefits
• A comprehensive understanding of where sensitive data exists within
the organization.
• A method of mapping the flow of sensitive or high-risk data between
sources and while in transit.
• An overview of technical controls to evaluate and implement in order
to enhance data security.
• An overview of process or administrative controls to evaluate and
implement in order to enhance data security.
• A prioritized set of data security initiatives based on cost, effort, and
compliance and business risk values.
• Detailed set of initiatives to help effectively secure the organization’s
highly sensitive or high-risk data.
• An understanding of what the compliance obligations for the
organization are, and how the security controls in place ensure full
adherence.
• An understanding of the best-practice data security frameworks, and
the measures in place to meet framework standards.
• Increased end-user trust and enhanced organizational reputation
through improved data security practices.
Info-Tech Research Group | 15
Info-Tech Research Group | 15
Info-Tech Research Group | 15
Measure the value
of this blueprint
Effective data security saves
more than just money…
• Data breaches have become increasingly prevalent, costing organizations an extra 83%
between 2018 and 2019 alone.
• Legal fines and costs associated with customer damages and compensation are a small
portion of the losses incurred as a result of a data breach; reputational damage to the
business and brand can have an irreversible long-term impact on the organization.
In Phase 1 of this blueprint, we will help
you establish current and target state of
data security.
In Phase 3, we will help you develop a set
of relevant and achievable metrics to
support data security initiatives.
Info-Tech Project Value
$116,383 Average annual salary of a Senior
Cybersecurity Consultant
Average total time/cost to
completion for the following high-
priority data security projects:
• Map out full set of data sources
for all high-sensitivity data
• Complete and revise Data Matrix
tool
• Map all compliance obligations
(regulatory and industry-specific)
to current data security controls
• Complete gap analysis of data
security controls
• Develop gap-closing initiatives
and create implementation
roadmap
• Establish metrics and
monitoring schedule
200 hours (initial)
$ 69,875
(internal project cost)
$3.92 million
(data breach cost)
$ 1,230,195
Source: Indeed, Ponemon Institute
29.6% chance of
a data breach occurring
Total Dollars Saved
Info-Tech Research Group | 16
Executive Brief
Case Study Education
INDUSTRY
SCMedia, vpnMentor
SOURCE
OneClass
Remote learning platform OneClass was the recent victim of an extensive data breach, where
personal information of over one million North American students was exposed. Information left
exposed included records that were linkable to minors (under the age of 13) as well as other
OneClass users. The data harvested (account credentials, personal information) provided hackers
with the potential to leverage it for social engineering scams to inflict financial damage through
obtained payment card information.
The database breach was detected by researchers at vpnMentors, who notified the vendor shortly
after the initial date of discovery. Upon notification, OneClass took down the unsecured database
under the claim that the data within it was purely test data and was not linkable to individuals
associated with OneClass. However, vpnMentor noted that the database contained over 27GB of
data, and 8.9 million records, exposing and directly linked to over one million OneClass customers.
Researchers discovered this contradiction between OneClass’ statement and vpnMentors’
discovery, as a sample set of the exposed data was easily matched with public information. A key
issue being the fact that many of OneClass’ users are young, and as such would be unaware of the
fraudulent activities that could result from their information being stolen by malicious actors.
The Data Security Impact
• The AWS-hosted Elasticsearch database used by OneClass was
left unsecured.
• The issue of incorrectly configured/unsecured databases in the
cloud has been a prominent issue, as the transition from a
network security approach to a data-specific approach.
Solution
• Organizations, like OneClass, need to adopt a more
comprehensive approach to data secured off-prem through
additional applied controls:
• Higher degree of server security.
• Strengthening of access control measures through
additional access rules.
• Ensuring systems that contain large amounts of personal or
sensitive data have appropriate authentication measures.
Info-Tech Research Group | 17
Diagnostic and consistent frameworks are used throughout all four options.
DIY Toolkit
“Our team has already made this
critical project a priority, and we
have the time and capability, but
some guidance along the way
would be helpful.”
Guided
Implementation
“Our team knows that we need to fix
a process, but we need assistance
to determine where to focus. Some
check-ins along the way would help
keep us on track.”
Workshop
“We need to hit the ground
running and get this project
kicked off immediately. Our team
has the ability to take this over
once we get a framework and
strategy in place.”
Consulting
“Our team does not have the time or
the knowledge to take this project
on. We need assistance through the
entirety of this project.”
Info-Tech offers various levels of
support to best suit your needs
Info-Tech Research Group | 18
Info-Tech Research Group | 18
Guided Implementation
What does a typical GI on this topic look like?
Phase 1 Phase 2 Phase 3
Call #1: Scope
requirements,
objectives, and
your specific
challenges.
Call #2: Review
and complete
data security
controls matrix.
Call #3: Review and
align compliance
framework matrix.
Call #4: Identify
gap closing
initiatives.
Call #5: Prioritize all
gap-closing
initiatives based on
criteria.
Call #8: Define
metrics and
establish monitoring
schedule.
Call #7: Identify
user groups
and assess
appropriate
training plans.
A Guided
Implementation
(GI) is a series
of calls with an
Info-Tech analyst
to help implement
our best practices
in your
organization.
A typical GI is
between six to
eight calls over the
course of four to six
months.
Call #6: Evaluate cost
and effort table and
implementation
timeline.
Info-Tech Research Group | 19
Workshop Overview
Day 1 Day 2 Day 3 Day 4 Day 5
Activities
Identify Internal and
External Drivers
Assess the Current State Identify Gaps and Define
Initiatives
Develop Implementation
Plan and Define Metrics
Next Steps and
Wrap-Up (offsite)
1.1 Review the business context.
1.2 Review compliance drivers
and relevant regulatory
frameworks.
1.3 Discuss current drivers from
both the InfoSec and business
context.
1.4 Define the scope; which data
sources will be examined, and
what is the tier of data
(classification standard) in
focus for this project?
2.1 Map the flow of all high-risk,
highly sensitive data in-scope
for project workshop (at-rest
and in-transit); document and
compare with Data Inventory.
2.2 Review current set of data
security controls.
2.3 Identify and list future or
potential adjustments to the
organization’s data security
landscape.
2.4 Complete Data Security Matrix
and recommended action
items.
3.1 Review data security Gaps.
3.2 Identify gap-closing initiatives
in-scope for each of the seven
areas of data security.
3.4 Allocate cost and effort
values, and prioritize
initiatives.
3.5 Compare against compliance
obligations and security
frameworks.
3.6 Define execution waves and
initial timeline for
implementation.
4.1 Finalize implementation
roadmap for data security
initiatives.
4.2 Identify scope of employees or
end-users impacted by
changes to data security
landscape.
4.3 Develop training plan for any
process or administrative-
based initiatives.
4.4 Define metrics for
measurement of data security
program success.
4.5 Outline schedule for
monitoring and reporting.
5.1 Complete in-progress
deliverables from previous
four days.
5.2 Set up review time for
workshop deliverables and to
discuss next steps.
Deliverables
1. Set of data security objectives
2. Mapped compliance matrix
3. Defined project scope
1. Revised Data Inventory
2. Completed Data Security Matrix
(compliance frameworks and
security controls)
1. Finalized Data Security Matrix
2. Completed Data Security
Initiatives list
1. Finalized Data Security
Roadmap
2. Data Security Metrics
3. Outline for Data Security
Technical report
1. Completed and delivered Data
Security Technical Report and
Executive Presentation
2. Additional recommendations
based on days one to four of
workshop
Contact your account representative for more
information.
workshops@infotech.com 1-888-670-8889
Info-Tech Research Group | 20
Info-Tech Research Group | 20
3.1 Prepare Your People
3.2 Establish Metrics
Phase 1
Review Data Security Methodologies
This phase will walk you through the
following activities:
• Understand the Role of Data Security
• Review Current Security Controls
• Assess Compliance Frameworks
This phase involves the following
participants:
• CISO/InfoSec lead
• InfoSec managers
• InfoSec team
• CDO/ Data Lead
• IT Team (optional)
• Data and Analytics team
• DBAs
Secure Your High-Risk Data
2.1 Develop the Data
Security Roadmap
1.1 Understand the Role
of Data Security
1.2 Review Current
Security Controls
1.3 Assess Compliance and
Regulatory Frameworks
Info-Tech Research Group | 21
Info-Tech Research Group | 21
Step 1.1 Step 1.2 Step 1.3
Step 1.1
Understand the role of data security
This step will walk you through the following
activities:
• Define your Drivers
• Review and Update Data Classification
Standard
This step involves the following
participants:
• CISO/InfoSec lead
• InfoSec managers
• InfoSec team (optional)
• CDO/ Data Lead
• Data and Analytics team
• DBAs
.
Activities
Outcomes of this step
• Identification of full set of business and
security drivers behind the
implementation of data security controls.
Review Data Security Methodologies
1.1.1 Define your Business Drivers
1.1.2 Review and Update Data Classification Standard
Info-Tech Research Group | 22
NIST Cybersecurity Framework
Know the Role of Data
Security
What part does data play in the
Information Security ecosystem?
Source: Adopted from NIST.gov, 2018
Identify
Protect
Detect
Respond
Recover
• Asset Management
• Business Environment
• Governance
• Risk Assessment
• Risk Management
• Awareness Control
• Awareness and Training
• Data Security
• Info Protection and Procedures
• Maintenance
• Protective Tech
• Anomalies and Events
• Security Continuous Monitoring
• Detection Process
• Response Planning
• Communications
• Analysis
• Mitigation
• Improvements
• Recovery Planning
• Improvements
• Communications
• An organization’s security must adapt to the dynamic global landscape and can no longer rely
on static, one-dimensional measures.
• End-to-end security of the organization’s data relies on appropriate controls around the
systems, applications, sources, and methods of transmission of the data.
• By taking a defense-in-depth approach, data security is layered in. This ensures that multiple
methods covering all potential vectors for data loss, as well as external and internal threat, are
covered.
• While security of data is paramount, overall availability and access to the data must be at the
forefront of the control framework. This necessitates an understanding of core users and data
owners within the business.
Info-Tech Insight
An organization’s approach to data security must be as dynamic as the data it
protects. This depends on a clear understanding of what data will be protected, why
it’s protected (compliance, customer expectations, business drivers), and where it
exists throughout its lifecycle.
Info-Tech Research Group | 23
Info-Tech Research Group | 23
Info-Tech Research Group | 23
Data Security and the
CIA Triad
Effective data security covers all three
elements of the CIA triad.
Confidentiality
Availability
Integrity
Effective
Security
Program
• The CIA Triad is a common, accepted best-practice set of security objectives that guide the
development of a strong security program.
• Confidentiality: “The prevention of unauthorized disclosure of information.” For example, any
sensitive or high-risk data that must be handled or viewed only by specific parties must be treated
in a way that ensures its confidentiality throughout the lifecycle.
• Integrity: “Ensures that information is protected from unauthorized or unintentional alternation,
modification, or deletion.” Data that is transferred between parties and/or sources must retain its
original format without interception resulting in changes to the structure of the data itself.
• Availability: “Information is readily accessible to authorized users.” This entails ensuring that access
control is appropriately defined and carried out, so that users are leveraging the appropriate data
based on their competencies and requirements.
Source: IAPP Privacy Program Management
Defense-in-Depth
A comprehensive, multi-tiered, multi-layered approach to information security that mitigates risk through
a varied set of security controls. This research applies a defense-in-depth framework to the set of data
security controls and their application within the organization.
Info-Tech Insight
Consider the pillar of non-repudiation as an integral
fourth component of the security program. Assurance of
a statement’s or information’s validity is a key facet of
ensuring data security.
Info-Tech Research Group | 24
A perspective on the global data security landscape.
Data Security: An overview
Days is the average time it
took to identify a data breach
in 2019, which increased from
2018 (197 days).
Of individuals do not
believe that companies
care or take measures
toward securing their
personal data.
Source: DataPrivacyManager
41%
279
Of these same individuals
say that they are more
likely to remain loyal to
companies with strong
security controls in place.
84%
39 seconds
Is the time between each
hacker attack that occurs, on
average, every day.
Info-Tech Research Group | 25
Case Study:
Sina Weibo Technology,
Social Media
INDUSTRY
ZDNet, Security Boulevard,
Business & Human Rights
Resource Centre
SOURCE
Why did it happen?
Sina Weibo
Chinese social media platform Sina Weibo, similar to Twitter in terms of functionality,
suffered one of the largest data breaches of the decade in mid-2019. Account
information of over 530 million users was exposed, including phone numbers from
170 million of these users.
Details around the breach surfaced in 2020, as the personal information stolen was
discovered for sale for the mere price of $250 per set of credentials, which did not
include account passwords. This in itself is quite shocking and is a testament to the
fact that data breaches have become so common-place that unique identifiers are
unable to fetch a moderate price tag on the dark web.
The breach occurred as a result of an external hacker gaining access to Weibo and
completing a full-scale sweep of the company’s user database.
A lack of clarity still exists around how the data was obtained, as originally it was said
to have been obtained from a SQL database dump, but the company put forth a
contradictory statement indicating that data was swept by “matching contacts
against [Weibo]’s API.”
Data Source
Data Security
Resolution
• Weibo claimed one-way encryption was used
• No passwords stored in plaintext
• Data separation / segregation
• Database
• Weibo’s Cyber API detected posts where data stolen was
advertised as being sold
• Ministry of Industry and Information Technology (MIIT)
encouraged Weibo to focus on privacy policies to align with
cybersecurity regulations, as well as self-evaluation of data
security of new services
Info-Tech Research Group | 26
Evaluate the Intersection of Data Privacy
and Information Security
Information
Security
Information security aims to ensure
confidentiality, availability, and integrity of
information throughout the data’s lifecycle1.
Common functions of Information Security
include:
Data Privacy
Data Privacy ensures that the rights of
individuals are upheld with respect to
control over how their respective
information is collected, used, and
processed. Data privacy emphasizes the
following principles:
Source: IAPP Privacy Program Management
• Risk Management
• Vulnerability Management
• Strategy and Governance
• Data Protection
• Incident Response
• Identity and Access Management
• Accuracy of information
• Access of information
• Accountability
• Confidentiality of information
Info-Tech Research Group | 27
Input Output
• Optional: Ask core team
members to brainstorm a
list of key privacy
program drivers as well as
objectives
• Identified set of data
security drivers and
objectives
• Grouped themes around
data security drivers
Materials Participants
• Whiteboard/flip charts
• Sticky notes
• Pen/marker
• CISO/InfoSec lead
• InfoSec managers
• InfoSec team (optional)
• CDO/Data Lead
• Data and Analytics team
• DBAs
1.1.1 Define your Business
Drivers
1 hour
1. Bring together relevant stakeholders from the organization. This should include members of
your InfoSec team, as well as DBAs and any other relevant members from IT that support
security procedures.
2. Using sticky notes, have each stakeholder write one driver for data security per sticky note.
• These may vary from concerns about customers, to the push of regulatory obligations.
3. Collect these and group together similar themes as they arise. Themes may include:
• Access Control and Management
• Data Lifecycle
• Data Integrity
4. Discuss with the group what is being put on the list and clarify any unusual or unclear drivers.
5. Determine the priority of the drivers. While they are all undoubtedly important, it will be crucial
to understand which are critical to the organization and need to be dealt with right away.
• For most, any obligation relating to an external regulation or governing framework (i.e.,
NIST SP800-53) will become top priority. Non-compliance can result in serious fines
and reputational damage.
6. Review the final priority of the drivers and confirm current status.
Info-Tech Research Group | 28
Storage Usage Transmission Destruction
Creation
Data Capture
Flat Files
Databases Secure FTP
Email
Analytics
Digital Destruction
Physical Destruction
Data Warehouses
Identity Verification Duplications
Backups
Reporting
Data Entry
Define the Data Lifecycle
During each of the six stages of the data lifecycle, there is an
opportunity to embed security controls to protect the data.
Archiving
Mobile Communications
Data Acquisition
Info-Tech Research Group | 29
From through to
, ensure your
data is protected and
secured.
Securing Data
through the Data
Lifecycle
Creation
Consider the starting point of data within the organization. This
includes collection of data from different sources and inputs
including computers, via email or secure file transfer, and
through mobile devices.
Storage
Data will be stored in various repositories, both on and off prem,
and increasingly in the cloud. Consider your current spectrum of
systems, applications, data warehouses and lakes, as well as
devices on which data may be at rest.
Usage
Evaluate how data is protected when it’s in use, both internally
and externally, by employees, contractors, and customers or
other types of end-users. How is data protected when in use, and
what parameters are in place around data usage?
Transmission
What methods does the organization currently use to transfer
data? How is data-in-transit protected, and are there current
technical controls as well as policies around data-in-transit?
Archiving
How does the organization currently archive data? Where is data-
at-rest being kept, and are there supporting retention periods and
documented policies around archiving procedures? Evaluate the
current security controls in place to protect data archives.
Destruction
When data reaches the end of its life, how is data permanently
disposed of? Consider both physical destruction of data
(shredding, melting, etc.) as well as digital destruction of data.
Conversely, consider the potential use of data you may destroy –
for example, information requested as part of a legal procedure.
• The ability to ensure data is secured throughout each
stage within the Data Lifecycle is a core component of
an effective approach and strategy to data security.
• This means leveraging both technical and process-
based controls that ensure end-to-end data security.
• Info-Tech’s research takes a multi-tiered approach, first
examining how data is secured within each of its
respective sources throughout each of the six stages
within the lifecycle.
Source: University of Western Ontario, Information Security Data Handling Standards
Info-Tech Research Group | 30
Data Security within the Five Tier Data
Architecture
Sources Integration
Data Warehouse
Environment
Reporting and
Analytics
Derived Content and
Presentation
Solutions:
SOA
Point-to-Point
Manual Loading
ESB
ETL
ODS
Data Hub
Functions:
Scrambling Masking
Encryption
Tokenizing
Aggregation
Transformation
Migration
Modeling
Data Lakes &
Warehouse(s) (Raw
Data)
EIM
ECM
DAM
Data Lakes
Warehouse(s)
(Derived Data)
Data
Marts
Data
Cube
BI Tools
Thought
Models
Formulas
Derived Data (from
analytics activities)
Protected Zone
Data
Marts
App 1
App 2
Excel and other
documents
Access Database(s)
External data
feed(s) &
Social Media
Flat
Files
Reports
Dashboards
Presentations
IOT
Devices
BDG
Class
MDM
This research will focus on securing both , as well as based
on Info-Tech’s Five-Tier Data Architecture.
Info-Tech Research Group | 31
Data Security Extends to All Data Sources
The purpose of an end-to-end data security program is to ensure that data is protected
in each location it exists in through each stage of its lifecycle.
Applications and Systems
Devices
Data Lakes
Data Warehouses
Data Marts
Databases
Flat Files
While most enterprise applications will
have built-in security controls, consider
focusing on process around user access
and privilege.
Includes both physical devices, as well as IoT-
enabled devices on which corporate information
may be accessed or stored. Devices are a complex
challenge as they necessitate complex, multi-tiered
data security controls.
Identify any databases that do not sit
behind enterprise applications and
evaluate the current set of controls in
place.
Sensitive data is often found in flat files as
well as relational databases, however flat
files often lack adequate security controls.
Raw data is still sensitive data.
Consider who has access to data
lakes, and the types of information
stored.
Access to data warehouses should
be limited, and data housed inside
should be encrypted and properly
protected.
Often considered a segment of a date
warehouse, data marts contain
accessible client-facing data, and as
such require separate consideration
for data security controls.
For additional insight, download Info-Tech’s research on how to
Build a Data Architecture Roadmap,
Info-Tech Research Group | 32
With an increase in the volume of organizational
data comes questions around who is accessing the
most sensitive and high-risk data, and why? An integral part of data security is
ensuring that the organization does
not fall victim to internal
overexposure of data. Internal or
insider threats pose a major risk
with respect to potential breaches.
Scoping the Internal Accessibility of
Sensitive Data
Of organizations leave 1,000 or more files with
sensitive data open to all employees.
Of all sensitive files at these same organizations
were found to be accessible to every employee.
Info-Tech Insight
Data security isn’t just a technical
process; it involves a thorough
assessment of the business from an
organizational perspective.
17%
53%
Source: Varonis 2019 Global Data Risk Report
534,465 Files at the average company
contain
Info-Tech Research Group | 33
Securing Data Within the Governance
Framework
Data Governance can be
thought of as the engine
that enables your data to be
transformed into the power
needed to drive your
organization up the data
value chain.
Data Security plays a key
role in the ecosystem of
data governance. The need
for more secure data drives
the adoption of security
policies that adapt to
changing requirements and
control access to sensitive
data.
Policies and
Procedures (PnP)
Master and
Reference
Data
Management
Data
Architecture
Info-Tech Power
Out
Knowledge
Information
Data
Data Value Chain
Data
Information
Business
Needs
Organization Fuel-In
People
Data Policies and
Procedures
Communication Plan
Data Security
and Audit
Data Risk
Management
Data
Integration
Data Modeling
Data Storage &
Ops
Data
Warehousing &
BI
Documents
and Content
Metadata
Shared Insight
Info-Tech’s Data Governance Framework
For additional insight, download Info-Tech’s Establish
Data Governance.
Info-Tech Research Group | 34
Define data classification
in the context of your
organization
Build out a data classification scheme that fits the
operating and regulatory environment of your
organization.
With the increase in data and digital advancements in communication and storage (e.g. cloud),
it becomes a challenge for organizations to know what data exists and where the data lives. A
classification scheme must be properly implemented and socialized to help ensure appropriate
security measures are applied to protect that data appropriately.
What is data classification?
Data classification is the process of identifying and classifying data on the basis of sensitivity
and the impact the information could have on the company if the data is breached. The
classification initiative outlines proper handling procedures for the creation, usage, storage,
disclosure, and removal of data.
Why do we need it?
Structured
• Highly organized data, often in a relational, easily searchable
database.
• E.g. employee numbers stored in a spreadsheet
Unstructured
• Data that is not pre-defined in format and content; majority
of data in most organizations.
• E.g. free text, images, videos, audio files
Semi-structured
• Information not in traditional database but contains some
organizational properties.
• E.g. email, XML
Types of data
Without data
classification, an
organization treats all
information the same.
Sensitive data may
have too little
protection.
Less sensitive data
may have too much
protection.
Strategically
classifying data will
allow an organization
to implement proper
controls where
necessary.
Info-Tech Research Group | 35
Info-Tech Research Group | 35
Info-Tech Research Group | 35
Appropriate classification of data is the first step to
securing the organization’s data.
The Role of Data
Classification in Data Security
• You can’t secure something without first knowing what it is you’re securing. This blueprint requires data
classification as a pre-requisite, in order to focus in on the organization’s most high-value, sensitive, or high-
risk data.
• Info-Tech’s five-tiers of data classification levels are outlined below: This research takes a three-phase approach to
developing a repeatable data classification program.
• Formalize the Classification Program
• Discover the Data
• Classify the Data
If you do not currently have a data inventory in place,
or lack a classification standard, refer to this research
in order to identify data in-scope for the Secure Your
High-Risk Data blueprint. This includes tier 4 and/or
tier 5 classified data.
Level 5 (Top-Secret): Data intended for use only by the authorized personnel whose unauthorized
disclosure could be expected to cause exceptional damage to corporate or national security.
Level 4 (Confidential): Data that is kept private under federal, local, or state laws or
contractual agreements or to protect its proprietary worth.
Level 3 (Internal): Data intended for use within the organization. Unauthorized external disclosure
could adversely impact the organization/customers/partners.
Level 2 (Limited): Data that is not openly published but can be made available via open record
requests. Direct access available only through authenticated and authorized employees.
Level 1 (Public): Data that is readily available to the public with anonymous access.
Download Info-Tech’s Discover and Classify
Your Data research.
Info-Tech Research Group | 36
Evaluate
Software
Solutions for
Data Discovery
and
Classification
We collect and analyze the most detailed reviews on enterprise
software from real users to give you an unprecedented view into the
product and vendor before you buy.
Learn from the collective knowledge of real IT professionals
• Know the products and features available
• Explore module and detailed feature-level data
• Quickly understand the market
Evaluate market leaders through vendor rankings and awards
• Convince stakeholders with professional reports
• Avoid pitfalls with unfiltered data from real users
• Confidently select your software
Cut through misleading marketing material
• Negotiate contracts based on data
• Know what to expect before you sign
• Effectively manage the vendor
Info-Tech Research Group | 37
Your organization’s most sensitive data will most likely fall into one of the following five
categories:
Know the Archetypes of your High-Risk
Data
Payment Card
Industry (PCI)
Data obtained as a part of the
financial transaction process.
This includes cardholder
name, expiration date, PIN,
and magnetic strip contents.
Intellectual
Property (IP)
Department of
Defense (DoD)
For organization’s that deal
with DoD clients, certain
categories of top secret or
classified information may
necessitate a separate
classification tier and set of
handling procedures.
Personal Health
Information (PHI)
Personally
Identifiable
Information (PII)
An organization’s secret sauce.
IP refers to data including
trade secrets, merger and
acquisition plans, and
upcoming product plans.
Covered by governing
regulations such as HIPAA or
PHIPA, this data includes
medical history, health
insurance numbers, and
biometric identifiers.
Certain types of PII can be
classified as highly sensitive,
dependent on the governing
privacy regulation. These
include social security
numbers, driver’s licenses, as
well as financial account
information.
Info-Tech Research Group | 38
Input Output
• Current data classification
standard
• Current data classification
policy
• Info-Tech’s Data Classification
Standard
• Revised data classification
standard
Materials Participants
• Laptop/computer
• Whiteboard
• Sticky notes
• Pen/paper
• Markers
• Data Classification Standard
• Data Classification Policy
• User Data Handling
Requirements tool
• CISO/InfoSec lead
• InfoSec managers and team
• IT Director
• IT team (optional)
• CDO/Data Lead
• Data and Analytics team
1.1.2 Review and Update
Data Classification
Standard
1 – 2 hours
1. Meet with all relevant stakeholders from InfoSec and data teams and review the organization’s
current data classification standard.
• If no standard currently exists, review any documentation around data handling policies
and procedures. This may include any compliance-related documentation that outlines
data classification requirements.
2. In small groups, without relying on your current data inventory (if in place), write down all of the
data types that the respective group members believe is considered or classified within Levels 4
and 5.
3. Discuss each smaller group’s findings and identify any data types that were omitted or not
identified amongst the larger group.
4. Write these data types on sticky notes and group based on the five sensitive data archetypes
identified on the previous slide.
Download Info-Tech’s Data Classification Standard
and Data Classification Policy documents.
Info-Tech Research Group | 39
Input Output
• Current data classification
standard
• Current data classification
policy
• Info-Tech’s Data Classification
Standard
• Revised data classification
standard
• Revised data classification
supporting documents
Materials Participants
• Laptop/computer
• Whiteboard
• Pen/paper
• Markers
• Data Classification Standard
• Data Classification Policy
• User Data Handling
Requirements tool
• CISO/InfoSec lead
• InfoSec managers and team
• IT Director
• IT team (optional)
• CDO/ Data Lead
• Data and Analytics team
1.1.2 Review and Update
Data Classification
Standard (Cont.)
5. Identify whether or not changes need to be made at the
classification level (i.e.. adding a separate level based on the
archetypes or additional changes in the organization’s
environment). Document.
6. As a group, determine whether the current set of descriptions for
each level encompass all high-risk, highly sensitive, or high-value
data within the organization.
• Be forward-looking. Is there a chance that certain types of
information, that aren’t collected or processed today but may
be in the near future, aren’t included within the classification
levels?
7. Reword and update current Data Classification Standard document
as necessary and note the date or changes.
8. Update any additional supporting documents, including Data
Classification Policy, Steering Committee Charter, and relevant
categories within the Data inventory.
Info-Tech Research Group | 40
Info-Tech Research Group | 40
Step 1.1 Step 1.2 Step 1.3
Step 1.2
Review Current Security Controls
This step will walk you through the following
activities:
• Document Current Data Security Controls
• Review and Validate Data Inventory
This step involves the following
participants:
• CISO/InfoSec lead
• InfoSec managers and team
• Data team/DBAs
• IT Director
• IT team (optional)
Activities
Outcomes of this step
• Current State Assessment of
comprehensive set of data security
controls based on Info-Tech’s seven
control categories.
Review Data Security Methodologies
1.2.1 Document Current Data Security Controls
1.2.2 Review and Validate the Data Inventory
Info-Tech Research Group | 41
Access Control’s Role in Securing Data
Establish appropriate mechanisms, processes, and techniques to create a
comprehensive access control framework for the organization’s sensitive data.
Protect the Physical Environment
Ensure protection of sensitive data and valuable organizational information
by putting in place specific controls around access to the physical location,
which extends to all components of the facilities and physical environment.
This includes mechanisms to secure ingress and egress points, as well as
access to high-risk assets that house sensitive data.
Protect Your Information Systems
A comprehensive set of access controls should address the
security of the organization’s information systems as well
as networks on which the systems operate. Preventing
unauthorized access here upholds security and integrity of
sensitive data.
Manage Personnel/Employee Access
Internal access control measures are an integral component of a
strong approach to access management. Ensure that the right
people have the right level of access to relevant systems and
information to prevent inappropriate access while enabling internal
collaboration.
Info-Tech Research Group | 42
A user-centric approach to access control.
• A model for access control that centers on the capabilities and role-specific functions or tasks of an
employee within the organization.
• The access controls established are built around the employee job function so that each specific job
function merits a set of privileges based on the functional requirements, which directly aligns the
organization with Identity and Access Management – a business-centric approach to IAM.
• Before moving to implement RBAC within the organization or integrate RBAC tools, you will need to map
out an exhaustive list of the organization’s systems, applications, and data stores/sources, and fully
understand each of the roles or individual employees and employee groups.
• It’s important that the organization document all roles and their corresponding functions for reference,
and an RBAC policy should exist.
WHY RBAC?
• Ensures that employees access the right information and information systems in order to complete their
roles, and do not have unlimited or global access.
• Easily tracked, monitored, and maintained and is role vs. person-specific.
Leverage Role-Based
Access Control (RBAC)
Full RBAC Management Model
Role A
Application 1
Application 2
Application 3
Source: Official Guide to CISSP CBK
For more information, leverage Info-Tech’s
research Simplify Identity and Access
Management and Mature Your IAM Program.
Source: Official Guide to CISSP CBK
Info-Tech Research Group | 43
RuBAC, MAC, and DAC and their roles in shaping
the organization’s access control strategy
An RBAC model is one of multiple approaches to access control frameworks for an organization. Consider
the following and their applicability based on your organization’s structure from both a human resources
standpoint, as well as the collective employee collaboration needs. Use the considerations on the left-hand
slide of this slide to ensure all contributing factors are considered when looking at alternate approaches to
access control.
• Rule-Based Access Control (RuBAC): This approach RBAC, but layers in a set of rules that define the
specific privileges allocated to an employee or end-user and is dictated by the system administrator. This
extends to dictating what the individual can do to, or with the document vs. just enabling access.
• Discretionary Access Control (DAC): Discretionary Access Control is a data-centric approach to access
control that places the controls on the data vs. the user, as with role-based approaches. Data is assigned
a data owner, and he/she has the capability to change and amend access to the data as needed – an
approach that centers around effective data governance and ownership.
• Mandatory Access Control (MAC): Centered on the system’s management of access controls, and as
such is controlled entirely by the administrator, who sets permissions, policies, and manages the
process. This access control approach is a good fit for high-sensitivity systems and/or data as it tends to
provide more robust security around access to data.
Leverage Alternate
Access Control Methods
Considerations for Access
Control Strategy
Implementation
• Do we have a large number of
different roles that require specific
access to data types and information
systems?
• Who is going to manage this
process, and what level of complexity
will each of these methodologies add
to our existing process?
• Do we have a strong data
classification and data ownership
model in place already?
• Do we anticipate significant re-
training efforts and/or user friction?
Source: Official Guide to CISSP CBK
Info-Tech Research Group | 44
When working with high-risk data, PAM plays a
vital role.
• PAM enables organizations to manage and restrict employees’ and end-users’ privileged access.
• The foundations of PAM rests in the security principle of least privilege and focuses entirely on the
access and controls granted to the organization’s most privileged users.
• Why is PAM integral in data security? Malicious attackers often leverage the organization’s privileged
accounts, as they are a targetable entry point that allows access to an organization’s critical systems
and highly-sensitive data. With access to privileged account credentials, these attackers become
potentially malicious insiders, and pose a significant threat to the organization’s sensitive data.
• Privileged accounts are those accounts with extensive capabilities and control over specific systems
or applications.
• PAM solutions mitigate risk around over-provisioning of user privileges, automate and centralize the
process of credential management, and provide an increased visibility into the organization’s gamut of
privileged accounts and credentials.
Assess Privileged Access
Management (PAM)
Source: Microsoft PAM for Active Directory
Prepare
Identify privileged
users/groups
Monitor
View history,
audit, review
alerts and reports
Protect
Establish
authentication
requirements
Operate
Approved
requests obtain
temporary access
Info-Tech Research Group | 45
Adopt appropriate measures to secure the
physical environment.
• In order to secure the organization’s high-risk, valuable assets, a layer of physical access control is
needed.
• This is the first layer in access control, and tangibly separates employees and end-users from the
environment in which systems or assets in question are stored.
• Effective access control involves a combination of physical and logical access control, which includes
authentication and authorization of an employee or end-user following clearance of physical controls.
• Physical access control can be heightened or elevated based on the importance of the environment it
protects. Examples for physical facility ingress and egress points include:
o Biometrics such as retinal scanning or fingerprints
o RFID key fobs
o Swipe cards
o Motion detectors
o Alarms
o Fences or barriers
Considerations for
Physical Access Control
The Role of Physical
Access Control
• How are visitors monitored and registered when
entering and existing our facilities?
• How do we monitor physical maintenance
workers (cleaners, etc.) that may not be
employees of the organization?
• If equipment is moved off-site for maintenance,
are we removing sensitive (confidential or
restricted) data in preparation?
• What is the process for when employees lose,
misplace, or have stolen physical access
devices?
• In a remote work environment, how are we
promoting physical security of our employees’
workspaces?
Info-Tech Research Group | 46
Data Operations’ Role in Securing Data
End-to-end data security means that all stages within the data lifecycle must
include a layer of security controls, from creation to archiving and deletion.
Recovery Backups Deletions and Destruction
• An effective recovery strategy aims to
achieve target RTO and RPOs that
support the organization’s operating
cadence.
• Recovery strategies should be informed
by risk and include security services as a
core component.
• Vendor negotiations are often an
important part of an organization’s
recovery strategy, in order to provide
support for services or existing
technology supplied by the third party.
• Data security extends to ensuring data’s
availability (through backups) for users
and employees.
• Should a system crash occur, backups
enable the organization to leverage data
that may have seemingly been lost and
are key in maintaining operations.
• Effective backups rely on both an
established process and policy
documents, as well as technical controls
to add a layer of end-to-end protection for
the organization’s sensitive data.
• Data security does not end when the data
lifecycle does, it continues through to the
methods leveraged for deletion or
destruction of the data itself.
• Improper destruction of sensitive data
opens the organization to potential
breach opportunities.
• Consider your cloud environment(s) and
how they will change your approach to
data destruction and deletion.
• Data destruction includes physical
destruction methods and digital deletion.
Info-Tech Research Group | 47
• The process of effective data backup is derived from a set of controls meant to mitigate risk and ensure
compliance. Backups are enabled by various tactics applied to data in order to ensure operations can still be
carried out in the event of a loss scenario.
• Mirroring: Synchronous writing of a collection of data to two or more targets. This differs slightly from
the backup in the traditional sense in that it’s often used from one system to another.
• Caching: Technique that is used at remote sites to keep a local copy of the dataset in synch with a
master copy. This enables the user to access the data through a hash of the data on the cache being
sent to the master node, which then sends back to the cache.
• Replication: Asynchronous writing of a collection of data to two or more targets, generally leveraged
from one system to another. This control defines the number of additional but similar systems data is
written into.
• Continuous Data Protection: Technique or process of saving a parallel record of all write operations to
a journal log file as well as executing the corresponding file change. This process allows for extremely
granular RPOs with very little system overhead.
• Protection Algorithm: Controls (RAID, Reed–Solomon error correction) that define how data is written
to the media in order to safeguard against physical fault; generally at the storage medium level.
• Snapshot: A point-in-time image at the block level of the entirety of a given media set. The input/output
overhead of a snapshot remains fixed based on the media set and does not scale with the amount of
data being stored, exerting much lower impact on performance than a traditional backup.
• Backup: A true copy of data onto non-volatile media. Defines the nature and number of copies of data
on non-production volumes.
Full Backups
Incremental
Differential
Synthetic Fulls
Demystify Data Backups
Types of Backups
A full second copy of the data is made to another
media type (i.e.. disk). Full backups provide a
complete copy of the data and are generally
conducted at a periodic cadence.
Backup type that only takes the changes in the
dataset that have occurred since the previous
incremental backup. This backup is generally
conducted at a more frequent cadence.
Incremental backup that also includes the reference
full backup.
A secure index of all changes against one or more
full restore versions of the dataset.
Info-Tech Research Group | 48
Effective recovery strategies hinge on
relevant backup protocols.
Implement Recovery
Techniques
• Recovery of sensitive data gives organizations a fail-safe in the case of physical damage or physical
drive failure, such as damaged assets or hardware, or logical drive failure, which includes corruption of
files or data.
• While organizations tend to invest significant time, money, and effort into backup solutions, a supporting
recovery strategy is often overlooked and underdeveloped.
• End-to-end data security necessitates a comprehensive recovery strategy and set of supporting
techniques to ensure critical or sensitive data is not unintentionally lost or destroyed.
• Common recovery site strategies include the following1:
o Dual Data Center: Applications are split between two separate physical data center locations.
o Internal Hot Site: Standby site that possesses necessary infrastructure to run identical
environments.
o External Hot Site: Recovery site provided by a third-party, lacks exact replication of infrastructure.
o Warm Site: Facility leased by the organization that has some of the necessary equipment.
o Cold Site: Empty data center recovery site that is not equipped with necessary assets/hardware.
• Do the organization’s current disaster recovery
and business continuity plans include recovery
of security services?
• How is sensitive or high-risk data addressed
within the organization’s current recovery
processes?
• Are specific applications or information systems
with sensitive, critical, or high-risk data
prioritized within the recovery strategy?
• How often are we checking that the designated
recovery site meets requirements from a security
and operational perspective?
• Do we test our recovery process on a regular
basis (annually) and make changes as needed?
Considerations for Secure
Recovery
1Source: Official Guide to CISSP CBK
Info-Tech Research Group | 49
End-of-life sensitive and high-risk data
requires specific destruction and deletion
processes.
• For both digital forms as well as physical forms of sensitive data, ensure that appropriate
destruction, disposal, and deletion procedures are documented and carried out.
• Ensure that your end-of-life destruction methods align with any governing compliance
regulations and standards. Many regulatory laws will have limitations or guidelines in place
for data destruction.
o Physical: Disk shredding, burning, melting, degaussing
o Digital: Wiping, deletion, electronic shredding, solid state shredding,
overwriting/erasure/sanitization
• It is recommended that you keep record of the destruction through a log-book and
document step-by-step processes that outline appropriate destruction techniques.
• Opportunities exists to employ a third-party service for data destruction, however, prior to
doing so, ensure that the company has appropriate certificates of sanitization, complies
with relevant regulations, and documents its processes.
Establish Data
Destruction Processes
Data Destruction
Considerations
Time
Resourcing
Cost
While methods such as overwriting may seem
preferable to physical forms of destruction, consider
the time spend when evaluating high-capacity data
sources.
Are you planning to outsource data destruction, or will
the process be carried out in-house? If the latter is true,
do assigned members of staff have the knowledge and
capabilities to ensure proper data destruction?
For physical methods of destruction, there are often
high capital expenses associated with the process. This
extends to costs associated with degaussing products
as well as additional forensic methods that may be
leveraged as a part of the validation process.
Info-Tech Research Group | 50
Establish Appropriate Authentication
within the Organization
Managing the identity access controls around data extends to putting in place
appropriate and aligned methods for authentication.
Effective authentication often requires a combination of multiple techniques, as well as significant process redesign in order to safely
validate user identity. When including authentication as a part of the data security access control layer, consider both the strength of the
technical controls, as well as the time and effort it will take to implement and change behaviors amongst your employees.
Security Considerations
• Is the organization consistently taking on more sensitive or high-risk
information?
• Are the clearance and general security requirements from our
customers and partners increasing?
• What tools (identity, location, keys) can we use to help validate how
our employees are accessing corporate information?
Employee/User Considerations
• Will implementing this significantly impact employee experience?
• Will the challenges in implementing and changing behaviors
create resistance amongst employees to adhere to new
authentication standards?
• How will organization-wide authentication methods better
facilitate a remote-work environment?
Info-Tech Research Group | 51
Improve your chances of fending off external attacks
through MFA.
• Multi-Factor Authentication (MFA) has evolved from two-factor authentication and requires the use of three
or more different methods of authenticating a user’s identity.
• The intention behind effective MFA is that the different authentication mechanisms originate from separate
or independent channels, complicating the infiltration route for malicious attackers.
• MFA provides increased protection against phishing attacks, keyloggers, credential stuffing, and man-in-
the-middle attacks.
• When setting up MFA, it is integral to align the types of with the sensitivity level and risk of the information
or assets being accessed.
• In addition to the four primary elements of considerations of authentication, more evolved versions of MFA,
called Adaptive MFA, (generally with the assistance of AI) incorporate contextual and behavioral elements
including:
o Time of access
o Device being used for access
o Type of network being accessed
MFA Credentials and
Authentication
Mechanisms
Evaluate the Layers of Multi-
Factor Authentication
Biometrics
Passwords
SMS / Email Code Verification
Tokens
Knowledge or Security Questions
Risk Score/Risk-Based Authentication
Social Media Platform Login
Info-Tech Research Group | 52
SSO extends authentication controls outside the realm
of the organization.
• Through SSO, an employee can access multiple different applications or systems within the
organization’s environment without signing in to each one individually.
• The process involves the Identity Provider authenticating user access to the various service providers
(think applications) through one singular use of credentials and log-in process.
• SSO significantly reduces user friction in how employees access information systems, applications, and
corporate data.
• Both SSL or Secure Sockets Layer, as well as S/MIME certificates are an integral component of SSO. They
provide the employee or end-user the ability to access any SSL-enabled servers that the individual has
been granted access to without needing to submit any additional verification steps.
Using Single-Sign On
for User Authentication
Source: IAPP Introduction to Privacy for Technology Professionals
SSO Website
Website App
Website/App
Source: Adapted from OneLogin
Trusted partnership
All Pages
Trusted
partnership
Identity
Provider
Info-Tech Research Group | 53
Completely Automated Public Turing test to
tell Computers and Humans Apart.
• CAPTCHAs differ slightly from the previous forms and mechanisms of authentication, as they
have a singular purpose of verifying whether or not a user is human vs. validating actual user
identities, known as a type of Human Interaction Proof (HIP).
• Over their two-decade lifespan, the effectiveness of traditional CAPTCHAs has weakened. The
advent of AI technology has made the traditional anti-robot approach of CAPTCHAs relatively
easy to bypass.
• Additionally, CAPTCHAs often diminish the end-user experience; think overly-complex characters,
or multiple images that have a tiny fragment contained within them of the object to identify.
• Google’s risk-based reCAPTCHA provides a solution to the dated CAPTCHAs of the past, which
leverages AI and machine learning to help better understand techniques for effective CAPTCHA
creation.
• While reCAPTCHA is currently used on over 4.5 million sites globally, there are significant
concerns around the user privacy implications of the system, including previously-installed
Google browser cookies.
CAPTCHAs Role in
Authentication
SI CAPTCHA
Math Captcha
Are You a Human
Geetest
WP reCAPTCHA
Tencent
Waterproof Wall
CAPTCHA
Solutions
Source: BuiltWith
Info-Tech Research Group | 54
Protect Data’s Integrity throughout its
Organizational Use
In order to ensure data’s and from initial creation
through until destruction, controls around integrity and quality become
valuable security mechanisms.
Accuracy
Consistency
• Is the data in each system or source in which it appears, accurate and a true representation of its original format?
• During the process of transmitting data, has it been altered or modified in some way that it no longer retains its intent and accuracy?
• Data that retains its accuracy tends to be increasingly reusable and maintainable within the context of the organization’s data uses.
• Data accuracy can be comprised during transfers, due to either intentional or unintentional alterations made to the data.
• Data that is consistent drives its ability to be easily recovered, searchable, reliable, and traceable.
• Inconsistencies in data can be driven by a multitude of factors, including human error, compromised or impacted assets, as well as malicious external actors
(hackers).
• In implementing mechanisms and controls to ensure the consistency of sensitive data, organizations protect themselves from the harmful interception and
modification of their high-risk, high-value information.
Info-Tech Research Group | 55
Understand the role of hash functions in techniques
that support data integrity.
• Hash functions or algorithms secure data through the transmission or transfer process by taking an
input (data) and producing a small output that is dependent or unique to the input.
• If the output can not be identified or deduced using the input, it means that the hashing function is
strong. Hashing is known as a one-way hashing algorithm (unidirectional).
• Hashing prevents data from being altered or tampered with when it is being transmitted from the
creator or sender to the recipient while it is in transit.
• Hash functions can be used and applied as a data security control in the following ways:
• Identification of versions of documents
• Password verification (authentication)
• Comparison of files to ensure equality and integrity
• Digitally signing documents (we will explore this in the following Digital Integrity section)
Data Hashing’s Role in
Ensuring Integrity
Hashing
Source: IAPP Introduction to Privacy for Technology Professionals
Hashing Algorithm
Plain text Hashed Text
Info-Tech Research Group | 56
Ensure accuracy and consistency of data from
sender to receiver.
• Digital signatures derive from eSignatures (electronic signatures) and serve as a digital certification or
fingerprint, ensuring that a document has not been altered since it was signed.
• Digital signatures can be grouped in to three separate Digital Signature Certificates:
o Class 1: Equipped with a baseline level of security, used only for low-risk, low-sensitive data and
documents. Do not hold legal tenure and confirm only name and email address.
o Class 2: Can be used for electronic filing of specific documents, such as tax filing purposes.
Equipped with a level of security that enables both business and personal use. Compares
information from sender with data in a recognized consumer database.
o Class 3: Highest degree of certificates for digital signatures and applies to the exchange of highly
sensitive or high-value data and information. Can be leveraged for patent and trademark filing,
procurement, etc.
• Digital signatures promote increased trust and transparency to better validate online interactions
between an organization and its customers, as well as third-parties and other external entities.
• Digital signatures help to provide proof of the origin, identity, status, and accuracy of an electronic
document, message, or transaction.
Validating Data Integrity
through Digital Signatures
$1,534.8
Million is the size of the
digital signatures market as
of 2019
APAC
Is the fastest-growing
market by region for digital
signatures
Elimination of paperwork
(and)
Government policies
supporting market growth
are the two key drivers for
the Digital Signature
market.
Source: PSMarket Research
Info-Tech Research Group | 57
Digital signatures leverage Public
Key Infrastructure (PKI) in order to
validate integrity of the document.
This requires the use of a
mathematical algorithm along with
one public and one private key.
From Digital Signatures to Secure Data –
A Journey
Info-Tech Insight
The ways in which you secure your data evolves
to match the external landscape, and so should
you. Should digital signatures not currently be
used in the organization, understand how they
can fit within the landscape of your
organization as we see an increased prevalence
of remote work.
As remote work environments become the norm, an
uptake in the acceptance of digital signatures is
anticipated.
Public Key (Receiver)
Privacy Key (Sender)
Document/Data Digital Signature Document
Hash Algorithm Hash Value
Document/Data Recipient
Document/Data
Creator
Info-Tech Research Group | 58
An increasingly digital economy brings with it the
advent of a new form of digital integrity.
• Though a seeming outlier in traditional data security, blockchain upholds each pillar of the CIA triad.
• A blockchain keeps record of a series of transactions, each one of which generates a hash. This hash is
dependent on both the transaction itself, as well as the hash of the previous transaction.
o Nonce: “Number used only once” refers to a random whole number that is added to a hashed clock
within a blockchain.
o Nodes: The spectrum of computers that contribute to a blockchain.
o Block: Refers to one individual block of records within a blockchain – the entire group of blocks
composes the blockchain.
• Blockchain is built on the following pillars:
o Decentralization: There is no central governing access to the data contained within a blockchain due to
its distributed ledgers; therefore, it ensures a level of data security as there is no singular surface or
point for attacks.
o Transparency: Although privacy is a paramount consideration of blockchain the ledger lends itself to
the principle of transparent visibility by providing an essentially anonymized view of a full host of
transactions conducted within the blockchain.
o Integrity: The cryptographic has functions of a blockchain that ensure the data entered into the
blockchain itself can not be altered or modified.
Blockchain Transactions
The Blockchain Buzz
Transaction is
requested
A representational
“Block” is created
“Block” sent to all
“Nodes” in the
network
“Nodes” approve
the transaction,
“Block” added to
Blockchain
“Transaction
Complete
Source: Adapted from MLSDev
Info-Tech Research Group | 59
Blockchain vs. the GDPR
Evaluate the regulatory implications of blockchain
technology.
Although transparency is one of the key pillars of blockchain, the distributed ledger format of blockchain has raised questions around its viability for organizations in-
scope for the GDPR.
• Additionally, GDPR allocates to data subjects a set of specific rights with
respect to data correction, deletion, and access.
• Blockchain, however, as a result of the framework that upholds its pillar
of integrity, is immutable, and as such data can not be amended,
modified, or deleted once the transaction’s “block” has been added to the
blockchain.
• Article 5 (2) as well as Article 24 within the GDPR denote the
responsibilities of an identifiable data controller with respect to
processing of personal data.
• Blockchain, by nature, does not have a singular controlling body, but
instead is comprised of many individual contributors (Articles 12- 23).
Data Controller
Data Subject Rights
Lawful Bases
Data Minimization
• The six lawful bases for processing data (Article 6 – GDPR) outline the most
tenuous, and the one that often necessitates further validation through a data
processing impact assessment (DPIA) as legitimate interest.
• When considered under the scope of the GDPR, this is likely the only lawful basis
that applies to blockchain, calling in to question the potential need for validation
of how and why personal data is collected through a DPIA.
• This guiding principle of data protection ensures that the controller collects
the minimum amount of data necessary to perform the intended purpose and
retain it only for the necessary period.
• This conflicts with a distributed network’s framework of redundant data
storage – a key component of blockchain.
Info-Tech Research Group | 60
Avoid Secure Data Leaving the
Organization with Data Loss Management
Within each stage of the data lifecycle, there is potential for high-value, high-
risk data to unintentionally leave the organization – a key problem that data
loss management controls address.
Data-at-Rest Data-in-Transit Data-in-Use
• Identify and track where information is
stored throughout the enterprise (data
warehouses, data marts, data lakes).
• Once located, assessment can
subsequently be conducted to ensure
that the locations in which the data sits
are adequately protected and align with
the compliance (external) and
classification (internal) rules.
• Data loss management also helps to
ensure data-at-rest stored outside of the
organization is protected.
• As data moves dynamically throughout its
lifecycle, its movement across the network
can be tracked, monitored, and assessed.
• As with data-at-rest, traditional DLP (data
loss prevention) solutions can analyze
network file transmission to identify any
concerns with respect to the data types
being transferred over the network.
• Appropriate controls (encryption, hashing)
can be placed around data-in-transit as a
result of this analysis, to protect data both
inside and outside of the organization.
• Data being leveraged on endpoints is an
integral part of data loss management, as it
aims to restrict improper use of data by
employees or other end-users.
• While traditional DLP solutions provide a
technical set of controls through rule sets
and limitations that prevent certain actions
from being taken, this must also be
supported through policy creation and
education of end-users and data owners.
Info-Tech Research Group | 61
Identify where DLP can add value.
• Data Leakage Prevention and Data Loss Prevention are often used interchangeably and focus on
discovering and documenting all sensitive information that is stored within and environment while
subsequently controlling its movement in and out of the environment.
• NIST outlines data loss as a result of data leakage, data disappearance, or data damage.
• DLP solutions can be viewed as cumbersome, as they require significant planning and governance prior to
implementation and continuous monitoring and maintenance from a process owner.
• An integral component of effective implementation of a DLP solution, be it standalone or leveraged as a
part of existing technology is a strong understanding of what types of data the organization wants to
prevent from exfiltration – what is our most sensitive, or more sensitive, information, and how can we
characterize that through policies?
• Key features of DLP vendors include
o Data discovery
o Policy creation capabilities and pre-set policies
o Environment scanning
o Endpoint support
o Centralized reporting
DLP’s Role within the
Data Security Landscape
Global DLP Market
DRIVERS
• Data breaches
• Compliance and complexity of regulatory
demands
• Cloud computing
• Protection of sensitive data
• Opportunities for DLP in cloud instances
INHIBITORS
• Deployment complexity
• Business driver for external collaboration
• Requires comprehensive data inventory and
mapping of organization’s data
Expected Growth Rate by 2025
$4.7 billion
Source: BusinessWire
Info-Tech Research Group | 62
The Cloud Access Security Broker advantage.
• CASBs provide organizations with the elusive security layer sought after as the advent of cloud solutions
have enabled operations to become increasingly agile, scalable, and readily-available.
• A CASB solution helps extend infrastructure security measures to the cloud environment, serving a different
purpose than IDaaS, web app and enterprise firewalls, and secure web gateways.
• CASBs can function either as on-premises or cloud-based security policy enforcement mechanisms, and
are generally a result of one of four specific use-cases:
o Increased Visibility
o Regulatory Compliance
o Data Security
o Threat Protection
• CASBs can be deployed in one of four ways:
o Log Collection
o Forward Proxy
o Reverse Proxy
o API
Secure Data in the Cloud
with a CASB Solution
Source: Adapted from McAfee
Source: SANS, Technical Approach to Securing SaaS CASBs
Simplified CASB Model
CASB
Endpoints Third
Parties
Remote
Users
On-
Premises
Info-Tech Research Group | 63
Revisit your network security to mitigate loss of
data-in-transit.
• Network segregation is the process of segregating or splitting the organization’s critical networks from
less-critical internal networks.
• Network segmentation involves creating smaller sub-networks from the larger corporate network.
• Both can play a vital role in the organization’s efforts to reduce risk of external attack or infiltration and are
potential mechanisms for network security. Network segregation emphasizes safeguarding of the
organization’s most critical information while its in-transit.
• How? Both network segmentation and segregation mitigate risk by housing separate networks, effectively
enabling specific sensitive datasets to remain untouched and uncompromised in the case of a network
breach and creating additional hoops for malicious attackers.
• PCI DSS-compliant organizations are required to incorporate network segmentation or segregation as a
method of securing cardholder data.
• Effective network segregation efforts cannot take place in isolation. Instead of the InfoSec team owning the
process, IT and the organization’s networking team, as well as any other impacted departments and end-
users, need to be involved in order to ensure that the appropriate foresight and architectural considerations
are a part of the project process.
Segregate Your Networks
to Mitigate Data Loss
Source: Advisera
ISO 27002 Guidance on
Network Segregation
1. Divide large networks into separate
network domains
2. Consider physical and logical
segregation
3. Define domain perimeters
4. Define traffic rules between domains
6. Consider network integration with
business partners
5. Use authentication, encryption, and
user-level network access controls
Info-Tech Research Group | 64
Ensuring that audit logs help monitor effectiveness of
technical controls.
• Audit logs or audit trails ensure that any events or incidents that take place can be attributed to a point in
time and a specific individual or user.
• Software applications and systems, including your DLP solution, IDS, servers, and firewalls, should each
have a respective attributed audit log.
Effective Audit Logging Practices:
• Ensure that audit logs are kept for key systems and applications within the organization and are
aggregated in a central log management system.
• Logs should be regularly reviewed and analysis, including alerts attributed to security incidents.
• Assign ownership over audit log review process to an individual within your IT or InfoSec team.
• Establish and document a retention strategy for logs.
• Log capacity requirements should be identified and managed.
• Ensure log files are treated as sensitive, as the information they contain can be an open book for potential
malicious intruders.
Audit your Data Security
Controls
Compliance
Accountability
Analysis
Risk Management
Audit logs are a key requirement of various
compliance and regulatory obligations, including
HIPAA, NERC-CIP, and ISO 27001.
By tracing the start point of specific issues or
incidents, audit logs help enforce user
accountability within the organization.
Tracking of event details enables an organization
to better understand individual risk elements
within the landscape of the operating
environment.
Assessment of events post-occurrence provides
insight on how to improve risk management and
general security practices.
Info-Tech Research Group | 65
Encryption in its many forms is an effective mechanism for securing data
throughout its lifespan within the organization and upholding the CIA triad.
Encryption as a Data-Specific Technique
to Secure Sensitive Information
Confidentiality
Encryption preserves confidentiality by ensuring that sensitive or high-risk
data is not visible when sent from one party to another, until it is decrypted by
the appropriate party. Encryption can be used for transmission of data but
also as a part of storage of data both internal to the organization, as well as in
external locations.
Availability
Integrity
Encryption makes the secure transmission of data possible within
organizations, but also outside of organizations when there are fewer
protective controls that the data creators or owners can place on the data.
Secure availability and authorized access are key benefits of leveraging
encryption practices.
The Integrity pillar within the CIA triad is upheld through the respective
processes of encrypting and decrypting messages by intended recipients so
that the information contained is not tampered with or changed in the
process of transmission, or while it is stored.
Info-Tech Research Group | 66
Protect data as it moves both within and
outside of the organization.
S/MIME and PGP (Email)
Public Key
Implement Encryption
for Data-in-Transit
SSL and TLS (Web Browser/Server)
SSH (Secure Remote Systems)
• S/MIME ensures that information sent via email verifies the sender through digital signature and
is encrypted, ensuring that only the intended recipient sees the information sent out.
• End-to-end encryption provided, uses asymmetric cryptography.
• Pretty Good Privacy (PGP) provides users the ability to make their own PGP public keys vs.
leveraging certificates.
• Secure Socket Layer has evolved to Transport Layer Security, used to securely send data over
the internet
• TLS is commonly used to encrypt data communicated between a web application and a server
Source: IAPP An Introduction to Privacy for Technology Professionals
Private Key
• Secure Shell (SSH) enables safe remote access (think servers and off-prem computers).
• SSH relies on symmetric-key algorithms (public key cryptography).
• Asymmetric encryption
• Publicly available information
• In public-key infrastructure (PKI), one of the two keys
leveraged in the exchange of data is kept secret
• Sender and receiver are as such not required to
share the same key; one used for encryption, one
used to decrypt.
• Symmetric encryption
• In private-key cryptography, only one key is used in
the exchange of data (private key)
• Sender and receiver must exchange the same key
• Tends to be faster than public key
Info-Tech Research Group | 67
Stored sensitive data should be encrypted to ensure
end-to-end data security.
Implement Encryption
for Data-at-Rest
Hardware/Software
Storage-Level
Database
• Hardware encryption occurs at the device level, is self-contained, and leverages the device’s on-
board security, which means it does not impact the host system and as such tends to result in
improved performance.
• It is a less-flexible and more costly form of encryption, as it can not be easily extended to an
entire system.
• Software encryption is cost-effective, involves public or symmetric key encryption, and tends to
run a lesser price tag than its hardware counterpart.
• Feature of storage security that encrypts data during the archival stage of its lifecycle.
• This includes encryption of data at the disk level (think hard drive – full disk encryption or FDE),
file level (file-based encryption of FBE), on tape drives, and arrays.
• Consists of the following encryption levels: cell-level, column-level, tablespace or database level,
and file-level.
• Encryption of databases is a feature of Generally-Accepted Privacy Principles for data privacy
and protection.
Transparent Data Encryption
(TDE)
This enables the data to remain protected and
secured as it is encrypted while at rest, but
transparent and decrypted while in use only by
these select authorized users.
How does this secure data? Disincentivizes hackers
from stealing data as it becomes unreadable when
not in use. In order to access or use the data, users
must have access to the encryption certificate and
“master key.”
Occurs at the database encryption level and allows
for sensitive data stored in tables/tablespaces to be
encrypted. Data can then be decrypted for use by
authorized users and transparently displayed.
Info-Tech Research Group | 68
A core concept explored within data privacy, data masking offers a valuable set
of potential options for retaining and protecting highly sensitive information.
Mask Data for a Privacy-Centric
Approach to Data Security
Obfuscation and Data Minimization
Data Masking Deidentification/Pseudonymization Anonymization
• The processes of making datasets
unrecognizable or untraceable through the use of
specific techniques.
• These include scrambling, substitution, shuffling,
nulling, redaction, deletion, and introducing
variances of numbers/dates/numerical
complexes.
• Data masking can be static, meaning it is masked
within the original environment prior to extraction
and transfer.
• Dynamic data masking occurs in real-time, within
the production database.
• Privacy regulations such as the GDPR reinforce
the importance of minimization techniques such
as deidentification and pseudonymization.
• Deidentified data is data that has had specific
values that attribute to an individual, replaced or
removed, so that it is no longer directly
identifiable to a data subject.
• Pseudonymized data is one step back, as it
removes personal identifiers from a dataset;
however, it does not eliminate the possibility of
the data being reidentified should another piece
of personal data be added to the mix.
• When data has no reasonable way of being
reattributed to its relevant data subject, it is
considered anonymized.
• All personal identifiers, in this case, have been
removed, and can no longer be re-associated in
hopes of reidentifying the data subject.
• Data that has been fully anonymized often
encounters issues around usability from the
perspective of analytics.
• It is difficult to guarantee full anonymization of
data, and anonymization is often confused with
deidentification in practice.
Source: IAPP Privacy Program Management, PrivSector Report
Info-Tech Research Group | 69
Data privacy laws fuel the need to mask sensitive
data.
• The role of data obfuscation is to make data unreadable, uninterpretable, useless or unusable to anyone
other than the intended recipient or user.
• Data masking and tokenization are examples of data obfuscation.
• Tokenization is an irreversible process that refers to replacing sensitive data with tokens acting
as placeholders and storing this data in a secure storage environment. The token(s) do not have
an algorithmic or mathematical attachment to the original input data.
• This form of obfuscation is popular in payment systems or financial services (think card
numbers and bank account numbers)
• Deidentification and pseudonymization are examples of data minimization, both of which are employed
in the context of personal data, PII, and privacy regulations.
• Privacy regulations and industry standards, including GDPR and PCI DSS, have specific instructions
around the appropriate use of data obfuscation and minimization techniques as a part of data security
• The GDPR, for example, considers fully anonymized data NOT to be considered personal data, as
it is not attributable to an individual, whereas pseudonymized and deidentified data can be
reattributed, and is therefore still considered personal data.
Apply Obfuscation and
Minimization Techniques
Fully-identifiable
personal data
Pseudonymized
personal data
Deidentified
personal data
Anonymized
personal data
Sources: IAPP Privacy Program Management, Tokenex
Info-Tech Research Group | 70
Input Output
• Review of Section 1.2 slides
and information
• Data Security Matrix
• Completed Tabs 3 to 10 in the
Data Security Matrix
• Overview of all data security
controls in place within the
organization
Materials Participants
• Laptop
• Sticky notes
• Markers
• CISO/InfoSec lead
• InfoSec managers and team
• Privacy Officer/Privacy
Program Manager
• IT Director
• IT team (optional)
1.2.1 Document Current
Data Security Controls
2 - 6 hours
1. Using the Data Security Matrix tool, complete tabs 3 to 9 for all relevant data sources.
• Note that prior to completing this Matrix tool, you will need to decide on the scope of the applications and
systems evaluated. For example, you may choose to evaluate ALL applications and systems in which data
is stored (see instructions on Tab 3 for this option), or scope down to only a specific application or
system.
2. In column E, select the Framework or Industry Standard your organization currently aligns with, wishes to align to,
or is in scope for.
3. For each control item listed in column D, identify whether or not the control is in place for the data source in
question, f it is partially deployed, or select N/A for controls that do not fit the particular data source of stage of
the lifecycle. Note that selecting N/A will not impact the overall data security average.
• Use columns F to K to match the control with the stage within the data lifecycle.
• Note that “N/A” has been pre-selected for many of the controls that do not logically apply.
4. Review the “Security Average” in column L, and revisit controls that have listed “No” for more than three of the
stages within the lifecycle. Identify whether or not these control areas require immediate or future focus.
5. Define and document a gap-closing initiative for data security controls in column M and/or include any relevant
comments around why the control is or is not in place.
6. As a group, discuss outcomes and compare with the set of compliance or standard requirements for each control.
Download Info-Tech’s Data Security Matrix
Info-Tech Research Group | 71
Reference the Organization’s Data
Inventory
After completing the Data Security
Matrix for each of the relevant data
sources within the organization,
validate that the data sources
evaluated during Activity 1.2.1 include
all of the data locations within the
organization.
• Review
• Validate
• Revise and Update
• Document
Info-Tech Insight
The data inventory should not be a viewed as a static repository or compliance requirement, but as a dynamic control itself within the data security process. Ensure it
is regularly updated, revised, and assessed in order to build out the process surrounding effective data security within the organization.
Info-Tech Research Group | 72
Input Output
• Inventory of all data within the
organization
• Outputs from Activity 1.2.1
• Updated data inventory
• Comprehensive and validated
list of applications, systems,
and physical locations that
currently house highly-
sensitive or high-risk data (tier
4/5)
Materials Participants
• Optional: Info-Tech’s Data
Classification Inventory tool.
• Alternatively, leverage the
organization’s data
classification inventory for
reference
• CISO/InfoSec lead
• InfoSec managers and team
• Data team/DBAs
• IT Director
• IT team (optional)
1.2.2 Review and Validate
the Data Inventory
1 hour
1. As a group, review the current data inventory tool using Info-Tech’s Data Inventory, review Tab 3
– Data Inventory, and filter for any data considered in the Sensitive or Classified level (tier 4/5).
• If you have completed your own data inventory, ensure you filter to focus on only the
highest and/or second highest tier(s) of data.
2. Review Column I – Repository, and Column J – Physical Location, which details where the data
resides. Ensure that any additional locations in which the data exists while in-use are noted
within the inventory, as these will be mapped to help validate the Data Security Matrix.
3. Document the full list of repositories. Include any data sources captured in Activity 1.2.1 that
may have been omitted.
4. In Tab 4 – Repository Classification, cross-reference repository controls listed in Columns F to AK
with outputs from the Data Security Matrix tool, to ensure that both the Inventory and Matrix
include controls in place for applications and systems.
Assess and update the current data inventory for tier 4/5 sensitive or
confidential data.
Download Info-Tech’s Data Classification Inventory, or
leverage the organization’s current data inventory tool
Info-Tech Research Group | 73
Info-Tech Research Group | 73
Step 1.1 Step 1.2 Step 1.3
Step 1.3
Assess Compliance and Regulatory
Frameworks
This step will walk you through the following
activities:
• Identify Compliance Concerns
• Review the Compliance Frameworks
This step involves the following
participants:
• CISO/InfoSec lead
• InfoSec managers and team
• Privacy Officer/Privacy Program
Manager
• Internal Audit
• IT Director
• IT team (optional)
Activities
Outcomes of this step
• Mapped comparison of the compliance
obligations across all data sources
• Identification of current data security
compliance gaps
Review Data Security Methodologies
1.3.1 Identify Compliance Concerns
1.3.2 Review the Compliance Frameworks
Info-Tech Research Group | 74
A perspective on the global data privacy and compliance landscape.
Data Privacy Compliance –
An overview
Countries in total have
established data protection or
data privacy laws, while only
18% of countries have no data
protection laws in place.
Of individuals state that
they are very or
somewhat concerned
about how companies
are using the data they
collect about them.
79% 21%
Info-Tech Insight
Effective privacy and
compliance help drive
consumer confidence.
Effective data privacy
practices can give you a
competitive advantage
through transparency.
107
Of individuals believe
that companies vs.
government or users
are responsible for
maintaining data
privacy.
Source: DataPrivacyManager
Info-Tech Research Group | 75
Input Output
• (Optional) Stakeholders
involved can prepare a list of
current compliance
obligations.
• Printed or digital document
version of all relevant
compliance and industry
standards or frameworks.
• Collective understanding of
key compliance concerns and
drivers within the organization.
• Prioritized list of compliance
concerns.
Materials Participants
• Laptop
• Sticky notes
• Markers
• CISO/InfoSec lead
• InfoSec managers and team
• Privacy Officer/Privacy
Program Manager
• Internal Audit
• IT Director
• IT team (optional)
1.3.1 Identify Compliance
Concerns
45 minutes
1. Bring together relevant stakeholders from the organization with significant knowledge around
compliance obligations and governing regulations.
2. Using sticky notes, have each stakeholder write down one key concern or question around
compliance adherence per sticky note.
3. Collect these and group together similar themes as they arise. Themes may include:
• Access Control and Management
• Data Lifecycle (deletion, archiving, retention periods)
• Disclosure and Open Access of Data
4. Discuss with the group what is being put on the list and clarify any unusual or unclear
obligations.
5. Determine the priority of concern, or severity, for each of the compliance adherence questions.
Action any high-concern or high-severity questions addressed.
6. Discuss and document for future reference.
Discuss and document the primary compliance and regulatory
obligations of the organization.
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data
 Secure Your High Risk Data

More Related Content

Similar to Secure Your High Risk Data

Cybersecurity Management: Preventing Data Breaches in the Age of Big Data, 25...
Cybersecurity Management: Preventing Data Breaches in the Age of Big Data, 25...Cybersecurity Management: Preventing Data Breaches in the Age of Big Data, 25...
Cybersecurity Management: Preventing Data Breaches in the Age of Big Data, 25...
360 BSI
 
IT Information Security Management Principles, 28 February - 02 March 2016 Du...
IT Information Security Management Principles, 28 February - 02 March 2016 Du...IT Information Security Management Principles, 28 February - 02 March 2016 Du...
IT Information Security Management Principles, 28 February - 02 March 2016 Du...
360 BSI
 
Select and Implement a Next Generation Endpoint Protection Solution
Select and Implement a Next Generation Endpoint Protection SolutionSelect and Implement a Next Generation Endpoint Protection Solution
Select and Implement a Next Generation Endpoint Protection Solution
Info-Tech Research Group
 
Security Architecture
Security ArchitectureSecurity Architecture
Security Architecture
Priyank Hada
 
IT Information Security Management Principles, 15 - 18 May 2016 Dubai UAE
IT Information Security Management Principles, 15 - 18 May 2016 Dubai UAEIT Information Security Management Principles, 15 - 18 May 2016 Dubai UAE
IT Information Security Management Principles, 15 - 18 May 2016 Dubai UAE
360 BSI
 

Similar to Secure Your High Risk Data (20)

Life After Compliance march 2010 v2
Life After Compliance march 2010 v2Life After Compliance march 2010 v2
Life After Compliance march 2010 v2
 
ISACA New York Metro, Developing, Deploying and Managing a Risk-Adjusted Data...
ISACA New York Metro, Developing, Deploying and Managing a Risk-Adjusted Data...ISACA New York Metro, Developing, Deploying and Managing a Risk-Adjusted Data...
ISACA New York Metro, Developing, Deploying and Managing a Risk-Adjusted Data...
 
Business Process Revamp is Paramount in 2024.pdf
Business Process Revamp is Paramount in 2024.pdfBusiness Process Revamp is Paramount in 2024.pdf
Business Process Revamp is Paramount in 2024.pdf
 
Security Analytics Beyond Cyber
Security Analytics Beyond CyberSecurity Analytics Beyond Cyber
Security Analytics Beyond Cyber
 
44CON 2014 - Security Analytics Beyond Cyber, Phil Huggins
44CON 2014 - Security Analytics Beyond Cyber, Phil Huggins44CON 2014 - Security Analytics Beyond Cyber, Phil Huggins
44CON 2014 - Security Analytics Beyond Cyber, Phil Huggins
 
HITRUST CSF in the Cloud
HITRUST CSF in the CloudHITRUST CSF in the Cloud
HITRUST CSF in the Cloud
 
Chapter 1 introduction(web security)
Chapter 1 introduction(web security)Chapter 1 introduction(web security)
Chapter 1 introduction(web security)
 
Cybersecurity Management: Preventing Data Breaches in the Age of Big Data, 25...
Cybersecurity Management: Preventing Data Breaches in the Age of Big Data, 25...Cybersecurity Management: Preventing Data Breaches in the Age of Big Data, 25...
Cybersecurity Management: Preventing Data Breaches in the Age of Big Data, 25...
 
MT50 Data is the new currency: Protect it!
MT50 Data is the new currency: Protect it!MT50 Data is the new currency: Protect it!
MT50 Data is the new currency: Protect it!
 
IT Information Security Management Principles, 28 February - 02 March 2016 Du...
IT Information Security Management Principles, 28 February - 02 March 2016 Du...IT Information Security Management Principles, 28 February - 02 March 2016 Du...
IT Information Security Management Principles, 28 February - 02 March 2016 Du...
 
Cyber-Security-Whitepaper.pdf
Cyber-Security-Whitepaper.pdfCyber-Security-Whitepaper.pdf
Cyber-Security-Whitepaper.pdf
 
Cyber-Security-Whitepaper.pdf
Cyber-Security-Whitepaper.pdfCyber-Security-Whitepaper.pdf
Cyber-Security-Whitepaper.pdf
 
Addressing Gaps in Your Cyber Security
Addressing Gaps in Your Cyber Security Addressing Gaps in Your Cyber Security
Addressing Gaps in Your Cyber Security
 
Select and Implement a Next Generation Endpoint Protection Solution
Select and Implement a Next Generation Endpoint Protection SolutionSelect and Implement a Next Generation Endpoint Protection Solution
Select and Implement a Next Generation Endpoint Protection Solution
 
Information security: importance of having defined policy & process
Information security: importance of having defined policy & processInformation security: importance of having defined policy & process
Information security: importance of having defined policy & process
 
Getting Real About Security Management and “Big Data”
Getting Real About Security Management and “Big Data” Getting Real About Security Management and “Big Data”
Getting Real About Security Management and “Big Data”
 
Security Architecture
Security ArchitectureSecurity Architecture
Security Architecture
 
Cyber and information security operations and assurance
Cyber and information security operations and assurance Cyber and information security operations and assurance
Cyber and information security operations and assurance
 
SECURITY AND CONTROL
SECURITY AND CONTROLSECURITY AND CONTROL
SECURITY AND CONTROL
 
IT Information Security Management Principles, 15 - 18 May 2016 Dubai UAE
IT Information Security Management Principles, 15 - 18 May 2016 Dubai UAEIT Information Security Management Principles, 15 - 18 May 2016 Dubai UAE
IT Information Security Management Principles, 15 - 18 May 2016 Dubai UAE
 

Recently uploaded

Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Victor Rentea
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
panagenda
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Recently uploaded (20)

Introduction to use of FHIR Documents in ABDM
Introduction to use of FHIR Documents in ABDMIntroduction to use of FHIR Documents in ABDM
Introduction to use of FHIR Documents in ABDM
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelMcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKSpring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 

Secure Your High Risk Data

  • 1. Info-Tech Research Group Inc. is a global leader in providing IT research and advice. Info-Tech’s products and services combine actionable insight and relevant advice with ready-to-use tools and templates that cover the full spectrum of IT concerns. © 1997-2020 Info-Tech Research Group Inc. Secure Your High-Risk Data Develop a comprehensive data security plan.
  • 2. Info-Tech Research Group | 2 Table of Contents 4 Analyst Perspective 5 Executive Summary 6 Executive Brief 20 21 Step 1.1: Understand the Role of Data Security 40 Step 1.2: Document Current Data Security Controls 73 Step 1.3: Assess Compliance and Regulatory Frameworks 81 82 Step 2.1: Develop the Data Security Roadmap 92 93 Step 3.1: Prepare your People 98 Step 3.2: Establish Metrics 105 Summary of Accomplishment 107 Research Contributors 110 Bibliography
  • 3. Info-Tech Research Group Inc. is a global leader in providing IT research and advice. Info-Tech’s products and services combine actionable insight and relevant advice with ready-to-use tools and templates that cover the full spectrum of IT concerns. © 1997-2020 Info-Tech Research Group Inc. Secure Your High-Risk Data Develop a comprehensive data security plan. E X E C U T I V E B R I E F
  • 4. Info-Tech Research Group | 4 Analyst Perspective Secure your data to help secure your business. Cassandra Cooper Senior Research Analyst, Security, Risk, and Compliance Info-Tech Research Group Gone are the days of simply protecting the network perimeter, or securing your primary assets, and resting comfortably at night knowing that your business’ most important data would not be at risk. An expanded global economy brings with it an expectation of collaboration – not just internally, but between different organizations in different areas of the world. Data has become an asset, and unlike the devices to which it travels, data is not static or stationary, but dynamic and fluid in nature. Throughout its lifecycle, data will live in a multitude of repositories and move through various sources. The extent of a business’ data sources no longer lie within the confines of the office or primary workspace, a set of easily-controlled devices, or even a physical data center – organizations increasingly keep high volumes of sensitive, valuable data in the cloud. While accessible and convenient, this poses its own set of data security risks and questions. As a result, business and IT leaders must map the flow of data throughout each stage of its lifecycle. In order to properly account for information that poses a risk to the organization should it be lost, corrupted, or used for malicious intent, the IT leader’s mindset must switch from securing the assets and the network, to securing the data itself.
  • 5. Info-Tech Research Group | 5 Executive Summary Your Challenge Common Obstacles Info-Tech’s Approach Securing data is no longer as simple as implementing a singular set of controls. Sensitive and high-risk data now lives in various repositories both in and out of the organization, in both on-prem and cloud environments. Layer in the process of exchanging data and ensuring secure transfer of this information while still making it accessible to the intended group of end-users, and the challenge becomes increasingly complex. To implement appropriate security controls, InfoSec leaders must first understand: • What data they need to keep safe • Where it exists within the organization Additionally, organizations must understand the current compliance and regulatory obligations based on location and industry. Finally, InfoSec leaders must select a combination of technical and process controls that fit the business environment and reduce user friction. Info-Tech’s Secure your High-Risk Data takes a multi- faceted approach to the problem that incorporates foundational technical elements, compliance considerations, and supporting processes and policies. • Assess what technical controls currently exist within the organization and consider additional controls. • Review compliance obligations and information security frameworks (NIST, CIS) for guidance. • Develop a set of data security initiatives that involve both technical and supporting procedural controls. Info-Tech Insight A modern data security strategy must protect data through the entire data lifecycle. Data security efforts must be business-focused, with multi-layered defense, while extending to all data sources.
  • 6. Info-Tech Research Group | 6 Info-Tech Research Group | 6 Info-Tech Research Group | 6 Your challenge This research is designed to help organizations who are facing these challenges/looking to/need to: • Identify the set of technical controls involved in adequately securing the organization’s high-risk, or highly sensitive data. • Develop a better understanding of the current sources and repositories for the organization’s data and how they are currently protected and secured. • Implement an approach to effective data security that takes into account the relevant technical and process-based controls. • Ensure that compliance and regulatory obligations are met and protect all retained sources of personal data. This research is intended for organizations looking to protect highly sensitive or high- risk data, that have previously undergone data classification using either a tool or solution or Info-Tech’s Discover and Classify Your Data research. Data-In-Transit Applications / Systems Data Warehouses Data Lakes Data Marts Flat Files Devices / Endpoints
  • 7. Info-Tech Research Group | 7 The Modern Data Breach Landscape The impact and frequency of data breaches is nothing new for CIOs and CISOs. And while even the tightest data security programs can still result in a breach, a comprehensive plan that covers all vectors of potential attack significantly reduces both compliance and business risk. Making the Case for End-to-End Data Security Of data breaches were perpetrated by external actors, while 45% of breaches featured hacking as the attack tactic. Of data-breach victims report having had their personal data compromised. Info-Tech Insight A multi-vector defense involves protection of data- in-use, at-rest, and in-transit, through a combination of both technical and process controls aligned to specific control categories. 58% 70% Source: Verizon Data Breaches 2020
  • 8. Info-Tech Research Group | 8 Common obstacles 34% Of an organization’s Of organizations have in the past year. 49% These barriers make this challenge difficult to address for many organizations: The Data Security Dilemma • A changing technological environment characterized by an increase in cloud computing, rapid proliferation of IoT of connected devices, and a vast number of data sources that become targets for malicious attacks. • The traditional approach of securing the network perimeter no longer ensures security of all data; organizations must take a multi-faceted approach to securing data. • Global privacy regulations are increasing, adding to an already complex environment characterized by multi-jurisdictional differences in controls to implement. • Most organizations will be faced with multiple compliance obligations and must begin to start looking at data security holistically as opposed to piecemeal. • Additionally, industry-specific compliance regulations often present IT leaders with a host of instructions but lack specific examples of technical controls to be put into place in order to ensure data is secured. 40% Of attacks were a result of Source: 2020 Thales Data Threat Report
  • 9. Info-Tech Research Group | 9 Info-Tech Research Group | 9 Creation Storage Usage Archiving Destruction Info-Tech’s approach: End-to-End Data Security The Info-Tech difference: Effective data security is more than just a single layer or set of controls around your applications – it’s a comprehensive approach to both data-at- rest and data-in-transit. Successful end-to-end data security calls on an exhaustive strategy that includes: • Technical and process controls within the seven core categories of data security; • Secures data through all stages of the data lifecycle, from data creation through until data destruction; • Encompasses both data-at-rest and data-in- transit. Through both technical and process controls, Info-Tech’s research equips you with the knowledge and tools for an effective approach to data security. Data-in-Transit Access Control Data Operations Authentication Data Integrity Data Loss Management Encryption Obfuscation & Data Minimization Data-at-Rest Data-at-Rest Data-in-Use Data Classification Data Security Control Categories Data Lifecycle Stages
  • 11. Info-Tech Research Group | 11 Info-Tech’s methodology for Secure Your High-Risk Data 1. Review Data Security Methodologies 2. Develop the Data Security Roadmap 3. Implement Data Security Roadmap Phase Steps 1.1 Understand the Role of Data Security 1.2 Review Current Security Controls 1.3 Assess Compliance and Regulatory Frameworks 2.1 Develop the Data Security Roadmap 3.1 Prepare Your People 3.2 Establish Metrics Phase Outcomes Comprehensive overview of: • Access Control • Data Operations • Authentication • Data Integrity • Data Loss Management • Encryption • Obfuscation and Data Minimization Alignment to governing compliance regulations. • Understanding of where the gaps lie in current data security framework and control implementation. • Prioritized set of technical and process- based data security initiatives based on specific criteria. • Strategy for implementing both technical and process-based data security controls. • Metrics for evaluating the success of data security framework within the organization.
  • 12. Info-Tech Research Group | 12 Data Security Has Layers A modern data security strategy protects data throughout its entire lifecycle. Data security efforts must be prioritized, business-focused, with multi-layered defense, and extend to all data sources. Start with the technical, finish with the process. For every technical control (think encryption, MFA), there are an adjacent set of supporting controls that must be put into action. Think technical, but follow-up by ensuring that the process is developed and documented. Get your priorities right. Before you can present a set of controls back to your senior leadership team, you need to identify which ones are most important in securing what’s most important to the business. Prioritize using controls that consider cost, effort, as well as compliance and business risk reduction. Secure your people. During the implementation phase, ensure that any employees or end- users that will be impacted by changes in process, or additional controls in place are properly informed, trained, and supported. Secure data fails to remain secure when those managing it are not informed and educated. Nothing in isolation. With this exhaustive set of data security controls, you need to evaluate the organization’s full operational landscape. Assess your gap-closing initiatives as part of a comprehensive, defense-in-depth data security roadmap. Sometimes more is better. Instead of focusing on one specific data security technique or technology, examine how multiple controls can work cohesively in order to facilitate a multi-faceted approach to securing your data.
  • 13. Info-Tech Research Group | 13 Each step of this blueprint is accompanied by supporting deliverables to help you accomplish your goals: Blueprint deliverables Key deliverable: Data Security Matrix tool An assessment tool based on the current state and desired future state, that leverages a set of targeted initiatives to close gaps and provide a data security roadmap. A comprehensive outlook on the current state of data security. The Data Security Roadmap Report also provides guidance on the implementation of technology, processes, and training to properly secure your high-risk, high-value data. Data Security Technical and Report Data Security Executive Report template Review and present your findings to your executive leadership team by highlighting core gaps and identified initiatives to ensure sufficient data security standards are in place.
  • 14. Info-Tech Research Group | 14 Blueprint benefits IT/InfoSec Benefits Business Benefits • A comprehensive understanding of where sensitive data exists within the organization. • A method of mapping the flow of sensitive or high-risk data between sources and while in transit. • An overview of technical controls to evaluate and implement in order to enhance data security. • An overview of process or administrative controls to evaluate and implement in order to enhance data security. • A prioritized set of data security initiatives based on cost, effort, and compliance and business risk values. • Detailed set of initiatives to help effectively secure the organization’s highly sensitive or high-risk data. • An understanding of what the compliance obligations for the organization are, and how the security controls in place ensure full adherence. • An understanding of the best-practice data security frameworks, and the measures in place to meet framework standards. • Increased end-user trust and enhanced organizational reputation through improved data security practices.
  • 15. Info-Tech Research Group | 15 Info-Tech Research Group | 15 Info-Tech Research Group | 15 Measure the value of this blueprint Effective data security saves more than just money… • Data breaches have become increasingly prevalent, costing organizations an extra 83% between 2018 and 2019 alone. • Legal fines and costs associated with customer damages and compensation are a small portion of the losses incurred as a result of a data breach; reputational damage to the business and brand can have an irreversible long-term impact on the organization. In Phase 1 of this blueprint, we will help you establish current and target state of data security. In Phase 3, we will help you develop a set of relevant and achievable metrics to support data security initiatives. Info-Tech Project Value $116,383 Average annual salary of a Senior Cybersecurity Consultant Average total time/cost to completion for the following high- priority data security projects: • Map out full set of data sources for all high-sensitivity data • Complete and revise Data Matrix tool • Map all compliance obligations (regulatory and industry-specific) to current data security controls • Complete gap analysis of data security controls • Develop gap-closing initiatives and create implementation roadmap • Establish metrics and monitoring schedule 200 hours (initial) $ 69,875 (internal project cost) $3.92 million (data breach cost) $ 1,230,195 Source: Indeed, Ponemon Institute 29.6% chance of a data breach occurring Total Dollars Saved
  • 16. Info-Tech Research Group | 16 Executive Brief Case Study Education INDUSTRY SCMedia, vpnMentor SOURCE OneClass Remote learning platform OneClass was the recent victim of an extensive data breach, where personal information of over one million North American students was exposed. Information left exposed included records that were linkable to minors (under the age of 13) as well as other OneClass users. The data harvested (account credentials, personal information) provided hackers with the potential to leverage it for social engineering scams to inflict financial damage through obtained payment card information. The database breach was detected by researchers at vpnMentors, who notified the vendor shortly after the initial date of discovery. Upon notification, OneClass took down the unsecured database under the claim that the data within it was purely test data and was not linkable to individuals associated with OneClass. However, vpnMentor noted that the database contained over 27GB of data, and 8.9 million records, exposing and directly linked to over one million OneClass customers. Researchers discovered this contradiction between OneClass’ statement and vpnMentors’ discovery, as a sample set of the exposed data was easily matched with public information. A key issue being the fact that many of OneClass’ users are young, and as such would be unaware of the fraudulent activities that could result from their information being stolen by malicious actors. The Data Security Impact • The AWS-hosted Elasticsearch database used by OneClass was left unsecured. • The issue of incorrectly configured/unsecured databases in the cloud has been a prominent issue, as the transition from a network security approach to a data-specific approach. Solution • Organizations, like OneClass, need to adopt a more comprehensive approach to data secured off-prem through additional applied controls: • Higher degree of server security. • Strengthening of access control measures through additional access rules. • Ensuring systems that contain large amounts of personal or sensitive data have appropriate authentication measures.
  • 17. Info-Tech Research Group | 17 Diagnostic and consistent frameworks are used throughout all four options. DIY Toolkit “Our team has already made this critical project a priority, and we have the time and capability, but some guidance along the way would be helpful.” Guided Implementation “Our team knows that we need to fix a process, but we need assistance to determine where to focus. Some check-ins along the way would help keep us on track.” Workshop “We need to hit the ground running and get this project kicked off immediately. Our team has the ability to take this over once we get a framework and strategy in place.” Consulting “Our team does not have the time or the knowledge to take this project on. We need assistance through the entirety of this project.” Info-Tech offers various levels of support to best suit your needs
  • 18. Info-Tech Research Group | 18 Info-Tech Research Group | 18 Guided Implementation What does a typical GI on this topic look like? Phase 1 Phase 2 Phase 3 Call #1: Scope requirements, objectives, and your specific challenges. Call #2: Review and complete data security controls matrix. Call #3: Review and align compliance framework matrix. Call #4: Identify gap closing initiatives. Call #5: Prioritize all gap-closing initiatives based on criteria. Call #8: Define metrics and establish monitoring schedule. Call #7: Identify user groups and assess appropriate training plans. A Guided Implementation (GI) is a series of calls with an Info-Tech analyst to help implement our best practices in your organization. A typical GI is between six to eight calls over the course of four to six months. Call #6: Evaluate cost and effort table and implementation timeline.
  • 19. Info-Tech Research Group | 19 Workshop Overview Day 1 Day 2 Day 3 Day 4 Day 5 Activities Identify Internal and External Drivers Assess the Current State Identify Gaps and Define Initiatives Develop Implementation Plan and Define Metrics Next Steps and Wrap-Up (offsite) 1.1 Review the business context. 1.2 Review compliance drivers and relevant regulatory frameworks. 1.3 Discuss current drivers from both the InfoSec and business context. 1.4 Define the scope; which data sources will be examined, and what is the tier of data (classification standard) in focus for this project? 2.1 Map the flow of all high-risk, highly sensitive data in-scope for project workshop (at-rest and in-transit); document and compare with Data Inventory. 2.2 Review current set of data security controls. 2.3 Identify and list future or potential adjustments to the organization’s data security landscape. 2.4 Complete Data Security Matrix and recommended action items. 3.1 Review data security Gaps. 3.2 Identify gap-closing initiatives in-scope for each of the seven areas of data security. 3.4 Allocate cost and effort values, and prioritize initiatives. 3.5 Compare against compliance obligations and security frameworks. 3.6 Define execution waves and initial timeline for implementation. 4.1 Finalize implementation roadmap for data security initiatives. 4.2 Identify scope of employees or end-users impacted by changes to data security landscape. 4.3 Develop training plan for any process or administrative- based initiatives. 4.4 Define metrics for measurement of data security program success. 4.5 Outline schedule for monitoring and reporting. 5.1 Complete in-progress deliverables from previous four days. 5.2 Set up review time for workshop deliverables and to discuss next steps. Deliverables 1. Set of data security objectives 2. Mapped compliance matrix 3. Defined project scope 1. Revised Data Inventory 2. Completed Data Security Matrix (compliance frameworks and security controls) 1. Finalized Data Security Matrix 2. Completed Data Security Initiatives list 1. Finalized Data Security Roadmap 2. Data Security Metrics 3. Outline for Data Security Technical report 1. Completed and delivered Data Security Technical Report and Executive Presentation 2. Additional recommendations based on days one to four of workshop Contact your account representative for more information. workshops@infotech.com 1-888-670-8889
  • 20. Info-Tech Research Group | 20 Info-Tech Research Group | 20 3.1 Prepare Your People 3.2 Establish Metrics Phase 1 Review Data Security Methodologies This phase will walk you through the following activities: • Understand the Role of Data Security • Review Current Security Controls • Assess Compliance Frameworks This phase involves the following participants: • CISO/InfoSec lead • InfoSec managers • InfoSec team • CDO/ Data Lead • IT Team (optional) • Data and Analytics team • DBAs Secure Your High-Risk Data 2.1 Develop the Data Security Roadmap 1.1 Understand the Role of Data Security 1.2 Review Current Security Controls 1.3 Assess Compliance and Regulatory Frameworks
  • 21. Info-Tech Research Group | 21 Info-Tech Research Group | 21 Step 1.1 Step 1.2 Step 1.3 Step 1.1 Understand the role of data security This step will walk you through the following activities: • Define your Drivers • Review and Update Data Classification Standard This step involves the following participants: • CISO/InfoSec lead • InfoSec managers • InfoSec team (optional) • CDO/ Data Lead • Data and Analytics team • DBAs . Activities Outcomes of this step • Identification of full set of business and security drivers behind the implementation of data security controls. Review Data Security Methodologies 1.1.1 Define your Business Drivers 1.1.2 Review and Update Data Classification Standard
  • 22. Info-Tech Research Group | 22 NIST Cybersecurity Framework Know the Role of Data Security What part does data play in the Information Security ecosystem? Source: Adopted from NIST.gov, 2018 Identify Protect Detect Respond Recover • Asset Management • Business Environment • Governance • Risk Assessment • Risk Management • Awareness Control • Awareness and Training • Data Security • Info Protection and Procedures • Maintenance • Protective Tech • Anomalies and Events • Security Continuous Monitoring • Detection Process • Response Planning • Communications • Analysis • Mitigation • Improvements • Recovery Planning • Improvements • Communications • An organization’s security must adapt to the dynamic global landscape and can no longer rely on static, one-dimensional measures. • End-to-end security of the organization’s data relies on appropriate controls around the systems, applications, sources, and methods of transmission of the data. • By taking a defense-in-depth approach, data security is layered in. This ensures that multiple methods covering all potential vectors for data loss, as well as external and internal threat, are covered. • While security of data is paramount, overall availability and access to the data must be at the forefront of the control framework. This necessitates an understanding of core users and data owners within the business. Info-Tech Insight An organization’s approach to data security must be as dynamic as the data it protects. This depends on a clear understanding of what data will be protected, why it’s protected (compliance, customer expectations, business drivers), and where it exists throughout its lifecycle.
  • 23. Info-Tech Research Group | 23 Info-Tech Research Group | 23 Info-Tech Research Group | 23 Data Security and the CIA Triad Effective data security covers all three elements of the CIA triad. Confidentiality Availability Integrity Effective Security Program • The CIA Triad is a common, accepted best-practice set of security objectives that guide the development of a strong security program. • Confidentiality: “The prevention of unauthorized disclosure of information.” For example, any sensitive or high-risk data that must be handled or viewed only by specific parties must be treated in a way that ensures its confidentiality throughout the lifecycle. • Integrity: “Ensures that information is protected from unauthorized or unintentional alternation, modification, or deletion.” Data that is transferred between parties and/or sources must retain its original format without interception resulting in changes to the structure of the data itself. • Availability: “Information is readily accessible to authorized users.” This entails ensuring that access control is appropriately defined and carried out, so that users are leveraging the appropriate data based on their competencies and requirements. Source: IAPP Privacy Program Management Defense-in-Depth A comprehensive, multi-tiered, multi-layered approach to information security that mitigates risk through a varied set of security controls. This research applies a defense-in-depth framework to the set of data security controls and their application within the organization. Info-Tech Insight Consider the pillar of non-repudiation as an integral fourth component of the security program. Assurance of a statement’s or information’s validity is a key facet of ensuring data security.
  • 24. Info-Tech Research Group | 24 A perspective on the global data security landscape. Data Security: An overview Days is the average time it took to identify a data breach in 2019, which increased from 2018 (197 days). Of individuals do not believe that companies care or take measures toward securing their personal data. Source: DataPrivacyManager 41% 279 Of these same individuals say that they are more likely to remain loyal to companies with strong security controls in place. 84% 39 seconds Is the time between each hacker attack that occurs, on average, every day.
  • 25. Info-Tech Research Group | 25 Case Study: Sina Weibo Technology, Social Media INDUSTRY ZDNet, Security Boulevard, Business & Human Rights Resource Centre SOURCE Why did it happen? Sina Weibo Chinese social media platform Sina Weibo, similar to Twitter in terms of functionality, suffered one of the largest data breaches of the decade in mid-2019. Account information of over 530 million users was exposed, including phone numbers from 170 million of these users. Details around the breach surfaced in 2020, as the personal information stolen was discovered for sale for the mere price of $250 per set of credentials, which did not include account passwords. This in itself is quite shocking and is a testament to the fact that data breaches have become so common-place that unique identifiers are unable to fetch a moderate price tag on the dark web. The breach occurred as a result of an external hacker gaining access to Weibo and completing a full-scale sweep of the company’s user database. A lack of clarity still exists around how the data was obtained, as originally it was said to have been obtained from a SQL database dump, but the company put forth a contradictory statement indicating that data was swept by “matching contacts against [Weibo]’s API.” Data Source Data Security Resolution • Weibo claimed one-way encryption was used • No passwords stored in plaintext • Data separation / segregation • Database • Weibo’s Cyber API detected posts where data stolen was advertised as being sold • Ministry of Industry and Information Technology (MIIT) encouraged Weibo to focus on privacy policies to align with cybersecurity regulations, as well as self-evaluation of data security of new services
  • 26. Info-Tech Research Group | 26 Evaluate the Intersection of Data Privacy and Information Security Information Security Information security aims to ensure confidentiality, availability, and integrity of information throughout the data’s lifecycle1. Common functions of Information Security include: Data Privacy Data Privacy ensures that the rights of individuals are upheld with respect to control over how their respective information is collected, used, and processed. Data privacy emphasizes the following principles: Source: IAPP Privacy Program Management • Risk Management • Vulnerability Management • Strategy and Governance • Data Protection • Incident Response • Identity and Access Management • Accuracy of information • Access of information • Accountability • Confidentiality of information
  • 27. Info-Tech Research Group | 27 Input Output • Optional: Ask core team members to brainstorm a list of key privacy program drivers as well as objectives • Identified set of data security drivers and objectives • Grouped themes around data security drivers Materials Participants • Whiteboard/flip charts • Sticky notes • Pen/marker • CISO/InfoSec lead • InfoSec managers • InfoSec team (optional) • CDO/Data Lead • Data and Analytics team • DBAs 1.1.1 Define your Business Drivers 1 hour 1. Bring together relevant stakeholders from the organization. This should include members of your InfoSec team, as well as DBAs and any other relevant members from IT that support security procedures. 2. Using sticky notes, have each stakeholder write one driver for data security per sticky note. • These may vary from concerns about customers, to the push of regulatory obligations. 3. Collect these and group together similar themes as they arise. Themes may include: • Access Control and Management • Data Lifecycle • Data Integrity 4. Discuss with the group what is being put on the list and clarify any unusual or unclear drivers. 5. Determine the priority of the drivers. While they are all undoubtedly important, it will be crucial to understand which are critical to the organization and need to be dealt with right away. • For most, any obligation relating to an external regulation or governing framework (i.e., NIST SP800-53) will become top priority. Non-compliance can result in serious fines and reputational damage. 6. Review the final priority of the drivers and confirm current status.
  • 28. Info-Tech Research Group | 28 Storage Usage Transmission Destruction Creation Data Capture Flat Files Databases Secure FTP Email Analytics Digital Destruction Physical Destruction Data Warehouses Identity Verification Duplications Backups Reporting Data Entry Define the Data Lifecycle During each of the six stages of the data lifecycle, there is an opportunity to embed security controls to protect the data. Archiving Mobile Communications Data Acquisition
  • 29. Info-Tech Research Group | 29 From through to , ensure your data is protected and secured. Securing Data through the Data Lifecycle Creation Consider the starting point of data within the organization. This includes collection of data from different sources and inputs including computers, via email or secure file transfer, and through mobile devices. Storage Data will be stored in various repositories, both on and off prem, and increasingly in the cloud. Consider your current spectrum of systems, applications, data warehouses and lakes, as well as devices on which data may be at rest. Usage Evaluate how data is protected when it’s in use, both internally and externally, by employees, contractors, and customers or other types of end-users. How is data protected when in use, and what parameters are in place around data usage? Transmission What methods does the organization currently use to transfer data? How is data-in-transit protected, and are there current technical controls as well as policies around data-in-transit? Archiving How does the organization currently archive data? Where is data- at-rest being kept, and are there supporting retention periods and documented policies around archiving procedures? Evaluate the current security controls in place to protect data archives. Destruction When data reaches the end of its life, how is data permanently disposed of? Consider both physical destruction of data (shredding, melting, etc.) as well as digital destruction of data. Conversely, consider the potential use of data you may destroy – for example, information requested as part of a legal procedure. • The ability to ensure data is secured throughout each stage within the Data Lifecycle is a core component of an effective approach and strategy to data security. • This means leveraging both technical and process- based controls that ensure end-to-end data security. • Info-Tech’s research takes a multi-tiered approach, first examining how data is secured within each of its respective sources throughout each of the six stages within the lifecycle. Source: University of Western Ontario, Information Security Data Handling Standards
  • 30. Info-Tech Research Group | 30 Data Security within the Five Tier Data Architecture Sources Integration Data Warehouse Environment Reporting and Analytics Derived Content and Presentation Solutions: SOA Point-to-Point Manual Loading ESB ETL ODS Data Hub Functions: Scrambling Masking Encryption Tokenizing Aggregation Transformation Migration Modeling Data Lakes & Warehouse(s) (Raw Data) EIM ECM DAM Data Lakes Warehouse(s) (Derived Data) Data Marts Data Cube BI Tools Thought Models Formulas Derived Data (from analytics activities) Protected Zone Data Marts App 1 App 2 Excel and other documents Access Database(s) External data feed(s) & Social Media Flat Files Reports Dashboards Presentations IOT Devices BDG Class MDM This research will focus on securing both , as well as based on Info-Tech’s Five-Tier Data Architecture.
  • 31. Info-Tech Research Group | 31 Data Security Extends to All Data Sources The purpose of an end-to-end data security program is to ensure that data is protected in each location it exists in through each stage of its lifecycle. Applications and Systems Devices Data Lakes Data Warehouses Data Marts Databases Flat Files While most enterprise applications will have built-in security controls, consider focusing on process around user access and privilege. Includes both physical devices, as well as IoT- enabled devices on which corporate information may be accessed or stored. Devices are a complex challenge as they necessitate complex, multi-tiered data security controls. Identify any databases that do not sit behind enterprise applications and evaluate the current set of controls in place. Sensitive data is often found in flat files as well as relational databases, however flat files often lack adequate security controls. Raw data is still sensitive data. Consider who has access to data lakes, and the types of information stored. Access to data warehouses should be limited, and data housed inside should be encrypted and properly protected. Often considered a segment of a date warehouse, data marts contain accessible client-facing data, and as such require separate consideration for data security controls. For additional insight, download Info-Tech’s research on how to Build a Data Architecture Roadmap,
  • 32. Info-Tech Research Group | 32 With an increase in the volume of organizational data comes questions around who is accessing the most sensitive and high-risk data, and why? An integral part of data security is ensuring that the organization does not fall victim to internal overexposure of data. Internal or insider threats pose a major risk with respect to potential breaches. Scoping the Internal Accessibility of Sensitive Data Of organizations leave 1,000 or more files with sensitive data open to all employees. Of all sensitive files at these same organizations were found to be accessible to every employee. Info-Tech Insight Data security isn’t just a technical process; it involves a thorough assessment of the business from an organizational perspective. 17% 53% Source: Varonis 2019 Global Data Risk Report 534,465 Files at the average company contain
  • 33. Info-Tech Research Group | 33 Securing Data Within the Governance Framework Data Governance can be thought of as the engine that enables your data to be transformed into the power needed to drive your organization up the data value chain. Data Security plays a key role in the ecosystem of data governance. The need for more secure data drives the adoption of security policies that adapt to changing requirements and control access to sensitive data. Policies and Procedures (PnP) Master and Reference Data Management Data Architecture Info-Tech Power Out Knowledge Information Data Data Value Chain Data Information Business Needs Organization Fuel-In People Data Policies and Procedures Communication Plan Data Security and Audit Data Risk Management Data Integration Data Modeling Data Storage & Ops Data Warehousing & BI Documents and Content Metadata Shared Insight Info-Tech’s Data Governance Framework For additional insight, download Info-Tech’s Establish Data Governance.
  • 34. Info-Tech Research Group | 34 Define data classification in the context of your organization Build out a data classification scheme that fits the operating and regulatory environment of your organization. With the increase in data and digital advancements in communication and storage (e.g. cloud), it becomes a challenge for organizations to know what data exists and where the data lives. A classification scheme must be properly implemented and socialized to help ensure appropriate security measures are applied to protect that data appropriately. What is data classification? Data classification is the process of identifying and classifying data on the basis of sensitivity and the impact the information could have on the company if the data is breached. The classification initiative outlines proper handling procedures for the creation, usage, storage, disclosure, and removal of data. Why do we need it? Structured • Highly organized data, often in a relational, easily searchable database. • E.g. employee numbers stored in a spreadsheet Unstructured • Data that is not pre-defined in format and content; majority of data in most organizations. • E.g. free text, images, videos, audio files Semi-structured • Information not in traditional database but contains some organizational properties. • E.g. email, XML Types of data Without data classification, an organization treats all information the same. Sensitive data may have too little protection. Less sensitive data may have too much protection. Strategically classifying data will allow an organization to implement proper controls where necessary.
  • 35. Info-Tech Research Group | 35 Info-Tech Research Group | 35 Info-Tech Research Group | 35 Appropriate classification of data is the first step to securing the organization’s data. The Role of Data Classification in Data Security • You can’t secure something without first knowing what it is you’re securing. This blueprint requires data classification as a pre-requisite, in order to focus in on the organization’s most high-value, sensitive, or high- risk data. • Info-Tech’s five-tiers of data classification levels are outlined below: This research takes a three-phase approach to developing a repeatable data classification program. • Formalize the Classification Program • Discover the Data • Classify the Data If you do not currently have a data inventory in place, or lack a classification standard, refer to this research in order to identify data in-scope for the Secure Your High-Risk Data blueprint. This includes tier 4 and/or tier 5 classified data. Level 5 (Top-Secret): Data intended for use only by the authorized personnel whose unauthorized disclosure could be expected to cause exceptional damage to corporate or national security. Level 4 (Confidential): Data that is kept private under federal, local, or state laws or contractual agreements or to protect its proprietary worth. Level 3 (Internal): Data intended for use within the organization. Unauthorized external disclosure could adversely impact the organization/customers/partners. Level 2 (Limited): Data that is not openly published but can be made available via open record requests. Direct access available only through authenticated and authorized employees. Level 1 (Public): Data that is readily available to the public with anonymous access. Download Info-Tech’s Discover and Classify Your Data research.
  • 36. Info-Tech Research Group | 36 Evaluate Software Solutions for Data Discovery and Classification We collect and analyze the most detailed reviews on enterprise software from real users to give you an unprecedented view into the product and vendor before you buy. Learn from the collective knowledge of real IT professionals • Know the products and features available • Explore module and detailed feature-level data • Quickly understand the market Evaluate market leaders through vendor rankings and awards • Convince stakeholders with professional reports • Avoid pitfalls with unfiltered data from real users • Confidently select your software Cut through misleading marketing material • Negotiate contracts based on data • Know what to expect before you sign • Effectively manage the vendor
  • 37. Info-Tech Research Group | 37 Your organization’s most sensitive data will most likely fall into one of the following five categories: Know the Archetypes of your High-Risk Data Payment Card Industry (PCI) Data obtained as a part of the financial transaction process. This includes cardholder name, expiration date, PIN, and magnetic strip contents. Intellectual Property (IP) Department of Defense (DoD) For organization’s that deal with DoD clients, certain categories of top secret or classified information may necessitate a separate classification tier and set of handling procedures. Personal Health Information (PHI) Personally Identifiable Information (PII) An organization’s secret sauce. IP refers to data including trade secrets, merger and acquisition plans, and upcoming product plans. Covered by governing regulations such as HIPAA or PHIPA, this data includes medical history, health insurance numbers, and biometric identifiers. Certain types of PII can be classified as highly sensitive, dependent on the governing privacy regulation. These include social security numbers, driver’s licenses, as well as financial account information.
  • 38. Info-Tech Research Group | 38 Input Output • Current data classification standard • Current data classification policy • Info-Tech’s Data Classification Standard • Revised data classification standard Materials Participants • Laptop/computer • Whiteboard • Sticky notes • Pen/paper • Markers • Data Classification Standard • Data Classification Policy • User Data Handling Requirements tool • CISO/InfoSec lead • InfoSec managers and team • IT Director • IT team (optional) • CDO/Data Lead • Data and Analytics team 1.1.2 Review and Update Data Classification Standard 1 – 2 hours 1. Meet with all relevant stakeholders from InfoSec and data teams and review the organization’s current data classification standard. • If no standard currently exists, review any documentation around data handling policies and procedures. This may include any compliance-related documentation that outlines data classification requirements. 2. In small groups, without relying on your current data inventory (if in place), write down all of the data types that the respective group members believe is considered or classified within Levels 4 and 5. 3. Discuss each smaller group’s findings and identify any data types that were omitted or not identified amongst the larger group. 4. Write these data types on sticky notes and group based on the five sensitive data archetypes identified on the previous slide. Download Info-Tech’s Data Classification Standard and Data Classification Policy documents.
  • 39. Info-Tech Research Group | 39 Input Output • Current data classification standard • Current data classification policy • Info-Tech’s Data Classification Standard • Revised data classification standard • Revised data classification supporting documents Materials Participants • Laptop/computer • Whiteboard • Pen/paper • Markers • Data Classification Standard • Data Classification Policy • User Data Handling Requirements tool • CISO/InfoSec lead • InfoSec managers and team • IT Director • IT team (optional) • CDO/ Data Lead • Data and Analytics team 1.1.2 Review and Update Data Classification Standard (Cont.) 5. Identify whether or not changes need to be made at the classification level (i.e.. adding a separate level based on the archetypes or additional changes in the organization’s environment). Document. 6. As a group, determine whether the current set of descriptions for each level encompass all high-risk, highly sensitive, or high-value data within the organization. • Be forward-looking. Is there a chance that certain types of information, that aren’t collected or processed today but may be in the near future, aren’t included within the classification levels? 7. Reword and update current Data Classification Standard document as necessary and note the date or changes. 8. Update any additional supporting documents, including Data Classification Policy, Steering Committee Charter, and relevant categories within the Data inventory.
  • 40. Info-Tech Research Group | 40 Info-Tech Research Group | 40 Step 1.1 Step 1.2 Step 1.3 Step 1.2 Review Current Security Controls This step will walk you through the following activities: • Document Current Data Security Controls • Review and Validate Data Inventory This step involves the following participants: • CISO/InfoSec lead • InfoSec managers and team • Data team/DBAs • IT Director • IT team (optional) Activities Outcomes of this step • Current State Assessment of comprehensive set of data security controls based on Info-Tech’s seven control categories. Review Data Security Methodologies 1.2.1 Document Current Data Security Controls 1.2.2 Review and Validate the Data Inventory
  • 41. Info-Tech Research Group | 41 Access Control’s Role in Securing Data Establish appropriate mechanisms, processes, and techniques to create a comprehensive access control framework for the organization’s sensitive data. Protect the Physical Environment Ensure protection of sensitive data and valuable organizational information by putting in place specific controls around access to the physical location, which extends to all components of the facilities and physical environment. This includes mechanisms to secure ingress and egress points, as well as access to high-risk assets that house sensitive data. Protect Your Information Systems A comprehensive set of access controls should address the security of the organization’s information systems as well as networks on which the systems operate. Preventing unauthorized access here upholds security and integrity of sensitive data. Manage Personnel/Employee Access Internal access control measures are an integral component of a strong approach to access management. Ensure that the right people have the right level of access to relevant systems and information to prevent inappropriate access while enabling internal collaboration.
  • 42. Info-Tech Research Group | 42 A user-centric approach to access control. • A model for access control that centers on the capabilities and role-specific functions or tasks of an employee within the organization. • The access controls established are built around the employee job function so that each specific job function merits a set of privileges based on the functional requirements, which directly aligns the organization with Identity and Access Management – a business-centric approach to IAM. • Before moving to implement RBAC within the organization or integrate RBAC tools, you will need to map out an exhaustive list of the organization’s systems, applications, and data stores/sources, and fully understand each of the roles or individual employees and employee groups. • It’s important that the organization document all roles and their corresponding functions for reference, and an RBAC policy should exist. WHY RBAC? • Ensures that employees access the right information and information systems in order to complete their roles, and do not have unlimited or global access. • Easily tracked, monitored, and maintained and is role vs. person-specific. Leverage Role-Based Access Control (RBAC) Full RBAC Management Model Role A Application 1 Application 2 Application 3 Source: Official Guide to CISSP CBK For more information, leverage Info-Tech’s research Simplify Identity and Access Management and Mature Your IAM Program. Source: Official Guide to CISSP CBK
  • 43. Info-Tech Research Group | 43 RuBAC, MAC, and DAC and their roles in shaping the organization’s access control strategy An RBAC model is one of multiple approaches to access control frameworks for an organization. Consider the following and their applicability based on your organization’s structure from both a human resources standpoint, as well as the collective employee collaboration needs. Use the considerations on the left-hand slide of this slide to ensure all contributing factors are considered when looking at alternate approaches to access control. • Rule-Based Access Control (RuBAC): This approach RBAC, but layers in a set of rules that define the specific privileges allocated to an employee or end-user and is dictated by the system administrator. This extends to dictating what the individual can do to, or with the document vs. just enabling access. • Discretionary Access Control (DAC): Discretionary Access Control is a data-centric approach to access control that places the controls on the data vs. the user, as with role-based approaches. Data is assigned a data owner, and he/she has the capability to change and amend access to the data as needed – an approach that centers around effective data governance and ownership. • Mandatory Access Control (MAC): Centered on the system’s management of access controls, and as such is controlled entirely by the administrator, who sets permissions, policies, and manages the process. This access control approach is a good fit for high-sensitivity systems and/or data as it tends to provide more robust security around access to data. Leverage Alternate Access Control Methods Considerations for Access Control Strategy Implementation • Do we have a large number of different roles that require specific access to data types and information systems? • Who is going to manage this process, and what level of complexity will each of these methodologies add to our existing process? • Do we have a strong data classification and data ownership model in place already? • Do we anticipate significant re- training efforts and/or user friction? Source: Official Guide to CISSP CBK
  • 44. Info-Tech Research Group | 44 When working with high-risk data, PAM plays a vital role. • PAM enables organizations to manage and restrict employees’ and end-users’ privileged access. • The foundations of PAM rests in the security principle of least privilege and focuses entirely on the access and controls granted to the organization’s most privileged users. • Why is PAM integral in data security? Malicious attackers often leverage the organization’s privileged accounts, as they are a targetable entry point that allows access to an organization’s critical systems and highly-sensitive data. With access to privileged account credentials, these attackers become potentially malicious insiders, and pose a significant threat to the organization’s sensitive data. • Privileged accounts are those accounts with extensive capabilities and control over specific systems or applications. • PAM solutions mitigate risk around over-provisioning of user privileges, automate and centralize the process of credential management, and provide an increased visibility into the organization’s gamut of privileged accounts and credentials. Assess Privileged Access Management (PAM) Source: Microsoft PAM for Active Directory Prepare Identify privileged users/groups Monitor View history, audit, review alerts and reports Protect Establish authentication requirements Operate Approved requests obtain temporary access
  • 45. Info-Tech Research Group | 45 Adopt appropriate measures to secure the physical environment. • In order to secure the organization’s high-risk, valuable assets, a layer of physical access control is needed. • This is the first layer in access control, and tangibly separates employees and end-users from the environment in which systems or assets in question are stored. • Effective access control involves a combination of physical and logical access control, which includes authentication and authorization of an employee or end-user following clearance of physical controls. • Physical access control can be heightened or elevated based on the importance of the environment it protects. Examples for physical facility ingress and egress points include: o Biometrics such as retinal scanning or fingerprints o RFID key fobs o Swipe cards o Motion detectors o Alarms o Fences or barriers Considerations for Physical Access Control The Role of Physical Access Control • How are visitors monitored and registered when entering and existing our facilities? • How do we monitor physical maintenance workers (cleaners, etc.) that may not be employees of the organization? • If equipment is moved off-site for maintenance, are we removing sensitive (confidential or restricted) data in preparation? • What is the process for when employees lose, misplace, or have stolen physical access devices? • In a remote work environment, how are we promoting physical security of our employees’ workspaces?
  • 46. Info-Tech Research Group | 46 Data Operations’ Role in Securing Data End-to-end data security means that all stages within the data lifecycle must include a layer of security controls, from creation to archiving and deletion. Recovery Backups Deletions and Destruction • An effective recovery strategy aims to achieve target RTO and RPOs that support the organization’s operating cadence. • Recovery strategies should be informed by risk and include security services as a core component. • Vendor negotiations are often an important part of an organization’s recovery strategy, in order to provide support for services or existing technology supplied by the third party. • Data security extends to ensuring data’s availability (through backups) for users and employees. • Should a system crash occur, backups enable the organization to leverage data that may have seemingly been lost and are key in maintaining operations. • Effective backups rely on both an established process and policy documents, as well as technical controls to add a layer of end-to-end protection for the organization’s sensitive data. • Data security does not end when the data lifecycle does, it continues through to the methods leveraged for deletion or destruction of the data itself. • Improper destruction of sensitive data opens the organization to potential breach opportunities. • Consider your cloud environment(s) and how they will change your approach to data destruction and deletion. • Data destruction includes physical destruction methods and digital deletion.
  • 47. Info-Tech Research Group | 47 • The process of effective data backup is derived from a set of controls meant to mitigate risk and ensure compliance. Backups are enabled by various tactics applied to data in order to ensure operations can still be carried out in the event of a loss scenario. • Mirroring: Synchronous writing of a collection of data to two or more targets. This differs slightly from the backup in the traditional sense in that it’s often used from one system to another. • Caching: Technique that is used at remote sites to keep a local copy of the dataset in synch with a master copy. This enables the user to access the data through a hash of the data on the cache being sent to the master node, which then sends back to the cache. • Replication: Asynchronous writing of a collection of data to two or more targets, generally leveraged from one system to another. This control defines the number of additional but similar systems data is written into. • Continuous Data Protection: Technique or process of saving a parallel record of all write operations to a journal log file as well as executing the corresponding file change. This process allows for extremely granular RPOs with very little system overhead. • Protection Algorithm: Controls (RAID, Reed–Solomon error correction) that define how data is written to the media in order to safeguard against physical fault; generally at the storage medium level. • Snapshot: A point-in-time image at the block level of the entirety of a given media set. The input/output overhead of a snapshot remains fixed based on the media set and does not scale with the amount of data being stored, exerting much lower impact on performance than a traditional backup. • Backup: A true copy of data onto non-volatile media. Defines the nature and number of copies of data on non-production volumes. Full Backups Incremental Differential Synthetic Fulls Demystify Data Backups Types of Backups A full second copy of the data is made to another media type (i.e.. disk). Full backups provide a complete copy of the data and are generally conducted at a periodic cadence. Backup type that only takes the changes in the dataset that have occurred since the previous incremental backup. This backup is generally conducted at a more frequent cadence. Incremental backup that also includes the reference full backup. A secure index of all changes against one or more full restore versions of the dataset.
  • 48. Info-Tech Research Group | 48 Effective recovery strategies hinge on relevant backup protocols. Implement Recovery Techniques • Recovery of sensitive data gives organizations a fail-safe in the case of physical damage or physical drive failure, such as damaged assets or hardware, or logical drive failure, which includes corruption of files or data. • While organizations tend to invest significant time, money, and effort into backup solutions, a supporting recovery strategy is often overlooked and underdeveloped. • End-to-end data security necessitates a comprehensive recovery strategy and set of supporting techniques to ensure critical or sensitive data is not unintentionally lost or destroyed. • Common recovery site strategies include the following1: o Dual Data Center: Applications are split between two separate physical data center locations. o Internal Hot Site: Standby site that possesses necessary infrastructure to run identical environments. o External Hot Site: Recovery site provided by a third-party, lacks exact replication of infrastructure. o Warm Site: Facility leased by the organization that has some of the necessary equipment. o Cold Site: Empty data center recovery site that is not equipped with necessary assets/hardware. • Do the organization’s current disaster recovery and business continuity plans include recovery of security services? • How is sensitive or high-risk data addressed within the organization’s current recovery processes? • Are specific applications or information systems with sensitive, critical, or high-risk data prioritized within the recovery strategy? • How often are we checking that the designated recovery site meets requirements from a security and operational perspective? • Do we test our recovery process on a regular basis (annually) and make changes as needed? Considerations for Secure Recovery 1Source: Official Guide to CISSP CBK
  • 49. Info-Tech Research Group | 49 End-of-life sensitive and high-risk data requires specific destruction and deletion processes. • For both digital forms as well as physical forms of sensitive data, ensure that appropriate destruction, disposal, and deletion procedures are documented and carried out. • Ensure that your end-of-life destruction methods align with any governing compliance regulations and standards. Many regulatory laws will have limitations or guidelines in place for data destruction. o Physical: Disk shredding, burning, melting, degaussing o Digital: Wiping, deletion, electronic shredding, solid state shredding, overwriting/erasure/sanitization • It is recommended that you keep record of the destruction through a log-book and document step-by-step processes that outline appropriate destruction techniques. • Opportunities exists to employ a third-party service for data destruction, however, prior to doing so, ensure that the company has appropriate certificates of sanitization, complies with relevant regulations, and documents its processes. Establish Data Destruction Processes Data Destruction Considerations Time Resourcing Cost While methods such as overwriting may seem preferable to physical forms of destruction, consider the time spend when evaluating high-capacity data sources. Are you planning to outsource data destruction, or will the process be carried out in-house? If the latter is true, do assigned members of staff have the knowledge and capabilities to ensure proper data destruction? For physical methods of destruction, there are often high capital expenses associated with the process. This extends to costs associated with degaussing products as well as additional forensic methods that may be leveraged as a part of the validation process.
  • 50. Info-Tech Research Group | 50 Establish Appropriate Authentication within the Organization Managing the identity access controls around data extends to putting in place appropriate and aligned methods for authentication. Effective authentication often requires a combination of multiple techniques, as well as significant process redesign in order to safely validate user identity. When including authentication as a part of the data security access control layer, consider both the strength of the technical controls, as well as the time and effort it will take to implement and change behaviors amongst your employees. Security Considerations • Is the organization consistently taking on more sensitive or high-risk information? • Are the clearance and general security requirements from our customers and partners increasing? • What tools (identity, location, keys) can we use to help validate how our employees are accessing corporate information? Employee/User Considerations • Will implementing this significantly impact employee experience? • Will the challenges in implementing and changing behaviors create resistance amongst employees to adhere to new authentication standards? • How will organization-wide authentication methods better facilitate a remote-work environment?
  • 51. Info-Tech Research Group | 51 Improve your chances of fending off external attacks through MFA. • Multi-Factor Authentication (MFA) has evolved from two-factor authentication and requires the use of three or more different methods of authenticating a user’s identity. • The intention behind effective MFA is that the different authentication mechanisms originate from separate or independent channels, complicating the infiltration route for malicious attackers. • MFA provides increased protection against phishing attacks, keyloggers, credential stuffing, and man-in- the-middle attacks. • When setting up MFA, it is integral to align the types of with the sensitivity level and risk of the information or assets being accessed. • In addition to the four primary elements of considerations of authentication, more evolved versions of MFA, called Adaptive MFA, (generally with the assistance of AI) incorporate contextual and behavioral elements including: o Time of access o Device being used for access o Type of network being accessed MFA Credentials and Authentication Mechanisms Evaluate the Layers of Multi- Factor Authentication Biometrics Passwords SMS / Email Code Verification Tokens Knowledge or Security Questions Risk Score/Risk-Based Authentication Social Media Platform Login
  • 52. Info-Tech Research Group | 52 SSO extends authentication controls outside the realm of the organization. • Through SSO, an employee can access multiple different applications or systems within the organization’s environment without signing in to each one individually. • The process involves the Identity Provider authenticating user access to the various service providers (think applications) through one singular use of credentials and log-in process. • SSO significantly reduces user friction in how employees access information systems, applications, and corporate data. • Both SSL or Secure Sockets Layer, as well as S/MIME certificates are an integral component of SSO. They provide the employee or end-user the ability to access any SSL-enabled servers that the individual has been granted access to without needing to submit any additional verification steps. Using Single-Sign On for User Authentication Source: IAPP Introduction to Privacy for Technology Professionals SSO Website Website App Website/App Source: Adapted from OneLogin Trusted partnership All Pages Trusted partnership Identity Provider
  • 53. Info-Tech Research Group | 53 Completely Automated Public Turing test to tell Computers and Humans Apart. • CAPTCHAs differ slightly from the previous forms and mechanisms of authentication, as they have a singular purpose of verifying whether or not a user is human vs. validating actual user identities, known as a type of Human Interaction Proof (HIP). • Over their two-decade lifespan, the effectiveness of traditional CAPTCHAs has weakened. The advent of AI technology has made the traditional anti-robot approach of CAPTCHAs relatively easy to bypass. • Additionally, CAPTCHAs often diminish the end-user experience; think overly-complex characters, or multiple images that have a tiny fragment contained within them of the object to identify. • Google’s risk-based reCAPTCHA provides a solution to the dated CAPTCHAs of the past, which leverages AI and machine learning to help better understand techniques for effective CAPTCHA creation. • While reCAPTCHA is currently used on over 4.5 million sites globally, there are significant concerns around the user privacy implications of the system, including previously-installed Google browser cookies. CAPTCHAs Role in Authentication SI CAPTCHA Math Captcha Are You a Human Geetest WP reCAPTCHA Tencent Waterproof Wall CAPTCHA Solutions Source: BuiltWith
  • 54. Info-Tech Research Group | 54 Protect Data’s Integrity throughout its Organizational Use In order to ensure data’s and from initial creation through until destruction, controls around integrity and quality become valuable security mechanisms. Accuracy Consistency • Is the data in each system or source in which it appears, accurate and a true representation of its original format? • During the process of transmitting data, has it been altered or modified in some way that it no longer retains its intent and accuracy? • Data that retains its accuracy tends to be increasingly reusable and maintainable within the context of the organization’s data uses. • Data accuracy can be comprised during transfers, due to either intentional or unintentional alterations made to the data. • Data that is consistent drives its ability to be easily recovered, searchable, reliable, and traceable. • Inconsistencies in data can be driven by a multitude of factors, including human error, compromised or impacted assets, as well as malicious external actors (hackers). • In implementing mechanisms and controls to ensure the consistency of sensitive data, organizations protect themselves from the harmful interception and modification of their high-risk, high-value information.
  • 55. Info-Tech Research Group | 55 Understand the role of hash functions in techniques that support data integrity. • Hash functions or algorithms secure data through the transmission or transfer process by taking an input (data) and producing a small output that is dependent or unique to the input. • If the output can not be identified or deduced using the input, it means that the hashing function is strong. Hashing is known as a one-way hashing algorithm (unidirectional). • Hashing prevents data from being altered or tampered with when it is being transmitted from the creator or sender to the recipient while it is in transit. • Hash functions can be used and applied as a data security control in the following ways: • Identification of versions of documents • Password verification (authentication) • Comparison of files to ensure equality and integrity • Digitally signing documents (we will explore this in the following Digital Integrity section) Data Hashing’s Role in Ensuring Integrity Hashing Source: IAPP Introduction to Privacy for Technology Professionals Hashing Algorithm Plain text Hashed Text
  • 56. Info-Tech Research Group | 56 Ensure accuracy and consistency of data from sender to receiver. • Digital signatures derive from eSignatures (electronic signatures) and serve as a digital certification or fingerprint, ensuring that a document has not been altered since it was signed. • Digital signatures can be grouped in to three separate Digital Signature Certificates: o Class 1: Equipped with a baseline level of security, used only for low-risk, low-sensitive data and documents. Do not hold legal tenure and confirm only name and email address. o Class 2: Can be used for electronic filing of specific documents, such as tax filing purposes. Equipped with a level of security that enables both business and personal use. Compares information from sender with data in a recognized consumer database. o Class 3: Highest degree of certificates for digital signatures and applies to the exchange of highly sensitive or high-value data and information. Can be leveraged for patent and trademark filing, procurement, etc. • Digital signatures promote increased trust and transparency to better validate online interactions between an organization and its customers, as well as third-parties and other external entities. • Digital signatures help to provide proof of the origin, identity, status, and accuracy of an electronic document, message, or transaction. Validating Data Integrity through Digital Signatures $1,534.8 Million is the size of the digital signatures market as of 2019 APAC Is the fastest-growing market by region for digital signatures Elimination of paperwork (and) Government policies supporting market growth are the two key drivers for the Digital Signature market. Source: PSMarket Research
  • 57. Info-Tech Research Group | 57 Digital signatures leverage Public Key Infrastructure (PKI) in order to validate integrity of the document. This requires the use of a mathematical algorithm along with one public and one private key. From Digital Signatures to Secure Data – A Journey Info-Tech Insight The ways in which you secure your data evolves to match the external landscape, and so should you. Should digital signatures not currently be used in the organization, understand how they can fit within the landscape of your organization as we see an increased prevalence of remote work. As remote work environments become the norm, an uptake in the acceptance of digital signatures is anticipated. Public Key (Receiver) Privacy Key (Sender) Document/Data Digital Signature Document Hash Algorithm Hash Value Document/Data Recipient Document/Data Creator
  • 58. Info-Tech Research Group | 58 An increasingly digital economy brings with it the advent of a new form of digital integrity. • Though a seeming outlier in traditional data security, blockchain upholds each pillar of the CIA triad. • A blockchain keeps record of a series of transactions, each one of which generates a hash. This hash is dependent on both the transaction itself, as well as the hash of the previous transaction. o Nonce: “Number used only once” refers to a random whole number that is added to a hashed clock within a blockchain. o Nodes: The spectrum of computers that contribute to a blockchain. o Block: Refers to one individual block of records within a blockchain – the entire group of blocks composes the blockchain. • Blockchain is built on the following pillars: o Decentralization: There is no central governing access to the data contained within a blockchain due to its distributed ledgers; therefore, it ensures a level of data security as there is no singular surface or point for attacks. o Transparency: Although privacy is a paramount consideration of blockchain the ledger lends itself to the principle of transparent visibility by providing an essentially anonymized view of a full host of transactions conducted within the blockchain. o Integrity: The cryptographic has functions of a blockchain that ensure the data entered into the blockchain itself can not be altered or modified. Blockchain Transactions The Blockchain Buzz Transaction is requested A representational “Block” is created “Block” sent to all “Nodes” in the network “Nodes” approve the transaction, “Block” added to Blockchain “Transaction Complete Source: Adapted from MLSDev
  • 59. Info-Tech Research Group | 59 Blockchain vs. the GDPR Evaluate the regulatory implications of blockchain technology. Although transparency is one of the key pillars of blockchain, the distributed ledger format of blockchain has raised questions around its viability for organizations in- scope for the GDPR. • Additionally, GDPR allocates to data subjects a set of specific rights with respect to data correction, deletion, and access. • Blockchain, however, as a result of the framework that upholds its pillar of integrity, is immutable, and as such data can not be amended, modified, or deleted once the transaction’s “block” has been added to the blockchain. • Article 5 (2) as well as Article 24 within the GDPR denote the responsibilities of an identifiable data controller with respect to processing of personal data. • Blockchain, by nature, does not have a singular controlling body, but instead is comprised of many individual contributors (Articles 12- 23). Data Controller Data Subject Rights Lawful Bases Data Minimization • The six lawful bases for processing data (Article 6 – GDPR) outline the most tenuous, and the one that often necessitates further validation through a data processing impact assessment (DPIA) as legitimate interest. • When considered under the scope of the GDPR, this is likely the only lawful basis that applies to blockchain, calling in to question the potential need for validation of how and why personal data is collected through a DPIA. • This guiding principle of data protection ensures that the controller collects the minimum amount of data necessary to perform the intended purpose and retain it only for the necessary period. • This conflicts with a distributed network’s framework of redundant data storage – a key component of blockchain.
  • 60. Info-Tech Research Group | 60 Avoid Secure Data Leaving the Organization with Data Loss Management Within each stage of the data lifecycle, there is potential for high-value, high- risk data to unintentionally leave the organization – a key problem that data loss management controls address. Data-at-Rest Data-in-Transit Data-in-Use • Identify and track where information is stored throughout the enterprise (data warehouses, data marts, data lakes). • Once located, assessment can subsequently be conducted to ensure that the locations in which the data sits are adequately protected and align with the compliance (external) and classification (internal) rules. • Data loss management also helps to ensure data-at-rest stored outside of the organization is protected. • As data moves dynamically throughout its lifecycle, its movement across the network can be tracked, monitored, and assessed. • As with data-at-rest, traditional DLP (data loss prevention) solutions can analyze network file transmission to identify any concerns with respect to the data types being transferred over the network. • Appropriate controls (encryption, hashing) can be placed around data-in-transit as a result of this analysis, to protect data both inside and outside of the organization. • Data being leveraged on endpoints is an integral part of data loss management, as it aims to restrict improper use of data by employees or other end-users. • While traditional DLP solutions provide a technical set of controls through rule sets and limitations that prevent certain actions from being taken, this must also be supported through policy creation and education of end-users and data owners.
  • 61. Info-Tech Research Group | 61 Identify where DLP can add value. • Data Leakage Prevention and Data Loss Prevention are often used interchangeably and focus on discovering and documenting all sensitive information that is stored within and environment while subsequently controlling its movement in and out of the environment. • NIST outlines data loss as a result of data leakage, data disappearance, or data damage. • DLP solutions can be viewed as cumbersome, as they require significant planning and governance prior to implementation and continuous monitoring and maintenance from a process owner. • An integral component of effective implementation of a DLP solution, be it standalone or leveraged as a part of existing technology is a strong understanding of what types of data the organization wants to prevent from exfiltration – what is our most sensitive, or more sensitive, information, and how can we characterize that through policies? • Key features of DLP vendors include o Data discovery o Policy creation capabilities and pre-set policies o Environment scanning o Endpoint support o Centralized reporting DLP’s Role within the Data Security Landscape Global DLP Market DRIVERS • Data breaches • Compliance and complexity of regulatory demands • Cloud computing • Protection of sensitive data • Opportunities for DLP in cloud instances INHIBITORS • Deployment complexity • Business driver for external collaboration • Requires comprehensive data inventory and mapping of organization’s data Expected Growth Rate by 2025 $4.7 billion Source: BusinessWire
  • 62. Info-Tech Research Group | 62 The Cloud Access Security Broker advantage. • CASBs provide organizations with the elusive security layer sought after as the advent of cloud solutions have enabled operations to become increasingly agile, scalable, and readily-available. • A CASB solution helps extend infrastructure security measures to the cloud environment, serving a different purpose than IDaaS, web app and enterprise firewalls, and secure web gateways. • CASBs can function either as on-premises or cloud-based security policy enforcement mechanisms, and are generally a result of one of four specific use-cases: o Increased Visibility o Regulatory Compliance o Data Security o Threat Protection • CASBs can be deployed in one of four ways: o Log Collection o Forward Proxy o Reverse Proxy o API Secure Data in the Cloud with a CASB Solution Source: Adapted from McAfee Source: SANS, Technical Approach to Securing SaaS CASBs Simplified CASB Model CASB Endpoints Third Parties Remote Users On- Premises
  • 63. Info-Tech Research Group | 63 Revisit your network security to mitigate loss of data-in-transit. • Network segregation is the process of segregating or splitting the organization’s critical networks from less-critical internal networks. • Network segmentation involves creating smaller sub-networks from the larger corporate network. • Both can play a vital role in the organization’s efforts to reduce risk of external attack or infiltration and are potential mechanisms for network security. Network segregation emphasizes safeguarding of the organization’s most critical information while its in-transit. • How? Both network segmentation and segregation mitigate risk by housing separate networks, effectively enabling specific sensitive datasets to remain untouched and uncompromised in the case of a network breach and creating additional hoops for malicious attackers. • PCI DSS-compliant organizations are required to incorporate network segmentation or segregation as a method of securing cardholder data. • Effective network segregation efforts cannot take place in isolation. Instead of the InfoSec team owning the process, IT and the organization’s networking team, as well as any other impacted departments and end- users, need to be involved in order to ensure that the appropriate foresight and architectural considerations are a part of the project process. Segregate Your Networks to Mitigate Data Loss Source: Advisera ISO 27002 Guidance on Network Segregation 1. Divide large networks into separate network domains 2. Consider physical and logical segregation 3. Define domain perimeters 4. Define traffic rules between domains 6. Consider network integration with business partners 5. Use authentication, encryption, and user-level network access controls
  • 64. Info-Tech Research Group | 64 Ensuring that audit logs help monitor effectiveness of technical controls. • Audit logs or audit trails ensure that any events or incidents that take place can be attributed to a point in time and a specific individual or user. • Software applications and systems, including your DLP solution, IDS, servers, and firewalls, should each have a respective attributed audit log. Effective Audit Logging Practices: • Ensure that audit logs are kept for key systems and applications within the organization and are aggregated in a central log management system. • Logs should be regularly reviewed and analysis, including alerts attributed to security incidents. • Assign ownership over audit log review process to an individual within your IT or InfoSec team. • Establish and document a retention strategy for logs. • Log capacity requirements should be identified and managed. • Ensure log files are treated as sensitive, as the information they contain can be an open book for potential malicious intruders. Audit your Data Security Controls Compliance Accountability Analysis Risk Management Audit logs are a key requirement of various compliance and regulatory obligations, including HIPAA, NERC-CIP, and ISO 27001. By tracing the start point of specific issues or incidents, audit logs help enforce user accountability within the organization. Tracking of event details enables an organization to better understand individual risk elements within the landscape of the operating environment. Assessment of events post-occurrence provides insight on how to improve risk management and general security practices.
  • 65. Info-Tech Research Group | 65 Encryption in its many forms is an effective mechanism for securing data throughout its lifespan within the organization and upholding the CIA triad. Encryption as a Data-Specific Technique to Secure Sensitive Information Confidentiality Encryption preserves confidentiality by ensuring that sensitive or high-risk data is not visible when sent from one party to another, until it is decrypted by the appropriate party. Encryption can be used for transmission of data but also as a part of storage of data both internal to the organization, as well as in external locations. Availability Integrity Encryption makes the secure transmission of data possible within organizations, but also outside of organizations when there are fewer protective controls that the data creators or owners can place on the data. Secure availability and authorized access are key benefits of leveraging encryption practices. The Integrity pillar within the CIA triad is upheld through the respective processes of encrypting and decrypting messages by intended recipients so that the information contained is not tampered with or changed in the process of transmission, or while it is stored.
  • 66. Info-Tech Research Group | 66 Protect data as it moves both within and outside of the organization. S/MIME and PGP (Email) Public Key Implement Encryption for Data-in-Transit SSL and TLS (Web Browser/Server) SSH (Secure Remote Systems) • S/MIME ensures that information sent via email verifies the sender through digital signature and is encrypted, ensuring that only the intended recipient sees the information sent out. • End-to-end encryption provided, uses asymmetric cryptography. • Pretty Good Privacy (PGP) provides users the ability to make their own PGP public keys vs. leveraging certificates. • Secure Socket Layer has evolved to Transport Layer Security, used to securely send data over the internet • TLS is commonly used to encrypt data communicated between a web application and a server Source: IAPP An Introduction to Privacy for Technology Professionals Private Key • Secure Shell (SSH) enables safe remote access (think servers and off-prem computers). • SSH relies on symmetric-key algorithms (public key cryptography). • Asymmetric encryption • Publicly available information • In public-key infrastructure (PKI), one of the two keys leveraged in the exchange of data is kept secret • Sender and receiver are as such not required to share the same key; one used for encryption, one used to decrypt. • Symmetric encryption • In private-key cryptography, only one key is used in the exchange of data (private key) • Sender and receiver must exchange the same key • Tends to be faster than public key
  • 67. Info-Tech Research Group | 67 Stored sensitive data should be encrypted to ensure end-to-end data security. Implement Encryption for Data-at-Rest Hardware/Software Storage-Level Database • Hardware encryption occurs at the device level, is self-contained, and leverages the device’s on- board security, which means it does not impact the host system and as such tends to result in improved performance. • It is a less-flexible and more costly form of encryption, as it can not be easily extended to an entire system. • Software encryption is cost-effective, involves public or symmetric key encryption, and tends to run a lesser price tag than its hardware counterpart. • Feature of storage security that encrypts data during the archival stage of its lifecycle. • This includes encryption of data at the disk level (think hard drive – full disk encryption or FDE), file level (file-based encryption of FBE), on tape drives, and arrays. • Consists of the following encryption levels: cell-level, column-level, tablespace or database level, and file-level. • Encryption of databases is a feature of Generally-Accepted Privacy Principles for data privacy and protection. Transparent Data Encryption (TDE) This enables the data to remain protected and secured as it is encrypted while at rest, but transparent and decrypted while in use only by these select authorized users. How does this secure data? Disincentivizes hackers from stealing data as it becomes unreadable when not in use. In order to access or use the data, users must have access to the encryption certificate and “master key.” Occurs at the database encryption level and allows for sensitive data stored in tables/tablespaces to be encrypted. Data can then be decrypted for use by authorized users and transparently displayed.
  • 68. Info-Tech Research Group | 68 A core concept explored within data privacy, data masking offers a valuable set of potential options for retaining and protecting highly sensitive information. Mask Data for a Privacy-Centric Approach to Data Security Obfuscation and Data Minimization Data Masking Deidentification/Pseudonymization Anonymization • The processes of making datasets unrecognizable or untraceable through the use of specific techniques. • These include scrambling, substitution, shuffling, nulling, redaction, deletion, and introducing variances of numbers/dates/numerical complexes. • Data masking can be static, meaning it is masked within the original environment prior to extraction and transfer. • Dynamic data masking occurs in real-time, within the production database. • Privacy regulations such as the GDPR reinforce the importance of minimization techniques such as deidentification and pseudonymization. • Deidentified data is data that has had specific values that attribute to an individual, replaced or removed, so that it is no longer directly identifiable to a data subject. • Pseudonymized data is one step back, as it removes personal identifiers from a dataset; however, it does not eliminate the possibility of the data being reidentified should another piece of personal data be added to the mix. • When data has no reasonable way of being reattributed to its relevant data subject, it is considered anonymized. • All personal identifiers, in this case, have been removed, and can no longer be re-associated in hopes of reidentifying the data subject. • Data that has been fully anonymized often encounters issues around usability from the perspective of analytics. • It is difficult to guarantee full anonymization of data, and anonymization is often confused with deidentification in practice. Source: IAPP Privacy Program Management, PrivSector Report
  • 69. Info-Tech Research Group | 69 Data privacy laws fuel the need to mask sensitive data. • The role of data obfuscation is to make data unreadable, uninterpretable, useless or unusable to anyone other than the intended recipient or user. • Data masking and tokenization are examples of data obfuscation. • Tokenization is an irreversible process that refers to replacing sensitive data with tokens acting as placeholders and storing this data in a secure storage environment. The token(s) do not have an algorithmic or mathematical attachment to the original input data. • This form of obfuscation is popular in payment systems or financial services (think card numbers and bank account numbers) • Deidentification and pseudonymization are examples of data minimization, both of which are employed in the context of personal data, PII, and privacy regulations. • Privacy regulations and industry standards, including GDPR and PCI DSS, have specific instructions around the appropriate use of data obfuscation and minimization techniques as a part of data security • The GDPR, for example, considers fully anonymized data NOT to be considered personal data, as it is not attributable to an individual, whereas pseudonymized and deidentified data can be reattributed, and is therefore still considered personal data. Apply Obfuscation and Minimization Techniques Fully-identifiable personal data Pseudonymized personal data Deidentified personal data Anonymized personal data Sources: IAPP Privacy Program Management, Tokenex
  • 70. Info-Tech Research Group | 70 Input Output • Review of Section 1.2 slides and information • Data Security Matrix • Completed Tabs 3 to 10 in the Data Security Matrix • Overview of all data security controls in place within the organization Materials Participants • Laptop • Sticky notes • Markers • CISO/InfoSec lead • InfoSec managers and team • Privacy Officer/Privacy Program Manager • IT Director • IT team (optional) 1.2.1 Document Current Data Security Controls 2 - 6 hours 1. Using the Data Security Matrix tool, complete tabs 3 to 9 for all relevant data sources. • Note that prior to completing this Matrix tool, you will need to decide on the scope of the applications and systems evaluated. For example, you may choose to evaluate ALL applications and systems in which data is stored (see instructions on Tab 3 for this option), or scope down to only a specific application or system. 2. In column E, select the Framework or Industry Standard your organization currently aligns with, wishes to align to, or is in scope for. 3. For each control item listed in column D, identify whether or not the control is in place for the data source in question, f it is partially deployed, or select N/A for controls that do not fit the particular data source of stage of the lifecycle. Note that selecting N/A will not impact the overall data security average. • Use columns F to K to match the control with the stage within the data lifecycle. • Note that “N/A” has been pre-selected for many of the controls that do not logically apply. 4. Review the “Security Average” in column L, and revisit controls that have listed “No” for more than three of the stages within the lifecycle. Identify whether or not these control areas require immediate or future focus. 5. Define and document a gap-closing initiative for data security controls in column M and/or include any relevant comments around why the control is or is not in place. 6. As a group, discuss outcomes and compare with the set of compliance or standard requirements for each control. Download Info-Tech’s Data Security Matrix
  • 71. Info-Tech Research Group | 71 Reference the Organization’s Data Inventory After completing the Data Security Matrix for each of the relevant data sources within the organization, validate that the data sources evaluated during Activity 1.2.1 include all of the data locations within the organization. • Review • Validate • Revise and Update • Document Info-Tech Insight The data inventory should not be a viewed as a static repository or compliance requirement, but as a dynamic control itself within the data security process. Ensure it is regularly updated, revised, and assessed in order to build out the process surrounding effective data security within the organization.
  • 72. Info-Tech Research Group | 72 Input Output • Inventory of all data within the organization • Outputs from Activity 1.2.1 • Updated data inventory • Comprehensive and validated list of applications, systems, and physical locations that currently house highly- sensitive or high-risk data (tier 4/5) Materials Participants • Optional: Info-Tech’s Data Classification Inventory tool. • Alternatively, leverage the organization’s data classification inventory for reference • CISO/InfoSec lead • InfoSec managers and team • Data team/DBAs • IT Director • IT team (optional) 1.2.2 Review and Validate the Data Inventory 1 hour 1. As a group, review the current data inventory tool using Info-Tech’s Data Inventory, review Tab 3 – Data Inventory, and filter for any data considered in the Sensitive or Classified level (tier 4/5). • If you have completed your own data inventory, ensure you filter to focus on only the highest and/or second highest tier(s) of data. 2. Review Column I – Repository, and Column J – Physical Location, which details where the data resides. Ensure that any additional locations in which the data exists while in-use are noted within the inventory, as these will be mapped to help validate the Data Security Matrix. 3. Document the full list of repositories. Include any data sources captured in Activity 1.2.1 that may have been omitted. 4. In Tab 4 – Repository Classification, cross-reference repository controls listed in Columns F to AK with outputs from the Data Security Matrix tool, to ensure that both the Inventory and Matrix include controls in place for applications and systems. Assess and update the current data inventory for tier 4/5 sensitive or confidential data. Download Info-Tech’s Data Classification Inventory, or leverage the organization’s current data inventory tool
  • 73. Info-Tech Research Group | 73 Info-Tech Research Group | 73 Step 1.1 Step 1.2 Step 1.3 Step 1.3 Assess Compliance and Regulatory Frameworks This step will walk you through the following activities: • Identify Compliance Concerns • Review the Compliance Frameworks This step involves the following participants: • CISO/InfoSec lead • InfoSec managers and team • Privacy Officer/Privacy Program Manager • Internal Audit • IT Director • IT team (optional) Activities Outcomes of this step • Mapped comparison of the compliance obligations across all data sources • Identification of current data security compliance gaps Review Data Security Methodologies 1.3.1 Identify Compliance Concerns 1.3.2 Review the Compliance Frameworks
  • 74. Info-Tech Research Group | 74 A perspective on the global data privacy and compliance landscape. Data Privacy Compliance – An overview Countries in total have established data protection or data privacy laws, while only 18% of countries have no data protection laws in place. Of individuals state that they are very or somewhat concerned about how companies are using the data they collect about them. 79% 21% Info-Tech Insight Effective privacy and compliance help drive consumer confidence. Effective data privacy practices can give you a competitive advantage through transparency. 107 Of individuals believe that companies vs. government or users are responsible for maintaining data privacy. Source: DataPrivacyManager
  • 75. Info-Tech Research Group | 75 Input Output • (Optional) Stakeholders involved can prepare a list of current compliance obligations. • Printed or digital document version of all relevant compliance and industry standards or frameworks. • Collective understanding of key compliance concerns and drivers within the organization. • Prioritized list of compliance concerns. Materials Participants • Laptop • Sticky notes • Markers • CISO/InfoSec lead • InfoSec managers and team • Privacy Officer/Privacy Program Manager • Internal Audit • IT Director • IT team (optional) 1.3.1 Identify Compliance Concerns 45 minutes 1. Bring together relevant stakeholders from the organization with significant knowledge around compliance obligations and governing regulations. 2. Using sticky notes, have each stakeholder write down one key concern or question around compliance adherence per sticky note. 3. Collect these and group together similar themes as they arise. Themes may include: • Access Control and Management • Data Lifecycle (deletion, archiving, retention periods) • Disclosure and Open Access of Data 4. Discuss with the group what is being put on the list and clarify any unusual or unclear obligations. 5. Determine the priority of concern, or severity, for each of the compliance adherence questions. Action any high-concern or high-severity questions addressed. 6. Discuss and document for future reference. Discuss and document the primary compliance and regulatory obligations of the organization.