SlideShare a Scribd company logo
1 of 8
Download to read offline
Level Up your Cloud Data
Security with Tokenization
Advantages and implementation models
eBook
Introduction
A handful of “tokens” used to be as good as gold at the local arcade. It
meant a chance to master skee-ball or prove yourself a pinball wizard.
Today, “tokenization” is a data security technology for protecting
sensitive data. By substituting a “token” with no intrinsic value in place
of sensitive data, such as social security numbers or birth dates that
do have value, usually quite a lot of it, companies can keep the original
data safe in a secure vault, while moving tokenized data throughout the
business.
And today, one of the places just about every business is moving data is
to the cloud. Companies may be using cloud storage to replace legacy
onsite hardware or to consolidate data from across the business to
enable BI and analysis without affecting performance of operational
systems. To get the most of this analysis, companies often need to
include sensitive data.
2
Three Risk-based Models for
Tokenizing Sensitive Data in your
Cloud Data Warehouse
But where do you start tokenizing data?
We see three main models for tokenization
on your cloud data journey. The best
choice for your company will depend on
the risks you’re facing:
Level 1: Tokenization just before
moving to the cloud data
warehouse
Level 2: Tokenization before
moving through the ETL Process
Level 3: Full end-to-end
tokenization
Companies that know they need to protect data in the cloud warehouse
may be considering options like encryption or anonymization.
Tokenization actually has some very clear, specific advantages over those
two other options:
#1 Tokens have no mathematical relationship to the original data, which
means unlike encrypted data, they can’t be broken or returned to their
original form. While many of us might think encryption is one of the
strongest ways to protect stored data, it has a few weaknesses, including
this big one: the encrypted information is simply a version of the original
plain text data, scrambled by math. If a hacker gets their hands on a set
of encrypted data and the key, they essentially have the source data.
That means breaches of sensitive PII, even of encrypted data, require
reporting under state data privacy laws. Tokenization on the other hand,
replaces the plain text data with a completely unrelated “token” that
has no value if breached. Unlike encryption, there is no mathematical
formula or “key” to unlocking the data – the real data remains secure in
a token vault.
# 2 Tokens can be made to match the relationships and distinctness of
the original data so that meta-analysis can be performed on tokenized
data. When one of the main goals of moving data to the cloud is to
make it available for analytics, tokenizing the data delivers a distinct
advantage: actions such as counts of new users, lookups of users in
specific locations, and joins of data for the same user from multiple
systems can be done on the secure, tokenized data. Analysts can gain
insight and find high-level trends without requiring access to the plain
text sensitive data. Standard encrypted data, on the other hand, must
be decrypted to operate on, and once the data is decrypted there’s
3
no guarantee it will be deleted and not be forgotten, unsecured, in
the user’s download folder. As companies seek to comply with data
privacy regulations, demonstrating to auditors that access to raw
PII is as limited as possible is also a huge bonus. Tokenization allows
you to feed tokenized data directly from Snowflake into whatever
application needs it, without requiring data to be unencrypted and
potentially inadvertently exposed to privileged users.
#3 Tokens maintain a connection to the original data, so analysis
can be drilled down to the individual as needed. Anonymized data
is a security alternative that removes the personally identifiable
information by grouping data into ranges. It can keep sensitive data
safe while still allowing for high-level analysis. For example, you may
group customers by age range or general location, removing the
specific birth date or address. Analysts can derive some insights
from this, but if they wish to change the cut or focus in, for example
looking at users aged 20 to 25 versus 20 to 30, there’s no ability to
do so. Anonymized data is limited by the original parameters which
might not provide enough granularity or flexibility. And once the data
has been analyzed, if a user wants to send a marketing offer to the
group of customers, they can’t, because there’s no relationship to the
original, individual PII.
Tokenization essentially provides the best of both worlds: the
strong at-rest protection of encryption and the analysis opportunity
provided by anonymization. It delivers tough protection for sensitive
data while allowing flexibility to utilize the data down to the
individual. Tokenization allows companies to unlock the value of
sensitive data in the cloud.
Why Tokenization is an Ideal Solution for the Cloud Data Warehouse
4
Level 1: Tokenization just before moving to the cloud
data warehouse
If you’re concerned about protecting sensitive data in your destination
database, often a cloud data warehouse like Snowflake, you’re not
alone. While you may feel confident in the security measures in
place in your own datacenter, storing sensitive data in the cloud is a
different ballgame.
The first issue might be that you’re consolidating sensitive data spread
across multiple databases, each with its siloed and segmented security
and log in requirements, into one central repository. This means a
bad actor, dishonest employee or hacker just needs to sneak into one
location to have access to all your sensitive data. It creates a much
bigger risk. And this leads to the second issue: as more and more data
is stored in prominent cloud data warehouses, they have become a
top target for bad actors and nation states. Why should they target
Salesforce or Workday separately when all the same data can be found
in one place? The third concern might be about privileged access from
Snowflake employees or your Snowflake admins who could, but really
shouldn’t, have access to the sensitive data in your offsite database.
In these cases, it makes sense for you to choose “Level 1 Tokenization”:
tokenize data just before it goes into the cloud. By tokenizing data
that is stored in the database, you ensure that only the people you
authorize have access to the plain text data.
Tokens
Destination
Database
Tokens
Source
Database
ETL Tokenization
Stage
LEVEL 1
5
Level 2: Tokenization before moving through the
ETL Process
As you’re planning your path to the cloud, you may be concerned about
data as soon as it leaves the secure walls of your datacenter. This is
especially challenging for CISOs who’ve spent years hardening the security
perimeter only to have control wrested away as sensitive data is moved to
cloud data warehouses they don’t own. If you’re working with an outside
ETL (extract, transform, load) provider to help you prepare, combine, and
move your data, that will be the first step outside your perimeter causing
concern. Even though you hired them, without years of built-up trust,
you may not want them to have access to sensitive data. Or it may even
be out of your hands—you may have agreements or contracts with your
customers that specify you can’t let any vendor or other entity have access
without written consent.
In this case, “Level 2 Tokenization” is appropriate. This takes one step
back in the data transfer path and tokenizes sensitive data before it even
reaches the ETL. Instead of direct connection to the source database,
the provider connects through the tokenization software which returns
tokens. ALTR partners with SaaS-based ETL providers like Matillion to make
this seamless for enterprises.
Tokens
Destination
Database
Tokens
Source
Database
ETL
Tokenization
Proxy or Driver
LEVEL 2
Level 3: Full end-to-end tokenization
If you’re a very large financial institution classified as a “critical vendor” by
the US government, you’re familiar with the rigorous security required. This
includes ensuring that ultra-sensitive financial data is highly secure – no
unauthorized users, inside or outside the enterprise, can have access to that
data, no matter where it is. You already have this nailed down on-prem, but
we’re living in the 21st century and everyone from marketing to operations
is saying “you have to go to the cloud.” In this case, you’ll need “Level 3
Tokenization”: full end-to-end tokenization of all your onsite databases all the
way through to your cloud data warehouse.
As you can imagine, this can be a complex task. It requires tokenization across
multiple on-premises systems before even starting the data transfer process.
The upside is that it can also shine a light on who’s accessing your data,
wherever it is. You’ll quickly hear from people throughout the company who
relied on sensitive data to do their jobs when the next time they run a report
all they get back is tokens. This turns into a benefit by stopping “dark access”
to sensitive data.
Our customers found this so valuable, ALTR created a tool called “the
observer” they can hook up to a source database to provide a report of every
connection, every user, every IP address and every query run against the
source database. The process forces companies to understand the full lifecycle
of their data – where it’s being used and by whom – and helps ensure sensitive
data is truly secure.
6
Destination
Database
Tokens
Source
Database
ETL
Tokens
LEVEL 3
Use the power of ALTR’s tokenization platform
ALTR’s tokenization platform provides unique data security benefits across your entire path to the cloud. Our SaaS-based approach means we can cover
data wherever it’s located: on-premises, in the cloud or even in other SaaS-based software like Salesforce. This also allows us to deliver innovations like new
token formats or new security features more quickly, with no need to upgrade. Our tokenization solutions also range from the most fundamental level
all the way up to PCI Level 1 compliant, allowing companies to choose the best balance of speed, security, and cost for their business. We’ve also invested
heavily in IP that enables our database driver to connect transparently and keep data usable while tokenized. The drivers can, for example, perform the
lookups and joins needed to keep applications that are unused to tokenization running.
With tokenization from ALTR, you can bring sensitive data safely into the cloud to get full analytic value from it, while helping meet contractual security
requirements or the steepest regulatory challenges.
7
Complete data control
and protection
ALTR simplifies and unifies data governance and security, allowing anyone the ability
to confidently store, share, analyze, and use their data. With ALTR, customers gain
unparalleled visibility into how sensitive data is used across their organization. This
intelligence can be used to create advanced policies to control data access. Through
ALTR’s cloud platform, customers can implement data governance and security in
minutes instead of months.
Get started for free at get.altr.com/free

More Related Content

Similar to eBook: Level Up Your Data Security with Tokenization

Global Security Certification for Governments
Global Security Certification for GovernmentsGlobal Security Certification for Governments
Global Security Certification for GovernmentsCloudMask inc.
 
Isaca journal - bridging the gap between access and security in big data...
Isaca journal  - bridging the gap between access and security in big data...Isaca journal  - bridging the gap between access and security in big data...
Isaca journal - bridging the gap between access and security in big data...Ulf Mattsson
 
F018133640.key aggregate paper
F018133640.key aggregate paperF018133640.key aggregate paper
F018133640.key aggregate paperIOSR Journals
 
Carrying out safe exploration short of the actual data of codes and trapdoors
Carrying out safe exploration short of the actual data of codes and trapdoorsCarrying out safe exploration short of the actual data of codes and trapdoors
Carrying out safe exploration short of the actual data of codes and trapdoorsIaetsd Iaetsd
 
Data security to protect pci data flow ulf mattsson - insecure-mag-40
Data security to protect pci data flow   ulf mattsson - insecure-mag-40Data security to protect pci data flow   ulf mattsson - insecure-mag-40
Data security to protect pci data flow ulf mattsson - insecure-mag-40Ulf Mattsson
 
Security issues in big data
Security issues in big data Security issues in big data
Security issues in big data Shallote Dsouza
 
Research Report on Preserving Data Confidentiality & Data Integrity in ...
Research Report on Preserving  Data  Confidentiality  &  Data  Integrity  in ...Research Report on Preserving  Data  Confidentiality  &  Data  Integrity  in ...
Research Report on Preserving Data Confidentiality & Data Integrity in ...Manish Sahani
 
Cashing in on the public cloud with total confidence
Cashing in on the public cloud with total confidenceCashing in on the public cloud with total confidence
Cashing in on the public cloud with total confidenceCloudMask inc.
 
IRJET- Exchanging Secure Data in Cloud with Confidentiality and Privacy Goals
IRJET- Exchanging Secure Data in Cloud with Confidentiality and Privacy GoalsIRJET- Exchanging Secure Data in Cloud with Confidentiality and Privacy Goals
IRJET- Exchanging Secure Data in Cloud with Confidentiality and Privacy GoalsIRJET Journal
 
Is data sovereignty the answer to cloud computing risks
Is data sovereignty the answer to cloud computing risksIs data sovereignty the answer to cloud computing risks
Is data sovereignty the answer to cloud computing risksCloudMask inc.
 
International Journal of Engineering Research and Development
International Journal of Engineering Research and DevelopmentInternational Journal of Engineering Research and Development
International Journal of Engineering Research and DevelopmentIJERD Editor
 
Gdpr ccpa steps to near as close to compliancy as possible with low risk of f...
Gdpr ccpa steps to near as close to compliancy as possible with low risk of f...Gdpr ccpa steps to near as close to compliancy as possible with low risk of f...
Gdpr ccpa steps to near as close to compliancy as possible with low risk of f...Steven Meister
 
10 Tips for CIOS Data Security in the Cloud
10 Tips for CIOS Data Security in the Cloud10 Tips for CIOS Data Security in the Cloud
10 Tips for CIOS Data Security in the CloudIron Mountain
 
10.1.1.436.3364.pdf
10.1.1.436.3364.pdf10.1.1.436.3364.pdf
10.1.1.436.3364.pdfmistryritesh
 
Extending Information Security to Non-Production Environments
Extending Information Security to Non-Production EnvironmentsExtending Information Security to Non-Production Environments
Extending Information Security to Non-Production EnvironmentsLindaWatson19
 
COST-EFFECTIVE AUTHENTIC AND ANONYMOUS DATA SHARING WITH FORWARD SECURITY
COST-EFFECTIVE AUTHENTIC AND ANONYMOUS DATA SHARING WITH FORWARD SECURITYCOST-EFFECTIVE AUTHENTIC AND ANONYMOUS DATA SHARING WITH FORWARD SECURITY
COST-EFFECTIVE AUTHENTIC AND ANONYMOUS DATA SHARING WITH FORWARD SECURITYShakas Technologies
 

Similar to eBook: Level Up Your Data Security with Tokenization (20)

Global Security Certification for Governments
Global Security Certification for GovernmentsGlobal Security Certification for Governments
Global Security Certification for Governments
 
Isaca journal - bridging the gap between access and security in big data...
Isaca journal  - bridging the gap between access and security in big data...Isaca journal  - bridging the gap between access and security in big data...
Isaca journal - bridging the gap between access and security in big data...
 
F018133640.key aggregate paper
F018133640.key aggregate paperF018133640.key aggregate paper
F018133640.key aggregate paper
 
Carrying out safe exploration short of the actual data of codes and trapdoors
Carrying out safe exploration short of the actual data of codes and trapdoorsCarrying out safe exploration short of the actual data of codes and trapdoors
Carrying out safe exploration short of the actual data of codes and trapdoors
 
Data security to protect pci data flow ulf mattsson - insecure-mag-40
Data security to protect pci data flow   ulf mattsson - insecure-mag-40Data security to protect pci data flow   ulf mattsson - insecure-mag-40
Data security to protect pci data flow ulf mattsson - insecure-mag-40
 
Security issues in big data
Security issues in big data Security issues in big data
Security issues in big data
 
Research Report on Preserving Data Confidentiality & Data Integrity in ...
Research Report on Preserving  Data  Confidentiality  &  Data  Integrity  in ...Research Report on Preserving  Data  Confidentiality  &  Data  Integrity  in ...
Research Report on Preserving Data Confidentiality & Data Integrity in ...
 
Cashing in on the public cloud with total confidence
Cashing in on the public cloud with total confidenceCashing in on the public cloud with total confidence
Cashing in on the public cloud with total confidence
 
IRJET- Exchanging Secure Data in Cloud with Confidentiality and Privacy Goals
IRJET- Exchanging Secure Data in Cloud with Confidentiality and Privacy GoalsIRJET- Exchanging Secure Data in Cloud with Confidentiality and Privacy Goals
IRJET- Exchanging Secure Data in Cloud with Confidentiality and Privacy Goals
 
Wp security-data-safe
Wp security-data-safeWp security-data-safe
Wp security-data-safe
 
Encrypt-Everything-eB.pdf
Encrypt-Everything-eB.pdfEncrypt-Everything-eB.pdf
Encrypt-Everything-eB.pdf
 
Is data sovereignty the answer to cloud computing risks
Is data sovereignty the answer to cloud computing risksIs data sovereignty the answer to cloud computing risks
Is data sovereignty the answer to cloud computing risks
 
International Journal of Engineering Research and Development
International Journal of Engineering Research and DevelopmentInternational Journal of Engineering Research and Development
International Journal of Engineering Research and Development
 
Gdpr ccpa steps to near as close to compliancy as possible with low risk of f...
Gdpr ccpa steps to near as close to compliancy as possible with low risk of f...Gdpr ccpa steps to near as close to compliancy as possible with low risk of f...
Gdpr ccpa steps to near as close to compliancy as possible with low risk of f...
 
10 Tips for CIOS Data Security in the Cloud
10 Tips for CIOS Data Security in the Cloud10 Tips for CIOS Data Security in the Cloud
10 Tips for CIOS Data Security in the Cloud
 
Ijcatr04051015
Ijcatr04051015Ijcatr04051015
Ijcatr04051015
 
10.1.1.436.3364.pdf
10.1.1.436.3364.pdf10.1.1.436.3364.pdf
10.1.1.436.3364.pdf
 
Data Sovereignty and the Cloud
Data Sovereignty and the CloudData Sovereignty and the Cloud
Data Sovereignty and the Cloud
 
Extending Information Security to Non-Production Environments
Extending Information Security to Non-Production EnvironmentsExtending Information Security to Non-Production Environments
Extending Information Security to Non-Production Environments
 
COST-EFFECTIVE AUTHENTIC AND ANONYMOUS DATA SHARING WITH FORWARD SECURITY
COST-EFFECTIVE AUTHENTIC AND ANONYMOUS DATA SHARING WITH FORWARD SECURITYCOST-EFFECTIVE AUTHENTIC AND ANONYMOUS DATA SHARING WITH FORWARD SECURITY
COST-EFFECTIVE AUTHENTIC AND ANONYMOUS DATA SHARING WITH FORWARD SECURITY
 

More from Kim Cook

White Paper: The Age of Data
White Paper: The Age of DataWhite Paper: The Age of Data
White Paper: The Age of DataKim Cook
 
Customer Story: Indian Farm Equipment
Customer Story: Indian Farm EquipmentCustomer Story: Indian Farm Equipment
Customer Story: Indian Farm EquipmentKim Cook
 
Customer Story: University of Munster
Customer Story: University of MunsterCustomer Story: University of Munster
Customer Story: University of MunsterKim Cook
 
ALTR Company Overview 2023
ALTR Company Overview 2023ALTR Company Overview 2023
ALTR Company Overview 2023Kim Cook
 
ALTR Redwood Logistics Case Study
ALTR Redwood Logistics Case StudyALTR Redwood Logistics Case Study
ALTR Redwood Logistics Case StudyKim Cook
 
E2open Corporate Overview
E2open Corporate OverviewE2open Corporate Overview
E2open Corporate OverviewKim Cook
 

More from Kim Cook (6)

White Paper: The Age of Data
White Paper: The Age of DataWhite Paper: The Age of Data
White Paper: The Age of Data
 
Customer Story: Indian Farm Equipment
Customer Story: Indian Farm EquipmentCustomer Story: Indian Farm Equipment
Customer Story: Indian Farm Equipment
 
Customer Story: University of Munster
Customer Story: University of MunsterCustomer Story: University of Munster
Customer Story: University of Munster
 
ALTR Company Overview 2023
ALTR Company Overview 2023ALTR Company Overview 2023
ALTR Company Overview 2023
 
ALTR Redwood Logistics Case Study
ALTR Redwood Logistics Case StudyALTR Redwood Logistics Case Study
ALTR Redwood Logistics Case Study
 
E2open Corporate Overview
E2open Corporate OverviewE2open Corporate Overview
E2open Corporate Overview
 

Recently uploaded

AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAndrey Devyatkin
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessPixlogix Infotech
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CVKhem
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesBoston Institute of Analytics
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilV3cube
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?Antenna Manufacturer Coco
 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfhans926745
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 

Recently uploaded (20)

AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of Brazil
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdf
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 

eBook: Level Up Your Data Security with Tokenization

  • 1. Level Up your Cloud Data Security with Tokenization Advantages and implementation models eBook
  • 2. Introduction A handful of “tokens” used to be as good as gold at the local arcade. It meant a chance to master skee-ball or prove yourself a pinball wizard. Today, “tokenization” is a data security technology for protecting sensitive data. By substituting a “token” with no intrinsic value in place of sensitive data, such as social security numbers or birth dates that do have value, usually quite a lot of it, companies can keep the original data safe in a secure vault, while moving tokenized data throughout the business. And today, one of the places just about every business is moving data is to the cloud. Companies may be using cloud storage to replace legacy onsite hardware or to consolidate data from across the business to enable BI and analysis without affecting performance of operational systems. To get the most of this analysis, companies often need to include sensitive data. 2 Three Risk-based Models for Tokenizing Sensitive Data in your Cloud Data Warehouse But where do you start tokenizing data? We see three main models for tokenization on your cloud data journey. The best choice for your company will depend on the risks you’re facing: Level 1: Tokenization just before moving to the cloud data warehouse Level 2: Tokenization before moving through the ETL Process Level 3: Full end-to-end tokenization
  • 3. Companies that know they need to protect data in the cloud warehouse may be considering options like encryption or anonymization. Tokenization actually has some very clear, specific advantages over those two other options: #1 Tokens have no mathematical relationship to the original data, which means unlike encrypted data, they can’t be broken or returned to their original form. While many of us might think encryption is one of the strongest ways to protect stored data, it has a few weaknesses, including this big one: the encrypted information is simply a version of the original plain text data, scrambled by math. If a hacker gets their hands on a set of encrypted data and the key, they essentially have the source data. That means breaches of sensitive PII, even of encrypted data, require reporting under state data privacy laws. Tokenization on the other hand, replaces the plain text data with a completely unrelated “token” that has no value if breached. Unlike encryption, there is no mathematical formula or “key” to unlocking the data – the real data remains secure in a token vault. # 2 Tokens can be made to match the relationships and distinctness of the original data so that meta-analysis can be performed on tokenized data. When one of the main goals of moving data to the cloud is to make it available for analytics, tokenizing the data delivers a distinct advantage: actions such as counts of new users, lookups of users in specific locations, and joins of data for the same user from multiple systems can be done on the secure, tokenized data. Analysts can gain insight and find high-level trends without requiring access to the plain text sensitive data. Standard encrypted data, on the other hand, must be decrypted to operate on, and once the data is decrypted there’s 3 no guarantee it will be deleted and not be forgotten, unsecured, in the user’s download folder. As companies seek to comply with data privacy regulations, demonstrating to auditors that access to raw PII is as limited as possible is also a huge bonus. Tokenization allows you to feed tokenized data directly from Snowflake into whatever application needs it, without requiring data to be unencrypted and potentially inadvertently exposed to privileged users. #3 Tokens maintain a connection to the original data, so analysis can be drilled down to the individual as needed. Anonymized data is a security alternative that removes the personally identifiable information by grouping data into ranges. It can keep sensitive data safe while still allowing for high-level analysis. For example, you may group customers by age range or general location, removing the specific birth date or address. Analysts can derive some insights from this, but if they wish to change the cut or focus in, for example looking at users aged 20 to 25 versus 20 to 30, there’s no ability to do so. Anonymized data is limited by the original parameters which might not provide enough granularity or flexibility. And once the data has been analyzed, if a user wants to send a marketing offer to the group of customers, they can’t, because there’s no relationship to the original, individual PII. Tokenization essentially provides the best of both worlds: the strong at-rest protection of encryption and the analysis opportunity provided by anonymization. It delivers tough protection for sensitive data while allowing flexibility to utilize the data down to the individual. Tokenization allows companies to unlock the value of sensitive data in the cloud. Why Tokenization is an Ideal Solution for the Cloud Data Warehouse
  • 4. 4 Level 1: Tokenization just before moving to the cloud data warehouse If you’re concerned about protecting sensitive data in your destination database, often a cloud data warehouse like Snowflake, you’re not alone. While you may feel confident in the security measures in place in your own datacenter, storing sensitive data in the cloud is a different ballgame. The first issue might be that you’re consolidating sensitive data spread across multiple databases, each with its siloed and segmented security and log in requirements, into one central repository. This means a bad actor, dishonest employee or hacker just needs to sneak into one location to have access to all your sensitive data. It creates a much bigger risk. And this leads to the second issue: as more and more data is stored in prominent cloud data warehouses, they have become a top target for bad actors and nation states. Why should they target Salesforce or Workday separately when all the same data can be found in one place? The third concern might be about privileged access from Snowflake employees or your Snowflake admins who could, but really shouldn’t, have access to the sensitive data in your offsite database. In these cases, it makes sense for you to choose “Level 1 Tokenization”: tokenize data just before it goes into the cloud. By tokenizing data that is stored in the database, you ensure that only the people you authorize have access to the plain text data. Tokens Destination Database Tokens Source Database ETL Tokenization Stage LEVEL 1
  • 5. 5 Level 2: Tokenization before moving through the ETL Process As you’re planning your path to the cloud, you may be concerned about data as soon as it leaves the secure walls of your datacenter. This is especially challenging for CISOs who’ve spent years hardening the security perimeter only to have control wrested away as sensitive data is moved to cloud data warehouses they don’t own. If you’re working with an outside ETL (extract, transform, load) provider to help you prepare, combine, and move your data, that will be the first step outside your perimeter causing concern. Even though you hired them, without years of built-up trust, you may not want them to have access to sensitive data. Or it may even be out of your hands—you may have agreements or contracts with your customers that specify you can’t let any vendor or other entity have access without written consent. In this case, “Level 2 Tokenization” is appropriate. This takes one step back in the data transfer path and tokenizes sensitive data before it even reaches the ETL. Instead of direct connection to the source database, the provider connects through the tokenization software which returns tokens. ALTR partners with SaaS-based ETL providers like Matillion to make this seamless for enterprises. Tokens Destination Database Tokens Source Database ETL Tokenization Proxy or Driver LEVEL 2
  • 6. Level 3: Full end-to-end tokenization If you’re a very large financial institution classified as a “critical vendor” by the US government, you’re familiar with the rigorous security required. This includes ensuring that ultra-sensitive financial data is highly secure – no unauthorized users, inside or outside the enterprise, can have access to that data, no matter where it is. You already have this nailed down on-prem, but we’re living in the 21st century and everyone from marketing to operations is saying “you have to go to the cloud.” In this case, you’ll need “Level 3 Tokenization”: full end-to-end tokenization of all your onsite databases all the way through to your cloud data warehouse. As you can imagine, this can be a complex task. It requires tokenization across multiple on-premises systems before even starting the data transfer process. The upside is that it can also shine a light on who’s accessing your data, wherever it is. You’ll quickly hear from people throughout the company who relied on sensitive data to do their jobs when the next time they run a report all they get back is tokens. This turns into a benefit by stopping “dark access” to sensitive data. Our customers found this so valuable, ALTR created a tool called “the observer” they can hook up to a source database to provide a report of every connection, every user, every IP address and every query run against the source database. The process forces companies to understand the full lifecycle of their data – where it’s being used and by whom – and helps ensure sensitive data is truly secure. 6 Destination Database Tokens Source Database ETL Tokens LEVEL 3
  • 7. Use the power of ALTR’s tokenization platform ALTR’s tokenization platform provides unique data security benefits across your entire path to the cloud. Our SaaS-based approach means we can cover data wherever it’s located: on-premises, in the cloud or even in other SaaS-based software like Salesforce. This also allows us to deliver innovations like new token formats or new security features more quickly, with no need to upgrade. Our tokenization solutions also range from the most fundamental level all the way up to PCI Level 1 compliant, allowing companies to choose the best balance of speed, security, and cost for their business. We’ve also invested heavily in IP that enables our database driver to connect transparently and keep data usable while tokenized. The drivers can, for example, perform the lookups and joins needed to keep applications that are unused to tokenization running. With tokenization from ALTR, you can bring sensitive data safely into the cloud to get full analytic value from it, while helping meet contractual security requirements or the steepest regulatory challenges. 7
  • 8. Complete data control and protection ALTR simplifies and unifies data governance and security, allowing anyone the ability to confidently store, share, analyze, and use their data. With ALTR, customers gain unparalleled visibility into how sensitive data is used across their organization. This intelligence can be used to create advanced policies to control data access. Through ALTR’s cloud platform, customers can implement data governance and security in minutes instead of months. Get started for free at get.altr.com/free