This white paper examines the factors that have driven rapid adoption of tokenization among retailers and other merchants, and offers lessons from the PCI experience that can be applied to other industries and use cases.
Best Practices for PCI Scope Reduction - TokenEx & KyteTokenEx
Best practices for PCI Scope Reduction includes some common misconceptions, important definitions, and an overview of technologies such as tokenization and encryption to help reduce PCI DSS scope and achieve compliance.
PCI Scope Reduction Using Tokenization for Security Assessors (QSA, ISA)TokenEx
Achieving and maintaining compliance with the PCI DSS (Payment Card Industry Data Security Standard) is a complex and painful process that can vary widely across different industries and businesses. PCI scope reduction can simplify and reduce the pain of compliance for many organizations.
The Easy WAy to Accept & Protect Credit Card DataTyler Hannan
The recorded version of this webinar is available at:
http://www.practicalecommerce.com/webinars/60-The-Easy-Way-to-Accept-and-Protect-Credit-Card-Data
"The Easy Way to Accept & Protect Credit Card Data" is a free, educational webinar. The moderator is Kerry Murdock, editor and publisher of Practical eCommerce. The presenters are Tyler Hannan, platform evangelist for IP Commerce, a leading cloud-computing payment platform, and David Herrald, an information security consultant with Global Technology Resources, Inc., an international security and technology firm.
e-Similate, a leading provider of payment integration tools, is the sponsor of the webinar.
Continuous PCI and GDPR Compliance With Data-Centric SecurityTokenEx
Continuous PCI and GDPR Compliance With Data-Centric Security describes how to develop a data security environment that is GDPR and/or PCI DSS compliant by utilizing tokenisation to pseudonymize sensitive data. Contact: Sales@tokenex.com
PCI Descoping: How to Reduce Controls and Streamline ComplianceTokenEx
Descoping a data environment by decreasing the amount of PCI traversing it is one of the simplest and most effective ways of complying with the PCI DSS. By outsourcing the handling of sensitive payment information to security experts, organizations can reduce compliance and operational costs while minimizing the risk and liability associated with a potential data breach. Tokenization is especially effective at this due to its ability to remove sensitive data from an environment and store it in a secure, cloud-based token vault.
In this deck you will learn:
PCI controls for organizations that handle card information
Which controls can be removed from scope
How cloud-based tokenization outsources PCI compliance to a tokenization provider
Additional strategies and best practices for achieving PCI compliance
Cashing in on the public cloud with total confidenceCloudMask inc.
Banks have always been targets for attack. The year 2011 appears to have been a critical tipping point for bank related cybercrime. Attacks grew at a rate of nearly 300 to 400% that year, and innovative attacks cost banks and customers a lot of money.
MTBiz is for you if you are looking for contemporary information on business, economy and especially on banking industry of Bangladesh. You would also find periodical information on Global Economy and Commodity Markets.
IRIS-KYC solution assists securities firms to reduce time of client onboarding by up to 80%, eliminate paper and remove manual steps. Iris-KYC solution is designed with flexibility to meet an organisation’s business activities and integrate with core banking systems whilst meeting stringent compliance regulations. IRIS-KYC can be integrated into an organisation’s core systems or sit alongside existing financial applications for batch processing of documents and data.
Common Data Protection Issues in Managing M&A DealsMatheson Law Firm
This article explores the potential application of the GDPR in running a typical Irish merger or acquisition and sets out some practical guidelines on how parties to the transaction can demonstrate compliance with the GDPR requirements.
Understanding GDPR Compliance and How to Easily Protect DataGoose & Gander
Tokenizing data provides a means for your organization to pseudonymize personal data and securely store it outside of your environment. This data sheet highlights relevant articles within the General Data Protection Regulation.
Shopify makes it easy for you to choose the right payment methods that are the perfect fit for your customers. Below are some of the critical points to keep in mind before you hire a Shopify development agency that can deliver as per your needs.
To know more visit at https://www.thinktanker.io/blog/seamless-payment-integration-with-shopify.html
The Role of Password Management in Achieving CompliancePortalGuard
Password management solutions have had a dramatic impact on organiza-tions; from eliminating password-related Help Desk calls to simplifying end-user access, password management has gone beyond tightening security to delivering improvements to the bottom line. Now, with the implementation of Sarbanes-Oxley, HIPAA and other regulations, password management has proven to be a strategic component for successful compliance.
http://www.portalguard.com
What is Payment Tokenization?
Tokenization enables banks, acquirers and merchants to offer more secure (mobile) payment services.
It is the process of replacing card data with alternate values.
The original personal account number (PAN) is disconnected and replaced with a unique identifier called a payment token.
The ‘mapping’ between the real PAN and the payment tokens is safely stored in the token vault.
With tokenization the original PAN information is removed from environments where data can be vulnerable.
Why tokenization?
Tokenization heavily reduces payment fraud by removing confidential consumer credit card data from the network.
The original data stays in the bank’s control. External systems have no access to this.
Tokens are not based on cryptography and can therefore not be traced back to the original value.
How does tokenization work?
Step 1: A payment token is generated from the PAN for one time use within a specific domain such as a merchant’s website or channel.
Tokens are sent to the token vault and stored in a PCI-compliant environment which does not allow merchants to store credit card numbers.
Step 2: Tokens are loaded on the mobile device.
Step 3: The NFC device makes a payment at a merchant’s NFC point-of-sales (POS) terminal.
Step 4: The POS terminal sends the token to the acquiring bank, which sends it to the issuing bank through the payment network.
Step 5: The issuer de-tokenizes the token to the real PAN and, if in order, approves the payment.
Step 6: After authorization from the card issuer, the token is returned to the merchant’s POS terminal.
Payment tokens perform like the original PAN for returns, sales reports, marketing analysis, recurring payments etc.
20. How can I issue tokens?
In order to use tokenization, a bank or merchant should become a token service provider (TSP).
A TSP manages the entire lifecycle of payment credentials including:
1. Tokenization: replaces the PAN with a payment token.
2. De-Tokenization: converts the token back to the PAN using the token vault.
3. Token vault: establishes and maintains the payment token to PAN mapping.
4. Domain management: improves protection by defining payment tokens for specific use.
5. Clearing and settlement: ad-hoc de-tokenization during clearing and settlement process.
6. Identification and verification: ensures the original PAN is legitimately used by the token requestor.
Thinking of issuing payment tokens to e.g. secure mobile payments or secure your online sales channel? Bell ID can help: www.bellid.com – info@bellid.com
Martin Cox – Global Head of Sales
Best Practices for PCI Scope Reduction - TokenEx & KyteTokenEx
Best practices for PCI Scope Reduction includes some common misconceptions, important definitions, and an overview of technologies such as tokenization and encryption to help reduce PCI DSS scope and achieve compliance.
PCI Scope Reduction Using Tokenization for Security Assessors (QSA, ISA)TokenEx
Achieving and maintaining compliance with the PCI DSS (Payment Card Industry Data Security Standard) is a complex and painful process that can vary widely across different industries and businesses. PCI scope reduction can simplify and reduce the pain of compliance for many organizations.
The Easy WAy to Accept & Protect Credit Card DataTyler Hannan
The recorded version of this webinar is available at:
http://www.practicalecommerce.com/webinars/60-The-Easy-Way-to-Accept-and-Protect-Credit-Card-Data
"The Easy Way to Accept & Protect Credit Card Data" is a free, educational webinar. The moderator is Kerry Murdock, editor and publisher of Practical eCommerce. The presenters are Tyler Hannan, platform evangelist for IP Commerce, a leading cloud-computing payment platform, and David Herrald, an information security consultant with Global Technology Resources, Inc., an international security and technology firm.
e-Similate, a leading provider of payment integration tools, is the sponsor of the webinar.
Continuous PCI and GDPR Compliance With Data-Centric SecurityTokenEx
Continuous PCI and GDPR Compliance With Data-Centric Security describes how to develop a data security environment that is GDPR and/or PCI DSS compliant by utilizing tokenisation to pseudonymize sensitive data. Contact: Sales@tokenex.com
PCI Descoping: How to Reduce Controls and Streamline ComplianceTokenEx
Descoping a data environment by decreasing the amount of PCI traversing it is one of the simplest and most effective ways of complying with the PCI DSS. By outsourcing the handling of sensitive payment information to security experts, organizations can reduce compliance and operational costs while minimizing the risk and liability associated with a potential data breach. Tokenization is especially effective at this due to its ability to remove sensitive data from an environment and store it in a secure, cloud-based token vault.
In this deck you will learn:
PCI controls for organizations that handle card information
Which controls can be removed from scope
How cloud-based tokenization outsources PCI compliance to a tokenization provider
Additional strategies and best practices for achieving PCI compliance
Cashing in on the public cloud with total confidenceCloudMask inc.
Banks have always been targets for attack. The year 2011 appears to have been a critical tipping point for bank related cybercrime. Attacks grew at a rate of nearly 300 to 400% that year, and innovative attacks cost banks and customers a lot of money.
MTBiz is for you if you are looking for contemporary information on business, economy and especially on banking industry of Bangladesh. You would also find periodical information on Global Economy and Commodity Markets.
IRIS-KYC solution assists securities firms to reduce time of client onboarding by up to 80%, eliminate paper and remove manual steps. Iris-KYC solution is designed with flexibility to meet an organisation’s business activities and integrate with core banking systems whilst meeting stringent compliance regulations. IRIS-KYC can be integrated into an organisation’s core systems or sit alongside existing financial applications for batch processing of documents and data.
Common Data Protection Issues in Managing M&A DealsMatheson Law Firm
This article explores the potential application of the GDPR in running a typical Irish merger or acquisition and sets out some practical guidelines on how parties to the transaction can demonstrate compliance with the GDPR requirements.
Understanding GDPR Compliance and How to Easily Protect DataGoose & Gander
Tokenizing data provides a means for your organization to pseudonymize personal data and securely store it outside of your environment. This data sheet highlights relevant articles within the General Data Protection Regulation.
Shopify makes it easy for you to choose the right payment methods that are the perfect fit for your customers. Below are some of the critical points to keep in mind before you hire a Shopify development agency that can deliver as per your needs.
To know more visit at https://www.thinktanker.io/blog/seamless-payment-integration-with-shopify.html
The Role of Password Management in Achieving CompliancePortalGuard
Password management solutions have had a dramatic impact on organiza-tions; from eliminating password-related Help Desk calls to simplifying end-user access, password management has gone beyond tightening security to delivering improvements to the bottom line. Now, with the implementation of Sarbanes-Oxley, HIPAA and other regulations, password management has proven to be a strategic component for successful compliance.
http://www.portalguard.com
What is Payment Tokenization?
Tokenization enables banks, acquirers and merchants to offer more secure (mobile) payment services.
It is the process of replacing card data with alternate values.
The original personal account number (PAN) is disconnected and replaced with a unique identifier called a payment token.
The ‘mapping’ between the real PAN and the payment tokens is safely stored in the token vault.
With tokenization the original PAN information is removed from environments where data can be vulnerable.
Why tokenization?
Tokenization heavily reduces payment fraud by removing confidential consumer credit card data from the network.
The original data stays in the bank’s control. External systems have no access to this.
Tokens are not based on cryptography and can therefore not be traced back to the original value.
How does tokenization work?
Step 1: A payment token is generated from the PAN for one time use within a specific domain such as a merchant’s website or channel.
Tokens are sent to the token vault and stored in a PCI-compliant environment which does not allow merchants to store credit card numbers.
Step 2: Tokens are loaded on the mobile device.
Step 3: The NFC device makes a payment at a merchant’s NFC point-of-sales (POS) terminal.
Step 4: The POS terminal sends the token to the acquiring bank, which sends it to the issuing bank through the payment network.
Step 5: The issuer de-tokenizes the token to the real PAN and, if in order, approves the payment.
Step 6: After authorization from the card issuer, the token is returned to the merchant’s POS terminal.
Payment tokens perform like the original PAN for returns, sales reports, marketing analysis, recurring payments etc.
20. How can I issue tokens?
In order to use tokenization, a bank or merchant should become a token service provider (TSP).
A TSP manages the entire lifecycle of payment credentials including:
1. Tokenization: replaces the PAN with a payment token.
2. De-Tokenization: converts the token back to the PAN using the token vault.
3. Token vault: establishes and maintains the payment token to PAN mapping.
4. Domain management: improves protection by defining payment tokens for specific use.
5. Clearing and settlement: ad-hoc de-tokenization during clearing and settlement process.
6. Identification and verification: ensures the original PAN is legitimately used by the token requestor.
Thinking of issuing payment tokens to e.g. secure mobile payments or secure your online sales channel? Bell ID can help: www.bellid.com – info@bellid.com
Martin Cox – Global Head of Sales
To increase the security of mobile payments, many payment schemes nowadays apply a technology called tokenization.
Tokenization is the process of replacing an existing payment card number with a surrogate value (token).
This token is used during a payment transaction, keeping the original card number safe.
A Token Service Provider (TSP) is an entity within the payments ecosystem that generates and manages tokens.
The TSP maps the original card number with the payment tokens and stores this safely in a token vault.
Often these tokens can only be used in a specific domain such as a merchant’s online website or channel, limiting the risk even further.
A TSP manages the entire lifecycle of payment credentials including: Token Requestor Authorization Host
1. Tokenization: Replaces the PAN with a payment token.
2. De-tokenization: Converts the token back to the PAN using the token vault.
3. Token vault: Establishes and maintains the payment token to PAN mapping.
4. Domain management: Adds additional security by restricting tokens to be used within specific (retail) channels or domains.
5. Identification and verification: Ensures that the payment token is replacing a PAN that was legitimately used by the token requestor.
6. Clearing and settlement: Ad-hoc de-tokenization during clearing and settlement process.
Issuers, acquirers and merchants that wish to offer mobile and/or digital payments to customers can become a TSP.
Becoming your own TSP gives full control over the tokenization process: creation, storage, issuance and management.
By having your own TSP, you are in full control of digital payments by issuing tokens directly without third party intervention.
By using a third party TSP from the payment schemes, issuers need to integrate with each payment scheme.
Benefits of having you own TSP:
1. Reduce long term costs: no additional TSP fees from the payment schemes.
2. On-us transactions: save on transaction fees when you are the issuing as well as the acquiring bank.
3. Banks retain their privacy because data and roadmaps do not have to be shared with the schemes.
4. Keep track of customer payment behavior to gain valuable insight and be able to offer personalized services.
5. Expand to multiple use cases. Host Card Emulation Embedded SE Internet: Card-Not-Present Value added services / non-card payments
6. Have your own strategy and be future proof in order to stay competitive.
Bell ID® Token Service Provider enables issuers and processors to perform the role of a Token Service Provider.
Online Campaigns: A waste of money or improving your bottom line?Research Now
With the explosion of online advertising, many brands have taken a shotgun approach to reach their audience. But consumers are now demanding more focused, targeted and personalised communications. In this webinar, we will tackle the move from offline to online advertising and show how effective consumer insight can advance your online campaigns and drive conversions.
Learn more: http://bit.ly/132FstQ
White Paper: xDesign Online Editor & API Performance Benchmark Summary EMC
This white paper explains the performance of the xDesign Online Editor and its web services APIs, part of the EMC Document Sciences xPression suite. It provides performance data for editing a document, publishing a document, returning it to the calling application or browser, and displaying it in the user’s queue.
Evolving regulations are changing the way we think about tools and technologyUlf Mattsson
Discover the latest in RegTech and stay up-to-date on compliance tools and best practices.
The move to digital has meant that many organizations have had to rethink legacy systems.
They need to put the customer first, focus on the Customer Experience and Digital Experience Platforms.
They also need to understand the latest in RegTech and solutions for hybrid cloud.
We will discuss Regtech for the financial industry and related technologies for compliance.
We will discuss new International Standards, tools and best practices for financial institutions including PCI v4, FFIEC, NACHA, NIST, GDPR and CCPA.
We will discuss related technologies for Data Security and Privacy, including data de-identification, encryption, tokenization and the new API Economy.
Cybersecurity Research Paper instructionsSelect a research topic.docxtheodorelove43763
Cybersecurity Research Paper instructions
Select a research topic from the list below. After selecting your topic, research the incident using news articles, magazine articles (trade press), journal articles, and/or technical reports from government and industry.
TJ Maxx Security breach
For a grade of A, a minimum of five authoritative sources are required.
Your research is to be incorporated into the students' 3- to 5-page written analysis of the attack or incident. Your report is to be prepared using basic APA formatting (see below) and submitted as an MS Word attachment to the Cybersecurity Research Paper entry in your assignments folder.
This paper must be plagiarism free. I will have to turn it in using turnitin.com!
Below is one source that should be used for this paper. I will also send the full text pdf for the source.
Source 1
Berg, G. G., Freeman, M. S., & Schneider, K. N. (2008). Analyzing the TJ Maxx Data Security Fiasco. CPA Journal, 78(8), 34-37.
A C C O U N T I N G & A U D I T I N G
a u d i t i n g
Analyzing the TJ Maxx Data Security Fiasco
Lessons for Auditors
By Gary G. Berg. Michelle S.
Freeman, and Kent N. Schneider
I n January 2007, TJX Companies,Inc. (TJX), the parent company ofretail chains such as T,J. Maxx and
Marshalls, issued a press release announc-
ing that its computer systems had been
breached and that customer information
had heen stolen. As the investigation into
the crime continued during 2007, estimates
of the number of customers affected sky-
rocketed. Other reports indicated that at
least 94 million Visa and MasterCard
accounts had been compromised, with loss-
es projected to approach $4.5 biilion. As
expected, Visa and MasterCard are seek-
ing to recoup these losses from TJX. The
sheer scale of the security breach should
cause auditors to wonder about the impli-
cations for their professional practice.
What Went Wrong at TJX?
Investigations into the TJX case appear
to indicate that the company was not in
compliance with the Payment Card
Industry (PCI) data security standards
established in 2004 by American Express,
Discover Financial Services. JCB.
MasterCard Worldwide, and Visa
Intemational. Repxirts identified three major
areas of vulnerability: inadequate wireless
network security, improper storage of
customer data, and failure to encrypt cus-
tomer account data.
Inadequate wireless network security.
The store where the initial breach occurred
was using a wireless network that was
inadequately secured. Specifically, the net-
work was using a security protocol
known as wired equivalent privacy (WEP),
One problem with WEP security is that it
is easy to crack. In fact, researchers at
Darmstadt Technical University in
Germany have demonstrated that a WEP
key can be broken in less than a minute.
More important. WEP does not satisfy
industry standards that require the use of
the much stronger WPA (Wi-Ei Protected
Access) protocol. After breaking into the
store's network, the hackers then bre.
Virtual Data Room Industry Growth Statistics and Trends.pdfHokme
The virtual data room market is expected to grow exponentially at a CAGR of 15..12% for 2021-2026. This is because businesses appreciate the innumerable advantages of a room, like no need for physical storage space, less paperwork, reduced overhead costs, saves travel time and money. Moreover, it is entirely secure.
A detailed analysis on the Security Standard goals and requirements. Examples of companies that failed to comply, with emphasis on which part of the security standards they violated and the fines that resulted as a result of their non-compliance.
Protect your confidential information while improving servicesCloudMask inc.
Over the last few decades, the financial sector has outgrown banks, as financial engineering, digital money and regulatory changes have evolved. Assets managed by financial firms (equity and various types of debt) are larger, as corporate debt has surpassed federal, state and local government’s debt. The US banks’ share of assets under management (AUM) accordingly declined from 58% in 1907 to 27% in 2008, while pension, mutual funds and non-depository firms (e.g., private equity and hedge funds) have grown substantially.
All You Wanted To Know About Top Online Payment Security Methods.pptxITIO Innovex
As online transactions become an integral part of our daily lives, the importance of robust online payment security methods cannot be overstated, especially when you want to start your own payment gateway business. Visit us at: https://itio.in/
Module 02 Performance Risk-based Analytics With all the advancemIlonaThornburg83
Module 02 Performance Risk-based Analytics
With all the advancements in technology and encryption levels, some methods are faster or slower than others. In most cases a cybersecurity professional must weigh cost, performance, and security. Risk is a powerful tool used by all cybersecurity professionals to assist in making these decisions, and in influencing appropriate stakeholders by providing appropriate information with regard to these three elements.
Risk analysis or risk base analytics helps determine the level of risk to an organization. The first step in this process is to determine the sensitivity of the data being processed. The example below is a common data classification for many organizations; however, depending on how the data will be used, these data fields may vary due to classification levels.
· Public: Data available to the general public and approved for distribution outside the organization.
· Examples: press releases, directory information (not subject to a government regulations or blocks), product catalogs, application and request forms, and other general information that is openly shared. The type of information an organization would choose to post on its website offers a good example of Public data.
· Internal: Data necessary for the operation of the business and generally available to all internal users, users of that particular customer, and potentially interested third-parties if appropriate and when authorized.
· Examples: Some memos, correspondence, and meeting minutes; contact lists that contain information that is not publicly available; and procedural documentation that should remain internal.
· Confidential: Data generally not made available outside the organization and the unauthorized access, use, disclosure, duplication, modification, or destruction of which could adversely impact the organization and/or customers. All confidential information is sensitive in nature and must be restricted to those with a legitimate business need to know.
· Examples:
· Information covered by the Family Educational Rights and Privacy Act (FERPA), which requires protection of records for current and former students. This includes pictures of students kept for official purposes.
· Personally identifiable information entrusted to the organization’s care that is not restricted use data, such as information regarding applicants, donors, potential donors, or competitive marketing research data.
· Information covered by the Gramm-Leach-Bliley Act (GLB), which requires protection of certain financial records.
· Individual employment information, including salary, benefits and performance appraisals for current, former, and prospective employees.
· Legally privileged information.
· Information that is the subject of a confidentiality agreement.
· Restricted: Data that MUST be specifically protected via various access, confidentiality, integrity and/or non-repudiation controls in order to comply with legislative, regulatory, con ...
Safeguarding customer and financial data in analytics and machine learningUlf Mattsson
Digital Transformation and the opportunities to use data in Analytics and Machine Learning are growing exponentially, but so too are the business and financial risks in Data Privacy. The increasing number of privacy incidents and data breaches are destroying brands and customer trust, and we will discuss how business prioritization can be benefit from a finance-based data risk assessment (FinDRA).
More than 60 countries have introduced privacy laws and by 2023, 65% of the world’s population will have its personal information covered under modern privacy regulations. We will discuss use cases in financial services that are finding a balance between new technology impact, regulatory compliance, and commercial business opportunity. Several privacy-preserving and privacy-enhanced techniques can provide practical security for data in use and data sharing, but none universally cover all use cases. We will discuss what tools can we use mitigate business risks caused by security threats, data residency and privacy issues. We will discuss how technologies like pseudonymization, anonymization, tokenization, encryption, masking and privacy preservation in analytics and business intelligence are used in Analytics and Machine Learning.
Organizations are increasingly concerned about data security in processing personal information in external environments, such as the cloud; and information sharing. Data is spreading across hybrid IT infrastructure on-premises and multi-cloud services and we will discuss how to enforce consistent and holistic data security and privacy policies. Increasing numbers of data security, privacy and identity access management products are in use, but they do not integrate, do not share common policies, and we will discuss use cases in financial services of different techniques to protect and manage data security and privacy.
Isaca global journal - choosing the most appropriate data security solution ...Ulf Mattsson
Recent breaches demonstrate the urgent need to secure enterprise identities against cyberthreats that target today’s hybrid IT environment of cloud, mobile and on-premises. The rapid rise of cloud databases, storage and applications has led to unease among adopters over the security of their data. Whether it is data stored in a public, private or hybrid cloud, or used in third party SaaS applications, companies have good reason to be concerned. The biggest challenge in this interconnected world is merging data security with data value and productivity. If we are to realize the benefits promised by these new ways of doing business, we urgently need a data-centric strategy to protect the sensitive data flowing through these digital business systems.
Quick Start Guide to IT Security for BusinessesCompTIA
IT security is constantly changing, which means it can be hard for businesses to keep up. This guide from CompTIA educates IT solution providers on the importance of providing clients with up-to-date IT security, identifies the risks of inadequate or poor security, and examines the technology shifts and factors affecting security in in the workplace.
It is shocking to note that about 3.5 billion people saw their
personal data stolen in the top two of the 15 biggest breaches
of this century alone. With the average cost of a data breach
exceeding $8 million, it is no wonder that safeguarding
confidential business and customer information has become
more important than ever. Furthermore, with stricter laws and governance requirements, data security is now everyone’s
responsibility across the entire enterprise.
However, that is easier said than done, and for that reason, an
an increasing number of organizations are relying heavily on data masking to proactively protect their data, avoid the cost of security breaches, and ensure compliance.
Similar to Tokenization: What's Next After PCI? (17)
INDUSTRY-LEADING TECHNOLOGY FOR LONG TERM RETENTION OF BACKUPS IN THE CLOUDEMC
CloudBoost is a cloud-enabling solution from EMC
Facilitates secure, automatic, efficient data transfer to private and public clouds for Long-Term Retention (LTR) of backups. Seamlessly extends existing data protection solutions to elastic, resilient, scale-out cloud storage
Transforming Desktop Virtualization with Citrix XenDesktop and EMC XtremIOEMC
With EMC XtremIO all-flash array, improve
1) your competitive agility with real-time analytics & development
2) your infrastructure agility with elastic provisioning for performance & capacity
3) your TCO with 50% lower capex and opex and double the storage lifecycle.
• Citrix & EMC XtremIO: Better Together
• XtremIO Design Fundamentals for VDI
• Citrix XenDesktop & XtremIO
-- Image Management & Storage
-- Demonstrations
-- XtremIO XenDesktop Integration
EMC FORUM RESEARCH GLOBAL RESULTS - 10,451 RESPONSES ACROSS 33 COUNTRIES EMC
Explore findings from the EMC Forum IT Study and learn how cloud computing, social, mobile, and big data megatrends are shaping IT as a business driver globally.
Reference architecture with MIRANTIS OPENSTACK PLATFORM.The changes that are going on in IT with disruptions from technology, business and culture and so IT to solve the issues has to change from moving from traditional models to broker provider model.
Force Cyber Criminals to Shop Elsewhere
Learn the value of having an Identity Management and Governance solution and how retailers today are benefiting by strengthening their defenses and bolstering their Identity Management capabilities.
Container-based technology has experienced a recent revival and is becoming adopted at an explosive rate. For those that are new to the conversation, containers offer a way to virtualize an operating system. This virtualization isolates processes, providing limited visibility and resource utilization to each, such that the processes appear to be running on separate machines. In short, allowing more applications to run on a single machine. Here is a brief timeline of key moments in container history.
This white paper provides an overview of EMC's data protection solutions for the data lake - an active repository to manage varied and complex Big Data workloads
This infographic highlights key stats and messages from the analyst report from J.Gold Associates that addresses the growing economic impact of mobile cybercrime and fraud.
This white paper describes how an intelligence-driven governance, risk management, and compliance (GRC) model can create an efficient, collaborative enterprise GRC strategy across IT, Finance, Operations, and Legal areas.
The Trust Paradox: Access Management and Trust in an Insecure AgeEMC
This white paper discusses the results of a CIO UK survey on a“Trust Paradox,” defined as employees and business partners being both the weakest link in an organization’s security as well as trusted agents in achieving the company’s goals.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Accelerate your Kubernetes clusters with Varnish Caching
Tokenization: What's Next After PCI?
1. TOKENIZATION:
WHAT’S NEXT AFTER PCI?
Rather than trying to
protect cardholder data
that is widely dispersed
across the environment,
a tokenization solution
removes it altogether
from any systems and
applications that don’t
specifically require it. This
is a major game changer.
Executive Summary
Until recently, the central theme for IT security has been: “Protect sensitive data wherever
it resides.” With the growing adoption of tokenization solutions, primarily in the payment
card industry (PCI), a second principle is gaining equally wide acceptance: “Remove
sensitive data wherever it’s not required.”
This paper examines the factors that have driven rapid adoption of tokenization among
retailers and other merchants, and it offers lessons from the PCI experience that can be
applied to other industries and use cases. Most notably, tokenization has helped reduce
business risk and ease the compliance burden for securing credit card data. Looking
beyond PCI, the paper explores where the next big wave of tokenization is likely to occur:
in key vertical industries that need to safeguard personally identifiable information (PII)
and protected health information (PHI).
The First Wave of Tokenization Was All About Payment Card Data
If necessity is the mother of invention, PCI compliance is the mother of tokenization.
First published in 2004, the Payment Card Industry Data Security Standard (PCI DSS) has
imposed an enormous compliance burden on retailers, e-tailers, payment processors,
and banks. It also affects any “merchant” that accepts credit cards as payment for goods
and services including businesses, schools, educational and healthcare institutions and
nonprofit organizations.
PCI DSS defines 12 major requirements and over 200 sub-requirements for protecting
cardholder data. These must be applied across the entire Card Data Environment (CDE),
meaning any system that accepts or stores payment card data plus any systems that
access the data. Unfortunately, credit card numbers have long been used as a primary
identifier for systems, applications and business processes that have no intrinsic need to
access the number itself. (In many industries the same thing has happened with Social
Security Numbers.) The staggering compliance burden this places on merchants becomes
apparent in this description by Securosis of a typical retail environment:
“ s the standard reference key, credit card numbers are stored in billing, order
A
management, shipping, customer care, business intelligence, and even fraud
detection systems. They are used to cross-reference data from third parties to
gather intelligence on consumer buying trends. Large retail organizations
typically store credit card data in every critical business processing system.
White Paper
2. “It is incredibly expensive to audit network, platform, application, user, and data
security across all these systems — and then to document usage and security
policies sufficiently to demonstrate compliance with PCI-DSS.1
The Greatest Risk Is In the Application Layer
According to a study conducted by the Verizon RISK team, 92% of all data breaches are
the work of external agents, who target servers and applications most of the time. Drilling
down further in the Verizon data, servers accounted for 80% of breaches and 95% of
compromised records, with POS and web servers leading both metrics. Due to this, an
organization interested in preventing data breaches or meeting compliance requirements
must protect sensitive data in the application layer, where the majority of threats reside.
To date, encryption, along with strong key management, has been the preferred method
of enforcing data protection in applications.
Once an organization
identifies which
applications and
business processes
don’t require use of
the actual [credit] card
number, tokenization
can shrink the Card
Data Environment
significantly. In turn,
this greatly reduces PCI
compliance scope and
costs.
However, tokenization has rapidly gained acceptance as an attractive alternative due to
its compelling value proposition. The primary benefit of tokenization is that rather than
trying to protect cardholder data that is widely dispersed across the environment, a
tokenization solution removes it altogether from any systems and applications that don’t
specifically require it. This is a major game changer: Thieves can’t steal what isn’t there,
and organizations don’t need to protect what they no longer store. The result is a
dramatic reduction in security and compliance requirements and costs.
Tokenization offers another significant advantage over encryption. Encrypting data often
requires system software and business applications to be recoded so they can handle
the added length of an encrypted value. Tokenization can be deployed with only minor
application changes. This means data removal can proceed at a faster pace and far more
cost-effectively than encrypting the same data would entail.
How Tokenization Works
In a typical tokenization scenario, card data is encrypted at the point of capture and
transmitted to a secure, central repository, which may be operated by the merchant or
a third-party service provider. (See Figure 2.) The system provides the merchant with a
randomly generated substitute value, called a token, which cannot be traced back to the
original. Because the token retains the same length and format as the original number,
it can be seamlessly passed between applications, databases and business processes
without risk.
The encrypted credit card data is vaulted in a highly secure facility, with multiple layers
of protection and appropriate redundancy for disaster recovery and business continuity
purposes. Only applications that require the actual card number are authorized to access
the vaulted data; this is the only point in the CDE where tokens and account numbers
are correlated.
Like encryption, tokenization can be performed on the database layer, from the network
or on the application layer. Tokenizing or encrypting data at the point of capture—in the
application layer--provides the best protection as data exposure is minimized.
Tokenization Reduces PCI Compliance Costs and Business Risk
One of the major benefits of tokenization is risk consolidation, says Sam Curry Chief
Technology Officer of RSA’s Identity and Data Protection Division, “In essence,
tokenization enables a merchant to consolidate sensitive data, and the related risk, from
dozens or hundreds of systems, databases and networks to just a handful of points,”
1 okenization vs. Encryption: Options for Compliance, Securosis, July 2011, page 3 [https://
T
securosis.com/research/publication/tokenization-vs.-encryption-options-for-compliance]
2 011 Data Breach Investigations Report, Verizon, June 2011 [http://www.verizonbusiness.com/
2
resources/reports/rp_data-breach-investigations-report-2011_en_xg.pdf]
PAGE 2
3. 2
Encryption
Merchant
Merchant Environment
In the payment card world,
tokenization can be implemented as
an on-premise solution deployed by a
merchant or as a third-party service
offered by a payment processor as
illustrated here. In this scenario, card
data is encrypted at the POS using
public/private key encryption to ensure
safe transmission. The data is
decrypted at the processor switch so
the transaction can be authorized, and
a token is returned to the merchant
along with the authorization. The card
number is then encrypted and centrally
vaulted for maximum protection.
3
1
Processor
Switch
5
6
Financial Token
4
6
4
Transaction Log
Settlement Data
Warehouse
Acquirer Datacenter
Figure 2: Tokenization as a Service
Issuer
1. Credit card is swiped at the merchant’s POS
2. Primary Acct Number/Track data/expiration
dates are encrypted using a Public Key in the
POS device and sent to Acquirer
3. Encryted Ttransaction is decrypted using
Private Key
4. Card number is passed to bank for authorization
and RSA SafeProxy server for tokenization
5. Authorization and token are returned to the
merchant
6
Analytics
6
Anti Fraud
6. Token is stored in place of the card number
in all places
7. Adjustments, refunds, ‘Card not present’,
and settlement use the token in place of the
card number
says Curry. “These points include the card processing infrastructure, primarily point-ofsale systems and the store network, and the secure vault. Companies can then focus
security resources on safeguarding those high-risk points, making it easier to protect
against intrusions.”
Once an organization identifies which applications and business processes don’t require
use of the actual card number, tokenization can shrink the Card Data Environment
significantly. In turn, this can greatly reduce PCI compliance scope and costs. For
example, when a $5 billion global technology company outsourced its payment
processing to a third party that tokenized cardholder data, the firm only had to comply
with a few questions on the PCI Self-Assessment Questionnaire rather than the complete
set of 200-plus questions. In turn, the company saved more than $3 million in PCI-related
costs and months of internal development time. Similarly, an RSA customer in the
government sector recently implemented tokenization with RSA Data Protection Manager
and reduced its PCI scope and time spent on PCI compliance by 33%.
Although compliance has been the driving force behind tokenization, the result has
been to enhance security and reduce business risk by greatly reducing the footprint of
sensitive information across the enterprise. History shows that where only a small
number of well-defended targets exist, criminals tend to move on to more vulnerable
environments. Even if a data breach occurs and token values are stolen or exposed,
the information is useless in perpetrating identity fraud and similar crimes.
Aberdeen Group, which has been tracking PCI DSS compliance efforts for several years,
reported in 2010 that “the current use of tokenization is strongly correlated with Best-InClass results” in protecting cardholder data via PCI DSS implementation. “The top
performers were 2-times more likely than all others to indicate current use of a tokenization
solution....The average performance of tokenization users was even higher than that of the
average Best-In-Class company in Aberdeen’s study as measured by the number of known
incidents of data loss, data exposure or audit deficiencies within the last 2 years.” 3
Industry Validation Speeds Adoption
Beyond these well-established benefits of tokenization, two other factors have contributed
to widening adoption: industry validation and the emergence of third-party services. The
principle of tokenizing credit card data was first demonstrated in 2005, but the technology
did not gain traction for several years. In October 2009 tokenization got a big boost when
Visa—a perennial leader in defining and enforcing best practices for card data security—
published guidelines for encrypting card data and recommended the use of tokens to
replace the primary account number (PAN) in payment-related business functions. Visa
followed up in 2010 by publishing best practices for tokenization, stating that:
3 voiding a Kick in the Head: The Value of Tokenization for Protecting Cardholder Data,
A
Aberdeen Group, February 2010
PAGE 3
4. “ ntities that properly implement and execute a tokenization process to support
E
their payment functions may be able to reduce the scope, risks and costs
associated with ongoing compliance with the Payment Card Industry Data Security
Standards (PCI DSS).”
Confirming that guidance, the PCI Security Standards Council in August 2011 issued its
own guidelines for developing, evaluating or implementing a tokenization solution,
offering this advice:
“ n essence, tokenization
I
enables a merchant to
consolidate sensitive
data, and the related risk,
from dozens or hundreds
of systems, databases
and networks to just a
handful of points....
Companies can then
focus security resources
on safeguarding those
high-risk points, making
it easier to protect
against intrusions.”
SAM CURRY, CHIEF TECHNOLOGY OFFICER,
RSA IDENTITY AND DATA PROTECTION
DIVISION
“ toring tokens instead of PANs is one alternative that can help to reduce the
S
amount of cardholder data in the environment, potentially reducing the merchant’s
effort to implement PCI DSS requirements...”
“ okenization solutions do not eliminate the need to maintain and validate PCI DSS
T
compliance, but they may simplify a merchant’s validation efforts by reducing the
number of system components for which PCI DSS requirements apply.”
Secure Payment Services Shift the Risk to Providers
In the earliest days of tokenization, there were two basic deployment models for
tokenization solutions: Merchants could build a homegrown system, or they could buy,
deploy and operate a vendor solution such as RSA Data Protection Manager. A third
option emerged when RSA partnered with First Data, the largest payment processor in the
payment card industry, to create a secure payment solution that offered both encryption
and tokenization of cardholder data as a hosted service.
This new model offered two compelling benefits: First, it freed merchants from the
significant complexity and cost of building and maintaining an on-premise payment
processing infrastructure. Second, by removing cardholder data from the enterprise
environment and vaulting it in a vendor’s secure repository, an outsourced solution
shifted much of the risks and burdens of PCI compliance to trusted third parties with
proven capabilities for securing card data. Merchants’ security obligations didn’t vanish
completely. For instance, they’re still responsible for securing the in-house payment
processing environment. However, the bulk of their PCI DSS scope is transferred to
service providers.
Interest was immediate and enthusiastic, especially among Level 3 and Level 4
merchants who, due to their smaller size, typically lack the resources to implement and
maintain a payment processing infrastructure on their own. Because First Data’s embrace
of tokenization conferred instant legitimacy, these merchants jumped at the chance to
gain the cost, compliance and ease-of-deployment benefits of a hosted offering from a
leading provider.
The First Data model of tokenization is not for everyone. Many Tier 1 and Tier 2 retailers
tend to work with multiple payment processors and thus need an on-premise solution
that is vendor agnostic. Even so, a number of larger merchants opted for a hosted
solution. Within three months of availability, First Data had more than 100,000
merchants using the service, a number that has since more than doubled.
Finding the Industry Sweet Spot
Because payment processors’ core business places them directly in the stream of
payment transactions—and because they already had the infrastructure in place to
handle billions of transactions annually—they were the most logical place to implement
tokenization on a large scale and as an add-on to their existing services. By recognizing
that fact early on, First Data gained a first-mover advantage over competitors.
RSA believes that similar opportunities will emerge in other industries as thought leaders
leverage existing infrastructure to profitably deliver tokenization services, both to
targeted markets and to a general business audience.
PAGE 4
5. The Next Wave of Tokenization Will Be About PII, PHI and the Cloud
An RSA customer in
the government sector
recently implemented
tokenization with RSA
Data Protection Manager
and reduced its PCI
scope and time spent on
PCI compliance by 33%.
The payment card industry has been a giant proving ground for tokenization. The external
pressures of PCI compliance created an urgent market need and shaped how the
technology has evolved and matured. Because PCI compliance initiatives have been
pervasive in businesses, government and institutions—which all accept credit cards as
payment for goods, services and fees—PCI projects have also created a small army of
IT and security professionals across all sectors who know firsthand how effective
tokenization is at removing high-value data from the environment.
That group is starting to explore how they can leverage and extend their investments in
tokenization to safeguard other types of structured data such as Social Security Numbers,
motor vehicle license numbers, and banking and investment account numbers. The
ultimate goal is to secure the transactions and communications that utilize the data
including healthcare and employment records, financial transactions, and government
records related to voting, taxes, and the criminal justice system.
This next wave of adoption is already under way and will touch multiple industries—led
by the insurance, healthcare and hospitality industries—and focus on a broader range
of personally identifiable information (PII) and protected health information (PHI).
Tokenizing SSNs in the Insurance Industry
RSA is currently seeing significant activity in the insurance industry where, for decades,
Social Security Numbers were used as the primary identifier for policyholders. SSNs
became so deeply embedded in business processes and application infrastructures that
IT systems could not function without them, even if there was no other reason to retain
such sensitive data.
With the advent of data disclosure laws and other regulations protecting PII, insurers
have strong reason to remove SSN data from the network environment. A number of
insurers that RSA works with are doing just that by tokenizing SSNs that previously were
stored as clear text in protected databases. One of the U.S.’s largest insurers has already
deployed tokenization within their SOA to support PCI compliance. Now, with RSA’s
assistance, they are rolling it out to other parts of the business to protect PII, starting
with a large-scale project to tokenize hundreds of millions of SSNs, which are accessed
by hundreds of applications across the organization. A standard component of the
insurer’s business process evaluation has been to determine where SSNs are being used
simply for lookup purposes and therefore can be replaced by tokens with no impact.
The insurer is leveraging a centralized, web services model for distributing tokens to
requesting applications. The centralized infrastructure is more efficient to operate and
easier to defend than a distributed model, and it provides a platform for tokenizing other
types of PII, such as enrollees’ policy numbers or health record numbers, if the company
decides to do so.
Tokenizing PHI in the Health Care World
Health care is another area where tokenization is being explored as an alternative to
encryption or inaction. As of March 2011, more than 8 million people in the U.S. had
been impacted by breaches of protected health information (PHI)4, with medical and
healthcare groups accounting for 16 percent of all identity records exposed nationwide.
Stolen healthcare records contain identity and financial data that can be used to commit
financial fraud. Information on the patient’s health status, insurer, and medical providers
allows an imposter to pose as the patient in order to obtain “free” health care. Security
4 reported by the U.S. Health and Human Services Office for Civil Rights, which is
As
responsible for privacy and security enforcement under HIPAA and certain provisions
of the Health Information Technology for Economic and Clinical Health (HITECH) Act.
PAGE 5
6. executives are not only concerned about intrusions by external hackers; insider threats–
from negligent or malicious employees, partners and contractors and from process
breakdowns are also major causes of data breaches.
When a $5 billion global
technology company
outsourced its payment
processing to a third
party that tokenized
cardholder data, the
firm only had to comply
with a few questions on
the PCI Self-Assessment
Questionnaire rather
than the complete set of
200-plus questions. In
turn, the company saved
more than $3 million
in PCI-related costs
and months of internal
development time.
Addressing these kinds of concerns, the Health Information Technology for Economic and
Clinical Health (HITECH) Act of 2009 increased privacy and security requirements
introduced under HIPAA5. For example breaches involving unencrypted PHI now require
affected individuals to be notified within 60 days of discovery, and incidents involving
more than 500 records must be reported to the Department of Health and Human
Services and publicly posted. In contrast, if breached data is encrypted or tokenized,
notification requirements do not apply because the data is considered unreadable.
Eliminating the notification requirement reduces the financial costs and brand damage
associated with notification.
The data security challenges
In health care, the data security challenges are far more complex than in the payment
card industry. Under current laws, medical providers, insurers, and other stakeholders are
accountable for safeguarding PHI, defined as any information about health status,
provision of health care, or payment that can be linked to a particular individual. HIPAA
specifies 18 categories of identifiers, but an actual electronic health record may include
hundreds of data points relating to medical conditions, medications, provider certificate
or license numbers, medical device identifiers and serial numbers, and so forth.
Securosis points out a particularly thorny problem in securing PHI:
“ any different groups need access to different subsets (or all) of [a patient’s health
M
record]: doctors, hospitals, insurance providers, drug companies, clinics, health
maintenance organizations, state and federal governments, and so on. And each
audience needs a different slice of the data — but must not see the rest of the data.”6
Early experiences with tokenization
Some firms have applied tokenization to health records in a limited way, typically using a
single token to represent an individual’s name, address and SSN while other data in the
health record is stored in the clear. In the long run, this is likely to prove insufficient since
it has been demonstrated that a patient’s identity may be deduced by correlating
as few as two or three key identifiers that have been left unprotected.
Martin Sizemore, an enterprise architect with the IT management consulting firm Perficient,
has argued that tokenization of health records, implemented within an SOA environment,
is the right technology to address the exchange of PHI over public and private networks.
“ he big question is how to implement the tokenization of protected healthcare
T
information? The short answer is make it a ‘service’ in a service-oriented
architecture that talks to a tokenization server.... The tokenization server would
contain the 18 or more key protected items and their corresponding tokens.
The service would retrieve the protected information temporarily for healthcare
applications and updates, but would prevent local storage of the information to
maintain control.”7
Whether this or another model emerges as the leading deployment scenario for
tokenization of PHI, RSA expects to see a good deal of activity and progress in the next
two to four years.
5 Health Insurance Portability and Accountability Act
6 Tokenization vs. Encryption: Options for Compliance, Securosis July 2011, page 7
7 s Tokenization the solution for Protected Healthcare Information (PHI)?, February 2011,
I
Perficient blog post, Martin Sizemore, February 2011 [http://blogs.perficient.com/healthcare/
blog/2011/02/22/is-tokenization-the-solution-for-protected-healthcare-information-phi/]
PAGE 6
7. Protecting the Hospitality Sector
RSA is also seeing increased interest in tokenization among customers in the hospitality
industry. This may be in response to a recent rise in smaller attacks on companies in
the hospitality and retail sectors. Verizon’s 2011 data breach report notes that such
organizations represent smaller, softer, and less reactive targets than, for instance, financial
institutions and speculates that criminals may be deciding to “play it safe” in light of recent
arrests and prosecutions following large-scale intrusions in financial services.8
PCI-compliant Cloud Computing
With several years of experience under its belt, the payment card industry will continue to
be a bellwether for other industries, pushing the boundaries of what tokenization can do.
One promising area is PCI-compliant cloud computing.
For the most part, cloud architectures have proven to be impractical for applications
using payment card data. The high investment needed to build a PCI DSS-compliant
environment in the cloud exceeds the cloud’s efficient resource allocation benefits in
most cases. Consequently, merchants have been limited in the types of business
processes and applications they can move to the cloud, because many nonpaymentrelated systems use card numbers for look-up values.
Tokenization promises to open up new business models for PCI-compliant cloud
computing, enabling merchants to shift IT-based services to the cloud by allowing
applications that previously used PANs (and were thus governed by PCI DSS) to use
tokens instead. The ability to take many business processes and IT systems out of PCI
scope enables merchants to better leverage the cloud to achieve significant advantages
in IT efficiency, cost and flexibility.
The RSA Approach to Tokenization
Given its proven effectiveness, tokenization should be an important component of any
layered strategy for protecting high-value, structured data from end to end. With this
principle in mind, RSA incorporated comprehensive tokenization functionality into the RSA
Data Protection Manager platform, combining tokenization with industry-leading application
encryption, data-at-rest encryption, and comprehensive lifecycle key management.
RSA Data Protection
Manager
Figure 1: Protecting Credit Card Data
Encryption and tokenization are
complementary technologies that
use different mechanisms to protect
structured data—in this case a credit
card number—and are suitable for
different scenarios. With tokenization,
the original card number is replaced
with a substitute value created by a
random number generator that
preserves the 16-digit data format.
This allows applications to handle
tokens without any special coding.
A token can optionally include an
element of the original value for
identification purposes, for example
the first 4 digits of the credit card
number, as shown here.
Encryption
KJaSa^)(#EHLghrS
$Lja(*gfbe$%634Hdc
6011-2548-5246-7563
Original Value
Ability to offer “hybrid” deployments of
both encryption and tokenization is a
core differentiator for RSA Data Protection
Manager
8 2011 Data Breach Investigations Report, Verizon, June 2011 page 12
6011-2548-6325-5564
Tokenization
PAGE 7