The document discusses data breach protection from a DB2 perspective. It provides an overview of data breach legislation and compliance issues. It discusses examples of recent data breaches and resources for tracking breaches. It also covers the significant costs associated with data breaches for organizations. The document recommends several best practices for protecting data, including data masking, database security and encryption, data access auditing, database archiving, and metadata management.
Key Trends Shaping the Future of Infrastructure.pdf
Data breach protection from a DB2 perspective
1. Session: H08
Data Breach Protection
From a DB2 Perspective
Craig S. Mullins
NEON Enterprise Software, Inc.
May 20, 2008 • 04:00 p.m. – 05:00 p.m.
Platform: Multi-platform
2. Objectives
• Understand the various laws that have been enacted to combat
data breaches and the trends toward increasing legislation
• Learn how to calculate the cost of a data breach based on
industry best practices and research from leading analysts
• Gain knowledge of several best practices for managing data with
the goal of protecting the data from surreptitious or nefarious
access (and/or modification)
• Learn about techniques for securing, encrypting, and masking
data to minimize exposure of critical data
• Uncover new data best practices for auditing access to database
data and for protecting data stored for long-term retention
2
3. Agenda
• Objectives
• Data Breach Overview
• Legislation and Compliance Issues
• Examples and Resources
• The Cost of a Data Breach
• Best Practices for Data Protection
• Data Masking
• Database Security & Encryption
• Data Access Auditing
• Database Archiving
• Metadata Management
• Synopsis
3
5. Legislation & Compliance
The Gramm-Leach-Bliley Act (GLB), is a federal law enacted
in the United States to control the ways that financial
institutions deal with the private information of individuals. The
Act regulates the collection and disclosure of private financial
information; stipulates safeguards requiring security programs;
and prohibits the practice of pretexting.
HIPAA (Health Insurance Portability and Accountability Act)
creates national standards to protect individuals' medical
records & personal health information. The Privacy Rule
provides that, in general, a covered entity may not use or
disclose an individual’s healthcare information without
permission except for treatment, payment, or healthcare
operations.
5
6. Legislation & Compliance
Basel II is an international banking regulation the goal of which is to
produce uniformity in the way banks and banking regulators
approach risk management across national borders.
The Sarbanes-Oxley Act (SOX) establishes standards for all U.S.
public company boards, management, and public accounting firms.
The Public Company Accounting Oversight Board (PCAOB) was
created by SOX to oversee auditors of public companies. The
primary goals of SOX were to strengthen and restore public
confidence in corporate accountability and to improve executive
responsibility.
The California Security Breach Notification Law (CA SB 1386)
requires companies to notify California customers if PII maintained in
computerized data files have been compromised by unauthorized
access.
6
8. Impact of The Personal Data
Privacy and Security Act
• Defines regulatory requirements for data brokers, defined as any
company that is "collecting, transmitting, or otherwise providing
personally identifiable information" (PII) of 5,000 or more people that
are not customers or employees.
• New penalties for database intrusions.
• Fines and 10 years in prison for trespassing in a "data broker's" system;
• Five years in prison for "willfully" concealing certain types of breaches.
• Mandate a "comprehensive personal data privacy and security
program" for most businesses and individuals acting as sole
proprietors (similar to what Gramm-Leach-Bliley Act required).
• Requires notification if a computer security breach "impacts more
than 10,000 individuals."
• Requires review of federal sentencing guidelines for misuses of PII,
and provides grants to states to for enforcement of ID fraud crimes.
• Create additional "privacy impact assessments" when a federal
agency relies on a commercial database consisting "primarily" of
information on U.S. citizens.
8
9. Other Regulations & Issues?
• And, there are more regulations to consider, for example:
• the USA Patriot Act
• Can SPAM Act of 2003
• Telecommunications Act of 1996
• The Data Quality Act
• Federal Information Security Mgmt Act
• Different regulations to contend with based upon your industry,
location, etc.
• And new regulations will continue to be written by government
and imposed over time.
• These regulations have brought to light big problems
• More on this later…
9
11. Regulatory Compliance is International
Country Examples of Regulations
Australia Commonwealth Government’s Information Exchange Steering Committee,
Evidence Act 1995, more than 80 acts governing retention requirements
Brazil Electronic Government Programme, EU GMP Directive 1/356/EEC-9
Canada Bill 198, Competition Act,
France Model Requirements for the Management of Electronic Records, EU Directive
95/46/EC
Germany Federal Data Protection Act, Model Requirements for the Management of
Electronic Records, EU Directive 95/46/EC
Japan Personal Data Protection Bill, J-SOX
Switzerland Swiss Code of Obligations articles 957 and 962
United Data Protection Act, Civil Evidence Act 1995, Police and Criminal Evidence
Kingdom Act 1984, Employment Practices Data Protection Code, Combined Code on
Corporate Governance 2003
11
17. Privacy Rights Clearinghouse
• The Chronology of Data Breaches is a very useful
web resource for tracking just how pervasive this
problem is:
• http://www.privacyrights.org/ar/ChronDataBreaches.htm
17
18. Total Number of Records Breached
• According to the Privacy Rights
Clearinghouse, the total number of
records containing sensitive
personal information involved in
security breaches in the U.S. is:
218,621,856
As of February 29, 2008
18
19. How Prevalent is this Problem?
• 68% of companies are losing sensitive data or
having it stolen out from under them six times a
year
• An additional 20%
are losing sensitive
data 22 times or
more per year.
Sources: eWeek, March 3, 2007
IT Policy Compliance Group
http://www.eweek.com/c/a/Desktops-and-Notebooks/Report-Some-Companies-Lose-Data-Six-Times-a-Year/
19
20. What is the Payoff?
How Stolen Data is Used
• Once breached, there
are many potential
“uses” (and misuses) of
personally identifiable
information (PII).
20
21. The Cost of a Data Breach
• According to Forrester Research, the
average cost per record is between
$90 and $305
• According to the Ponemon Institute,
the average cost per lost customer
record is $182
• Average cost per incident: $5 million
• Or use the Data Loss Calculator, free on the web
at http://www.tech-404.com/calculator.html
21
22. Data Breach Cost Breakdown
• Tangible costs
• Lost employee productivity
• Stock price
• Lost customers
• Studies have shown that most customers
would take their business elsewhere
if they received two or more security
breach notices.
• Regulatory fines
22
23. A Costly Example: ChoicePoint
• February 2005: Bogus accounts established by ID thieves. The
initial number of affected records was estimated at 145,000; later
revised to 163,000.
• January 2006: The Federal Trade Commission assessed a $10
million civil penalty against the company in January 2006 for
violations of the Fair Credit Reporting Act.
• May 2007: ChoicePoint reached an agreement with the attorneys
general in 43 states and DC. It promised to make changes in the
way it screens and authenticates new customers. The company
also agreed to pay a total of $500,000 to the states to cover legal
fees and costs.
• January 2008: ChoicePoint agreed to pay $10 million to settle the
last remaining class-action lawsuit filed against the company in
connection with a data breach disclosed in early 2005 in which the
personal information of more than 160,000 people was exposed.
23
24. So What’s Next?
Compliance Drives Security Initiatives
• Compliance requirements, such as Sarbanes-
Oxley, HIPAA, CA SB 1386, and the Gramm-
Leach-Bliley Act (GLBA), are driving interest for all
enterprises to take strong security measures to
protect private data. These include initiatives
around data-at-rest encryption, granular auditing,
intrusion detection and prevention, and end-to-end
security measures.
Source: Forrester Research (Trends 2006: Database Management Systems)
24
25. Security Best Practices
For Preventing Data Breaches
• Data Masking
• Database Security & Encryption
• Data Access Auditing
• Database Archiving
• Metadata Management
25
27. The Problem
• Data is required to test application designs.
• Data (or reports) may also need to be sent to other individuals,
or organizations, on an on-going basis for various purposes.
• Some of the data is sensitive and should not be
accessible by programmers:
• PII such as SSN, salary, credit card details, etc.
• Referential integrity must be maintained in test,
even as data values are changed.
27
28. The Solution
• Data masking is the process of protecting
sensitive information in non-production databases
from inappropriate visibility.
• Valid production data is replaced with useable,
referentially-intact but incorrect and invalid data.
• After masking, the test database is usable just like
production; but the information content is secure.
• May need to create and populate test data from
scratch, as well as sanitize existing information.
28
29. Data Masking in action
• Masking must change names, credit card and telephone numbers
(invalid), e-mail addresses (invalid), postal addresses (including
street names, zip codes, and postal codes), false company
names, and so on.
191-64-9180 275-99-0194
Craig S. Mullins John Q. Public
Pittsburgh, PA 15230
Sugar Land, TX 77478
29
33. Database Security in a Nutshell
• Authentication
• Who is it?
• Authorization
• Who can do it?
• Encryption
• Who can see it?
• Audit
• Who did it?
33
34. Data Encryption Considerations
• California's SB 1386 protects personally identifiable
information; obviously it doesn't matter if encrypted data
becomes public since it's near impossible to decrypt.
• Types of encryption Source: “Encryption May Help Regulatory Compliance”
Edmund X. DeJesus, SearchSecurity.com
• At Rest
• In Transit
• Issues
• Performance
• Encrypting and decrypting data consumes CPU
• Access paths
• Applications may need to be changed
• See next slide for DB2 V8 encryption functions
34
35. DB2 V8: Encryption / Decryption
• Encryption: to encrypt the data for a column
ENCRYPT_TDES(string, password, hint)
• ENCRYPT_TDES [can use ENCRYPT() as a synonym]
• Triple DES cipher block chaining (CPC) encryption algorithm
• Not the same algorithm used by DB2 on other platforms
• 128-bit secret key derived from password using MD5 hash
INSERT INTO EMP (SSN)
VALUES(ENCRYPT('289-46-8832','TARZAN','? AND JANE'));
• Decryption: to decrypt the encrypted data for a column
DECRYPT_BIT(), DECRYPT_CHAR(), DECRYPT_DB()
• Can only decrypt expressions encrypted using ENCRYPT_TDES
• Can have a different password for each row if needed
• Without the password, there is no way to decrypt
SELECT DECRYPT_BIT(SSN,'TARZAN') AS SSN FROM EMP;
35
36. DB2 9 for z/OS: Encryption in Transit
• DB2 9 supports SSL by implementing z/OS Communications
Server IP Application Transparent Transport Layer Security
(AT-TLS)
• AT-TLS performs transport layer security on behalf of DB2 for
z/OS by invoking the z/OS system SSL in the TCP layer of
the TCP/IP stack
• When acting as a requester, DB2 for z/OS can request a
connection using the secure port of another DB2 subsystem
• When acting as a server, and from within a trusted context
SSL encryption can be required for the connection
36
38. Database Security Issues
• Most organizations have access control policies…
but how do you enforce them?
• Privileged users have unfettered (and often anonymous)
access
• Transaction logs don’t provide sufficient visibility (e.g., read
operations)
• Trace logs impose high overhead & require changes to
database schemas
• Web-facing applications expose corporate data in
new ways
• SQL injection, vulnerabilities, non-current patches, scripting,
etc.
38
39. Levels of Database Auditing
• An audit is an evaluation of an organization, system,
process, project or product.
• Database Control Auditing
• Who has the authority to…
• Database Object Auditing
• DCL: GRANT, REVOKE
• DDL: CREATE, DROP
• Data Access Auditing
• INSERT, UPDATE, DELETE
• SELECT
39
41. How to Audit Database Access?
1. DBMS traces
2. Log based
3. Network sniffing
4. Capture requests at the server
41
42. Why Server-based Capture is
Important?
• Audit within the DBMS (traces)
• Must start performance trace
• DDL changes required to audit tables
• Overhead as trace records are written by DB2
• Audit over the network
• Capture SQL requests as they are sent over the network
• What about non-network requests?
• CICS w/DB2, IMS w/DB2, SPUFI
• Audit from the database transaction log files
• Modification are on the log anyway so…
• What about reads? Non-logged utilities?
42
43. Auditing Has to be at the Server Level
• Requires a software tap to
“capture” relevant SQL
at the DBMS/server level
to review for auditing.
• If you are not capturing
all pertinent access
requests at the server
level, nefarious users
can sneak in and not
be caught.
43
45. More Data, Stored for Longer
Durations
Data Retention Issues:
Volume of data (125% CAGR)
n Length of retention
c tio requirement
Amount of Data
e
ot
Pr Varied types of data
ce
ian Security issues
pl
C om
0 Time Required 30+ Yrs
45
46. Retention Regulations Drive
Database Archiving Needs
Data Retention Requirements refer to the length of
time you need to keep data
Determined by laws – regulatory compliance
Over 150 state and federal laws
Dramatically increase retention periods for corporate data
Determined by business needs
Reduce operational costs: Large volumes of data interfere
with operations: performance, backup/recovery, etc.
Isolate content from changes: Protect archived data from
modification
46
48. Solution: Database Archiving
Database Archiving: The process of removing
Databases
selected data records from operational databases
that are not expected to be referenced again and
storing them in an archive data store where they
Data Data
can be retrieved if needed. Extract Recall
Metadata
Archive data store and retrieve
capture, design,
maintenance
Purge
Archive data
Archive
query and
access
metadata data
policies metadata
history
Archive
administration
48
49. Needs for Database Archiving
• Policy based archiving: logical selection
• Keep data for very long periods of time
• Store very large amounts of data in archive
• Maintain archives for ever changing operational systems
• Become independent from Applications/DBMS/Systems
• Become independent from Operational Metadata
• Protect authenticity of data
• Access data directly in the archive when needed; as
needed
• Discard data after retention period
49
50. Why Archiving Can Help to Prevent
Data Breaches
• Data is removed from the database
• If the database is breached, the archived data is
not because it is no longer there
• Data cannot be accessed by SYSADM
• …or any other database users
• Data is protected against modification
• Archived data cannot be changed
50
52. Compliance Requires Metadata
• As data volume expands and more regulations hit the
books, metadata will increase in importance
• Metadata: data about the data
• Metadata characterizes data. It is used to provide
documentation such that data can be understood and
more readily consumed by your organization.
Metadata answers the who, what, when, where,
why, and how questions for users of the data.
• Data without metadata is meaningless
• Consider: 27, 010110, JAN
52
53. Data Categorization
• Data categorization is critical
• Metadata is required to place the data into
proper categories for determining which
regulations apply
• Financial data SOX
• Health care data HIPAA
• Etc.
• Some data will apply to multiple regulations
• Who does this now at your company? Anyone?
53
55. Synopsis
• Regulations are requiring more stringent protection
of data
• Data breaches occur frequently and are costly
• Implementation of best practices can
mitigate the occurrence of data breaches
55
56. Session H08
Data Breach
Protection From a
DB2 Perspective
Craig S. Mullins
NEON Enterprise Software, Inc.
craig.mullins@neonesoft.com
www.CraigSMullins.com
56
Anyone who has been paying attention lately knows at least something about the large number of data breaches that have been in the news. Data breaches and the threat of lost or stolen data will continue to plague organizations until comprehensive plans are enacted to combat them. Although many of these breaches have not been at the database level, some have, and more will be unless better data protection policies and procedures are enacted on operational databases. This presentation will provide an overview of the problem, providing examples of data breaches, their associated cost, and series of best practices for protecting your valuable production data. This presentation will help you avoid having your company's name splashed on the front page because you did not adequately protect your databases.
A data breach is an unauthorized disclosure of information that compromises the security, confidentiality, or integrity of personally identifiable information. In other words, it is when your personal information is allowed to be stolen or accessed in unauthorized ways.
GLB regulates financial institutions and the manner in which financial data is collected, managed, and used. Pretexting is accessing private information under false pretenses. HIPAA regulates the privacy of health care information.
Basel II regulates the banking industry to improve risk and asset management. The goal of the regulation is to avoid financial disasters by: Imposing minimum capital requirements Requiring supervisory review; and Promoting market discipline (to achieve greater stability in the financial system). SOX is the big one. The goal of SOX is to use the full authority of the government to expose corruption, punish wrongdoers, and defend the rights & interests of American workers & investors. Section 404 is the largest driver of SOX projects. It is the most important section for IT because the processes and internal controls are implemented primarily in IT systems; … and much of the data is stored in a DBMS. CA SB 1386 has resulted in many disclosures of breached data that otherwise would probably have never come to light. This law was a pioneer in helping to expose this big problem.
The Personal Data Privacy and Security Act, pending in the US Senate, is a proposed response from the federal government to all of the high-profile data breaches in the news. The bill, at 91 pages, is aggressive and regulation-laden. "Reforms like these are long overdue," Sen. Patrick Leahy, a Vermont Democrat, said in a floor speech. "This issue and our legislation deserve to become a key part of this year’s domestic agenda so that we can achieve some positive changes in areas that affect the everyday lives of Americans." The text of this act can be read at http://i.i.com.com/cnwk.1d/pdf/ne/2005/Specter-Leahy.pdf?tag=st.nl We’ll look at the highlights on the next slide.
Defines regulatory requirements for data brokers , defined as any company that is "collecting, transmitting, or otherwise providing personally identifiable information" (PII) of 5,000 or more people that are not customers or employees. New penalties for database intrusions. Fines and 10 years in prison for trespassing in a "data broker's" system; Five years in prison for "willfully" concealing certain types of breaches. Mandate a " comprehensive personal data privacy and security program " for most businesses and individuals acting as sole proprietors (similar to what Gramm-Leach-Bliley Act required). Requires notification if a computer security breach "impacts more than 10,000 individuals." Requires review of federal sentencing guidelines for misuses of PII, and provides grants to states to for enforcement of ID fraud crimes. Create additional " privacy impact assessments " when a federal agency relies on a commercial database consisting "primarily" of information on U.S. citizens.
There are many additional regulations and more continue to be written and passed. Controlling the Assault of Non-Solicited Pornography and Marketing (Can SPAM). The law took effect on January 1, 2004. The Can Spam Act allows courts to set damages of up to $2 million when spammers break the law. Federal district courts are allowed to send spammers to jail and/or triple the damages if the violation is found to be willful. The Can Spam Act requires that businesses: Clearly label commercial e-mail as advertising Use a truthful and relevant subject line Use a legitimate return e-mail address Provide a valid physical address Provide a working opt-out option Process opt-out requests within ten business days The Telecommunications Act of 1996 , enacted by the U.S. Congress on February 1, 1996, and signed into law by President Bill Clinton on February 8, 1996, provided major changes in laws affecting cable TV, telecommunications, and the Internet. The law's main purpose was to stimulate competition in telecommunication services. The law specifies: How local telephone carriers can compete How and under what circumstances local exchange carriers (LEC) can provide long-distance services The deregulation of cable TV rates The Federal Information Security Mgmt Act (FISMA) basically states that federal agencies, contractors, and any entity that supports them, must maintain security commensurate with potential risk . Also called the E-Government Act; it was passed in 2002. The Data Quality Act was written by a lobbyist and slipped into a giant appropriations bill in 2000 without congressional discussion or debate. It basically consists of two sentences directing the OMB to ensure that all information disseminated by the federal government is reliable
Bananas.com was caught off guard last year. The musical instrument sales site suffered a data breach that was followed swiftly by a double whammy of consequences. Roughly 250 customer records were exposed, likely after an individual stole an administrative password by accessing systems remotely. (Site owner Bananas at Large has since put additional security procedures in place to prevent a recurrence.) After the breach, the 25-person company scrambled to comply with the many state laws requiring customer notification. It alerted only the affected customers, either by mail or e-mail. Because its own resources were limited, Bananas referred victims to large credit-reporting agencies to monitor for subsequent financial damage from the breach. Despite its efforts, Bananas apparently failed to meet all the various state notification requirements and was subsequently slammed with fines and fees by major credit companies. “They did not specifically provide a reason for the fees other than saying that we had not met all of the terms in our agreements with them,” says Bananas President J.D. Sharp. “They’ll fine the pants off you,” he adds.
Regulatory compliance is not just a USA “thing,” but is international. In fact, Europe was ahead of the USA in terms of passing data privacy-related regulations.
It can be a daunting task to keep track of all of the regulations that may apply to your business, and therefore your data. Fortunately, there is a helpful resource provided by the IT Compliance Institute The Universal Compliance Project, or UCP. The UCP tracks all of the regulations and records them in terms of which sections require what type of actions. This information is recorded in a series of spreadsheets on various topics, such as the one for Records Management, shown here. The information is freely available to download in Adobe PDF format. Or you can subscribe (for a fee) and get the information in spreadsheet format with regular updates. Use the link at the top of this slide to access the UCP information.
OK, we’ve learned about data breaches and looked at some of the regulations, but let’s turn now to look at some examples to see just how prevalent this problem currently is.
A senior DBA at Certegy, a subsidiary of Fidelity National Information Services Inc., who was responsible for defining and enforcing data access rights at the company, stole data belonging to about 2.3 million consumers and sold it to a data broker. The broker in turn sold a subset of the data to other marketing companies. The stolen data included names, addresses, birth dates, bank account and credit card information. The DBA, who has been terminated, agreed to pay restitution, cooperate with investigators, and forfeit over $105,000 of illicit profits. But this story is not over yet. Florida's attorney general is investigating the theft and resale of Certegy's consumer data. A criminal investigation by the U.S. attorney in Tampa also is continuing.
In filings with the U.S. Securities and Exchange Commission yesterday, the company said 45.6 million credit and debit card numbers were stolen from one of its systems over a period of more than 18 months by an unknown number of intruders. That number eclipses the 40 million records compromised in the mid-2005 breach at CardSystems Solutions and makes the TJX compromise the worst ever involving the loss of personal data. In addition, personal data provided in connection with the return of merchandise without receipts by about 451,000 individuals in 2003 was also stolen. The company is in the process of contacting individuals affected by the breach, TJX said in its filings. -------------------------- TJX has said that in the 12 months since the breach was disclosed, it has spent or set aside about $250 million in breach-related costs. That includes the costs associated with fixing the security flaws that led to the breach, as well as dealing with all of the claims, lawsuits and fines that followed the breach.
Sears Holdings is facing a class-action lawsuit after making the purchase history of its customers public on its Managemyhome.com Web site. The lawsuit seeks damages as well as an accounting by Sears to determine whether the Web site was misused by criminals. It was filed on Friday by New Jersey resident Christine Desantis, who is represented by KamberEdelson, a technology law firm. KamberEdelson is best known for its recent settlement with social networking site Facebook over its sending of unwanted text messages to recycled cell-phone numbers. "It's a pretty simple case," said Jay Edelson, a partner with the Chicago-based law firm. "Sears decided to put private information of its customers up on the Web site and make it publicly available. They did it without telling their customers that it was going to happen ... and they really did it for their own financial reasons." Manage My Home is a community portal where Sears shoppers can download product manuals, find product tips, and get home renovation ideas. The Web site had a feature called "Find your products" that ostensibly was designed to help users look up past purchases. Last Thursday, researchers at security vendor CA pointed out that the feature could be used to look up the purchase history of any Sears customer, an apparent violation of the company's privacy policy. Manage My Home could easily have been misused by criminals, Edelson said. For example, a robber could gain access to a victim's home by posing as a Sears repair person, using the information available on the site. That could be incredibly scary, he said. "They have a duty to keep that information away from the public."
The Privacy Rights Clearinghouse (PRC) is a nonprofit consumer organization with a two-part mission -- consumer information and consumer advocacy. It was established in 1992 and is based in San Diego, California. It is primarily grant-supported and serves individuals nationwide. One of the projects managed by the Privacy Rights Clearinghouse is the Chronology of Data Breaches (http://www.privacyrights.org/ar/ChronDataBreaches.htm), which documents the data breaches which have been reported. The total number of records reported documents the personal information compromised including data elements useful to identity thieves, such as Social Security numbers, account numbers, and driver's license numbers. Some breaches that do NOT expose such sensitive information are included in the chronology but not in the total number of records involved.
As of February 29, 2008 the total number of sensitive records breached since 2005 is: 218,621,856
Sixty-eight percent of companies are losing sensitive data or having it stolen out from under them six times a year, according to new research from the IT Policy Compliance Group. TJX’s massive data loss is just the tip of the iceberg. Almost seven out of 10 companies—68 percent—are losing sensitive data or having it stolen out from under them six times a year, according to new research from the IT Policy Compliance Group. An additional 20 percent are losing sensitive data a whopping 22 times or more per year. The ITPCG is a security and compliance policy industry group that counts among its members the Institute of Internal Auditors, the Computer Security Institute and Symantec.
Why would thieves want to steal data? Well, generally it is to enrich themselves by fradulent means. According to statistics provided by the Federal Trade Commission (FTC) in early 2006, credit card fraud and identity theft top the list of reasons for theft of data (according to the FTC), but there are many “popular” reasons to steal PII. According to the same FTC statistics, a total of 93,938 identity theft complaints were recorded in calendar year 2005. That represents an increase over 2004 when there were 78,815 complaints.
Forrester Research recently conducted a survey of companies that had experienced a data breach ( Calculating The Cost Of A Security Breach , Apr 10, 2007). The study, as reported this week in Information Week , concludes that the average security breach can cost a company between $90 and $305 per lost record. But coming up with an accurate figure is difficult because of the additional, extenuating circumstances surrounding data breaches. An additional data point comes from the Ponemon Institute 's 2006 study on the cost of a data breach. The highlights of this particular study though showed that the average cost per lost customer record is $182. Or you can use the data loss calculator, which will automatically generate an average cost, and a plus/minus 20% range, for expenses associated with internal investigation, notification/crisis management and regulatory/compliance if the incident were to give rise to a class action claim.
As you attempt to put a price tag on potential breaches of your data, keep these issues in mind: Tangible costs – attorney fees, cost of notification (certified mail), consultant fees, etc. Lost employee productivity – what could your employees be doing other than tracking down lost data and responding to problems created because of the breach Stock price – if the breach is public enough, and problematic enough, it could erode confidence in the company and impact the price of your stock Lost customers – if your data is breached often enough customers will lose confidence and stop doing business with your company Regulatory fines – in some cases regulatory bodies and government agencies will impose fines due to lack of compliance
A data breach of 163,000 records cost ChoicePoint over $20 million .
So, given the regulations, along with the prevalence and cost of data breaches, what can we, as DB2 professionals, do? Improve database security!
The first area that can help protect your organization from a data breach is data masking.
URL for the sourced quote above: http://searchsecurity.techtarget.com/originalContent/0,289142,sid14_gci991508,00.html
DB2 V8 offers functions that allow you to encrypt and decrypt data at the column level. Because you can specify a different password for every row that you insert, you can really encrypt data at the “cell” level in your tables. If you use these functions to encrypt your data, be sure to put some mechanism in place to manage the passwords that are used to encrypt the data. Without the password, there is absolutely no way to decrypt the data. To assist you in remembering the password, you have an option to specify a hint (for the password) at the time you encrypt the data. The SQL example in the middle of the screen shows an INSERT that encrypts the SSN ( social security number) using a password and a hint. Now, in order to retrieve the row you must use the DECRYPT function supplying the correct password. This is shown in the SELECT statement at the very bottom of the slide. If we fail to supply a password, or the wrong password, the data is returned in an encrypted format that is unreadable. The result of encrypting data using the ENCRYPT function is VARCHAR -- FOR BIT DATA. When encrypting data keep the following in mind. The encryption algorithm is an internal algorithm. For those who care to know, it uses Triple DES cipher block chaining (CBC) with padding and the 128-bit secret key is derived from the password using an MD5 hash. When defining columns to contain encrypted data the DBA must be involved because the data storage required is significantly different. The length of the column has to include the length of the non-encrypted data + 24 bytes + number of bytes to the next 8 byte boundary + 32 bytes for the hint. The decryption functions (DECRYPT_BIT, DECRYPT_CHAR, and ECRYPT_DB) return the decrypted value of the data. You must supply the encrypted column and the password required for decryption. The decryption functions can only decrypt values that are encrypted using the DB2 ENCRYPT function.
DB2 9 for z/OS improves support for encryption of data in transit. DB2 9 supports the Secure Socket Layer (SSL) protocol by implementing the z/OS Communications Server IP Application Transparent Transport Layer Security (AT-TLS) function. The z/OS V1R7 Communications Server for TCP/IP introduces the AT-TLS function in the TCP/IP stack for applications that require secure TCP/IP connections. AT-TLS performs transport layer security on behalf of the application, in this case DB2 for z/OS, by invoking the z/OS system SSL in the TCP layer of the TCP/IP stack. The z/OS system SSL provides support for TLS V1.0, SSL V3.0, and SSL V2.0 protocols. So encryption of data over the wire is improved in z/OS 1.7. The Communications Server supports AT-TLS, which uses SSL data encryption. Now SSL encryption has been available on z/OS for a long time, but now DB2 9 for z/OS makes use of this facility and offers SSL encryption using a new secure port. When acting as a requester, DB2 for z/OS can request a connection using the secure port of another DB2 subsystem. When acting as a server, and from within a trusted context, SSL encryption can be required for the connection.
Auditing tools should not only capture and present to you the audit data, but they should also alert you on activities and quickly identify records that are needed for an audit.
1. DDL = Data Definition Language AKA “schema changes” Manipulates database structure CREATE, DROP, ALTER objects such as TABLES 2. DML = Data Manipulation Language Manipulates & retrieves data INSERT, UPDATE, DELETE and SELECT 3. DCL = Data Control Language Controls access GRANT & REVOKE CONNECT to the database or schema. SELECT, INSERT, UPDATE, DELETE records. USAGE -- use a database object such as a schema or a function
The audit trace doesn’t record everything: Auditing takes place only when the audit trace is on. The trace does not record old data after it is changed (the log records old data). If an agent or transaction accesses a table more than once in a single unit of recovery, the audit trace records only the first access. The audit trace does not record accesses if you do not start the audit trace for the appropriate class of events. The audit trace does not audit some utilities. The trace audits the first access of a table with the LOAD utility, but it does not audit access by the COPY, RECOVER, and REPAIR utilities. The audit trace does not audit access by stand-alone utilities, such as DSN1CHKR and DSN1PRNT. The trace audits only the tables that you specifically choose to audit. You cannot audit access to auxiliary tables. You cannot audit the catalog tables because you cannot create or alter catalog tables.
Organization are processing and storing more and more data every year. Average yearly rate of growth: 125% - large volumes of data interfering with operations (the more data in the operational database, the slower all processes may run) And regulations (as well as business practices) dictate that data once stored, be retained for longer periods of time. - No longer months or years, but in some cases multiple decades More varied types of data are being stored in databases. Not just (structured) characters, numbers, dates & times, but also (unstructured) large text, images, video, and more are being stored. - Unstructured data greatly expands storage needs The retained data must be protected from modification – it must represent the authentic business transactions at the time the business was conducted. - need for better protection from modification - need for isolation of content from changes
Taking a closer look at the requirements for data retention, the first thing to keep in mind is that laws NEVER say you have to archive data, just that you retain data. So if you can keep it in the operational database, OK. But if not, you have to archive it. Data Archiving is a process used to move data from the operational database to another data store to be kept for the duration of the retention period when it is unacceptable to keep the data in the operational database for that long. So, why might it be unacceptable? - large volumes of data interfering with operations (the more data in the operational database, the slower all processes may run) - need for better protection from modification - need for isolation of content from changes
Here we have some examples of regulations that impact data retention. This is just a sampling of the more than 150 different regulations (at the local, state, national, and international levels) that impact data retention. Regulations Drive Retention and Discovery Source: www. domains.cio.com/symantec/wp/ebs_ ediscovery _ev_wp.pdf
Database Archiving is part of a larger topic, namely Data Archiving. There are many types of data that needs to be archived to fulfill regulatory, legal, and business requirements. Perhaps the biggest driver for archiving has been e-mail. At any rate, each type of data requires different archival processing requirements due to its form and nature. What works to archive e-mail is not sufficient for archiving database data, and so on. This diagram depicts the necessary components of a database archiving solution. Starting with the databases down the left side is the Extract portion, and up the right side is the data recall portion. Several sites have been very vocal about needing this capability, but I think it is a dubious requirement. If you can query from the archive recalling data into the operational database is not really a necessity, it it? And what if the operational database changes (e.g. Oracle to SAP)? The whole process requires metadata to operate. You must capture, validate and enhance the metadata to drive the archive process. You need to know the structure of the operational database and the structure of the archive. There is also the case of the metadata about data retention for the archive. This policy-based metadata must be maintained and monitored against the archive, for example, to determine if data needs to be discarded. And, as you can see off to the bottom left we also need to be able to query the archived data. This will not necessarily be the most efficient access because of differences in the metadata over time, but for queries against archived data performance is not a paramount concern. Finally, we will also have on-going maintenance of the archive: security, access audit, administration of the structures (REORG), backup/recovery, etc.
OK, so let’s take a look at the actual functionality needed to support database archiving. This slide outlines the various features and functions needed: Additional summary points: Keeping data in operational systems is a bad idea, as is putting data in UNLOAD or backup files, putting data in a parallel references database, and/or using a DBMS to store the archive does not work Database archiving requires a great deal of data design Establishing and maintaining metadata Designing how data looks in the archive Achieving application independence Database archives must be continuously managed Copying data for storage problems (e.g. media rot) Copying data for system changes Copying data for data encoding standard changes Logging, auditing, and monitoring Archive events Partition management Accesses
Archiving data from operational databases serves to protect that data from breaches that may befall a typical production system. Because the data is no longer in the database it cannot be accessed like typical database data, nor can it be accessed by privileged users, such as DB2 SYSADMs. Only those authorized to access the archive can access the data. And, of course, the archived data is protected against change, so the data cannot be changed.
Data must be categorized to ensure that it is treated appropriately for regulatory compliance. The who, what, where, when, and why for each piece of data is what determines how that data is governed: Audit, Protection, Retention, etc. So, metadata increases in importance!
Database security tasks are left mostly to DBAs, who rarely have the time or training to do them. Another reason that database security is lacking in many enterprises is … (the) "big disconnect" among DBAs: They know a lot about data, but their security knowledge is lacking. In response to the gap between the time enterprise DBAs have to devote to database security and an enterprise's database security needs, some companies have begun to take a more proactive approach, pulling DBAs out of their regular workgroups and inserting them within an IT security team. "We believe that the database security administrator role is going to evolve," (Noel) Yuhanna (of Forrester Research) said. "In some organizations, it already exists, but they're called database security professionals, database administrators or database specialists." This arrangement solves two dilemmas: IT security professionals who lack substantial database knowledge have people on hand to fill this gap, and DBAs receive the intense security focus and training needed to keep enterprise databases safe.