The document discusses the key requirements for managing enterprise data, including obtaining executive sponsorship, identifying stakeholders, classifying data, and understanding metadata. It outlines considerations for data classification like business needs, availability, and confidentiality. Regulatory compliance, security, and proper data destruction are also addressed as significant risk factors organizations must manage. The presentation provides an overview of the critical aspects of establishing an effective data management program.
Sans Tech Paper Hardware Vs Software Encryptionharshadthakar
This document compares software-based disk encryption and hardware-based disk encryption using Seagate Secure. It discusses barriers to adoption of encryption, how software-based encryption works by using the CPU for encryption/decryption, and how hardware-based encryption moves this functionality into the hard disk drive. A hands-on evaluation of software-based encryption and Seagate Secure found that hardware-based encryption had significantly better performance since it offloads encryption/decryption from the CPU.
The impact of regulatory compliance on DBA(latest)Craig Mullins
The document discusses how increasing regulatory compliance is impacting database administration. It outlines several key regulations and how they influence data quality, long-term data retention, database security, auditing, and controls over database administration procedures. Compliance is driving the need for improved data management practices to ensure data is properly protected, retained, and accessible over time. Failure to comply can result in significant fines or prosecution.
This document discusses cybersecurity and provides guidance on developing a cybersecurity plan. It recommends taking four key steps: 1) understanding common cybersecurity issues, 2) evaluating organizational risks, 3) protecting the organization through measures like data encryption and training, and 4) developing an incident response plan to react to data breaches. The document then covers various components of a cybersecurity plan, including conducting a data inventory, assessing risks, and implementing technical, policy, and training controls.
The trends continue to point upward for data incidents and 2013 is becoming a pace setter. The shifting regulatory landscape promises to add further complications for companies struggling to prepare for and respond to data privacy incidents.
This webinar will feature two leading data breach experts who have performed a two year trend analysis across hundreds of cases to offer a powerful and up-to-date perspective on what has happened and their predictions for the future. It will also cover how these factors are shaping regulations which are in turn influencing decision-making in the C-Suite.
Our featured speakers for this timely webinar will be:
-Bill Hardin, Director of Data Privacy Response & Investigations, Navigant
-Jennifer Coughlin, Privacy and Data Security Attorney, Nelson, Levine
-Gant Redmon, Esq. General Counsel and VP of Business Development, Co3 Systems
"We're all in this together" - educating users on the importance of cyber sec...Jisc
The document discusses educating university users on the importance of cyber security. It describes how the university initially struggled with frequent ransomware attacks until implementing mandatory cyber security training for staff and establishing an information security group. Regular security updates were sent to staff to raise awareness in a non-technical way. This helped reduce incidents and improved staff understanding of secure practices like not using personal passwords for work accounts. Educating users was found to be important for improving the overall security of the university's IT networks and data.
Encryption and Key Management: Ensuring Compliance, Privacy, and Minimizing t...IBM Security
Encryption and Key Management: Ensuring Compliance, Privacy, and Minimizing the Impact of a Breach
Encryption has been viewed as the ultimate way to protect sensitive data for compliance. But it has also been considered very complex to implement. Today, encryption is essential to meet compliance objectives, and has become much simpler to implement. The challenge is knowing when and where to use encryption, how it can simplify compliance, what controls need to be in place, and the options for good encryption key management. This session will cover the options for encryption and key management, what each provides, and their requirements. Encryption and key management topics include application-level encryption for data in use, network encryption of data in motion, and storage encryption for data at rest.
The document discusses data leakage prevention and demystifies DLP solutions. It begins with examples of major data breaches to illustrate the business case for DLP. It then covers key considerations for building a DLP program such as defining policies, selecting vendors, and addressing implementation challenges like user resistance and integration. The presentation concludes with recommendations for measuring the effectiveness of a DLP program over time through metrics like the reduction of incidents and policy violations.
Sans Tech Paper Hardware Vs Software Encryptionharshadthakar
This document compares software-based disk encryption and hardware-based disk encryption using Seagate Secure. It discusses barriers to adoption of encryption, how software-based encryption works by using the CPU for encryption/decryption, and how hardware-based encryption moves this functionality into the hard disk drive. A hands-on evaluation of software-based encryption and Seagate Secure found that hardware-based encryption had significantly better performance since it offloads encryption/decryption from the CPU.
The impact of regulatory compliance on DBA(latest)Craig Mullins
The document discusses how increasing regulatory compliance is impacting database administration. It outlines several key regulations and how they influence data quality, long-term data retention, database security, auditing, and controls over database administration procedures. Compliance is driving the need for improved data management practices to ensure data is properly protected, retained, and accessible over time. Failure to comply can result in significant fines or prosecution.
This document discusses cybersecurity and provides guidance on developing a cybersecurity plan. It recommends taking four key steps: 1) understanding common cybersecurity issues, 2) evaluating organizational risks, 3) protecting the organization through measures like data encryption and training, and 4) developing an incident response plan to react to data breaches. The document then covers various components of a cybersecurity plan, including conducting a data inventory, assessing risks, and implementing technical, policy, and training controls.
The trends continue to point upward for data incidents and 2013 is becoming a pace setter. The shifting regulatory landscape promises to add further complications for companies struggling to prepare for and respond to data privacy incidents.
This webinar will feature two leading data breach experts who have performed a two year trend analysis across hundreds of cases to offer a powerful and up-to-date perspective on what has happened and their predictions for the future. It will also cover how these factors are shaping regulations which are in turn influencing decision-making in the C-Suite.
Our featured speakers for this timely webinar will be:
-Bill Hardin, Director of Data Privacy Response & Investigations, Navigant
-Jennifer Coughlin, Privacy and Data Security Attorney, Nelson, Levine
-Gant Redmon, Esq. General Counsel and VP of Business Development, Co3 Systems
"We're all in this together" - educating users on the importance of cyber sec...Jisc
The document discusses educating university users on the importance of cyber security. It describes how the university initially struggled with frequent ransomware attacks until implementing mandatory cyber security training for staff and establishing an information security group. Regular security updates were sent to staff to raise awareness in a non-technical way. This helped reduce incidents and improved staff understanding of secure practices like not using personal passwords for work accounts. Educating users was found to be important for improving the overall security of the university's IT networks and data.
Encryption and Key Management: Ensuring Compliance, Privacy, and Minimizing t...IBM Security
Encryption and Key Management: Ensuring Compliance, Privacy, and Minimizing the Impact of a Breach
Encryption has been viewed as the ultimate way to protect sensitive data for compliance. But it has also been considered very complex to implement. Today, encryption is essential to meet compliance objectives, and has become much simpler to implement. The challenge is knowing when and where to use encryption, how it can simplify compliance, what controls need to be in place, and the options for good encryption key management. This session will cover the options for encryption and key management, what each provides, and their requirements. Encryption and key management topics include application-level encryption for data in use, network encryption of data in motion, and storage encryption for data at rest.
The document discusses data leakage prevention and demystifies DLP solutions. It begins with examples of major data breaches to illustrate the business case for DLP. It then covers key considerations for building a DLP program such as defining policies, selecting vendors, and addressing implementation challenges like user resistance and integration. The presentation concludes with recommendations for measuring the effectiveness of a DLP program over time through metrics like the reduction of incidents and policy violations.
Database Auditing Essentials... or... Who did what to which data when and how?
The combination of increasing government regulation and the need for securing corporate data has driven up the need to track who is accessing data in our corporate databases. This presentation discusses these drivers as well as presenting the requirements for auditing data access in corporate databases.
The goal of this presentation is to review the regulations impacting the need to audit, and then to discuss in detail the kinds of things that may need to be audited, along with the several ways of accomplishing this.
2011 hildebrandt institute cio forum data privacy and security presentation...David Cunningham
The document discusses a presentation on leveraging IT in times of fiscal restraint to support evolving law firm business models, with specific focus on data privacy and security risk management and competitive advantage. Speakers include CISOs and IT risk managers from law firms who cover topics like data regulations, examples of regulated data, information security roles, ISO 27001 certification, audits, components of information security programs, service provider management, and contractual controls. The presentation then ends with a question and answer session.
Some basic security controls you can (and should) implement in your web apps. Specifically this covers:
1 - Beyond SQL injection
2 - Cross-site Scripting
3 - Access Control
Big Data: Beyond the Hype - Why Big Data Matters to YouDATAVERSITY
This document discusses big data and its importance. It notes that big data is more prevalent than many realize, with most companies and industries now dealing with large volumes of various types of data. It also explains that effectively managing big data provides competitive advantages, with data-savvy companies experiencing much stronger growth rates. Additionally, the document introduces DataStax Enterprise as a solution for easily and effectively managing big data at scale through its support for Apache Cassandra, analytics capabilities, visualization tools, and enterprise services.
The document summarizes key statistics about data loss incidents in 2013, including that over 2,000 incidents exposed over 800 million records. It outlines the typical stages companies go through after an incident and laws requiring preparation and response. The document provides a self-assessment for companies and best practices around security, forensics, communications, and international considerations for responding to a data breach. It emphasizes that companies should plan for an incident as regulatory requirements and costs can be significant for unprepared organizations.
Better to Ask Permission? Best Practices for Privacy and SecurityEric Kavanagh
Hot Technologies with The Bloor Group and IDERA
If security was once a nice-to-have, those days have long gone. Between data breaches and privacy regulations, organizations today face immense pressure to protect their systems and their sensitive data. When giants like Yahoo! and Target can get hacked, so can any other company. What can you do about it? How can you protect your company and clients?
Register for this episode of Hot Technologies to hear Analysts Eric Kavanagh and Dr. Robin Bloor provide insights about the many ways that companies can buttress their defenses and stay ahead of the bad guys. They'll be briefed by Vicky Harp of IDERA who will demonstrate how to identify vulnerabilities, track sensitive data, successfully pass audits, and protect your SQL Server databases.
apsec 7 Golden Rules Data Leakage Prevention / DLPandreasschuster
The document outlines seven golden rules for data leakage prevention:
1. Accept that there is a risk of data breaches.
2. Provide endpoint security by identifying sensitive data and protecting it at its origin.
3. Take security into your own hands through centralized policy management and access controls.
4. Make security easy to reduce human errors through invisible encryption and easy administration.
5. Have emergency precautions like encryption key recovery to ensure data availability.
6. Prioritize security using the 80/20 rule to find an acceptable risk level.
7. Understand that security costs money but it is worth it to prevent data loss.
This document is a presentation about how CIOs and CSOs are becoming mission-critical business partners. The presentation covers how information is the lifeblood of organizations and how events involving data loss are rising. It discusses moving to an information-centric security approach and developing critical partnerships across organizations. The presentation emphasizes that security is not about checking boxes for compliance, but rather focusing on behavior change through education and building relationships.
Navigant is an expert services firm that assists clients with disputes, investigations, and critical business risks. They help organizations address data breaches and theft of trade secrets through a comprehensive process. This includes investigating security incidents, analyzing compromised data and systems to understand the full scope, and helping clients resolve issues by advising on notifications, regulatory responses, damages, and security improvements to prevent future incidents. Navigant leverages forensic and analytical expertise to swiftly determine the methods, extent, and implications of data breaches and trade secret theft.
What is in store for e-discovery in 2015?Logikcull.com
The document discusses 4 major trends in e-discovery for 2015:
1) Increased focus on litigation readiness and document preservation plans to defend against sanctions for e-discovery violations.
2) Growing use of data analytics and visualization tools to improve e-discovery efficiency and reduce costs.
3) Challenges of addressing foreign privacy laws and cross-border data transfers.
4) Difficulties preserving, collecting and processing data from mobile devices.
Defining a Legal Strategy ... The Value in Early Case AssessmentAubrey Owens
Early Case Assessment provides the framework for litigators to identify and analyze electronically stored information in response to a litigation hold and.or discovery request.
Information Security vs. Data Governance vs. Data Protection: What Is the Rea...PECB
This webinar will provide more information on the importance of information security and how you can take security well beyond compliance, an approach on building strong information security, privacy and data governance programs, and the importance of strong data governance in relation to privacy and information security requirements.
The webinar covers
• Information Security
• Importance Of Information Security Today
• Taking Information Security Beyond A Compliance First
• Importance Of Data Governance In Information Security
• Privacy
• Changing And Evolving Privacy Requirements
• Importance Of Data Governance In Privacy
• Data Governance And Data Privacy
• Data Privacy - Data Processing Principles
Presenters:
Moji is a Senior Business Process Analyst working with GemaltoThales, a leading firm in the IT industry. Moji has over fifteen years of experience in leading projects to improve processes, create and implement processes leading to increased revenue generation and eliminate redundancies.
She has a zeal for adding value and increasing revenue for organizations. Moji is very passionate about Data Privacy and its application in business and consumer rights.
Hardeep Mehrotara has 20+ years of senior leadership experience in Information Technology and Cyber Security working for public and private organizations building security programs from the ground up. He has been featured on Canadian television as a cyber expert and provided advice to various communities on implementing cybersecurity strategy, best practices and controls. He has been a co-author on numerous leading industry security control frameworks, technical benchmarks and industry best practice standards.
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: https://pecb.com/whitepaper/iso-27001-information-technology--security-techniques-information-security--management-systems---requirements
https://pecb.com/en/education-and-certification-for-individuals/iso-iec-27701
Webinars: https://pecb.com/webinars
Articles: https://pecb.com/article
Whitepapers: https://pecb.com/whitepaper
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
YouTube video: https://youtu.be/aQcS5-RFIEY
Website link: https://pecb.com/
This document discusses the challenges organizations face with litigation and electronic discovery. It notes that litigation can have widespread impact on employees, legal teams, IT, and operations. Challenges include a lack of processes for legal holds, unclear roles and responsibilities, and inability to effectively search large amounts of electronic data from various sources. The document then outlines implications like high costs of discovery without preparation. It proposes solutions such as assessing readiness, establishing a litigation response team, implementing an electronic discovery plan, classifying data, and providing training and records management.
The IPAS Project aims to enhance data security practices across Penn State through two phases. Phase I focused on PCI compliance to allow processing of credit cards. Phase II focuses on security and privacy initiatives for all institutional information, including data classification, scanning computers for personal information, and encrypting laptops. Ag IT is helping to guide the College of Agricultural Sciences through this ongoing process of implementation and security standardization across the university network and systems over several months and years.
Exploring Data Privacy - SQL Saturday Louisville 2011John Magnabosco
This is the slide deck from the presentation given at SQL Saturday event in Louisville, October 2011. A modified version of this presentation was given at the Indianapolis SQL Saturday in May 2011.
Orange Legal Technologies Considering Meet And Confer I L T A Prod...Rob Robinson
This document provides an overview of understanding and preparing for meet and confer discussions as required by the Federal Rules of Civil Procedure for electronic discovery. It discusses key topics such as understanding meet and confer definitions and requirements, electronically stored information, electronic discovery tasks of collection, processing, review and production, and pre-meet and confer preparation including preservation, data scoping, cost estimation, and developing a discovery plan. The goal is to translate understanding of these concepts into effective execution of the meet and confer process.
This document summarizes a presentation given to the American Bar Association on securing critical infrastructures. It defines critical infrastructures as physical and digital systems essential to the economy and government. It notes that advances in IT have increased interdependence between infrastructures, creating new vulnerabilities. The presentation discusses issues like lack of cooperation between infrastructure owners, need for regular vulnerability assessments, and taking a holistic approach. It introduces SCADA and control systems, noting differences from conventional IT systems in prioritizing availability over security. The presentation covers legal and practical considerations for securing control systems and standards for control system security.
Electronic discovery, or eDiscovery, refers to the discovery of electronically stored information in legal proceedings. This includes information stored on devices like computers, phones, hard drives, and more. eDiscovery is more complex than paper discovery due to factors like the persistence, dynamic nature, metadata, and dispersion of electronic data. Managing eDiscovery effectively requires proactive information management policies to control electronic records and reduce costs when responding to legal requests.
This document discusses electronic discovery (eDiscovery) which refers to the discovery of electronically stored information in legal cases. It notes that eDiscovery costs are skyrocketing, averaging over $1.5 million per corporate lawsuit. The document outlines typical eDiscovery costs including collecting, processing, reviewing data which can cost thousands or millions depending on the size of the case. It emphasizes that proactive information management is key to addressing eDiscovery by developing policies to help employees manage information and only retain necessary records.
Database Auditing Essentials... or... Who did what to which data when and how?
The combination of increasing government regulation and the need for securing corporate data has driven up the need to track who is accessing data in our corporate databases. This presentation discusses these drivers as well as presenting the requirements for auditing data access in corporate databases.
The goal of this presentation is to review the regulations impacting the need to audit, and then to discuss in detail the kinds of things that may need to be audited, along with the several ways of accomplishing this.
2011 hildebrandt institute cio forum data privacy and security presentation...David Cunningham
The document discusses a presentation on leveraging IT in times of fiscal restraint to support evolving law firm business models, with specific focus on data privacy and security risk management and competitive advantage. Speakers include CISOs and IT risk managers from law firms who cover topics like data regulations, examples of regulated data, information security roles, ISO 27001 certification, audits, components of information security programs, service provider management, and contractual controls. The presentation then ends with a question and answer session.
Some basic security controls you can (and should) implement in your web apps. Specifically this covers:
1 - Beyond SQL injection
2 - Cross-site Scripting
3 - Access Control
Big Data: Beyond the Hype - Why Big Data Matters to YouDATAVERSITY
This document discusses big data and its importance. It notes that big data is more prevalent than many realize, with most companies and industries now dealing with large volumes of various types of data. It also explains that effectively managing big data provides competitive advantages, with data-savvy companies experiencing much stronger growth rates. Additionally, the document introduces DataStax Enterprise as a solution for easily and effectively managing big data at scale through its support for Apache Cassandra, analytics capabilities, visualization tools, and enterprise services.
The document summarizes key statistics about data loss incidents in 2013, including that over 2,000 incidents exposed over 800 million records. It outlines the typical stages companies go through after an incident and laws requiring preparation and response. The document provides a self-assessment for companies and best practices around security, forensics, communications, and international considerations for responding to a data breach. It emphasizes that companies should plan for an incident as regulatory requirements and costs can be significant for unprepared organizations.
Better to Ask Permission? Best Practices for Privacy and SecurityEric Kavanagh
Hot Technologies with The Bloor Group and IDERA
If security was once a nice-to-have, those days have long gone. Between data breaches and privacy regulations, organizations today face immense pressure to protect their systems and their sensitive data. When giants like Yahoo! and Target can get hacked, so can any other company. What can you do about it? How can you protect your company and clients?
Register for this episode of Hot Technologies to hear Analysts Eric Kavanagh and Dr. Robin Bloor provide insights about the many ways that companies can buttress their defenses and stay ahead of the bad guys. They'll be briefed by Vicky Harp of IDERA who will demonstrate how to identify vulnerabilities, track sensitive data, successfully pass audits, and protect your SQL Server databases.
apsec 7 Golden Rules Data Leakage Prevention / DLPandreasschuster
The document outlines seven golden rules for data leakage prevention:
1. Accept that there is a risk of data breaches.
2. Provide endpoint security by identifying sensitive data and protecting it at its origin.
3. Take security into your own hands through centralized policy management and access controls.
4. Make security easy to reduce human errors through invisible encryption and easy administration.
5. Have emergency precautions like encryption key recovery to ensure data availability.
6. Prioritize security using the 80/20 rule to find an acceptable risk level.
7. Understand that security costs money but it is worth it to prevent data loss.
This document is a presentation about how CIOs and CSOs are becoming mission-critical business partners. The presentation covers how information is the lifeblood of organizations and how events involving data loss are rising. It discusses moving to an information-centric security approach and developing critical partnerships across organizations. The presentation emphasizes that security is not about checking boxes for compliance, but rather focusing on behavior change through education and building relationships.
Navigant is an expert services firm that assists clients with disputes, investigations, and critical business risks. They help organizations address data breaches and theft of trade secrets through a comprehensive process. This includes investigating security incidents, analyzing compromised data and systems to understand the full scope, and helping clients resolve issues by advising on notifications, regulatory responses, damages, and security improvements to prevent future incidents. Navigant leverages forensic and analytical expertise to swiftly determine the methods, extent, and implications of data breaches and trade secret theft.
What is in store for e-discovery in 2015?Logikcull.com
The document discusses 4 major trends in e-discovery for 2015:
1) Increased focus on litigation readiness and document preservation plans to defend against sanctions for e-discovery violations.
2) Growing use of data analytics and visualization tools to improve e-discovery efficiency and reduce costs.
3) Challenges of addressing foreign privacy laws and cross-border data transfers.
4) Difficulties preserving, collecting and processing data from mobile devices.
Defining a Legal Strategy ... The Value in Early Case AssessmentAubrey Owens
Early Case Assessment provides the framework for litigators to identify and analyze electronically stored information in response to a litigation hold and.or discovery request.
Information Security vs. Data Governance vs. Data Protection: What Is the Rea...PECB
This webinar will provide more information on the importance of information security and how you can take security well beyond compliance, an approach on building strong information security, privacy and data governance programs, and the importance of strong data governance in relation to privacy and information security requirements.
The webinar covers
• Information Security
• Importance Of Information Security Today
• Taking Information Security Beyond A Compliance First
• Importance Of Data Governance In Information Security
• Privacy
• Changing And Evolving Privacy Requirements
• Importance Of Data Governance In Privacy
• Data Governance And Data Privacy
• Data Privacy - Data Processing Principles
Presenters:
Moji is a Senior Business Process Analyst working with GemaltoThales, a leading firm in the IT industry. Moji has over fifteen years of experience in leading projects to improve processes, create and implement processes leading to increased revenue generation and eliminate redundancies.
She has a zeal for adding value and increasing revenue for organizations. Moji is very passionate about Data Privacy and its application in business and consumer rights.
Hardeep Mehrotara has 20+ years of senior leadership experience in Information Technology and Cyber Security working for public and private organizations building security programs from the ground up. He has been featured on Canadian television as a cyber expert and provided advice to various communities on implementing cybersecurity strategy, best practices and controls. He has been a co-author on numerous leading industry security control frameworks, technical benchmarks and industry best practice standards.
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: https://pecb.com/whitepaper/iso-27001-information-technology--security-techniques-information-security--management-systems---requirements
https://pecb.com/en/education-and-certification-for-individuals/iso-iec-27701
Webinars: https://pecb.com/webinars
Articles: https://pecb.com/article
Whitepapers: https://pecb.com/whitepaper
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
YouTube video: https://youtu.be/aQcS5-RFIEY
Website link: https://pecb.com/
This document discusses the challenges organizations face with litigation and electronic discovery. It notes that litigation can have widespread impact on employees, legal teams, IT, and operations. Challenges include a lack of processes for legal holds, unclear roles and responsibilities, and inability to effectively search large amounts of electronic data from various sources. The document then outlines implications like high costs of discovery without preparation. It proposes solutions such as assessing readiness, establishing a litigation response team, implementing an electronic discovery plan, classifying data, and providing training and records management.
The IPAS Project aims to enhance data security practices across Penn State through two phases. Phase I focused on PCI compliance to allow processing of credit cards. Phase II focuses on security and privacy initiatives for all institutional information, including data classification, scanning computers for personal information, and encrypting laptops. Ag IT is helping to guide the College of Agricultural Sciences through this ongoing process of implementation and security standardization across the university network and systems over several months and years.
Exploring Data Privacy - SQL Saturday Louisville 2011John Magnabosco
This is the slide deck from the presentation given at SQL Saturday event in Louisville, October 2011. A modified version of this presentation was given at the Indianapolis SQL Saturday in May 2011.
Orange Legal Technologies Considering Meet And Confer I L T A Prod...Rob Robinson
This document provides an overview of understanding and preparing for meet and confer discussions as required by the Federal Rules of Civil Procedure for electronic discovery. It discusses key topics such as understanding meet and confer definitions and requirements, electronically stored information, electronic discovery tasks of collection, processing, review and production, and pre-meet and confer preparation including preservation, data scoping, cost estimation, and developing a discovery plan. The goal is to translate understanding of these concepts into effective execution of the meet and confer process.
This document summarizes a presentation given to the American Bar Association on securing critical infrastructures. It defines critical infrastructures as physical and digital systems essential to the economy and government. It notes that advances in IT have increased interdependence between infrastructures, creating new vulnerabilities. The presentation discusses issues like lack of cooperation between infrastructure owners, need for regular vulnerability assessments, and taking a holistic approach. It introduces SCADA and control systems, noting differences from conventional IT systems in prioritizing availability over security. The presentation covers legal and practical considerations for securing control systems and standards for control system security.
Electronic discovery, or eDiscovery, refers to the discovery of electronically stored information in legal proceedings. This includes information stored on devices like computers, phones, hard drives, and more. eDiscovery is more complex than paper discovery due to factors like the persistence, dynamic nature, metadata, and dispersion of electronic data. Managing eDiscovery effectively requires proactive information management policies to control electronic records and reduce costs when responding to legal requests.
This document discusses electronic discovery (eDiscovery) which refers to the discovery of electronically stored information in legal cases. It notes that eDiscovery costs are skyrocketing, averaging over $1.5 million per corporate lawsuit. The document outlines typical eDiscovery costs including collecting, processing, reviewing data which can cost thousands or millions depending on the size of the case. It emphasizes that proactive information management is key to addressing eDiscovery by developing policies to help employees manage information and only retain necessary records.
2010 IQPC - Turning Risks into Rewards Developing a Comprehensive Records and...Keith Atteck C.Tech. ERMm
The thesis if this presentation is that good information management is the key to effective eDiscovery and early case assessment, and that poor information management leads to ineffective and costly discovery efforts. The presentation covers; implementing a comprehensive RIM Framework and why it is the best defense in Discovery, Mitigating risk while maintaining record-keeping compliance, Applying Canadian General Standards Board standards to keep records with integrity.
10 Steps for Taking Control of Your Organization's Digital Debris Perficient, Inc.
Do you have too much old information, but not enough guidance to begin the task of cleaning out your data stores? Join Perficient to learn 10 tips for creating a strategic roadmap to take control of your information and uncover the technology that can support your efforts, including how to:
Stop keeping everything forever
Create an information governance and disposal policy before implementing technology
Automate information management to improve employee productivity
Prepare a discovery response plan
This document discusses big data and the importance of data quality for big data initiatives. It defines big data as large, diverse digital data sets that require new techniques to enable capture, storage, analysis and visualization. The key challenges of big data include integrating diverse structured and unstructured data sources and ensuring high quality data. The document emphasizes that poor data quality can undermine big data analytics efforts and lead to wrong insights. It promotes establishing a data quality framework including profiling, standardization, matching and enrichment to enable valid big data analytics.
[Webinar Slides] Data Privacy – Learn What It Takes to Protect Your InformationAIIM International
Follow along with these webinar slides as we take a close look at what it takes to prepare for all kinds of data privacy regulations – learn how to protect your data in order to be compliant with regulators or for healthy business practices in general.
Want to follow along with the webinar replay? Download it here for free: http://info.aiim.org/protect-your-information
Computer forensics is the science of investigating digital devices and media for legal evidence. It began in the late 1980s with the creation of organizations like IACIS. Investigators use methods to detect and recover hidden, deleted, or encrypted data through techniques like disk analysis, steganalysis, and data carving. They must have expertise in operating systems, file systems, data storage, and encryption to properly handle electronic evidence for criminal and civil cases. While computer forensics allows thorough searches and analysis of large amounts of data, it also faces challenges such as ensuring evidence integrity and overcoming costs.
Using Pattern-based design to Drive Disruptive Information SecurityRavila White
This document discusses using pattern-based design to drive disruptive information security. It begins by outlining competing priorities around complying with regulations while addressing evolving cyber threats. It then defines disruptive innovation as starting with simpler applications and moving up to displace established competitors. The document provides examples of past information security disruptions and outlines elements of patterns that could be applied to information security, including patterns of plans, compliance and threats. It concludes by thanking the audience and inviting questions.
This document summarizes a discussion on data protection for credit unions. It introduces the moderator and peer panelists, who work in credit union IT. The agenda includes discussing the increasing complexity of IT environments and data growth, compliance requirements, data loss threats, and methods to protect data. Technologies mentioned that can help include backups, determining recovery time objectives, onsite vs offsite storage, cloud storage, and site replication. The discussion emphasizes the need to identify all data to protect, have multiple recovery points, ensure data is encrypted and unattended, and out of region in accordance with compliance. Questions from attendees were taken at the end.
Idera live 2021: Database Auditing - on-Premises and in the Cloud by Craig M...IDERA Software
Hackers, thieves, and many cybercriminals are constantly on the lookout for ways to harvest your data and use it for their own nefarious purposes. And where do they look? Everywhere! However, your database systems are the most likely target because that is where the data is located! And increasingly, your data is not just on computers running in your data center, but also in the cloud. So, organizations must be ever-vigilant to see who is accessing the sensitive corporate data in your databases usage and protect it from unauthorized access.
Protecting your data for business reasons is a big enough reason to check your data access. In addition, many governmental and industry regulations exist that mandate you do so. Each regulation places different demands on what types of data access we must watch and audit.
Ensuring compliance can be difficult, especially when you need to follow multiple regulations. And you need to capture all relevant data access attempts while still maintaining the service levels for the performance and availability of your applications.
This webinar discusses these issues as well as presenting the requirements for auditing data access in relational databases. The goal of this presentation is to review the regulations affecting the need to audit at a high level. Then, the speaker will discuss the things that need to be audited, along with pros and cons of the various ways of accomplishing this.
The presenter, Craig Mullins, is president and principal consultant of Mullins Consulting, Inc. where he focuses on data management strategy and consulting. Craig writes the monthly DBA Corner column for Database Trends & Applications magazine. With over three decades of experience in all facets of database systems development, he has worked as a programmer and analyst, a database administrator, an industry analyst, a software executive, and a consultant.
The document provides an overview of the General Data Protection Regulation (GDPR) and its implications for data protection and privacy. In 3 sentences: The GDPR imposes new obligations on companies regarding how personal data is collected and processed in order to protect European citizens' privacy rights. It requires companies to implement privacy by design principles and conduct data protection impact assessments. The GDPR aims to give citizens more control over their personal data and holds companies accountable for any data breaches or violations of individuals' privacy rights.
The document discusses various methods for securely destroying data from retired electronic assets. It describes how data can still be retrieved from devices like hard drives even after being deleted unless properly destroyed. It then outlines different recognized methods for destroying data securely, including software-based wiping, hardware degaussing, and physical destruction. It discusses the advantages and disadvantages of each method and when they would be appropriately applied. It also provides an overview of Sims Recycling Solutions and why they are experts in electronic asset and data destruction management.
Emids Morning Security Virtual India V3techcouncil
The document discusses data security and privacy in offshore operations. It outlines potential risks like loss of data, intellectual property, and damage to brand. It then discusses ways to mitigate these risks, including understanding relevant laws and regulations, NASSCOM's role in promoting standards in the Indian IT market, vendor best practices around frameworks and audits, and drafting secure contracts.
Jason Christopher, Dragos Principal Cyber Risk Advisor, joins CyberWire for this podcast that discusses the evolution of ICS/OT ransomware, its impacts on the community, and cybersecurity best practices ICS/OT practitioners can implement to combat it. Listen to the full podcast here: https://dragos.com/resource/ransomware-in-an-industrial-world/
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Data Management - NA CACS 2009
1. SESSION 133
ENTERPRISE
DATA MANAGEMENT
REQUIREMENTS
Michael Berardi, MS-CIS, CISA
IT Audit Manager
Energizer Holdings, Inc.
Jeffrey Roth, CISA, CGEIT
Director, Technology Risk Management Services
RSM McGladrey
2. ACRONYMS TO KNOW
ILM – Information
Lifecycle Management
ICM – Information
Classification
Management
FRCP – Federal Rules
for Criminal
Procedures
3. ACRONYMS TO KNOW
PII/PHI – Personally
Identifiable
Information/Personal Health
Information
FISMA – Federal Information
Security Act
MDM – Master Data
Management
4. TERMINOLOGY AND FOUNDATION
FOR RECORDS MANAGEMENT
• DISS Destruction standards
– Degaussing (NIST)
– Physical destruction methods
• Records management
• Business records life cycle
• Active data
• eDiscovery
• Sedona Conference
7. THIS IS THE END GAME
It has been said that “information is power,” and they
who control the information control the power.
Whether the information is broadcast on the evening
news, printed in a newspaper, etched on stone
tablets, or published on a USENET newsgroup or
Internet Web page, we rely on information in our daily
lives, and trust that most of the information we
receive and process is accurate.
Information Warfare and Security, Dorothy E. Denning, ISBN 0-201- 43303-6, Addison-
Wesley, 1999 Originally published in Cisco's The Internet Protocol Journal, September,
1999
9. FLAWED DECISION SUPPORT
Origins of Master Visibility across
Data Management applications and the
• Mainframe organization
• Personal Computer and • Financials
RDBMS • Customers
• ERPs – SAP R/3 • Employees
10. LEGAL EXPOSURE OR OVER-
EXPOSURE
“Wall Street Crisis brings lax e-discovery law
enforcement to light”, Jan 14, 2009
• Only 10-15% of US corporations have electronic
records retention systems in place according to
Gartner Inc as quoted
• Debra Logan of Gartner went on to say “We need
to have people in charge of managing information
for the entire company. Today, everyone’s
expected to manage their own data”
• Federal Rules for Civil Procedure or FRCP
11. How Big is the
Problem?
• Headlines tout compliance allegations
• FRCP: Intel/AMD
We must address Stanley
• FRCP: Morgan our data at rest and in
motion…
• FRCP: General Motors
• SEC: UBS Securities
The time SEC:sitting America side-lines has long
• for Bank of on the
past and HIPAA: Providenceare readily available to
• the solutions Health & Services
both control and monitor data flow from our
• HIPAA: UCLA Health Systems
• SOX: Neworganization
government whistle-blower’s hotline
• Cost = several thousand dollars to millions
– Providence Health & Services: $100,000 settlement
– Morgan Stanley: $15 Million fine
11
13. REGULATORY COMPLIANCE
(Cont.)
• Massachusetts State Regulations
– Encrypt personal data on portable devices or being
transmitted on public or via wireless networks
– Deploy secure user authentication and access control
measures and conduct “reasonable” monitoring of
systems in an effort to spot unauthorized activities
– Develop a comprehensive data-security program that
sets internal policies and specifies disciplinary action
– Inventory all electronic and paper records to identify
the ones that contain personal data
15. COST – STORAGE AND
PERFORMANCE
System
Other costs performance?
anyone?
High availability
storage media?
16. STORAGE
• Environmental considerations
– Light
– Temperature
– Humidity
– Location – Floods, Hurricanes, Earthquakes
• Storage containers
• Storage media
• Physical and logical security
17. DATA INTEGRITY
• At in transit and rest
– Creation of data has intrinsic risks
• Data entry error (yes even hand written documents)
• Data garbling during on-line entry
– Media degradation
– Microfiche
– Photographs
– Documents
– Tape
– CDs
– Flash Memory
18. SECURITY – BREACH AND
DISCLOSURE LAWS
• List of security breaches, do you want to see
your company’s name on this list?
http://www.insideidtheft.info/breaches09.aspx?
gclid=CIxitu6BqZkCFREhDQodGBzApg
• Oregon law for Oregon employers of Oregon
residents
– Designate a security officer
– Conduct a risk assessment
– Assess safeguards to manage risks
• HIPAA – Within 60 days
19. SO WE NEED IT, NOW WHAT?
FIRST STEP – CLASSIFY DATA
20. CLASSIFICATION - YOU CAN
NOT MANAGE WHAT YOU
DON’T KNOW
Organizational critical
Highly Confidential
Proprietary
Internal Use Only
Public Documents
21. TEN MOST CRITICAL
REQUIREMENTS FOR
MANAGING DATA
Obtain executive mgmt sponsorship
Identify and interview the stakeholders
Understanding the business requirements
Develop a Project Charter and RACI
Governance of MDM
22. TEN MOST CRITICAL
REQUIREMENTS FOR
MANAGING DATA
(CONT.)
Metadata registry and management
Assessment
Integration of existing data
Assurance
Project Plan
23. CONSIDERATIONS IN CREATING
DATA CLASSIFICATIONS
• Multiple perspectives
• Business requirements
– Compliance
– Analysis
– Time to recovery
Advancing Storage & Information Technology – SNIA - Educational
http://www.snia.org/education/tutorials/2008/fall#data
24. CONSIDERATIONS IN CREATING
DATA CLASSIFICATIONS (CONT.)
• Tagging files by classification name
• Automated classification tools
• Availability, confidentiality, proprietary?
• National Institute of Science and
Technology Federal Information Processing
Standards (FIPS) 199 and Special
Publication SP800-60 volumes I and II
26. DATA AND YOUR OPERATIONS
• Defined data
requirements
– Context of data – Presentation
– Syntax and – Protection
format
– Storage
– Integrity
– Retention
– Classification
– Destruction
– Availability
27. PROTECTION – POWER WITH NO
SHIELD
• If information is power, then do we
treat it as a key asset?
• Based on classification we can
implement incremental security
controls in line with data value.
• Regulatory drivers
(GLBA, HIPAA, EU Privacy
laws, etc.)
28. PROTECTION – POWER WITH NO
SHIELD (CONT.)
• What about hardcopy data?
• Locations of output/presentation
devices (printers, CRT/LCD screens,
logs, etc.)
• Protection in transit and at rest (cover
sheets, encryption, etc.)
• Brakes are what enables a race car
to go fast
29. RETENTION SCHEDULES
• How long is long enough?
– Federal agencies and their contractors
must follow national archives standards
– Corporate regulations require varied
retention periods
– Investigations and Litigation how ever
long it takes and some. Courts and
lawyers will set these requirements
30. RETENTION SCHEDULES
• Based on classification (internal and
regulatory) a records coordinator
position should be established to train the
management team, maintain policies
related to records management, and
monitor records retention activities
(creation through destruction).
• Part of Business Continuity and Disaster
Recovery Planning
31. DESTRUCTION
Many forget that hard drives must
be properly destroyed prior to
disposal (reference National
Association for Information
Destruction)
32. DESTRUCTION
• Expectations
– Proper EPA permits and certifications
– Hard drives are identified by serial number and are
stored in secure uniquely number containers in a
secure storage area prior to shredding.
– Immediately prior to shredding, the number of hard
drives in each container are counted and matched
against the original physical inventory count.
– The start and finish time of each shredding project
is logged.
33. DESTRUCTION
• Expectations (continued)
– The shredded particles are sent through a
powerful degaussing station providing the
ultimate in data destruction security.
– The shredded particles for each destruction project
are weighed. The particles are placed in a
uniquely numbered large recycling container.
– Record the lot and their weights contain in each
recycling container.
– The filled containers are weighed and sent to
metal refineries. We receive a destruction
certificate from the refiners listing the unique
container number and its weight.
34. DESTRUCTION
• Do not forget shredding of sensitive hard copy
document, photos, and other records must provide
assurance that this data cannot be reconstructed by
third parties.
• Tapes, CD, Floppies, and flash memory need to be
addressed
35. AVAILABILITY – DAY LATE
A DOLLAR SHORT
• If data can not be accessed in a timely manner it is of little or
no value.
• What controls are in place to ensure the following:
– Ability to access required documents and electronic data
feeds for month end closing, sales meetings, customer
service activities.
– Infrastructure capable of providing data per service level
agreements
– Off-site storage services provide adequate access to
archived documents, tapes, and other records
– Legacy system data able to be accessed through software
emulators
36. PRESENTATION
• This is an often forgotten part of data management.
• During development of data extract programs, end user
considerations are not adequately addressed, resulting in
additional design of proper data formatting and
summarization
– Would we give the same Trade Accounts Payable report
to the CFO as the AP clerk?
– How about on-line display for customers and suppliers?
– Electronic and Hardcopy reports have proper
watermarking per data classification requirements?
37. SYNTAX AND FORMAT
• A corporate data dictionary with the organization’s data
syntax rules, data classification scheme and security
levels.
• This process improves the quality of management
decision making by making sure that reliable and secure
information is provided, and it enables rationalizing
information systems resources to appropriately match
business strategies.
PO2 Define the Information Architecture
CobiT 4.0
38. UNDERSTANDING
METADATA
• Business Definitions
Data •
•
Reference metadata
Data element metadata
•
about •
•
Information architecture
Data governance management
Service metadata
data • Business metadata
39. SECURITY AND DATA CENTER
CONCERNS
• Do you know where your sensitive data is?
– In SAP R/3
– In Oracle
– In Peoplesoft
– In JD Edwards
– On the backup tape stolen or lost in transit
• What is being stored on laptops, memory
sticks and backup hard drives?
• Encryption
40. DATA MANAGEMENT SUMMARY
Significant risk factors organizations face daily
Qualitative and quantitative for data management being a
full-time commitment
The ten most critical rqmts. for managing data
Considerations for creating data classifications
Understanding Metadata
Regulatory requirements and data availability
Security and environmental data concerns
41. SOURCES
• MASTER DATA MANAGEMENT by David
Loshin of Knowledge Integrity, Inc., Morgan
Kaufmann OMG Press, copyright 2009
• Informationweek
– “Records Retention: Practice What You Preach”
by Andrew Conry-Murray on June 7, 2008
• Computerworld:
“Wall Street crisis brings lax e-discovery law
enforcement to light” by Lucas
Mearin, January 14, 2009
42. SOURCES
• Network World
– “Data-classification best practices” by Bill Reed on
January 18, 2007
• CIO Magazine
• CFO Magazine
• Sun Microsystems White Paper, “Best
practices in data classification of information
lifecycle management”, October 2005
43. QUESTIONS AND COMMENTS?
JEFF ROTH, CGEIT, CISA
Director Technology Risk Management Services
RSM McGladrey
jeff.roth@rsmi.com
Michael Berardi, MS-CIS, CISA
IT Audit Manager
Energizer Holdings, Inc.
Michaela.berardi@energizer.com
Editor's Notes
MICHAEL ILM & ICM?ILM – Information Lifecycle Management is a sustainable storage strategy that balances the cost of storing and managing information with its business value. A well-executed ILM strategy will result in a more agile organization, reduce business risk and drive down both storage unit and storage management costs.ICM – Information Classification and Management – Implementing an information classification scheme is valuable for a number of reasons as it allows enterprises to utilize content-based access policies, apply appropriate retention intervals to data, demonstrate comprehensive adherence to policy for compliance purposes, and potentially protect sensitive content when it leaves the enterprise. Tools offer advanced features such as file-path metadata parsing, in-file content visibility, context category classification, file-classification tagging and policy-based management and tracking (Bill Reed, Data-classification best practices”, 1/18/2007).THE GLOBAL STATE OF INFORMATION SECURITY BY CIO AND CSO Magazines in partnership with PWC, 2008Mark Lobel of PWC says referring to security and data classification, “Doing this project is a lot of effort and unless there’s a regulatory need for it, many don’t do it.” The survey goes on to only 24% report that classifying the business value of data is part of their security policies, 68% classify their data by risk level at least periodically and 30% don’t ever classify their data.Continental Airlines has a three tier classification scheme, Tier One is anything that keeps planes aloft or money coming in, Tiers Two and Three is data that is still important, but not critical to revenue or safety.JEFF- FRCP
JEFF WILL TAKE PII/PHIMICHAEL – FISMA AND MDM – FISMA – Federal Information Security Management Act – The FISMA Implementation Project was established in January 2003 to produce several key security standards and guidelines required by Congressional legislation.MDM – Master Data Management – Organizations must understand that improving their data—and building the foundation for MDM—requires them to address internal disagreements and broken processes. Staff must agree on exactly what constitutes a \"customer\" or a \"partner,\" and how to resolve any disagreements across business units. Departments and divisions need to agree on hierarchies of customers and products and how to resolve duplicate records across sources. Rather than a technology-focused effort, the project becomes one of political strategy and consensus building (Tony Fisher, “Demystifying Data Management”, CIO Magazine, April 2007)A key element of data management is tiered storage, placing the more current, valuable data on highend, highly accessible storage solutions, while storing the lower value, older data on lower cost storage solutions:Operational – Documents used for daily transactionsReference – Information occasionally checked for reference.Archive – Info you don’t need regularly.
JEFFDISS standards for destruction
JEFF
JEFFYES, this is definitely applicable to eDiscovery, but is the basis for all information management and applicable to any business. Reduce costs through proper management of your information and its relevance to your business. This is public domain tool
JEFFFocus on trust and how data has been misused. From predicting weather events, with massive amount of data and trust storm will not hit but does to the Market and Services and daily movement of information.
MICHAELFlawed decision support brought about by the exclusion of certain data or information such as from system or applications at newly acquired organizations or duplication of data or information.Legal exposure resulting from a opponent attorney uncovering email that should have been deleted and of whose existence your General Counsel had no knowledgeWhat is the performance impact of not archiving data on your primary system? How about the duration and cost of the daily backup process? How do the cost of the different storage options differ and do you have a strategy of storing the less frequently accessed data on the least costly storage medium?Regulatory compliance – are you monitoring access to your sensitive data to be able to identify a breach. California now has a 5 Day Breach Disclosure requirement and Massachusetts requirements include; Encryption of personal data stored on portable devices and while transmitted, conducting reasonable monitoring of systems in an effort to spot unauthorized activities; install firewalls, operating system patches and client level security tools that are reasonably up to date on all system; Develop a comprehensive data-security program that sets internal policies and specifies disciplinary measures for employees who violate them; Inventory all electronic and paper records to identify the ones that contain personal data.Has your organization classified its data, including the sensitive and critical data. Have provisions been made for resilience of the systems containing the critical data such as provided by a DRP and have standards and policies been enacted to ensure the protection of data classified as ‘sensitive’? Is there a Security Policy that more broadly requires and provides the resources for said standards, policies and associated procedures?
MICHAELOrigins of master data management were the single computer resource known as the mainframe, supporting all the applications and data files. Then came relational databases and associated data redundancy predated data normalization. This was fairly minor until the introduction of the personal computer and distributed computing – the client server environment. Everyone was their own administrator of their computer and frequently a relational database management system or RDBMS as it was known. Multiple RDBMS in multiple lines of business resulted in multiple instances of the same piece of data called by different names.The first driver of MDM – the ability to rationalize the definitions and meanings of commonly used business terms and concepts, while needing to be able to differentiate when two seemingly similar terms mean different things. The move to ERP applications such as SAP r/3 seems to be a move back towards the centralized model that was represented by the mainframe in the 1980s.Mis-configured data marts and warehouses.Improperly constructed Crystal reports and SQL QuerieseBOMs- Labor rates for labels on CD lead to misstatement of cost
MICHAELComputerworld article “Wall Street crisis brings lax e-discovery law enforcement to light” by Lucas Mearin, January 14, 2009This slide basically tell us that the laws are on the books, they just need to be enforced. This will change as organizations continue to lose private and proprietary data.
MICHAELAddressing data at rest is frequently involving encryptionFRCP: INTELL/AMD – If you put a policy in place you had better be able to demonstrate compliance and enforcement when the policy is not followed. Let Jeff step in with more details. Cost of none compliance is significant, averaging $50 per record by some estimate and up to $60 per record by others.
MICHAELHere are just a sample of the various regulations with which many of our organization must comply. Consider each state seems to have their own disclosure laws beyond the national and international regulations. While enforcement may have been lax in the past, recent system breaches and the economic crisis will likely lead to tougher enforcement of the existing laws.How can any organization accomplished compliance with the 44+ state and federal regulations/statutes without a data classification scheme that identifies where personal or private data resides, data on customers, vendors and employees?
MICHAELAn example of evolving regulatory landscape, no longer just talking about generalities. Now talking specific techniques and controls to managing these information systems. Periods of disclosure are shortening. Disclosure can lead to business closure.Massachusetts Law requirements include,Review Scope of security measures at least annuallyRegularly monitoring to ensure that the comprehensive information security program is operating in a manner reasonably calculated to prevent unauthorized access to or use of personal information.Immediately terminating both the physical and electronic access of terminated employeesTaking all reasonable steps to verify that any third party service provider with access to personal information has the capacity to protect such personal information in the manner proved for n 201 CMR 17:00; and taking all reasonable steps to ensure that such third party service providers is applying to such personal information protective security measures at least as stringent as those required to be applied to personal information under 201 CMR 17:00
JEFFReason
JEFFSystem performance slowed by the vast amounts of data that have to be parsed to respond to queriesHigh availability storage media such as the kind used to store your most current and valuable data is also the most expensive medium. Costs can be reduced by archiving ‘older” data to the less expensive medium such as tape
JEFFStorage containers and mediaStorage media – Is it write once only? Federal requirements or FISMA for moderate and high impact federal systemsSecurity of data at rest?Security of data backed up onto tapeEnvironmental security of storage media both in the data center and at the offsite storage facility – do you assess these controls at the offsite facility?How are tapes secured while in transit to the offsite? Is it a carrier that specializes in tape transport? Annual inventory of offsite tapes?How are tapes controlled between locations?What are the risks of employees transporting tapes to offsite? Tapes and laptops stolen from employee vehicles…
MICHAELIntrinsic value of dataGarbage IN, Garbage OUT – GIGOData mgmt can establish data standards, valid value sthat reduce GIGOWhat about the reliability of your storage media?How do you monitor and ensure you have a good backup, periodic testing of backups? Replacement of tapes periodically?How do you manage flash memory, thumb drives: Protecting data in case of loss Preventing viruses
MICHAELSo we’ve talked abut protecting your data, just in case there are any questins about the vulnerability of yours or any organization’s data to a breach, take a look at the link on this pageYet more disclosure laws – 44+FISMAHIPAAETC.
POLL THE AUDIENCE ON WHO HAS CLASSIFIED DATA AND THE CLASSIFICATIONS THEY ARE USING
MICHAEL/JEFFDocument / Data Classification Description – Bring up template-Information System Categorization (formula for creating the classifications)Organization CriticalHighly sensitive internal documents e.g. pending mergers or acquisitions; investment strategies; plans or designs; that could seriously damage the organization if such information were lost or made public. Has very restricted distribution and must be protected at all times. Security at this level is the highest possible. Highly Confidential Information that, if made public or even shared around the organization, could seriously impede the organization’s operations and is considered critical to its ongoing operations (accounting information, business plans, sensitive customer information of banks, solicitors and accountants etc., patient's medical records and similar highly sensitive data). Such information should not be copied or removed from the organization’s operational control without specific authority. Security at this level should be very high. Proprietary Information of a proprietary nature; procedures, operational work routines, project plans, designs and specifications that define the way in which the organization operates. Such information is normally for proprietary use to authorized personnel only. Security at this level is high. Internal Use only Information not approved for general circulation outside the organization where its loss would inconvenience the organization or management but where disclosure is unlikely to result in financial loss or serious damage to credibility. Examples would include, internal memos, minutes of meetings, internal project reports. Security at this level is controlled but normal. Public Documents Information in the public domain; annual reports, press statements etc.; which has been approved for public use. Security at this level is minimal.
MICHAELExecutive management must be on board to ensure you receive the support you need from those locations outside of HQBusiness case is productivity improvements realized through the associated initatives and improving the organization’s ability to quick respond to business opportunities.Selling the benefits – Improving data quality, reducing the need to for cross-system reconciliation, reducing operational complexity and simplifying the design and implementation (1) Master metadata simplifies application development. A master metadata repository captures the whole story of a data element’s use, instead how it is used in a single application, such as how data elements are used for different business purposes. (2) Simplify or otherwise standardize the process for unique identification or uniquely identifying a data record instead of by application. (3) Define and standardize across the enterprise many different kinds of master data servicesIdentification of Stakeholders, which will include senior management, clients, application owners, info architects, data governance and data quality practitioners, Metadata analysts, system developers and operations staff.Understanding the business needs is required to both cost justify MDM as well as integrate it into the existing application centrist data management.RACI –Responsible (those who do the work) , Accountable (signs off on R), Consulted and InformedGovernance of MDM - Oversight of master data involves the testing and where needed re-establishment of data quality.
MICHAELMetadata registry and management – All aspects of determining the need, planning , migration strategy and future state require a clarified view of the information about the data that is used within the organization – its metadata. A metadata registry provides a control mechanism or perhaps even a “clearing house” for unifying a master data view when possible, as well as helping to determine when that unification is not possible.Assessment to Identify data sets, primary & foreign keys, implicit relational structure and embedded business rules.Integration of existing master data such as person names, addresses, telephone numbers, product descriptions, etc. using tools to resolve the variations in representation of specific entities from disparate data sources.Assurance – MDM requires a high degree of confidence in the quality of the master data moving forward. Auditing and monitoring compliance with defined data quality standards coupled with effective issue response and tracking, along with strong stewardship within a consensus based governance model will ensure ongoing compliance with data quality objectives.Project Plan – RACI next step, identify task dependencies, interdependencies and the order of work.
JEFF? DON’T FORGET TO USE HYPERLINKSMultiple perspectives
JEFFMessage Gate is one example of a tool that can be used to manage data leakage.Support tools can be used in determine the classification of data provided at no cost by the federal govt. through NIST.NIST SPECIAL PUBLICATION 800-30 – RISK ASSESSMENT PROCESS DEFINED
MICHAELEach organization knows how to use data required for their business processes, but very few look beyond meeting their day-to-day activities.During the next few slides will examine the requirements referenced here. We’ll talk about things like the Availability of data across the enterprise. Syntax and format in things like context sensitive help or a drop down from which you must select state if the country selected was the United States.
MICHAELWe talked some already about the cost of disclosure of private data and now we want to turn to the value of data itself.Does information empower us to do things differently. You bet you. Data can identify fraud and abuse or un-served or forgotten customers for which profit can be realized.Proper management of data in the form of classification enable us to minimize the cost of compliance through knowing where and in what forms private data is stored as well as transmitted, minimizing the cost of regulatory drivers.
JEFF
JEFF
MICHAEL?We talked on the Ten Most Critical Requirements for managing data about the importance of executive management buy-in and sponsorship. We mentioned having a team to establish Information Lifecycle Management or ILM, but also on-going governance of data management.A records coordinator would be a critical member of such as team.Another key aspect of retention schedules is to incorporate those requirements or said schedule into the BCP and DRP to ensure continuance of regulatory and operational compliance.
JEFF
JEFFPUBLIC SCHOOL EXAMPLE – IDENTITY THEFT VULNERABILITYNASA BY OIG
JEFF
JEFF
MICHAELIf there was a disaster on the last day of the month would your organization still be able to report their financials on time to SEC?Does your infrastructure have the capacity to take on additional data such as what might be required to integrate another organization into your own?Do you have a sufficient number of individuals designated as authorized to declare a disaster, so the loss of one does not preclude the company from restoring at the recovery site.The storage or archival of data needs to be in a format that will be readable for at least as long as retentions specifies.
MICHAELViewsof data – Different views for different people based upon their responsibilityOnline – What the users sees online may depend on whether they are a third party order taker, a vendor or a customer. Amazon as an example.Classification – electronic and hardcopy forms of documents and information clearly state how the document or information is classified?
MICHAELDoes your organization have a central data dictionary for data across applications and locatios, specifying its characteristics such as length, type and its classification. What level of security is required, where it can be stored (thmb drives) and does it need to be encrypted when transmitted.Do the above improves both the efficiency and effectiveness of both from an operational and regulator perspective.
JEFFBusiness definitions look at the business terms used across the organizations and the associated meaningsReference metadata – Detail data domains (both conceptual domains and corresponding value domains) as well as reference data and mappings between codes and values.Data element metadata – Focus on data element definitions, structures, nomenclature, and determination of existence along a critical path of a processing streamInformation architecture. Coagulates the representations of data elements into cohesive entity structures, shows how those structures reflect real world objects, and explores how those object interact within business processes.Data Governance management. Concentrates on the data rules governing data quality, data use, access control, and the protocols for rule observance (and processes for remediation of rule violations).Service metadata. Look at the abstract functionality embedded and used by the applications and the degree to which those functions can be described as stand-alone services, along with the mapping from service to client applications and at the top of the stackBusiness metadata. Capture the business policies that drive application design and implementation the corresponding information policies that drive the implementation decisions inherent in the lower levels of the stack and the mgmt and exe schemes for the business rules that embody both business and information policies.