The document discusses data retention policies and handling of confidential and sensitive data. It provides details on:
1) Data retention policies - their purpose, requirements, scope and how they are managed. Different retention periods are defined depending on the type of data.
2) Laws and regulations around data retention in India, particularly for telecommunication companies. Specific requirements for retaining call detail records, network logs, and other subscriber information are outlined.
3) Types of sensitive data, including personal, business, and classified information. Guidelines for properly handling sensitive data through access policies, authentication, training, and other security practices.
Let’s understand the concepts of business continuity and Disaster Recovery in brief. To know more, visit: www.eccouncil.org/business-continuity-and-disaster-recovery
Security threats and controls were discussed, including cryptography and access control. An expert trainer profile was provided, detailing qualifications and experience in IT security management and implementation of standards such as ISO 27001, COBIT 5, and ITIL. Key security concepts such as the CIA triad of confidentiality, integrity and availability were explained.
The document discusses operational security, incident response, and disaster recovery. It provides overviews of security operations, the incident response process and roles, evaluation and analysis of incidents, response and mitigation, recovery and remediation, reporting, and lessons learned. The document also discusses disaster recovery planning, strategies, priority levels, roles and responsibilities, testing plans, communication with stakeholders, and the restoration process after a disaster.
How can businesses make the smartest telecom and network choices—choices that are guided as much by the organization’s needs and demands as by cost considerations? Increasingly, companies are opting to partner with managed service providers to harness as many benefits as possible from their telecom and network services. Are they realizing these benefits? View the SlideShare to find out.
Protecting Agile Transformation through Secure DevOps (DevSecOps)Eryk Budi Pratama
Respresenting Cyber Defense Community (cdef.id) to present and share my view on Secure DevOps / DevSecOps. Through this presentation, I shared several insights about:
1. How to balance the risk and controls in the "great shift left" paradigm (agile)
2. DevOps activities
3. How to seamlessly integrate security into DevOps
4. How to "shift left" the security"
5. Get started with Secure DevOps / DevSecOps
6. Case Study about DevSecOps implementation
For further discussion, especially how to secure digital and agile transformation in your organization, don't hesitate to contact me :)
Data centers are facilities that house computer systems and associated infrastructure needed to support IT operations, providing reliable and secure environments for hosting applications through redundant power and network connections. They consist of server racks, cooling, cabling, security and management systems. Data centers provide advantages for businesses such as reduced costs, improved connectivity, easy disaster recovery and network security.
The Surprising Truth About Your Disaster Recovery Maturity LevelAxcient
Have you ever wondered if your organization's Disaster Recovery initiatives are in line with business objectives? How can you get business units, IT, and senior management on the same page when it comes to the company's resiliency?
Introducing the Disaster Recovery Maturity Framework, a new, vendor-agnostic tool for analyzing your organization's resiliency level.
Learn how to assess your company's DR maturityand discover:
- What resiliency really means
- The five different maturity levels for disaster recovery
- Key elements to assess your company's own maturity score
- How to use the DR Maturity Framework as a catalyst for change
Let’s understand the concepts of business continuity and Disaster Recovery in brief. To know more, visit: www.eccouncil.org/business-continuity-and-disaster-recovery
Security threats and controls were discussed, including cryptography and access control. An expert trainer profile was provided, detailing qualifications and experience in IT security management and implementation of standards such as ISO 27001, COBIT 5, and ITIL. Key security concepts such as the CIA triad of confidentiality, integrity and availability were explained.
The document discusses operational security, incident response, and disaster recovery. It provides overviews of security operations, the incident response process and roles, evaluation and analysis of incidents, response and mitigation, recovery and remediation, reporting, and lessons learned. The document also discusses disaster recovery planning, strategies, priority levels, roles and responsibilities, testing plans, communication with stakeholders, and the restoration process after a disaster.
How can businesses make the smartest telecom and network choices—choices that are guided as much by the organization’s needs and demands as by cost considerations? Increasingly, companies are opting to partner with managed service providers to harness as many benefits as possible from their telecom and network services. Are they realizing these benefits? View the SlideShare to find out.
Protecting Agile Transformation through Secure DevOps (DevSecOps)Eryk Budi Pratama
Respresenting Cyber Defense Community (cdef.id) to present and share my view on Secure DevOps / DevSecOps. Through this presentation, I shared several insights about:
1. How to balance the risk and controls in the "great shift left" paradigm (agile)
2. DevOps activities
3. How to seamlessly integrate security into DevOps
4. How to "shift left" the security"
5. Get started with Secure DevOps / DevSecOps
6. Case Study about DevSecOps implementation
For further discussion, especially how to secure digital and agile transformation in your organization, don't hesitate to contact me :)
Data centers are facilities that house computer systems and associated infrastructure needed to support IT operations, providing reliable and secure environments for hosting applications through redundant power and network connections. They consist of server racks, cooling, cabling, security and management systems. Data centers provide advantages for businesses such as reduced costs, improved connectivity, easy disaster recovery and network security.
The Surprising Truth About Your Disaster Recovery Maturity LevelAxcient
Have you ever wondered if your organization's Disaster Recovery initiatives are in line with business objectives? How can you get business units, IT, and senior management on the same page when it comes to the company's resiliency?
Introducing the Disaster Recovery Maturity Framework, a new, vendor-agnostic tool for analyzing your organization's resiliency level.
Learn how to assess your company's DR maturityand discover:
- What resiliency really means
- The five different maturity levels for disaster recovery
- Key elements to assess your company's own maturity score
- How to use the DR Maturity Framework as a catalyst for change
The document discusses various topics related to asset management and data security in an IT environment. It covers:
- The importance of having policies for classifying, retaining, and destroying assets like data, hardware, software and documentation.
- Defining roles for data owners, custodians, system owners and administrators.
- Methods for securely storing, transmitting and destroying sensitive data.
- Vulnerabilities that can affect web-based systems and ways to assess security risks through scanning and testing.
SMBs - Hierarchy of Business-Security Documents 2015-11Alan Watkins
The document discusses the typical hierarchy and types of documents needed for an effective information security program. It explains that organization-level documents like business plans and IT strategies set long-term direction, while operational documents like policies, procedures, and standards provide rules and guidelines for implementation. Key mandatory operational documents include an information security policy, IT disaster recovery plan, computer security incident response policy, and acceptable use policy. The document also provides descriptions and examples of various common information security documents and how they relate to one another to protect an organization's information assets.
The document discusses the importance of policy in defining an organization's security scope and expectations. It provides examples of key policies around information, security, computer and internet use, and procedures for user management, backups, incident response and disaster recovery. Effective policy creation involves risk assessment, stakeholder input, and regular review to ensure ongoing relevance. Deployment requires security awareness training and compliance audits.
Report2Web is a document management solution that allows hospitals to access critical information even during network or system downtimes. It automatically stores reports and files on local workstations so authorized users can access them through a web browser if the main network or hospital information system goes down. This helps hospitals comply with HIPAA requirements to have plans to continue operations during outages. Report2Web securely distributes documents to the right recipients, and provides features like bursting to only send relevant portions of large reports. It establishes a centralized repository for documents to enable access from any location through a standard web interface.
The document discusses asset management policies and procedures for managing an organization's hardware, software, data, and other assets. It covers establishing ownership and classifications for assets, roles and responsibilities for data owners, custodians, and administrators, implementing retention and disposal policies, and ensuring compliance with privacy and security regulations.
The Importance of Security within the Computer EnvironmentAdetula Bunmi
The document discusses the importance of security procedures and policies within a computer center. It outlines standard operating procedures that should be implemented, including change control processes, safety regulations, security policies, deployment procedures, and more. The document also discusses the need for computer room security to protect assets, data, employees, and the organization's reputation. Methods for preventing hazards like fires, floods and sabotage are also important. Computer systems auditing helps evaluate security controls and ensures the computer systems are protecting assets and operating effectively.
Encryption technology is important for healthcare organizations to protect patient health information. An appliance-based encryption solution like the JANA series can offer strong security while being simple to implement and manage. The JANA appliances encrypt data in transit and at rest using key management processes to ensure privacy and HIPAA compliance. They integrate easily into healthcare IT systems and scale from small practices to large networks.
The document discusses several issues relating to data security and integrity in information systems. It outlines the importance of protecting personal data privacy and ensuring data is accurate. It then provides recommendations for increasing data security, such as using user IDs and passwords, access rights, protecting against viruses, and disaster planning. Backup strategies are also covered, including the importance of periodic backups and storing backups safely offsite.
The document discusses strategies for complying with the EU's General Data Protection Regulation (GDPR). It outlines five critical strategies: 1) Know all personal data stored, 2) Carefully manage access to personal data, 3) Encrypt as much data as possible, 4) Monitor changes affecting sensitive data and prevent critical changes, and 5) Investigate potential breaches. It also discusses how the software company Quest can help customers strengthen data protection, ensure compliance, and avoid fines through solutions that secure and manage data, modernize infrastructure, and provide insights.
Bibek Chaudhary is interning in the GRC and IS Audit department. An IS audit examines an organization's information systems, processes, controls, and operations to determine if components are operating successfully to achieve organizational goals and objectives. IS audits can be undertaken as part of financial, internal, or other audits. Key areas covered in IS audits include systems and applications, information processing facilities, system development, IT management, and ensuring technical and operational controls. Major focuses of IS audits are governance and management of IT, information systems acquisition and development, protection of information assets, information systems operations and business resilience, and following appropriate audit methodologies.
Classify information and supporting assets (e.g., sensitivity, criticality), Determine and maintain ownership (e.g., data owners, system owners, business/mission
owners), Protect privacy, Ensure appropriate retention (e.g., media, hardware, personnel), Determine data security controls (e.g., data at rest, data in transit), Establish handling requirements (markings, labels, storage, destruction of sensitive
information)
Information Systems Manager Job DescriptionDanai Thongsin
An information systems manager is responsible for overseeing a company's computer systems, including installation, backups, purchasing hardware and software, and ensuring technology infrastructure meets user needs. They work in organizations of all sizes across industries, managing technicians, programmers, and administrators. While job titles may vary, core responsibilities include implementing technology, managing budgets, researching new systems, providing user support, and ensuring security, compliance, and system uptime. The role requires technical expertise as well as an understanding of business and management.
The document discusses the key responsibilities of a CISO regarding incident management and response. It outlines establishing processes for detecting, identifying, analyzing and responding to security incidents. This includes developing escalation processes, response plans, and integrating response plans with business continuity and disaster recovery plans. It also discusses organizing incident response teams, conducting testing and reviews to improve effectiveness.
Inform Interiors Proposal for Managed Support Servicesjoshua paul
Managed support services provide small and medium businesses with enterprise-level IT support through proactive network and system monitoring and management similar to large corporations. This includes 24/7 monitoring and support, backup and disaster recovery, security, and routine maintenance. A monitoring agent runs on each computer to provide full visibility of the network and alerts. Benefits include reduced costs, improved reliability, and IT support tailored to the needs of the business.
Building HIPAA Compliance in service delivery teamsGaurav Garg
If you work with healthcare providers, you need to weave HIPAA compliance in your DNA. In this presentation, I share my approach for building a consulting team focussed on Healthcare clients.
Benefits of an Managed Service ProviderThe TNS Group
An MSP (managed service provider) can provide IT support for a fixed monthly cost, avoiding expenses associated with hiring internal IT staff like training, benefits, and sick days. An MSP will ensure backups are regularly tested and updated, manage employees' devices through BYOD plans, and provide 24/7 technical support and monitoring. Updates and patching of software will be done daily by an MSP without disrupting business operations.
This presentation introduces the concept of taking a proactive stance to security and utilizing existing information in new (and better) ways to improve profitability.
Have you navigated your predictive maintenance maturity level?
Navigating the maturity level of your predictive maintenance model could help you identify the new requirements of your business to utilize the data more effectively. Check out the post to explore more about predictive maintenance maturity level and how you can improve on the levels to increase your operational effectiveness.
190 compliance, risk, and control specialists participated in our class on cyber compliance at the IE Law School. I presented good practices and tips to comply with regulations involving data security, computer crime, corporate defense, IT and compliance controls, and sectorial requirements
This document provides guidance for customers on accessing and using the Technology Services Agreement. It outlines the 11 lots of services available under the agreement, including help desk support, desktop support, network management, and asset disposal. It describes who is eligible to use the agreement, the benefits of doing so such as reduced procurement timelines and choice of suppliers. The document provides directions on running a mini-competition with suppliers to identify the best solution for customers' requirements. It also encourages customers to do discovery days with suppliers in complex procurements to help define requirements and solutions.
The GDPR will directly apply across the EU from May 2018, replacing the previous data protection directive. It expands the scope of regulations and increases accountability for organizations. Individual rights are also enhanced, including rights to access, rectify, and erase personal data. Non-compliance can result in fines of up to 20 million euros or 4% of annual global turnover. Organizations should begin compliance projects now to assess risks, strengthen policies, and appoint data protection officers. The GDPR aims to harmonize data protection and modernize rules for an increasingly digital world.
The document discusses various topics related to asset management and data security in an IT environment. It covers:
- The importance of having policies for classifying, retaining, and destroying assets like data, hardware, software and documentation.
- Defining roles for data owners, custodians, system owners and administrators.
- Methods for securely storing, transmitting and destroying sensitive data.
- Vulnerabilities that can affect web-based systems and ways to assess security risks through scanning and testing.
SMBs - Hierarchy of Business-Security Documents 2015-11Alan Watkins
The document discusses the typical hierarchy and types of documents needed for an effective information security program. It explains that organization-level documents like business plans and IT strategies set long-term direction, while operational documents like policies, procedures, and standards provide rules and guidelines for implementation. Key mandatory operational documents include an information security policy, IT disaster recovery plan, computer security incident response policy, and acceptable use policy. The document also provides descriptions and examples of various common information security documents and how they relate to one another to protect an organization's information assets.
The document discusses the importance of policy in defining an organization's security scope and expectations. It provides examples of key policies around information, security, computer and internet use, and procedures for user management, backups, incident response and disaster recovery. Effective policy creation involves risk assessment, stakeholder input, and regular review to ensure ongoing relevance. Deployment requires security awareness training and compliance audits.
Report2Web is a document management solution that allows hospitals to access critical information even during network or system downtimes. It automatically stores reports and files on local workstations so authorized users can access them through a web browser if the main network or hospital information system goes down. This helps hospitals comply with HIPAA requirements to have plans to continue operations during outages. Report2Web securely distributes documents to the right recipients, and provides features like bursting to only send relevant portions of large reports. It establishes a centralized repository for documents to enable access from any location through a standard web interface.
The document discusses asset management policies and procedures for managing an organization's hardware, software, data, and other assets. It covers establishing ownership and classifications for assets, roles and responsibilities for data owners, custodians, and administrators, implementing retention and disposal policies, and ensuring compliance with privacy and security regulations.
The Importance of Security within the Computer EnvironmentAdetula Bunmi
The document discusses the importance of security procedures and policies within a computer center. It outlines standard operating procedures that should be implemented, including change control processes, safety regulations, security policies, deployment procedures, and more. The document also discusses the need for computer room security to protect assets, data, employees, and the organization's reputation. Methods for preventing hazards like fires, floods and sabotage are also important. Computer systems auditing helps evaluate security controls and ensures the computer systems are protecting assets and operating effectively.
Encryption technology is important for healthcare organizations to protect patient health information. An appliance-based encryption solution like the JANA series can offer strong security while being simple to implement and manage. The JANA appliances encrypt data in transit and at rest using key management processes to ensure privacy and HIPAA compliance. They integrate easily into healthcare IT systems and scale from small practices to large networks.
The document discusses several issues relating to data security and integrity in information systems. It outlines the importance of protecting personal data privacy and ensuring data is accurate. It then provides recommendations for increasing data security, such as using user IDs and passwords, access rights, protecting against viruses, and disaster planning. Backup strategies are also covered, including the importance of periodic backups and storing backups safely offsite.
The document discusses strategies for complying with the EU's General Data Protection Regulation (GDPR). It outlines five critical strategies: 1) Know all personal data stored, 2) Carefully manage access to personal data, 3) Encrypt as much data as possible, 4) Monitor changes affecting sensitive data and prevent critical changes, and 5) Investigate potential breaches. It also discusses how the software company Quest can help customers strengthen data protection, ensure compliance, and avoid fines through solutions that secure and manage data, modernize infrastructure, and provide insights.
Bibek Chaudhary is interning in the GRC and IS Audit department. An IS audit examines an organization's information systems, processes, controls, and operations to determine if components are operating successfully to achieve organizational goals and objectives. IS audits can be undertaken as part of financial, internal, or other audits. Key areas covered in IS audits include systems and applications, information processing facilities, system development, IT management, and ensuring technical and operational controls. Major focuses of IS audits are governance and management of IT, information systems acquisition and development, protection of information assets, information systems operations and business resilience, and following appropriate audit methodologies.
Classify information and supporting assets (e.g., sensitivity, criticality), Determine and maintain ownership (e.g., data owners, system owners, business/mission
owners), Protect privacy, Ensure appropriate retention (e.g., media, hardware, personnel), Determine data security controls (e.g., data at rest, data in transit), Establish handling requirements (markings, labels, storage, destruction of sensitive
information)
Information Systems Manager Job DescriptionDanai Thongsin
An information systems manager is responsible for overseeing a company's computer systems, including installation, backups, purchasing hardware and software, and ensuring technology infrastructure meets user needs. They work in organizations of all sizes across industries, managing technicians, programmers, and administrators. While job titles may vary, core responsibilities include implementing technology, managing budgets, researching new systems, providing user support, and ensuring security, compliance, and system uptime. The role requires technical expertise as well as an understanding of business and management.
The document discusses the key responsibilities of a CISO regarding incident management and response. It outlines establishing processes for detecting, identifying, analyzing and responding to security incidents. This includes developing escalation processes, response plans, and integrating response plans with business continuity and disaster recovery plans. It also discusses organizing incident response teams, conducting testing and reviews to improve effectiveness.
Inform Interiors Proposal for Managed Support Servicesjoshua paul
Managed support services provide small and medium businesses with enterprise-level IT support through proactive network and system monitoring and management similar to large corporations. This includes 24/7 monitoring and support, backup and disaster recovery, security, and routine maintenance. A monitoring agent runs on each computer to provide full visibility of the network and alerts. Benefits include reduced costs, improved reliability, and IT support tailored to the needs of the business.
Building HIPAA Compliance in service delivery teamsGaurav Garg
If you work with healthcare providers, you need to weave HIPAA compliance in your DNA. In this presentation, I share my approach for building a consulting team focussed on Healthcare clients.
Benefits of an Managed Service ProviderThe TNS Group
An MSP (managed service provider) can provide IT support for a fixed monthly cost, avoiding expenses associated with hiring internal IT staff like training, benefits, and sick days. An MSP will ensure backups are regularly tested and updated, manage employees' devices through BYOD plans, and provide 24/7 technical support and monitoring. Updates and patching of software will be done daily by an MSP without disrupting business operations.
This presentation introduces the concept of taking a proactive stance to security and utilizing existing information in new (and better) ways to improve profitability.
Have you navigated your predictive maintenance maturity level?
Navigating the maturity level of your predictive maintenance model could help you identify the new requirements of your business to utilize the data more effectively. Check out the post to explore more about predictive maintenance maturity level and how you can improve on the levels to increase your operational effectiveness.
190 compliance, risk, and control specialists participated in our class on cyber compliance at the IE Law School. I presented good practices and tips to comply with regulations involving data security, computer crime, corporate defense, IT and compliance controls, and sectorial requirements
This document provides guidance for customers on accessing and using the Technology Services Agreement. It outlines the 11 lots of services available under the agreement, including help desk support, desktop support, network management, and asset disposal. It describes who is eligible to use the agreement, the benefits of doing so such as reduced procurement timelines and choice of suppliers. The document provides directions on running a mini-competition with suppliers to identify the best solution for customers' requirements. It also encourages customers to do discovery days with suppliers in complex procurements to help define requirements and solutions.
The GDPR will directly apply across the EU from May 2018, replacing the previous data protection directive. It expands the scope of regulations and increases accountability for organizations. Individual rights are also enhanced, including rights to access, rectify, and erase personal data. Non-compliance can result in fines of up to 20 million euros or 4% of annual global turnover. Organizations should begin compliance projects now to assess risks, strengthen policies, and appoint data protection officers. The GDPR aims to harmonize data protection and modernize rules for an increasingly digital world.
Vivek Cholera has over 10 years of experience as a business analyst and project manager in the financial services industry. He has a degree in Financial Mathematics and has delivered numerous projects involving IT implementation, data analytics, regulatory reporting and client relationship management systems. His experience spans roles at Credit Suisse, Old Mutual Global Investors, Barclays Wealth, Schroders Investment Management and others. He has strong skills in requirements gathering, process improvement, data management and project delivery.
Privacy Frameworks: The Foundation for Every Privacy ProgramTrustArc
The webinar discussed privacy frameworks and how they provide the foundation for effective privacy programs. It introduced the TrustArc-Nymity Privacy & Data Governance Framework and explained how it is based on international standards. The framework takes an accountability-based approach and can be used to map activities to various regulatory requirements to demonstrate compliance. It also described how organizations can use the framework to build out the components of a comprehensive privacy program.
Procurement Of Software And Information Technology ServicesPeister
This document discusses key considerations for procuring cloud computing services and drafting cloud services agreements. It identifies legal and business risks, such as data security, privacy, jurisdiction, and termination issues. It provides recommendations for addressing these risks, including conducting due diligence of the vendor, negotiating terms to retain data ownership and ensure access in case of disputes, and establishing governance processes to manage the relationship. The appendices describe relevant stakeholders, regulatory issues, and aspects of assessing security based on NIST standards.
Information Security Program & PCI Compliance Planning for your BusinessLaura Perry
This document discusses information security programs and Payment Card Industry Data Security Standard (PCI DSS) compliance. It provides an overview of COBIT standards and risk assessment frameworks. It also outlines the essential elements of an IT security program including planning and organization, acquisition and implementation, delivery and support, and monitoring. Next, it discusses PCI DSS compliance requirements and merchant levels. It provides a prioritized approach to achieving PCI DSS compliance and addresses related policies, guidelines, and consequences of a security breach.
WATCH THE FULL WEBINAR HERE:
https://www.revulytics.com/gdpr-readiness-for-software-usage-analytics
Learn what you need to know to prepare for May 2018.
There have always been conflicts between a person’s right to privacy and an organization’s right to collect personal information for the protection and improvement of their intellectual property. But an organization can achieve balance by considering the laws and regulations in place within the jurisdictions where the organization’s products may be used.
With increased interest and concerns around the impact of the General Data Protection Regulation (GDPR), Revulytics invites you to join a presentation from Privacy Ref that will provide an overview of the latest changes to the European privacy environment to educate you on the applicability of GDPR to the use of software usage and compliance analytics.
The webinar provides insight into the EU privacy environment and shows how the Revulytics platform can be implemented in a manner that is GDPR compliant.
In this webinar you will learn:
* Concepts, principles, and definitions underlying GDPR
* How GDPR applies to software producers deploying software analytics solutions
* How the roles of Data Controllers and Data Processors apply
* What approaches may be used to lawfully process personal information under GDPR
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...TrustArc
In today's digital world, trust is key to customer relationships, but keeping it is a huge challenge. Customers are well-informed and empowered, quick to change brands if their trust is broken, even if it costs them more. This puts a lot of pressure on organizations to handle trust and safety issues with great care and transparency.
The challenge, however, is real. Fragmented solutions have left privacy, legal, and security teams in a perpetual cycle of catch-up, struggling to update privacy notices, manage customer data rights, and answer lengthy security questionnaires—all while trying to prove ROI to the business. It's a thankless job, filled with repetition, tedious tasks, and constant interdepartmental coordination. Combine this with fast regulatory changes and the quick evolution of AI, and it becomes overwhelming.
Join this webinar to learn more about TrustArc's new innovative solution Trust Center, the only unified, no-code online hub for trust and safety information built for privacy, security, compliance, and legal teams. Trust Center streamlines your path to compliance, shortens the pre-sales cycle, and reduces both legal and regulatory risks, saving time, effort, and cost.
This webinar will review:
- Why companies are building unified Trust Centers for a robust privacy program.
- How unified Trust Centers streamline sales cycles, ensure regulatory compliance, and reduce operational bottlenecks.
- How compliance, legal, security, GRC, and privacy teams benefit from a unified Trust Center in terms of needs, pains, and outcomes.
- How TrustArc Trust Center saves time and work while reducing legal, reputational, and compliance risk by effectively managing policies, notices, terms, and disclosures, and providing real-time updates on subprocessors.
IAPP Canada Privacy Symposium- "Data Retention Is a Team Sport: How to Get It...Blancco
The document discusses how companies need to implement strong data retention policies and procedures to comply with increasing data privacy regulations, properly classify and manage data through its lifecycle, and ensure all data is securely erased at end-of-life through an information lifecycle management approach involving key stakeholders like IT, legal, and data owners. It highlights how simply deleting or formatting data is not enough and certified data erasure tools and processes are needed to prevent data breaches and regulatory fines from non-compliant data disposal.
California Consumer Protection Act (CCPA) is
one such law that empowers the residents of
California, United States to have enhanced
privacy rights & consumer protection. It is the
most comprehensive US state privacy law to
date.
Preparing for GDPR: General Data Protection Regulation - Stakeholder Presenta...Qualsys Ltd
This document provides an overview of the EU General Data Protection Regulation (GDPR) which takes effect on May 25, 2018. It discusses the issues with how organizations currently manage data and how GDPR aims to better protect consumer data. Key points include expanded definitions of personal data, increased rights for data subjects, higher fines for non-compliance, and new requirements for consent, transparency, accountability, and breach notification. It outlines four steps businesses need to take, including reviewing policies, establishing a legal basis for processing, demonstrating compliance, and considering appointing a data protection officer.
Transitioning from US current energy infrastructure to a Smart Grid is essential to meeting future energy challenges. One key component of the Smart Grid is advanced metering infrastructure (AMI). AMI allows for the grid to be run more effectively and efficiently by making granular near real-time data about customers’ energy usage available. Coupled with the input and innovation of third-party companies, the potential benefits of this technology are immense. But given the granularity of AMI data, the emerging technology can place customer privacy at risk.
On October 5, 2013, California Governor signed into law AB-1274 “Privacy of Customer Electrical or Natural Gas Usage Data” (http://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=201320140AB1274), now known as Title 1.81.4. This law is not aimed at utilities, but at third parties which may have access to customer data as a result of doing business directly with the customer.
NIST 7628 “Guidelines for Smart Grid Cyber Security: Vol. 2, Privacy and the Smart Grid” document provides a good background information for this law (available on DEN titled NIST_7628_vol2).
This submittal reviews NIST guidelines, identifies and provides information about the potential threats against the system and summarizes essential best practices.
The document discusses how Internet of Things (IoT) technologies will impact computerized maintenance management systems (CAFM) in the future. It describes how IoT will allow facilities and assets to be continuously monitored, enabling predictive maintenance and more efficient workflows. Key stakeholders in this transition include equipment providers, IT teams, facility managers, on-site staff, maintenance contractors, and CAFM software providers. The future state enabled by IoT is outlined for each stakeholder group.
apidays LIVE Australia 2021 - Empowering the fintech ecosystem with APIs by D...apidays
apidays LIVE Australia 2021 - Accelerating Digital
September 15 & 16, 2021
Empowering the fintech ecosystem with APIs
Damir Cuca, CEO & Founder at Basiq
Data Security & Data Privacy: Data AnonymizationPatric Dahse
The document discusses data security and privacy in SAP systems, specifically data anonymization. It provides an overview of Natuvion, a consulting company specialized in SAP solutions for utilities. It outlines the challenges of anonymizing personal data in SAP systems to comply with privacy regulations like GDPR. Natuvion's Test Data Anonymization solution allows comprehensive pseudonymization or full anonymization of SAP systems, including connected non-SAP databases and SAP BW systems. It addresses challenges like ensuring consistency across networked systems and accounting for all personal data fields.
Ethyca CodeDriven - Data Privacy Compliance for Engineers & Data TeamsCillian Kieran
A presentation at FirstMark's CodeDriven event in AWS Loft in New York on how to think about Data Privacy Compliance if you work in engineering, data or product teams.
The GDPR document outlines new data protection laws that will take effect in the European Union on May 25th, 2018. The key points are:
1) The GDPR aims to give citizens control over their personal data and simplify rules for businesses.
2) It establishes clear principles for data handling including lawfulness, transparency, storage limitation, and accountability.
3) Individuals are given new rights regarding their data, such as access, rectification, erasure, and objection to processing.
4) Businesses must comply with the single set of rules to reduce costs and protect EU citizen data.
Drive compliance culture to ensure 100% adherence & lower risks with LexComply Legal Compliance Management solution in India, having integrated technology & updated legal library to identify, Allocate, Report & monitor compliance across group companies, locations, departments, & 3rd parties.
Iron Mountain® Policy Center Solution Enterprise EditionInfoGoTo
Policy Center Enterprise Edition combines subscription access to Policy Center, a cloud-based retention and privacy policy management platform, with expert Advisory Services to help you comply with existing and new regulations, such as the General Data Protection Regulation (GDPR). It helps you manage privacy and retention together, so you can know your retention and privacy obligations, and show compliance.
The document discusses the risks associated with big data, including increased data production leading to higher costs of replication and storage, evolving privacy and security regulations, and growing litigation and discovery obligations. It notes that most of the significant risks and costs of big data are not clearly visible and addresses challenges in areas like existing infrastructure, regulatory compliance, contracting, data retention, and eDiscovery.
Similar to IT6701-Information Management Unit 5 (20)
Cs8092 computer graphics and multimedia unit 5SIMONTHOMAS S
This document discusses multimedia authoring tools and techniques. It covers several topics:
1. Types of multimedia authoring tools including card/page based tools, icon based tools, and time based tools. Popular examples are discussed.
2. Key features and capabilities of authoring tools including editing, programming, interactivity, playback, delivery, and project organization.
3. Authoring system metaphors like hierarchical, flow control, and different technologies focused on like hypermedia.
4. Considerations for multimedia production, presentation, and automatic authoring. Professional development tools are also outlined.
Cs8092 computer graphics and multimedia unit 4SIMONTHOMAS S
This document provides an overview of multimedia system design and multimedia file handling. It discusses multimedia basics and system architecture. Key topics covered include defining objects for multimedia systems, multimedia data interface standards, compression and decompression, data and file format standards, and multimedia I/O technologies. It also examines digital voice and audio, video, image and animation, and full motion video. Storage and retrieval technologies are also mentioned.
Cs8092 computer graphics and multimedia unit 3SIMONTHOMAS S
The document discusses various methods for representing 3D objects in computer graphics, including polygon meshes, curved surfaces defined by equations or splines, and sweep representations. It also covers 3D transformations like translation, rotation, and scaling. Key representation methods discussed are polygonal meshes, NURBS curves and surfaces, and extruded and revolved shapes. Transformation operations covered are translation using addition of a offset vector, and rotation using a rotation matrix.
Cs8092 computer graphics and multimedia unit 2SIMONTHOMAS S
This document discusses two-dimensional graphics transformations and matrix representations. It covers topics such as translation, rotation, scaling, reflections, shearing, and representing composite transformations using matrix multiplication. Homogeneous coordinates are also introduced as a way to represent 2D points using 3-dimensional vectors and matrices for transformations.
Cs8092 computer graphics and multimedia unit 1SIMONTHOMAS S
This document discusses illumination models and color models in computer graphics. It begins by introducing illumination models which determine the perceived color and intensity at points on a surface given lighting conditions. It then covers various lighting models including point light sources, damping of light intensity over distance, and the Phong illumination model for specular reflection. It also discusses surface illumination factors like reflection, transmission and absorption of light. Basic illumination models are presented combining ambient, diffuse and specular reflection. The document concludes by covering rendering of polygons using constant, Gouraud and Phong shading to interpolate colors across surfaces.
Take minutes, post minutes, track action items
Responsible for inter-group communication
Liaison with other teams
Coordinate interfaces
Responsible for quality of work products
Enforce standards and guidelines
Review work products before delivery
Responsible for team motivation and morale
Responsible for resource allocation within the team
Responsible for risk management within the team
Responsible for scope management within the team
Responsible for schedule management within the team
Responsible for budget management within the team
Responsible for configuration management within the team
The document discusses the project management process and inspection process. It provides details on the typical roles and responsibilities of a project manager, including planning, monitoring, communication facilitation, and postmortem analysis. It also outlines the steps for risk management, including identification, analysis, planning, and review. Finally, it describes the inspection process for reviewing work products, including planning, individual review, group review meetings, rework, and roles like moderator and scribe.
This document discusses risk management concepts including risk assessment, prioritization, and planning. It provides formulas for calculating risk exposure based on potential damage and probability of occurrence. It also includes qualitative descriptors for probability and impact levels and introduces a probability-impact matrix for risk analysis. Finally, it outlines different approaches for dealing with risks, such as acceptance, avoidance, reduction, transfer, and mitigation.
The document discusses various software project life cycle models and cost estimation techniques. It begins by describing agile methods like Scrum and Extreme Programming that emphasize iterative development, communication, and customer involvement. It then covers traditional models like waterfall and incremental development. Key estimation techniques discussed include function points, COCOMO, and analogy-based estimation. The document provides details on calculating sizes and estimating effort for different models.
The document discusses software project management activities and methodologies. It describes the typical activities covered in project management, including feasibility studies, planning, execution, and the software development life cycle. The software development life cycle includes requirements analysis, architecture design, coding and testing, integration, qualification testing, installation, and acceptance support. The document also discusses plans, methods, and methodologies, categorizing different types of projects, and identifying stakeholders.
The document discusses principles of information architecture and its framework. It describes the responsibilities of information architects in collecting information from various sources, organizing large amounts of data on websites, understanding user needs, and testing user experiences. It also defines different dimensions of information architecture including contents, context, users. Components of information architecture discussed include labeling systems, navigation systems, organization systems, and searching systems.
The document discusses master data management (MDM) including its definition, need, and implementation process. MDM aims to create and maintain consistent and accurate master data across systems. It discusses key aspects like the different types of data, MDM architecture styles, and domains. The implementation involves identifying data sources, developing data models, deploying tools, and maintaining processes to manage master data effectively.
The document discusses various aspects of program security including types of flaws, malicious code, and controls against threats. It describes different types of flaws such as buffer overflows, incomplete mediation, and time-of-check to time-of-use errors. Malicious code like viruses, trojan horses, and worms are also explained. Controls during software development include following principles of modularity, encapsulation, and information hiding. Techniques like code reviews and testing aim to identify and fix flaws to enhance program security.
The document discusses IT6701 - Information Management, which covers topics such as database modeling, management and development, information governance, and information architecture. It describes objectives, units, database design, data modeling, entity relationship models, normalization, Java database connectivity, stored procedures, and big data technologies including Hadoop, HDFS, MapReduce, Hive and enhancements.
Quick sort is an internal sorting technique that uses the divide and conquer approach. It works by picking a pivot element and partitioning the array so that elements less than the pivot are moved left and greater elements right. The pivot is placed in its correct position, then quick sort is recursively applied to the left and right subarrays. It has a best case of O(n log n) and average case of O(n log n), but worst case of O(n^2).
Breadth first traversal (BFS) is a graph traversal algorithm that begins at a starting node and explores all neighboring nodes at the present distance from the node before proceeding to nodes at the next distance. It uses a queue to keep track of nodes to visit at each level. The key steps are to enqueue the starting node, dequeue nodes and enqueue any unvisited neighbors, repeating until the queue is empty. BFS can be used to check if a graph is connected or not. Depth first search (DFS) recursively explores as far as possible along each branch before backtracking. It involves marking the starting node visited, recursively searching adjacent unvisited nodes, and marking nodes visited along the way.
Binary search trees have nodes where the left child is less than the root node and the right child is greater than the root. Nodes are inserted by traversing the tree recursively to find an empty spot. Values are found by checking and traversing left or right based on whether the value is less than or greater than the current node. Minimum and maximum values are found by traversing all the way left or right. Nodes are deleted by checking if they have 0, 1, or 2 children and adjusting pointers accordingly or replacing with a child node.
The document defines and describes a stack data structure. A stack follows LIFO (last in, first out) and FILO (first in, last out) principles. Elements can be inserted using push and deleted using pop. Stacks have only one end for insertion/deletion and can be implemented using arrays or linked lists. The document provides code examples to implement stacks using arrays and linked lists and describes some applications of stacks like evaluating expressions and balancing symbols.
This document discusses using linked lists to represent polynomials and perform operations like addition and subtraction on them. It also discusses radix sort, which sorts integers based on their digits, and multi-linked lists, which have multiple links between nodes allowing for multiple lists to be embedded in a single data structure. Linked lists allow storing polynomial terms with coefficient and power, and traversing the lists to add/subtract terms with the same power and output a new polynomial list. Radix sort requires multiple passes equal to the largest number's digits to sort based on each digit place value. Multi-lists generalize linked lists by having nodes with multiple pointers connecting separate embedded lists.
This document discusses structures in C programming. It defines a structure as a collection of variables under a single name that provides a way to group related data. Structures allow heterogeneous data of different types to be stored together. The document covers defining and declaring structure types and variables, initializing structure members, using pointers to structures, and aggregate and segregate operations on structures like accessing members and taking the address of a structure.
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELgerogepatton
As digital technology becomes more deeply embedded in power systems, protecting the communication
networks of Smart Grids (SG) has emerged as a critical concern. Distributed Network Protocol 3 (DNP3)
represents a multi-tiered application layer protocol extensively utilized in Supervisory Control and Data
Acquisition (SCADA)-based smart grids to facilitate real-time data gathering and control functionalities.
Robust Intrusion Detection Systems (IDS) are necessary for early threat detection and mitigation because
of the interconnection of these networks, which makes them vulnerable to a variety of cyberattacks. To
solve this issue, this paper develops a hybrid Deep Learning (DL) model specifically designed for intrusion
detection in smart grids. The proposed approach is a combination of the Convolutional Neural Network
(CNN) and the Long-Short-Term Memory algorithms (LSTM). We employed a recent intrusion detection
dataset (DNP3), which focuses on unauthorized commands and Denial of Service (DoS) cyberattacks, to
train and test our model. The results of our experiments show that our CNN-LSTM method is much better
at finding smart grid intrusions than other deep learning algorithms used for classification. In addition,
our proposed approach improves accuracy, precision, recall, and F1 score, achieving a high detection
accuracy rate of 99.50%.
ACEP Magazine edition 4th launched on 05.06.2024Rahul
This document provides information about the third edition of the magazine "Sthapatya" published by the Association of Civil Engineers (Practicing) Aurangabad. It includes messages from current and past presidents of ACEP, memories and photos from past ACEP events, information on life time achievement awards given by ACEP, and a technical article on concrete maintenance, repairs and strengthening. The document highlights activities of ACEP and provides a technical educational article for members.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Literature Review Basics and Understanding Reference Management.pptxDr Ramhari Poudyal
Three-day training on academic research focuses on analytical tools at United Technical College, supported by the University Grant Commission, Nepal. 24-26 May 2024
6th International Conference on Machine Learning & Applications (CMLA 2024)ClaraZara1
6th International Conference on Machine Learning & Applications (CMLA 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of on Machine Learning & Applications.
Using recycled concrete aggregates (RCA) for pavements is crucial to achieving sustainability. Implementing RCA for new pavement can minimize carbon footprint, conserve natural resources, reduce harmful emissions, and lower life cycle costs. Compared to natural aggregate (NA), RCA pavement has fewer comprehensive studies and sustainability assessments.
HEAP SORT ILLUSTRATED WITH HEAPIFY, BUILD HEAP FOR DYNAMIC ARRAYS.
Heap sort is a comparison-based sorting technique based on Binary Heap data structure. It is similar to the selection sort where we first find the minimum element and place the minimum element at the beginning. Repeat the same process for the remaining elements.
2. UNIT V INFORMATION
LIFECYCLE MANAGEMENT
Data retention policies; Confidential and
Sensitive data handling, lifecycle management
costs. Archive data using Hadoop; Testing and
delivering big data applications for performance
and functionality; Challenges with data
administration
3. Introduction
Data – for analyzing and for drawing
important conclusions and observations about
the business.
there are constant updates and older data
might become obsolete.
Decide which data is no longer required.
4. Data Retention Policies
A data retention policy, or records retention
policy, is an organization's established
protocol for retaining information for
operational or regulatory compliance needs.
5. Data Retention Policies
deal with the complex issues of protecting
business information for a pre-determined
length of time on a pre-determined storage
system.
These policies define different retention
periods, depending on the type of data.
6. Data Retention Policies
describe procedures for archiving the
information, guidelines for destroying the
information when the time limit exceeds and
special mechanisms for handling the
information.
7. Data Retention Policies
Purpose:
To maintain important records and documents for
future use or reference
To dispose of records or documents that are no longer
needed
To organize records so that they can be searched and
accessed easily at a later date.
8. Data Retention Policies
Purpose:
Email messaging had a large impact on those who
develop and enforce data retention policies.
This stored business information can be expensive
for the business organizations.
9. Data Retention Policies
Requirements:
Legal or legitimate requirements: information related
to legal information
Business or commercial requirements: for operational
perspective
Personal or private requirements: for personal
perspective
10. Data Retention Policies
Scope: what kind of data are covered under data
retention policies
Legal record: contracts, trademark, power of attorney,
press release.
Final records: records of completed activities.
Permanent records: financial registers, patents,
proposals.
11. Data Retention Policies
Scope: what kind of data are covered under data
retention policies
Accounting and corporate tax records: investments, audits,
purchase, sales records
workplace records: day to day activities of employees,
agreement, minutes of meetings
Employment, employee and payroll records: job posting,
job advertisements, recruitment procedures, performance
reviews.
12. Data Retention Policies
Scope: what kind of data are covered under data
retention policies
Bank records: bank transactions, deposits, cheque details,
cheque bouncing
Historic records: records that are no longer required by the
organization.
Temporary records: documents that are not complete or
finalized
13. Data Retention Policies - Content
to focus on the reason behind data retention.
identification of criteria on which data needs to
be retained.
Usually the decision is based on the creation of
date, but it is also important to examine other
criteria such as last accessed time, type of data,
time till which the data is valid, data value
14. Data Retention Policies - Content
The policy document should include details of
the data that need to be archived or retained.
the division of data would help in deciding the
duration of retention and destruction
procedures.
15. Managing Data Retention Policies
to identify and managing authority
combined effort, involving storage administrators and
application owners, along with executive support.
The policy document should be validated by the company’s
legal counsel and should also be fully supported by the
management to be presented as a company policy and not
restricting it as an IT best practice document.
16. Data Retention in Telecommunication
Industry
the storage of call detail records of telephony and
internet traffic and transaction data by
government and commercial organizations.
Retention requirements for service providers are
found in the ISP and UASL licenses, which are
documented in the Indian Telegraph Act, 1885.
17. Data Retention in Telecommunication
Industry
Internet Service Provider (ISP) License:
According to ISP license, each ISP must
maintain:
Customers and services: A log of all
customers registered and all the services used
by them.
18. Data Retention in Telecommunication
Industry
Internet Service Provider (ISP) License:
Outward logins/ connections or telnet:
Every outward login or telnet through an ISP’s
computer must be logged.
Data Packets:
Copies of all packets originating from the
Customer/User Premises Equipment of the ISP must
be available.
19. Data Retention in Telecommunication
Industry
Internet Service Provider (ISP) License:
Subscribers:
available on the ISP website with authenticated access and must
be available to authorized Intelligence Agencies at any time.
Internet-leased line customers:
the complete list of customers and sub customers.
Details like name, address of installation, IP address allotted,
bandwidth provided and contact person with phone
number/email.
20. Data Retention in Telecommunication
Industry
Internet Service Provider (ISP) License:
Subscribers:
available on the ISP website with authenticated access and must
be available to authorized Intelligence Agencies at any time.
Internet-leased line customers:
the complete list of customers and sub customers.
Details like name, address of installation, IP address allotted,
bandwidth provided and contact person with phone
number/email.
21. Data Retention in Telecommunication
Industry
Internet Service Provider (ISP) License:
Network records and purpose
A record of complete network topology of the set-
up each of the internet –leased line customer
premises along with details of connectivity must
be made available at the site of the service
provider.
22. Data Retention in Telecommunication
Industry
Internet Service Provider (ISP) License:
Commercial records:
communications exchanged on the network must be
maintained for a year.
Site:
maintain the geographic location of all its subscriber
and should be able to provide it at a given point of
time.
23. Data Retention in Telecommunication
Industry
Internet Service Provider (ISP) License:
Remote activities:
the network should have a complete audit trail and
must be retained for a period of 6 months.
This information must be provided on request to
the licenser or any other agency authorized by the
licensor.
24. Data Retention in Telecommunication
Industry
Unified Access Service License (UASL)
introduced by DoT through which an access
service provider can offer a fixed and/or mobile
services using any technology under the same
license.
to retain pertaining to customer information or
transactions for security purposes.
25. Data Retention in Telecommunication
Industry
Unified Access Service License (UASL)
Mobile numbers:
Called / Calling party mobile numbers when
required.
Capture / Interception records:
Time, date and duration of interception when
required.
26. Data Retention in Telecommunication
Industry
Unified Access Service License (UASL)
Site / location:
Location of target subscribers.
All call records:
All call data records handled by the system when
required.
Failed call records
Roaming subscriber records
27. Data Retention in Telecommunication
Industry
Unified Access Service License (UASL)
Commercial records:
the communication exchanged on the network must be
retained for 1 year.
Outgoing call records:
a record of checks made on outgoing calls completed
by customers making a large number of outgoing calls
day and night to various customers.
28. Data Retention in Telecommunication
Industry
Unified Access Service License (UASL)
Calling line identification
A list of subscribers including address and details using line
identification should be kept in a password-protected website
accessible to authorized government agencies.
Location
The service provider must be able to provide the geographical location
of any subscriber at any point of time.
Remote access activities:
The complete audit trails of the remote access activities pertaining to
the network operated in India for period of 6 months.
29. Data Retention in Telecommunication
Industry
Sample retention Records:
Accounting and finance:
Record type Retention Period
Accounts payable/receivable ledgers
and schedules
8 Years
Financial statements and Annual audit
reports
Permanent
Annual audit records 8 years after the completion of audit
Annual plans and budgets 3 Years
30. Data Retention in Telecommunication
Industry
Sample retention Records:
Electronic documents:
Email:
All the emails are not retained
All emails are deleted after a period of 12 months.
All emails will be archived for 6 months after it has been deleted
by a staff, after which the mails will be permanently deleted.
Staff will not send confidential / proprietary data to outsiders.
31. Data Retention in Telecommunication
Industry
Sample retention Records:
Electronic documents:
Documents:
The maximum period of retention of documents is 6
years, depending on the content of the file.
Web pages:
Browsers should be scheduled to delete internet
cookies once per month.
32. Data Retention in Telecommunication
Industry
Sample retention Records:
Insurance records:
Record type Retention Period
Insurance policies and certificates Permanent
Claims files Permanent
Group insurance plans ( current
employee)
Till the plan is active or employee is
terminated
Group insurance plans (Retires)
Permanent or until 5 years after the
death of last eligible andidate.
33. Data Retention in Telecommunication
Industry
Sample retention Records:
Legal files and papers:
Record type Retention Period
Legal memoranda and ideas 6 years after close of matter
Court Orders Permanent
34. Data Retention in Telecommunication
Industry
Sample retention Records:
Payroll documents:
Record type Retention Period
Payroll registers 6 years
Time sheets 2 years
Payroll deductions Termination + 6 years
35. Data Retention in Telecommunication
Industry
Sample retention Records:
Personnel records:
Record type Retention Period
Employee service book Permanent
Employee medical records Termination/Retirement + 6 years
Bio-data of applicants not selected 1 year
36. Data Retention in Telecommunication
Industry
Sample retention Records:
Property records:
Record type Retention Period
Property deeds, licenses Permanent
Purchase/ Sale/ Lease agreement Permanent
Property insurance policy Permanent
37. Data Retention in Telecommunication
Industry
Sample retention Records:
Tax records:
Record type Retention Period
Tax returns – income, property Permanent
Sales/ use tax records 7 years
Tax exemption documents Permanent
38. Data Retention in Telecommunication
Industry
Laws related to Data retention policy in India:
License Agreement for Provision of Internet
Services.
maintain all commercial records with regard to the
communications exchanged on the network.
ISPs are responsible for maintaining history or a log
of all users connected through ISP and the services
they are using.
39. Data Retention in Telecommunication
Industry
Laws related to Data retention policy in India:
Information Technology Act (ITA) - 2008.
Section 67C (“Preservation and Retention of
Information by Intermediaries”)
online service providers and at least some access point
providers – retain a specified amount of information
for a specified period of time.
40. Data Retention in Telecommunication
Industry
Laws related to Data retention policy in India:
Information Technology Act (ITA) - 2008.
Section 79(2), under which intermediaries are
protected from liability for third party content
provided that they follow due carefulness while
discharging notice-and-takedown requirements of
the law
41. Data Retention in Telecommunication
Industry
Laws related to Data retention policy in India:
The Indian Department of Information Technology,
Ministry of Communications and Information
Technology - 2011.
store the traffic data and “history of websites
accessed” for each user for 1 year.
Users must be identified by their government issued
Id number and photograph.
42. Confidential and Sensitive Data
Handling
Personal Data: information about an individual, and
through which an individual can easily identified ,
either directly or indirectly.
Confidential Data: the personal data that is private and
should be disclosed to others
Sensitive Data: secured from unauthorized access to
protect the privacy or security of an individual or
organization.
43. Confidential and Sensitive Data
Handling
Types of Sensitive information:
Personal information:
Sensitive personally identifiable information is data
that can be traced back to an individual, thus revealing
one’s identity.
Information: biometric data, medical information and
history, bank and credit card information.
44. Confidential and Sensitive Data
Handling
Types of Sensitive information:
Business Information:
poses a risk to the company in question if
discovered by a competitor or the general public.
Trade secrets, contract details, financial data and
supplier and customer identification.
45. Confidential and Sensitive Data
Handling
Types of Sensitive information:
Classified Information:
pertains to a government body and it restricted
according to the level of sensitivity.
to protect security.
46. Confidential and Sensitive Data
Handling
Handling of Sensitive data:
Access policy
Access Decisions: Availability of data,
acceptability of the access data and non-sensitive
information derived from sensitive data.
47. Confidential and Sensitive Data
Handling
Types of Disclosures:
Displaying exact data
Displaying bounds – between high and low value.
Range of salary
Displaying negative results
Displaying probable values
48. Confidential and Sensitive Data
Handling
Handling Data:
Create a security risk-aware culture – Risk Management
Define data types – classify it as confidential or sensitive.
Clarify responsibilities and accountability for protection of
confidential or sensitive data.
Limit the access confidential or sensitive data
Provide training to properly use the resources and follow the
guidelines and rules.
Authenticate compliance regularly with policies and procedures.
49. Confidential and Sensitive Data
Handling
Law provision in India defining Sensitive
Data and its handling:
Information Technology Act:
reasonable security practices and procedures while
handling sensitive personal data or information.
50. Confidential and Sensitive Data
Handling
Law provision in India defining Sensitive
Data and its handling:
Information Technology Act:
Criminal punishment for a person
discloses sensitive personal information
violation of the relevant contract
with an intention of, or knowing that the disclosure
would cause wrongful loss or gain.
51. Confidential and Sensitive Data
Handling
Information Technology Act – Feature
Sensitive personal information: Sensitive Personal
Data (SPD) includes passwords, financial and
credit card details, physical, physiological and
mental health condition, medical records and
history.
SPD deals only with information of individuals
and not information of businesses.
52. Confidential and Sensitive Data
Handling
Information Technology Act – Feature
Privacy Policy: describe what information is
collected, what is the purpose of using the
information, to whom or how the information
might be disclosed and the sound security practices
followed to safeguard the information.
53. Confidential and Sensitive Data
Handling
Information Technology Act – Feature
Privacy Policy:
describe what information is collected,
what is the purpose of using the information,
to whom or how the information might be disclosed
the sound security practices followed to safeguard the
information.
54. Confidential and Sensitive Data
Handling
Information Technology Act – Feature
Consent for collection:
approval has to be provided by letter, fax or email.
prior to collecting the information, provide an
option to the information provider to not provide
such information.
55. Confidential and Sensitive Data
Handling
Information Technology Act – Feature
Notification:
the business ensure that the information provider is
aware of the information being collected
purpose of using the information
recipients of the information and name and address of
the agency collecting the information.
56. Confidential and Sensitive Data
Handling
Information Technology Act – Feature
Use and retention:
restricted to the purpose for which it was
collected.
The business should not maintain the SPD for
longer than it is specified.
57. Confidential and Sensitive Data
Handling
Information Technology Act – Feature
Right of access, correction and withdrawal
permit the information provider the right to review
the information, and should ensure that any
information found to be inaccurate or deficient be
corrected.
58. Confidential and Sensitive Data
Handling
Information Technology Act – Feature
Transactional transfer
transferred if it is necessary for the performance
of a lawful contract between the body corporate
and information provider or where the information
provider has provided his/her consent to such
transfer
59. Confidential and Sensitive Data
Handling
Information Technology Act – Feature
Security Procedures
procedure has to be audited on a regular basis by
an independent auditor
60. Lifecycle Management Costs
Data lifecycle management:
the process of handling the flow of business
information throughout its lifespan, from
requirements through maintenance.
automating the process involved in organizing
data into separate tiers according to specified
policies, and automating data migration from one
tier to another.
61. Lifecycle Management Costs
Data lifecycle management – Stages:
Data creation powers the enterprise:
When an employee creates and saves a file, that
information becomes part of the organization's
daily operations
store this active data locally and on a network
server
62. Lifecycle Management Costs
Data lifecycle management – Stages:
Backups guard against data loss:
enterprise can move it from primary storage into less
costly off-site tape vaults or to the cloud.
A well-rounded data backup and recovery strategy
combines off-site tape storage with cloud backup and
data restoration capabilities
63. Lifecycle Management Costs
Data lifecycle management – Stages:
Archiving helps contain storage costs:
to retain older inactive data in case of a legal,
regulatory or audit event.
to hold on to data for as long as seven years
Off-site tape archives offer high security, quick access
and lower storage costs for such long-term data storage
demands.
64. Lifecycle Management Costs
Data lifecycle management – Stages:
Ensuring secure data destruction:
The final stage of the data lifecycles requires secure
destruction, which is typically governed by a schedule
that defines when you must destroy unneeded data.
Once data reaches its expiration date, secure media
destruction can ensure its environmentally friendly
disposal.
65. Lifecycle Management Costs
Data lifecycle management – Stages:
Put secure IT asset disposition to work:
before discarding the storage media it needs to be
completely destroyed.
66. Lifecycle Management Costs
Efficient Data lifecycle management:
the storage needs to be scalable to accommodate
Analytics applications in some cases require us to
access archived and unstructured data.
the storage can be optimized for maintenance and
licensing costs by migrating rarely used data into
framework like Hadoop.
67. Lifecycle Management Costs
Objectives of Data lifecycle management:
Data trustworthiness
Both structured and unstructured data must be
managed effectively.
Data privacy and security must be protected at all
times.
68. Archive Data using Hadoop
Hadoop – to store any type of data
ability to query Hadoop data with SQL makes
Hadoop the prime destination for archival data.
to perform archiving is Sqoop, which can
move the data to be archived from the data
warehouse into Hadoop.
69. Archive Data using Hadoop
Features:
Schema preservation:
to ensure that data values will be archived without loss of
precision.
Changes to the source schema, for example adding new columns
or changing data types, should also be captured by the archive.
allows the archive to grow organically over a long period of
time while maintaining a continuous historical record of the
changes to the schema and the data in the source EDW.
70. Archive Data using Hadoop
Features:
Control and Security:
Archived data generally inherits the same governance
requirements as the EDW.
The archive must provide access to data on a ‘need to
know’ basis; it must guarantee that sensitive data is
encrypted or masked, and that access is audited.
An archive must also integrate with the same
enterprise security infrastructure as the EDW.
71. Archive Data using Hadoop
Features:
SQL support
Support for SQL access to the archived data is a must.
Applications would require us to make use of the archived
data to generate reports or to perform analysis.
to execute run-time interactive queries along with batch
processing.
SLA can be relaxed or ignored
72. Testing and Delivering Big Data
Applications for Performance and
Functionality
A huge set of complicated structured and
unstructured data is called as Big Data.
testing of Big Data, a lot of processes and
techniques are involved.
Big Data testing is a proof of the perfect
data dealing, instead of testing the tool.
In testing of data, Performance and functional
testing are the keys.
73. Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing
Are:
Presentation Testing:
Big Data applications work together with existing
statistics for genuine occasion analytics
makes the procedure keep going.
74. Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing
Are:
Problems With Expansion Capacity:
starts with lesser sets of statistics and ends up with
an overweight quantity of statistics.
a number of data increases, the performance of
analytics may reduce.
75. Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing Are:
Towering Quantity Of Downtime:
due to a large number of problems, the data faces
certain issues resulting in a reduction of downtime.
So if a continuous amount of downtimes occur, then
users should be a concern and be sure that it is time for
testing the Big Data Analytics
76. Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing Are:
Poor Improvement:
Failure in handling data efficiently for longer time
span would result in improper development.
Hence for running the business appropriately, proper
testing of data is required, because the delivery of the
proper result to clients.
77. Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing
Are:
No Proper Control:
require proper control of the information the
business work with.
And this proper data can be obtained only by
frequently checking the data.
78. Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing Are:
Poor Safety Measures:
big data stores the organization’s complete data from
credential sets to all the confidential reports so safety and
protection in Big data is a must and the management have
to make sure that the data stored in HDFS of big data is
secured to the fullest.
to steal confidential data from the company’s storage.
79. Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing Are:
Problems With The Proper Running Of The
Applications:
Before applying data to be used in different applications
they should undergo a testing procedure to find out if they
are fit for the analysis.
in order to assure proper, running the applications,
performing proper testing should be a must.
80. Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing
Are:
Proper Output:
In order to get the best output in any project proper
input is necessary and correction and testing of
input must be made sure to determine the best
output ever
81. Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing Are:
Unpredictable Performances:
When the right data is used in the right way, then the
potential of any organization finds no limit.
Only through correct testing on time will help to
decide inconsistency and removes insecurity.
82. Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing Are:
insufficient Value:
a lot of other factors need to be taken cared of like the
strength, precision, traditional values, replication,
stability, etc.
gaining the proper data, all factors need to be checked
which led to the requirement of performing testing on
Big Data.
83. Testing and Delivering Big Data
Applications for Performance and
Functionality
A High-Level Overview Of Phases In Testing Big Data
Applications:
The Testing Procedure Is Filled With
Data Phase Proofing:
The data collected from different places need to be proved
to be correct.
The supply data and the input data needs to be similar
Make sure true and valid data is put into the HDFS.
84. Testing and Delivering Big Data
Applications for Performance and
Functionality
A High-Level Overview Of Phases In Testing
Big Data Applications:
The Testing Procedure Is Filled With
Proofing of MapReduce:
data accumulation regulations are applied on data.
proof the processed output data.
85. Testing and Delivering Big Data
Applications for Performance and
Functionality
A High-Level Overview Of Phases In Testing Big Data
Applications:
The Testing Procedure Is Filled With
Proofing of the Output:
the transformation rules are implemented accurately.
Fill the information in the target system.
make sure that the data in the output and in the HDFS has
no fraud.
86. Testing and Delivering Big Data
Applications for Performance and
Functionality
Testing of the Architecture
Hadoop is the data storage of an immense set of data with
high standard arrangement and security.
the testing should always occur in the Hadoop atmosphere
only.
Testing of the concert includes the clear output completion,
use of proper storage, throughput, and system commodities
Data processing is flawless and it needs to be proved.
87. Testing and Delivering Big Data
Applications for Performance and
Functionality
ActionFlow Testing
Information Intake And Right Through:
the speed of the data from different sources is
determined.
Categorizing messages from different data frame
in different time is classified
88. Testing and Delivering Big Data
Applications for Performance and
Functionality
ActionFlow Testing
Dealing With Data:
Here determination of how fast the data is
executed is done.
when the datasets are busy, testing of the data
processing is done in separated forum.
89. Testing and Delivering Big Data
Applications for Performance and
Functionality
ActionFlow Testing
Check The Working Of All The Ingredients:
test of each and every commodity is a must.
The speed of message indexes, utilization of those
messages, Phases of the MapReduce procedure,
support search, all comes under this phase.
90. Testing and Delivering Big Data
Applications for Performance and
Functionality
Performance Testing Approach:
involves testing of huge volumes of planned
and shapeless data, and it requires a specific
testing approach to test such massive data.
Hadoop is involved with storage and
maintenance of a large set of data including
both structured as well as unstructured data .
91. Testing and Delivering Big Data
Applications for Performance and
Functionality
Performance Testing Approach:
92. Testing and Delivering Big Data
Applications for Performance and
Functionality
Performance Testing Approach:
the set up of the application prior to the testing
procedure begins.
Find out the required workloads and make the design
accordingly.
Make ready each and every client separately.
Perform the testing procedure and also check the
output carefully.
Do the best possible organization
93. Testing and Delivering Big Data
Applications for Performance and
Functionality
Factors For Concert Testing: Various parameters to be
verified for performance testing are
How the information will be stored
Till what extend the commit logs can enlarge
Finding out the concurrency of the read and write
procedures
Find all the standards of the start, and stop timeouts.
94. Testing and Delivering Big Data
Applications for Performance and
Functionality
Factors For Concert Testing: Various parameters to be
verified for performance testing are
Arrange the key and row cache properly
Do consider the ingredients of the Java Virtual Machine
also
Filter and sort the working of the processing part, the
MapReduce.
Check the messaging rate and its sizes too.
95. Testing and Delivering Big Data
Applications for Performance and
Functionality
Test Atmosphere Requirements
As always Hadoop structure should be
more spacious since it has to process a large set of
data.
The cluster should contain a large set of nodes to
handle the stored information.
The CPU should be utilized properly.
96. Testing and Delivering Big Data
Applications for Performance and
Functionality
Challenges In Big Data Testing
Mechanization:
High technical expert is involved with
mechanical testing.
They do not solve those unexpected problems.
97. Testing and Delivering Big Data
Applications for Performance and
Functionality
Challenges In Big Data Testing
Virtualization:
Latency in this system produces time problems in real
time testing.
Image management is also done here.
98. Testing and Delivering Big Data
Applications for Performance and
Functionality
Challenges In Big Data Testing
Large Dataset:
Proofing of large amount of data and increase of its
speed.
Need to increase the tests.
Testing has to be done in several fields.
99. Testing and Delivering Big Data
Applications for Performance and
Functionality
Performance Testing Challenges:
Varieties In Technologies:
The different ingredients of Hadoop belong to
different technology
each one of them needs separate kinds of testing
100. Testing and Delivering Big Data
Applications for Performance and
Functionality
Performance Testing Challenges:
Unavailability Of Precise Equipment:
A lot number of testing components are required
for the complete testing procedure
Test Scripting:
High-quality scripting is thus important and very
essential for the state of affairs.
101. Testing and Delivering Big Data
Applications for Performance and
Functionality
Performance Testing Challenges:
Test Environment:
The perfect test atmosphere is must, and in most of the
cases not possible to obtain.
Controlling Resolutions:
For controlling the complete atmosphere large number
of resolutions is required which is not always present.
102. Challenges with Data Administration
responsible for designing and maintaining data
stores.
data is monitored, maintained and managed by a
person and/or organization.
allows an organization to check its data
resources, along with their processing and
communications with different applications and
business processes.
103. Challenges with Data Administration
data usage and handling is working towards
the enterprise’s objective.
to integrate data from multiple resources and
provide it to various applications.
104. Challenges with Data Administration
Data Administrator deals with designing of
the logical and conceptual models treating the
data at an organizational level.
Database Administrators deal with the
implementation of databases required and in
use.
105. Challenges with Data Administration
Responsibilities of data administrators:
Data policies, procedures and standards:
Data policy: can interact with which data, how
that data can be changed and what is the effect of
the change.
106. Challenges with Data Administration
Responsibilities of data administrators:
Data policies, procedures and standards:
Data Procedures: documented plan of actions to
be taken to perform a certain activity like backup
and recovery procedures.
107. Challenges with Data Administration
Responsibilities of data administrators:
Data policies, procedures and standards:
Data standards: conventions and behaviors that
need to be followed so that the maintenance
becomes easy.
108. Challenges with Data Administration
Responsibilities of data administrators:
Planning:
to plan for an effective administration of data and
also provide support for future needs.
109. Challenges with Data Administration
Responsibilities of data administrators:
Data conflict resolution:
establish procedures for resolving any conflicts in
ownership.
authority to mediate and enforce the resolution of
the conflict, they may be very effective in this
capacity.
110. Challenges with Data Administration
Responsibilities of data administrators:
Managing the data repository:
contain metadata that holds data description of the
data stored in data stores.
describe an organization’s data and data
processing resources.
111. Challenges with Data Administration
Responsibilities of data administrators:
Internal marketing of DA concepts:
established policies and procedures must be made
known to the internal staff.
112. Challenges with Data Administration
Responsibilities of data administrators:
designing the database:
defining and creating the logical data model, physical
database model and prototyping.
Security and authorization:
ensures that there is no unauthorized access to the data.
users may be granted permission to access only certain
views and relations.
113. Challenges with Data Administration
Responsibilities of data administrators:
Data availability and recovery from failures:
ensure that the data is made available to its user in
such way that the users are unaware of the failure.
Database tuning:
modifying the database, the conceptual and logical
design.
114. Challenges with Data Administration
Creating the data repository: integrating it to
create a common data repository is
challenging.
Emphasize the capability to build a database
quickly, tune it for maximum performance and
restore it to production quickly when problems
develop.
115. Challenges with Data Administration
Enforcing the data policies and standards,
especially those related to security.
support should be provided to incorporate the
changes and make provision for future scope.
with the social media, should define the
ownership of data.
116. Challenges with Data Administration
The administrator is always expected to keep
side by side with new technologies, and is
usually involved in mission-critical
applications.
to have a comprehensive understanding of a
wide variety of topics and improve business
processes in their organization.