SlideShare a Scribd company logo
IT6701 _ Information
Management
IV Year / VII Semester
UNIT V INFORMATION
LIFECYCLE MANAGEMENT
Data retention policies; Confidential and
Sensitive data handling, lifecycle management
costs. Archive data using Hadoop; Testing and
delivering big data applications for performance
and functionality; Challenges with data
administration
Introduction
 Data – for analyzing and for drawing
important conclusions and observations about
the business.
 there are constant updates and older data
might become obsolete.
 Decide which data is no longer required.
Data Retention Policies
A data retention policy, or records retention
policy, is an organization's established
protocol for retaining information for
operational or regulatory compliance needs.
Data Retention Policies
 deal with the complex issues of protecting
business information for a pre-determined
length of time on a pre-determined storage
system.
 These policies define different retention
periods, depending on the type of data.
Data Retention Policies
 describe procedures for archiving the
information, guidelines for destroying the
information when the time limit exceeds and
special mechanisms for handling the
information.
Data Retention Policies
 Purpose:
 To maintain important records and documents for
future use or reference
 To dispose of records or documents that are no longer
needed
 To organize records so that they can be searched and
accessed easily at a later date.
Data Retention Policies
 Purpose:
 Email messaging had a large impact on those who
develop and enforce data retention policies.
 This stored business information can be expensive
for the business organizations.
Data Retention Policies
 Requirements:
 Legal or legitimate requirements: information related
to legal information
 Business or commercial requirements: for operational
perspective
 Personal or private requirements: for personal
perspective
Data Retention Policies
 Scope: what kind of data are covered under data
retention policies
 Legal record: contracts, trademark, power of attorney,
press release.
 Final records: records of completed activities.
 Permanent records: financial registers, patents,
proposals.
Data Retention Policies
 Scope: what kind of data are covered under data
retention policies
 Accounting and corporate tax records: investments, audits,
purchase, sales records
 workplace records: day to day activities of employees,
agreement, minutes of meetings
 Employment, employee and payroll records: job posting,
job advertisements, recruitment procedures, performance
reviews.
Data Retention Policies
 Scope: what kind of data are covered under data
retention policies
 Bank records: bank transactions, deposits, cheque details,
cheque bouncing
 Historic records: records that are no longer required by the
organization.
Temporary records: documents that are not complete or
finalized
Data Retention Policies - Content
 to focus on the reason behind data retention.
 identification of criteria on which data needs to
be retained.
 Usually the decision is based on the creation of
date, but it is also important to examine other
criteria such as last accessed time, type of data,
time till which the data is valid, data value
Data Retention Policies - Content
 The policy document should include details of
the data that need to be archived or retained.
 the division of data would help in deciding the
duration of retention and destruction
procedures.
Managing Data Retention Policies
 to identify and managing authority
 combined effort, involving storage administrators and
application owners, along with executive support.
 The policy document should be validated by the company’s
legal counsel and should also be fully supported by the
management to be presented as a company policy and not
restricting it as an IT best practice document.
Data Retention in Telecommunication
Industry
 the storage of call detail records of telephony and
internet traffic and transaction data by
government and commercial organizations.
 Retention requirements for service providers are
found in the ISP and UASL licenses, which are
documented in the Indian Telegraph Act, 1885.
Data Retention in Telecommunication
Industry
 Internet Service Provider (ISP) License:
 According to ISP license, each ISP must
maintain:
 Customers and services: A log of all
customers registered and all the services used
by them.
Data Retention in Telecommunication
Industry
 Internet Service Provider (ISP) License:
 Outward logins/ connections or telnet:
 Every outward login or telnet through an ISP’s
computer must be logged.
 Data Packets:
 Copies of all packets originating from the
Customer/User Premises Equipment of the ISP must
be available.
Data Retention in Telecommunication
Industry
 Internet Service Provider (ISP) License:
 Subscribers:
 available on the ISP website with authenticated access and must
be available to authorized Intelligence Agencies at any time.
 Internet-leased line customers:
 the complete list of customers and sub customers.
 Details like name, address of installation, IP address allotted,
bandwidth provided and contact person with phone
number/email.
Data Retention in Telecommunication
Industry
 Internet Service Provider (ISP) License:
 Subscribers:
 available on the ISP website with authenticated access and must
be available to authorized Intelligence Agencies at any time.
 Internet-leased line customers:
 the complete list of customers and sub customers.
 Details like name, address of installation, IP address allotted,
bandwidth provided and contact person with phone
number/email.
Data Retention in Telecommunication
Industry
 Internet Service Provider (ISP) License:
 Network records and purpose
 A record of complete network topology of the set-
up each of the internet –leased line customer
premises along with details of connectivity must
be made available at the site of the service
provider.
Data Retention in Telecommunication
Industry
 Internet Service Provider (ISP) License:
 Commercial records:
 communications exchanged on the network must be
maintained for a year.
 Site:
 maintain the geographic location of all its subscriber
and should be able to provide it at a given point of
time.
Data Retention in Telecommunication
Industry
 Internet Service Provider (ISP) License:
 Remote activities:
 the network should have a complete audit trail and
must be retained for a period of 6 months.
 This information must be provided on request to
the licenser or any other agency authorized by the
licensor.
Data Retention in Telecommunication
Industry
 Unified Access Service License (UASL)
 introduced by DoT through which an access
service provider can offer a fixed and/or mobile
services using any technology under the same
license.
 to retain pertaining to customer information or
transactions for security purposes.
Data Retention in Telecommunication
Industry
 Unified Access Service License (UASL)
 Mobile numbers:
 Called / Calling party mobile numbers when
required.
 Capture / Interception records:
 Time, date and duration of interception when
required.
Data Retention in Telecommunication
Industry
 Unified Access Service License (UASL)
 Site / location:
 Location of target subscribers.
 All call records:
 All call data records handled by the system when
required.
 Failed call records
 Roaming subscriber records
Data Retention in Telecommunication
Industry
 Unified Access Service License (UASL)
 Commercial records:
 the communication exchanged on the network must be
retained for 1 year.
 Outgoing call records:
 a record of checks made on outgoing calls completed
by customers making a large number of outgoing calls
day and night to various customers.
Data Retention in Telecommunication
Industry
 Unified Access Service License (UASL)
 Calling line identification
 A list of subscribers including address and details using line
identification should be kept in a password-protected website
accessible to authorized government agencies.
 Location
 The service provider must be able to provide the geographical location
of any subscriber at any point of time.
 Remote access activities:
 The complete audit trails of the remote access activities pertaining to
the network operated in India for period of 6 months.
Data Retention in Telecommunication
Industry
 Sample retention Records:
 Accounting and finance:
Record type Retention Period
Accounts payable/receivable ledgers
and schedules
8 Years
Financial statements and Annual audit
reports
Permanent
Annual audit records 8 years after the completion of audit
Annual plans and budgets 3 Years
Data Retention in Telecommunication
Industry
 Sample retention Records:
 Electronic documents:
 Email:
 All the emails are not retained
 All emails are deleted after a period of 12 months.
 All emails will be archived for 6 months after it has been deleted
by a staff, after which the mails will be permanently deleted.
 Staff will not send confidential / proprietary data to outsiders.
Data Retention in Telecommunication
Industry
 Sample retention Records:
 Electronic documents:
 Documents:
 The maximum period of retention of documents is 6
years, depending on the content of the file.
 Web pages:
 Browsers should be scheduled to delete internet
cookies once per month.
Data Retention in Telecommunication
Industry
 Sample retention Records:
Insurance records:
Record type Retention Period
Insurance policies and certificates Permanent
Claims files Permanent
Group insurance plans ( current
employee)
Till the plan is active or employee is
terminated
Group insurance plans (Retires)
Permanent or until 5 years after the
death of last eligible andidate.
Data Retention in Telecommunication
Industry
 Sample retention Records:
Legal files and papers:
Record type Retention Period
Legal memoranda and ideas 6 years after close of matter
Court Orders Permanent
Data Retention in Telecommunication
Industry
 Sample retention Records:
Payroll documents:
Record type Retention Period
Payroll registers 6 years
Time sheets 2 years
Payroll deductions Termination + 6 years
Data Retention in Telecommunication
Industry
 Sample retention Records:
Personnel records:
Record type Retention Period
Employee service book Permanent
Employee medical records Termination/Retirement + 6 years
Bio-data of applicants not selected 1 year
Data Retention in Telecommunication
Industry
 Sample retention Records:
Property records:
Record type Retention Period
Property deeds, licenses Permanent
Purchase/ Sale/ Lease agreement Permanent
Property insurance policy Permanent
Data Retention in Telecommunication
Industry
 Sample retention Records:
Tax records:
Record type Retention Period
Tax returns – income, property Permanent
Sales/ use tax records 7 years
Tax exemption documents Permanent
Data Retention in Telecommunication
Industry
 Laws related to Data retention policy in India:
 License Agreement for Provision of Internet
Services.
 maintain all commercial records with regard to the
communications exchanged on the network.
 ISPs are responsible for maintaining history or a log
of all users connected through ISP and the services
they are using.
Data Retention in Telecommunication
Industry
 Laws related to Data retention policy in India:
Information Technology Act (ITA) - 2008.
 Section 67C (“Preservation and Retention of
Information by Intermediaries”)
 online service providers and at least some access point
providers – retain a specified amount of information
for a specified period of time.
Data Retention in Telecommunication
Industry
 Laws related to Data retention policy in India:
Information Technology Act (ITA) - 2008.
 Section 79(2), under which intermediaries are
protected from liability for third party content
provided that they follow due carefulness while
discharging notice-and-takedown requirements of
the law
Data Retention in Telecommunication
Industry
 Laws related to Data retention policy in India:
The Indian Department of Information Technology,
Ministry of Communications and Information
Technology - 2011.
 store the traffic data and “history of websites
accessed” for each user for 1 year.
 Users must be identified by their government issued
Id number and photograph.
Confidential and Sensitive Data
Handling
 Personal Data: information about an individual, and
through which an individual can easily identified ,
either directly or indirectly.
 Confidential Data: the personal data that is private and
should be disclosed to others
 Sensitive Data: secured from unauthorized access to
protect the privacy or security of an individual or
organization.
Confidential and Sensitive Data
Handling
 Types of Sensitive information:
 Personal information:
 Sensitive personally identifiable information is data
that can be traced back to an individual, thus revealing
one’s identity.
 Information: biometric data, medical information and
history, bank and credit card information.
Confidential and Sensitive Data
Handling
 Types of Sensitive information:
 Business Information:
 poses a risk to the company in question if
discovered by a competitor or the general public.
 Trade secrets, contract details, financial data and
supplier and customer identification.
Confidential and Sensitive Data
Handling
 Types of Sensitive information:
 Classified Information:
 pertains to a government body and it restricted
according to the level of sensitivity.
 to protect security.
Confidential and Sensitive Data
Handling
 Handling of Sensitive data:
 Access policy
 Access Decisions: Availability of data,
acceptability of the access data and non-sensitive
information derived from sensitive data.
Confidential and Sensitive Data
Handling
Types of Disclosures:
 Displaying exact data
 Displaying bounds – between high and low value.
Range of salary
 Displaying negative results
 Displaying probable values
Confidential and Sensitive Data
Handling
 Handling Data:
 Create a security risk-aware culture – Risk Management
 Define data types – classify it as confidential or sensitive.
 Clarify responsibilities and accountability for protection of
confidential or sensitive data.
 Limit the access confidential or sensitive data
 Provide training to properly use the resources and follow the
guidelines and rules.
 Authenticate compliance regularly with policies and procedures.
Confidential and Sensitive Data
Handling
 Law provision in India defining Sensitive
Data and its handling:
 Information Technology Act:
 reasonable security practices and procedures while
handling sensitive personal data or information.
Confidential and Sensitive Data
Handling
 Law provision in India defining Sensitive
Data and its handling:
 Information Technology Act:
 Criminal punishment for a person
 discloses sensitive personal information
 violation of the relevant contract
 with an intention of, or knowing that the disclosure
would cause wrongful loss or gain.
Confidential and Sensitive Data
Handling
 Information Technology Act – Feature
 Sensitive personal information: Sensitive Personal
Data (SPD) includes passwords, financial and
credit card details, physical, physiological and
mental health condition, medical records and
history.
 SPD deals only with information of individuals
and not information of businesses.
Confidential and Sensitive Data
Handling
 Information Technology Act – Feature
 Privacy Policy: describe what information is
collected, what is the purpose of using the
information, to whom or how the information
might be disclosed and the sound security practices
followed to safeguard the information.
Confidential and Sensitive Data
Handling
 Information Technology Act – Feature
 Privacy Policy:
describe what information is collected,
what is the purpose of using the information,
to whom or how the information might be disclosed
the sound security practices followed to safeguard the
information.
Confidential and Sensitive Data
Handling
 Information Technology Act – Feature
 Consent for collection:
 approval has to be provided by letter, fax or email.
 prior to collecting the information, provide an
option to the information provider to not provide
such information.
Confidential and Sensitive Data
Handling
 Information Technology Act – Feature
 Notification:
 the business ensure that the information provider is
aware of the information being collected
 purpose of using the information
 recipients of the information and name and address of
the agency collecting the information.
Confidential and Sensitive Data
Handling
 Information Technology Act – Feature
 Use and retention:
 restricted to the purpose for which it was
collected.
 The business should not maintain the SPD for
longer than it is specified.
Confidential and Sensitive Data
Handling
 Information Technology Act – Feature
Right of access, correction and withdrawal
 permit the information provider the right to review
the information, and should ensure that any
information found to be inaccurate or deficient be
corrected.
Confidential and Sensitive Data
Handling
 Information Technology Act – Feature
 Transactional transfer
 transferred if it is necessary for the performance
of a lawful contract between the body corporate
and information provider or where the information
provider has provided his/her consent to such
transfer
Confidential and Sensitive Data
Handling
 Information Technology Act – Feature
 Security Procedures
 procedure has to be audited on a regular basis by
an independent auditor
Lifecycle Management Costs
 Data lifecycle management:
 the process of handling the flow of business
information throughout its lifespan, from
requirements through maintenance.
 automating the process involved in organizing
data into separate tiers according to specified
policies, and automating data migration from one
tier to another.
Lifecycle Management Costs
 Data lifecycle management – Stages:
 Data creation powers the enterprise:
 When an employee creates and saves a file, that
information becomes part of the organization's
daily operations
 store this active data locally and on a network
server
Lifecycle Management Costs
 Data lifecycle management – Stages:
 Backups guard against data loss:
 enterprise can move it from primary storage into less
costly off-site tape vaults or to the cloud.
 A well-rounded data backup and recovery strategy
combines off-site tape storage with cloud backup and
data restoration capabilities
Lifecycle Management Costs
 Data lifecycle management – Stages:
 Archiving helps contain storage costs:
 to retain older inactive data in case of a legal,
regulatory or audit event.
to hold on to data for as long as seven years
 Off-site tape archives offer high security, quick access
and lower storage costs for such long-term data storage
demands.
Lifecycle Management Costs
 Data lifecycle management – Stages:
 Ensuring secure data destruction:
 The final stage of the data lifecycles requires secure
destruction, which is typically governed by a schedule
that defines when you must destroy unneeded data.
 Once data reaches its expiration date, secure media
destruction can ensure its environmentally friendly
disposal.
Lifecycle Management Costs
 Data lifecycle management – Stages:
 Put secure IT asset disposition to work:
 before discarding the storage media it needs to be
completely destroyed.
Lifecycle Management Costs
 Efficient Data lifecycle management:
 the storage needs to be scalable to accommodate
 Analytics applications in some cases require us to
access archived and unstructured data.
 the storage can be optimized for maintenance and
licensing costs by migrating rarely used data into
framework like Hadoop.
Lifecycle Management Costs
 Objectives of Data lifecycle management:
 Data trustworthiness
 Both structured and unstructured data must be
managed effectively.
 Data privacy and security must be protected at all
times.
Archive Data using Hadoop
 Hadoop – to store any type of data
 ability to query Hadoop data with SQL makes
Hadoop the prime destination for archival data.
 to perform archiving is Sqoop, which can
move the data to be archived from the data
warehouse into Hadoop.
Archive Data using Hadoop
 Features:
 Schema preservation:
 to ensure that data values will be archived without loss of
precision.
 Changes to the source schema, for example adding new columns
or changing data types, should also be captured by the archive.
 allows the archive to grow organically over a long period of
time while maintaining a continuous historical record of the
changes to the schema and the data in the source EDW.
Archive Data using Hadoop
Features:
 Control and Security:
 Archived data generally inherits the same governance
requirements as the EDW.
 The archive must provide access to data on a ‘need to
know’ basis; it must guarantee that sensitive data is
encrypted or masked, and that access is audited.
 An archive must also integrate with the same
enterprise security infrastructure as the EDW.
Archive Data using Hadoop
 Features:
 SQL support
 Support for SQL access to the archived data is a must.
 Applications would require us to make use of the archived
data to generate reports or to perform analysis.
 to execute run-time interactive queries along with batch
processing.
 SLA can be relaxed or ignored
Testing and Delivering Big Data
Applications for Performance and
Functionality
 A huge set of complicated structured and
unstructured data is called as Big Data.
 testing of Big Data, a lot of processes and
techniques are involved.
 Big Data testing is a proof of the perfect
data dealing, instead of testing the tool.
 In testing of data, Performance and functional
testing are the keys.
Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing
Are:
 Presentation Testing:
Big Data applications work together with existing
statistics for genuine occasion analytics
 makes the procedure keep going.
Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing
Are:
 Problems With Expansion Capacity:
starts with lesser sets of statistics and ends up with
an overweight quantity of statistics.
 a number of data increases, the performance of
analytics may reduce.
Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing Are:
Towering Quantity Of Downtime:
due to a large number of problems, the data faces
certain issues resulting in a reduction of downtime.
So if a continuous amount of downtimes occur, then
users should be a concern and be sure that it is time for
testing the Big Data Analytics
Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing Are:
Poor Improvement:
Failure in handling data efficiently for longer time
span would result in improper development.
 Hence for running the business appropriately, proper
testing of data is required, because the delivery of the
proper result to clients.
Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing
Are:
No Proper Control:
require proper control of the information the
business work with.
 And this proper data can be obtained only by
frequently checking the data.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Signs That Show We Should Go For Testing Are:
 Poor Safety Measures:
big data stores the organization’s complete data from
credential sets to all the confidential reports so safety and
protection in Big data is a must and the management have
to make sure that the data stored in HDFS of big data is
secured to the fullest.
 to steal confidential data from the company’s storage.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Signs That Show We Should Go For Testing Are:
 Problems With The Proper Running Of The
Applications:
Before applying data to be used in different applications
they should undergo a testing procedure to find out if they
are fit for the analysis.
 in order to assure proper, running the applications,
performing proper testing should be a must.
Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing
Are:
Proper Output:
In order to get the best output in any project proper
input is necessary and correction and testing of
input must be made sure to determine the best
output ever
Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing Are:
Unpredictable Performances:
 When the right data is used in the right way, then the
potential of any organization finds no limit.
 Only through correct testing on time will help to
decide inconsistency and removes insecurity.
Testing and Delivering Big Data
Applications for Performance and
Functionality
Signs That Show We Should Go For Testing Are:
insufficient Value:
a lot of other factors need to be taken cared of like the
strength, precision, traditional values, replication,
stability, etc.
 gaining the proper data, all factors need to be checked
which led to the requirement of performing testing on
Big Data.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 A High-Level Overview Of Phases In Testing Big Data
Applications:
 The Testing Procedure Is Filled With
 Data Phase Proofing:
 The data collected from different places need to be proved
to be correct.
The supply data and the input data needs to be similar
Make sure true and valid data is put into the HDFS.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 A High-Level Overview Of Phases In Testing
Big Data Applications:
 The Testing Procedure Is Filled With
Proofing of MapReduce:
 data accumulation regulations are applied on data.
 proof the processed output data.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 A High-Level Overview Of Phases In Testing Big Data
Applications:
 The Testing Procedure Is Filled With
 Proofing of the Output:
 the transformation rules are implemented accurately.
 Fill the information in the target system.
 make sure that the data in the output and in the HDFS has
no fraud.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Testing of the Architecture
 Hadoop is the data storage of an immense set of data with
high standard arrangement and security.
 the testing should always occur in the Hadoop atmosphere
only.
 Testing of the concert includes the clear output completion,
use of proper storage, throughput, and system commodities
 Data processing is flawless and it needs to be proved.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 ActionFlow Testing
 Information Intake And Right Through:
 the speed of the data from different sources is
determined.
 Categorizing messages from different data frame
in different time is classified
Testing and Delivering Big Data
Applications for Performance and
Functionality
 ActionFlow Testing
 Dealing With Data:
 Here determination of how fast the data is
executed is done.
 when the datasets are busy, testing of the data
processing is done in separated forum.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 ActionFlow Testing
 Check The Working Of All The Ingredients:
 test of each and every commodity is a must.
 The speed of message indexes, utilization of those
messages, Phases of the MapReduce procedure,
support search, all comes under this phase.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Performance Testing Approach:
involves testing of huge volumes of planned
and shapeless data, and it requires a specific
testing approach to test such massive data.
Hadoop is involved with storage and
maintenance of a large set of data including
both structured as well as unstructured data .
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Performance Testing Approach:
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Performance Testing Approach:
the set up of the application prior to the testing
procedure begins.
 Find out the required workloads and make the design
accordingly.
 Make ready each and every client separately.
 Perform the testing procedure and also check the
output carefully.
 Do the best possible organization
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Factors For Concert Testing: Various parameters to be
verified for performance testing are
How the information will be stored
Till what extend the commit logs can enlarge
Finding out the concurrency of the read and write
procedures
Find all the standards of the start, and stop timeouts.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Factors For Concert Testing: Various parameters to be
verified for performance testing are
Arrange the key and row cache properly
Do consider the ingredients of the Java Virtual Machine
also
Filter and sort the working of the processing part, the
MapReduce.
Check the messaging rate and its sizes too.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Test Atmosphere Requirements
As always Hadoop structure should be
more spacious since it has to process a large set of
data.
The cluster should contain a large set of nodes to
handle the stored information.
The CPU should be utilized properly.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Challenges In Big Data Testing
 Mechanization:
High technical expert is involved with
mechanical testing.
They do not solve those unexpected problems.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Challenges In Big Data Testing
 Virtualization:
Latency in this system produces time problems in real
time testing.
Image management is also done here.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Challenges In Big Data Testing
 Large Dataset:
Proofing of large amount of data and increase of its
speed.
Need to increase the tests.
 Testing has to be done in several fields.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Performance Testing Challenges:
 Varieties In Technologies:
The different ingredients of Hadoop belong to
different technology
each one of them needs separate kinds of testing
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Performance Testing Challenges:
 Unavailability Of Precise Equipment:
A lot number of testing components are required
for the complete testing procedure
 Test Scripting:
 High-quality scripting is thus important and very
essential for the state of affairs.
Testing and Delivering Big Data
Applications for Performance and
Functionality
 Performance Testing Challenges:
 Test Environment:
The perfect test atmosphere is must, and in most of the
cases not possible to obtain.
 Controlling Resolutions:
 For controlling the complete atmosphere large number
of resolutions is required which is not always present.
Challenges with Data Administration
 responsible for designing and maintaining data
stores.
 data is monitored, maintained and managed by a
person and/or organization.
 allows an organization to check its data
resources, along with their processing and
communications with different applications and
business processes.
Challenges with Data Administration
 data usage and handling is working towards
the enterprise’s objective.
 to integrate data from multiple resources and
provide it to various applications.
Challenges with Data Administration
 Data Administrator deals with designing of
the logical and conceptual models treating the
data at an organizational level.
 Database Administrators deal with the
implementation of databases required and in
use.
Challenges with Data Administration
 Responsibilities of data administrators:
 Data policies, procedures and standards:
 Data policy: can interact with which data, how
that data can be changed and what is the effect of
the change.
Challenges with Data Administration
 Responsibilities of data administrators:
 Data policies, procedures and standards:
 Data Procedures: documented plan of actions to
be taken to perform a certain activity like backup
and recovery procedures.
Challenges with Data Administration
 Responsibilities of data administrators:
 Data policies, procedures and standards:
 Data standards: conventions and behaviors that
need to be followed so that the maintenance
becomes easy.
Challenges with Data Administration
 Responsibilities of data administrators:
 Planning:
 to plan for an effective administration of data and
also provide support for future needs.
Challenges with Data Administration
 Responsibilities of data administrators:
 Data conflict resolution:
 establish procedures for resolving any conflicts in
ownership.
 authority to mediate and enforce the resolution of
the conflict, they may be very effective in this
capacity.
Challenges with Data Administration
 Responsibilities of data administrators:
 Managing the data repository:
 contain metadata that holds data description of the
data stored in data stores.
 describe an organization’s data and data
processing resources.
Challenges with Data Administration
 Responsibilities of data administrators:
 Internal marketing of DA concepts:
 established policies and procedures must be made
known to the internal staff.
Challenges with Data Administration
 Responsibilities of data administrators:
 designing the database:
 defining and creating the logical data model, physical
database model and prototyping.
 Security and authorization:
 ensures that there is no unauthorized access to the data.
 users may be granted permission to access only certain
views and relations.
Challenges with Data Administration
 Responsibilities of data administrators:
Data availability and recovery from failures:
 ensure that the data is made available to its user in
such way that the users are unaware of the failure.
 Database tuning:
 modifying the database, the conceptual and logical
design.
Challenges with Data Administration
 Creating the data repository: integrating it to
create a common data repository is
challenging.
 Emphasize the capability to build a database
quickly, tune it for maximum performance and
restore it to production quickly when problems
develop.
Challenges with Data Administration
 Enforcing the data policies and standards,
especially those related to security.
 support should be provided to incorporate the
changes and make provision for future scope.
 with the social media, should define the
ownership of data.
Challenges with Data Administration
 The administrator is always expected to keep
side by side with new technologies, and is
usually involved in mission-critical
applications.
 to have a comprehensive understanding of a
wide variety of topics and improve business
processes in their organization.

More Related Content

What's hot

Secuntialesse
SecuntialesseSecuntialesse
Secuntialesse
Anne Starr
 
SMBs - Hierarchy of Business-Security Documents 2015-11
SMBs - Hierarchy of Business-Security Documents 2015-11SMBs - Hierarchy of Business-Security Documents 2015-11
SMBs - Hierarchy of Business-Security Documents 2015-11
Alan Watkins
 
Ch06 Policy
Ch06 PolicyCh06 Policy
Ch06 Policy
phanleson
 
downtime_solution_sheet
downtime_solution_sheetdowntime_solution_sheet
downtime_solution_sheet
Diego Portilla
 
Cybertopicsecurity_3
Cybertopicsecurity_3Cybertopicsecurity_3
Cybertopicsecurity_3
Anne Starr
 
The Importance of Security within the Computer Environment
The Importance of Security within the Computer EnvironmentThe Importance of Security within the Computer Environment
The Importance of Security within the Computer Environment
Adetula Bunmi
 
Encryption Solutions for Healthcare
Encryption Solutions for HealthcareEncryption Solutions for Healthcare
Encryption Solutions for Healthcare
Steve Dunn
 
3e - Security Of Data
3e - Security Of Data3e - Security Of Data
3e - Security Of Data
MISY
 
GDPR Part 2: Quest Relevance
GDPR Part 2: Quest RelevanceGDPR Part 2: Quest Relevance
GDPR Part 2: Quest Relevance
Adrian Dumitrescu
 
Business continuity-plan-template
Business continuity-plan-templateBusiness continuity-plan-template
Business continuity-plan-template
Mohamed Owaish
 
Task 2
Task 2Task 2
Domain 2 - Asset Security
Domain 2 - Asset SecurityDomain 2 - Asset Security
Domain 2 - Asset Security
Maganathin Veeraragaloo
 
Information Systems Manager Job Description
Information Systems Manager Job DescriptionInformation Systems Manager Job Description
Information Systems Manager Job Description
Danai Thongsin
 
ISACA Belgium CERT view 2011
ISACA Belgium CERT view 2011ISACA Belgium CERT view 2011
ISACA Belgium CERT view 2011
Marc Vael
 
Inform Interiors Proposal for Managed Support Services
Inform Interiors Proposal for Managed Support ServicesInform Interiors Proposal for Managed Support Services
Inform Interiors Proposal for Managed Support Services
joshua paul
 
Building HIPAA Compliance in service delivery teams
Building HIPAA Compliance in service delivery teamsBuilding HIPAA Compliance in service delivery teams
Building HIPAA Compliance in service delivery teams
Gaurav Garg
 
Benefits of an Managed Service Provider
Benefits of an Managed Service ProviderBenefits of an Managed Service Provider
Benefits of an Managed Service Provider
The TNS Group
 
Active Network Monitoring brings Peace of Mind
Active Network Monitoring brings Peace of MindActive Network Monitoring brings Peace of Mind
Active Network Monitoring brings Peace of Mind
The Lorenzi Group
 
Predictive Maintenance: Achieving Level 4 Maturity
Predictive Maintenance: Achieving Level 4 MaturityPredictive Maintenance: Achieving Level 4 Maturity
Predictive Maintenance: Achieving Level 4 Maturity
FieldCircle
 
Master Class Cyber Compliance IE Law School IE Busines School
Master Class Cyber Compliance IE Law School IE Busines SchoolMaster Class Cyber Compliance IE Law School IE Busines School
Master Class Cyber Compliance IE Law School IE Busines School
Hernan Huwyler, MBA CPA
 

What's hot (20)

Secuntialesse
SecuntialesseSecuntialesse
Secuntialesse
 
SMBs - Hierarchy of Business-Security Documents 2015-11
SMBs - Hierarchy of Business-Security Documents 2015-11SMBs - Hierarchy of Business-Security Documents 2015-11
SMBs - Hierarchy of Business-Security Documents 2015-11
 
Ch06 Policy
Ch06 PolicyCh06 Policy
Ch06 Policy
 
downtime_solution_sheet
downtime_solution_sheetdowntime_solution_sheet
downtime_solution_sheet
 
Cybertopicsecurity_3
Cybertopicsecurity_3Cybertopicsecurity_3
Cybertopicsecurity_3
 
The Importance of Security within the Computer Environment
The Importance of Security within the Computer EnvironmentThe Importance of Security within the Computer Environment
The Importance of Security within the Computer Environment
 
Encryption Solutions for Healthcare
Encryption Solutions for HealthcareEncryption Solutions for Healthcare
Encryption Solutions for Healthcare
 
3e - Security Of Data
3e - Security Of Data3e - Security Of Data
3e - Security Of Data
 
GDPR Part 2: Quest Relevance
GDPR Part 2: Quest RelevanceGDPR Part 2: Quest Relevance
GDPR Part 2: Quest Relevance
 
Business continuity-plan-template
Business continuity-plan-templateBusiness continuity-plan-template
Business continuity-plan-template
 
Task 2
Task 2Task 2
Task 2
 
Domain 2 - Asset Security
Domain 2 - Asset SecurityDomain 2 - Asset Security
Domain 2 - Asset Security
 
Information Systems Manager Job Description
Information Systems Manager Job DescriptionInformation Systems Manager Job Description
Information Systems Manager Job Description
 
ISACA Belgium CERT view 2011
ISACA Belgium CERT view 2011ISACA Belgium CERT view 2011
ISACA Belgium CERT view 2011
 
Inform Interiors Proposal for Managed Support Services
Inform Interiors Proposal for Managed Support ServicesInform Interiors Proposal for Managed Support Services
Inform Interiors Proposal for Managed Support Services
 
Building HIPAA Compliance in service delivery teams
Building HIPAA Compliance in service delivery teamsBuilding HIPAA Compliance in service delivery teams
Building HIPAA Compliance in service delivery teams
 
Benefits of an Managed Service Provider
Benefits of an Managed Service ProviderBenefits of an Managed Service Provider
Benefits of an Managed Service Provider
 
Active Network Monitoring brings Peace of Mind
Active Network Monitoring brings Peace of MindActive Network Monitoring brings Peace of Mind
Active Network Monitoring brings Peace of Mind
 
Predictive Maintenance: Achieving Level 4 Maturity
Predictive Maintenance: Achieving Level 4 MaturityPredictive Maintenance: Achieving Level 4 Maturity
Predictive Maintenance: Achieving Level 4 Maturity
 
Master Class Cyber Compliance IE Law School IE Busines School
Master Class Cyber Compliance IE Law School IE Busines SchoolMaster Class Cyber Compliance IE Law School IE Busines School
Master Class Cyber Compliance IE Law School IE Busines School
 

Similar to IT6701-Information Management Unit 5

Technology Services RM1058 Customer Guidance
Technology Services RM1058 Customer GuidanceTechnology Services RM1058 Customer Guidance
Technology Services RM1058 Customer Guidance
Ben Morrison
 
Members evening - data protection
Members evening - data protectionMembers evening - data protection
Members evening - data protection
MRS
 
Vivek C CV
Vivek C CVVivek C CV
Vivek C CV
Vivek Cholera
 
Privacy Frameworks: The Foundation for Every Privacy Program
Privacy Frameworks: The Foundation for Every Privacy ProgramPrivacy Frameworks: The Foundation for Every Privacy Program
Privacy Frameworks: The Foundation for Every Privacy Program
TrustArc
 
Procurement Of Software And Information Technology Services
Procurement Of Software And Information Technology ServicesProcurement Of Software And Information Technology Services
Procurement Of Software And Information Technology Services
Peister
 
Information Security Program & PCI Compliance Planning for your Business
Information Security Program & PCI Compliance Planning for your BusinessInformation Security Program & PCI Compliance Planning for your Business
Information Security Program & PCI Compliance Planning for your Business
Laura Perry
 
GDPR Readiness for Software Usage Analytics
GDPR Readiness for Software Usage AnalyticsGDPR Readiness for Software Usage Analytics
GDPR Readiness for Software Usage Analytics
Revulytics Inc.
 
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc
 
IAPP Canada Privacy Symposium- "Data Retention Is a Team Sport: How to Get It...
IAPP Canada Privacy Symposium- "Data Retention Is a Team Sport: How to Get It...IAPP Canada Privacy Symposium- "Data Retention Is a Team Sport: How to Get It...
IAPP Canada Privacy Symposium- "Data Retention Is a Team Sport: How to Get It...
Blancco
 
California Consumer Privacy Act (CCPA)
California Consumer Privacy Act (CCPA)California Consumer Privacy Act (CCPA)
California Consumer Privacy Act (CCPA)
Happiest Minds Technologies
 
Preparing for GDPR: General Data Protection Regulation - Stakeholder Presenta...
Preparing for GDPR: General Data Protection Regulation - Stakeholder Presenta...Preparing for GDPR: General Data Protection Regulation - Stakeholder Presenta...
Preparing for GDPR: General Data Protection Regulation - Stakeholder Presenta...
Qualsys Ltd
 
Smart grid
Smart gridSmart grid
Andrew Schafer, Managing Director, EMEA - Verisae Inc
Andrew Schafer, Managing Director, EMEA - Verisae IncAndrew Schafer, Managing Director, EMEA - Verisae Inc
Andrew Schafer, Managing Director, EMEA - Verisae Inc
Global Business Intelligence
 
apidays LIVE Australia 2021 - Empowering the fintech ecosystem with APIs by D...
apidays LIVE Australia 2021 - Empowering the fintech ecosystem with APIs by D...apidays LIVE Australia 2021 - Empowering the fintech ecosystem with APIs by D...
apidays LIVE Australia 2021 - Empowering the fintech ecosystem with APIs by D...
apidays
 
Data Security & Data Privacy: Data Anonymization
Data Security & Data Privacy: Data AnonymizationData Security & Data Privacy: Data Anonymization
Data Security & Data Privacy: Data Anonymization
Patric Dahse
 
Ethyca CodeDriven - Data Privacy Compliance for Engineers & Data Teams
Ethyca CodeDriven - Data Privacy Compliance for Engineers & Data TeamsEthyca CodeDriven - Data Privacy Compliance for Engineers & Data Teams
Ethyca CodeDriven - Data Privacy Compliance for Engineers & Data Teams
Cillian Kieran
 
General data protection regulation - European union
General data protection regulation  - European unionGeneral data protection regulation  - European union
General data protection regulation - European union
Rohana K Amarakoon
 
LexComply - Compliance Software India
LexComply - Compliance Software IndiaLexComply - Compliance Software India
LexComply - Compliance Software India
LexComply
 
Iron Mountain® Policy Center Solution Enterprise Edition
Iron Mountain® Policy Center Solution Enterprise EditionIron Mountain® Policy Center Solution Enterprise Edition
Iron Mountain® Policy Center Solution Enterprise Edition
InfoGoTo
 
Michael Josephs
Michael JosephsMichael Josephs
Michael Josephs
daveGBE
 

Similar to IT6701-Information Management Unit 5 (20)

Technology Services RM1058 Customer Guidance
Technology Services RM1058 Customer GuidanceTechnology Services RM1058 Customer Guidance
Technology Services RM1058 Customer Guidance
 
Members evening - data protection
Members evening - data protectionMembers evening - data protection
Members evening - data protection
 
Vivek C CV
Vivek C CVVivek C CV
Vivek C CV
 
Privacy Frameworks: The Foundation for Every Privacy Program
Privacy Frameworks: The Foundation for Every Privacy ProgramPrivacy Frameworks: The Foundation for Every Privacy Program
Privacy Frameworks: The Foundation for Every Privacy Program
 
Procurement Of Software And Information Technology Services
Procurement Of Software And Information Technology ServicesProcurement Of Software And Information Technology Services
Procurement Of Software And Information Technology Services
 
Information Security Program & PCI Compliance Planning for your Business
Information Security Program & PCI Compliance Planning for your BusinessInformation Security Program & PCI Compliance Planning for your Business
Information Security Program & PCI Compliance Planning for your Business
 
GDPR Readiness for Software Usage Analytics
GDPR Readiness for Software Usage AnalyticsGDPR Readiness for Software Usage Analytics
GDPR Readiness for Software Usage Analytics
 
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
 
IAPP Canada Privacy Symposium- "Data Retention Is a Team Sport: How to Get It...
IAPP Canada Privacy Symposium- "Data Retention Is a Team Sport: How to Get It...IAPP Canada Privacy Symposium- "Data Retention Is a Team Sport: How to Get It...
IAPP Canada Privacy Symposium- "Data Retention Is a Team Sport: How to Get It...
 
California Consumer Privacy Act (CCPA)
California Consumer Privacy Act (CCPA)California Consumer Privacy Act (CCPA)
California Consumer Privacy Act (CCPA)
 
Preparing for GDPR: General Data Protection Regulation - Stakeholder Presenta...
Preparing for GDPR: General Data Protection Regulation - Stakeholder Presenta...Preparing for GDPR: General Data Protection Regulation - Stakeholder Presenta...
Preparing for GDPR: General Data Protection Regulation - Stakeholder Presenta...
 
Smart grid
Smart gridSmart grid
Smart grid
 
Andrew Schafer, Managing Director, EMEA - Verisae Inc
Andrew Schafer, Managing Director, EMEA - Verisae IncAndrew Schafer, Managing Director, EMEA - Verisae Inc
Andrew Schafer, Managing Director, EMEA - Verisae Inc
 
apidays LIVE Australia 2021 - Empowering the fintech ecosystem with APIs by D...
apidays LIVE Australia 2021 - Empowering the fintech ecosystem with APIs by D...apidays LIVE Australia 2021 - Empowering the fintech ecosystem with APIs by D...
apidays LIVE Australia 2021 - Empowering the fintech ecosystem with APIs by D...
 
Data Security & Data Privacy: Data Anonymization
Data Security & Data Privacy: Data AnonymizationData Security & Data Privacy: Data Anonymization
Data Security & Data Privacy: Data Anonymization
 
Ethyca CodeDriven - Data Privacy Compliance for Engineers & Data Teams
Ethyca CodeDriven - Data Privacy Compliance for Engineers & Data TeamsEthyca CodeDriven - Data Privacy Compliance for Engineers & Data Teams
Ethyca CodeDriven - Data Privacy Compliance for Engineers & Data Teams
 
General data protection regulation - European union
General data protection regulation  - European unionGeneral data protection regulation  - European union
General data protection regulation - European union
 
LexComply - Compliance Software India
LexComply - Compliance Software IndiaLexComply - Compliance Software India
LexComply - Compliance Software India
 
Iron Mountain® Policy Center Solution Enterprise Edition
Iron Mountain® Policy Center Solution Enterprise EditionIron Mountain® Policy Center Solution Enterprise Edition
Iron Mountain® Policy Center Solution Enterprise Edition
 
Michael Josephs
Michael JosephsMichael Josephs
Michael Josephs
 

More from SIMONTHOMAS S

Cs8092 computer graphics and multimedia unit 5
Cs8092 computer graphics and multimedia unit 5Cs8092 computer graphics and multimedia unit 5
Cs8092 computer graphics and multimedia unit 5
SIMONTHOMAS S
 
Cs8092 computer graphics and multimedia unit 4
Cs8092 computer graphics and multimedia unit 4Cs8092 computer graphics and multimedia unit 4
Cs8092 computer graphics and multimedia unit 4
SIMONTHOMAS S
 
Cs8092 computer graphics and multimedia unit 3
Cs8092 computer graphics and multimedia unit 3Cs8092 computer graphics and multimedia unit 3
Cs8092 computer graphics and multimedia unit 3
SIMONTHOMAS S
 
Cs8092 computer graphics and multimedia unit 2
Cs8092 computer graphics and multimedia unit 2Cs8092 computer graphics and multimedia unit 2
Cs8092 computer graphics and multimedia unit 2
SIMONTHOMAS S
 
Cs8092 computer graphics and multimedia unit 1
Cs8092 computer graphics and multimedia unit 1Cs8092 computer graphics and multimedia unit 1
Cs8092 computer graphics and multimedia unit 1
SIMONTHOMAS S
 
Mg6088 spm unit-5
Mg6088 spm unit-5Mg6088 spm unit-5
Mg6088 spm unit-5
SIMONTHOMAS S
 
Mg6088 spm unit-4
Mg6088 spm unit-4Mg6088 spm unit-4
Mg6088 spm unit-4
SIMONTHOMAS S
 
Mg6088 spm unit-3
Mg6088 spm unit-3Mg6088 spm unit-3
Mg6088 spm unit-3
SIMONTHOMAS S
 
Mg6088 spm unit-2
Mg6088 spm unit-2Mg6088 spm unit-2
Mg6088 spm unit-2
SIMONTHOMAS S
 
Mg6088 spm unit-1
Mg6088 spm unit-1Mg6088 spm unit-1
Mg6088 spm unit-1
SIMONTHOMAS S
 
IT6701-Information Management Unit 4
IT6701-Information Management Unit 4IT6701-Information Management Unit 4
IT6701-Information Management Unit 4
SIMONTHOMAS S
 
IT6701-Information Management Unit 3
IT6701-Information Management Unit 3IT6701-Information Management Unit 3
IT6701-Information Management Unit 3
SIMONTHOMAS S
 
IT6701-Information Management Unit 2
IT6701-Information Management Unit 2IT6701-Information Management Unit 2
IT6701-Information Management Unit 2
SIMONTHOMAS S
 
IT6701-Information Management Unit 1
IT6701-Information Management Unit 1IT6701-Information Management Unit 1
IT6701-Information Management Unit 1
SIMONTHOMAS S
 
CS8391-Data Structures Unit 5
CS8391-Data Structures Unit 5CS8391-Data Structures Unit 5
CS8391-Data Structures Unit 5
SIMONTHOMAS S
 
CS8391-Data Structures Unit 4
CS8391-Data Structures Unit 4CS8391-Data Structures Unit 4
CS8391-Data Structures Unit 4
SIMONTHOMAS S
 
CS8391-Data Structures Unit 3
CS8391-Data Structures Unit 3CS8391-Data Structures Unit 3
CS8391-Data Structures Unit 3
SIMONTHOMAS S
 
CS8391-Data Structures Unit 2
CS8391-Data Structures Unit 2CS8391-Data Structures Unit 2
CS8391-Data Structures Unit 2
SIMONTHOMAS S
 
CS8391-Data Structures Unit 1
CS8391-Data Structures Unit 1CS8391-Data Structures Unit 1
CS8391-Data Structures Unit 1
SIMONTHOMAS S
 
SPC Unit 5
SPC Unit 5SPC Unit 5
SPC Unit 5
SIMONTHOMAS S
 

More from SIMONTHOMAS S (20)

Cs8092 computer graphics and multimedia unit 5
Cs8092 computer graphics and multimedia unit 5Cs8092 computer graphics and multimedia unit 5
Cs8092 computer graphics and multimedia unit 5
 
Cs8092 computer graphics and multimedia unit 4
Cs8092 computer graphics and multimedia unit 4Cs8092 computer graphics and multimedia unit 4
Cs8092 computer graphics and multimedia unit 4
 
Cs8092 computer graphics and multimedia unit 3
Cs8092 computer graphics and multimedia unit 3Cs8092 computer graphics and multimedia unit 3
Cs8092 computer graphics and multimedia unit 3
 
Cs8092 computer graphics and multimedia unit 2
Cs8092 computer graphics and multimedia unit 2Cs8092 computer graphics and multimedia unit 2
Cs8092 computer graphics and multimedia unit 2
 
Cs8092 computer graphics and multimedia unit 1
Cs8092 computer graphics and multimedia unit 1Cs8092 computer graphics and multimedia unit 1
Cs8092 computer graphics and multimedia unit 1
 
Mg6088 spm unit-5
Mg6088 spm unit-5Mg6088 spm unit-5
Mg6088 spm unit-5
 
Mg6088 spm unit-4
Mg6088 spm unit-4Mg6088 spm unit-4
Mg6088 spm unit-4
 
Mg6088 spm unit-3
Mg6088 spm unit-3Mg6088 spm unit-3
Mg6088 spm unit-3
 
Mg6088 spm unit-2
Mg6088 spm unit-2Mg6088 spm unit-2
Mg6088 spm unit-2
 
Mg6088 spm unit-1
Mg6088 spm unit-1Mg6088 spm unit-1
Mg6088 spm unit-1
 
IT6701-Information Management Unit 4
IT6701-Information Management Unit 4IT6701-Information Management Unit 4
IT6701-Information Management Unit 4
 
IT6701-Information Management Unit 3
IT6701-Information Management Unit 3IT6701-Information Management Unit 3
IT6701-Information Management Unit 3
 
IT6701-Information Management Unit 2
IT6701-Information Management Unit 2IT6701-Information Management Unit 2
IT6701-Information Management Unit 2
 
IT6701-Information Management Unit 1
IT6701-Information Management Unit 1IT6701-Information Management Unit 1
IT6701-Information Management Unit 1
 
CS8391-Data Structures Unit 5
CS8391-Data Structures Unit 5CS8391-Data Structures Unit 5
CS8391-Data Structures Unit 5
 
CS8391-Data Structures Unit 4
CS8391-Data Structures Unit 4CS8391-Data Structures Unit 4
CS8391-Data Structures Unit 4
 
CS8391-Data Structures Unit 3
CS8391-Data Structures Unit 3CS8391-Data Structures Unit 3
CS8391-Data Structures Unit 3
 
CS8391-Data Structures Unit 2
CS8391-Data Structures Unit 2CS8391-Data Structures Unit 2
CS8391-Data Structures Unit 2
 
CS8391-Data Structures Unit 1
CS8391-Data Structures Unit 1CS8391-Data Structures Unit 1
CS8391-Data Structures Unit 1
 
SPC Unit 5
SPC Unit 5SPC Unit 5
SPC Unit 5
 

Recently uploaded

Exception Handling notes in java exception
Exception Handling notes in java exceptionException Handling notes in java exception
Exception Handling notes in java exception
Ratnakar Mikkili
 
sieving analysis and results interpretation
sieving analysis and results interpretationsieving analysis and results interpretation
sieving analysis and results interpretation
ssuser36d3051
 
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELDEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
gerogepatton
 
Self-Control of Emotions by Slidesgo.pptx
Self-Control of Emotions by Slidesgo.pptxSelf-Control of Emotions by Slidesgo.pptx
Self-Control of Emotions by Slidesgo.pptx
iemerc2024
 
ACEP Magazine edition 4th launched on 05.06.2024
ACEP Magazine edition 4th launched on 05.06.2024ACEP Magazine edition 4th launched on 05.06.2024
ACEP Magazine edition 4th launched on 05.06.2024
Rahul
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
zwunae
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...
IJECEIAES
 
Literature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptxLiterature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptx
Dr Ramhari Poudyal
 
bank management system in java and mysql report1.pdf
bank management system in java and mysql report1.pdfbank management system in java and mysql report1.pdf
bank management system in java and mysql report1.pdf
Divyam548318
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
ClaraZara1
 
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&BDesign and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Sreedhar Chowdam
 
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
thanhdowork
 
basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
NidhalKahouli2
 
Unbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptxUnbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptx
ChristineTorrepenida1
 
PROJECT FORMAT FOR EVS AMITY UNIVERSITY GWALIOR.ppt
PROJECT FORMAT FOR EVS AMITY UNIVERSITY GWALIOR.pptPROJECT FORMAT FOR EVS AMITY UNIVERSITY GWALIOR.ppt
PROJECT FORMAT FOR EVS AMITY UNIVERSITY GWALIOR.ppt
bhadouriyakaku
 
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
obonagu
 
原版制作(unimelb毕业证书)墨尔本大学毕业证Offer一模一样
原版制作(unimelb毕业证书)墨尔本大学毕业证Offer一模一样原版制作(unimelb毕业证书)墨尔本大学毕业证Offer一模一样
原版制作(unimelb毕业证书)墨尔本大学毕业证Offer一模一样
obonagu
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
Aditya Rajan Patra
 
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
awadeshbabu
 
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTSHeap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Soumen Santra
 

Recently uploaded (20)

Exception Handling notes in java exception
Exception Handling notes in java exceptionException Handling notes in java exception
Exception Handling notes in java exception
 
sieving analysis and results interpretation
sieving analysis and results interpretationsieving analysis and results interpretation
sieving analysis and results interpretation
 
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELDEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
 
Self-Control of Emotions by Slidesgo.pptx
Self-Control of Emotions by Slidesgo.pptxSelf-Control of Emotions by Slidesgo.pptx
Self-Control of Emotions by Slidesgo.pptx
 
ACEP Magazine edition 4th launched on 05.06.2024
ACEP Magazine edition 4th launched on 05.06.2024ACEP Magazine edition 4th launched on 05.06.2024
ACEP Magazine edition 4th launched on 05.06.2024
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...
 
Literature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptxLiterature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptx
 
bank management system in java and mysql report1.pdf
bank management system in java and mysql report1.pdfbank management system in java and mysql report1.pdf
bank management system in java and mysql report1.pdf
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
 
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&BDesign and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
 
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
 
basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
 
Unbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptxUnbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptx
 
PROJECT FORMAT FOR EVS AMITY UNIVERSITY GWALIOR.ppt
PROJECT FORMAT FOR EVS AMITY UNIVERSITY GWALIOR.pptPROJECT FORMAT FOR EVS AMITY UNIVERSITY GWALIOR.ppt
PROJECT FORMAT FOR EVS AMITY UNIVERSITY GWALIOR.ppt
 
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
 
原版制作(unimelb毕业证书)墨尔本大学毕业证Offer一模一样
原版制作(unimelb毕业证书)墨尔本大学毕业证Offer一模一样原版制作(unimelb毕业证书)墨尔本大学毕业证Offer一模一样
原版制作(unimelb毕业证书)墨尔本大学毕业证Offer一模一样
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
 
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
 
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTSHeap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
 

IT6701-Information Management Unit 5

  • 2. UNIT V INFORMATION LIFECYCLE MANAGEMENT Data retention policies; Confidential and Sensitive data handling, lifecycle management costs. Archive data using Hadoop; Testing and delivering big data applications for performance and functionality; Challenges with data administration
  • 3. Introduction  Data – for analyzing and for drawing important conclusions and observations about the business.  there are constant updates and older data might become obsolete.  Decide which data is no longer required.
  • 4. Data Retention Policies A data retention policy, or records retention policy, is an organization's established protocol for retaining information for operational or regulatory compliance needs.
  • 5. Data Retention Policies  deal with the complex issues of protecting business information for a pre-determined length of time on a pre-determined storage system.  These policies define different retention periods, depending on the type of data.
  • 6. Data Retention Policies  describe procedures for archiving the information, guidelines for destroying the information when the time limit exceeds and special mechanisms for handling the information.
  • 7. Data Retention Policies  Purpose:  To maintain important records and documents for future use or reference  To dispose of records or documents that are no longer needed  To organize records so that they can be searched and accessed easily at a later date.
  • 8. Data Retention Policies  Purpose:  Email messaging had a large impact on those who develop and enforce data retention policies.  This stored business information can be expensive for the business organizations.
  • 9. Data Retention Policies  Requirements:  Legal or legitimate requirements: information related to legal information  Business or commercial requirements: for operational perspective  Personal or private requirements: for personal perspective
  • 10. Data Retention Policies  Scope: what kind of data are covered under data retention policies  Legal record: contracts, trademark, power of attorney, press release.  Final records: records of completed activities.  Permanent records: financial registers, patents, proposals.
  • 11. Data Retention Policies  Scope: what kind of data are covered under data retention policies  Accounting and corporate tax records: investments, audits, purchase, sales records  workplace records: day to day activities of employees, agreement, minutes of meetings  Employment, employee and payroll records: job posting, job advertisements, recruitment procedures, performance reviews.
  • 12. Data Retention Policies  Scope: what kind of data are covered under data retention policies  Bank records: bank transactions, deposits, cheque details, cheque bouncing  Historic records: records that are no longer required by the organization. Temporary records: documents that are not complete or finalized
  • 13. Data Retention Policies - Content  to focus on the reason behind data retention.  identification of criteria on which data needs to be retained.  Usually the decision is based on the creation of date, but it is also important to examine other criteria such as last accessed time, type of data, time till which the data is valid, data value
  • 14. Data Retention Policies - Content  The policy document should include details of the data that need to be archived or retained.  the division of data would help in deciding the duration of retention and destruction procedures.
  • 15. Managing Data Retention Policies  to identify and managing authority  combined effort, involving storage administrators and application owners, along with executive support.  The policy document should be validated by the company’s legal counsel and should also be fully supported by the management to be presented as a company policy and not restricting it as an IT best practice document.
  • 16. Data Retention in Telecommunication Industry  the storage of call detail records of telephony and internet traffic and transaction data by government and commercial organizations.  Retention requirements for service providers are found in the ISP and UASL licenses, which are documented in the Indian Telegraph Act, 1885.
  • 17. Data Retention in Telecommunication Industry  Internet Service Provider (ISP) License:  According to ISP license, each ISP must maintain:  Customers and services: A log of all customers registered and all the services used by them.
  • 18. Data Retention in Telecommunication Industry  Internet Service Provider (ISP) License:  Outward logins/ connections or telnet:  Every outward login or telnet through an ISP’s computer must be logged.  Data Packets:  Copies of all packets originating from the Customer/User Premises Equipment of the ISP must be available.
  • 19. Data Retention in Telecommunication Industry  Internet Service Provider (ISP) License:  Subscribers:  available on the ISP website with authenticated access and must be available to authorized Intelligence Agencies at any time.  Internet-leased line customers:  the complete list of customers and sub customers.  Details like name, address of installation, IP address allotted, bandwidth provided and contact person with phone number/email.
  • 20. Data Retention in Telecommunication Industry  Internet Service Provider (ISP) License:  Subscribers:  available on the ISP website with authenticated access and must be available to authorized Intelligence Agencies at any time.  Internet-leased line customers:  the complete list of customers and sub customers.  Details like name, address of installation, IP address allotted, bandwidth provided and contact person with phone number/email.
  • 21. Data Retention in Telecommunication Industry  Internet Service Provider (ISP) License:  Network records and purpose  A record of complete network topology of the set- up each of the internet –leased line customer premises along with details of connectivity must be made available at the site of the service provider.
  • 22. Data Retention in Telecommunication Industry  Internet Service Provider (ISP) License:  Commercial records:  communications exchanged on the network must be maintained for a year.  Site:  maintain the geographic location of all its subscriber and should be able to provide it at a given point of time.
  • 23. Data Retention in Telecommunication Industry  Internet Service Provider (ISP) License:  Remote activities:  the network should have a complete audit trail and must be retained for a period of 6 months.  This information must be provided on request to the licenser or any other agency authorized by the licensor.
  • 24. Data Retention in Telecommunication Industry  Unified Access Service License (UASL)  introduced by DoT through which an access service provider can offer a fixed and/or mobile services using any technology under the same license.  to retain pertaining to customer information or transactions for security purposes.
  • 25. Data Retention in Telecommunication Industry  Unified Access Service License (UASL)  Mobile numbers:  Called / Calling party mobile numbers when required.  Capture / Interception records:  Time, date and duration of interception when required.
  • 26. Data Retention in Telecommunication Industry  Unified Access Service License (UASL)  Site / location:  Location of target subscribers.  All call records:  All call data records handled by the system when required.  Failed call records  Roaming subscriber records
  • 27. Data Retention in Telecommunication Industry  Unified Access Service License (UASL)  Commercial records:  the communication exchanged on the network must be retained for 1 year.  Outgoing call records:  a record of checks made on outgoing calls completed by customers making a large number of outgoing calls day and night to various customers.
  • 28. Data Retention in Telecommunication Industry  Unified Access Service License (UASL)  Calling line identification  A list of subscribers including address and details using line identification should be kept in a password-protected website accessible to authorized government agencies.  Location  The service provider must be able to provide the geographical location of any subscriber at any point of time.  Remote access activities:  The complete audit trails of the remote access activities pertaining to the network operated in India for period of 6 months.
  • 29. Data Retention in Telecommunication Industry  Sample retention Records:  Accounting and finance: Record type Retention Period Accounts payable/receivable ledgers and schedules 8 Years Financial statements and Annual audit reports Permanent Annual audit records 8 years after the completion of audit Annual plans and budgets 3 Years
  • 30. Data Retention in Telecommunication Industry  Sample retention Records:  Electronic documents:  Email:  All the emails are not retained  All emails are deleted after a period of 12 months.  All emails will be archived for 6 months after it has been deleted by a staff, after which the mails will be permanently deleted.  Staff will not send confidential / proprietary data to outsiders.
  • 31. Data Retention in Telecommunication Industry  Sample retention Records:  Electronic documents:  Documents:  The maximum period of retention of documents is 6 years, depending on the content of the file.  Web pages:  Browsers should be scheduled to delete internet cookies once per month.
  • 32. Data Retention in Telecommunication Industry  Sample retention Records: Insurance records: Record type Retention Period Insurance policies and certificates Permanent Claims files Permanent Group insurance plans ( current employee) Till the plan is active or employee is terminated Group insurance plans (Retires) Permanent or until 5 years after the death of last eligible andidate.
  • 33. Data Retention in Telecommunication Industry  Sample retention Records: Legal files and papers: Record type Retention Period Legal memoranda and ideas 6 years after close of matter Court Orders Permanent
  • 34. Data Retention in Telecommunication Industry  Sample retention Records: Payroll documents: Record type Retention Period Payroll registers 6 years Time sheets 2 years Payroll deductions Termination + 6 years
  • 35. Data Retention in Telecommunication Industry  Sample retention Records: Personnel records: Record type Retention Period Employee service book Permanent Employee medical records Termination/Retirement + 6 years Bio-data of applicants not selected 1 year
  • 36. Data Retention in Telecommunication Industry  Sample retention Records: Property records: Record type Retention Period Property deeds, licenses Permanent Purchase/ Sale/ Lease agreement Permanent Property insurance policy Permanent
  • 37. Data Retention in Telecommunication Industry  Sample retention Records: Tax records: Record type Retention Period Tax returns – income, property Permanent Sales/ use tax records 7 years Tax exemption documents Permanent
  • 38. Data Retention in Telecommunication Industry  Laws related to Data retention policy in India:  License Agreement for Provision of Internet Services.  maintain all commercial records with regard to the communications exchanged on the network.  ISPs are responsible for maintaining history or a log of all users connected through ISP and the services they are using.
  • 39. Data Retention in Telecommunication Industry  Laws related to Data retention policy in India: Information Technology Act (ITA) - 2008.  Section 67C (“Preservation and Retention of Information by Intermediaries”)  online service providers and at least some access point providers – retain a specified amount of information for a specified period of time.
  • 40. Data Retention in Telecommunication Industry  Laws related to Data retention policy in India: Information Technology Act (ITA) - 2008.  Section 79(2), under which intermediaries are protected from liability for third party content provided that they follow due carefulness while discharging notice-and-takedown requirements of the law
  • 41. Data Retention in Telecommunication Industry  Laws related to Data retention policy in India: The Indian Department of Information Technology, Ministry of Communications and Information Technology - 2011.  store the traffic data and “history of websites accessed” for each user for 1 year.  Users must be identified by their government issued Id number and photograph.
  • 42. Confidential and Sensitive Data Handling  Personal Data: information about an individual, and through which an individual can easily identified , either directly or indirectly.  Confidential Data: the personal data that is private and should be disclosed to others  Sensitive Data: secured from unauthorized access to protect the privacy or security of an individual or organization.
  • 43. Confidential and Sensitive Data Handling  Types of Sensitive information:  Personal information:  Sensitive personally identifiable information is data that can be traced back to an individual, thus revealing one’s identity.  Information: biometric data, medical information and history, bank and credit card information.
  • 44. Confidential and Sensitive Data Handling  Types of Sensitive information:  Business Information:  poses a risk to the company in question if discovered by a competitor or the general public.  Trade secrets, contract details, financial data and supplier and customer identification.
  • 45. Confidential and Sensitive Data Handling  Types of Sensitive information:  Classified Information:  pertains to a government body and it restricted according to the level of sensitivity.  to protect security.
  • 46. Confidential and Sensitive Data Handling  Handling of Sensitive data:  Access policy  Access Decisions: Availability of data, acceptability of the access data and non-sensitive information derived from sensitive data.
  • 47. Confidential and Sensitive Data Handling Types of Disclosures:  Displaying exact data  Displaying bounds – between high and low value. Range of salary  Displaying negative results  Displaying probable values
  • 48. Confidential and Sensitive Data Handling  Handling Data:  Create a security risk-aware culture – Risk Management  Define data types – classify it as confidential or sensitive.  Clarify responsibilities and accountability for protection of confidential or sensitive data.  Limit the access confidential or sensitive data  Provide training to properly use the resources and follow the guidelines and rules.  Authenticate compliance regularly with policies and procedures.
  • 49. Confidential and Sensitive Data Handling  Law provision in India defining Sensitive Data and its handling:  Information Technology Act:  reasonable security practices and procedures while handling sensitive personal data or information.
  • 50. Confidential and Sensitive Data Handling  Law provision in India defining Sensitive Data and its handling:  Information Technology Act:  Criminal punishment for a person  discloses sensitive personal information  violation of the relevant contract  with an intention of, or knowing that the disclosure would cause wrongful loss or gain.
  • 51. Confidential and Sensitive Data Handling  Information Technology Act – Feature  Sensitive personal information: Sensitive Personal Data (SPD) includes passwords, financial and credit card details, physical, physiological and mental health condition, medical records and history.  SPD deals only with information of individuals and not information of businesses.
  • 52. Confidential and Sensitive Data Handling  Information Technology Act – Feature  Privacy Policy: describe what information is collected, what is the purpose of using the information, to whom or how the information might be disclosed and the sound security practices followed to safeguard the information.
  • 53. Confidential and Sensitive Data Handling  Information Technology Act – Feature  Privacy Policy: describe what information is collected, what is the purpose of using the information, to whom or how the information might be disclosed the sound security practices followed to safeguard the information.
  • 54. Confidential and Sensitive Data Handling  Information Technology Act – Feature  Consent for collection:  approval has to be provided by letter, fax or email.  prior to collecting the information, provide an option to the information provider to not provide such information.
  • 55. Confidential and Sensitive Data Handling  Information Technology Act – Feature  Notification:  the business ensure that the information provider is aware of the information being collected  purpose of using the information  recipients of the information and name and address of the agency collecting the information.
  • 56. Confidential and Sensitive Data Handling  Information Technology Act – Feature  Use and retention:  restricted to the purpose for which it was collected.  The business should not maintain the SPD for longer than it is specified.
  • 57. Confidential and Sensitive Data Handling  Information Technology Act – Feature Right of access, correction and withdrawal  permit the information provider the right to review the information, and should ensure that any information found to be inaccurate or deficient be corrected.
  • 58. Confidential and Sensitive Data Handling  Information Technology Act – Feature  Transactional transfer  transferred if it is necessary for the performance of a lawful contract between the body corporate and information provider or where the information provider has provided his/her consent to such transfer
  • 59. Confidential and Sensitive Data Handling  Information Technology Act – Feature  Security Procedures  procedure has to be audited on a regular basis by an independent auditor
  • 60. Lifecycle Management Costs  Data lifecycle management:  the process of handling the flow of business information throughout its lifespan, from requirements through maintenance.  automating the process involved in organizing data into separate tiers according to specified policies, and automating data migration from one tier to another.
  • 61. Lifecycle Management Costs  Data lifecycle management – Stages:  Data creation powers the enterprise:  When an employee creates and saves a file, that information becomes part of the organization's daily operations  store this active data locally and on a network server
  • 62. Lifecycle Management Costs  Data lifecycle management – Stages:  Backups guard against data loss:  enterprise can move it from primary storage into less costly off-site tape vaults or to the cloud.  A well-rounded data backup and recovery strategy combines off-site tape storage with cloud backup and data restoration capabilities
  • 63. Lifecycle Management Costs  Data lifecycle management – Stages:  Archiving helps contain storage costs:  to retain older inactive data in case of a legal, regulatory or audit event. to hold on to data for as long as seven years  Off-site tape archives offer high security, quick access and lower storage costs for such long-term data storage demands.
  • 64. Lifecycle Management Costs  Data lifecycle management – Stages:  Ensuring secure data destruction:  The final stage of the data lifecycles requires secure destruction, which is typically governed by a schedule that defines when you must destroy unneeded data.  Once data reaches its expiration date, secure media destruction can ensure its environmentally friendly disposal.
  • 65. Lifecycle Management Costs  Data lifecycle management – Stages:  Put secure IT asset disposition to work:  before discarding the storage media it needs to be completely destroyed.
  • 66. Lifecycle Management Costs  Efficient Data lifecycle management:  the storage needs to be scalable to accommodate  Analytics applications in some cases require us to access archived and unstructured data.  the storage can be optimized for maintenance and licensing costs by migrating rarely used data into framework like Hadoop.
  • 67. Lifecycle Management Costs  Objectives of Data lifecycle management:  Data trustworthiness  Both structured and unstructured data must be managed effectively.  Data privacy and security must be protected at all times.
  • 68. Archive Data using Hadoop  Hadoop – to store any type of data  ability to query Hadoop data with SQL makes Hadoop the prime destination for archival data.  to perform archiving is Sqoop, which can move the data to be archived from the data warehouse into Hadoop.
  • 69. Archive Data using Hadoop  Features:  Schema preservation:  to ensure that data values will be archived without loss of precision.  Changes to the source schema, for example adding new columns or changing data types, should also be captured by the archive.  allows the archive to grow organically over a long period of time while maintaining a continuous historical record of the changes to the schema and the data in the source EDW.
  • 70. Archive Data using Hadoop Features:  Control and Security:  Archived data generally inherits the same governance requirements as the EDW.  The archive must provide access to data on a ‘need to know’ basis; it must guarantee that sensitive data is encrypted or masked, and that access is audited.  An archive must also integrate with the same enterprise security infrastructure as the EDW.
  • 71. Archive Data using Hadoop  Features:  SQL support  Support for SQL access to the archived data is a must.  Applications would require us to make use of the archived data to generate reports or to perform analysis.  to execute run-time interactive queries along with batch processing.  SLA can be relaxed or ignored
  • 72. Testing and Delivering Big Data Applications for Performance and Functionality  A huge set of complicated structured and unstructured data is called as Big Data.  testing of Big Data, a lot of processes and techniques are involved.  Big Data testing is a proof of the perfect data dealing, instead of testing the tool.  In testing of data, Performance and functional testing are the keys.
  • 73. Testing and Delivering Big Data Applications for Performance and Functionality Signs That Show We Should Go For Testing Are:  Presentation Testing: Big Data applications work together with existing statistics for genuine occasion analytics  makes the procedure keep going.
  • 74. Testing and Delivering Big Data Applications for Performance and Functionality Signs That Show We Should Go For Testing Are:  Problems With Expansion Capacity: starts with lesser sets of statistics and ends up with an overweight quantity of statistics.  a number of data increases, the performance of analytics may reduce.
  • 75. Testing and Delivering Big Data Applications for Performance and Functionality Signs That Show We Should Go For Testing Are: Towering Quantity Of Downtime: due to a large number of problems, the data faces certain issues resulting in a reduction of downtime. So if a continuous amount of downtimes occur, then users should be a concern and be sure that it is time for testing the Big Data Analytics
  • 76. Testing and Delivering Big Data Applications for Performance and Functionality Signs That Show We Should Go For Testing Are: Poor Improvement: Failure in handling data efficiently for longer time span would result in improper development.  Hence for running the business appropriately, proper testing of data is required, because the delivery of the proper result to clients.
  • 77. Testing and Delivering Big Data Applications for Performance and Functionality Signs That Show We Should Go For Testing Are: No Proper Control: require proper control of the information the business work with.  And this proper data can be obtained only by frequently checking the data.
  • 78. Testing and Delivering Big Data Applications for Performance and Functionality  Signs That Show We Should Go For Testing Are:  Poor Safety Measures: big data stores the organization’s complete data from credential sets to all the confidential reports so safety and protection in Big data is a must and the management have to make sure that the data stored in HDFS of big data is secured to the fullest.  to steal confidential data from the company’s storage.
  • 79. Testing and Delivering Big Data Applications for Performance and Functionality  Signs That Show We Should Go For Testing Are:  Problems With The Proper Running Of The Applications: Before applying data to be used in different applications they should undergo a testing procedure to find out if they are fit for the analysis.  in order to assure proper, running the applications, performing proper testing should be a must.
  • 80. Testing and Delivering Big Data Applications for Performance and Functionality Signs That Show We Should Go For Testing Are: Proper Output: In order to get the best output in any project proper input is necessary and correction and testing of input must be made sure to determine the best output ever
  • 81. Testing and Delivering Big Data Applications for Performance and Functionality Signs That Show We Should Go For Testing Are: Unpredictable Performances:  When the right data is used in the right way, then the potential of any organization finds no limit.  Only through correct testing on time will help to decide inconsistency and removes insecurity.
  • 82. Testing and Delivering Big Data Applications for Performance and Functionality Signs That Show We Should Go For Testing Are: insufficient Value: a lot of other factors need to be taken cared of like the strength, precision, traditional values, replication, stability, etc.  gaining the proper data, all factors need to be checked which led to the requirement of performing testing on Big Data.
  • 83. Testing and Delivering Big Data Applications for Performance and Functionality  A High-Level Overview Of Phases In Testing Big Data Applications:  The Testing Procedure Is Filled With  Data Phase Proofing:  The data collected from different places need to be proved to be correct. The supply data and the input data needs to be similar Make sure true and valid data is put into the HDFS.
  • 84. Testing and Delivering Big Data Applications for Performance and Functionality  A High-Level Overview Of Phases In Testing Big Data Applications:  The Testing Procedure Is Filled With Proofing of MapReduce:  data accumulation regulations are applied on data.  proof the processed output data.
  • 85. Testing and Delivering Big Data Applications for Performance and Functionality  A High-Level Overview Of Phases In Testing Big Data Applications:  The Testing Procedure Is Filled With  Proofing of the Output:  the transformation rules are implemented accurately.  Fill the information in the target system.  make sure that the data in the output and in the HDFS has no fraud.
  • 86. Testing and Delivering Big Data Applications for Performance and Functionality  Testing of the Architecture  Hadoop is the data storage of an immense set of data with high standard arrangement and security.  the testing should always occur in the Hadoop atmosphere only.  Testing of the concert includes the clear output completion, use of proper storage, throughput, and system commodities  Data processing is flawless and it needs to be proved.
  • 87. Testing and Delivering Big Data Applications for Performance and Functionality  ActionFlow Testing  Information Intake And Right Through:  the speed of the data from different sources is determined.  Categorizing messages from different data frame in different time is classified
  • 88. Testing and Delivering Big Data Applications for Performance and Functionality  ActionFlow Testing  Dealing With Data:  Here determination of how fast the data is executed is done.  when the datasets are busy, testing of the data processing is done in separated forum.
  • 89. Testing and Delivering Big Data Applications for Performance and Functionality  ActionFlow Testing  Check The Working Of All The Ingredients:  test of each and every commodity is a must.  The speed of message indexes, utilization of those messages, Phases of the MapReduce procedure, support search, all comes under this phase.
  • 90. Testing and Delivering Big Data Applications for Performance and Functionality  Performance Testing Approach: involves testing of huge volumes of planned and shapeless data, and it requires a specific testing approach to test such massive data. Hadoop is involved with storage and maintenance of a large set of data including both structured as well as unstructured data .
  • 91. Testing and Delivering Big Data Applications for Performance and Functionality  Performance Testing Approach:
  • 92. Testing and Delivering Big Data Applications for Performance and Functionality  Performance Testing Approach: the set up of the application prior to the testing procedure begins.  Find out the required workloads and make the design accordingly.  Make ready each and every client separately.  Perform the testing procedure and also check the output carefully.  Do the best possible organization
  • 93. Testing and Delivering Big Data Applications for Performance and Functionality  Factors For Concert Testing: Various parameters to be verified for performance testing are How the information will be stored Till what extend the commit logs can enlarge Finding out the concurrency of the read and write procedures Find all the standards of the start, and stop timeouts.
  • 94. Testing and Delivering Big Data Applications for Performance and Functionality  Factors For Concert Testing: Various parameters to be verified for performance testing are Arrange the key and row cache properly Do consider the ingredients of the Java Virtual Machine also Filter and sort the working of the processing part, the MapReduce. Check the messaging rate and its sizes too.
  • 95. Testing and Delivering Big Data Applications for Performance and Functionality  Test Atmosphere Requirements As always Hadoop structure should be more spacious since it has to process a large set of data. The cluster should contain a large set of nodes to handle the stored information. The CPU should be utilized properly.
  • 96. Testing and Delivering Big Data Applications for Performance and Functionality  Challenges In Big Data Testing  Mechanization: High technical expert is involved with mechanical testing. They do not solve those unexpected problems.
  • 97. Testing and Delivering Big Data Applications for Performance and Functionality  Challenges In Big Data Testing  Virtualization: Latency in this system produces time problems in real time testing. Image management is also done here.
  • 98. Testing and Delivering Big Data Applications for Performance and Functionality  Challenges In Big Data Testing  Large Dataset: Proofing of large amount of data and increase of its speed. Need to increase the tests.  Testing has to be done in several fields.
  • 99. Testing and Delivering Big Data Applications for Performance and Functionality  Performance Testing Challenges:  Varieties In Technologies: The different ingredients of Hadoop belong to different technology each one of them needs separate kinds of testing
  • 100. Testing and Delivering Big Data Applications for Performance and Functionality  Performance Testing Challenges:  Unavailability Of Precise Equipment: A lot number of testing components are required for the complete testing procedure  Test Scripting:  High-quality scripting is thus important and very essential for the state of affairs.
  • 101. Testing and Delivering Big Data Applications for Performance and Functionality  Performance Testing Challenges:  Test Environment: The perfect test atmosphere is must, and in most of the cases not possible to obtain.  Controlling Resolutions:  For controlling the complete atmosphere large number of resolutions is required which is not always present.
  • 102. Challenges with Data Administration  responsible for designing and maintaining data stores.  data is monitored, maintained and managed by a person and/or organization.  allows an organization to check its data resources, along with their processing and communications with different applications and business processes.
  • 103. Challenges with Data Administration  data usage and handling is working towards the enterprise’s objective.  to integrate data from multiple resources and provide it to various applications.
  • 104. Challenges with Data Administration  Data Administrator deals with designing of the logical and conceptual models treating the data at an organizational level.  Database Administrators deal with the implementation of databases required and in use.
  • 105. Challenges with Data Administration  Responsibilities of data administrators:  Data policies, procedures and standards:  Data policy: can interact with which data, how that data can be changed and what is the effect of the change.
  • 106. Challenges with Data Administration  Responsibilities of data administrators:  Data policies, procedures and standards:  Data Procedures: documented plan of actions to be taken to perform a certain activity like backup and recovery procedures.
  • 107. Challenges with Data Administration  Responsibilities of data administrators:  Data policies, procedures and standards:  Data standards: conventions and behaviors that need to be followed so that the maintenance becomes easy.
  • 108. Challenges with Data Administration  Responsibilities of data administrators:  Planning:  to plan for an effective administration of data and also provide support for future needs.
  • 109. Challenges with Data Administration  Responsibilities of data administrators:  Data conflict resolution:  establish procedures for resolving any conflicts in ownership.  authority to mediate and enforce the resolution of the conflict, they may be very effective in this capacity.
  • 110. Challenges with Data Administration  Responsibilities of data administrators:  Managing the data repository:  contain metadata that holds data description of the data stored in data stores.  describe an organization’s data and data processing resources.
  • 111. Challenges with Data Administration  Responsibilities of data administrators:  Internal marketing of DA concepts:  established policies and procedures must be made known to the internal staff.
  • 112. Challenges with Data Administration  Responsibilities of data administrators:  designing the database:  defining and creating the logical data model, physical database model and prototyping.  Security and authorization:  ensures that there is no unauthorized access to the data.  users may be granted permission to access only certain views and relations.
  • 113. Challenges with Data Administration  Responsibilities of data administrators: Data availability and recovery from failures:  ensure that the data is made available to its user in such way that the users are unaware of the failure.  Database tuning:  modifying the database, the conceptual and logical design.
  • 114. Challenges with Data Administration  Creating the data repository: integrating it to create a common data repository is challenging.  Emphasize the capability to build a database quickly, tune it for maximum performance and restore it to production quickly when problems develop.
  • 115. Challenges with Data Administration  Enforcing the data policies and standards, especially those related to security.  support should be provided to incorporate the changes and make provision for future scope.  with the social media, should define the ownership of data.
  • 116. Challenges with Data Administration  The administrator is always expected to keep side by side with new technologies, and is usually involved in mission-critical applications.  to have a comprehensive understanding of a wide variety of topics and improve business processes in their organization.