Security and Control Issues in Information SystemDaryl Conson
This is all about issues concerning security and control within the Information System. This had been researched via the internet, and reported as part of the project in the subject ITE Professional Ethics and Values.
Why your Information Security MUST mesh with your Business Continuity ProgramPECB
Data breaches, cyber-attacks and hacking should be thought of as inevitable consequences of our interconnected world. We’re reliant on data and machines which makes us vulnerable when those assets are abused. Information and cyber security measures seek to prevent data breaches and losing control of systems and processes that, for example, use industrial control systems. Business continuity (BC) programs/plans stand as an organization’s last line of defense against any number of threats and hazards, not the least of which are data breaches and hacked control systems. There is added value from good business continuity planning. A rigorous BC program always has included IT disaster recovery, but the preparedness that stems from robust BC plans can deter cyber-attacks and protect an organization – including its reputation – when those attacks occur.
Main points covered:
• Why is the Information Security and Business Continuity plan incorporation important?
• Is hacking inevitable?
• How are Information Security and Business Continuity linked?
• How Business Continuity and Information Security stand as the last line of defense
• What is the ‘best’ plan that we should follow to be protected from threats?
Presenter:
Dr. Ed Goldberg, MBA, BSEE, CBCP, manages Eversource’s BC & DR programs in Berlin, CT and served 10 years as IT manager at Millstone Nuclear Power Station. Ed is a Certified Business Continuity Professional with 25+ years IT and management experience. He served 4 terms as president of the Connecticut ACP and is a popular conference speaker and published author. Also, Ed served 5 years as adjunct faculty and 8 years as core faculty at Capella University where he mentored PhD students, taught and developed IT management and general PhD research coursework. He also taught MBA and IT coursework at Albertus Magnus College for 11 years.
Link to the recorded webinar: https://youtu.be/ePNhhGgaEFc
I’ve been hacked the essential steps to take nextBrian Pichman
It happens. A place you shop at frequently gets its data stolen. Someone was able to get access to one of your accounts. Or a system you manage gets compromised. No matter how the data breach happened, it is important be prepared ahead of time before the worst happens. Join Brian Pichman as he helps you put a proactive plan in place and what to do after you or your organization has been hacked. Attendees will walk away from this webinar with a toolbox for their library and to use to educate their users.
Security and Control Issues in Information SystemDaryl Conson
This is all about issues concerning security and control within the Information System. This had been researched via the internet, and reported as part of the project in the subject ITE Professional Ethics and Values.
Why your Information Security MUST mesh with your Business Continuity ProgramPECB
Data breaches, cyber-attacks and hacking should be thought of as inevitable consequences of our interconnected world. We’re reliant on data and machines which makes us vulnerable when those assets are abused. Information and cyber security measures seek to prevent data breaches and losing control of systems and processes that, for example, use industrial control systems. Business continuity (BC) programs/plans stand as an organization’s last line of defense against any number of threats and hazards, not the least of which are data breaches and hacked control systems. There is added value from good business continuity planning. A rigorous BC program always has included IT disaster recovery, but the preparedness that stems from robust BC plans can deter cyber-attacks and protect an organization – including its reputation – when those attacks occur.
Main points covered:
• Why is the Information Security and Business Continuity plan incorporation important?
• Is hacking inevitable?
• How are Information Security and Business Continuity linked?
• How Business Continuity and Information Security stand as the last line of defense
• What is the ‘best’ plan that we should follow to be protected from threats?
Presenter:
Dr. Ed Goldberg, MBA, BSEE, CBCP, manages Eversource’s BC & DR programs in Berlin, CT and served 10 years as IT manager at Millstone Nuclear Power Station. Ed is a Certified Business Continuity Professional with 25+ years IT and management experience. He served 4 terms as president of the Connecticut ACP and is a popular conference speaker and published author. Also, Ed served 5 years as adjunct faculty and 8 years as core faculty at Capella University where he mentored PhD students, taught and developed IT management and general PhD research coursework. He also taught MBA and IT coursework at Albertus Magnus College for 11 years.
Link to the recorded webinar: https://youtu.be/ePNhhGgaEFc
I’ve been hacked the essential steps to take nextBrian Pichman
It happens. A place you shop at frequently gets its data stolen. Someone was able to get access to one of your accounts. Or a system you manage gets compromised. No matter how the data breach happened, it is important be prepared ahead of time before the worst happens. Join Brian Pichman as he helps you put a proactive plan in place and what to do after you or your organization has been hacked. Attendees will walk away from this webinar with a toolbox for their library and to use to educate their users.
The Information Disruption Industry and the Operational Environment of the Fu...Vincent O'Neil
Executive Summary:
Use of everyday technology to collect personal data is increasing, and as these efforts become more intrusive, popular resentment is likely to grow.
If that irritation reaches a tipping point, existing privacy protection services will expand enormously—creating an Information Disruption Industry (IDI) dedicated to thwarting the collection, storage, and sale of personal data.
The expanded IDI’s efforts will do direct and indirect damage to a wide range of systems—even systems unrelated to personal data collection.
This likely scenario has the potential to seriously impact the information landscape in 2035, if not sooner.
End of Summary
I presented the paper in a webinar hosted by the Mad Scientist Initiative and Georgetown University on May 10, 2020. The complete webinar can be viewed at:
https://www.youtube.com/watch?v=j2-cjW1cmrQ&t=75s
Data Confidentiality, Security and Recent Changes to the ABA Model Rulessaurnou
Continuing legal education (CLE) presentation regarding data confidentiality, information security, computer forensics and legal ethics in light of technology-related changes made to the American Bar Association's Model Rules of Professional Conduct.
Business Security Check Reducing Risks Your Computer Systems- Mark - Fullbright
All product and company names mentioned herein are for identification and educational purposes only and are the property of, and may be trademarks of, their respective owners.
Security and control in Management Information SystemSatya P. Joshi
Security and control in Management Information System, software security, Security and control in Management Information System, malware, vulnerability, Security and control in Management Information System
Discussing Cyber Risk Coverage With Your Commercial Clients by Steve Robinson...Don Grauel
Steve Robinson of RPS Technology & Cyber presented "Discussing Cyber Risk Coverage With Your Commercial Clients" to the 68th Annual F. Addison Fowler Fall Seminar on October 17, 2014.
The Information Disruption Industry and the Operational Environment of the Fu...Vincent O'Neil
Executive Summary:
Use of everyday technology to collect personal data is increasing, and as these efforts become more intrusive, popular resentment is likely to grow.
If that irritation reaches a tipping point, existing privacy protection services will expand enormously—creating an Information Disruption Industry (IDI) dedicated to thwarting the collection, storage, and sale of personal data.
The expanded IDI’s efforts will do direct and indirect damage to a wide range of systems—even systems unrelated to personal data collection.
This likely scenario has the potential to seriously impact the information landscape in 2035, if not sooner.
End of Summary
I presented the paper in a webinar hosted by the Mad Scientist Initiative and Georgetown University on May 10, 2020. The complete webinar can be viewed at:
https://www.youtube.com/watch?v=j2-cjW1cmrQ&t=75s
Data Confidentiality, Security and Recent Changes to the ABA Model Rulessaurnou
Continuing legal education (CLE) presentation regarding data confidentiality, information security, computer forensics and legal ethics in light of technology-related changes made to the American Bar Association's Model Rules of Professional Conduct.
Business Security Check Reducing Risks Your Computer Systems- Mark - Fullbright
All product and company names mentioned herein are for identification and educational purposes only and are the property of, and may be trademarks of, their respective owners.
Security and control in Management Information SystemSatya P. Joshi
Security and control in Management Information System, software security, Security and control in Management Information System, malware, vulnerability, Security and control in Management Information System
Discussing Cyber Risk Coverage With Your Commercial Clients by Steve Robinson...Don Grauel
Steve Robinson of RPS Technology & Cyber presented "Discussing Cyber Risk Coverage With Your Commercial Clients" to the 68th Annual F. Addison Fowler Fall Seminar on October 17, 2014.
The Risks of Horizontal Privilege Escalation.pdfuzair
I. Introduction
Definition of horizontal privilege escalation
Importance of understanding the risks
II. Common Vulnerabilities and Exploits
Misconfigured access controls
Weak authentication mechanisms
Software vulnerabilities
Social engineering attacks
III. Impact of Horizontal Privilege Escalation
Unauthorized access to sensitive information
Data breaches and privacy violations
Financial losses and legal consequences
Reputational damage
IV. Examples of Horizontal Privilege Escalation
Case study 1: Exploiting a misconfigured access control
Case study 2: Leveraging weak authentication
Case study 3: Exploiting software vulnerabilities
V. Mitigation Strategies
Implementing strong access controls
Regularly updating and patching software
Conducting security audits and penetration testing
Educating employees about social engineering attacks
VI. Best Practices for Prevention
Principle of least privilege
Implementing multi-factor authentication
Regularly monitoring and logging system activities
Implementing intrusion detection and prevention systems
VII. Conclusion
VIII. FAQs
What is horizontal privilege escalation?
How can misconfigured access controls lead to horizontal privilege escalation?
What are some examples of software vulnerabilities that can be exploited for horizontal privilege escalation?
How can organizations prevent horizontal privilege escalation?
What are the potential consequences of horizontal privilege escalation?
The Risks of Horizontal Privilege Escalation
Horizontal privilege escalation refers to a critical security vulnerability that can have severe consequences for organizations and individuals alike. It occurs when an unauthorized user gains access to resources, data, or privileges that they should not have within the same level of authorization. In this article, we will delve into the risks associated with horizontal privilege escalation and explore mitigation strategies to protect against this type of attack.
Introduction
Horizontal privilege escalation poses a significant threat to the security of computer systems, networks, and sensitive data. It occurs when an attacker exploits vulnerabilities or weaknesses within a system to gain unauthorized access to resources or privileges. Understanding the risks associated with this type of attack is crucial for organizations to implement effective security measures.
Common Vulnerabilities and Exploits
Misconfigured access controls: Improperly configured access controls can allow unauthorized users to gain access to sensitive information or perform actions beyond their authorized privileges. Attackers can exploit these misconfigurations to elevate their privileges and access critical resources.
Weak authentication mechanisms: Weak passwords, default credentials, or insufficient authentication processes provide opportunities for attackers to gain unauthorized access to user accounts and escalate their privileges within
system.
Software vulnerabilities: Unpatched software or applicatio
How to Build and Implement your Company's Information Security ProgramFinancial Poise
Data is one of your business’s most valuable assets and requires protection like any other asset. How can you protect your data from unauthorized access or inadvertent disclosure?
An information security program is designed to protect the confidentiality, integrity, and availability of your company’s data and information technology assets. Federal, state, or international law may also require your business to have an information security program in place.
This webinar will provide the basics of how to create and implement an information security program, beginning with identifying your incident response team, putting applicable insurance policies into place, and closing any gaps in the security of your data.
To view the accompanying webinar, go to: https://www.financialpoise.com/financial-poise-webinars/how-to-build-and-implement-your-companys-information-security-program-2021/
Ethical Considerations in Data Analysis_ Balancing Power, Privacy, and Respon...Soumodeep Nanee Kundu
The explosion of data and the increasing capabilities of data analysis have transformed various aspects of our lives. From healthcare and finance to marketing and law enforcement, data analysis has become an essential tool for decision-making and problem-solving. However, with great power comes great responsibility. Ethical considerations in data analysis are more critical than ever as data professionals grapple with questions related to privacy, fairness, transparency, and accountability. In this article, we will delve into the ethical challenges that data analysts and organizations face and explore strategies to address them.
A DPIA is a well-ordered list of data processing methods and purposes
A DPIA is also a proactive measure to safeguard and protect data using certified security mechanisms.
DPIA will help organisations to:
Identify
Fix problems at an early stage
Reducing the related costs
Damage to reputation
Similar to NIST Privacy Engineering Working Group - Risk Model (20)
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Knowledge engineering: from people to machines and back
NIST Privacy Engineering Working Group - Risk Model
1. Privacy Engineering Objectives and
Risk Model- Discussion Deck
Objective-Based Design for Improving Privacy
in Information Systems
2. Purpose and Scope
• This discussion deck describes initial draft components of privacy engineering
developed from the output of NIST’s first Privacy Engineering Workshop, April
9-10, 2014. It does not address a complete privacy risk management
framework.
• The draft components include a definition for privacy engineering, a set of
privacy engineering objectives and a system privacy risk model. This deck will be
used to facilitate discussions at the second Privacy Engineering Workshop on
September 15-16, 2014 in San Jose, CA.
• The privacy engineering objectives and risk model are primarily focused on
mitigating risks arising from unanticipated consequences of normal system
behavior. Risks to privacy arising from malicious actors or attacks can continue
to be mitigated by following standard security principles.
3. Risk Model
(Personal Information + Data Actions + Context = System Privacy Risk)
Controls
(Derived from FIPPs, etc.)
Objectives
(Predictability, Manageability, Confidentiality)
Metrics
ModelPrivacy Risk
Management Framework
Privacy Engineering
Components
Policy
(Law, Regulation, FIPPs, etc.)
Risk
Requirements
System
Evaluation
SeptemberWorkshop
4. Privacy Engineering
Privacy engineering is a collection of methods to support the mitigation of risks to
individuals of loss of self-determination, loss of trust, discrimination and economic loss
by providing predictability, manageability, and confidentiality of personal information
within information systems.
5. Key Terms
• Privacy Engineering Objectives: Outcome-based objectives that guide design
requirements to achieve privacy-preserving information systems.
• Data Lifecycle: The data actions an information system performs.
• Data Actions: Normal information system operations that handle personal
information.
• Problematic Data Actions: Problematic data actions occur when the data actions of an
information system contravene the objectives of predictability, manageability, or
confidentiality.
• Context: The circumstances surrounding a system’s collection, generation,
processing, disclosure and retention of personal information.
6. Key Terms (con’t)
Privacy Harms: Harms to individuals that result from problematic data actions. Harms
can be grouped into four categories:
• Loss of Self-Determination: The loss of an individual’s personal sovereignty or ability
to freely make choices. This includes the harms of Loss of Autonomy, Exclusion, Loss
of Liberty
• Discrimination: The unfair or unequal treatment of individuals. This includes the
harms of Stigmatization and Power Imbalance.
• Loss of Trust: The breach of implicit or explicit expectations or agreements about the
handling of personal information.
• Economic Loss: Economic loss can include direct financial losses as the result of
identity theft, as well as the failure to receive fair value in a transaction involving
personal information.
9. Predictability
Enabling reliable assumptions about the rationale for the collection of personal
information and other data actions to be taken with that personal information.
Example Violation of Predictability
Data Lifecycle
Phase
Normal Data Action Problematic Data Action Potential Harms
Processing Aggregation
Unanticipated
Revelation
Stigmatization, Power Imbalance, Loss of Trust,
Loss of Autonomy
10. Manageability
Providing the capability for authorized modification of personal information, including
alteration, deletion, or selective disclosure of personal information.
Example Violation of Manageability
Data Lifecycle
Phase
Normal Data Action Problematic Data Action Potential Harms
Disposal
Normal Account
Deletion
Unwarranted
Restriction
Exclusion, Economic Loss,
Loss of Trust
11. Confidentiality
Preserving authorized restrictions on information access and disclosure, including
means for protecting personal privacy and proprietary information. (NIST SP 800-53,
rev 4)
Example Violation of Confidentiality
Data Lifecycle
Phase
Normal Data Action Problematic Data Action Potential Harms
Retention Secure Storage Insecurity Economic Loss, Stigmatization
13. Assessing System Privacy Risk
• To understand the magnitude of privacy risk within an information system, the
proposed model focuses on the risk or likelihood of problematic data actions occurring
that could result in privacy harm to individuals.
• A key distinction between privacy and security is that privacy harms may arise even
though the system is performing data actions in accordance with its operational
purpose. Moreover, these harms may be difficult to assess as they may occur externally
to the system, and beyond the system owner's awareness.
• Thus, the risk model concentrates on the risk of data actions becoming problematic
which the system owner or designer can better recognize and control.
14. System Privacy Risk Equation
System privacy risk is the risk of problematic data actions occurring
Personal Information Collected or Generated + Data Actions Performed
on that Information + Context = System Privacy Risk
15. Context
“Context” means the circumstances surrounding a system’s collection, generation,
processing, disclosure and retention of personal information.
16. Examples of Contextual Factors
• The relationship between individuals and the organization that controls the system
• The extent and frequency of direct interactions between an individual and the system
• The nature and history of those interactions
• The range of goods or services that the system offers, and such use by individuals
• The types of personal information that is foreseeably necessary for the system to process or
generate in order to provide the goods or services
• The level of understanding that reasonable individuals would have of how the system
processes the personal information that it contains
• Information known by the system about the privacy preferences of individuals
• The extent to which personal information under the control of the system is exposed to
public view
• General user experience with information technologies
17. Problematic Data
Actions
Problematic data actions occur when the data
actions of an information system contravene
the objectives of predictability, manageability,
or confidentiality.
18. Appropriation: Personal information is used
in ways that exceed an individual’s
expectation or authorization
Appropriation occurs when personal information is used in ways that an individual
would object to or would have negotiated additional value for, absent an information
asymmetry or other marketplace failure. Privacy harms that Appropriation can lead to
include loss of trust, economic loss or power imbalance.
19. Distortion: The use or dissemination of
inaccurate or misleadingly incomplete
personal information
Distortion can present users in an inaccurate, unflattering or disparaging manner,
opening the door for discrimination harms or loss of liberty.
20. Induced Disclosure: Pressure to divulge
personal information
Induced disclosure can occur when users feel compelled to provide information
disproportionate to the purpose or outcome of the transaction. Induced disclosure
can include leveraging access or privilege to an essential (or perceived essential)
service. It can lead to harms such as power imbalance or loss of autonomy.
21. Insecurity: Lapses in data security
Lapses in data security can result in a loss of trust, as well as exposing individuals to
economic loss, and stigmatization.
22. Surveillance: Tracking or monitoring of
personal information that is disproportionate
to the purpose or outcome of the service
The difference between the data action of monitoring and the problematic data action
of surveillance can be very narrow. Tracking user behavior, transactions or personal
information may be conducted for operational purposes such as protection from cyber
threats or to provide better services, but it becomes surveillance when it leads to
harms such as power imbalance, loss of trust or loss of autonomy or liberty.
23. Unanticipated Revelation: Non-contextual
use of data reveals or exposes an individual
or facets of an individual in unexpected ways
Unanticipated revelation can arise from aggregation and analysis of large and/or
diverse data sets. Unanticipated revelation can give rise to stigmatization, power
imbalance and loss of trust and autonomy.
24. Unwarranted Restriction: The improper
denial of access or loss of privilege to
personal information
Unwarranted restriction to personal information includes not only blocking tangible
access to personal information, but also limiting awareness of the existence of the
information within the system or the uses of such information. Such restriction of
access to systems or personal information stored within that system can result in
harms such as exclusion, economic loss and loss of trust.
26. Loss of Self-Determination
Loss of autonomy: Loss of autonomy includes needless changes in behavior, including
self-imposed restrictions on freedom of expression or assembly.
Exclusion: Exclusion is the lack of knowledge about or access to personal information.
When individuals do not know what information an entity collects or can make use of, or
they do not have the opportunity to participate in such decision-making, it diminishes
accountability as to whether the information is appropriate for the entity to possess or
the information will be used in a fair or equitable manner.
Loss of Liberty: Improper exposure to arrest or detainment. Even in democratic societies,
incomplete or inaccurate information can lead to arrest, or improper exposure or use of
information can contribute to instances of abuse of governmental power. More life-
threatening situations can arise in non-democratic societies.
27. Discrimination
Stigmatization: Personal information is linked to an actual identity in such a way as to
create a stigma that can cause embarrassment and discrimination. Sensitive
information such as health data or criminal records or merely accessing certain
services such as food stamps or unemployment benefits may attach to individuals and
be disclosed in unrelated contexts.
Power Imbalance: Acquisition of personal information which creates an inappropriate
power imbalance, or takes unfair advantage of or abuses a power imbalance between
acquirer and the individual. For example, collection of attributes or analysis of
behavior or transactions about individuals can lead to various forms of discrimination
or disparate impact, including differential pricing or redlining.
28. Loss of Trust
Loss of trust is the breach of implicit or explicit expectations or agreements about the
handling of personal information. For example, the disclosure of personal or other
sensitive data to an entity is accompanied by a number of expectations for how that
data is used, secured, transmitted, shared, etc. Breaches can leave individuals leave
individuals reluctant to engage in further transactions.
29. Economic Loss
Economic loss can include direct financial losses as the result of identity theft, as well
as the failure to receive fair value in a transaction involving personal information.
30. Illustrative Mapping of Privacy Engineering Objectives to Problematic
Data Actions
Data Lifecycle Phase Normal Data Action Problematic Data Action Potential Harms
Predictability
Collection Service Initiation Induced Disclosure Power Imbalance, Loss of Autonomy
Processing Aggregation Unanticipated Revelation
Stigmatization, Power Imbalance, Loss of Trust,
Loss of Autonomy
Processing System monitoring Surveillance
Power Imbalance, Loss of Trust, Loss of Autonomy,
Loss of Liberty
Manageability
Disclosure
Authorized Attribute
Sharing
Distortion Stigmatization, Power Imbalance, Loss of Liberty
Disposal
Normal Account
Deletion
Unwarranted Restriction
Exclusion, Economic Loss,
Loss of Trust
Confidentiality
Use Authorized Use Appropriation Loss of Trust, Economic Loss, Power Imbalance
Retention Secure Storage Insecurity Economic Loss, Stigmatization
31. Discussion Questions
For discussion at the NIST Privacy Engineering Workshop on September 15-16,
2014:
• Privacy Engineering (slide 4): Is this definition helpful?
• Privacy Engineering Objectives (slides 8-10): Are these objectives actionable for organizations? Are
there any gaps?
• System Privacy Risk Model (slide 13): Is it constructive to focus on mitigating problematic data
actions?
• System Privacy Risk Equation (slide 14): Does this equation seem likely to be effective in identifying
system privacy risks? If not, how should system privacy risk be identified?
• Context (slide 16): Are these the right factors? Are there others?
• Problematic Data Actions (slides 18-24): Are these actions functional? Are there additional ones that
should be included?
• Harms (slides 26-29): Are these harms relevant? Are there additional ones that should be included?
Comments may also be sent to privacyeng@nist.gov until September 30,
2014.