Integrate the data, build warehouses and federations
Develop profiles of terrorists, activities/threats
Mine the data to extract patterns of potential terrorists and predict future activities and targets
Find the “needle in the haystack” - suspicious needles?
Data integrity is important
Techniques have to SCALE
Data Mining Needs for Counterterrorism: Real-time Data Mining
Nature of data
Data arriving from sensors and other devices
Continuous data streams
Breaking news, video releases, satellite images
Some critical data may also reside in caches
Rapidly sift through the data and discard unwanted data for later use and analysis (non-real-time data mining)
Data mining techniques need to meet timing constraints
Quality of service (QoS) tradeoffs among timeliness, precision and accuracy
Presentation of results, visualization, real-time alerts and triggers
Data Mining for Real-time Threats Integrate data sources in real - time Build real - time models Examine Results in Real - time Report final results Data sources with information about terrorists and terrorist activities Mine the data Rapidly sift through data and discard irrelevant data
What should be done: Form a Research Agenda
Immediate action (0 - 1 year)
We’ve got to know what our current capabilities are
Do the commercial tools scale? Do they work only on special data and limited cases? Do they deliver what they promise?
Need an unbiased objective study with demonstrations
At the same time, work on the big picture
What do we want? What are our end results for the foreseeable future? What are the criteria for success? How do we evaluate the data mining algorithms? What testbeds do we build?
Near-term (1 - 3 years)
Leverage current efforts
Fill the gaps in a goal-directed way; technology transfer
Long-term (3 - 5 years and beyond)
5-year R&D plan for data mining for counterterrorism
Data Mining is very useful to solve Security Problems
Data mining tools could be used to examine audit data and flag abnormal behavior
Much recent work in Intrusion detection (unit #18)
e.g., Neural networks to detect abnormal patterns
Tools are being examined to determine abnormal patterns for national security
Classification techniques, Link analysis
Credit cards, calling cards, identity theft etc.
BUT CONCERNS FOR PRIVACY
What is Privacy
Privacy is about a patient determining what information the doctor should release about him/her
A bank customer determines what financial information the bank should release about him/her
FBI would collect information about US citizens. However FBI determines what information about a US citizen it can release to say the CIA
Some Privacy concerns
Medical and Healthcare
Employers, marketers, or others knowing of private medical concerns
Allowing access to individual’s travel and spending data
Allowing access to web surfing behavior
Marketing, Sales, and Finance
Allowing access to individual’s purchases
Data Mining as a Threat to Privacy
Data mining gives us “facts” that are not obvious to human analysts of the data
Can general trends across individuals be determined without revealing information about individuals?
Combine collections of data and infer information that is private
Disease information from prescription data
Military Action from Pizza delivery to pentagon
Need to protect the associations and correlations between the data that are sensitive or private
Some Privacy Problems and Potential Solutions
Problem: Privacy violations that result due to data mining
Potential solution: Privacy-preserving data mining
Problem: Privacy violations that result due to the Inference problem
Inference is the process of deducing sensitive information from the legitimate responses received to user queries
Potential solution: Privacy Constraint Processing
Problem: Privacy violations due to un-encrypted data
Potential solution: Encryption at different levels
Problem: Privacy violation due to poor system design
Potential solution: Develop methodology for designing privacy-enhanced systems
Privacy as Inference: Privacy Constraint Processing
Privacy constraint/policy processing
Based on prior research in security constraint processing
Simple Constraint: an attribute of a document is private
Content-based constraint: If document contains information about X, then it is private
Association-based Constraint: Two or more documents taken together is private; individually each document is public
Release constraint: After X is released Y becomes private
Augment a database system with a privacy controller for constraint processing
Architecture for Privacy Constraint Processing User Interface Manager Constraint Manager Privacy Constraints Query Processor: Constraints during query and release operations Update Processor: Constraints during update operation Database Design Tool Constraints during database design operation Database DBMS
Semantic Model for Privacy Control Patient John Cancer Influenza Has disease Travels frequently England address John’s address Dark lines/boxes contain private information
Privacy Preserving Data Mining
Prevent useful results from mining
Introduce “cover stories” to give “false” results
Only make a sample of data available so that an adversary is unable to come up with useful rules and predictive functions
Introduce random values into the data and/or results
Challenge is to introduce random values without significantly affecting the data mining results
Give range of values for results instead of exact values
Secure Multi-party Computation
Each party knows its own inputs; encryption techniques used to compute final results
Rules, predictive functions
Approach: Only make a sample of data available
Limits ability to learn good classifier
Cryptographic Approaches for Privacy Preserving Data Mining
Secure Multi-part Computation (SMC) for PPDM
Mainly used for distributed data mining.; Provably secure under some assumptions.; Learned models are accurate; Mainly semi-honest assumption (i.e. parties follow the protocols); Malicious model is also explored recently. (e.g. Kantarcioglu and Kardes paper in this workshop); Many SMC based PPDM algorithms share common sub-protocols (e.g. dot product, summation, etc. )
Still not efficient enough for very large datasets. (e.g. petabyte sized datasets ??); Semi-honest model may not be realistic;Malicious model is even slower
Possible new directions
New models that can trade-off better between efficiency and security; Game theoretic / incentive issues in PPDM; Combining anonymization and cryptographic techniques for PPDM
Perturbation Based Approaches for Privacy Preserving Data Mining
Goal: Distort data while still preserve some properties for data mining propose.
Goal: Achieve a high data mining accuracy with maximum privacy protection.
Our approach: Privacy is a personal choice, so should be
(Liu, Kantarcioglu and Thuraisingham ICDM’06)
Perturbation Based Approaches for Privacy Preserving Data Mining
The trend is to make PPDM approaches to reflect reality
We investigated perturbation based approaches with real-world data sets
We give a applicability study to the current approaches
Liu, Kantarcioglu and Thuraisingham, DKE 07
We found out,
The reconstruction of the original distribution may not work well with real-world data sets
Try to modify perturbation techniques, and adapt some data mining tools, e.g. Liu, Kantarcioglu and Thuraisingham, Novel decision tree – UTD technical report 06
Platform for Privacy Preferences (P3P): What is it?
P3P is an emerging industry standard that enables web sites t9o express their privacy practices in a standard format
The format of the policies can be automatically retrieved and understood by user agents
It is a product of W3C; World wide web consortium
When a user enters a web site, the privacy policies of the web site is conveyed to the user; If the privacy policies are different from user preferences, the user is notified; User can then decide how to proceed
Several major corporations are working on P3P standards including
Privacy for Assured Information Sharing Export Data/Policy Component Data/Policy for Agency A Data/Policy for Federation Export Data/Policy Component Data/Policy for Agency C Component Data/Policy for Agency B Export Data/Policy
Privacy Preserving Surveillance Raw video surveillance data Face Detection and Face Derecognizing system Suspicious Event Detection System Manual Inspection of video data Comprehensive security report listing suspicious events and people detected Suspicious people found Suspicious events found Report of security personnel Faces of trusted people derecognized to preserve privacy
Directions: Foundations of Privacy Preserving Data Mining
We proved in 1990 that the inference problem in general was unsolvable, therefore the suggestion was to explore the solvability aspects of the problem.
Can we do something similar for privacy?
Is the general privacy problem solvable?
What are the complicity classes?
What is the storage and time complicity
We need to explore the foundation of PPDM and related privacy solutions
Directions: Testbed Development and Application Scenarios
There are numerous PPDM related algorithms. How do they compare with each other? We need a testbed with realistic parameters to test the algorithms
It is time to develop real world scenarios where these algorithms can be utilized
Is it feasible to develop realistic commercial products or should each organization adapt product to suit their needs?
1. There is no universal definition for privacy, each organization must definite what it means by privacy and develop appropriate privacy policies
2. Technology alone is not sufficient for privacy We need technologists, Policy expert, Legal experts and Social scientists to work on Privacy
3. Some well known people have said ‘Forget about privacy” Therefore, should we pursue research on Privacy?
Interesting research problems, there need to continue with research
Something is better than nothing
Try to prevent privacy violations and if violations occur then prosecute
4. We need to tackle privacy from all directions
Application Specific Privacy?
Examining privacy may make sense for healthcare and financial applications
Does privacy work for Defense and Intelligence applications?
Is it even meaningful to have privacy for surveillance and geospatial applications
Once the image of my house is on Google Earth, then how much privacy can I have?
I may want my location to be private, but does it make sense if a camera can capture a picture of me?
If there are sensors all over the place, is it meaningful to have privacy preserving surveillance?
This suggests that we need application specific privacy
It is not meaningful to examine PPDM for every data mining algorithm and for every application
CPT: Confidentiality, Privacy and Trust
I as a user of Organization A send data about me to organization B, I read the privacy policies enforced by organization B
If I agree to the privacy policies of organization B, then I will send data about me to organization B
If I do not agree with the policies of organization B, then I can negotiate with organization B
Even if the web site states that it will not share private information with others, do I trust the web site
Note: while confidentiality is enforced by the organization, privacy is determined by the user. Therefore for confidentiality, the organization will determine whether a user can have the data. If so, then the organization van further determine whether the user can be trusted
Confidentiality, Privacy and Trust
How can we ensure the confidentiality of the data mining processes and results?
Access control policies
How can we trust the data mining processes and results
Verification and validation
How can we integrate confidentiality, privacy and trust with respect to data mining?
Need to examine the research challenges and form a research agenda
Data Mining and Privacy: Friends or Foes?
They are neither friends nor foes
Need advances in both data mining and privacy
Need to design flexible systems
For some applications one may have to focus entirely on “pure” data mining while for some others there may be a need for “privacy-preserving” data mining
Need flexible data mining techniques that can adapt to the changing environments
Technologists, legal specialists, social scientists, policy makers and privacy advocates MUST work together