Example: Data Mining for the NBA

  • 1,317 views
Uploaded on

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,317
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
30
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Interest measures – make sure that sensitive facts, if they exist, will be deemed uninteresting by algorithms Extra data – example, a “phone book” that contains extra entries. Still useful if goal is to find phone given name, but access to complete phone book doesn’t allow determining facts about (for example) department sizes. Performance – maybe not an issue for small amounts of data, but on large data sets (terabyte); exponential performance is an issue (disk limited) Note that we don’t have the same problem faced by (for example) the GPS military/civilian accuracy encoding. There, the goal is to make information (position) known to all, but just more clearly for some. Here, the information to be made known, and the information to be kept hidden, are completely different. A better analogy would be getting position from communications satellites (e.g. measuring delay). Introducing a small random delay will wreak havoc with trying to determine position by this method, but will not alter the information communicated.
  • Here I try to summarize entire cryptographic approaches to PPDM. Basically, lots of work done on applying secure Multi-party based ideas for PPDM. Generally, it is applied to distributed data mining. In most of the work recent work, it is assumed That the adversaries are semi-honest (i.e. they follow the protocol correctly). Only recently (including Kantarcioglu and Kardes paper that will be presented in the workshop) malicious model is discussed. It turns out that all these Different solutions are consist of few common secure subprotocols such as dot product and summation.
  • Perturbation is a very important technique in PPDM. This technique is to distort the data, but still keep some properties of the data which will be used for later data mining phase. Here listed are some perturbation techniques. Additive based approach is first proposed by Agrawal and Srikant, now has many various. Single one step plus may not enough to protect the privacy, we have proposed a two step model in ICDM 06. Multiplicative based approach, e.g. orthogonal transformation, geometry property is to rotate the data. E.g Chen and Liu ICDM 05. This transformation has the property to keep the Euclidean distance between any pair of data points, so some data mining tools can be directly applied, K-Nearest Neighbor Classifier(KNN), Support Vector Machines(SVM) and so on. The later approach has evaluated the privacy preserving in more detail, so proposed a random projection to a lower space, Liu and Kargupta TKDE2006 and Liu and Kargupta PKDD'06. Condensation and decomposition (Wandand Zhang ICDM06) are using some properties of matrix. In the decomposition area, Wavelet transformation is new. All these approaches are still in progress. Data swapping is a different approach, which transforms the data set by switching a subset of attributes between selected pairs (Fienberg et al 2003.
  • Make the PPDM approaches more fit in the real life situation is the trend for today’s research. We conducted intensive experiments with real-world data set, and give a applicability study in DKE07 paper. Reconstruction of the original data distribution not work very well with real life data. Distribution is a hard problem. When the distribution of the original data set is not hard, the method may work; but if the distribution of the original data is hard, the method not work well. It depends on the distribution! So we suggest should not use distribution as a meddle step. In our another work, we have tailed the data mining tools to fit the PPDM domain. That is try to directly mapping the data mining functions according to the noise addition method. Believe this is a fruitful direction for PPDM

Transcript

  • 1. Data Mining, Security and Privacy Prof. Bhavani Thuraisingham Prof. Murat Kantarcioglu Ms Li Liu (PhD Student – completing December 2007) The University of Texas at Dallas August 24, 2008
  • 2. Outline
    • Data Mining for Security Applications
    • Privacy Concerns
    • What is Privacy?
    • Why is data mining a threat to privacy
    • Developments in Privacy
    • Directions for Privacy
    • Confidentiality, Privacy and Trust for Data Mining
  • 3. Data Mining Needs for Counterterrorism: Non-real-time Data Mining
    • Gather data from multiple sources
      • Information on terrorist attacks: who, what, where, when, how
      • Personal and business data: place of birth, ethnic origin, religion, education, work history, finances, criminal record, relatives, friends and associates, travel history, . . .
      • Unstructured data: newspaper articles, video clips, speeches, emails, phone records, . . .
    • Integrate the data, build warehouses and federations
    • Develop profiles of terrorists, activities/threats
    • Mine the data to extract patterns of potential terrorists and predict future activities and targets
    • Find the “needle in the haystack” - suspicious needles?
    • Data integrity is important
    • Techniques have to SCALE
  • 4. Data Mining Needs for Counterterrorism: Real-time Data Mining
    • Nature of data
      • Data arriving from sensors and other devices
        • Continuous data streams
      • Breaking news, video releases, satellite images
      • Some critical data may also reside in caches
    • Rapidly sift through the data and discard unwanted data for later use and analysis (non-real-time data mining)
    • Data mining techniques need to meet timing constraints
    • Quality of service (QoS) tradeoffs among timeliness, precision and accuracy
    • Presentation of results, visualization, real-time alerts and triggers
  • 5. Data Mining for Real-time Threats Integrate data sources in real - time Build real - time models Examine Results in Real - time Report final results Data sources with information about terrorists and terrorist activities Mine the data Rapidly sift through data and discard irrelevant data
  • 6. What should be done: Form a Research Agenda
    • Immediate action (0 - 1 year)
      • We’ve got to know what our current capabilities are
      • Do the commercial tools scale? Do they work only on special data and limited cases? Do they deliver what they promise?
      • Need an unbiased objective study with demonstrations
    • At the same time, work on the big picture
      • What do we want? What are our end results for the foreseeable future? What are the criteria for success? How do we evaluate the data mining algorithms? What testbeds do we build?
    • Near-term (1 - 3 years)
      • Leverage current efforts
      • Fill the gaps in a goal-directed way; technology transfer
    • Long-term (3 - 5 years and beyond)
      • 5-year R&D plan for data mining for counterterrorism
  • 7. IN SUMMARY:
    • Data Mining is very useful to solve Security Problems
      • Data mining tools could be used to examine audit data and flag abnormal behavior
      • Much recent work in Intrusion detection (unit #18)
        • e.g., Neural networks to detect abnormal patterns
      • Tools are being examined to determine abnormal patterns for national security
        • Classification techniques, Link analysis
      • Fraud detection
        • Credit cards, calling cards, identity theft etc.
        • BUT CONCERNS FOR PRIVACY
  • 8. What is Privacy
    • Medical Community
      • Privacy is about a patient determining what information the doctor should release about him/her
    • Financial community
      • A bank customer determines what financial information the bank should release about him/her
    • Government community
      • FBI would collect information about US citizens. However FBI determines what information about a US citizen it can release to say the CIA
  • 9. Some Privacy concerns
    • Medical and Healthcare
      • Employers, marketers, or others knowing of private medical concerns
    • Security
      • Allowing access to individual’s travel and spending data
      • Allowing access to web surfing behavior
    • Marketing, Sales, and Finance
      • Allowing access to individual’s purchases
  • 10. Data Mining as a Threat to Privacy
    • Data mining gives us “facts” that are not obvious to human analysts of the data
    • Can general trends across individuals be determined without revealing information about individuals?
    • Possible threats:
      • Combine collections of data and infer information that is private
        • Disease information from prescription data
        • Military Action from Pizza delivery to pentagon
    • Need to protect the associations and correlations between the data that are sensitive or private
  • 11. Some Privacy Problems and Potential Solutions
    • Problem: Privacy violations that result due to data mining
      • Potential solution: Privacy-preserving data mining
    • Problem: Privacy violations that result due to the Inference problem
      • Inference is the process of deducing sensitive information from the legitimate responses received to user queries
      • Potential solution: Privacy Constraint Processing
    • Problem: Privacy violations due to un-encrypted data
      • Potential solution: Encryption at different levels
    • Problem: Privacy violation due to poor system design
      • Potential solution: Develop methodology for designing privacy-enhanced systems
  • 12. Privacy as Inference: Privacy Constraint Processing
    • Privacy constraint/policy processing
      • Based on prior research in security constraint processing
      • Simple Constraint: an attribute of a document is private
      • Content-based constraint: If document contains information about X, then it is private
      • Association-based Constraint: Two or more documents taken together is private; individually each document is public
      • Release constraint: After X is released Y becomes private
    • Augment a database system with a privacy controller for constraint processing
  • 13. Architecture for Privacy Constraint Processing User Interface Manager Constraint Manager Privacy Constraints Query Processor: Constraints during query and release operations Update Processor: Constraints during update operation Database Design Tool Constraints during database design operation Database DBMS
  • 14. Semantic Model for Privacy Control Patient John Cancer Influenza Has disease Travels frequently England address John’s address Dark lines/boxes contain private information
  • 15. Privacy Preserving Data Mining
    • Prevent useful results from mining
      • Introduce “cover stories” to give “false” results
      • Only make a sample of data available so that an adversary is unable to come up with useful rules and predictive functions
    • Randomization
      • Introduce random values into the data and/or results
      • Challenge is to introduce random values without significantly affecting the data mining results
      • Give range of values for results instead of exact values
    • Secure Multi-party Computation
      • Each party knows its own inputs; encryption techniques used to compute final results
      • Rules, predictive functions
    • Approach: Only make a sample of data available
      • Limits ability to learn good classifier
  • 16. Cryptographic Approaches for Privacy Preserving Data Mining
    • Secure Multi-part Computation (SMC) for PPDM
      • Mainly used for distributed data mining.; Provably secure under some assumptions.; Learned models are accurate; Mainly semi-honest assumption (i.e. parties follow the protocols); Malicious model is also explored recently. (e.g. Kantarcioglu and Kardes paper in this workshop); Many SMC based PPDM algorithms share common sub-protocols (e.g. dot product, summation, etc. )
    • Drawbacks:
      • Still not efficient enough for very large datasets. (e.g. petabyte sized datasets ??); Semi-honest model may not be realistic;Malicious model is even slower
    • Possible new directions
      • New models that can trade-off better between efficiency and security; Game theoretic / incentive issues in PPDM; Combining anonymization and cryptographic techniques for PPDM
  • 17. Perturbation Based Approaches for Privacy Preserving Data Mining
    • Goal: Distort data while still preserve some properties for data mining propose.
    • Goal: Achieve a high data mining accuracy with maximum privacy protection.
    • Additive Based
    • Multiplicative Based
    • Condensation based
    • Decomposition
    • Data Swapping
    • Our approach: Privacy is a personal choice, so should be
    • individually adaptable
    • (Liu, Kantarcioglu and Thuraisingham ICDM’06)
  • 18. Perturbation Based Approaches for Privacy Preserving Data Mining
    • The trend is to make PPDM approaches to reflect reality
    • We investigated perturbation based approaches with real-world data sets
    • We give a applicability study to the current approaches
      • Liu, Kantarcioglu and Thuraisingham, DKE 07
    • We found out,
      • The reconstruction of the original distribution may not work well with real-world data sets
      • Try to modify perturbation techniques, and adapt some data mining tools, e.g. Liu, Kantarcioglu and Thuraisingham, Novel decision tree – UTD technical report 06
  • 19. Platform for Privacy Preferences (P3P): What is it?
    • P3P is an emerging industry standard that enables web sites t9o express their privacy practices in a standard format
    • The format of the policies can be automatically retrieved and understood by user agents
    • It is a product of W3C; World wide web consortium
    • www.w3c.org
    • When a user enters a web site, the privacy policies of the web site is conveyed to the user; If the privacy policies are different from user preferences, the user is notified; User can then decide how to proceed
    • Several major corporations are working on P3P standards including
  • 20. Privacy for Assured Information Sharing Export Data/Policy Component Data/Policy for Agency A Data/Policy for Federation Export Data/Policy Component Data/Policy for Agency C Component Data/Policy for Agency B Export Data/Policy
  • 21. Privacy Preserving Surveillance Raw video surveillance data Face Detection and Face Derecognizing system Suspicious Event Detection System Manual Inspection of video data Comprehensive security report listing suspicious events and people detected Suspicious people found Suspicious events found Report of security personnel Faces of trusted people derecognized to preserve privacy
  • 22. Directions: Foundations of Privacy Preserving Data Mining
    • We proved in 1990 that the inference problem in general was unsolvable, therefore the suggestion was to explore the solvability aspects of the problem.
    • Can we do something similar for privacy?
      • Is the general privacy problem solvable?
      • What are the complicity classes?
      • What is the storage and time complicity
    • We need to explore the foundation of PPDM and related privacy solutions
  • 23. Directions: Testbed Development and Application Scenarios
    • There are numerous PPDM related algorithms. How do they compare with each other? We need a testbed with realistic parameters to test the algorithms
    • It is time to develop real world scenarios where these algorithms can be utilized
    • Is it feasible to develop realistic commercial products or should each organization adapt product to suit their needs?
  • 24. Key Points
    • 1. There is no universal definition for privacy, each organization must definite what it means by privacy and develop appropriate privacy policies
    • 2. Technology alone is not sufficient for privacy We need technologists, Policy expert, Legal experts and Social scientists to work on Privacy
    • 3. Some well known people have said ‘Forget about privacy” Therefore, should we pursue research on Privacy?
      • Interesting research problems, there need to continue with research
      • Something is better than nothing
      • Try to prevent privacy violations and if violations occur then prosecute
    • 4. We need to tackle privacy from all directions
  • 25. Application Specific Privacy?
    • Examining privacy may make sense for healthcare and financial applications
    • Does privacy work for Defense and Intelligence applications?
    • Is it even meaningful to have privacy for surveillance and geospatial applications
      • Once the image of my house is on Google Earth, then how much privacy can I have?
      • I may want my location to be private, but does it make sense if a camera can capture a picture of me?
      • If there are sensors all over the place, is it meaningful to have privacy preserving surveillance?
    • This suggests that we need application specific privacy
    • It is not meaningful to examine PPDM for every data mining algorithm and for every application
  • 26. CPT: Confidentiality, Privacy and Trust
    • I as a user of Organization A send data about me to organization B, I read the privacy policies enforced by organization B
      • If I agree to the privacy policies of organization B, then I will send data about me to organization B
      • If I do not agree with the policies of organization B, then I can negotiate with organization B
    • Even if the web site states that it will not share private information with others, do I trust the web site
    • Note: while confidentiality is enforced by the organization, privacy is determined by the user. Therefore for confidentiality, the organization will determine whether a user can have the data. If so, then the organization van further determine whether the user can be trusted
  • 27. Confidentiality, Privacy and Trust
    • How can we ensure the confidentiality of the data mining processes and results?
      • Access control policies
    • How can we trust the data mining processes and results
      • Verification and validation
    • How can we integrate confidentiality, privacy and trust with respect to data mining?
      • Need to examine the research challenges and form a research agenda
  • 28. Data Mining and Privacy: Friends or Foes?
    • They are neither friends nor foes
    • Need advances in both data mining and privacy
    • Need to design flexible systems
      • For some applications one may have to focus entirely on “pure” data mining while for some others there may be a need for “privacy-preserving” data mining
      • Need flexible data mining techniques that can adapt to the changing environments
    • Technologists, legal specialists, social scientists, policy makers and privacy advocates MUST work together