Example: Data Mining for the NBA


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Interest measures – make sure that sensitive facts, if they exist, will be deemed uninteresting by algorithms Extra data – example, a “phone book” that contains extra entries. Still useful if goal is to find phone given name, but access to complete phone book doesn’t allow determining facts about (for example) department sizes. Performance – maybe not an issue for small amounts of data, but on large data sets (terabyte); exponential performance is an issue (disk limited) Note that we don’t have the same problem faced by (for example) the GPS military/civilian accuracy encoding. There, the goal is to make information (position) known to all, but just more clearly for some. Here, the information to be made known, and the information to be kept hidden, are completely different. A better analogy would be getting position from communications satellites (e.g. measuring delay). Introducing a small random delay will wreak havoc with trying to determine position by this method, but will not alter the information communicated.
  • Here I try to summarize entire cryptographic approaches to PPDM. Basically, lots of work done on applying secure Multi-party based ideas for PPDM. Generally, it is applied to distributed data mining. In most of the work recent work, it is assumed That the adversaries are semi-honest (i.e. they follow the protocol correctly). Only recently (including Kantarcioglu and Kardes paper that will be presented in the workshop) malicious model is discussed. It turns out that all these Different solutions are consist of few common secure subprotocols such as dot product and summation.
  • Perturbation is a very important technique in PPDM. This technique is to distort the data, but still keep some properties of the data which will be used for later data mining phase. Here listed are some perturbation techniques. Additive based approach is first proposed by Agrawal and Srikant, now has many various. Single one step plus may not enough to protect the privacy, we have proposed a two step model in ICDM 06. Multiplicative based approach, e.g. orthogonal transformation, geometry property is to rotate the data. E.g Chen and Liu ICDM 05. This transformation has the property to keep the Euclidean distance between any pair of data points, so some data mining tools can be directly applied, K-Nearest Neighbor Classifier(KNN), Support Vector Machines(SVM) and so on. The later approach has evaluated the privacy preserving in more detail, so proposed a random projection to a lower space, Liu and Kargupta TKDE2006 and Liu and Kargupta PKDD'06. Condensation and decomposition (Wandand Zhang ICDM06) are using some properties of matrix. In the decomposition area, Wavelet transformation is new. All these approaches are still in progress. Data swapping is a different approach, which transforms the data set by switching a subset of attributes between selected pairs (Fienberg et al 2003.
  • Make the PPDM approaches more fit in the real life situation is the trend for today’s research. We conducted intensive experiments with real-world data set, and give a applicability study in DKE07 paper. Reconstruction of the original data distribution not work very well with real life data. Distribution is a hard problem. When the distribution of the original data set is not hard, the method may work; but if the distribution of the original data is hard, the method not work well. It depends on the distribution! So we suggest should not use distribution as a meddle step. In our another work, we have tailed the data mining tools to fit the PPDM domain. That is try to directly mapping the data mining functions according to the noise addition method. Believe this is a fruitful direction for PPDM
  • Example: Data Mining for the NBA

    1. 1. Data Mining, Security and Privacy Prof. Bhavani Thuraisingham Prof. Murat Kantarcioglu Ms Li Liu (PhD Student – completing December 2007) The University of Texas at Dallas August 24, 2008
    2. 2. Outline <ul><li>Data Mining for Security Applications </li></ul><ul><li>Privacy Concerns </li></ul><ul><li>What is Privacy? </li></ul><ul><li>Why is data mining a threat to privacy </li></ul><ul><li>Developments in Privacy </li></ul><ul><li>Directions for Privacy </li></ul><ul><li>Confidentiality, Privacy and Trust for Data Mining </li></ul>
    3. 3. Data Mining Needs for Counterterrorism: Non-real-time Data Mining <ul><li>Gather data from multiple sources </li></ul><ul><ul><li>Information on terrorist attacks: who, what, where, when, how </li></ul></ul><ul><ul><li>Personal and business data: place of birth, ethnic origin, religion, education, work history, finances, criminal record, relatives, friends and associates, travel history, . . . </li></ul></ul><ul><ul><li>Unstructured data: newspaper articles, video clips, speeches, emails, phone records, . . . </li></ul></ul><ul><li>Integrate the data, build warehouses and federations </li></ul><ul><li>Develop profiles of terrorists, activities/threats </li></ul><ul><li>Mine the data to extract patterns of potential terrorists and predict future activities and targets </li></ul><ul><li>Find the “needle in the haystack” - suspicious needles? </li></ul><ul><li>Data integrity is important </li></ul><ul><li>Techniques have to SCALE </li></ul>
    4. 4. Data Mining Needs for Counterterrorism: Real-time Data Mining <ul><li>Nature of data </li></ul><ul><ul><li>Data arriving from sensors and other devices </li></ul></ul><ul><ul><ul><li>Continuous data streams </li></ul></ul></ul><ul><ul><li>Breaking news, video releases, satellite images </li></ul></ul><ul><ul><li>Some critical data may also reside in caches </li></ul></ul><ul><li>Rapidly sift through the data and discard unwanted data for later use and analysis (non-real-time data mining) </li></ul><ul><li>Data mining techniques need to meet timing constraints </li></ul><ul><li>Quality of service (QoS) tradeoffs among timeliness, precision and accuracy </li></ul><ul><li>Presentation of results, visualization, real-time alerts and triggers </li></ul>
    5. 5. Data Mining for Real-time Threats Integrate data sources in real - time Build real - time models Examine Results in Real - time Report final results Data sources with information about terrorists and terrorist activities Mine the data Rapidly sift through data and discard irrelevant data
    6. 6. What should be done: Form a Research Agenda <ul><li>Immediate action (0 - 1 year) </li></ul><ul><ul><li>We’ve got to know what our current capabilities are </li></ul></ul><ul><ul><li>Do the commercial tools scale? Do they work only on special data and limited cases? Do they deliver what they promise? </li></ul></ul><ul><ul><li>Need an unbiased objective study with demonstrations </li></ul></ul><ul><li>At the same time, work on the big picture </li></ul><ul><ul><li>What do we want? What are our end results for the foreseeable future? What are the criteria for success? How do we evaluate the data mining algorithms? What testbeds do we build? </li></ul></ul><ul><li>Near-term (1 - 3 years) </li></ul><ul><ul><li>Leverage current efforts </li></ul></ul><ul><ul><li>Fill the gaps in a goal-directed way; technology transfer </li></ul></ul><ul><li>Long-term (3 - 5 years and beyond) </li></ul><ul><ul><li>5-year R&D plan for data mining for counterterrorism </li></ul></ul>
    7. 7. IN SUMMARY: <ul><li>Data Mining is very useful to solve Security Problems </li></ul><ul><ul><li>Data mining tools could be used to examine audit data and flag abnormal behavior </li></ul></ul><ul><ul><li>Much recent work in Intrusion detection (unit #18) </li></ul></ul><ul><ul><ul><li>e.g., Neural networks to detect abnormal patterns </li></ul></ul></ul><ul><ul><li>Tools are being examined to determine abnormal patterns for national security </li></ul></ul><ul><ul><ul><li>Classification techniques, Link analysis </li></ul></ul></ul><ul><ul><li>Fraud detection </li></ul></ul><ul><ul><ul><li>Credit cards, calling cards, identity theft etc. </li></ul></ul></ul><ul><ul><ul><li>BUT CONCERNS FOR PRIVACY </li></ul></ul></ul>
    8. 8. What is Privacy <ul><li>Medical Community </li></ul><ul><ul><li>Privacy is about a patient determining what information the doctor should release about him/her </li></ul></ul><ul><li>Financial community </li></ul><ul><ul><li>A bank customer determines what financial information the bank should release about him/her </li></ul></ul><ul><li>Government community </li></ul><ul><ul><li>FBI would collect information about US citizens. However FBI determines what information about a US citizen it can release to say the CIA </li></ul></ul>
    9. 9. Some Privacy concerns <ul><li>Medical and Healthcare </li></ul><ul><ul><li>Employers, marketers, or others knowing of private medical concerns </li></ul></ul><ul><li>Security </li></ul><ul><ul><li>Allowing access to individual’s travel and spending data </li></ul></ul><ul><ul><li>Allowing access to web surfing behavior </li></ul></ul><ul><li>Marketing, Sales, and Finance </li></ul><ul><ul><li>Allowing access to individual’s purchases </li></ul></ul>
    10. 10. Data Mining as a Threat to Privacy <ul><li>Data mining gives us “facts” that are not obvious to human analysts of the data </li></ul><ul><li>Can general trends across individuals be determined without revealing information about individuals? </li></ul><ul><li>Possible threats: </li></ul><ul><ul><li>Combine collections of data and infer information that is private </li></ul></ul><ul><ul><ul><li>Disease information from prescription data </li></ul></ul></ul><ul><ul><ul><li>Military Action from Pizza delivery to pentagon </li></ul></ul></ul><ul><li>Need to protect the associations and correlations between the data that are sensitive or private </li></ul>
    11. 11. Some Privacy Problems and Potential Solutions <ul><li>Problem: Privacy violations that result due to data mining </li></ul><ul><ul><li>Potential solution: Privacy-preserving data mining </li></ul></ul><ul><li>Problem: Privacy violations that result due to the Inference problem </li></ul><ul><ul><li>Inference is the process of deducing sensitive information from the legitimate responses received to user queries </li></ul></ul><ul><ul><li>Potential solution: Privacy Constraint Processing </li></ul></ul><ul><li>Problem: Privacy violations due to un-encrypted data </li></ul><ul><ul><li>Potential solution: Encryption at different levels </li></ul></ul><ul><li>Problem: Privacy violation due to poor system design </li></ul><ul><ul><li>Potential solution: Develop methodology for designing privacy-enhanced systems </li></ul></ul>
    12. 12. Privacy as Inference: Privacy Constraint Processing <ul><li>Privacy constraint/policy processing </li></ul><ul><ul><li>Based on prior research in security constraint processing </li></ul></ul><ul><ul><li>Simple Constraint: an attribute of a document is private </li></ul></ul><ul><ul><li>Content-based constraint: If document contains information about X, then it is private </li></ul></ul><ul><ul><li>Association-based Constraint: Two or more documents taken together is private; individually each document is public </li></ul></ul><ul><ul><li>Release constraint: After X is released Y becomes private </li></ul></ul><ul><li>Augment a database system with a privacy controller for constraint processing </li></ul>
    13. 13. Architecture for Privacy Constraint Processing User Interface Manager Constraint Manager Privacy Constraints Query Processor: Constraints during query and release operations Update Processor: Constraints during update operation Database Design Tool Constraints during database design operation Database DBMS
    14. 14. Semantic Model for Privacy Control Patient John Cancer Influenza Has disease Travels frequently England address John’s address Dark lines/boxes contain private information
    15. 15. Privacy Preserving Data Mining <ul><li>Prevent useful results from mining </li></ul><ul><ul><li>Introduce “cover stories” to give “false” results </li></ul></ul><ul><ul><li>Only make a sample of data available so that an adversary is unable to come up with useful rules and predictive functions </li></ul></ul><ul><li>Randomization </li></ul><ul><ul><li>Introduce random values into the data and/or results </li></ul></ul><ul><ul><li>Challenge is to introduce random values without significantly affecting the data mining results </li></ul></ul><ul><ul><li>Give range of values for results instead of exact values </li></ul></ul><ul><li>Secure Multi-party Computation </li></ul><ul><ul><li>Each party knows its own inputs; encryption techniques used to compute final results </li></ul></ul><ul><ul><li>Rules, predictive functions </li></ul></ul><ul><li>Approach: Only make a sample of data available </li></ul><ul><ul><li>Limits ability to learn good classifier </li></ul></ul>
    16. 16. Cryptographic Approaches for Privacy Preserving Data Mining <ul><li>Secure Multi-part Computation (SMC) for PPDM </li></ul><ul><ul><li>Mainly used for distributed data mining.; Provably secure under some assumptions.; Learned models are accurate; Mainly semi-honest assumption (i.e. parties follow the protocols); Malicious model is also explored recently. (e.g. Kantarcioglu and Kardes paper in this workshop); Many SMC based PPDM algorithms share common sub-protocols (e.g. dot product, summation, etc. ) </li></ul></ul><ul><li>Drawbacks: </li></ul><ul><ul><li>Still not efficient enough for very large datasets. (e.g. petabyte sized datasets ??); Semi-honest model may not be realistic;Malicious model is even slower </li></ul></ul><ul><li>Possible new directions </li></ul><ul><ul><li>New models that can trade-off better between efficiency and security; Game theoretic / incentive issues in PPDM; Combining anonymization and cryptographic techniques for PPDM </li></ul></ul>
    17. 17. Perturbation Based Approaches for Privacy Preserving Data Mining <ul><li>Goal: Distort data while still preserve some properties for data mining propose. </li></ul><ul><li>Goal: Achieve a high data mining accuracy with maximum privacy protection. </li></ul><ul><li>Additive Based </li></ul><ul><li>Multiplicative Based </li></ul><ul><li>Condensation based </li></ul><ul><li>Decomposition </li></ul><ul><li>Data Swapping </li></ul><ul><li>Our approach: Privacy is a personal choice, so should be </li></ul><ul><li>individually adaptable </li></ul><ul><li>(Liu, Kantarcioglu and Thuraisingham ICDM’06) </li></ul>
    18. 18. Perturbation Based Approaches for Privacy Preserving Data Mining <ul><li>The trend is to make PPDM approaches to reflect reality </li></ul><ul><li>We investigated perturbation based approaches with real-world data sets </li></ul><ul><li>We give a applicability study to the current approaches </li></ul><ul><ul><li>Liu, Kantarcioglu and Thuraisingham, DKE 07 </li></ul></ul><ul><li>We found out, </li></ul><ul><ul><li>The reconstruction of the original distribution may not work well with real-world data sets </li></ul></ul><ul><ul><li>Try to modify perturbation techniques, and adapt some data mining tools, e.g. Liu, Kantarcioglu and Thuraisingham, Novel decision tree – UTD technical report 06 </li></ul></ul>
    19. 19. Platform for Privacy Preferences (P3P): What is it? <ul><li>P3P is an emerging industry standard that enables web sites t9o express their privacy practices in a standard format </li></ul><ul><li>The format of the policies can be automatically retrieved and understood by user agents </li></ul><ul><li>It is a product of W3C; World wide web consortium </li></ul><ul><li>www.w3c.org </li></ul><ul><li>When a user enters a web site, the privacy policies of the web site is conveyed to the user; If the privacy policies are different from user preferences, the user is notified; User can then decide how to proceed </li></ul><ul><li>Several major corporations are working on P3P standards including </li></ul>
    20. 20. Privacy for Assured Information Sharing Export Data/Policy Component Data/Policy for Agency A Data/Policy for Federation Export Data/Policy Component Data/Policy for Agency C Component Data/Policy for Agency B Export Data/Policy
    21. 21. Privacy Preserving Surveillance Raw video surveillance data Face Detection and Face Derecognizing system Suspicious Event Detection System Manual Inspection of video data Comprehensive security report listing suspicious events and people detected Suspicious people found Suspicious events found Report of security personnel Faces of trusted people derecognized to preserve privacy
    22. 22. Directions: Foundations of Privacy Preserving Data Mining <ul><li>We proved in 1990 that the inference problem in general was unsolvable, therefore the suggestion was to explore the solvability aspects of the problem. </li></ul><ul><li>Can we do something similar for privacy? </li></ul><ul><ul><li>Is the general privacy problem solvable? </li></ul></ul><ul><ul><li>What are the complicity classes? </li></ul></ul><ul><ul><li>What is the storage and time complicity </li></ul></ul><ul><li>We need to explore the foundation of PPDM and related privacy solutions </li></ul>
    23. 23. Directions: Testbed Development and Application Scenarios <ul><li>There are numerous PPDM related algorithms. How do they compare with each other? We need a testbed with realistic parameters to test the algorithms </li></ul><ul><li>It is time to develop real world scenarios where these algorithms can be utilized </li></ul><ul><li>Is it feasible to develop realistic commercial products or should each organization adapt product to suit their needs? </li></ul>
    24. 24. Key Points <ul><li>1. There is no universal definition for privacy, each organization must definite what it means by privacy and develop appropriate privacy policies </li></ul><ul><li>2. Technology alone is not sufficient for privacy We need technologists, Policy expert, Legal experts and Social scientists to work on Privacy </li></ul><ul><li>3. Some well known people have said ‘Forget about privacy” Therefore, should we pursue research on Privacy? </li></ul><ul><ul><li>Interesting research problems, there need to continue with research </li></ul></ul><ul><ul><li>Something is better than nothing </li></ul></ul><ul><ul><li>Try to prevent privacy violations and if violations occur then prosecute </li></ul></ul><ul><li>4. We need to tackle privacy from all directions </li></ul>
    25. 25. Application Specific Privacy? <ul><li>Examining privacy may make sense for healthcare and financial applications </li></ul><ul><li>Does privacy work for Defense and Intelligence applications? </li></ul><ul><li>Is it even meaningful to have privacy for surveillance and geospatial applications </li></ul><ul><ul><li>Once the image of my house is on Google Earth, then how much privacy can I have? </li></ul></ul><ul><ul><li>I may want my location to be private, but does it make sense if a camera can capture a picture of me? </li></ul></ul><ul><ul><li>If there are sensors all over the place, is it meaningful to have privacy preserving surveillance? </li></ul></ul><ul><li>This suggests that we need application specific privacy </li></ul><ul><li>It is not meaningful to examine PPDM for every data mining algorithm and for every application </li></ul>
    26. 26. CPT: Confidentiality, Privacy and Trust <ul><li>I as a user of Organization A send data about me to organization B, I read the privacy policies enforced by organization B </li></ul><ul><ul><li>If I agree to the privacy policies of organization B, then I will send data about me to organization B </li></ul></ul><ul><ul><li>If I do not agree with the policies of organization B, then I can negotiate with organization B </li></ul></ul><ul><li>Even if the web site states that it will not share private information with others, do I trust the web site </li></ul><ul><li>Note: while confidentiality is enforced by the organization, privacy is determined by the user. Therefore for confidentiality, the organization will determine whether a user can have the data. If so, then the organization van further determine whether the user can be trusted </li></ul>
    27. 27. Confidentiality, Privacy and Trust <ul><li>How can we ensure the confidentiality of the data mining processes and results? </li></ul><ul><ul><li>Access control policies </li></ul></ul><ul><li>How can we trust the data mining processes and results </li></ul><ul><ul><li>Verification and validation </li></ul></ul><ul><li>How can we integrate confidentiality, privacy and trust with respect to data mining? </li></ul><ul><ul><li>Need to examine the research challenges and form a research agenda </li></ul></ul>
    28. 28. Data Mining and Privacy: Friends or Foes? <ul><li>They are neither friends nor foes </li></ul><ul><li>Need advances in both data mining and privacy </li></ul><ul><li>Need to design flexible systems </li></ul><ul><ul><li>For some applications one may have to focus entirely on “pure” data mining while for some others there may be a need for “privacy-preserving” data mining </li></ul></ul><ul><ul><li>Need flexible data mining techniques that can adapt to the changing environments </li></ul></ul><ul><li>Technologists, legal specialists, social scientists, policy makers and privacy advocates MUST work together </li></ul>