Mina.Deng.PhD.defense
Upcoming SlideShare
Loading in...5
×
 

Mina.Deng.PhD.defense

on

  • 897 views

Mina Deng PhD defense presentation

Mina Deng PhD defense presentation

Statistics

Views

Total Views
897
Views on SlideShare
888
Embed Views
9

Actions

Likes
1
Downloads
15
Comments
0

2 Embeds 9

http://www.linkedin.com 8
https://www.linkedin.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Mina.Deng.PhD.defense Mina.Deng.PhD.defense Presentation Transcript

  • Privacy PreservingContent Protection PhD Defense Mina Deng Promoter: Prof. Bart Preneel COSIC, ESAT/SCD, KU Leuven July, 2010
  • Introduction
  • The age of privacy is over?
  • Privacy definitionsIndividual rights • “the right to be let alone” (Warren and Brandeis, 1890)Informational self-determination • “Privacy is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.” (Alan Westin, 1967)Access and control • “control access to oneself and to personal information about oneself ” (Adam Moore, 1998)Pluralistic resemblance • “Privacy is a plurality of different things.” “It is a set of protections against a related cluster of problems” (Daniel Solove, 2008)Privacy & data minimization • “Data controllers should collect only the personal data they really need, and should keep it only for as long as they need it”. (European Data Protection Directive 95/46/EC, 1995)
  • Debate privacy vs. security– tradeoff: security & privacy + need to have both• get more of one, at the expense of the other • “These two components of security – safety• after 9/11, give up civil liberties & privacy to and privacy … I work from the assumption that national security you need to have both.” – Donald Kerr (US• popular response: “I have nothing to hide” deputy director of national intelligence)• “The nothing to hide argument is an argument • “Security and privacy are not opposite ends of that the privacy interest is generally minimal to a seesaw. There is no security without privacy. trivial, thus making the balance against And liberty requires both security and privacy.” security concerns a foreordained victory for –Bruce Schneier (security commenter) security”. – Daniel Solove (privacy scholar)
  • Content protection motivation
  • Industry interests
  • Content protection Core techniques Encryption: first line of defense • + prevent unauthorized access • – no content protection after decryptionSymmetric secret secret key key key plaintext ciphertext Homer plaintext MargeAsymmetric Homer’s public Homer’s private key key key plaintext ciphertext plaintext Lisa & Bart Homer
  • Content protection Core techniques Digital watermarking: second line of defense • embed information imperceptibly • e.g. to prove ownership secret secret watermarking watermarking key keyoriginalcontent watermarked watermark content embedding detection / extractionwatermark distribution / processing / attack
  • Digital watermarking illustration original image watermarked image 512×512 watermark extracted watermark 64×64 image (correlation = 0.9997)
  • Privacy issue in content protectioncreation use control payment distribution monitor usage User’s privacy nightmare
  • Research motivationConflict • content protection interests of provider • privacy rights of userCan we reconcile privacy with protection of content?
  • OutlineIntroductionOverview of contributionsPrivacy threat analysis frameworkAnonymous buyer-seller watermarking protocolsConclusion
  • Overall structure Privacy preserving content protectionResearch Privacy preserving Privacy analysisQuestions content protection methodology systems Content protection Privacy protection for commercial for personal content contentProposed Threat Privacy Personal rights framework BSW protocols managementSolutions (Ch 3) friendly (Ch 2) eHealth (Ch 4) (Ch 5)
  • Chapter 2. Privacy threat analysis framework
  • Contribution (J.RE 2010)Background Problem & solutionThreat modeling Problem• threats • lacks systematic approach• requirements • privacy threat analysis• countermeasures Our solutionTwo pillars • privacy threat analysis framework• methodology • model threats to system elements• knowledge • instantiate threats using threat tree • checklists & patterns patterns • elicit requirements from misuse casesSecurity: methodological support • select countermeasures according to• goal-oriented: KAOS requirements• scenario-based: STRIDE
  • Chapter 3. Anonymous Buyer-Seller Watermarking Protocols
  • Contribution (J.TIFS 2010, MMSEC 2009)Background Problem & solutionMassive online distribution Problem• + efficiency and convenience • copyright protection (provider)• – threats: intellectual property rights • privacy protection (user)Traditional assumption Our solution• providers trustworthy • limited trust in seller • no illegal distribution • traceability: unique code embedded • honest embedding • copyright protection & piracy tracing• not realistic! • buyer’s revocable anonymity • formal security analysisTraceability discredited • actual protocol security bounded to• seller frames innocent buyer security of watermarking scheme• guilty buyer repudiates
  • Chapter 4. Privacy-friendlyarchitecture to manage distributed e-Health information
  • Contribution (J.OIR 2009, E-Health Handbook 2009)Back ground Problem & solutionE-Health system Problem• privacy sensitive content • content sharing– interoperability• overview of patient’s medical history (healthcare provider) • privacy protection (patient)Privacy threats Our solution• cross reference content & ID across providers • architecture distributed e-health• intensive use of patient’s ID • limited trust in healthcare service• different sensitivity levels providers • mediating service • data anonymization • practical validation
  • Chapter 5. Personal rights management for individual privacy enforcement
  • Contribution (PET’06, CMS’05)Background Problem & solutionPersonal content distribution Problem• (phone) cameras, blogs, social networks, • privacy protection (an individual) search engines • personal content distribution (other• private pictures taken & published individuals)• technology trends worsen situation Our solutionEmerging privacy threats • detection mechanism• governments and industry • control pictures taken by others• normal individuals • no restriction & no privacy infringement for photographers • distribution channel • non-professional adversary
  • OutlineIntroductionOverview of contributionsPrivacy threat analysis frameworkAnonymous buyer-seller watermarking protocolsConclusion
  • Privacy analysis frameworkSYSTEM SPECIFIC Assumption & High-level usage description scenariosMETHODOLOGY Define Data Map Privacy Select Identify Flow Threats to Risk-based Elicit Privacy Privacy Misuse Case Diagram DFD Prioritization Requirements Enhancing Scenarios (DFD) Elements SolutionsKNOWLEDGE Mapping Risk Mapping Mapping Privacy threats Assessment Privacy misuse Privacy threat tree components Techniques cases to Objectives to patterns to DFD (Not included) Requirements Solutions
  • Privacy threat analysis – illustration Privacy properties Privacy threats Data Flow Unlinkability Linkability Anonymity & Pseudonymity Identifiability Plausible deniability Non-repudiation Undetectability & Unobservability Detectability Confidentiality Disclosure of information Content awareness content Unawareness Data Policy and consent compliance policy and consent Noncompliance Flow Diagram Threat Tree PatternPrivacy threats Entity Data Data Process flow storeLinkability X X X XIdentifiability X X X XNon-repudiation X X XDetectability X X XInformation disclosure X X XContent unawareness XConsent/policy X X Xnoncompliance
  • Elicited privacy requirements & mitigation strategiesn° Threat scenarios Privacy requirements Suggested mitigation strategy1 Linkability of social network data store Unlinkability of data entries within the social network database Protection of the data store, by applying of data anonymization techniques, such as k-anonymity k-2 Linkability of data Unlinkability of messages of Employ anonymity system, flow (user-portal) user-portal communication e.g. TOR3 Linkability of entities the social network users Unlinkability of different pseudonyms (user IDs) of social network users Technical enforcement: Use anonymity system such as TOR, for communication between user and social network web portal User privacy self-awareness (aware revealing too much information online self- can be privacy invasive) Channel and message confidentiality (of data flow) Use anonymity system, such as TOR4 Identifiability at the social network data store Anonymity of social network users such that the user will not be Protection of the data store, by applying of data anonymization identified from social network database entries techniques, such as k-anonymity k-5 Identifiability at data flow of user data stream Anonymity of social network users such that the user will not be Technical enforcement: use anonymity system, such as TOR, for (use-rportal) (use-rportal) identified from user-portal communication user- communication between user and social network web portal
  • OutlineIntroductionOverview of contributionsPrivacy threat analysis frameworkAnonymous buyer-seller watermarking protocolsConclusion
  • Online transaction scenarioBuyer Seller Group Judge Manager
  • Anonymous buyer-seller watermarking protocols 2. Watermark generation & embedding Building blocks • homomorphic encryption: watermarking in encrypted domain o M ∈ M ,o C ∈ C , ∀m1 , m2 ∈ M : E (m1 o M m2 ) = E (m1 ) o C E (m2 ) • group signature • zero-knowledge proof Properties • traceability (seller’s security) • non-repudiation (seller’s security)1. Registration 3. Identification & • non-framing (buyer’s security) arbitration • anonymity & unlinkability (buyer’s security)
  • Registration phaseBuyer Group Manager gski gski ← GSjoin ( gpk , uski ) reg i ← GSiss ( gpk , isk , upki ) Secure & authenticated channel Group manager • Buyer’s ID Buyer group joining • secret signature key
  • Watermark generation & embedding phaseBuyer Seller π1,π 2 ( sk B , pk B ) ← BKgen(1k ) C ← JEnc( pk j , sk B ) ci ← BEnc( pk B , WBi ) m ← ( pk B , j , (ci )li =1 , C ) WATemb ( swk , X , BEnc ( pk B , W )) S m ← GSsig ( gpk , gski , m) Anonymous channel (S & B) Zero knowledge proofs • Fair Encryption of private Key • Bit Encryption of watermark
  • Watermark generation & embeddingBasic concept Type I• Seller & Buyer generate part of watermark • security (S & B)• Seller doesn’t know: buyer’s watermark & • multiple transactions watermarked content delivered to the buyer• Buyer doesn’t know: original content & seller’s X = X ⊕ V, watermark E(Y) = E(X⊕σ (W)) = E(X ) ⊗ E (σ (W)) intermediate ⊕ →  watermarked content original content index watermark ⊕ →  permutation final  → watermarked content buyer’s watermark permuted buyer’s watermark
  • Watermark generation & embedding Type II • not limited to permutation tolerant watermarks X = X ⊕ V, E(W) = E(WS + WB ) = E(WS ) × E(WB ), E(Y) = E(X⊕ W) = E(X ) ⊗ E(W) + →  composite watermarkbuyer’s watermark seller’s watermark additive homomorphic × →  ⊕ →  ⊕ →  final watermarked content original content index watermark Intermediate watermarked content
  • Watermark generation & embedding Type III • avoid double-watermark W = φ || (WS ⊕ WB ), E(W) = {E(φ1 ),..., E(φl1 )} || {E(WSB1 ),..., E(WSBl2 )} E(WBi ), (Wsi = 0) E(WSBi ) = E(WS ⊕ WB ) = { E(1) × E(WBi ) -1 , (Wsi = 1) intermediate ⊕c →  composite watermarkbuyer’s watermark seller’s watermark final || →  composite watermarkindex watermark intermediate composite watermark ⊕ →  final watermarked content original content
  • Identification and arbitration phase Seller Judge W ← WAT det( swk , Y ) Secure & authenticated channel( B i ,τ ) ← GSopen ( gpk , osk , reg , m, sm ) Group Manager
  • Implementation Type III BSW protocolParameters Communication complexity• 512×512-pixel image, ≈ 2 Mbit • (in exchanged bits)• Watermark 128 bits • Watermark generation & embedding: ≈• Paillier modulus N of 1024 bits 8 Mbit• run on CPU at 2.4 GHz • Identification & arbitration: ≈ 0.4 MbitExecution time (in seconds) • Expansion factor: ≈ 4.2• Registration: <0.5 sec• Identification & arbitration: < 2.5 sec• most computational load @ Seller watermark generation & embedding (WGE) phase execution time
  • OutlineIntroductionOverview of contributionsPrivacy threat analysis frameworkAnonymous buyer-seller watermarking protocolsConclusions
  • ConclusionsPrivacy threats emerge • trust in providersNeed balance • content protection (provider) & privacy protection (user)Privacy • as security, embodied valueBuild privacy in • goal-oriented, frameworkContent protection techniques • also protect privacy Yes, it is possible to reconcile privacy with protection of content
  • List of publicationsInternational JournalsMina Deng, Kim Wuyts, Riccardo Scandariato, Bart Preneel, and Wouter Joosen. A privacy threat analysis framework: supporting the elicitation and fulfillment of privacy requirements. Requirement Engineering Journal special issue on Data Privacy, to appear, 27 pages, 2010.Alfredo Rial, Mina Deng, Tiziano Bianchi, Alessandro Piva, and Bart Preneel. Anonymous buyer-seller watermarking protocols: formal definitions and security analysis. IEEE Transactions on Information Forensics and Security, to appear, 11 pages, 2010.Mina Deng, Danny De Cock, and Bart Preneel. Towards a cross-context identity management framework in e-health. Online Information Review, international journal 33(3):422-442, 2009.Mina Deng and Bart Preneel. Attacks on two buyer-seller watermarking protocols and an improvement for revocable anonymity. International Journal of Intelligent Information Technology Application, 1(2):53-64, 2008.Book ChaptersMina Deng, Danny De Cock, and Bart Preneel. An interoperable cross-context architecture to manage distributed personal e-health information. In M. M. Cunha, R. Simoes, and A. Tavares, editors, Handbook of Research on Developments in e- Health and Telemedicine: Technological and Social Perspectives, ISBN: 978-1-61520-670-4, chapter 27, pages 576-602. Hershey, PA, USA: IGI Global, Inc., 2009.Mina Deng and Bart Preneel. On secure buyer-seller watermarking protocols with revocable anonymity. In Kyeong Kang, editor, E-Commerce, ISBN: 978-953-7619-98-5, chapter 11, pages 184-202. IN-TECH Education and Publishing, Vienna, Austria, 2009.International conferences (Selected)Mina Deng, Tiziano Bianchi, Alessandro Piva, and Bart Preneel. An efficient buyer-seller watermarking protocol based on composite signal representation. In Proceedings of the 11th ACM workshop on Multimedia and security (MMSEC), pages 9-18, Princeton, New Jersey, USA. ACM New York, NY, USA, 2009.Mina Deng and Bart Preneel. On secure and anonymous buyer-seller watermarking protocol. In Abdelhamid Mellouk, Jun Bi, Guadalupe Ortiz, Dickson K. W. Chiu, and Manuela Popescu, editors, Third International Conference on Internet and Web Applications and Services (ICIW), pages 524-529, Athens, Greece. IEEE Computer Society, 2008.Mina Deng, Lothar Fritsch, and Klaus Kursawe. Personal rights management – taming camera-phones for individual privacy enforcement. In George Danezis and Philippe Golle, editors, Privacy Enhancing Technologies, 6th International Workshop (PET), Revised Selected Papers, volume 4258 of Lecture Notes in Computer Science, pages 172-189, Cambridge, UK. Springer, 2006.
  • Questions?Thank you! ☺mina.deng@esat.kuleuven.be