Successfully reported this slideshow.
Your SlideShare is downloading. ×

“Privacy: A Surmountable Challenge for Computer Vision,” a Presentation from Santa Clara University

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Loading in …3
×

Check these out next

1 of 18 Ad

“Privacy: A Surmountable Challenge for Computer Vision,” a Presentation from Santa Clara University

Download to read offline

For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2022/08/privacy-a-surmountable-challenge-for-computer-vision-a-presentation-from-santa-clara-university/

Susan Kennedy, Assistant Professor of Philosophy at Santa Clara University, presents the “Privacy: A Surmountable Challenge for Computer Vision” tutorial at the May 2022 Embedded Vision Summit.

Ethical concerns about privacy come with the territory of computer vision—after all, it’s hard to deny the privacy implications when considering the digital images or videos that are the bread and butter of these applications. But privacy concerns need not be a roadblock that stands in the way of technological innovation.

While the idea of “informed consent” has been a popular method of preserving privacy in some domains, it may not always be either possible or desirable in the context of emerging technologies like computer vision. In this talk, Kennedy considers a more flexible approach to preserving privacy that pays special attention to the context that a device is being designed for. In addition, she reviews several examples that demonstrate how small changes to the design of technology can make a big difference when it comes to preserving privacy.

For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2022/08/privacy-a-surmountable-challenge-for-computer-vision-a-presentation-from-santa-clara-university/

Susan Kennedy, Assistant Professor of Philosophy at Santa Clara University, presents the “Privacy: A Surmountable Challenge for Computer Vision” tutorial at the May 2022 Embedded Vision Summit.

Ethical concerns about privacy come with the territory of computer vision—after all, it’s hard to deny the privacy implications when considering the digital images or videos that are the bread and butter of these applications. But privacy concerns need not be a roadblock that stands in the way of technological innovation.

While the idea of “informed consent” has been a popular method of preserving privacy in some domains, it may not always be either possible or desirable in the context of emerging technologies like computer vision. In this talk, Kennedy considers a more flexible approach to preserving privacy that pays special attention to the context that a device is being designed for. In addition, she reviews several examples that demonstrate how small changes to the design of technology can make a big difference when it comes to preserving privacy.

Advertisement
Advertisement

More Related Content

Similar to “Privacy: A Surmountable Challenge for Computer Vision,” a Presentation from Santa Clara University (20)

More from Edge AI and Vision Alliance (20)

Advertisement

Recently uploaded (20)

“Privacy: A Surmountable Challenge for Computer Vision,” a Presentation from Santa Clara University

  1. 1. Privacy: A Surmountable Challenge for Computer Vision Susan Kennedy, PhD Assistant Professor of Philosophy Santa Clara University
  2. 2. 2 © 2022 Santa Clara University
  3. 3. Privacy: A Roadblock for Innovation? 3 © 2022 Santa Clara University
  4. 4. • Limitations of Current Privacy Procedures • Contextual Integrity Framework • Case studies: Privacy in Practice • Key Takeaways Agenda 4 © 2022 Santa Clara University
  5. 5. Looking Beyond Informed Consent 5 © 2022 Santa Clara University
  6. 6. • Context shapes users’ expectations of privacy • Privacy is a right to the appropriate flow of information Contextual Integrity 6 © 2022 Santa Clara University
  7. 7. Context-Relative Informational Norms 7 © 2022 Santa Clara University Context: What is the prevailing context? Actors: Who are the subjects, senders and recipients of information? Attributes: What is the type or nature of information? Transmission Principle: What are the constraints on the flow of information?
  8. 8. Context-Relative Informational Norms 8 © 2022 Santa Clara University Context: What is the prevailing context? Actors: Who are the subjects, senders and recipients of information? Attributes: What is the type or nature of information? Transmission Principle: What are the constraints on the flow of information? If the new practice results in any changes to these features, the practice is flagged as violating privacy
  9. 9. • The contextual integrity framework can pinpoint privacy concerns and help guide design solutions • Practices that are flagged as violating privacy may still be permissible, all things considered Second Chances 9 © 2022 Santa Clara University
  10. 10. Fall Detection Devices 10 © 2022 Santa Clara University
  11. 11. 11 © 2022 Santa Clara University Context: Caregiving in health care facilities Actors: Patients, Caregivers, Annotators at Kepler Attributes: Videos and Images of Patients Transmission Principle: Caregiver’s mandate, confidential Application of Contextual Integrity Framework Fall Detection Device
  12. 12. • Design Solutions • Security measures for the annotation process • Moral Considerations • Users directly benefit from improved model accuracy • Invasions of user privacy are minimized compared to the alternative best practices for caregivers Fall Detection Device - Analysis 12 © 2022 Santa Clara University
  13. 13. 13 © 2022 Santa Clara University Context: Identity verification and fraud detection for the IRS Actors: IRS, ID.me, third parties Attributes: Images of users Transmission Principle: Mandatory Application of Contextual Integrity Framework ID.me Facial Recognition
  14. 14. • Design Solutions • Alternative verification option for users that does not involve facial recognition • Create an avenue for users to provide consent for data sharing • Moral Considerations • Users directly benefit from improved fraud detection • Improves equity by enabling online access to government services for users who are overseas ID.me Facial Recognition - Analysis 14 © 2022 Santa Clara University
  15. 15. Operating Room Monitoring 15 © 2022 Santa Clara University “Human Pose Estimation on Privacy-Preserving Low-Resolution Depth Images,” Srivastav, Gangi, Padoy 2019
  16. 16. Design Choices: Levels of Access 16 © 2022 Santa Clara University “Blinkering Surveillance: Enabling Video Privacy through Computer Vision,” Senior, Pankanti, Hampapur, Brown, Tian, Ekin 2004.
  17. 17. • The Contextual Integrity framework can be used to: • Pinpoint privacy concerns • Guide design solutions • Frame moral considerations • Design choices: Hardware selection, de-identification, data minimization, different levels of information access, etc. Key Takeaways 17 © 2022 Santa Clara University
  18. 18. Resources 1 – Helen Nissenbaum’s book Privacy in Context https://dl.acm.org/doi/10.5555/1822585 2 – Privacy by Design: The 7 Foundational Principles https://www.ipc.on.ca/wp- content/uploads/resources/7foundationalprin ciples.pdf 3 – Privacy by Design – Practical Guidance https://iapp.org/media/pdf/resource_center/p bd_implement_7found_principles.pdf © 2022 Santa Clara University 18 2022 Embedded Vision Summit “Ask the Ethicist: Your Questions about AI Privacy, Bias, and Ethics Answered” Tuesday, May 17th 2:40 – 3:10 pm PT Business Insights 1 – Theater (upstairs)

×