SlideShare a Scribd company logo
1 of 6
Download to read offline


Expanded Protections for Children


Frequently Asked Questions




August 2021


Information subject to copyright. All rights reserved.
Overview


At Apple, our goal is to create technology that empowers people and enriches their lives —
while helping them stay safe. We want to protect children from predators who use communica-
tion tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material
(CSAM). Since we announced these features, many stakeholders including privacy organiza-
tions and child safety organizations have expressed their support of this new solution, and
some have reached out with questions. This document serves to address these questions and
provide more clarity and transparency in the process.


What are the differences between communication safety in


Messages and CSAM detection in iCloud Photos?


These two features are not the same and do not use the same technology.


Communication safety in Messages is designed to give parents and children additional tools to
help protect their children from sending and receiving sexually explicit images in the Messages
app. It works only on images sent or received in the Messages app for child accounts set up in
Family Sharing. It analyzes the images on-device, and so does not change the privacy assur-
ances of Messages. When a child account sends or receives sexually explicit images, the photo
will be blurred and the child will be warned, presented with helpful resources, and reassured it is
okay if they do not want to view or send the photo. As an additional precaution, young children
can also be told that, to make sure they are safe, their parents will get a message if they do view
it.


The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud
Photos without providing information to Apple about any photos other than those that match
known CSAM images. CSAM images are illegal to possess in most countries, including the Unit-
ed States. This feature only impacts users who have chosen to use iCloud Photos to store their
photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact
to any other on-device data. This feature does not apply to Messages.


Communication safety in Messages


Who can use communication safety in Messages?


Communication safety in Messages is only available for accounts set up as families in iCloud.
Parent/guardian accounts must opt in to turn on the feature for their family group. Parental noti-
fications can only be enabled by parents/guardians for child accounts age 12 or younger.


	
	


2
Expanded Protections for Children - Frequently Asked Questions


Information subject to copyright. All rights reserved.
Does this mean Messages will share information with Apple or law
enforcement?


No. Apple never gains access to communications as a result of this feature in Messages. This
feature does not share any information with Apple, NCMEC or law enforcement. The communi-
cations safety feature in Messages is separate from CSAM detection for iCloud Photos — see
below for more information about that feature.


Does this break end-to-end encryption in Messages?


No. This doesn’t change the privacy assurances of Messages, and Apple never gains access to
communications as a result of this feature. Any user of Messages, including those with with
communication safety enabled, retains control over what is sent and to whom. If the feature is
enabled for the child account, the device will evaluate images in Messages and present an in-
tervention if the image is determined to be sexually explicit. For accounts of children age 12 and
under, parents can set up parental notifications which will be sent if the child confirms and
sends or views an image that has been determined to be sexually explicit. None of the commu-
nications, image evaluation, interventions, or notifications are available to Apple.


Does this feature prevent children in abusive homes from seeking
help?


The communication safety feature applies only to sexually explicit photos shared or received in
Messages. Other communications that victims can use to seek help, including text in Messages,
are unaffected. We are also adding additional support to Siri and Search to provide victims —
and people who know victims — more guidance on how to seek help.


Will parents be notified without children being warned and given a
choice?


No. First, parent/guardian accounts must opt-in to enable communication safety in Messages,
and can only choose to turn on parental notifications for child accounts age 12 and younger. For
child accounts age 12 and younger, each instance of a sexually explicit image sent or received
will warn the child that if they continue to view or send the image, their parents will be sent a
notification. Only if the child proceeds with sending or viewing an image after this warning will
the notification be sent. For child accounts age 13–17, the child is still warned and asked if they
wish to view or share a sexually explicit image, but parents are not notified.


	
	


3
Expanded Protections for Children - Frequently Asked Questions


Information subject to copyright. All rights reserved.
CSAM detection


Does this mean Apple is going to scan all the photos stored on my
iPhone?


No. By design, this feature only applies to photos that the user chooses to upload to iCloud
Photos, and even then Apple only learns about accounts that are storing collections of known
CSAM images, and only the images that match to known CSAM. The system does not work for
users who have iCloud Photos disabled. This feature does not work on your private iPhone pho-
to library on the device.


Will this download CSAM images to my iPhone to compare against
my photos?


No. CSAM images are not stored on or sent to the device. Instead of actual images, Apple uses
unreadable hashes that are stored on device. These hashes are strings of numbers that repre-
sent known CSAM images, but it isn’t possible to read or convert those hashes into the CSAM
images they are based on. This set of image hashes is based on images acquired and validated
to be CSAM by child safety organizations. Using new applications of cryptography, Apple is able
to use these hashes to learn only about iCloud Photos accounts that are storing collections of
photos that match to these known CSAM images, and is then only able to learn about photos
that are known CSAM, without learning about or seeing any other photos.


Why is Apple doing this now?


One of the significant challenges in this space is protecting children while also preserving the
privacy of users. With this new technology, Apple will learn about known CSAM photos being
stored in iCloud Photos where the account is storing a collection of known CSAM. Apple will not
learn anything about other data stored solely on device.


Existing techniques as implemented by other companies scan all user photos stored in the
cloud. This creates privacy risk for all users. CSAM detection in iCloud Photos provides signifi-
cant privacy benefits over those techniques by preventing Apple from learning about photos
unless they both match to known CSAM images and are included in an iCloud Photos account
that includes a collection of known CSAM.


	
	


4
Expanded Protections for Children - Frequently Asked Questions


Information subject to copyright. All rights reserved.
Security for CSAM detection for iCloud Photos


Can the CSAM detection system in iCloud Photos be used to detect
things other than CSAM?


Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is
built so that the system only works with CSAM image hashes provided by NCMEC and other
child safety organizations. This set of image hashes is based on images acquired and validated
to be CSAM by child safety organizations. There is no automated reporting to law enforcement,
and Apple conducts human review before making a report to NCMEC. As a result, the system is
only designed to report photos that are known CSAM in iCloud Photos. In most countries, in-
cluding the United States, simply possessing these images is a crime and Apple is obligated to
report any instances we learn of to the appropriate authorities.


Could governments force Apple to add non-CSAM images to the
hash list?


Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect
known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC
and other child safety groups. We have faced demands to build and deploy government-man-
dated changes that degrade the privacy of users before, and have steadfastly refused those
demands. We will continue to refuse them in the future. Let us be clear, this technology is limit-
ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to
expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a
case where the system flags photos that do not match known CSAM images, the account would
not be disabled and no report would be filed to NCMEC.


Can non-CSAM images be “injected” into the system to flag ac-
counts for things other than CSAM?


Our process is designed to prevent that from happening. The set of image hashes used for
matching are from known, existing images of CSAM that have been acquired and validated by
child safety organizations. Apple does not add to the set of known CSAM image hashes. The
same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted
attacks against only specific individuals are not possible under our design. Finally, there is no
automated reporting to law enforcement, and Apple conducts human review before making a
report to NCMEC. In the unlikely event of the system flagging images that do not match known
CSAM images, the account would not be disabled and no report would be filed to NCMEC.


	
	


5
Expanded Protections for Children - Frequently Asked Questions


Information subject to copyright. All rights reserved.
Will CSAM detection in iCloud Photos falsely flag innocent people to
law enforcement?


No. The system is designed to be very accurate, and the likelihood that the system would incor-
rectly flag any given account is less than one in one trillion per year. In addition, any time an ac-
count is flagged by the system, Apple conducts human review before making a report to
NCMEC. As a result, system errors or attacks will not result in innocent people being reported to
NCMEC.


	
	


6
Expanded Protections for Children - Frequently Asked Questions


Information subject to copyright. All rights reserved.

More Related Content

Similar to Expanded protections for_children_frequently_asked_questions

Desperate for a Privacy Fix | HMA VPN
Desperate for a Privacy Fix | HMA VPNDesperate for a Privacy Fix | HMA VPN
Desperate for a Privacy Fix | HMA VPNHMA VPN
 
Protecting your Data in Google Apps
Protecting your Data in Google AppsProtecting your Data in Google Apps
Protecting your Data in Google AppsElastica Inc.
 
obtain additional security
obtain additional security 
obtain additional security
obtain additional security snobbishmishap958
 
Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )Brian Pichman
 
[Lithuania] I am the cavalry
[Lithuania] I am the cavalry[Lithuania] I am the cavalry
[Lithuania] I am the cavalryOWASP EEE
 
Facial recognition systems
Facial  recognition  systemsFacial  recognition  systems
Facial recognition systemstom12thomas
 
Facial recognition systems
Facial  recognition  systemsFacial  recognition  systems
Facial recognition systemstom12thomas
 
Facial recognition__systems[1]
Facial  recognition__systems[1]Facial  recognition__systems[1]
Facial recognition__systems[1]christeenamoses
 
Ethics and software engineering
Ethics and software engineeringEthics and software engineering
Ethics and software engineeringSolomon Nsumba
 
5 Ways to Protect your Mobile Security
5 Ways to Protect your Mobile Security5 Ways to Protect your Mobile Security
5 Ways to Protect your Mobile SecurityLookout
 
Online safety
Online safetyOnline safety
Online safetyelioar8
 
Social Media for Parents of Teens
Social Media for Parents of TeensSocial Media for Parents of Teens
Social Media for Parents of TeensKate Gukeisen
 
Technology deep dive project
Technology deep dive projectTechnology deep dive project
Technology deep dive projectHelenaZhao5
 
Digital Literacy Session 2017
Digital Literacy Session 2017Digital Literacy Session 2017
Digital Literacy Session 2017Tom Davidson
 
10 questions for LAFOIP in the classroom
10 questions for LAFOIP in the classroom10 questions for LAFOIP in the classroom
10 questions for LAFOIP in the classroomkmuench
 
Tumblr makes new changes to stay on apple's app store
Tumblr makes new changes to stay on apple's app storeTumblr makes new changes to stay on apple's app store
Tumblr makes new changes to stay on apple's app storeaditi agarwal
 

Similar to Expanded protections for_children_frequently_asked_questions (20)

Desperate for a Privacy Fix | HMA VPN
Desperate for a Privacy Fix | HMA VPNDesperate for a Privacy Fix | HMA VPN
Desperate for a Privacy Fix | HMA VPN
 
Protecting your Data in Google Apps
Protecting your Data in Google AppsProtecting your Data in Google Apps
Protecting your Data in Google Apps
 
obtain additional security
obtain additional security 
obtain additional security
obtain additional security
 
Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )
 
[Lithuania] I am the cavalry
[Lithuania] I am the cavalry[Lithuania] I am the cavalry
[Lithuania] I am the cavalry
 
Thecavalryisus owasp eee-oct2015_v2
Thecavalryisus owasp eee-oct2015_v2Thecavalryisus owasp eee-oct2015_v2
Thecavalryisus owasp eee-oct2015_v2
 
COPPA Compliance
COPPA ComplianceCOPPA Compliance
COPPA Compliance
 
Facial recognition systems
Facial  recognition  systemsFacial  recognition  systems
Facial recognition systems
 
Facial recognition systems
Facial  recognition  systemsFacial  recognition  systems
Facial recognition systems
 
Facial recognition__systems[1]
Facial  recognition__systems[1]Facial  recognition__systems[1]
Facial recognition__systems[1]
 
App Tracking Transparancy.docx
App Tracking Transparancy.docxApp Tracking Transparancy.docx
App Tracking Transparancy.docx
 
Ethics and software engineering
Ethics and software engineeringEthics and software engineering
Ethics and software engineering
 
5 Ways to Protect your Mobile Security
5 Ways to Protect your Mobile Security5 Ways to Protect your Mobile Security
5 Ways to Protect your Mobile Security
 
Online safety
Online safetyOnline safety
Online safety
 
Social Media for Parents of Teens
Social Media for Parents of TeensSocial Media for Parents of Teens
Social Media for Parents of Teens
 
Federated Learning
Federated LearningFederated Learning
Federated Learning
 
Technology deep dive project
Technology deep dive projectTechnology deep dive project
Technology deep dive project
 
Digital Literacy Session 2017
Digital Literacy Session 2017Digital Literacy Session 2017
Digital Literacy Session 2017
 
10 questions for LAFOIP in the classroom
10 questions for LAFOIP in the classroom10 questions for LAFOIP in the classroom
10 questions for LAFOIP in the classroom
 
Tumblr makes new changes to stay on apple's app store
Tumblr makes new changes to stay on apple's app storeTumblr makes new changes to stay on apple's app store
Tumblr makes new changes to stay on apple's app store
 

More from ssuser3957bc1

2023-10-05+Rheinmetall+SurveilSpire+UKR+engl.pdf
2023-10-05+Rheinmetall+SurveilSpire+UKR+engl.pdf2023-10-05+Rheinmetall+SurveilSpire+UKR+engl.pdf
2023-10-05+Rheinmetall+SurveilSpire+UKR+engl.pdfssuser3957bc1
 
Законопроєкт №8011
Законопроєкт №8011Законопроєкт №8011
Законопроєкт №8011ssuser3957bc1
 
У Києві перейменували проспект Перемоги
У Києві перейменували проспект ПеремогиУ Києві перейменували проспект Перемоги
У Києві перейменували проспект Перемогиssuser3957bc1
 
Графік_відключень.pdf
Графік_відключень.pdfГрафік_відключень.pdf
Графік_відключень.pdfssuser3957bc1
 
Графік_відключень_Кем.pdf
Графік_відключень_Кем.pdfГрафік_відключень_Кем.pdf
Графік_відключень_Кем.pdfssuser3957bc1
 
Kramatorsk Report FINAL.pdf
Kramatorsk Report FINAL.pdfKramatorsk Report FINAL.pdf
Kramatorsk Report FINAL.pdfssuser3957bc1
 
220921_Список-215_UKR.pdf
220921_Список-215_UKR.pdf220921_Список-215_UKR.pdf
220921_Список-215_UKR.pdfssuser3957bc1
 
Меру Чернігова Владиславу Атрошенку вручили антикорупційний протокол через «к...
Меру Чернігова Владиславу Атрошенку вручили антикорупційний протокол через «к...Меру Чернігова Владиславу Атрошенку вручили антикорупційний протокол через «к...
Меру Чернігова Владиславу Атрошенку вручили антикорупційний протокол через «к...ssuser3957bc1
 
Україна презентувала план відновлення країни
Україна презентувала план відновлення країниУкраїна презентувала план відновлення країни
Україна презентувала план відновлення країниssuser3957bc1
 
Зеленский підписав указ про введення санкцій проти Путіна.
Зеленский підписав указ про введення санкцій проти Путіна.Зеленский підписав указ про введення санкцій проти Путіна.
Зеленский підписав указ про введення санкцій проти Путіна.ssuser3957bc1
 
Володимир Зеленський підписав указ про санкції проти президента росії путіна,...
Володимир Зеленський підписав указ про санкції проти президента росії путіна,...Володимир Зеленський підписав указ про санкції проти президента росії путіна,...
Володимир Зеленський підписав указ про санкції проти президента росії путіна,...ssuser3957bc1
 
Декларація Путіна
Декларація ПутінаДекларація Путіна
Декларація Путінаssuser3957bc1
 

More from ssuser3957bc1 (20)

2023-10-05+Rheinmetall+SurveilSpire+UKR+engl.pdf
2023-10-05+Rheinmetall+SurveilSpire+UKR+engl.pdf2023-10-05+Rheinmetall+SurveilSpire+UKR+engl.pdf
2023-10-05+Rheinmetall+SurveilSpire+UKR+engl.pdf
 
1687627.pdf
1687627.pdf1687627.pdf
1687627.pdf
 
Законопроєкт №8011
Законопроєкт №8011Законопроєкт №8011
Законопроєкт №8011
 
1606284.pdf
1606284.pdf1606284.pdf
1606284.pdf
 
test3.pdf
test3.pdftest3.pdf
test3.pdf
 
test2.pdf
test2.pdftest2.pdf
test2.pdf
 
test8.pdf
test8.pdftest8.pdf
test8.pdf
 
test9.pdf
test9.pdftest9.pdf
test9.pdf
 
test8.pdf
test8.pdftest8.pdf
test8.pdf
 
test1.pdf
test1.pdftest1.pdf
test1.pdf
 
У Києві перейменували проспект Перемоги
У Києві перейменували проспект ПеремогиУ Києві перейменували проспект Перемоги
У Києві перейменували проспект Перемоги
 
Графік_відключень.pdf
Графік_відключень.pdfГрафік_відключень.pdf
Графік_відключень.pdf
 
Графік_відключень_Кем.pdf
Графік_відключень_Кем.pdfГрафік_відключень_Кем.pdf
Графік_відключень_Кем.pdf
 
Kramatorsk Report FINAL.pdf
Kramatorsk Report FINAL.pdfKramatorsk Report FINAL.pdf
Kramatorsk Report FINAL.pdf
 
220921_Список-215_UKR.pdf
220921_Список-215_UKR.pdf220921_Список-215_UKR.pdf
220921_Список-215_UKR.pdf
 
Меру Чернігова Владиславу Атрошенку вручили антикорупційний протокол через «к...
Меру Чернігова Владиславу Атрошенку вручили антикорупційний протокол через «к...Меру Чернігова Владиславу Атрошенку вручили антикорупційний протокол через «к...
Меру Чернігова Владиславу Атрошенку вручили антикорупційний протокол через «к...
 
Україна презентувала план відновлення країни
Україна презентувала план відновлення країниУкраїна презентувала план відновлення країни
Україна презентувала план відновлення країни
 
Зеленский підписав указ про введення санкцій проти Путіна.
Зеленский підписав указ про введення санкцій проти Путіна.Зеленский підписав указ про введення санкцій проти Путіна.
Зеленский підписав указ про введення санкцій проти Путіна.
 
Володимир Зеленський підписав указ про санкції проти президента росії путіна,...
Володимир Зеленський підписав указ про санкції проти президента росії путіна,...Володимир Зеленський підписав указ про санкції проти президента росії путіна,...
Володимир Зеленський підписав указ про санкції проти президента росії путіна,...
 
Декларація Путіна
Декларація ПутінаДекларація Путіна
Декларація Путіна
 

Recently uploaded

08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?Igalia
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 

Recently uploaded (20)

08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 

Expanded protections for_children_frequently_asked_questions

  • 1. 
 Expanded Protections for Children Frequently Asked Questions 
 August 2021 Information subject to copyright. All rights reserved.
  • 2. Overview At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe. We want to protect children from predators who use communica- tion tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM). Since we announced these features, many stakeholders including privacy organiza- tions and child safety organizations have expressed their support of this new solution, and some have reached out with questions. This document serves to address these questions and provide more clarity and transparency in the process. What are the differences between communication safety in Messages and CSAM detection in iCloud Photos? These two features are not the same and do not use the same technology. Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assur- ances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it. The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the Unit- ed States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages. Communication safety in Messages Who can use communication safety in Messages? Communication safety in Messages is only available for accounts set up as families in iCloud. Parent/guardian accounts must opt in to turn on the feature for their family group. Parental noti- fications can only be enabled by parents/guardians for child accounts age 12 or younger. 2 Expanded Protections for Children - Frequently Asked Questions Information subject to copyright. All rights reserved.
  • 3. Does this mean Messages will share information with Apple or law enforcement? No. Apple never gains access to communications as a result of this feature in Messages. This feature does not share any information with Apple, NCMEC or law enforcement. The communi- cations safety feature in Messages is separate from CSAM detection for iCloud Photos — see below for more information about that feature. Does this break end-to-end encryption in Messages? No. This doesn’t change the privacy assurances of Messages, and Apple never gains access to communications as a result of this feature. Any user of Messages, including those with with communication safety enabled, retains control over what is sent and to whom. If the feature is enabled for the child account, the device will evaluate images in Messages and present an in- tervention if the image is determined to be sexually explicit. For accounts of children age 12 and under, parents can set up parental notifications which will be sent if the child confirms and sends or views an image that has been determined to be sexually explicit. None of the commu- nications, image evaluation, interventions, or notifications are available to Apple. Does this feature prevent children in abusive homes from seeking help? The communication safety feature applies only to sexually explicit photos shared or received in Messages. Other communications that victims can use to seek help, including text in Messages, are unaffected. We are also adding additional support to Siri and Search to provide victims — and people who know victims — more guidance on how to seek help. Will parents be notified without children being warned and given a choice? No. First, parent/guardian accounts must opt-in to enable communication safety in Messages, and can only choose to turn on parental notifications for child accounts age 12 and younger. For child accounts age 12 and younger, each instance of a sexually explicit image sent or received will warn the child that if they continue to view or send the image, their parents will be sent a notification. Only if the child proceeds with sending or viewing an image after this warning will the notification be sent. For child accounts age 13–17, the child is still warned and asked if they wish to view or share a sexually explicit image, but parents are not notified. 3 Expanded Protections for Children - Frequently Asked Questions Information subject to copyright. All rights reserved.
  • 4. CSAM detection Does this mean Apple is going to scan all the photos stored on my iPhone? No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone pho- to library on the device. Will this download CSAM images to my iPhone to compare against my photos? No. CSAM images are not stored on or sent to the device. Instead of actual images, Apple uses unreadable hashes that are stored on device. These hashes are strings of numbers that repre- sent known CSAM images, but it isn’t possible to read or convert those hashes into the CSAM images they are based on. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. Using new applications of cryptography, Apple is able to use these hashes to learn only about iCloud Photos accounts that are storing collections of photos that match to these known CSAM images, and is then only able to learn about photos that are known CSAM, without learning about or seeing any other photos. Why is Apple doing this now? One of the significant challenges in this space is protecting children while also preserving the privacy of users. With this new technology, Apple will learn about known CSAM photos being stored in iCloud Photos where the account is storing a collection of known CSAM. Apple will not learn anything about other data stored solely on device. Existing techniques as implemented by other companies scan all user photos stored in the cloud. This creates privacy risk for all users. CSAM detection in iCloud Photos provides signifi- cant privacy benefits over those techniques by preventing Apple from learning about photos unless they both match to known CSAM images and are included in an iCloud Photos account that includes a collection of known CSAM. 4 Expanded Protections for Children - Frequently Asked Questions Information subject to copyright. All rights reserved.
  • 5. Security for CSAM detection for iCloud Photos Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM? Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, in- cluding the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities. Could governments force Apple to add non-CSAM images to the hash list? Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-man- dated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limit- ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC. Can non-CSAM images be “injected” into the system to flag ac- counts for things other than CSAM? Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system flagging images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC. 5 Expanded Protections for Children - Frequently Asked Questions Information subject to copyright. All rights reserved.
  • 6. Will CSAM detection in iCloud Photos falsely flag innocent people to law enforcement? No. The system is designed to be very accurate, and the likelihood that the system would incor- rectly flag any given account is less than one in one trillion per year. In addition, any time an ac- count is flagged by the system, Apple conducts human review before making a report to NCMEC. As a result, system errors or attacks will not result in innocent people being reported to NCMEC. 6 Expanded Protections for Children - Frequently Asked Questions Information subject to copyright. All rights reserved.