Camera sensors are expected to double to 2.2B by 2017; as wearable devices, smart TV’s and other devices incorporate new smart features. Advertisers, retailers and market researchers are among the first to take advantage of these new capabilities.
Ride the Storm: Navigating Through Unstable Periods / Katerina Rudko (Belka G...
Face Recognition and Privacy by Design for Retail and Market Research – White Paper by IMRSV
1. Anonymous Video Analytics (AVA)
Technology and Privacy
As featured in:
Nov 2013
86 Chambers Suite 704 New York, NY 10007 | www.imrsv.com
1
2. Introduction
There are more than 1.2B internet enabled desktop, mobile and tablet devices
equipped with camera sensors. According to ABI Research, by 2020 nodes/sensors will
account for the majority (60 percent) of the total installed base (50Bn units) of Internet
of Everything devices. Camera sensors are expected to double to 2.2B by 2017; as
wearable devices, smart TV’s and other devices incorporate new smart features and
capabilities.
3 Million digital displays in the US are web enabled, reaching over 70% of US teens and
adults every month in public venues. Retailers and market researchers are beginning
to use pattern detection technology to understand viewing audiences. The use of this
technology enables retailers, packaged goods brands, agencies and operators of
facilities such as malls, airports, colleges and museums to better understand and
communicate with their guests.
What is Anonymous Video Analytics?
AVA analyzes millions of pixels per second and anonymously detects general traits of
viewers, along with demographic and engagement data from multiple people
simultaneously. Data is extracted and stored as a numerical log file with no images or
video being stored, recorded or transmitted.
How does it work?
Sensors located in display panels, inside mobile/tablet devices, near product
placements scan the surrounding area. AVA is a computer vision application that
processes video feeds in order to detect an arrangement of pixels that resemble a
general pattern of a human face, using patterns such as pixel density around the the
eyes, nose and mouth. Video feeds are automatically analyzed on the local computing
device and discarded, and are not transmitted to any person or server. The facial
features in the image are detected, and any other objects like trees, buildings, bodies,
etc. are ignored.
Detection algorithms are based on a “learned” face pattern that has been trained on
an audience database of thousands of face images. This allows the software to
determine the gender and age of anonymous participants. Each video frame is
processed to detect the presence of human faces, and there is no database used to
match faces to an identity, as would be the case with face recognition. Non-identifiable
information includes a person’s gender, approximate age, and facial expression.
Nov 2013
86 Chambers Suite 704 New York, NY 10007 | www.imrsv.com
2
3. What information is collected?
Anonymous Information collected may include:
• Total count of individuals
• Demographic data such as gender and approximate age
• Engagement data such as attention, duration time and number of glances
• Viewer attributes such as estimated distance and general position
• Emotional expression (Facial Coding)
How is the information used?
AVA has no ability to recognize or identify anyone. The software gathers purely
numerical data, no personally identifiable information is collected and no images are
ever saved, recorded or transmitted. The anonymized data is aggregated by the
software to report numerical statistics. The analytics generated by AVA software
provides marketers and businesses valuable insights into what’s actually happening
within the proximity of displays and other product locations in real time.
Understanding the dynamics of the viewing audience allows businesses to better serve
their guests. AVA also provides marketers the ability to assess the cause and effect of
marketing messaging, and map sales or other data against an audience.
The FTC Face Facts report recommends the following guidelines:
1. Design services with consumer privacy in mind
2. Develop reasonable security protections for the information collected
3. No personal information is stored or collected. All anonymous numerical data is
securely encrypted and not shared with anyone.
4. Consider the sensitivity of information when developing products and services – for
example, digital signs using facial recognition technologies should not be set up in
places where children congregate.
The FTC staff report also recommends that companies take steps to make sure
consumers are aware of facial detection technologies when they come in contact with
them, and that they have a choice as to whether data about them is collected. So, for
example, if a company is using digital signs to determine the demographic features of
passersby, such as age or gender, they should provide clear notice to consumers that
the technology is in use before consumers come into contact with the signs.
Nov 2013
86 Chambers Suite 704 New York, NY 10007 | www.imrsv.com
3
4. The Digital Signage Federation (DSF) privacy guidelines - which represents a wide range
of companies from hardware and software vendors to retailers and fast food
restaurant operators - has recommended a set of privacy standards based on the
internationally-used Fair Information Practices (or FIPs), which are incorporated in
many privacy laws globally. The guidelines are voluntary recommendations.
• Transparency: Companies should give consumers “meaningful notice” where the
technology is in use;
• Individual Participation: Consumers should have the right to opt out (with AVA,
notice on site means consumers can choose to avoid the screens and sensors);
• Purpose Specification: Published policies should explain how the collected data
is used;
• Data Minimization: Companies should limit their data collection and retention to
only the minimum needed to achieve specified needs;
• Use Limitation: Collected data should not be shared or sold for any uses that are
incompatible the original purposes specified;
• Data Quality and Integrity: If identifiable data is retained, consumers should
have the right and mechanism to edit that data for accuracy;
• Security: Any data collected should be secured;
• Accountability: End-users should establish internal accountability mechanisms.
Is face recognition the same as AVA?
No. Simply put, face detection detects human faces, it does not recognize who the
person is. AVA has no ability to remember anyone once they have left the scene. Face
recognition is a different type of imaging technology that searches for faces matched
to images stored in a database. Face recognition can identify and remember a face
even years after it was first recorded. AVA uses anonymized general traits and does
not use face recognition. No face images are stored and no identity information is
matched. AVA uses some characteristics of the face to classify demographics (age and
gender) pictures of the person are immediately discarded.
Nov 2013
86 Chambers Suite 704 New York, NY 10007 | www.imrsv.com
4
5. AVA and Privacy by Design
IMRSV provides clear and unambiguous statements about the “anonymous” nature of
AVA’s processes.
1. No identifiable information is collected, retained, used, or shared using AVA.
2. Real time video is scanned, analyzed and immediately
discarded in the AVA process.
3. The aggregated anonymous data provides valuable,
actionable insights for users.
4. Real time processing means security and privacy risks are constantly addressed.
5. Visibility and transparency. Vendors and the user community are encouraging
consumer notice.
6. Respect for user privacy: keep it user-centric. Consumers should be empowered
by this technology to participate and/or verify privacy claims.
Privacy Advocate References:
“Business leaders and innovators who take data protection seriously should bake
privacy by design principles into both their product development and business
practices. Not only is it the right thing to do, in my opinion, but it’s good for business
because it helps build trust with users and I applaud the IMSRV team for making a
commitment to privacy by design.”
– Natalie Fronseca
Co-founder and Executive Producer at Privacy Identity Innovation
"Privacy by design solutions are critical to implementing new technologies in a world
were data collection has become ubiquitous. Steps that Cara takes such as not
collecting any personal information, and not storing, transferring or recorded any
images are key to ensuring privacy concerns are addressed as these technologies are
rolled out.”
– Jules Polonetsky
Co-Chair and Director at The Future of Privacy Forum
Nov 2013
86 Chambers Suite 704 New York, NY 10007 | www.imrsv.com
5