This paper presents an overview of the Drone Protect Task (DPT) of MediaEval 2015, its objectives, related dataset, and evaluation approach. Participants in this task were required to implement a privacy filter or a combination of filters to protect various personal information regions in the video sequences provided. The challenge was to achieve an adequate balance between the degree of privacy protection, intelligibility (how much useful information is retained after privacy filtering), and pleasantness (how minimal were the adverse effects of filtering on the appearance of the video frames). The evaluation methods for this task include subjective evaluation by those working in the video surveillance sector and also by naïve viewers.
http://ceur-ws.org/Vol-1436/
http://www.multimediaeval.org
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task
1. Drone
Protect
Task
Working
Paper
MediaEval
2015
Atta Badii, Pavel Koshunov, Hamid Oudi Touradj Ebrahimi, Tomas Piatrik , Volker Eiselein ,
Natacha Ruchaud, Christian Fedorczak , Jean-Luc Dugelay, Diego Fernandez Vazquez
7
2. Task
Description
• To explore the possibilities to optimise the process of
privacy filtering so as to:
1. obscure personal visual information effectively
whilst,
2. keeping as much as possible of the ‘useful’
information that would enable a human viewer to
interpret the obscured video frame.
Slide 2
3. • Privacy
Protection
Level
– How adequate was the
level of privacy protection achieved by the filter across
all testing video clips?
• Level
of
Intelligibility
– How much ‘useful’ information
that was retained in the video frames after privacy
filtering had been applied?
• Pleasantness
of the resulting privacy filtered video
frames in terms of their ‘aesthetic’ perceptual appeal to
human viewers. How acceptable were any adverse
aesthetic effects?
• All the evaluation results sent out including overall and
ranking results based on evaluators’ assigned
weightings for the above 3 criteria
Slide 3
Privacy Filtering (side)Effects, Affects
4. Produced in compliance with EU
Data Protection; Dataset includes:
38 FHD video clips, ~20 seconds each
showing Car Park Security Scenarios
Pre-annotated to signify
Re-Indentifiability specificity of features
Low, Medium, High
• Persons carrying specific items
(backpacks, umbrellas, wearing scarves)
• Persons near/interacting with cars
• behaviours tagged as Normal (walking) Suspicious
(loitering) and Illicit (stealing/ abandoning a car)
•
Privacy sensitive region
Sensitivity level
Skin
Medium (M)
Face
High (H)
Hair
Low (L)
Accessories
Medium (M)
Person
Low (L)
DPT 2015 DATASET
5. DPT 2015 DATASET
• Ground Truth
• has bounding boxes manually tagged LPII, MPII and
HPII
• License Plates(H), Skin (M), Face (H), Hair (L),
Accessories (M), and for Person’s body (L).
Challenges
• RoIs not annotated: the face-head “person-entering-
a-car” event as covered in 2014
• Persons at variable disytance from camera
• Some jitter effects
Slide 5
6. Evaluation Setup
• Participants submitted privacy protected video
clips using the testing subset
• Evaluation of submitted solutions was based on
the human-perception of levels of i) privacy
filtering, ii) retained information i.e., intelligibility
and, iii) appropriateness (acceptability-
attractiveness) of the privacy filtered High/Mdium/
Low PII regions
• 6 Evaluators from security practitioner’s category
and 11 from naïve category) evaluated by
responding to 13 evaluation questions for each
of the 3 randomly selected videos from each
solution set
Slide 6
7. Evaluation Process
• A Questionnaire consisting of 12 questions had been
carefully designed to examine aspects related to
privacy, intelligibility, and pleasantness; this was
used in stream 2 and 3.
• The First (5) questions were aimed at eliciting the
opinions of the evaluators re the Contents of the
viewed videos. The responses to these questions
were considered with respect to the ground truth.
• The rest of the questions were aimed at eliciting the
Subjective Opinions of the evaluators re the viewed
videos.
• The average score for all submission is illustrated in
the following figure
Slide 7