Thursday, 29 Jun 2017
You are here:

Subscribe to our Newsletter

How many eyes has a typical person? (ex: 1)

Twitter Feed

EMail Print


Pavel Korshunov (EPFL) Visiting Prof. Jean-Luc Dugelay (Eurecom)
07.04.2013 - 12.04.2013
Internal Visits


This visit is a continuation of the collaboration between EPFL and Eurecom on evaluation of visual privacy filters. However, now, instead of subjective studies, the main goal of this collaboration is to investigate objective evaluations and introduce an objective privacy metric.
Finding a correct balance between privacy protection of the surveyed public and the functionality of the surveillance system is challenging. Many different privacy protection techniques have been developed recently, which help in preserving privacy without obstructing the main surveillance objective. Typical techniques (i.e., filters) used for obscuring personal information in a video in order to preserve privacy include blurring and pixelisation of sensitive regions or their masking. However, there is a noticeable lack of methods to assess the performance of privacy protection tools and their impact on the surveillance task.
In the previous joint work between EPFL and Eurecom, a privacy-intelligibility trade-off for privacy protection was demonstrated via a series of subjective tests (in the lab and by crowd-sourcing). Also, a privacy-pleasantness trade-off was shown at MediaEval grand challenge.
The following were the planned milestones of this collaboration:

  • Identify key visual characteristics of the video that are important to carry the surveillance tasks. Study the impact of privacy protection techniques on these characteristics.
  • Conduct objective studies using face detection and recognition algorithms, by running them on video footage with the privacy sensitive regions distorted to various degrees by using visual privacy protection techniques.
  • Compare the objective evaluation results with subjective studies obtained previously.
  • Analyse how different privacy protection techniques affect the ability of both subjects and video analytics to perceive the surveillance scene, i.e., identify objects and events.

The main achievements and outcomes of this joint work between EPFL and Eurecom can be listed as follows:

  • Identified video surveillance use cases, where the privacy protective tools would be applicable.
  • Video datasets by Eurecom and EPFL were appropriately annotated.
  • Privacy sensitive regions were distorted by the following visual privacy protection tools: blurring, pixelisation, scrambling, and masking.
  • Several face detection and face recognition algorithms were adapted to run on the distorted video.
  • An objective evaluation methodology was proposed and the respective objective tests were conducted.
  • The preliminary results of the objective tests were analysed
  • Objective results were compared with previously conducted subjective studies
  • Publication describing the objective methodology and results of evaluations was submitted to SPIE 2013.

A full comparative analysis between subjective evaluations in the lab, crowdsourcing, and objective tests is necessary for a clear understanding of privacy-intelligibility trade-off and how to model it. The future work will focus this direction and the results of such analysis will constitute one more journal publication, which is planned for submission in September 2013.