Séminaires

Le séminaire du groupe de travail Protection de la Vie Privée du GDR Sécurité  est un évènement périodique en ligne.  Ce séminaire est à destination des membres de la communauté au sens large. Il a en particulier comme objectif de palier au manque de séminaires et de conférences causé par l’épidémie.

Appel à participation pour les prochaines itérations de ce séminaire :
– présentations longues (30 mins)
– présentations courtes (5 mins)

Propositions à envoyer à benjamin.nguyen@insa-cvl.fr et
mathieu.cunche@insa-lyon.fr.

Privacy-Preserving Decentralized Machine Learning — Aurélien Bellet (Inria Lille Nord Europe  – Magnet) — 18/03/2021 14:00

Abstract: Decentralized machine learning (DML), also known as federated learning, is a setting where many parties (e.g., mobile devices or whole organizations) collaboratively train a machine learning model while keeping their data decentralized. In this talk, I will give a brief introduction to DML and emphasize that most algorithms rely on aggregating local model updates made by participants. I will then show how differential privacy can be integrated into these algorithms to ensure data confidentiality, and discuss how to obtain good trade-offs between privacy, utility and computational costs.

Bio: Aurélien Bellet is a tenured researcher at Inria (France). He obtained his Ph.D. from the University of Saint-Etienne (France) in 2012 and was a postdoctoral researcher at the University of Southern California (USA) and at Télécom Paris (France). His current research focuses on the design of federated and decentralized machine learning algorithms under privacy constraints. Aurélien served as area chair for ICML 2019, ICML 2020 and NeurIPS 2020, and co-organized several international workshops on machine learning and privacy (at NIPS’16, NeurIPS’18 ’20 and as stand-alone events). He was also a co-organizer of the 10th edition of the French pluridisciplinary conference on privacy protection (APVP) in 2019.

[Présentation courte]  The Cluster Exposure Verification (CLÉA) Protocol — Vincent ROCA (Inria Grenoble – Privatics) — 18/03/2021 14:00

Abstract: In this talk, I will give a brief introduction to the Cluster Exposure Verification (CLÉA) protocol, meant to warn the participants of a private event (e.g., wedding or private party) or the persons present in a commercial or public location (e.g., bar, restaurant, or train) that became a cluster because people who were present at the same time have later been tested COVID+. This protocol is the foundation of a dedicated TousAntiCovid module that will offer an additional and complementary service to the existing contact tracing module.

Bio: After a PhD from Grenoble INP in 1996, Vincent Roca joins the University Paris 6 as Associate Professor in 1997, and Inria as researcher in 2000. Active IETF (Internet Engineering Task Force) participant, member of PRIVATICS since 2012, he is now leading this Inria research team specialised in privacy and personal data protection.  He focusses in particular on the privacy risks associated to the use of smartphones and Internet of Things devices. He is also co-author, with PRIVATICS colleagues, of the ROBERT Covid exposure notification protocol that is the foundation of the French
TousAntiCovid app.

Hybrid Differential Privacy —  Catuscia Palamidessi  (Inria Saclay – Comète) —  25/02/2021 14:00
Abstract: Differential Privacy (DP) is one of the most successful proposal to protect the privacy of the sensitive data while preserving their utility. In this talk, we will briefly introduce the DP frameworks and its central and local models, which refer to the cases in which sanitization is done after the data has been collected, or at the level of the individual data, respectively. 
We present an intermediate scenario, which we call hybrid, representing the case in which the data set is distributed across different organizations, which do not wish to disclose the original data but only their sanitized version, and still benefit from the advantages of combining the information coming from different sources. We propose a new mechanism for the hybrid case, which is compositional and particularly suitable for the application of a variant of the statistical Expectation-Maximization method, thanks to which the utility of the original data can be retrieved to an arbritrary degree of approximation, without affecting the privacy of the original data owners. 
Detecting online tracking and GDPR violations in Web applications — Nataliia Bielova (Inria Sophia Antipolis, Privatics) –17/12/20 14:00

Abstract: As millions of users browse the Web on a daily basis, they become producers of data that are continuously collected by numerous companies and agencies. Website owners, however, need to become compliant with recent EU privacy regulations (such as GDPR and ePrivacy) and often rely on cookie banners to either inform users or collect their consent to tracking.

In this talk, I will present recent results on detecting Web trackers and analyzing compliance of websites with GDPR and ePrivacy directive. We first develop a tracking detection methodology based on invisible pixels. By analyzing the third-party resource loading on 80K webpages, we uncover hidden collaborations between third parties and find that 68% of websites synchronize harmless firs-party cookies with privacy-invasive third-party cookies. We show that filter lists, used in the research community as a de facto approach to detect trackers, miss between 25% and 30% of cookie-based tracking we detect. Finally, we demonstrate that privacy-protecting browser extensions, such as Ghostery, Disconnect or Privacy Badger together miss 24% of tracking requests we detect.

To measure legal compliance of websites, we analyse cookie banners that are implemented by Consent Management Providers (CMPs), who respect the IAB Europe’s Transparency and Consent Framework (TCF). Via cookie banners, CMPs collect and disseminate user consent to third parties. We systematically study IAB Europe’s TCF and analyze consent stored behind the user interface of TCF cookie banners. We analyze the GDPR and the ePrivacy Directive to identify legal violations in implementations of cookie banners based on the storage of consent and detect such violations by crawling 23K European websites, and further analyzing 560 websites that rely on TCF. As a result, we find violations in 54% of them: 175 (12.3%) websites register positive consent even if the user has not made their choice; 236 (46.5%) websites nudge the users towards accepting consent by pre-selecting options; and 39 (7.7%) websites store a positive consent even if the user has explicitly opted out. Finally, we provide a browser extension, Cookie glasses, to facilitate manual detection of violations for regular users and Data Protection Authorities.

Bio: Nataliia Bielova is a Research Scientist at Privatics team in Inria Sophia Antipolis, where she started an interdisciplinary research in Computer Science and EU Data Protection Law. Her main research interests are measurement, detection and protection from Web tracking. She also collaborates with Law researchers to understand how GDPR and ePrivacy Regulation can be enforced in Web applications.

 

Groupe de Travail Protection de la Vie Privée