Facial Recognition Technologies and Algorithmic Video Technologies for Mass Surveillance in Terms of The Right to Privacy: A Case Study if Paris Olympic Games 2024
Summary
This thesis explores the legal and privacy implications of using Facial Recognition Technologies (FRTs) and Algorithmic Video Surveillance Systems (AVSs) for mass surveillance during the Paris 2024 Olympic Games. The research critically examines the regulatory framework governing FRTs and AVSs, focusing on the European Convention on Human Rights (ECHR), the Charter of Fundamental Rights of the European Union (CFR), the General Data Protection Regulation (GDPR), the Law Enforcement Directive (LED), and the newly adopted Artificial Intelligence Act (AI Act).
While the French "Olympic Law" prohibits FRTs for their potential to violate privacy, it allows AVSs, raising concerns about their compliance with privacy safeguards. In the research, a distinction between FRTs and AVSs is settled, highlighting the latter's non-biometric data processing and focus on aggregate behavioral patterns. However, it underscores potential risks to privacy and civil liberties due to their experimental use.
Through doctrinal legal analysis and a single case study methodology, the thesis evaluates whether the legal limitations on FRTs should extend to AVSs. It concludes that AVSs, despite operating within less strict legal boundaries compared to FRTs, AVSs still pose risks to fundamental rights. Recommendations emphasize robust regulatory oversight, transparency, and accountability to mitigate privacy concerns while addressing security needs in large-scale public events.