dc.rights.license | CC-BY-NC-ND | |
dc.contributor.advisor | Schaefer, Mirko | |
dc.contributor.author | Graff, Lukas | |
dc.date.accessioned | 2024-04-04T23:01:46Z | |
dc.date.available | 2024-04-04T23:01:46Z | |
dc.date.issued | 2024 | |
dc.identifier.uri | https://studenttheses.uu.nl/handle/20.500.12932/46244 | |
dc.description.abstract | Algorithms are increasingly used in decision-making. As numerous scandals show, this introduces new risks of discrimination on large scales. Algorithmic fairness audits have often been proposed as a binding method to reduce this and other risks. Hence, this research investigates what role auditing can play in ensuring algorithmic fairness, in terms of non-discrimination. Strictly defined, auditing leaves no room for subjective interpretation, meaning that all choices faced when assessing algorithmic fairness should be eliminated to create an audit framework. Hence our research focusses on detecting and eliminating these choices. Firstly, we identify the normative choices that are faced when assessing algorithmic fairness from a computer science perspective. Secondly, we investigate to what extent Dutch non-discrimination legislation prescribes how these choices should be made. We discover that some important, normative choices are left open by law. Hence, thirdly, we use informal conversations with algorithmic fairness practitioners to explore best practices in algorithmic fairness assessments to find alternative ways of deciding on these normative choices. Finally, we conclude that algorithmic fairness audits cannot be used directly to ensure non-discrimination. However, both internal and external audits can have a more indirect use in ensuring non-discrimination by ensuring the soundness of either the procedure of internal algorithmic fairness assessments or the documentation thereof. | |
dc.description.sponsorship | Utrecht University | |
dc.language.iso | EN | |
dc.subject | This research answered the question of what role ’auditing’ can play in ensuring algorithmic fairness, in terms of non-discrimination. It focusses on algorithms used in decision-making. This question was approached by considering computer science, non-discrimination legislation and practice. Its approach is similair to the algorithmic fairness approach promoted by the ACM FAccT conferences. | |
dc.title | Algorithmic fairness auditing: can discrimination by algorithms be prevented? | |
dc.type.content | Master Thesis | |
dc.rights.accessrights | Open Access | |
dc.subject.keywords | Algorithmic Fairness; Algorithmic Decision-making; Ethical Auditing; Ethical AI; Non-discrimination | |
dc.subject.courseuu | Artificial Intelligence | |
dc.thesis.id | 29748 | |