Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorSchaefer, Mirko
dc.contributor.authorGraff, Lukas
dc.date.accessioned2024-04-04T23:01:46Z
dc.date.available2024-04-04T23:01:46Z
dc.date.issued2024
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/46244
dc.description.abstractAlgorithms are increasingly used in decision-making. As numerous scandals show, this introduces new risks of discrimination on large scales. Algorithmic fairness audits have often been proposed as a binding method to reduce this and other risks. Hence, this research investigates what role auditing can play in ensuring algorithmic fairness, in terms of non-discrimination. Strictly defined, auditing leaves no room for subjective interpretation, meaning that all choices faced when assessing algorithmic fairness should be eliminated to create an audit framework. Hence our research focusses on detecting and eliminating these choices. Firstly, we identify the normative choices that are faced when assessing algorithmic fairness from a computer science perspective. Secondly, we investigate to what extent Dutch non-discrimination legislation prescribes how these choices should be made. We discover that some important, normative choices are left open by law. Hence, thirdly, we use informal conversations with algorithmic fairness practitioners to explore best practices in algorithmic fairness assessments to find alternative ways of deciding on these normative choices. Finally, we conclude that algorithmic fairness audits cannot be used directly to ensure non-discrimination. However, both internal and external audits can have a more indirect use in ensuring non-discrimination by ensuring the soundness of either the procedure of internal algorithmic fairness assessments or the documentation thereof.
dc.description.sponsorshipUtrecht University
dc.language.isoEN
dc.subjectThis research answered the question of what role ’auditing’ can play in ensuring algorithmic fairness, in terms of non-discrimination. It focusses on algorithms used in decision-making. This question was approached by considering computer science, non-discrimination legislation and practice. Its approach is similair to the algorithmic fairness approach promoted by the ACM FAccT conferences.
dc.titleAlgorithmic fairness auditing: can discrimination by algorithms be prevented?
dc.type.contentMaster Thesis
dc.rights.accessrightsOpen Access
dc.subject.keywordsAlgorithmic Fairness; Algorithmic Decision-making; Ethical Auditing; Ethical AI; Non-discrimination
dc.subject.courseuuArtificial Intelligence
dc.thesis.id29748


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record