Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorBex, F.J.
dc.contributor.authorScheffers, Roos
dc.date.accessioned2023-07-20T00:01:10Z
dc.date.available2023-07-20T00:01:10Z
dc.date.issued2023
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/44211
dc.description.abstractThe use of automated decision-making is becoming increasingly prevalent. Users of systems that make these decisions must be able to assess a system’s biases and have trust in it. Providing explanations for system decisions is one way to achieve this. Providing these explanations is the focus of the Explainable Artificial Intelligence (XAI) field. One technique used within XAI is formal argumentation. The logic used by an algorithm to arrive at a specific decision can be represented via formal argumentation structures. However, how such an argumentation structure can be translated into human-friendly explanations remains an open question. One concept formalized for explanations in argumentation that takes into account properties of human explanations is ‘relevance’. Informally an argument is relevant to another argument if there is a relation between the two, for example, by attacking or defending an argument. In this thesis, the concept of relevance was empirically tested by comparing explanations in formal argumentation based on relevance to explanations provided by participants. One hundred twenty-seven participants provided explanations for scenarios based on two different types of relevance. Based on the results, relevance in argumentation seems to align with explanations selected by participants. Participants preferred small explanations consisting of direct defenders, arguments that attack the attacker of an argument. However, further investigation is needed to determine whether the task’s difficulty affects this study’s results. Future work could build on the current work by expanding to non-acceptance and non-extension-based explanations and by investigating differences in explanation behaviour based on prior knowledge and goals of explanation.
dc.description.sponsorshipUtrecht University
dc.language.isoEN
dc.subjectAn empirical study in which theoretical explanations based on formal argumentation were compared to explanations provided by participants.
dc.titleRelevant Explanations in Formal Argumentation, an Empirical Study
dc.type.contentMaster Thesis
dc.rights.accessrightsOpen Access
dc.subject.keywordsXAI, Argumentation, explanation, experiment, AI
dc.subject.courseuuArtificial Intelligence
dc.thesis.id19491


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record