Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorDoder, Dragan
dc.contributor.authorDiehl, Daniël
dc.date.accessioned2025-05-31T23:01:26Z
dc.date.available2025-05-31T23:01:26Z
dc.date.issued2025
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/48995
dc.description.abstractExplainable Artificial Intelligence (XAI) has emerged as a critical aspect of AI systems, addressing the pressing need to enhance user understanding across various applications, thereby fostering trust and responsible use. Formal argumentation offers a promising approach to conceptualizing explanations, with a variety of explanation semantics defined to extract relevant arguments that justify the (non-)acceptance of conclusions. This thesis focuses on explanation semantics for structured argumentation, which is particularly suited for modelling real-world applications. Currently, the only explanation semantics defined for structured argumentation is that of Borg and Bex [2024]. While their approach is flexible and adaptable to the user’s needs, a key drawback is the high computational complexity of extension-based semantics in ASPIC+, which limits its scalability. As argumentation frameworks grow in complexity and size, it is essential to keep explanations understandable and efficient. To address this challenge, we propose a novel approach to explanations in ASPIC+, drawing inspiration from the results of Lehtonen, Wallner et al. [2020] for efficient reasoning with an ASP-based method. By leveraging their approach, we bypass the complex task of constructing the argumentation framework and directly determine the acceptance of premises, rules and conclusions. This allows us to define explanation semantics for ASPIC+ at the level of these components. A key result of this method is the condensation of explanations, grouping arguments with the same top rule. Additionally, we exclude irrelevant elements from the explanation, introducing new notions of attack and defence to further condense explanations. Our approach makes explanations shorter and more concise, offering minimal sets of elements that explain the (non-)acceptance of conclusions. This simplification is especially valuable for large and complex frameworks, where existing explanations are often too time-consuming and intricate. Our method provides a foundational step to ward more computationally efficient explanations.
dc.description.sponsorshipUtrecht University
dc.language.isoEN
dc.subjectExplanations in ASPIC+ suffer from high computational complexity, which limits their scalability. To address this, we propose a novel approach leveraging answer set programming (ASP). A key result is the condensation of explanations by grouping arguments with the same top rule and introducing new notions of attack and defense. This enables simpler, more concise explanations and provides a foundational step toward computationally efficient explanation semantics.
dc.titleCondense and Efficient Explanations in ASPIC+: An ASP approach to element explanations.
dc.type.contentMaster Thesis
dc.rights.accessrightsOpen Access
dc.subject.keywordsExplainable AI; Formal Argumentation; Structured Argumentation Frameworks; ASPIC+; Explanation Semantics; Answer Set Programming; Optimising Explanation Generation
dc.subject.courseuuArtificial Intelligence
dc.thesis.id41968


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record