dc.rights.license | CC-BY-NC-ND | |
dc.contributor.advisor | De, M | |
dc.contributor.author | Valkenburg, S.J.M. | |
dc.date.accessioned | 2021-09-01T18:00:16Z | |
dc.date.available | 2021-09-01T18:00:16Z | |
dc.date.issued | 2021 | |
dc.identifier.uri | https://studenttheses.uu.nl/handle/20.500.12932/41424 | |
dc.description.abstract | Recommender systems are all around us; they can be found in news applications, YouTube,
Netflix, the healthcare industry, and e-commerce. These recommender systems are
influencing our choices and the information that is presented to us. This makes it crucial to
think about the ethical consequences of these recommendations and possible solutions to
ethical issues. In this thesis, we have identified the main ethical challenges of recommender
systems, and we looked at one specific, promising solution called the secondary ethical layer.
The secondary ethical layer is a general ethical filter which filters out any unethical
recommendations based on cultural and personal preferences while also taking into account
all the different stakeholders on which recommendations can have an effect (such as the user,
provider, system and society). We have found that this solution can solve some ethical issues,
specifically with regards to inappropriate content, unfairness (biases) and issues for society. It
does not solve problems such as the lack of opacity and some privacy issues within
recommender systems. This thesis identifies different key elements of the ethical layer and
creates the fundaments on which a practical solution can be built. | |
dc.description.sponsorship | Utrecht University | |
dc.format.extent | 232385 | |
dc.format.mimetype | application/pdf | |
dc.language.iso | en | |
dc.title | The ethics of recommender systems | |
dc.type.content | Bachelor Thesis | |
dc.rights.accessrights | Open Access | |
dc.subject.keywords | Recommender systems, Ethics of AI, Information Ethics, Netflix ethics, recommendation systems, ethical layer | |
dc.subject.courseuu | Kunstmatige Intelligentie | |