Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorKustritz, A.M.
dc.contributor.authorHommenga, A.J.
dc.date.accessioned2017-07-05T17:01:51Z
dc.date.available2017-07-05T17:01:51Z
dc.date.issued2017
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/26029
dc.description.abstractThe article that you are about to read investigates how content on Facebook´s newsfeeds is framed and to what extent this constructs filter bubbles, wherein encountering difference is not facilitated. This is done in a case-study of two constructed Facebook accounts in the context of the recent American Election: pro-Trump and pro-Clinton. This is relevant because popular discourse indicates that encounters with the other side of the political spectrum on the personalized newsfeeds are rare. This research shows that this is problematic, because reality turns into a constructed personalized image of the real in digital filter bubbles and these bubbles do not represent western society’s heterogeneity. Besides, the emergence of fake news makes this even more problematic, because it turns out to be difficult to check the truthfulness of the newsfeed’s content on the newsfeed itself. Therefore, fake news can be acted upon as if it is real. The newsfeeds are observed as if they are a digital city where people move through, thereby this analysis builds on theories of a correlation between the surrounding structure wherein people act and the way that people act in it. However, the focus does not lie on movement, but on the framing of meaning in the shared space. Therefore, a non-participant observational research method is applied, because then the way that the newsfeed’s technologies construct filter bubbles can be isolated. The research findings are that hidden technologies of the personalized newsfeeds frame meaning in it, thereby the pro-Trump newsfeed only consists of content in favor of Trump and the pro-Clinton account only consists of content in favor of Clinton. Thus, the newsfeed’s hidden technologies are virtual gatekeepers that decide what an individual sees and what not. The results are personalized filter bubbles, that do not contest the users thoughts, but merely affirm them. The article builds towards a feature that should be added to the newsfeeds in order for Facebook to claim their responsibility as gatekeepers, because Facebook is responsible.
dc.description.sponsorshipUtrecht University
dc.format.extent764475
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.titleA new invisible non-human actor is framing politics, but who is it? A Non-Participatory Observational Research of Facebook’s Newsfeeds
dc.type.contentBachelor Thesis
dc.rights.accessrightsOpen Access
dc.subject.keywordsFilter bubble; black box; throwntogetherness; non-human actor
dc.subject.courseuuMedia en cultuur


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record