Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorWijnholds, G.J.
dc.contributor.advisorBroersen, J.M.
dc.contributor.authorTeerlink, Y.K.
dc.date.accessioned2021-04-06T18:00:16Z
dc.date.available2021-04-06T18:00:16Z
dc.date.issued2021
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/39213
dc.description.abstractCompared to neural networks (NN), humans can learn new concepts using only very little data. The ability to learn so efficiently might be due to the use of ab- stractions. To find similarities between human and machine learning this research will analyze if NN benefits from syntactic information during training. We will aim to answer the following question: How does enriching training data with syntactic knowledge affect the performance of a NN on natural language processing tasks? This research examines the results of Long Short Term Memory models (LSTM) trained on two different types of datasets; one without Part of Speech tags (a form of abstract knowledge) and a dataset that is supplemented with POS-tags. The results show that an LSTM trained on a relatively small dataset supplemented with POS- tags outperforms an LSTM trained on a regular dataset. The increase in performance might suggest that neural networks benefit from abstract information, which in turn might show some similarities in the way humans and machines learn.
dc.description.sponsorshipUtrecht University
dc.format.extent608483
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.titleEnriching training data with syntactic knowledge and the effect on performance of a neural network on natural language processing tasks
dc.type.contentBachelor Thesis
dc.rights.accessrightsOpen Access
dc.subject.courseuuKunstmatige Intelligentie


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record