Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorLöffler, M.
dc.contributor.advisorAlakuijala, J.
dc.contributor.authorAsseldonk, R.R.L.J. van
dc.date.accessioned2018-06-25T17:01:29Z
dc.date.available2018-06-25T17:01:29Z
dc.date.issued2018
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/29174
dc.description.abstractModern neural network architectures can have as many as hundreds of millions of parameters. This makes them power and memory-hungry, and impedes running networks on resource-constrained devices such as phones. Sparse networks can achieve performance similar to that of dense networks, with a fraction of the parameters. However, sparsification is usually done as an afterthought, without benefits in the learning phase. In this thesis, we propose to simultaneously optimise network architecture and parameters. Apart from sparsity benefits, this eliminates the need to choose a particular network architecture in advance.
dc.description.sponsorshipUtrecht University
dc.format.extent1305441
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.titleOnline Learning of Sparse Network Architectures
dc.type.contentMaster Thesis
dc.rights.accessrightsOpen Access
dc.subject.keywordsmachine learning, graphs, architecture learning, metalearning, sparsification
dc.subject.courseuuGame and Media Technology


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record