Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorvan Ommen, M.
dc.contributor.advisorDotlačil, J.D.
dc.contributor.authorSteendam, R.P.
dc.date.accessioned2020-07-13T18:00:19Z
dc.date.available2020-07-13T18:00:19Z
dc.date.issued2020
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/36152
dc.description.abstractMixed-precision floating points seem promising in reducing computation costs for deep neural networks. But does the technique live up to the promise and do all network architectures benefit equally from mixed-precision? Using the Dogs vs. Cats dataset we researched the effect of using mixed-precision on VGG, Inception and ResNet by measuring accuracy, training speed and inference speed. The results showed that the accuracy of mixed-precision was comparable with that of single-precision. Furthermore, all networks became faster, both in training and inference. The speedup between the different architectures varied between 32% and 45% for training and between 10% and 40% for inference.
dc.description.sponsorshipUtrecht University
dc.format.extent399630
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.titleTesting Mixed-Precision for VGG, Inception and ResNet on the Dogs vs. Cats Dataset
dc.type.contentBachelor Thesis
dc.rights.accessrightsOpen Access
dc.subject.keywordsdeep learning, neural networks, single-precision, mixed-precision, quantization, dogs vs. cats, vgg, inception, resnet, machine learning
dc.subject.courseuuKunstmatige Intelligentie


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record