dc.rights.license | CC-BY-NC-ND | |
dc.contributor.advisor | van Ommen, M. | |
dc.contributor.advisor | Dotlačil, J.D. | |
dc.contributor.author | Steendam, R.P. | |
dc.date.accessioned | 2020-07-13T18:00:19Z | |
dc.date.available | 2020-07-13T18:00:19Z | |
dc.date.issued | 2020 | |
dc.identifier.uri | https://studenttheses.uu.nl/handle/20.500.12932/36152 | |
dc.description.abstract | Mixed-precision floating points seem promising in reducing computation costs for deep neural networks. But does the technique live up to the promise and do all network architectures benefit equally from mixed-precision? Using the Dogs vs. Cats dataset we researched the effect of using mixed-precision on VGG, Inception and ResNet by measuring accuracy, training speed and inference speed. The results showed that the accuracy of mixed-precision was comparable with that of single-precision. Furthermore, all networks became faster, both in training and inference. The speedup between the different architectures varied between 32% and 45% for training and between 10% and 40% for inference. | |
dc.description.sponsorship | Utrecht University | |
dc.format.extent | 399630 | |
dc.format.mimetype | application/pdf | |
dc.language.iso | en | |
dc.title | Testing Mixed-Precision for VGG, Inception and ResNet on the Dogs vs. Cats Dataset | |
dc.type.content | Bachelor Thesis | |
dc.rights.accessrights | Open Access | |
dc.subject.keywords | deep learning, neural networks, single-precision, mixed-precision, quantization, dogs vs. cats, vgg, inception, resnet, machine learning | |
dc.subject.courseuu | Kunstmatige Intelligentie | |