Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorFeelders, A.
dc.contributor.authorKolkman, T.M.
dc.date.accessioned2015-03-17T18:00:39Z
dc.date.available2015-03-17T18:00:39Z
dc.date.issued2015
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/19543
dc.description.abstractActive learning can in many cases speed up classification tasks by combining expert knowledge and knowledge about the structure of the data. When it is known that the class label increases or decreases with the attribute vectors we can exploit this feature to greatly decrease the number of labelled examples that is needed to construct a classifier. Here we study such algorithms both in general and in the case where data exhibits such special features. These monotone relations form the basis of the SMAL algorithm as described by Barile and Feelders in \cite{barile2012active}, which we will study in more detail. We describe a special case that can lead to unwanted behaviour in this algorithm and explore a number of possible alternative approaches that aim to prevent the occurrence of this special case. We propose a number of changes to the algorithm that aim to reduce the occurrence of this special case, possibly sacrificing some performance. Experimental results look promising as they show only a minor drop in performance across our toy datasets and even increased performance in some cases.
dc.description.sponsorshipUtrecht University
dc.format.extent1657314
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.titleStochastic Active Learning with Monotonicity Constraints
dc.type.contentMaster Thesis
dc.rights.accessrightsOpen Access
dc.subject.keywordsActive Learning, Monotonicity, Stochastic Dominance, Machine Learning, Classification
dc.subject.courseuuComputing Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record