dc.rights.license | CC-BY-NC-ND | |
dc.contributor.advisor | Paperno, D. | |
dc.contributor.advisor | Adriaans, F.W. | |
dc.contributor.author | Bekkenutte, R.O.J. | |
dc.date.accessioned | 2020-08-04T18:00:21Z | |
dc.date.available | 2020-08-04T18:00:21Z | |
dc.date.issued | 2020 | |
dc.identifier.uri | https://studenttheses.uu.nl/handle/20.500.12932/36495 | |
dc.description.abstract | TrainTool is a web-app where trainees can train their communication skills.
This training is done by recording the trainee’s response to a video and evaluate that recording on a certain criterion. Automating the evaluation of these responses would make the system more efficient. To effectively run an automated communication training system, a classifier to evaluate criterion-user-input-pairs is necessary. As deep neural networks enabled text classification to reach new heights, this research aims to test if Google’s pre-trained neural model BERT can be fine-tuned to effectively classify the criterion-transcription-pairs. This novel task is called criterion-transcription-evaluation. Since this task is inherently different than tasks in previous studies, this task is a novel application of text classification. A multilingual BERT model as well as a pre-trained Dutch BERT model called BERTje were fine-tuned for this task. Results show that both models outperformed the baselines. Next to that, BERTje has a slightly better performance than the multilingual BERT. A larger dataset and more computing power is needed to further fine-tune the model and gather results that are more representative of the possibilities of this classifier. | |
dc.description.sponsorship | Utrecht University | |
dc.format.extent | 363229 | |
dc.format.mimetype | application/pdf | |
dc.language.iso | en | |
dc.title | Towards automated communication training: Fine-tuning deep contextualized embeddings | |
dc.type.content | Bachelor Thesis | |
dc.rights.accessrights | Open Access | |
dc.subject.courseuu | Kunstmatige Intelligentie | |