Automatic Sleep Assessment from Eye Cues in Videos of Strongly Occluded Preterm Infants
Summary
Monitoring the sleep of preterm infants provides valuable insights. Preterm infants are often strongly occluded, while the eyes are generally visible. Eye cues play an important role in manual sleep assessment of preterm infants. We exploit this correlation in an attempt to fully automate sleep assessment with (low-end) RGB cameras. We propose a framework to consistently extract eye regions in videos of occluded preterm infants. We show that convolutional neural networks (CNNs) can be trained on these regions to automatically identify eye states. We predict whether the eyes are opened or closed using a binary CNN, with a test accuracy of 96.3%. Using a sliding window and a binary 3D CNN, we also identify REMs, with a test accuracy up to 74.5%. We aggregate eye states per minute, and translate resulting features to sleep states with a random forest classifier. We manage to automatically discriminate sleep stages wake, active sleep and quiet sleep, with an accuracy of 92.2% - exclusively using eye cues. We discuss remaining issues and propose solutions to further improve the performance. Videos recorded at the neonatal intensive care unit of the University Medical Center Utrecht were used to construct labeled datasets.