3D Hand Tracking Using Depth Sensors
Summary
In recent years depth sensors have been used to interpret user gestures and tracking motions
of people. This has opened doors for many Human Computer Interaction (HCI) related
applications, for instance in the fields of gaming, virtual reality, medical assistance and human
behavioural studies. Ideally, a significant part of gesture interpretation within the HCI domain
are based on hand gestures. While many sensors already provide full body pose recognition,
full hand pose recognition using a depth sensor still requires a lot of development and overall
improvement in terms of accuracy and speed. In this project we seek improvement over
such an available system. We will develop our own hand tracking system that estimates the
complete 3D pose of a single isolated hand in near real-time. The system requires a single
RGB-D camera. The 3D pose of the hand will be modelled using a hand model that can be
manipulated with 27 parameters. We deliver the analysis of the performance of the system. In
order to make the system work in near real-time we use hardware acceleration by developing
a GPU implementation. For the evaluation, we have created a dataset using the SoftKinetic
sensor. The behaviour of the hand tracking system will be analysed on this dataset, which
includes several different hand gestures. Beside the quantitative evaluation of our system
we will compare our system with other commercial systems. We prove the system works for
simple gestures like waving and grasping, but it seems impossible to have a fully working
system in cases were occlusions of either the fingers or the palm occur. Some gestures, in
particular the ones where occlusions come into existence, can not be accurately tracked by
the method.