Abstract

This paper introduces a novel and efficient segmentation
method designed for articulated hand motion. The method is based on
a graph representation of temporal structures in human hand-object interaction.
Along with the method for temporal segmentation we provide
an extensive new database of hand motions. The experiments performed
on this data set show that our method is capable of fully automatic hand
motion segmentation which largely coincides with human user annotation.