Publication Year

Authors

BibTeX

Abstract

Over the last decades, there have been numerous efforts in wearable computing
research to enable interactive textiles. Most work focus, however, on integrating
sensors for planar touch gestures, and thus do not fully take advantage of the
flexible, deformable and tangible material properties of textile. In this work, we
introduce SmartSleeve, a deformable textile sensor, which can sense both surface
and deformation gestures in real-time. It expands the gesture vocabulary with a
range of expressive interaction techniques, and we explore new opportunities using
advanced deformation gestures, such as, Twirl, Twist, Fold, Push and Stretch. We
describe our sensor design, hardware implementation and its novel non-rigid
connector architecture. We provide a detailed description of our hybrid gesture
detection pipeline that uses learning-based algorithms and heuristics to enable
real-time gesture detection and tracking. Its modular architecture allows us to
derive new gestures through the combination with continuous properties like
pressure, location, and direction. Finally, we report on the promising results from
our evaluations which demonstrate real-time classification.