Gestures

The common method of user interaction in Harmattan devices is using the touch screen, which supports multipoint touch events. The device detects different types of touch events - called gestures - that you can use in your application.

In Harmattan applications, basic touch events such as tapping a Button are handled automatically. For example, you do not need to explicitly set the Button area to listen to tap events. However, the more advanced gestures such as swipes or multipoint touch gestures such as pinches and rotations, you need to implement specific listeners or UI elements.

Qt Quick applications can use the QML element Flickable to listen to swipe gestures and QML element PinchArea to listen to multipoint touch gestures. For example, the following code snippet creates a full screen rectangular PinchArea that contains a text string that reacts to rotate and pinch or spread gestures. All state changes are recorded in the console.