Thursday, April 19, 2012

[update 20 Apr: clarify middle finger emulation]
One of the features added to the upcoming xf86-input-synaptics version 1.6 is support for clickpad-style devices. This post outlines what clickpads are, how they are supported and how you go about enabling the new features.

What are clickpads

The name ClickPad comes from the Synaptics product of the same name. It describes a touchpad that does not have physical buttons. Instead, the whole touchpad works as a button. Devices like this have been around for a while, most notably the touchpads found in Apple laptops and the Dell Mini 10 series.
The challenges for us were to handle the data streams correctly. Most of this work was done by Chase Douglas.

Clickpads give us the position for each finger, and a button 1 event when the pad is pressed. The design of the hardware however means that whenever you click the pad, you always add one more finger to the mix. Decoupling that finger from the ones that actually matter is the challenge. And integrating it with all the other whacky features that the synaptics driver currently has.

Clickpad Support

Clickpad support requires server 1.12 [0]. It heavily relies on multitouch support.
Central to the new feature is the "ClickPad" option and property. It is enabled if the kernel sets the INPUT_PROP_BUTTONPAD property on the device, otherwise you can enable it in an xorg.conf.d snippet or at runtime with either xinput or synclient:

This toggles a few driver behaviours to make the clickpad much more usable. Most notably, you can use one finger to press the button and another finger to move the cursor around.

Word of warning here: if you enable clickpad support manually at runtime, you will also have to manually disable traditional middle mouse button emulation (synclient EmulateMidButtonTime=0). For autoconfigured devices or xorg.conf.d configured devices this is done automatically.

The second big feature for clickpads is support for software buttons. Since the device only gives us left button clicks, we expose an option to allow for right and middle button clicks. By default, we ship this xorg.conf.d snippet:

The order of soft button edges is left, right, top, bottom for the right button, then left, right, top, bottom for the middle button. So the above snippet sets the right half of the bottom 18% to be a right button, with no middle button present. A coordinate of 0 means "stretch to edge of touchpad". [1]

Traditional middle mouse button emulation is disabled by default and we don't handle it for clickpads. Traditional middle mouse button emulation refers to the method of generating a button 2 press when both button 1 and button 3 are pressed simultaneously. Mostly because, well, I'm not sure how you would even trigger middle mouse button emulation for clickpads given that all buttons except the first are already emulated anyway. You can still emulate middle mouse button emulations through clickfingers (see below), tapping, or the soft button areas as described above.

Tapping works as on any other touchpad.

ClickFinger functionality works, albeit slightly different. In the default ClickFinger setting, if you have 2 fingers on the touchpad and you press the left button, you will see a ClickFinger2 action performed (right button by default). On clickpads we guess the number of fingers by proximity (since you need one finger to actually press the button). Fingers closer together than 30% of the touchpad's size [2] will count towards ClickFinger actions, others will be skipped. So in the default use-case, where you have two fingers down on the touchpad in the middle and you use a third to click the button, you will still get a ClickFinger2 action. Likewise, if you press with two fingers down, you will also get a ClickFinger2 action.

All other functions are not affected by clickpad support and behave as before.

[0] Ubuntu 12.04 will ship a 1.11/1.12 mix, that will work as well

[1] "edge of touchpads" should be 100% but it isn't due to a years-old mishandling of the coordinate ranges. So 0 is better to use here.

[2] All touchpads lie about their size, so we just hardcode 30% since anything we infer is going to be wrong anyways

Tuesday, April 10, 2012

Short summary: layout of the blog changed.
Long summary: One of the issues I ran into repeatedly was that any code postings would overrun the rather narrow column width. Now I've done the lazy thing and applied a new standard template that appears to be wider. So in the future, code snippets are hopefully easier to read.

Sunday, April 8, 2012

Ok, I was away on holidays for the last week so I missed everything. Thanks to Simon Thum, I found the Chrome team's April fools joke: Chrome's Multitask Mode. Allegedly a mode for Chrome that allows to use multiple pointers simultaneously.

Great, except that it's just an April fool's joke, they didn't actually implement it. Even though, well, it's been a standard feature in the X.Org X server since September 2009. Yep, right. Fedora 12 had it, for example, and every version since. It's ubiquitous on the GNU/Linux desktop. GTK3 has support for it.

Being able to use two hands is quite useful, research papers on the benefits go back well over 20 years. Use cases are generally as dominant hand/non-dominant hand methods (e.g. an artist holding the palette with one hand while painting with the other one) or as equal bimanual interaction (aligning a rectangular mask around an object). All this isn't new, and as I said above this is all a standard feature in X since 2009. You really just [1] need to add it in your application. So Chrome's April fool's joke is pretty much a joke about not implementing a feature. Which, well, uhm... ok. Haha.

The video is a bit outlandish, with a guy playing golf and a shooter simultaneously. However, listening to just the audio on the video largely makes sense (except the switching off your computer part). Using two mice became natural for me quite quickly. I even conducted a user-study about users using two browser windows simultaneously to research and gather information about a specific topic. Yep, they could do it, including typing up the results in a decidedly single-user Abiword without major conflicts.

Now I'm waiting for next year's joke, maybe it's about how we drive cars with, wait for it, steering wheels.

[1] I say "just", but implementing it is really hard. It opens a can of worms about assumptions in user interface paradigms that aren't quite that simple to deal with. From a technical point of view, it's a "just" though...