In looking at the Ubuntu xPhone's desktop, and comparing its behavior to Apple and Android xPhone/xPad, is the subsystem components of what the user uses made posible by a Window-manager (WM) or made possible via a UI app in a WM?

And, what approach could be used to bring a similar UI to Puppyland for x86 (32/64bit) touch displays where the "swiping" kind of activity would be possible?

Any information would be useful

Edited: for better clarity_________________Get ACTIVE Create Circles; Do those good things which benefit people's needs!
We are all related ... Its time to show that we know this!
3 Different Puppy Search Enginesor use DogPileLast edited by gcmartin on Fri 11 Jan 2013, 00:49; edited 2 times in total

In looking at the Ubuntu xPhone's desktop, and comparing its behavior to Apple and Android xPhone/xPad, is the subsystem components of what the user uses made posible by a Window-manager (WM) or made possible via a UI app in a WM?

And, what approach could be used to bring a similar UI to Puppyland for x86 (32/64bit) touch displays where the "swiping" kind of activity would be possible?

Any information would be useful

Edited: for better clarity

From what I have read its a combination of both, plus additional individual program code The WM has to have the layout such that a touch interface works, but most of the actual touch interface is actually handled by the system as a user interface device like a keyboard and/or mouse, etc. The drivers then working along side with the kernel, translate the physical input, and turn that into code that can be understood by the WM. This is then sent to the WM which in turn does whatever you told it to do.
But thats not the end, individual programs also have to be modified to accept additional types of input. Otherwise a program, say a web browser, will not understand the input its getting. While some things can be simply translated (sweeping motion up or down = scroll wheel action). Others cannot. Multi-touch is where this becomes a rather large issue, since the system has to be able to properly parse the input and turn that into something a program can use; and at the same time programs must be able to manage and deal with that input as well.

At the simplest level a touch screen can operate as a simple click like a Mouse, but its the more complex touch interface interactions where things start to get complicated.

About 3 weeks ago, a colleague showed some advances he had made with Android-x86 on his HP Touchsmart.

Impressive. But, I didn't get to cover much at that time. When I saw the Ubuntu xPhone, last week, I wondered again about how the hand gestures on touch screen was carried out and which system/subsystem components were involved to control and carry out user requests with the apps that sit on its desktop.

I wonder how many of us have seen this in real life, too. There are several Youtubes and howtos demo'ing this._________________Get ACTIVE Create Circles; Do those good things which benefit people's needs!
We are all related ... Its time to show that we know this!
3 Different Puppy Search Enginesor use DogPile

So that my last post doesn't appear too vague, here's a couple of concerns that are a little more specific:
Will the trek to more Linuxes on Touch devices start from additional added to the base OS? Does the base OS, today, have the chipset support to bring the screen interaction into the system for Window manager use?

And, is the Window manager used by Android (mentioned earlier) capable of being "dropped" into a Linux such that applications can be manipulated in much the same way as is manipulated in Android-x86? If so, did Ubuntu do something like this?

Just trying to understand how this actually is done?_________________Get ACTIVE Create Circles; Do those good things which benefit people's needs!
We are all related ... Its time to show that we know this!
3 Different Puppy Search Enginesor use DogPile

So that my last post doesn't appear to vague, here's a couple of concerns:
Will the trek to more Linuxes on Touch devices start from additional added to the base OS? Does the base OS, today, have the chipset support to bring the screen interaction into the system for Window manager use?

And, is the Window manager used by Android (mentioned earlier) capable of being "dropped" into a Linux such that applications can be manipulated in much the same way as is manipulated in Android-x86? If so, did Ubuntu do something like this?

Just trying to understand how this actually is done?

Touch support is already in the kernel, but it doesnt cover all the hardware options out there. In fact it's been in the kernel since the 2.30 days, but it wasnt utilitzed by pretty much anyone.
So to go from a regular wm to a touch WM... you'd need to take several steps.

1) make sure you chosen hardware is driver compatible with whats in the kernel,
If its not, find opensource drivers for it (good luck), or write your own.

2) configure your system for the types of touch input you want to use, this means configuring the device input options for how the driver will read the sensors in the screen. You need to decide what will be what.

3) redesign your WM to take into account touch control. Fingers work differently than mouses, so visually things will need to be a tad different.

4) Edit whatever programs in whatever way you want so that they can take advantage of your new input gestures.

What google has done with android honestly isnt very helpful. Yes Android is linux, But its linux as far as the kernel. Android is mostly a Java stack running on top of the linux kernel. So it isnt very helpful for real linux distros. Android doesnt run any form of the X enviroment, so usability of googles work drops to almost nil.

Now toss out the WM on your computer, code a Java WM, and then you would be able to port some of googles code over, but you wont get real far, because then none of your common linux apps will want to run, because they want to run in X and not in a java gui.

Now once Canonical released their image for the ubuntu phones, everyone can have a look at what they've done. I dont see them using X on a phone because of processor demand, so they will probably go the same route and just run a java stack on top of the kernel, but who knows. Canonical has the $ and Devs to throw at this; anything is possible._________________

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot vote in polls in this forumYou cannot attach files in this forumYou can download files in this forum