"For two decades, X has been the foundation for Linux graphics. Ubuntu's decision late in 2010 to switch to Wayland shakes things up all the way to those roots. Just over a month ago, the official 1.0.0 release of Wayland appeared, as well as its associated Weston project. How will these milestones affect working GUI programmers? What will happen to all the existing toolkits - Qt, wxWindows, Tk, and others - on which so many graphical applications already depend?"

Many ARM GPUs have opensource driver, reverse-engineered driver, or at least an OpenGL ES2 interface.
Also, new Intel CPU which have Haswell GPUs removes the needs for an nvidia card (on laptops), and in the same time provides opensource drivers.

You beat me to it. This is the sore spot with switching to ARM as a production platform, at least for me. There are fully open alternatives to ARM like the Loongson platform, but I don't know if the performance is up to par.

Hum, less crashes? What makes you think so?
If the crashes are caused by the drivers, then new usages of the drivers will tend to create *more* crashes not less (at least initially).

'without glitch' this is true, but this is a tradeoff: I believe that resizing a window can be jerky/laggy if the program doesn't send its frame quick enough, whereas with X it could still be smooth if a bit ugly.

Android was the first major Linux-based client OS to ditch X11 in favor of a direct rendering manager (SurfaceFlinger). Now that Wayland inching its way toward adoption by Ubuntu and Fedora, what does this mean for the future of Android and SurfaceFlinger?

If Google has any intention of pushing Android upmarket into the productivity and workstation segments, then they may want a presentation framework which natively supports composited client windows. I suppose that ChromeOS, with its new Aura window manager on X11, would be the most likely venue for Google to introduce Wayland, if they have any intention of doing so.

I'm no developer, but I wonder if it would even scale up to desktop use. It's designed for one fullscreen app at a time; no window management would be possible without heavy modification.

And let's not forget what die hard X fanatics would miss the most: Remote X sessions. I've used them in the past from time to time, but there are some people for whom that feature is a necessity.

I may be wrong, but I think the best solution is to clean up and modernize X rather than shoehorn a mobile framework onto desktops. I don't have a problem with Wayland either, but I'm interested to see where the various distros go.

I'm no Mac expert so I don't know what they are up to. The basic concept is to download TTF fonts (probably preprocessed) into the GPU. You then give the GPU a string of text and a rectangle in your coordinate system. As that rectangle is transformed by the display system the GPU would generate the best text possible (including subpixel rendering) without application involvement.

This is not giving the GPU a texture with the glyphs on it and then scaling. The GPU has the equations for the fonts, transforms those equations and then draws with subpixel antialiasing.

The problem being addressed is that apps can't do the antialaising if the window is going to be transformed in any way. Slightly transform the app's antialiased window and all of the antialiasing gets broken. We need a scheme where the drawing system does the transformation based antialiasing without application interaction.

There have about ten papers written on this subject but I didn't know anyone has deployed it.

Reading "One technology that hasn't happened yet is GPU based glyph generation. When that happens it will likely have a large impact on the desktop" reminded me about Matrox Parhelia - though it was much simpler ("Glyph acceleration, where anti-aliasing of text was accelerated by the hardware" from http://en.wikipedia.org/wiki/Matrox_Parhelia )

Reading "One technology that hasn't happened yet is GPU based glyph generation. When that happens it will likely have a large impact on the desktop" reminded me about Matrox Parhelia - though it was much simpler ("Glyph acceleration, where anti-aliasing of text was accelerated by the hardware" from http://en.wikipedia.org/wiki/Matrox_Parhelia )

But still, what "large impact on the desktop" do you envision?

I don't know what he meant with it, but well, glyph acceleration would mean higher-quality text rendering even when the text is in motion (think of e.g. during scrolling up or down, various window transformation-effects and such) leading to a smoother experience. Also, rendering text is somewhat of a heavy task and it only becomes heavier with the ever-increasing display resolutions, so the decrease in resource-usage is always welcome.

There was a great post detailing why exactly X.org sucks and why it holds Linux Desktop back big time and needs to be replaced by something better. Too bad it was on linuxhaters blog and nobody is aware it exists. Here it is if you want to read it http://linuxhaters.blogspot.gr/2008/06/nitty-gritty-shit-on-open-so... (just don't read the comments, it feels like having your head drilled with a cheap drill)

Long story short, support for even basic stuff invented 15 years ago like pbufffers (at SGI -1997) is missing from X.org. Nvidia had to replace the 1/3 of X to get their closed drivers working and fully supporting OpenGL. I think X.org has played a MAJOR role in the flamewars surrounding Linux.
User with a Nvidia card: Linux is good
User with ATI/Intel: WTF is this crap!

GPU glyphs are the final piece needed to allow apps to draw resolution independently. Currently apps are forced to query the display resolution and do all kinds of calculations involving anti-aliasing. All of that platform specific code would disappear.

HTML is an example of something that is partially resolution independent.

GPU glyphs are the final piece needed to allow apps to draw resolution independently. Currently apps are forced to query the display resolution and do all kinds of calculations involving anti-aliasing. All of that platform specific code would disappear.

HTML is an example of something that is partially resolution independent.

Uh, you don't need hardware acceleration for resolution independence, you just need the WM and all the toolkits designed for it. Currently they aren't designed top-down for that and GPU-generated glyphs won't magically fix them.

Maybe not yet - autonomously driving cars are sort of a quite different concept, interface-wise.

Additionally, computerised dashboards are often quite horrible, there's a place for lots of improvement with them (they're often so bad that I wouldn't mind Apple focusing on the area, providing integrated solution for auto makers; with the influence of big & lavish US market on car designs, all would improve hopefully)

The problem with evolving car interfaces has to with the inherent advantages of a physical interface. Nobs and buttons in a car are a huge advantage due to the physical feedback and thus not needing to take your eyes of the road. I've seen cars with touch-screen based interfaces such as the Tesla Motors Model S (http://www.teslamotors.com/models) and I cringe at the prospect of having to take my eyes off the road to do something as simple as change the AC temperature.

That's what I meant, the problem is that many manufacturers do go in the direction of (poorly implemented) touchscreens. So maybe we need a company like Apple (not afraid to go against the trends) to stop that - and they would likely improve on what does and/or can work (one of old iPods does have a fully physical, mechanical clickwheel; then there's Siri...)

The real question here is whether Wayland will truly be backward compatible, or whether its developer community will say "tough luck, sod off" to the earlier X user base, as those for Win 8 UI and Unity have.

The real question here is whether Wayland will truly be backward compatible, or whether its developer community will say "tough luck, sod off" to the earlier X user base, as those for Win 8 UI and Unity have.

Given that Wayland developers have worked on XWayland very quickly, I don't think that this is much of a question..