If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

The Wayland Situation: Facts About X vs. Wayland

Phoronix: The Wayland Situation: Facts About X vs. Wayland

With the continued speculation and FUD about the future of Wayland at a time when Canonical is investing heavily into their own Mir Display Server alternative, Eric Griffith and Daniel Stone have written an article for Phoronix where they lay out all the facts. The "Wayland Situation" is explained with first going over the failings of X, the fixings of Wayland, common misconceptions about X and Wayland, and then a few other advantages to Wayland. For anyone interested in X/Wayland or the Linux desktop at a technical level, it's an article certainly worth reading!

With the continued speculation and FUD about the future of Wayland at a time when Canonical is investing heavily into their own Mir Display Server alternative, Eric Griffith and Daniel Stone have written an article for Phoronix where they lay out all the facts. The "Wayland Situation" is explained with first going over the failings of X, the fixings of Wayland, common misconceptions about X and Wayland, and then a few other advantages to Wayland. For anyone interested in X/Wayland or the Linux desktop at a technical level, it's an article certainly worth reading!

Could you explain how X and Wayland use the graphic card? I mean... in the article you say that Wayland receive buffers and how to display them.
What sort of information will Wayland receive to display the buffers: For hybrid graphics, who says which graphic card has to be used? Will be a sort of native Optimus (use the dedicated card if the integrated is busy, or sorts) ?

Could you explain how X and Wayland use the graphic card? I mean... in the article you say that Wayland receive buffers and how to display them.
What sort of information will Wayland receive to display the buffers: For hybrid graphics, who says which graphic card has to be used? Will be a sort of native Optimus (use the dedicated card if the integrated is busy, or sorts) ?

Don't have a lot of time at the moment so I'm gonna skip over the first question (for now). As to the Optimus question... Multiple GPU's are, unfortunately, a client problem. Its not necessarily FAIR for them to have to do all the heaving lifting in that regard, but they best know what to do and how much power they are going to take.

By "Client" it could mean a few things. For example, the driver could decide based upon load-- which I personally dont think is the best way. OR the method I prefer... If you code your app in GTK+, EFL, or Qt, I think those toolkits should provide a window-hint or similar interface where applications can register "I would like to be run on the $powersave GPU" or "I would like to be run on the $performance GPU."

As I touched upon in the article... Wayland is meant to be the most minimal way to do things possible, its not supposed to do the heavy lifting because since Wayland 1.0 its promised at minimum API compatibility (i think ABI is guaranteed through major versions, but I'd have to double check that). So the more things we load it up with, such as deciding what GPU to be run on, those are things we are FORCED to keep around basically forever so if we get it wrong...we're screwed.

EDIT: granted the entire protocol IS versioned, so we wouldn't be -completely- screwed, but its still legacy cruft we'd have to drag around. And do we really want that?

Question 1:
I assume you'll need nVidia and Intel updating their drivers for Wayland. Are there any news on the subject? Last time I heard no hardware vendor planned any Wayland related work.

Mir, Wayland and SurfaceFlinger all have a requirement on an EGL driver. That being said, there is one non-standard extension to EGL that Wayland does want / require. As long as Intel, nVidia, and AMD all have an EGL stack they-- to my knowledge-- should work just fine across all three. With the small exception of: wayland wants an extra non-standard extension. I think its buffer_age but I'd have to double check that as well.

Mir, Wayland and SurfaceFlinger all have a requirement on an EGL driver. That being said, there is one non-standard extension to EGL that Wayland does want / require. As long as Intel, nVidia, and AMD all have an EGL stack they-- to my knowledge-- should work just fine across all three. With the small exception of: wayland wants an extra non-standard extension. I think its buffer_age but I'd have to double check that as well.

In layman terms - are their drivers set to be ready to work with Wayland today? In a year?

With the advent of Steam for Linux and the non-stop news about upcoming games for it, I'd like to ask a question which might be interesting for all the "gamers" using Linux: Does Wayland have any direct impact on playing games on Linux, like for example input lag (which people seem to complain about), performance improvements, gamepad support and so on? Or will there be next to no difference from the current situation?

In layman terms - are their drivers set to be ready to work with Wayland today? In a year?

Open source drivers should all be gold on Wayland-- I know Intel is because that's what is in my laptop.

AMD and nVidia... We're gonna have to wait and see how they want to play. If they want to say "Screw Wayland! Go mir!" They could, they'd have to explicitly refuse to support the EGL extension that Wayland requires. But at the same time, Wayland might be able to work around that missing extension should that day come to pass.

Ideally, in a year, the open source Radeon and Intel graphics should be basically up to par with their Windows brethren (maybe not OpenGL compliance, but hopefully power management for Radeon) and we therefore wouldn't NEED them to expressly support Wayland. nVidia is in an interesting position though with how crappy (in comparison) Nouveau is.