Posted
by
Unknown Lamer
on Thursday October 25, 2012 @08:22PM
from the just-run-windowed dept.

jones_supa writes "The SDL developers Ryan Gordon and Sam Lantinga have proposed a window manager change to work out the full-screen X11 window mess, primarily for games. The proposal is to come up with a _NET_WM_STATE_FULLSCREEN_EXCLUSIVE window manager hint that works out the shortcomings of the full-screen hint used currently by most games, _NET_WM_STATE_FULLSCREEN. Ryan and Sam have already worked out an initial patch for SDL but they haven't tried hooking it to any window manager yet. Those interested in the details, information is available from this mailing list message. One of the key changes is that software would make the request to the window manager to change the resolution, rather than tapping RandR or XVidMode directly. Martin Gräßlin of KDE was rather wary about the patch and said that games changing the resolution just tend to mess up the desktop."
Seems like a reasonable idea, given a bit of time to mature as a spec. In KDE's case, a separate daemon from the window manager handles resolution changes so going through the WM would add complexity, and the plasma shell still has no way to realize that it shouldn't reflow the desktop widgets. Setting window properties seems like a sensible IPC method for communicating intent though (without making yet another aspect of the X desktop reliant upon the not-very-network-transparent dbus): "hey, I need to resize, but just for me so don't reshuffle the desktop and docks."

The desktop doesn't know what caused the changes, so you could run into a lot of strange issues. Imagine you lay out your desktop on a 30" 2560x1440 monitor, then switch to a 1920x1080 monitor and added/removed/moved an icon. What happens when you reattach the first monitor, should everything just "snap back" to the places it had - even if you'd arranged your icons completely differently now? To me the solution outlined here seems much smarter, just let the game have it's own "screen" with its own settings, no need to even tell the other windows about it.

This is the exact purpose of this proposal, to create a new signal that would tell the window manager that the change is temporary and only takes effect while a specific window has focus. This way they window manager would know there's no need even to move the icons in the first place.

Why don't games just spawn a separate X11 window server instance with a different resolution on a separate VC? Adding proper resource sharing between X11 instances seems like it would be a lot easier to do than rearchitecting all the existing apps to do the right thing during a temporary resolution change.

And there's no benefit to a full-screen app running in the same X11 instance as any other app other than making it possible to transition a window from being a normal window to a full screen window and back, and with a resolution change, that won't work very well anyway, which makes even that argument mostly moot.

Why don't games just spawn a separate X11 window server instance with a different resolution on a separate VC? Adding proper resource sharing between X11 instances seems like it would be a lot easier to do than rearchitecting all the existing apps to do the right thing during a temporary resolution change.

And there's no benefit to a full-screen app running in the same X11 instance as any other app other than making it possible to transition a window from being a normal window to a full screen window and back, and with a resolution change, that won't work very well anyway, which makes even that argument mostly moot.

Why the hell should the user suffer with resource expansion taken up by X because the damn paradigm is a big pile of hurt that goes back to the early days? I remember all the arrogance of X windows during NeXT's days and decisions with Display Postscript. It's rather clear the NeXT design has always been superior and OS X benefits from it.

You do have to admit though its ironic as hell that everyone here complains about "Windows cruft" and yet here we are talking about an obviously creaky backwards ass design going back all the way to the days of NeXT and you have people defending it or coming up with sucky workarounds rather than just admit its broken and probably needs replacing after all these years.

I mean c'mon guys, X11 has had a good run but it should probably be in the same group as Gopher and Telnet, things you can install when you need legacy support, not something everyone is depending on.

I mean c'mon guys, X11 has had a good run but it should probably be in the same group as Gopher and Telnet,

Aw geez not this crap again.

Why whenever anything new comes up do a shrill group of people start shreiking omg omg x11 is so old omg omg scrap it omg omg we can't possibly make a minor tweak to fix a minor problem omg omg omg legacy omg omg omg bloat ong oh the legacy omg won't someone please THINK OF THE CHILDREN omg legacy.

Without ever stopping to *THINK*.

Just stop and think. Not about X11, but about any GUI system.

The GUI runs at the monitor's maximum resolution. Things like windows are spread out over the whole area, as perhaps are icons, widgets etc.

If the user reduces resolution, a common thing to do is to move all the windows into the new area, otherwise they may become inaccessible.

So far so good. Nothing specific about X11 in there.

Any good system will have a protocol or API for changing resolutions so 3rd parts resolution changing programs are possible to write.

So far so good.

But in some cases you don't want to rearrange the windows because the resolution change is temporary, so you need to have an extra flag which tells the system that it's temporary and not to bother.

OK, still nothing about X11 in there.

Now this is a proposal to add such a flag using a mechanism for adding such flags which has been standardised since 1985. And it will work smoothly and be completely backwards compatible.

IOW the design of X11 is ideal for this kind of change and it shows how solid the underlying design is.

Nothing breaks. No need to have a ChangeResolution and ChangeResolution2 API, no need to deprecate the old API no need to break anything.

Seriously if you scrapped the entire GUI and rendering system whenever a minor tewak is needed you'd never get anywhere.

Except that we're no longer in the era of CRTs. Since LCDs have one native resolution, they should always be driven at that resolution. If a game wants to run at 640x480, then that should be accomplished by scaling it up and adding black bars, if necessary, but the signal to the LCD should still be at the original resolution.

If you don't trust your LCD to do it (I don't blame you, some LCDs are better at scaling that others), that sounds like something that should be done automatically and transparently by the video driver instead of something the WM should have to manage.

It's not that I don't trust the LCD. It's that when you change the resolution, you tend to screw other things up as well.

I have three monitors and I game on the center one. I like to keep my email and IRC open on the other ones while I play. But if the game adjusts the resolution, the positions of the other windows move around and I can no longer see all of them. This happens in Windows if the game doesn't run at the same resolution as my center monitor.

or even multiples/divisors of that resolution, ideally exposed via EDID. since they're even, the screen can do a simple, lossless point sample scale which is computationally simple (compared to common bilinear) and allow these low resolutions to be full screen with no added latency (scaler chips inside most panels are sloow). These are needed because desktops might be 2560x1600 but most gpus won't run games well at that resolution.

nvidia's windows drivers support scaling in the gpu too, but unfortunately

Why do game developers always assume that my computer doesn't have any other purpose except to play their game? I've got other stuff on this computer -- stuff that is more important than the games. My computer is my alarm clock, my calendar, and a communication tool, among other things. Games had darned well better stay in the window I put them in, or I won't be playing them.

Because not everybody wants to be annoyed by the rest of the UI when playing a game. Of course, when fullscreen is available it should be an option (and not the only way to play the game), but that isn't an excuse to completely get rid of it.

Exclusive fullscreen allows for better hardware\3D acceleration; windowed mode requires scaling and positioning of the contents of the window, often done in software because no-one plays games in god damn windowed mode anyway, and wastes even more resources drawing the desktop. It won't matter for Tux Racer, but for more advanced (read: 3D) games it's a big slow down.

"Let's not fix the underlaying problem and come up with client-side work arounds"

Not what I heard. Sounded more like "why the heck are you making the problem more complicated than it needs to be?

Ideally, I am not sure why the heck a game of the full-screen sort would need X11 to begin with. Perhaps for portability. Wouldnt want to try and run games over remote X either, so why?

Assuming there are nonetheless plenty of reasons in practice to want to make that work (starting with 'lots of existing games that do require it' of course) then why not just set them to start their own exclusive server instance, tuned for that purpose?

If it's a game that's supposed to be running full screen and not interacting with a desktop, why then force a desktop to be part of the environment at all? Keep it simple.

All I can think of as a semi-valid reason is the use case of having a full screen game on one monitor and email, web or whatever on the other. I do that at times on linux and like it, and it pisses me off immensely that I cannot do that on MS Windows (eg. skyrim on one screen, web browser with game info on the other, only possible to get from one to the other with ALT-TAB and it frequently crashes Windows7 when I do that). However, even in that case it would run just as well if the full screen game is a c

With Linux finally becoming a more "proper" gaming platform (i.e. Steam and others), it's "about time" that this is dealt with. _NET_WM_STATE_FULLSCREEN_EXCLUSIVE, where have you been my whole adult life? Gotta hand it to Ryan Gordon ("Icculus," as I recall) for consistently making Linux gaming that much more viable.

I didn't see any information in the article, but what exactly is the problem with X11 full screen support? I don't game in Linux, and this is the first time I've even heard of this.

The biggest issue is that when the game goes full-screen, it changes the resolution to whatever the game is set to--which may or may not be what you keep your desktop at. Then, when you exit the game, the icons are usually huge; the taskbars are usually all messed-up (even when "locked!"), and you have to futz around to make it usable again. Also, many games on Linux won't even let you Alt-Tab to other windows! Either nothing happens; or the resolution won't be correct; or the game crashes. It's really unpleasant to deal with. Also, it's worth noting that many games (especially Linux games, sadly) are extremely limited about what resolutions they'll let you use--so even if you want to set the game to your native resolution, it might not work or let you even try.

The article contained a lot of detail. The current mechanism is a two-step thing where the application first requests full-screen control from the WM. The WM then resizes the window to fit the current screen (which may not make the game happy), removes decorations, and then gets out of the way. Then the game changes the resolution and resizes the window again. The resolution change notification is delivered to the WM, which then propagates it to all of the applications, so if you want to play a fullscreen game at 640x480 then you may find that all of your application windows have resized themselves to fit in this screen when you exit it. The game then runs (hopefully) and if it crashes then in theory the WM is responsible for restoring the video mode, but in practice it doesn't know for certain that the game was responsible for changing it, so it may not.

With the new proposal, the game resizes its window to the resolution it wants and then requests full screen mode. The WM should then change the screen resolution to the closest to the current window size and doesn't propagate the resolution change notification to any other applications. This means that nothing else gets resized. If the game crashes, or if you do any kind of switching out of the game, then the desktop resolution will be restored.

And, while it's fashionable to hate X11, it's worth noting that Windows and OS X both suffer from some of the problems that this proposal is intended to fix.

Who is still running a CRT? Who wants any program to change the resolution of their screen?

This strikes me as the wrong solution to the problem: A program should instead request the "current monitor's resolution" (because there can be more than one!) set its display area to that size, and then tell the window manager to "fullscreen" it by removing title bar and border decorations and moving it to (0,0) of that monitor. But NEVER EVER RESIZE MY MONITORS. Thank you. The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.

I agree, I have no idea why game windows are not handled better.It is basically impossible to run many, quite possibly most, games in a window. And even the ones that do allow it often require editing of files or hacking the exe.Theoretically the OS should be being sent this visual data and no matter how it was programed you would resize it/run it in a window.

Yes, theoretically, but in reality resizing stuff on the fly, particularly to odd, one-off resolutions to fit in a window, is a big performance sink - fine for some games but there is a good chunk of the market where that isnt acceptable. Plus, for many games, moving the mouse to the edge of the screen actually has a specific meaning. It's not always straightforward to determine whether you mean to throw the mouse against the edge of the screen in-game or just mean to move the mouse out of the window to che

Someone whose graphics card isn't up to the task of running a game at full native resolution? That'd be my guess anyway; I haven't willingly used a lower resolution for a while. (Some games don't support high resolutions, or don't support widescreen resolutions, and there it's "reasonable" that they change it as well. But a program like that probably wouldn't use that in the first place, so whatever.)

The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.

I don't know enough about this proposal to say how it interacts with this (indeed, I'm rather disappointed by both the summary and TFA not actually, you know, saying what the problems are in the first place), but there's absolutely no reason why those goals are in conflict. In fact, the proposal specifically addresses this: "If the window loses input focus while fullscreen, the Window Manager MUST revert the resolution change and iconify the window until it regains input focus. The Window Manager MUST protect desktop state (icon positions, geometry of other windows, etc) during resolution change, so that the state will be unchanged when the window ceases to be marked as fullscreen."

Someone whose graphics card isn't up to the task of running a game at full native resolution?

For the myriad of responses that brought up this point: the answer is video card hardware scaling. E.g. add a flag _NET_WM_STATE_SCALING_ALLOWED which directs the WM to use hardware scaling from a fixed-size framebuffer, as is done by video players. Not only can you make it full screen, but you can resize it to any arbitrary shape and size (e.g. don't cover your widget bar, etc). Then the Window Manager decides what is "fullscreen". It could even make an app span more than one monitor when "fullscreen", or just one.

For the myriad of responses that brought up this point: the answer is video card hardware scaling.

And this is exactly the solution that the Xbox 360 uses. A lot of newer games are fill rate limited. Because of the complexity of the pixel shaders that games use, the AMD Xenos integrated GPU in the Xbox 360 (similar to a Radeon X1800) can't run it with an acceptable frame rate at any resolution over 1024x576. So games use the Xenos's scaler to turn 1024x576 into 1280x720 or 1920x1080 pixels for the component or HDMI output.

This. I came here to say the same thing, but you already had. Every single modern graphics card is very efficient at scaling textures, and in fact, LCD scaling these days most often ends up happening on the GPU anyway. Don't touch my screen resolution. Ever. If the goal is to get better performance by rendering at a lower resolution, then render at a lower-resolution offscreen buffer and scale that up to the screen resolution.

I wish Wine had a mode that did this for Windows games that expect to change the screen resolution and don't play well with Xinerama. These days I end up using the "virtual desktop" wine mode with per-game settings and KDE's window override support to put it on the right display head and remove the borders, but it's a suboptimal manual solution. The Linux game situation is slightly better (they tend to be configurable to respect the current resolution and usually get the display head right), but still don't have scaling support.

Need inspiration? Do what video players (particularly mplayer) do. That is how fullscreen games should work.

This is exactly how some games work on Mac OS X, for instance Source-based games like Portal and Half-Life 2. They don't muck with the actual screen resolution, but just render into an offscreen buffer at whatever resolution ant blit it stretched to the full screen. Switching from the game back to other apps doesn't disturb the desktop in any way.
Would definitely love to see more Linux games using this technique.

Modern games render to more than one off-screen buffers already (necessitated by HDR, deferred shading, and other fun things), only blitting and gamma-correcting the final bits to the screen's framebuffer at the very end.

The tiny amount of RAM occupied by the 8-bit framebuffer to accommodate a large screen resolution is dwarfed by these several framebuffers, some which will use 16-bit components.

The amount of GPU needed to draw a solid full-screen quad really is too trivial to care about.

I want to change the resolution and I will tell you why. On 32" monitor, I can't read the text unless it runs in 720P of even 800x600.

Actually, unless you have become totally inured to blocky, pixelated displays what you really want is for everything to be rendered larger.

Fortunately, many operating systems support resolution independence [wikipedia.org], which would allow you to keep your display at its high, native resolution and still draw your widgets, text, etc at a large size. This is done by changing the DPI hint in the OS so that it knows to render things larger (or smaller).

This approach would accomplish the overall effect you desire while avoi

Disgusting. You should NEVER have to upscale images if you are doing things right. This is the zeroth rule of Image Manipulation. Might as well save it as a 50% JPEG too, because you're already losing most of the resolution and it's going to be a blurry mess.

um.. why would you want a blurry 640x480 stretched game? In addition to looking like shit, it would consume lots of extra gpu resources. better to run the game in 960x540 (or some other even multiple) and have X11/gpu scale it up losslessly.

Not surprising, since you're ignoring the underlying problem. Your 2560x1600 desktop on that 30" LCD is going to kill the ability of your videocard to display a modern game at an acceptable frame rate. Many gamers will not accept windowed half-screen (or whatever fraction is required) gaming on their $1K LCD.

A program should instead request the "current monitor's resolution" (because there can be more than one!) set its display area to that size, and then tell the window manager to "fullscreen" it by removing title bar and border decorations and moving it to (0,0) of that monitor. But NEVER EVER RESIZE MY MONITORS.

No. Windows and OSX have figured this out. Linux window managers (at least one popular one) need to as well.

The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.

Irrelevant to your desired scheme, where keyboard hotkeys would still be required. In Windows and OSX you can still task switch, move to another desktop, etc. using such hotkeys. Yet the game controls the resolution of the monitor in fullscreen mode.

Not surprising, since you're ignoring the underlying problem. Your 2560x1600 desktop on that 30" LCD is going to kill the ability of your videocard to display a modern game at an acceptable frame rate. Many gamers will not accept windowed half-screen (or whatever fraction is required) gaming on their $1K LCD.

No, you are missing his point. There is no reason the game could not run at a lower resolution and be scaled by the WM, instead relying on the screen to do the rescaling. Only CRTs are able to do rescaling physically, LCDs end up doing it in software anyway and usually in a crappier maner than what the WM could do.

Unless your game uses OpenGL and you have a fully accelerated driver (read: the proprietary Catalyst or nVidia blob), it will not be able to scale fast enough. Most games use SDL and main memory surfaces that are then blitted to the screen. Any scaling is done in software by the CPU and is dreadfully slow. My Core i7 can handle 1680x1050@60, but just barely, with one core pegged to 100%. The cheapest GPU, of course, can handle that easily, but you must run proprietary drivers and use OpenGL. If you don't, r

Not me, but I want to but they are impossible to find new quality ones. Anyways, I still use low resolutions like old games, MAME, demos, etc. I still use old KVM from Y2K that use VGA so not changing resolutions and keeping black bars doesn't work.:(

Lastly SOME games do NOT support arbitrary resolutions. I *want* them to fill my 22" monitor at whatever resolution they DO support. The *fonts* and the rest of the UI elements are BIGGER and easier to see when running in full-screen mode.

Likewise, games that *only* run in full-screen mode are BADLY DESIGNED.

The *proper* thing to do is to give the _user_ choice: Namely the 3 I listed above.

Hope this helps explains some of the issues and WHY this solution is The Right Thing.

Every heard of 2880×1800 retina displays? Like to play your games at 60FPS? Well, as someone rocking one on a 15" monitor with a mid-low end GPU, I frequently run into this issue under Linux. And let me tell you the current system is terrible. Mac OS can do it, Windows can do it, If Linux wants to get competitive, they need to fix this issue. Just a few days ago, I fired up Tux Racer for a friend to play. I think I had to reboot after that fiasco.

It's certainly a worthy goal to never need to change the monitor mode. However, I don't think we're quite there yet. Most games that rely on 3D acceleration cannot maintain the maximum frame rate at the maximum resolution supported by the monitor. Therefore, users need to be able to choose resolution to tune the game to their machine and preferences. Once frame rate is truly independent of mode, there should never be a need to reduce resolution.

What difference does it make who (the graphics card or the monitor) is doing the scaling?

Three big differences come to mind:

The graphics card probably has more precise pixel values (floating-point values instead of scaled integers per color channel), so even if the hardware scaling algorithms in the monitor are better than "whatever we can do on ten cents of silicon" (which is a big assumption), they'll still be slightly lower quality than the GPU can produce.

Except for those i386 Linux systems who are trying to run Half Life 2.. perhaps we should lower that resolution to 320x240, just to guarantee we're not butting heads with the window manager. After all, the first goal of every Linux game designer should be to ensure the tail log window you're running is properly proportioned at all times.

Actually, the Amiga OS handles the concept of multiple separate resolutions on the OS level, it's fully supported in a system-friendly way without the need for any hardware-hacking. You just use separate 'screens' (I guess each 'screen' can be thought of as a single virtual desktop, but can have different resolutions and size).

Your second reason is stupid. Just because Windows and OSX still sort of do it that way doesn't mean it actually makes sense that you should have to futz with the resolution just to make widgets use more or less screen real estate for better viewing. Window managers should handle scaling of UI elements and text sanely.

If I want bigger text to be easier to read, I still want crisp text. If I want smaller text to have more stuff on the screen, I don't want the letters to all run together like a censor bar.

Seems like a reasonable idea, given a bit of time to mature as a spec.

So another ten years? Seriously, this is well past due. This is the second story about someone wanting to fix the desktop in the last month or so. Hopefully if there are enough one of them might actually gain traction. Here is hoping. The X system really is a heap. As much as the purists like to bitch about it, thank goodness for nvidia when it comes to multiple monitor support. Too bad it doesn't help the gaming thoug

I still think X needs to go. For truely forward thinking, it needs to be replaced. Just look at Andriod. Andriod would not be useful if it was forced to use X.

Frankly, X does too many things that too few people need. It was designed for a different era and it scales to modern workflows very clumsily. Multi-monitor desktops and gaming on windows is effortless. On X it's frankly a chore.

Sorry, no, network transperancy is not an important feature anymore. Probalby implemented by.001% of regular users. VNC/RDP style remote access is the way it's done now. An no, nobody cares if it's tehnically inferior. It's hundreds of times easier to implment and use.

Modern toolkits pretty much ignore 95% of X's built in features an just pass bitmaps.

Yeah, X has lots of cool things but you have to realize most of them are impractical or unnecessary. Today we really have the memory, computational power, and bandwith to bang whatever we want on to the screen with out any trouble. The latency and overhead X present are the enemies today.

Now stop. - Yes you, stop. I know you're about to type up a 10 paragraph screed about how you ported X ap to obscure platform Y, or remotely managed 10,000 servers with facial twitches and GTK. Just stop. Your use case does not represent the vast majority of computer users. It doesn't even represent a full fraction of a percent.

Legacy baggage and clinging to old ideas spawned x.org. The same thing is what will spawn whatever is to replace X.

Android. Look at the N9 with an award winning UI. It uses X and is really cool (on outdated hardware).

Network transparency is really useful. VNC/RDP sucks compared to X. And I don't see how it iseasier to use than X. Maybe there are more GUI for it that make it easier for beginners, but thatis not for any technical reasons.

I don't see what overhead X causes. I worked fine decades ago. Latency is an issue over the network,but only because the toolkits never cared about that. It's not a p

I agree that we need to come up with a brand new system to handle today's graphics systems. That's what Wayland is for, and why it's such an interesting project. It is not legacy baggage, but a ground up designed system. You have heard of it, haven't you? Seems like every Linux user and their dog knows about it these days.

Also, I'm very glad that Wayland is implementing an X compatibility layer. I'm one of those fraction of a percent that use and enjoy network transparency. It would annoy the hell out

Wayland has an X compatibility layer, sure, but you may also be pleased to know that there are efforts underway to get native Wayland network transparency.

See, the cool thing about wayland is the protocol is written to be flexible enough to have some other program handle the network transparency part, seamlessly. It's not part of the core design of wayland simply because it doesn't have to be in the core at all.

An added bonus of this flexibility is the ability to do network-tranpsarency things that even X c

That... is awesome news! Thanks for that information. I hadn't heard anything about it.

The X compatibility layer will still be useful regardless, but I'm very glad to hear that Wayland will likely have network transparency as well. The more I hear about this system, the more I'm liking it.

The feature I was referring to was demo'd here [youtu.be], where the presenter forwards a window from one display to the other, ending up with the same window on two displays. This is local in the demo, but he says that it's transferring graphics data over the network and I have no reason to not believe him. This presentation is from last month, almost a year after the blog post you linked.

Also, X can move windows from screen to another without a problem. Most toolkits don't support it,
but there has been extensions/patches around for a while.

I was not aware that this was a feature of X, but I would like to point out that this is not an *extension* of Wayland doing this,

Wayland has an X11 compatibility mode. So in what sense is that a "go to hell message"?

Have you ever used an "x11 compatibility layer" on e.g. OSX or Windows.

They suck, because the integration between X and non X sucks. They suck because you cant use an X11 window manager to manage native windows. They suck because native windows can't be remoted using X11.

Basically it makes X11 programs bastard red headed stepchildren and doesn't work nearly as well as using a single system.

And the flag is passed to an API? (libX11 level? higher?), or it lives within the X11 protocol? Or both? I understand the background, I was just saying it was weird to use a flag name as being #define'ed in source code without the context required for it to make any sense.

I dunno. If a game is running amok because gamers and game programmers suffer from an 80s mentality that a computer is a game console, then perhaps you don't want the rest of the GUI acknowledging this foolishness.

The fact that games on Linux don't scramble my desktop like they do under Windows IS ACTUALLY A GOOD THING.

Even with the status quo, cleaning up after a game run amok is less bothersome under Linux.

When a game starts, it wants the entire desktop, it doesn't want the other desktop elements at all, no dock, no icons, interaction, etc.

Why isn't there a function to create a new virtual desktop at any resolution you want and leave the other desktop untouched? So when you switch between them it knows to switch resolutions as well. Have the resolution tag part of the desktop, so when you switch between them it knows what to switch to.

I'm replying to the "game wants the entire desktop so it takes it" comment. If I want to expand an app to the full screen, fine. That's my choice. But if I don't want to, then the developers should damned well figure out how to write 'well behaved' apps.

Times changed, assorted Shit Happened (both within and without my PC), and my SDL tinkery and SVG tinkery became Blender tinkery (short YouTube video of mine) [youtube.com] became "fuck this I'll just play some Torchlight and roughly build a witch there instead", but I still had the feed on watch and saw a relevant change [libsdl.org]. The summary had a different tone from th

Not sure what 'mess' is referred to in the title but I sidestepped the issues I met with Baldur's Gate by running it in its own X-server on a separate VT.As I recall it just took a simple script to start the server and the game, and one extra command to make the mouse cursor less fugly. My main desktop remained completely undisturbed and just an Alt-F7 away. A little polish and this approach could be a good general solution, no?

X is a very old protocol, with a lot of things that need to be implemented in order for something to say that it "speaks X". Things like font rendering and 2d drawing routines. Things that nobody actually uses anymore.

X used to be in charge of providing font rendering and such, but now libraries like Freetype and Cairo do it instead (and do it better). X used to be in charge of accelerated video, but now we have good OpenGL support and Kernel Mode Setting. The only thing X really does now is act as a proxy between window managers and applications. But X still has support for all the old stuff, and so it's huge and lumbering.

Wayland is a new protocol designed to be used between window managers and applications directly. In a new breed of window managers, the window manager itself will set up the screen and draw to it using kernel mode setting and opengl, and it will communicate and delegate windows to applications by talking the Wayland protocol. So there's nothing like the X server sitting between them anymore: the window manager runs directly on the console, and talks directly to applications.

This might sound like it would cause a lot of duplication of effort, and in one sense that's true. However, the amount boilerplate code needed to set up a simple Wayland-speaking window manager is about the same as the amount of boilerplate code needed to set up a simple X11 window manager. Except with the Wayland case, there's no huge X server sitting between applications and the screen, because Wayland is so simple the window managers can speak it directly.

One side effect of this is the Wayland library API is *much* simpler to use than the X libraries, so it becomes a lot easier to write new experimental window managers. I expect we'll see a lot of new WM's after Wayland becomes standard. Another plus is that Wayland has built-in support for multiple pointers and arbitrary window transformations, and that's an extremely nice touch for multi-touch screens.

These two files, window.c [freedesktop.org] and image.c [freedesktop.org], are an entire simple GUI toolkit and an example program using that toolkit, for use with a Wayland compositor.

This directory [freedesktop.org] is an entire compositing window manager that speaks the Wayland protocol. This is already impressively small, but keep in mind that most of the complexity here is in actually drawing to the screen and getting input events from hardware, something that Wayland has nothing to do with, and it's *still* small.

The "network transparency" objection is a red herring, and it's getting rather tiresome. We're not "losing" network transparency. First, we don't have network transparency now; when nearly every application depends on Xshm and direct rendering for anything resembling reasonable display performance, the fact that you can draw obsolete primitives on the server through X11 core rendering protocol requests is hardly relevant. Remote X11 apps have already been reduced to rendering their windows locally and sendi