Wel basically it's a display server, like X11/X.org, except it's optimized to run locally, unlike X11/X.org which has it's origins in networked environments and tries its best to work either locally or remotely.

Except that X has been tinkered to work for desktop systems for so long that there little of that network oriented code left around, yet W, or Wayland, tries to get rid of that aspect completely.

Wayland is designed to fix a lot of the problems that X has. X, for historical reasons, does a TON of things. It has network transparency, it's responsible for input, for setting up the graphics card's memory and registers, drawing various primitive shapes, font rendering, etc.

But today 99% of the time people don't use the network transparency stuff in X, they run locally. But all sorts of memory has to be shuffled around. X mandates all sorts of bitmap formats that must be supported. Today the kernel, through KMS, can setup the graphics card. We have libraries like Cairo to draw basic shapes. Then there are all sorts of weird things that have been hacked into/onto X to support common features like resizing and rotating your desktop.

Wayland basically started with a blank slate. The kernel can setup the video card, so it won't do that. Most people don't use network transparency, so it doesn't do that (you can run an X client on Wayland, for when you still need the feature). The GUI toolkits and OpenGL libraries already draw everything, so it doesn't do that stuff.

LWN had an article [lwn.net] from two years ago about what Wayland set out to accomplish. Things may have changed since there, here are two [lwn.net] updates [lwn.net] from LWN describing Wayland earlier this year.

Wayland is an attempt to remove the network transparency of X... in a world where everything is networked.

I just don't get why they are so keen to get rid off that faeture... When I was in uni (just four years ago) our department had a powerful Sun server that mostly powered the computer lab's thin clients but also allowed ssh access from the outside. Being able to log into that server via ssh with X forwarding and run Maple, Mathematica,... was awesome and saved me the expense of getting any of that software myself.

They are not. This is not about replacing X, it's simply about splitting the graphics and input code out of X and moving it over to the kernel and Wayland. You can still run X11 apps under Wayland. So it's really more a pragmatic approach to getting the Xorg code based cleaned up, then an abandonment of network transparencey.

Also in general X11 network transparency is bit overrated in my opinion, as while it is good for things such as basic thin client computing, it is completely useles for basic everyday uses like screen sharing or moving applications from one display to another. It's also rather useless for multimedia, be it video due to the lack of bandwidth or sound due to the X11 simply not handling sound. Wayland won't solve them, but a clean code base means that it will be easier for other people to attack those problems.

What overhead? Show me the data that suggests that network transparency is what slows down X, and not the bloated toolkits people use (and will still use on Wayland). When X is used locally, it's as fast as anything else.

Nope. While a small minority of loud retards were repeating other people's outdated arguments, most of the rest of the world recreated the network transparency of X. Now you have a corporate environment where doing Unix-y things from 20 years ago is commonplace.

As much as I despise Windows, it does do the remote desktop thing well enough to be useful. I can't say the same for Macs.

The rest of the world has finally caught up with X. It's the people that want to dump X that are really living in the past.

It turns out that in a highly networked world, network transparency is actually a very handy thing. Dumping it just because you're an Apple wannabe is just stupid.

I just realized why I ignore Wayland: every comment in favor of Wayland, somewhere in the comment, will have the same fallacy about the network feature of X being a problem. Myself and lots of others user the remote X feature on a regular basis. I and thousands of IT professional use remote access constantly on other platforms, mainly windows. Using any of the remote technologies available for windows makes me grind my teeth constantly since I was ruined by using X first and know how it should work instead of what I have to use every day. I assume I'm not alone or even just part of a small crowd in that respect..
If there's some good sources for the % of users who don't want it, that's fine. quote 'em and I'm glad to read up on it. But, most times I hear just claimed lack of need for it when I know at least for myself and many others remote X is one of the most valuable features in X. If there was at least SOME info on Wayland that either left off the comments about how bad remote X is or showed some facts to back the claim, I'd pay more attention. Unfortunately I've seen no comment yet for Wayland that does more than say "I don't know of anyone who uses remote X, so most users don't use it"And, one day Wayland may actually catch on and take over the desktop in Unix. Now matter how good it is, it will be a sad day since a useful piece of functionality is thrown away when it used to be included for for free (free in terms of performance cost, free in terms of setup, free in terms of no effort needed for developers to support it, etc.). Doesn't make it easy to look forward to Wayland right now. This comment was at least less inflammatory than most and the rest of the comment was very informative, but I still would like just once for someone to back the claims against the network feature since I value myself quite a bit and have heard many others say the same.

I think Wayland is meant to move X and other legacy stuffs out of the local rendering path, making X second-class (and optional) citizen. This is one (big) difference between Wayland and previous attempts is that it is actually meant to work with X. Rather than removing X, It replaces X's position from being the center of the rendering universe. I don't know if Wayland would succeed, but it would satisfy both the network transparency AND the local desktop crowd. It seems like a very sane way to progress to

Optional is the issue. With X remote is available ANYTIME it's needed. Or are you saying there's no way apps will be written directly for Wayland? If that's the case, why have it anyway?
Otherwise you're saying Wayland has a well designed and/or written migration path....but it's still migration AWAY from the feature that I and many others feel is important...

I'm not sure I understand "written to support it". That's my comment on X: written for X, it works remotely. What have you run into that didn't work? This is a serious question, since I haven't ran into any native X apps that didn't at least run. Network speed makes many perform bad, but it still works for the things I've tried....even Windows apps via Wine at least used to work, but I haven't tried that trick in a while.

Wayland supports X11 in the same way OS X does. There is an X server running as a Wayland client. It would work like a normal X server, doing all it's own compositing, etc., and then send the output to Wayland to be composited with all the other programs/windows.

If you launch the X client, normal X programs should continue to work.

I believe they specifically decided not to even try to make Wayland network transparent. I think you'd either have to run the program on the remote host and transfer everything VNC style, or implement your own GUI/processing separation and handle the networking between the two yourself.

There are people who use X forwarding, you're obviously one. I believe Wayland was designed from the ground up to make things easier for the client.

you already generally have to use something like VNC or xpra (layered over X, not really using it) in order to get reasonable performance or to detach and reattach remotely.
X might seem like the answer until you try to use it... it isn't good enough because it's nowhere near usable as GNU screen.

you already generally have to use something like VNC or xpra (layered over X, not really using it) in order to get reasonable performance or to detach and reattach remotely.

X might seem like the answer until you try to use it... it isn't good enough because it's nowhere near usable as GNU screen.

Quite apart from the fact that terminals are a heck of a lot simpler than a GUI system, there are plenty of cases where high performance and detaching simply aren't needed. So what if the program goes away when you close the connection? You can run it again when you connect again, no problem. That's worked fine for all the cases in the past 5 years where I've needed remote X (such as installing engineering simulators or running certain types of performance monitors).

The big piece of evidence is the failure of network transparency to become a killer feature of Linux. I started using X in '88 and was using it daily by '92. I thought X was amazingly cool the ability to send windows around change displays. I've seen corporate settings where these features are integrated into applications and made real use of. And certainly for system administration the ability to run X applications remotely has been useful particular to monitor. I thought this would be Linux's killer

There's no question remote X has issues. Like you say, the networks and computing environment of today give X a lot of challenge. My big push and the original complaint in this thread is that every Wayland comment includes the remark that remote app usage is not important enough to be a base feature. I disagree and get tired of someone saying the feature is useless when I know it's not.
Right now, there are work arounds for the issues X has, but they are still work arounds. What X does needs to be done bet

I agree that the "unneeded and unused" is BS. The people who support Wayland are proposing eliminating network transparency in exchange for other advantages having to do with higher refresh rates. I agree with you it would be more honest to just say they believe on balance this is the right trade off. X existed when most modern GUIs which made the same tradeoffs were built. Wayland supporters are just saying that Commodore, Microsoft, Apple, IBM (OS/2),... were right and SGI, Sun, Digital, HP, IBM (AIX).. were wrong in figuring out the right balance of features. Wayland supporters are basically saying that ultimately, even in 2012, its all about ramming as many triangles through the video card as possible, and doing that with predictable timings; that anything that slows down those triangles, like networking must go overboard.

Similarly X has 30 year history of really really doing a bad job of delivering a smooth GUI experience. That while in theory the network protocol shouldn't cost much, in practice it often seems to complicate design tremendously. X supporters IMHO and experience have trouble often admitting how many GUI projects fail or take 10x longer than they should because of the complexity of working with the X / multiple window manager / multi GUI stack.

If everyone were putting their cards on the table, then we could have an honest conversation about tradeoffs. Because X servers can run on top of X it might be possible to even come to an agreement about which applications should remain network transparent and which shouldn't. I suspect most supporters of network transparency could care less if games and video editing software went local only. And I suspect that most supporters of wayland could care less if server monitoring and server installation software remained X forever. Longer term though the tradeoffs become real. Gnome and KDE will either be built around Wayland or built around X, its going to be impossible for them to do both well. If around Wayland then Linux will be a system of local GUIs with at best a few networkable applications. If around X then Wayland will be a hack run in place of the GUI or only in full screen mode, for real time rendering.

I suspect most supporters of network transparency could care less if games and video editing software went local only.

I agree that I don't care about those... until the day that I do. Suppose I need to test the installation of some game or video editing software.;-P Or for example, one day I had office hours downstairs, but needed to run a program off my machine upstairs (yes, actually happened and X worked like a dream). There is no reason to believe that the monitoring-installation vs games-video-editing division or any other division that you can think of will be the correct one for all use cases. Such a division

Similarly X has 30 year history of really really doing a bad job of delivering a smooth GUI experience. That while in theory the network protocol shouldn't cost much, in practice it often seems to complicate design tremendously. X supporters IMHO and experience have trouble often admitting how many GUI projects fail or take 10x longer than they should because of the complexity of working with the X / multiple window manager / multi GUI stack.

That's an interesting assertion you're making there, that supporting a networked protocol causes excessive complication in the server. Would you care to provide (or point to) some evidence for that?

In my experience, the hard parts of making GUI code are dealing with multiple platforms (why would you want to write code for a single platform?) and going from functional-but-dull to snazzy-and-usable. The networking side of things (or not) is nowhere on that map.

Have you any actual experience with a networked server? The actual protocol is easy enough to do. Sending data from one party to another and parse it is easy.

The real problem is that you go from a situation where all components can talk to each other quickly to a situation where everything you do must be carefully analysed because each time you cross the boundary between local and remote you take a possibly big hit from latency and possibly also bandwidth if it's a lot of data.

In my experience, the hard parts of making GUI code are dealing with multiple platforms (why would you want to write code for a single platform?) and going from functional-but-dull to snazzy-and-usable. The networking side of things (or not) is nowhere on that map.

I don't write software where I have to push large numbers of frames through per second either. On the other hand I use software where large numbers of frames per second matter. Interestingly enough I just got the mac retina. Because the retina is doing virtual adjustments (i.e. there are several virtual screens being drawn to by applications and those those are re-rendered to another virtual screen which gets pushed to the physical screen) I could easily see frame rate problems in even day to applications like video inside a web browser while scrolling before the driver improvements in OSX 10.8. What Apple did in 10.8 to get rid of those problems, would be impossible under X.

Kristian Høgsberg who wrote a lot of the X acceleration you are probably using was the one who started Wayland. He was frustrated about what he couldn't do. Under X applications are not able to control rendering. They cannot make decisions required to avoid visible tearing. They cannot force the X client to draw potential windows in advance to avoid lag.

Another problem is either the client and server (to use X terminology) share a video memory buffer or they don't. If they don't you pick up a lot of time passing information between them. Your CPU is probably no more than a few gigabytes per second, that is the maximum speed you can get data from one buffer to another under best conditions. And with screens that are 5 mega pixel x 4 bytes of color per pixel, every one way trip is is 1/100th of a second under perfect conditions. You aren't getting perfect conditions and 2 round trips is common. And if X wanted to implement something like the resolution system Apple for retina then it would be worse (though the CPU speed for memory is likely about double) because you could be rendering virtual screens as large as 14 megapixel with some round trip being 4 hops.... you could be talking flicker over 1/10th of a second.

If everyone were putting their cards on the table, then we could have an honest conversation about tradeoffs.

Sometimes it seems like people don't even know what cards they're holding. All these arguments are missing the point from a usability perspective.

When I type "ssh -X", I don't actually care what protocol is used. All I care about is that it works on every single computer *by default*. The solution is obvious: modify the Wayland spec to demand that every system that implements Wayland also includes VNC integrated with SSH. Problem solved, everyone can be happy.

Yes, performance won't be exactly the same, the specified protocol might not end up being VNC, etc. but these endless arguments about Wayland are much worse. We have the software to implement this, so let's just please standardize on *something* so we have usable systems out of the box. It's not going to prevent someone from manually installing a better network protocol in the future, so Wayland trying to remain neutral on network protocols is just ideological posturing.

modify the Wayland spec to demand that every system that implements Wayland also includes VNC integrated with SSH. Problem solved, everyone can be happy.

I don't know if it should be part of the spec nor do I know if Wayland can demand anything but...

I think that's a great idea for a strong suggestion! Wayland be default should support VNC. ssh by default should use vnc (-v is taken, I think -X should remain with X11/X12 but I have no problem with -Z which is free). That solves most of the problem. I a

But in the end, X network transparency doesn't work very well over Wan. It doesn't work very well over MPLS. In general it doesn't work all that well for the situations where you couldn't just be using some sort of remote solution.

X works very well over a LAN, and, as bandwidth becomes cheaper, problems running over a WAN will go away.

X wasn't able to handle the security problems and so the whole infrastructure of remote X and remote shells has gotten more complex and thus less useful.

X works very well over a LAN, and, as bandwidth becomes cheaper, problems running over a WAN will go away

Its not generally a problem of bandwidth alone. Even with tons of bandwidth latency is a problem over WAN. IPv6 will make that somewhat better by reducing latency. Moving to fiber will make it somewhat better. On the other hand introducing more satellite, over the air, and wifi will make it worse. Now an MPLS will solve jitter but if jitter it gets even worse.

X works very well over a LAN, and, as bandwidth becomes cheaper, problems running over a WAN will go away.

No, it won't. The problem with X over a WAN is latency, and no amount of technology is going to change the fact that light can only go so fast.

The company I work for has a *very* fast WAN between offices, and X over the WAN is still a dog. The problem is that X is to a large degree synchronous, and operations involve multiple round trips. So no matter how much bandwidth you have, you get killed by laten

The solution to this is to either use a framebuffer-based protocol (VNC and friends) or to use an asynchronous compressing X (NX). Neither of which is really taking advantage of the network features of X.

Actually, the solution is a combination of X and VNC. I have a persistent VNC session on the LAN, to which I send the display from mulitple X11 apps running on many differnet machines, and then, I connect to the VNC session either over the LAN or over the Internet (VPN or tunnelled over SSH).

The big piece of evidence is the failure of network transparency to become a killer feature of Linux

I don't know about you, but that IS the killer feature that got it placed onto everyone's desks in my workplace. A lot of scientific and engineering software that needs some CPU power behind it never got ported to MS Windows. A lot of it makes more sense running on big noisy stuff in server rooms instead of desktop PCs anyway.

I don't know about you, but that IS the killer feature that got it placed onto everyone's desks in my workplace. A lot of scientific and engineering software that needs some CPU power behind it never got ported to MS Windows. A lot of it makes more sense running on big noisy stuff in server rooms instead of desktop PCs anyway.

That's easy enough to implement as client server. You have a display client and a server which does the noisy stuff.

VNC type stuff generally sucks badly for those sort of situations but there has been some success with TurboVNC over WAN. Of course they are VNC sessions of X session displaying X apps that get updated so rarely that some of them still only work in 8 bit colour, so stuff in that niche won't be replaced with anything else any time soon.For other stuff, fine, but the network transparency of X is the reason I have a job with linux, solaris and AIX in the first place and why linux is at least on some corporate

I'll add the entire point I should have put in to start - the people doing all the stuff that require X to run their (in my case geophysical) remote applications are not just running it on one host and they are interacting with it in real time. Just showing one other desktop does not cut it especially when they are writing reports etc and cutting and pasting to something running on their local machine or a different remote machine. They could VNC into some sort of head node and do everything as if they ar

But what about native Wayland aps? (Is there such a thing?) Will I be able to run those across the network? (Asking, I don't know.) Like the GP, I use X over the network. If I can run *every* graphical ap on my machine over the network, then, sorry, no sale.

I haven't had the best performance with everything, but they all ran. And I've tried a lot....even xine runs OK remote....Doom3 ran, not playable, but then again, I was not on the same local network when i tried it. Menu's were fine. What have you ran into that does not work?

I haven't had any luck with programs using OpenGL or with compositing, but then again I haven't tried all that hard, so there might be a way to do these things. Wayland supports remote desktops, which I find more useful than remoting individual applications, but I can see why network transparency matters to some people.

I had problems too. Then I put a video card with 4MB of memory and some 3D acceleration into my machine. Then I could use 3D visualisation software run on a big SGI machine and operate it from my crappy pentium 60 in another building. That was in 1999 and very easily fixed with old donated hardware.Anyway - crap drivers are crap drivers and some of the MS Windows implementations of X are crap at OpenGL (Exceed? maybe they finally fixed that) while others ar

Nice job clipping that response to suit your needs. If you bothered to read the second paragraph:

This doesn't mean that remote rendering won't be possible with Wayland, it just means that you will have to put a remote rendering server on top of Wayland. One such server could be the X.org server, but other options include an RDP server, a VNC server or somebody could even invent their own new remote rendering model. Which is a feature when you think about it; layering X.org on top of Wayland has very little overhead, but the other types of remote rendering servers no longer requires X.org, and experimenting with new protocols is easier.

Now, call me crazy, but isn't a large part of Linux about the user being able to choose how their machine does the work that they ask it to do? It seems to me Wayland isn't trying to force anything on anyone, rather, just trying to open up more choices that would otherwise be limited.

Here is a perfect demonstration of the "clueless retard" mentality in this discussion. Running X on top of Wayland doesn't help you. It's like running X on Windows or running X on MacOS. It only allows you to run those apps still coded to use X.

Running X on Wayland doesn't allow me to run Wayland apps remotely.

It's just like how running X on a Mac doesn't allow me to run iTunes on one of my Linux boxes.

This still seems like a backwards step though if we're not even going to be considering how Wayland should deal with running programs over a network.

We're in the world with rising tablets and "cloud" computing usage, so the need for an easy to use remote desktop/application system seems more obvious then ever. Where is the sense in developing a new display server and not including in the design some type of road map as to how this feature would be supported from the get go?

Of course, this means that the graphics card has to be on the same machine as the applications, and that it has to have kernel drivers that support not just KMS but also DRI2 graphics acceleration. Without DRI2 support, there's no way to pass images of windows to the compositor and therefore no way to actually display anything. Precisely none of the existing closed-source drivers support the KMS or DRI2 interfaces that Wayland needs, so they can't run on it. In fact, for licensing reasons they can't actuall

Basically the idea of wayland is that by using the wayland libraries the window manager becomes the display server. There are no restrictions on how the window manager works, other than that it is a compositor. The end result is that you can have tiling window managers like Awesome but they will leverage the GPU a tiny bit for the rendering. From an end user perspective there is no reason you should see a change given that the devs are halfway competent.

can Awesome be ported to Wayland itself, so that it manages X clients and native Wayland clients?

Yes, but with a twist. Wayland doesn't have window managers as a separate process. Instead of porting your preferred window manager to work with Wayland, one would implement the Wayland protocol support in the window manager, with help from libwayland for the common parts. Supposedly the Wayland support only requires about as much code as the boilerplate for an X window manager. Of course, X core rendering and XRender will be unavailable. If the WM already uses a portable library like Cairo, GTK+ or Qt for rendering that shouldn't be a problem; otherwise all the drawing code would need to be ported as well.

Since Ubuntu's apparently planning on having a system-wide instance of Wayland that starts at boot and keeps running until shutdown, presumably the only way to change your window manager would be to do it system-wide. It doesn't look like there's going to be any way to support individual users setting their own window managers.

I don't think TPP was asking you to run Google for him. He was giving the editors a hard time for not answering an obvious question (WTF is it?) in their summary.

Not that Googling did you much good, since the Wayland web site is also sloppy about describing what they're doing, and you came away with the weird impression that it was about reinventing X on a different model. Not even close. Here's a much better summary from the Ubuntu site:

Wayland is a new protocol that enables 3D compositors to be used as primary display servers, instead of running the 3D compositor as an extension under the (2D) X.org display server.

Or, in layman's terms, it assumes you're using a 3D desktop from the start, instead of bolting on 3D capabilities to an 2D framework.

Except that X has been tinkered to work for desktop systems for so long that there little of that network oriented code left around

It was never there to start with because the design is to use local sockets for local stuff and network sockets for network stuff. The big thing with X is network stuff can work as if it's on the local screen, but for some reason some people get that virtue backwards and assume that means local stuff is double handled.Wayland is a different way to get stuff in the framebuffer w

It's so far from ready, this is what its currently achieving. When you are remaking a compositor from the begging these are significant steps.

I have been waiting from this announcement from Ubuntu since they said they were trying to use it next release. It might be close to being ok for a 2d no accel window manager but trying to run unity was never happening in 6 months.

way more pragmatic answer, if they switch to wayland they will piss of valve who is working on porting their games onto ubuntu. they piss of valve there will be no games, as a consequence there will be no year of the linux desktop for a long time. linux as a big force in the desktop space is ubuntu's goal. so no xserver means no valve means no steam no games and now no games.

My understanding of Wayland says that it wouldn't bother Valve at all.

If you use a toolkit that has been ported to Wayland, you use the toolkit and nothing changes.

If you use a toolkit that hasn't been ported, you'd just run the X server that runs as a Wayland client, so things keep working.

But the important thing is running OpenGL, which works just fine in Wayland (which is built on OpenGL).

Unless Valve is writing their own rendering directly against X (which seems like it would be an idiotic thing to do in general, especially considering Wayland has been coming for over 2 years), I wouldn't think this would really effect them. In fact, they could decide to go Wayland only (assuming it's done enough at that point) and avoid whatever hassle X might have given them.

If Linux ever comes to dominate the desktop it'll be in some form that just comes out of nowhere the way Android did* and made Linux a huge force in the mobile space. Windows would be a lot harder to displace than the iPhone though.

*Yes to people on here Android was old hat by the time it actually appeared but to most consumers it just appeared out of nowhere one day in 2008.

It didn't make Linux a huge force in the mobile space. Google made Android huge in the mobile space. The kernel has simply been utilized, with little flowing upstream into it from Android and frequent GPL violations by Chinese ODMs.

It is, otherwise, incompatible with what's commonly referred to as "Linux."

It didn't make Linux a huge force in the mobile space. Google made Android huge in the mobile space. The kernel has simply been utilized, with little flowing upstream into it from Android and frequent GPL violations by Chinese ODMs.

It is, otherwise, incompatible with what's commonly referred to as "Linux."

-- Linux Torvalds commenting on his blog about his at the time newly acquired Nexus One

I will politely defer to his opinion over what anyone else thinks in this matter.

Full sentence for context:

At the same time I love the concept of having a phone that runs Linux, and I've had a number of them over the years (in addition to the G1, I had one of the early China-only Motorola Linux phones) etc.

That's as close to an explicit declaration by Torvalds on Android being Linux as I can find. If you have something where he says otherwise in a more direct way I'd be happy to see it. Otherwise, if the Nexus One with Froyo Android on it is said to "run Linux

No doubt this is a stupid question (I'm not really that familiar with the technical details of the Linux graphics stack), but why is middleware like X11 or Wayland needed at all? Why can't the desktop/window manager talk directly to OpenGL, which in turn talks to the graphics hardware via a driver? Intuitively, it would seem like this would give better performance and fewer places for bugs to crop up. Why do there have to be 20 different layers in the rendering stack? Is this just abstraction for abstraction's sake or is there actually a good reason?

This is almost exactly what Wayland is doing. Wayland is a communication protocol between compositors and the things they composite: usually, between window managers and applications. This means that the window manager is responsible for communicating with applications, and for pushing video data on to the screen (via OpenGL + Kernel Mode Setting). So instead of launching X, then launching a window manager on top of that, you just launch the window manager.

This is the primary advantage of Wayland: it's simple. Really really simple. It's basically just OpenGL and a protocol for delegating render surfaces to other applications (to render on to using OpenGL). By comparison, an X server needs font rendering, shape rendering, and a ton of other things that aren't used today anyway because everyone uses freetype and cairo and such. Wayland leaves those out and expects you to get that from other places (like, say, freetype and cairo).

(Wayland is also the name of a C library implementing the Wayland protocol. The Wayland project also produces the Weston compositor, as a reference implementation of a simple "window manager".)

What does Wayland propose to give us that isn't already available with a stack like E17 (with Evas alone doing a lot of what the entire benefit of Wayland is supposed to be) on top of X11?I've looked at their list of features. Is there something else that provides some advantage that hasn't been listed? I just cannot see any advantage other than "let's have our own X, but without the hookers and blackjack".

Everything that Wayland does is possible under X, it just might be hard to write the X code to do it.

The biggest point to Wayland is that it is extremely simple (compared to X... X is huge), and it's capable of doing 95% of what people use X for. The other 5% is network transparency, a feature I hold dearly but one that I acknowledge not many people care about.

So what's so great about simple? For one, it's easier to maintain. It's also easier to write clients for, since it's almost entirely OpenGL (the Wayl

Yes, but my question was "what does it give us".If it's a simple monolithic thing does that lock it down or are there plans to make it extensible in the future like X is? Does it all have to go into memory at once or only just the bits you use, like the X extensions, Evas etc do now (without having to muck around directly with the X libraries anyway)?Correct me if I'm wrong, but I see it as a cathedral vs bazaar thing where it's hard to add more to the monolithic thing but easy to add a bit more to the mo

For Weyland to work applications will need to support it and nVidia & AMD will need to support it. It needs to be available via a fairly simple install before I'll try to port my applications to it. I'm hoping the Weyland developers are actually talking to nVidia and AMD and Cannonical doesn't release this until they have at least beta drivers.

I'm not too worried about the network transparency even though I use it everyday. Most of the applications I use remotely are things like emacs that are a bit slo

Ubuntu helped me completely switch over to linux back in 2004-2005. But once they changed to Unity, I moved to Linux Mint Debian and haven't looked back.

Sad, really. Their forums were one of the best things about them. As I learned tips and tricks, it became sort of a hobby to visit the forums and try to answer questions for even bigger noobs than me. Not much I can do there now. And the Debian boards are more of a listen and learn place for me. I miss the community, but the OS has made itself unusable.

I thought this was a hardcore tech site, but Ubuntu is a pile of crap, and anyone who has tried other distros (crap like CentOS doesn't count) usually likes the other distros better. Debian, Archlinux, Gentoo - these are distros that don't suck, don't go into dependency hell every upgrade, and don't make a gui for everything, with ads and daemons and useless crapp tossed in.

I don't see choosing some particular distro that important. They all carry mostly the same software and have somewhat similar mechanisms for package management and maintaining the system.

It would be hilarious to me, if it weren't already infuriating, that so many people decry how crappy Ubuntu is, and go on to complain about the user interace. If you're a hardcore user, it shouldn't matter what distribution you're running, you can change whatever you want about it, that's what's great about Linux. But some people need to show how big their Linux Organ is by dumping on a somewhat popular form of Linux. Don't like Unity? There are TONS of other DM out there, try one. If you still hate Xubuntu