Posted
by
samzenpuson Monday March 04, 2013 @05:35PM
from the rolling-it-out dept.

An anonymous reader writes "On the Ubuntu Wiki is now the Mir specification, which is a next-generation display server not based on X11/X.Org or Wayland. Canonical is rolling their own display server for future releases of Ubuntu for form factors from mobile phones to the desktop. Mir is still in development but is said to support Android graphics drivers, open-source Linux graphics drivers, and they're pressuring hardware vendors with commercial closed-source drivers to support it too. They also said X11 apps will be compatible along with GTK3 and Qt/QML programs. Canonical isn't using X11 or Wayland with their future Unity desktop as they see many shortcomings from these existing and commonly used components."

A lot of times in software someone starts some grand plan project which takes forever to get anywhere. Then some lone programmer comes along with something small, well focused and just plain well thought out, which causes the grand project to be abandoned. There are so many examples of this one can't count. The Linux kernel itself compared to Hurd is just one example. Let Canonical have a shot at this. They've got some good ideas, if they can pull it off, the result will stand on its own merits.

Problem is quite simple - Wayland started very small and simple, but of course were held back by legacy support requests (and then there's those closed binary video drivers) and Ubuntu planned to do next LTS with it. However, Canonical suddenly changed their direction 2 years ago, and tried to push into mobile market. Wayland (and Xorg too) can be used for mobile platform, it just needs more work. Problem is Canonical's time is running out. They can't wait. They also don't want to be in same position as others. They want to be first. They don't want to waste all their money only be beaten by some guy who will put GNOME 3 with GNOME Shell together, make it sexy and make all phone/tablet wannabies run for their money. So they retreat more and more in NIH land.

I don't mean them ill. But it's serious fragmentation and trying to destroying de facto Linux desktop ecosystem - to become ultimate winner instead. I'm not sure I can support that in any way anymore.

I think we're seeing a natural cycle in the software world. During the 80s there were dozens of architectures, operating systems, languages, etc. and the best (for some definition of best) became dominate and during the 90s consolidated. Now we're in the midst of another explosion in new technology (languages, display servers, processor architectures, perhaps even operating systems) that will eventually lead to reconciliation and consolidation in another five to ten years.

Things like Wayland have to appear, and even fail: their existence allows new ideas to be tested giving us a better idea of where to go from here.

Except the proper integrated solution (in this case, a rendering pipeline that doesn't suck hardcore) has already been done by others multiple decades ago. E.g., NextStep. OS X, hell, even the Windows DirectX stack, like it or not, is far superior to the clusterfuck that is X11.

It's kinda the whole problem with Linux is that any "standard" is just defacto and ever shifting. Yeah for sure, it is something that holds Linux back compared to the stability of proprietary platforms. But also, it is the thing that allows it to move forward. Canonical will give this a shot, and if its great, perhaps it will be the new standard. If its rubbish, it won't be. Let's just see what they come up with. If Wayland were perfect, I'm sure Canonical would not want to throw money at a problem that is already solved.

It is not good enough to "leave it well enough alone". That results in stagnation and obsolescence and toleration of mediocrity. This is the reason why consumer PCs were saddled with a piece of shit operating system that did not even have full 32-bit protected memory, for the better part of 2 decades after the i386 was released. This is the reason why most of the big RISC and UNIX vendors have stagnated and fallen.

New things should continually be explored and improved, new ideas test

One of the main reasons the UNIX vendors stagnated and fell is because of fragmentation. Consumer PCs were saddled with a POS OS (and also later iterations of it which did have full 32-bit protected memory, and later 64-bit) because that POS eliminated fragmentation. Fragmentation is one of the big things keeping Linux from being a true threat to Windows.

I'm guessing that they're running an elaborate experiment to see just what one has to do to ruin a distro thoroughly and completely. Otherwise, none of this makes any sense.

Microsoft is trying to be a copycat of Apple, Ubuntu is trying to be a copycat of Google. Google scrapped everything but the kernel and wrote all new code - you can tell by the Apache 2.0 license, no GPL userspace code, Ubuntu is now trying to do the same wanting to go head to head with Android not realizing a house cat can't hunt the same way and the same pray a lion does. But then they're used to being a 1% company in a 99% Win/Mac world, maybe they'll manage being a 1% company in an Android/iOS world too

I think Shuttleworth has just decided (probably correctly) that he can't make any money on the desktop, but mobile is still a possibility. The Unity interface and now this are an attempt to compete with Android.

I abandoned Ubuntu for my desktop when Unity came, but I think I might actually like it on a tablet or phone. Anyway, I'll try to keep an open mind when the devices actually come out. I hope one of non-Android Linux phone efforts finds a niche, whether it's Ubuntu, Jolla, Tizen, or Firefox OS. If Shuttleworth can pull it off, then more power to him.

I think Shuttleworth has just decided (probably correctly) that he can't make any money on the desktop, but mobile is still a possibility.

It is highly doubtful that he can make any money in the mobile sphere, that is pretty well decided now, too. He probably stood a better chance with the desktop, particularly after Windows 8.

The Unity interface and now this are an attempt to compete with Android.

If the goal was to compete with android, they should have gone KDE. KDE active is a much more attractive development environment and much further along than Ubuntu's mobile offerings, which don't even use the standard Unity interface.

I abandoned Ubuntu for my desktop when Unity came, but I think I might actually like it on a tablet or phone. Anyway, I'll try to keep an open mind when the devices actually come out. I hope one of non-Android Linux phone efforts finds a niche, whether it's Ubuntu, Jolla, Tizen, or Firefox OS. If Shuttleworth can pull it off, then more power to him.

Study after study shows that Unity does not work well on a tablet/touch device. It only looks like it should work, but all of the apps are mouse centric. The problem for Canonical going mobile is that most of the apps in their repositories, which is a large selling point (even if free), won't work on mobile. So from the very start, they will be competing with Apple and Android who have a huge head start and even Microsoft who while a very distant third is lightyears ahead of Canonical.

As I said earlier, they should have gone Plasma Active. If all of the resources that they dumped into Unity and now their mobile offerings had been used to further that project, they would have been to market earlier and had apps ready to deploy. Instead they chose to go their own way, which is their right, but not necessarily the wisest business decision as even Microsoft is late to the game.

Actually it makes quite a bit of sense -- if you're not making a Linux distro.

This appears to be the fundamental fact that Shuttleworth and Canonical have seemed to forgotten -- or rather, they want you to forget. Canonical is trying to position itself not just as a Linux distribution, but as a platform ala Android, where the only role Linux serves is to get around the licensing costs of using something like QNX instead...the work's already done, the community -are- the testers. They get an OS for embedded

There's something quixotic about all the recent changes in Ubuntu, isn't there? In the real world they are a Linux distro preferred by 2% of users for its good driver support and its ease of use. But in Shuttleworth's mind, they are a smartphone/tablet/TV operating system that is about to go mainstream and take over the world. Maybe if his desktop market share was a tad higher than 2% it would be realistic, but it just seems to me that they are overreaching and mostly daydreaming of grandeur where they shou

Let me tell you a story. A bunch of Swedish guys stay in a hotel in the US. Their manager speaks Spanish and chats to the staff. The staff complain the Swedes don't tip. So the manager talks to them and explains they should all put a dollar bill on the table each day. Some of them leave change and the cleaners tell the manager this is unacceptable. Eventually all but one of them do the crisp $1 per day thing. The one that doesn't claims that tipping is feudal and turns the cleaners into supplicants, the hotel should pay the staff a decent wage like in Sweden, the US should have a social democratic party like in Sweden to stick up for the workers and so on and so on and refuses to do it.

When he checks out he finds out the cleaners have put on the porn channel every day after he left the room and turned it off just before he got back.

I think we can all learn a lesson from that story, can't we?

I would say that the lesson is that hotel cleaners in the US are criminals. And that the tipping system in the US sucks.If the cleaners (or others in the service industry) feel they are entitled to the tip, it is not really a tip any more, it is just a hidden direct taxation for services.

Unity started to turn out quite nicely until they turned it into Amazon spyware. It was completely unnecessarily made unusable but the underpinnings were there. Coming up with a new display server, I am sure they have some hotshot OS programmer sell them on a demo of something that seems pretty spectacular. However, the fact X has hung around for a decade past its due by date shows it's not easy to replace. There is a horrible amount of legacy to be supported in terms of standards and hardware. I agree ther

And what might that lesson be? In 12.10 Unity is quite good. I used to run an alternate LXDE environment for my gaming needs but then I got unredirected fullscreen in Unity. There's is no speed diff now when running fullscreen games be those native or using wine. Using the HUD is a godsend in applications that I use rarely, just search for a certain function in the hood instead of looking through all the drop-down menus.

And yeah, I know the state Unity was released in. Know what I did? I used Gnome while I

It's possible they have a small team who has overcome all the corner cases discovered by the Xorg, XBC, and Wayland folks over the past couple decades by fundamentally re-factoring the problem into a more correct solution and have achieved excellent performance by doing so.

It's also possible that space aliens gave them this technology, but that's only slightly more likely.

Mark Shuttleworth while proclaiming publicly and often that he won't support the company forever and that it needs to be profitable decides in his infinite wisdom to not only fork a major toolchain piece (upstart) but to fork the GUI as well. Rather than putting his limited resourced into the community projects.

I think he has as much chance succeeding at this as he does of the aliens giving him the technology.

The fundamental issue with a replacement of Xorg is redefining the problem domain. Xorg(and X11) before it runs on a client server model, which is dead set sexy and really really useful if you're operating in a client server environment. The problem is that it's a horribly inefficient kludge if you happen to be running the client and the server on the same machine. This is more important if you're trying to run in a limited resource environment like a Tablet while still having all the new shiny. Yes you can

something on the display server (in X11 the client) is likely to be in GPU memory, and something on the application (in X11 the server) is definately in CPU memory

This is completely wrong. First, in X11, X is the server and the application is the client. Second, modern X11 applications do their own hardware-accelerated rendering in GPU memory and pass the rendered image to the X server for compositing, so the client/server memory distinction you're positing doesn't exist. Neither does "network transparency" in any meaningful sense; the extensions which allow efficient local rendering, like XShm and DRI2, aren't available over the network, so application can either use a completely different rendering path, forfeiting transparency, or get horrible performance due to the complete lack of image compression in the X protocol and the fact that inputs to the rendering process (particularly things like textures) are often much larger than the differences in the output from frame to frame. Rendering with local hardware acceleration and sending the results over the network in the form of compressed video, a la VNC, RDP, XPRA, and the plans for remote Wayland, is much more efficient, and actually transparent to the application.

Unless they can convince the wider Linux community to adopt some of their technologies, Canonical is basically going to end up forking the platform. If that happens, it will be a fairly major step backwards for Linux on the desktop since developers will be on the hook to adjust to supporting not just multiple packaging systems and multiple library versions, but also multiple incompatible core system API's. Essentially Ubuntu will no longer be "Linux" in any way that matters to developers and all the support for Linux out there now will either die or just switch over to being Ubuntu specific and I don't see how that benefits anyone in the community.

I've always been a Debian guy. It's just clean, simple, does what it's told, and leaves me alone when it can. I always love it and sing its praises when testing is fresh yet stable. But as time passes and I'm still using stone-age software near testing's transition to stable, I always start looking elsewhere.

Same here. Debian Sid is great when Testing isn't frozen, but then it stops being fun for far too long. If I wanted to run Stable, I'd run Stable, and if I want to run a rolling distro, I'd rather not run some slow-moving, semi-stable slush. So after Squeeze was frozen, I moved to Arch for some time, but quickly jumped back again when Sid got moving again. Repeat for Wheezy, but this time I'm so happy with Arch that I'm not sure I'll be moving back. Things work, and I've got all the packages I want.

Arch is as near as it comes (except maybe gentoo and Linux From Scratch) to plain linux rolling. It's got some great features as well as one or two mistakes. The mistake that matters is dropping sysvinit as an option. Not just adopting systemd as the default, but outright kicking sysvinit to the curb. That is hardly unique to Arch though.

Also the Arch devs are entirely willing to let pacman break your system if you don't religiously keep up with the announcements on their website. If you let a system go for a few months and then run pacman without reading the last several announcements, your system has an excellent chance of being hosed in a way that requires manual tinkering.

It was never a viable escape route. Debian just goes with the irresistable flow, very realistically. The escape route is very simple. Xfce. Period. And you can run it on practically any distro. Hell, I'll say it. ANY distro (compile from source if you have to, but that should be a very rare case).

Now we just have to figure out an escape route from systemd hell - or just accept it.

I dumped Ubuntu quite some time ago. The last Ubuntu install I had going, a web server, was shut down last fall. I've switched over to Debian, which has everything I liked about Ubuntu without any of the things I absolutely loathed about Ubuntu.

On the desktop I've switched from Ubuntu to Mint, but on the server I've changed from Debian to Ubuntu LTS. For my use case having more up-to-date software is more important than utter stability and the outside chance of major-vendor support for programs that I'm not going to run anyway.

In fact, my roll-my-own-distro choice is now Ubuntu Server, which is far less likely to spontaneously break than the current favorite riceboy distro, Arch.

Come on. Seriously?? Debian is nobody's bitch; certainly not Gnome's. You have a completely free choice of desktops in Debian, just as in practically all other distros. It's dead simple to select Xfce [debian.org], and it's dead simple to select KDE [debian.org], and it's dead simple to select LXDE [debian.org], just for example.

Why would you have them completely drop support for ANY major desktop? Open source is about choice. Choice is good.

As for "Ubuntu's bitch", color me completely mystified. I can't even begin to imagine how anyone can connect that to reality.

If that happens, it will be a fairly major step backwards for Linux on the desktop since developers will be on the hook to adjust to supporting not just multiple packaging systems and multiple library versions, but also multiple incompatible core system API's.

Unless they can convince the wider Linux community to adopt some of their technologies, Canonical is basically going to end up forking the platform. If that happens, it will be a fairly major step backwards for Linux on the desktop since developers will be on the hook to adjust to supporting not just multiple packaging systems and multiple library versions, but also multiple incompatible core system API's. Essentially Ubuntu will no longer be "Linux" in any way that matters to developers and all the support for Linux out there now will either die or just switch over to being Ubuntu specific and I don't see how that benefits anyone in the community.

Forking the platform, you mean like Apple did with BSD and Google did with Linux? I think Canonical isn't interested in having a linux distribution. Just like Apple has OS X and Google has Android, Canonical plan is to have Ubuntu as the operating system.

Sure, breaking tradition will cause a little more fragmentation in the Linux world, but is that so bad? We don't think our needs, or that of our users, are always met by sticking to the 'same old song and dance' so we're bucking the trend.

You forgot "...so we can claim exclusivity of the Linux desktop platform to ourselves only and therefore getting....profit?!" part of that claim.

Seriously, this is stupid. Breaking tradition isn't bad. First, wasting it's money on repairing one alternative, then trashing it and picking another one just because you suddenly feel lucky to be on mobile platform - it's bad, messy strategy. I fail to see how this will work on Canonical's advantage.

I understand the desire to replace X. Big chunks of X either aren't needed any more or have moved into other locations (mostly the kernel). but i find it hard to believe that the direction and goals of wayland are so different to what ubuntu want that its worth starting fresh.

maybe now that a display server has so little to do, it something that a small team can knock up in a few months. in which case maybe every window manager will end up being a display server.

It's the desire to trash everything and start again, but this time doint it *right*.

Big chunks of X either aren't needed any more or have moved into other locations (mostly the kernel).

Yes and no. Mostly no.

For better or worse, quite a bit of the hardware side has moved into the kernel.

The other bits (old-style graphics and font rendering) is no longer big. It was big in 1987, but by 2013 standards it's a few k, perhaps even a few M of memory. Utterly irrelevant.

The other parts of X work really pretty well.

Sure there are warts. But the better solution is not to nuke it from orbit, it's to come up with protocol fixes to give thigs like persistence and fewer round trips (e.g. like NX). The trouble with nuking things is that all the edge and corner and even marginally non mainsream cases just get thrown away too.

X does a lot of things well, and large parts of the protocol have aged very gracefully. Did you know that copy/paste with advanced (non text) types and drag and drop is all implemented using mechanisms compatible with the original 1987 X protocol?

Oh, and you can pry my server side decorations from my cold, dead hads:)

Also what moron on the X team got rid of the keycombo to nuke server grabs for misbehaving applications? I think the reasoning was that it shouldn't be necessary because that's an application bug and should never happen. No shit it's a bug, sherlock! Now these monkeys are trying to give us the next great compositor.

I really don't have the technical knowledge to praise or damn the idea, but as I understand it, there are some clever moves in this;

It appears that they rip out enough of Android that they can use the Android graphic drivers for Mir, so that every device with android drivers delivers "free" drivers for Mir too. That would give them a huge advantage in the Smartphone and Tablet arena.

QtMir, QtUbuntu, Qt/QML; it looks like Ubuntu dumps Gnome/GTK in favour of Qt5 for core OS (GUI) development. As I see it they will clone KDE/Qt, substituting the KDE parts with QtUbuntu.

Drivers aren't the issue - they would seem to sit a layer below the display server. On the desktop, they re-use all the acronyms wayland employs - GBM, KMS, DRM, OpenGL ES etc. Mir will run on the same infrastructure - e.g. using nouveau but none of the 'legacy' drivers tied to X11. On mobile, see Pekka Paalanen's efforts to port Wayland to Android, using the pre-existing underlying Android graphics APIs.

I've been a user of Kubuntu since 2007 and happy too. I don't get why people only talk about ubuntu, and when disappointed with it switch to other distributions, when kubuntu still gives you the classical desktop experience, and not something broken like unity.

I hope that whatever they do with mir they don't end up breaking Kubuntu. At least it survived the unity madness, and doesn't send your keys to amazon.

For someone who loves choice so much you're pretty hard set on X fanaticism. In any other arena X would be described as a monopoly. Should Canonical not be allowed the freedom to compete? Or should your zealotry force their roadmap?

We have competing window managers, competing graphical toolkits, competing desktop environments, X even has competing methods of rendering, a competing display server will make things interesting and looks like it's paving the way for easier cross platform application development.

Chances are Mir will be an open source, open spec standard under a nicey nice GPLish license allowing freedom of choice to distributions, application developers and end users alike.

Linux has been a fractured splintered platform for well over a decade, this doesn't really make that much of a difference.

While claim of soviet style dictatorship maybe are part of drama, I can't agree with rest of claims though.

"For someone who loves choice so much you're pretty hard set on X fanaticism. In any other arena X would be described as a monopoly. Should Canonical not be allowed the freedom to compete? Or should your zealotry force their roadmap?"

No, they can compete in anyway they like. However, they trashed Xorg first, claiming it doesn't do what they want to do - ok, fine. Then they supported Wayland. Ok, it's n

>> "I also find the name to be odd. Do they name it after a soviet space station as an indication that they are planning to take away our rights in a soviet style dictatorship?"

I'm not sure if you're trolling or just ignorant, so let me share some knowledge in case anyone takes this silliness half seriously.

The Russian word "mir" is typically used to mean "world" or "peace", depending on its usage. The term "mir" can also be used in a similar sense to the English words "village", "community" or "global". The word "mir" is actually a perfect fit with the rest of Canonical's naming structure. Ubuntu refers to community, the Unity desktop is named with an idea of many coming together to form a whole, and Mir continues this trend as the term refers to a unified group or community.

This will go nowhere. Cananonical has "completion" issues. Look at their past track record on linux. The focus on a feature for a release or two and then either declare it done or stop talking about it. They were going to make everything easy, printing, wifi, audio. Pulse Auido is still far from perfect and network manaeger still has issues. Then we have 10 second boot times, better looking that Mac, Desktop notifications, Wayland and 200 million users by October 2013.

They were going to make everything easy, printing, wifi, audio. Pulse Auido is still far from perfect and network manaeger still has issues. Then we have 10 second boot times, better looking that Mac, Desktop notifications

Hmmm, maybe they have done a lot? Pulseaudio hasn't failed me (except when using skype, but I blame skype), network manager is great, ok no 10 seconds, better looking than mac? maybe. I also like the desktop notifications - much better than ones you must click away.

Regardless of what anyone thinks about where Ubuntu is going and how sucky their plans are, your points about completion don't really stack up.

Wayland, Pulseaudio, Network Manager etc aren't Ubuntu/Canonical projects. The projects you mention are more like Redhat projects than Ubuntu ones, and they generally had very limited influence over them.

The projects that Ubuntu did start and does run (eg Unity and Upstart etc) are the ones that they seem to have committed to for the long haul.

My points are valid. I remember when Ubuntu took up each of these issues and adopted or created software to solve these issues.

Network manager is far from perfect. Try setting a static IP address for you wired adapter with network-manager. Or getting a working bridge going. Or having a wi-fi connection active upon booting a computer but before logging in. When Ubuntu adopted network manager and people filed bug reports and brought up those shortcomings. Ubuntu said it wold get taken care of in the next couple releases. They did not.

They said we would have grub to desktop graphic boots. Did they work on it for a bit. But even now, most desktops do not have a graphic boot from grub. Forget about that as an out of box experience with a Nvidia card. From not working in GRUB you move to not working in Plymouth Again Ubuntu did not create these technologies, but they did adopt them, set as a goal what they wanted to do with them. Then they fell short, got bug reports, promised they would fix it in a release or two. After a release or two, they announce another half baked initiative and move on.

Does pulse audio work? Yes, Does it still have issues? Yes. Can it be a pain to get software designed to work with OSS or ALSA working with it yes it can. I have every right to complain. Ubuntu promised 6 years ago when they adopted it that they would get it all fixed and sorted out. They have not.

You mention Unity and Upstart. Upstart still is not delivering on Ubuntu's promised sub 10 second boot times. Which by the way, were promised with graphic boot screens as well. Still not happening. What about 200,000 million users by 13.10? Again another half baked promise.

Ubuntu has done a lot. The Linux desktop is better off than it was in 2006. Ubuntu has helped improve some of these projects. But so far, every time Ubuntu announces an initiative and makes some big claim about what they will accomplish, they end up doing a half baked job when you look at how well they have met their objectives.

200 million users by October 201310 second boot timesDesktop looking better than OS X100% graphical boots on all Linux systems.Network manager as robust as OS X or Windows XP network managerPulse Audio as robust as OS X or Windows XP sound system.

I am not the one making these promises. Ubuntu is. They are the one telling us we should all hop on board and promote Ubuntu to all of our friends. All of this great stuff they are doing.

What I see are half-baked half-fulfilled promises. Being told we are a community, and the minute the majority of us don't like something like the close button being moved to the left side of the window, or Unity. we are told Mark is in charge and it is not a community decision. I see the word Linux purged from anything Ubuntu is involved with. I am tired of being lied to and treated like the ugly girlfriend that Ubuntu want to have sex with but will not hold her hand in public.

Commercial success? What commercial success? What makes you think that forking even more of their core system will actually lead to any sort of commercial success? If anything, they are just increasing their own burden.

Ubuntu does not represent "commercial success".

If I wanted to whine about commercial success like some "hippie", then I would whine about some other company.

It's not the being commercially successful - pretty sure more people would grant Ubuntu leeway if they were - but their aspirations to commercial success are taking them down ultimately the wrong path.

The description of Mir's goals in TFA wasn't really very clear, but it is true and widely recognized that *some* alternative to X11 is badly needed. X11 has become FOSS's own "X25 elephant" like something conjured up in the committees of the ITU, whereas open systems need is a lightweight "TCP/IP gazelle".

Whether Canonical can produce the goods is a different question entirely, but the need for such an alternative is rapidly approaching certainty..

The only way it could possibly become a TCP/IP of graphics is if it was licensed as BSD or something similar.
TCP/IP is an open standard today because of BSD licensed code.