I'm sure I'm in the minority here, but I actually like the innovations canonical keeps bringing to unity (although I'm not in love with the amazon searches). My problem with it is execution - run gnome shell on the identical hardware and it will always run circles* around unity especially in terms ui responsiveness. Just press the super key after you've been doing something else for a while and start typing - in my experience gnome shell seems to respond almost instantaneously while unity can take as long as several seconds to bring up the default lens on decent hardware.

I like a lot of their design decisions like the global menu, dechroming maximized windows, easy custom lenses etc. But the compiz/custom code base just isn't up to the task. Worse, unity is so tied to their non-mainline gui stack that it is a herculean effort to get it to run on anything besides ubuntu, preventing them from getting fixes from outside their ecosystem.

I know there is a lot of bad blood these days between gnome and canonical but they really need to bite the bullet and rebase on mutter, gtk3 & gnome shell. Canonical is great at a lot of things, but core performance oriented software just isn't one of them. Let somebody else do the heavy lifting.

I haven't gotten a chance to try the gnome shell spin that they're producing this cycle. Can anyone comment on how well its working? My experience trying to go vanilla gnome 3 on ubuntu in the past has been less than stellar, but I really appreciate the ubuntu community and things like launchpad ppas.

You can't configure it without an absurd number of extensions, the default interface makes me want to punch kittens, and it wiped my groups policy and network profiles so I am still trying to repair my startup procedure, and right now I need to manually start network-manager each time.

I won't blame that necessarily on Gnome Shell though. I will say I don't care how speedy it is, I can't use it for anything in its default state, and after a couple hours trying to get enough extensions to make it manageable, I just gave up. If I want a Gnome 3 desktop I'll use Cinnamon.

The only plus for Gnome Shell is that the system + search syntax that I love from many modern desktops works, but it also works in Unity (albeit slowly) and Cinnamon (albeit not as well, Cinnamon doesn't intelligently reorder results, it just lists them alphabetically).

This isn't meant to be a retort - I agree that there is a lot of mess in deb-ubu/gnome 3. But if you have the time, try fedora on bare hardware to get an idea of how gnome is supposed to work. You probably won't want to stick with it for a variety of legitimate reasons, but the core ui feel is very, very nice - and that's what I wish I could get under ubuntu.

I've used Gnome Shell on the last two releases of Fedora. I've grown to be comfortable with it but the idea of having to rely on multiple(9 in my case) extensions to get to that state is too much. Couple that with having to worry if extensions will be updated to work with each new release of Gnome. I had been interested in Cinnamon but was under the impression it was the distinguishing feature of Linux Mint. Cinnamon is now available for several distributions including Fedora(yum install cinnamon). For me, Cinnamon is what Gnome Shell should have been and I have no problem recommending it.

I still have a bad feeling about how the Mint team have been actively forking every GTK3 project Gnome has. Especially when they forked Nautilus into Nemo, I really feel like they should have gone with Thunar or Marlin. Maybe they should have also opted to try unifying under XFCE and fixing gtk3 to work with it instead of just forking Gnome Shell.

I dunno, I just feel like the direction of "fork everything even when alternatives with our same mindset exist" seem like a waste of effort.

The alternativees aren't GNOME, and many liked GNOME until very recently. (Well, it's been a year or 1.5 now, I guess, from the end-user perspective. Longer in development, I guess.)

It seems like Mint is trying to execute a hostile takeover of GNOME. Of course, GNOME is free software and forkable, and branches aren't necessarily subservient to the lines they branch from, so "takeover" means "establish a reputation as the highest-quality fork"

If the majority of the community begins to respect and support Mint as a better GNOME than the current stewards of GNOME, and Mint "rescues" the good majority of the GNOME software, we can jettison the old GNOME management, welcome the migration of contributors from old GNOME, and carry on.

XFCE is kind of gnome. Built off gtk2 and all. It was never mainline super integrated gnome, and it replaced a lot of dumb shit you didn't need to rewrite (see: the fudging calculator, the terminal emulator, etc) but I still feel like Cinnamon / XFCE should be in the same camp. They want the same thing, after all.

I ran Verne for a bit in a vm and live cd, but the Ubuntu Software Center has a convenience factor for Humble Bundles and since it is being targeted by a lot of Linux adoption such as Steam I'm kind of platform locked to it now.

I still run Arch as my side distro for having a machine I'm prouder of. Upstart can suck it.

I always thought Unity was all right in general, Compiz in particular has a wealth of functionality that just doesn't exist yet in Gnome3. The Unity dock is very good, particularly the way it deals with multiple application windows (broadly: first click: bring LRU to front, second click: expose all application windows).

But I still think the "Lenses" thing is just awful. It's fine for starting a specific application by typing in the name (the most basic "Quicksilver" functionality). It used to be really slow, but I think that's better now. It still sucks for application discovery, which the old hierarchical menu excels at. And the other lenses are mostly just silly: you've got this simplistic interface, and people are trying to shoehorn all kinds of use cases into it. Look at [0] and [1]. Some of those would be much more useful if they weren't limited by the restrictions of the Lens framework (others are just hopeless).

I'm giving Unity another shot these days and it's still frustrating me. I still find it significantly less convenient than the old Gnome 2 was, and that's after tweaking some of the more annoying offenders (simple Alt press bringing up a huge window).

For example, the "expose" feature shows a bunch of windows and lets you browse between them with the arrows. However, the lowered the contrast between them so much that I really have no idea which window I'm currently selecting.

Similarly, multiple windows barely show which is in focus due to this weird fetish for near 0 contrast.

Also, I'm still missing my task list, wanting to know what I've been using to remind me where I stand in each desktop. The little arrows are much less helpful.

When pressing winkey, it waits something like a whole second before it shows me the unity-bar's numbering, and without those, the keyboard is useless. I use the keyboard to work faster, not slower, but Unity is slowing me down here as well.

These are just the criticisms off the top of my head, but overall it's been a pretty annoying week or so with Unity.

I noticed the lag with pressing the super key in unity, as well. Then I found out that Super+a is apps, Super+m is music, etc and that anyone of those is usually faster in coming up than a pure Super (for the default lens or whatever). I'm wondering if it's "waiting" just a bit for that +[a,m,etc]...

be aware that this isn't a great user experience - a number of ubuntu specific features will work poorly or not at all while you're logged in under gnome shell, and several gnome 3 desktop integration features also won't be working. It's mostly fixable, but very much a trial and error experience.

I don't like the design of Unity, but I put up with it for a simple reason - it's been working stable without crashes here from the start. As long as they keep that I'm fine and hope they improve the rest over time.

One word of advice, don't do the automatic upgrade, do a full reinstall instead. After trying that for the last 5 or so releases. Always thinking that this time they fixed it. I'm not falling for that again. You'll see, as always, some reports of people saying they tried and it worked. But be careful, it's a trap!

I really don't understand why the upgrade seems to be so hit and miss between people. My current laptop has been on the same ubuntu install since (at least) 10.04. I've even upgraded to beta versions a couple of times and the only problem I've ever had was when I upgraded to the 12.10 beta and it ran out of disk space in the middle of the install. But that was recoverable.

Of course, this comment might be a trap. There's only one way to find out...

I had problems updating from 9.04 to 9.10 -> I remember it messed up my system so bad I had to reinstall it from scratch. After that, i moved to Mint for a while, and then tried Ubuntu again for 11.10 then did the update to 12.04 without any issue. I will try upgrading to 12.10 within the system again this time.

It's because the more customizations a user has done, the more likely they are have to have made a change that is incompatible with the upgrade process. It's the same for any OS. The only mitigation is to allow less customization of the files that get upgraded during an upgrade.

Server also can be hit-and-miss if you've got any irregularities in your setup...I had a rough upgrade from 10.04 to 12.04 a few weeks ago (died halfway through, system left in half-upgraded broken-but-bootable state). In the end it turned out that it was due to a no-longer-maintained package (gitosis) which I'd replaced some time ago (with gitolite) but I hadn't actually removed the gitosis package and there was some hangup about deleting the gitosis user.

In fairness, after manually resolving the problem with removing gitosis and cleaning up a couple of other half-upgraded things, the upgrade process did recover and finish successfully, but it certainly wasn't painless.

My InstallationMedia is 9.10. I had one troublesome update where it only partially completed but it was happy to complete it after a reboot (11.10). Other than that, I believe I have only had one thing function other than as I would like. In the upgrade to 12.04, it hit a point where something I had changed in /etc (turning off AppArmor for Evince, so that I could use a custom hyperlink protocol, which made it easier to work with Lilypond) caused it to stop, awaiting a decision from me on what to do with it. As a result, I got up in the morning and lo: it was only a third of the way through. Not what I had intended.

How are people's experiences with using Juju based setups on real production servers? Was anyone "lazy" enough to try this?

--
EDIT (adding context): the article says about the use-case for Juju "configuring and setting up complex services with lots of application components can still be a bit tricky"... then goes on to give a basic LAMP WordPress setup... the thing is that a LAMP WP setup is neither complex nor it has lots of components, it's basically the "drop dead simplest" case of web app deployment, so they are clearly missing the point using this as an example :|

[Unity] has received a number of additions, all built around the idea of demolishing the walls between local applications and web applications

Here's the thing why Ubuntu is on shaky ground these days.

The above idea is grand but like everything that tries to be the future of something—in this case computing—it leaks. And leaky things will never take off for real much like leaky abstractions will not hold because the truth will leak out sooner or later, and then the abstraction isn't worth much since you can only trust it superficially.

The early Ubuntu and Windows 95 and XP had something in common: they all were built mostly on how the computer worked. These operating systems tried to make the underlying computer available to the user, give or take a few sugar-coatings. And they all pretty much succeeded.

Conversely, I think most systems that try to pretend to be something that they aren't will not succeed. Web applications won't become local applications just like that: the user will just see some visible glue that holds some parts together. You've seen it so many times: something comes with great features that only work till you really want some things done, and then it turns out the system doesn't do its magic all the way through. You just see the one kind of magic that has been preprogrammed into it and you've already observed that besides an initial impression, that one kind of magic can't deal with everything you need from the system. Then you can't trust the system anymore since you know there's more available than the system can agree to offer you.

Ubuntu is still basically a local installation: some stuff can originate from the cloud but it can not be a grand computing environment that unifies web and local services because it matters that the user has his own local installation. You can't boot Ubuntu from a USB stick and have your environment seamlessly load from the cloud.

Something like Android or iOS are much better positioned for seamlessly integrating local and web applications and local and cloud services. Using a tablet interface you don't really have the sense of local vs. web at all: you just have apps and once you sign up another device your apps will be available automatically. This is maybe what Shuttleworth is envisioning with the Unity and his current plans for Ubuntu, but the downside is that the regular Ubuntu desktop will suffer.

Ubuntu suffers because it doesn't pay respect to its natural, physical environment that is a local computer. It can be a highly tuned system that takes the most out of your hardware or it can be an ethereal, ubiquitous cloud service that's available regardless of hardware. But not both.

"Ubuntu suffers because it doesn't pay respect to its natural, physical environment that is a local computer. It can be a highly tuned system that takes the most out of your hardware or it can be an ethereal, ubiquitous cloud service that's available regardless of hardware. But not both."

Emphasis added there at the end.

My claim is that "we", folks who engineer software and create products, have made those software products indispensable to folks who never used to care about computers much less shell out money to own one. They never did want to buy a computer, they don't want to buy one now, what they want is the function that is provided by some app or collection of apps. These people want to follow tweets, or facebook, or chat, or see cat pictures, they buy computers to do that because they have to, not because they want to. What is worse, the 'computerness' of computers, their re-programability, their flexibility, causes more problems for these people than it solves. They want turn-key, instant on, instant off, tools.

And folks are making these tools for them, Chromebooks, and iPhones, and iPads, and Slates and Surfaces. They are marketed as tools that get a particular job done, not a universal tool. Can you imagine a power drill where you take the motor off and use it in your mixer, then take it off and use it in your desk fan, and then take it off and use it to pump water and wash your deck? No you get separate tools for those tasks, and they all have a motor in them but the motor isn't universal, its optimized for the tool. We are moving that way with processors. No more 'boot whatever you want' no more manuals describing the instruction set or peripherals, no more general purpose tool chains or operating systems. Processors designed to do one thing well like be a 'phone' with proprietary value added parts (like a GPU) and special instructions (like Jazelle) which you only get to know about if you agree to buy a million a month and design it into your specialized product.

The needs of an operating system for one of those devices is very much different than the needs for a programmer or developer's operating system. Sure they share some things in common (both render to a screen) but how or when they do that, and what API they use, those things are important to a developer but not to a tool/appliance user.

Yason is exactly correct that Ubuntu is standing astride this crevice while it widens underneath them. Soon, unless steps are taken, it may find itself neither fish nor fowl, an unacceptable environment to developers (too closed off) and to appliance users (too technical). Personally I'd love to see two distros in Desktop one is "End user" and one is "Developer" with very different design targets.

In the past month or two, I've come around on Unity after hiding out in gnome-session-fallback and then cinnamon. With the launch bar icons shrunk to their smallest size and Docky on the bottom, the Unity interface has gotten to the point where it's smooth and usable. Call me a convert, but I decided Unity has gotten to the point where it's easier to work with it than to keep trying to work against it.

So I use Ubuntu 12.04 on my laptop and Linux Mint Debian Edition on the desktop. LMDE is OK, but I'm not sure Mate is a long-term solution. On the one hand it is great that they have maintained the old look and feel of Gnome 2, but forking so much "crap" seems like a huge wasted effort.

For example the configuration changes - instead of gconf, you now have mateconf, which uses a whole other set of files that are not compatible with gnome. So switching from an old Gnome desktop (say) to Mint+Mate means your settings are all lost, even though the forked code between gnome2 and mate is really close.

On a default LMDE install it adds gnome-terminal and mate-terminal, so on the menus you have 2 "Terminal" entries. Ditto for a load of other core applications that are in Gnome and Mate - stuff like Archive Manager, screenshot, and document viewer. Was there really a need to fork file-roller?

Debian have already pretty much said "no" to including Mate - the reason being that the duplication and resuscitation of crufty old code is not a good idea. It would be better, the say, to work with Gnome upstream to get a more Gnome2-ish feel in Gnome3. I think Cinnamon is in Debian or will be shortly, which probably has a brighter future.

Anyway, for now I'm happy with Mate - it works fine and didn't need any tweaking to get to a usable state, compare that to Gnome3 classic-session on Ubuntu 12.04 which I spent weeks tweaking and patching (!) to get back to a state that worked like 10.04 - but I think its days are numbered. The Mint devs alone can't keep a full-on Gnome2-fork alive forever, it has too many duplicated applications that are already fine in base Gnome, and eventually I think Cinnamon will be the only viable option for Mint.

I question whether it is more profitable to move forward and have what amounts to really good devs build mashups than to go back and focus on OS stability and core functionality (the thing which you arm sales people with when selling to corp, and that provides for greater general user loyalty).

The immediate rebuttals for my comment will be (1) user studies, (2) the os is fine, (3) were making money, (4) customers told us.

The not so obvious rebuttals will be:
(1) someone told us to do these, (2) we've reached user and customer saturation, (3) desktops aren't the rage, (4) core customer/users moved to using server only and were reeling them in to desktop, (5) corporate entities cannot perceive enough value thus all the new mashups.

It is this second set which I want to hear discussed and why they are or aren't the case.

Basically my guess is that core users are moving away from where these guys are spending money and they think spending more in the area will bring them back.

take with a grain of salt but the marketing theory is people when obtaining something (convinced or otherwise) will benefit if product in context serves all their needs and more if possible. That grey area or boundry of core->nice to have is what is in question here. The more they use it, and the more it serves them, the more likely they will return to it (product context). Take with a grain of salt.

Yes I do use Firefox and that's a rather silly assertion if you consider the technical implementation.

In the case of Firefox, a default search page for Google is included in the Firefox Chrome as is a default search engine. The default page doesn't call home at all until you enter a search i.e. until you use it. The rest of the browser is functional without using this feature, so you can type "duckduckgo.com" without transmitting data to google.

The same is not true for the Unity shopping lens which transmits your searches for local data and applications on your workstation to Canonical and Amazon by default.

Sure, I'm not agreeing with the default out of the box functionality, but other than the fact that user expectation is that a local search does not transmit information to remote servers (same issue as chrome's omnibox for typing URLs) it is exactly the same. But maybe Canonical should make it explicit what the default does (maybe they do, I don't know)

But in the case of Firefox, you are explicitly trying to do a web search. While in Unity, you are just running your local app launcher.

Confusing a private local search for your own files with an online search that sends information to a third party is is a fairly substantial privacy issue, and the fact that Canonical is just blowing it off (by saying "we already have root") is incredibly concerning about their judgement.

If its only coming with 3.2.3 it will break many existing tools.(think django)

If you see MAC, it does come with lower versions of python too, so it supports older python projects.(python2.7, python2.5 based and doesn't break the system)

Latest Ubuntu might well come even with lower versions of python too - as many Ubuntu apps and tools itself use the lower versions of python. But the default python will be 3.2.3

To use the right version of python we have to -

Manually install older versions if it doesn't have one.

And do symbolic Linking the installed versions to /usr/bin and set $PATH appropriately for the correct Python version we need. Just use them as python2.7 python2.5 python2.6 while calling your scripts.

It will still be really easy to have Python 2 & 3 installed side by side, and "python" will still be Python 2. So you don't need to do any symlinking or messing with $PATH.

I think Ars misunderstood, as well: 12.10 has both Python 2 and 3 by default. They're working towards dropping Python 2 from the default install, but it's not there yet. When they do, it will just be an "apt-get install python" away.

I don't know. I wish I did, because then I wouldn't have made the afore comment. :/. It would not surprise me to find the occasional tool that still relies on it. As does my entire scientific library, unfortunately...

I have been running Java on Ubuntu each release by just downloading directly from Oracle and exporting my JAVA_HOME and PATH variables. This is a pretty clean approach and allows me to use different JVMs for each type of application, and you can use update-alternatives to do it the Ubuntu way

Everyone has their own preferred setup, and that's great! It's part of what makes FOSS great: Your choice.

My choice is Xubuntu. I've been using Xubuntu on my laptop for the past 6 months, and the 12.10 beta for the past two.

The link below is my tasklist after install to really make Xubuntu shine. Oh! And now with updates to the Ubiquity installer, FDE can be done with the GUI instead of fiddling with the debian-installer text partitioner!

I was so fond of Ubuntu before they introduced Unity or Gnome 3. It was so responsive, easy enough for a beginner, yet it was not in the way for an expert user (apart for the lack of regex in the default editor, perhaps).
I tried Xubuntu, Kubuntu and other distros, but I kept going back to the default Ubuntu one.
I used it as my only OS for years, but now sadly it's the opposite and I boot it only when I need it.

I guess I should give up trying to like Ubuntu and start using XFCE instead. Thank you for your list, I will use it to check this desktop environment.

I'm the opposite. I' used XFCE for a decade and stuck by it because it felt familiar... it felt like home. With the release of 12.04, I switched to vanilla Ubuntu because I feel that will be what I see the most in the tech industry and I wanted to get use to it.

It took about 2 days to get used to it, but now I find myself super+(search term) for everything instead of trying to find stuff in an applications list and that spilled over to my Win7 use also.

I still miss a few functionalities, mainly right-click->open terminal here, but I find myself reaching for my mouse MUCH less and I'm not wasting time trying to find something buried in a list or folder somewhere. Search is FAST (on both platforms).

you can just install gnome-do on Xubuntu and you'll have the same search based workflow in the interface you're used to and the Ubuntu base below.

I TRIED to like Ubuntu's Unity, but the fact that I can't either: (a) right click things like panels and choose 'configure' or 'properties' and adjust everything to my liking [the Windows way, that I still prefer] or (b) have a goddamn' config/settings center where I can actually adjust the UI settings I care of [the KDE and Xfce way that works...] ...I mean, I still don't know how to disable window grouping for the damn "taskbar"

Yes, that's fine and all, but Xubuntu no longer (for me) has the lightweight advantage that it once had over Ubuntu and I have seen (again, personally) significant performance gains from Ubuntu OVER Xubuntu. Once you have to start installing GTK and KDE toolkits just to be productive on Xubuntu, it becomes quite heavy.

Yeah, Xfce is great. I don't know why more people who keep bitching about Unity and Gnome 3 don't just switch to Xfce. It's a solid DE that doesn't pull the rug out from under you on a whim because the devs decided they wanted to copy Apple.

Thanks for this. I have been on the lookout for a lightweight OS like Crunchbag to upgrade my desktop from Ubuntu 10.10. Being a intermediate user, I do not want a lot of bells and whistles but appreciate a more responsive system. I manage a lot of Ubuntu servers so I would prefer an Ubuntu development environment as well. In fact, I am seriously considering using a virtual machine for this purpose.

The first release with Unity, I tried it for several days before giving up. It was just too buggy with the Compiz configuration I wanted/needed for some things; as soon as I touched anything in the configuration or with certain configuration things enabled, it would crash and require a reboot/gdm restart.

The second release I tried again. It almost passed muster, but it was still just too buggy.

The third release I tried again.

I never went back to the GNOME desktop.

I love Unity. It's not perfect. In things related to keyboard shortcuts in particular it's still got a ways to go, but it is good. So much better than the GNOME 2 desktop. (I've never tried GNOME Shell, so I can't comment on it.)

How's the performace? I tried a beta of Quantal a couple weeks ago and Unity was so slow compared to vanilla Gnome that it wasn't really usable (I have a Radeon HD 3000 card, using the FOSS driver). I hope it's not that bad in the actual release.