If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

During the last six years I have spent countless unpaid hours bolstering Ubuntu's position as *the* linux desktop. I am not alone. We have collectively ensured the current position of Ubuntu. We did it because we believed in Mark and his intentions. A former Debian developer. Canonical is not free to do what it wants, not without consequences. We gave him our trust, in return we expect decent behaviour. We expect him to follow the fundamental principals of free software. We expect Canonical to be a positive force in the free software world. We expect Canonical to collaborate with the free software world. Mir may just prove to be the final nail in the coffin for Ubuntu's desktop dominance.

I was responding to the notion that Canonical was intending to be nothing more than an integrator of community-provided free software. It's an unsustainable long-term strategy, financially speaking.

Please do not put words in my mouth. That was not what I said, and it was not what I meant. For instance, I consider Red Hat a good citizen of the free software world, they are certainly much more than integrators. Ubuntu itself brought Gnome to new heights in 10.04, it even brought me to leave KDE for a while, they did far more than integrating. The hundred paper cuts was an excellent campaign, exactly what we needed. Unfortunately Mark got greedy when the netbook boom arrived, and priced Ubuntu out of the market (according to the information I got), a horrible mistake that cost us all. The CLA's is another provocation where Mark failed to see the writing on the wall. The list adds up. It is pretty clear that he plans to make money on dual licensing Ubuntu with proprietary extensions. I did not plan to go on a wild bashing rant against Canonical, and I hope you respect that by avoiding more strawmen.

Well, you posted the graph . I agree that their numbers are probably somewhat subject to anomalies because the Linux base is so tiny - I don't think their sample size is huge. But the trend is more or less the same as w3schools: slow growth, short faster growth, flat. It's flat for two and a half years; hard to write off as an anomaly, especially when w3schools shows the same thing.

The graph that I posted did not have such marked anomalies. But if you go all the way from 2008 until today, you will cross several of them. The w3schools numbers are indeed interesting and any alternative explanation would probably have to account for the apparent coinciding numbers.

Originally Posted by AdamW

So it often doesn't matter much to you what it is that provides that shell, so long as it's competent. Ubuntu would do fine. So would Fedora, or Arch, or RHEL, or anything else at all. Maybe you like one more than the other in some other context, but in the EC2 context, it probably just doesn't matter.

I am not convinced. If you are most familiar with (or your company has standardized on) Fedora, why would your preference suddenly not matter in EC2 context and you choose Ubuntu? Just because the list is full of Ubuntu images? I could understand why someone who otherwise prefers Gentoo would rather not have to pay by the minute of gcc compilation. But I imagine that all others enter their preferred distro in the search box choose from the platform list. I other words, I think that EC2 share is heavily influenced by what people use at home or at work. Except maybe when it comes to Windows, but there could be some distrust as even Microsoft Azure has to reset their "days since last major outage" counter way too often for my taste.

Originally Posted by AdamW

So the fact that Ubuntu goes to the trouble to make tested images that work right out of the box available right from Amazon's AMI download page counts for a lot. You're not going to bother going out and trying to find Fedora's or Arch's AMI images, which probably don't work as well anyway. Because Ubuntu's done the work to be the big prominent Linux choice on the AMI download pages, you'll just grab Ubuntu. It's there, and it'll get you to a Linux environment with a shell, and you're happy. You probably don't care about expert paid tech support for your EC2 instance, or whether it'll still have updates in five years' time, or any of that crap; you just want a working Linux environment.

This can't be too much trouble? To get your EC2 images properly set up and QA'ed, I estimate that you need 1-2 full time engineers and a dozen or two of community members whom you generously offer free EC2 instances if they help you with testing and bug squashing. If that leads to instant market share on EC2, why don't the other commercial Linux vendors help their distros in that regard? Either my estimate how much work that is must be off or "tested images that work right out of the box" is not enough to be attractive to EC2 users.

Originally Posted by AdamW

Personally I vastly prefer engineers to analysts. RH engineers have been building the basic pieces of the cloud since before the cloud was a thing. We're perfectly on top of the trends, thanks.

May I suggest then that RH makes their new company slogan "We worked on cloud technology before it was cool"

Originally Posted by AdamW

It seems plausible, I'll give you that; I'm just not sure I'd be confident stating it as an unalloyed fact.

Uhuh you keep telling yourself that, come back to me when you've actually figured out linguistics and how it relates to math and other languages, and learn enough history and the way things work to not make an absolute fool of yourself. In fact reading up on history is perhaps the best thing you can do to get back in touch with reality since you don't seem to have any clue about how development cycles of anything actually works.

Counting from the creation of Assembly in 1949 Programming as we know it is 64 years old

Counting from the creation of C in in 1972 Modern languages have been around for 51 years

Counting from the creation of C++ in 1983 and ratification in 1998 Object Oriented Programming has been around for 30 years and usable for 15

(yes one can argue to dating to other languages besides the assembly one but the ultimate point is that Software engineering is an extremely young field)

On the other hand Building engineering has been around for almost as long as humanity has so that they get things right the first time is to be expected (although even then it's not always true, see Galloping Gertie) because all the hard R&D is done. Space Travel counter to your spouting is the perfect example of why that adobe guy and you are full of shit. Don't take my word for it, actually research the history of NASA, and look and realize that there's a real world out there, and in that real world the math doesn't always work (should you be surprised? no, math is a model and so by definition can't be the real thing and so you have to expect limitations), and solutions aren't just automatic when you do the math. There is development and planning of models required. Dark matter is a perfect example of the limitations of math, despite the scifi nonsense Dark Matter is purely fudging the calculation in order to make it work. That's right it's a hole in the model and so they shoved this placeholder in there to balance the books, and it's not the only place holder they have for that particular hole just the one that's well known because the SciFi writers liked the term.

When we do finally have enough quality permissively licensed well tested modular components available to cover almost all use cases (such that people aren't having to write their own) we will finally be able to do the equivalent of just throwing out a bridge on time and without meaningful bugs, but until that point writing software is and will continue to be in the R&D stages of an Engineering Field. That said for being 64 years old software engineering is in an absolutely amazing state, it is one of if not the most rapidly developed engineering disciplines to have come into existence.

WTF?
Just ask Knuth or Dijsktra (if you could) what programming really is and stop just throwing out things that do not sum up to nothing. I'm mathematician and you have no fscking idea what it is really all about, so stop talking about it like if you did, because you are embarassing yourself. Also, linguistics? WTF, you have no idea what you are saying.
Now, keep thinking that OOP is the true foundation for programming (I don't care if you don't want to open yourself and have such a NARROW view of programming), but stop implying that Knuth or Dijkstra or even Stepanov are just fools, or even imply that Alonso Church and Kurt Gödel are also that, because you own much of what you do to people like them.
"Math is a model and so by definition..." Haha, you are lost...

Fedora is lightyears ahead. And if one can't install it on their hardware, then that's because they didn't do their homework on buying good hardware. Of course all the hacks and exceptions will be added later on so it can be installed on shitty chipsets and/or pisspoor driver hacks.

Never ever was my Linux experience pitch-perfect. It screams performance, stability and quality I've never seen in a Linux distro, ever.

Ubuntu doesn't have multiseat (requires Wayland). Ubuntu is only now picking up systemd. Unity search is piss-poor and malware, even. Unity doesn't even come near Gnome 3.6. Ubuntu touch will get slaughtered by Gnome 3.8's touch work. Unity has Mesa glitches that are fixed only recently in the open drivers (can be seen with Intel and Radeon GPU's)... What was Canonical thinking? Probably has something to do with cognitive-decline.

While Android grows more towards vanilla (it's s till a GNU-toolchainless-crapfest), Canonical seems to think that Android is the way to go. Boy they are wrong...

PS: Oh and don't forget Gnome/KDE/GTK/Qt HTML5 work, so X.org can be axed. Yet Canonical thinks a C++-based Qt SDK and X.org is the way to go. Jeez... Even Microsoft realizes HTML5 is the future.

An ecosystem, simply put, is a system where organisms interact with eachother and their environment.

A software ecosystem is a system where software/programs interact with eachother and their environment (the OS, or even the computer itself, if you will).

I don't see it as that much of a stretch. On the other hand, a software biosphere would be something to see...

If the term "ecosystem" applies anywhere in software, I think it's different Linux distros and how they interact with each other. That's a real ecosystem, with new organisms (distros) being born and dying, organisms evolving, learning, collaborating and sometimes fighting with each other, organisms spawning child organisms (distros spawning derivatives), mating with each other, eating each other...

Following the same metaphor, Android doesn't have or isn't an ecosystem - it's more like a colony or outgrowth of the Linux ecosystem. Windows and MacOS aren't ecosystems at all - they're more like zoos: you have to pay to enter and see the animals, some of the animals are glad to be there because they don't know of anything better, but ultimately the zookeeper controls what the selection of animals looks like and what kinds of animals the zoo is capable of supporting.

In a larger sense, the free software community could be called an ecosystem also - with each FOSS software project being an organism. Sometimes, these organisms get infected by proprietary parasites. When an organism is infected by these parasites, their descendants also carry the infection and become inable to interact with their original ecosystem, being instead forced to only work for the benefit of the parasite. Luckily, most of the organisms have developed an immunity - the GPL gene. However some organisms are still without that immunity, so they get exploited by proprietary parasites - BSD being the prime example, poor thing has been suffering from a bad case of the Apples, and refuses to take the antidote...