Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

snydeq writes: Developers are embracing a range of open source technologies, writes Matt Asay, virtually none of which are supported or sold by Red Hat, the purported open source leader. "Ask a CIO her choice to run mission-critical workloads, and her answer is a near immediate 'Red Hat.' Ask her developers what they prefer, however, and it's Ubuntu. Outside the operating system, according to AngelList data compiled by Leo Polovets, these developers go with MySQL, MongoDB, or PostgreSQL for their database; Chef or Puppet for configuration; and ElasticSearch or Solr for search. None of this technology is developed by Red Hat. Yet all of this technology is what the next generation of developers is using to build modern applications. Given that developers are the new kingmakers, Red Hat needs to get out in front of the developer freight train if it wants to remain relevant for the next 20 years, much less the next two."

For the "big stuff", much of what's listed in the summary, they probably can't create the bandwagon. The reason developers jump on something like that is because it's already in widespread use. All the "big stuff" already has leaders. The best RH could hope to do is to buy some of those out and take them over.

OTOH, do we developers want that? Look at the controversy surrounding systemd, directly developed by RH. If that's a sample of what they do, I'm not so keen for their solutions.

From working in Linux-based IT for nearly a decade now, IT departments get very frustrated by Red Hat's package management and the concept of needing both an Entitlement and various Channels to get updates; on the flip side of this summary is Ubuntu, which IT departments can't stand due to it's constant change and instable nature. Every IT department I've worked in and with seems to prefer administering and deploying Debian and battles with devs on Ubuntu and management on Red Hat.

I'd also prefer RHEL over all others, except for the costs (which are inconsequential to me, usually). I too get stuck trying to push CentOS...most shops I've been at want to be able to point the finger at someone, hence paying for RHEL. What's odd is, the shop I've been working at for the past year actually uses Ubuntu LTS, so I've (unfortunately, or perhaps, fortunately, in the name of expanding my knowledge) had to learn the system pretty quickly. Haven't had any problems with it so far, an actually I'm impressed with the LTS version's stability (while originally I abhorred it for no actual reason, heh). Seems like a cross between RHEL's stability and Fedora's up-to-date packages.

If you want "up to date" then Red Hat is NOT for you. Their claim to fame is that they are far enough behind the bleeding edge to be stable. Fedora is for people who need the latest stuff and have blood to spare when the patches go awry.

"Red Hat will not issue any more security advisories for the MySQL 5.0 packages (mysql-5.0.* and related packages). Security advisories will be provided only for MySQL 5.5."https://access.redhat.com/docu... [redhat.com]

Looks like the article I linked is out of date ("As of October 1, 2013, MySQL 5.5 packages have been added to the Red Hat Enterprise Linux 5.10 Beta, and therefore will be in the forthcoming GA release."). 5.10 was released on 2013-10-01 according to https://access.redhat.com/articles/3078#RHEL5 [redhat.com]

Thanks for pointing it out. I've commented on the article requesting it be updated.

I don't think that's a cut-and-dry sort of thing. As a developer, I hate the fact that Ubuntu is changing so quickly that I can't keep up. Leading edge is fine, but bleeding edge gets blood everywhere.

The great benefit of Red Hat is that it's stable and supported for a very long time, like 20 years. They don't change anything major in a release, and releases are few and far between. This is great for 'Enterprise' stuff, but the web is moving quickly and package support for RHEL boxes isn't great.

Having said that, where I work we have lots of stuff on RHEL/CentOS, and more and more stuff on Ubuntu. The Ubuntu stuff keeps me awake at night - literally. It's always falling over. I have never experience a kernel like the one the Ubuntu team are putting are. It's absolutely atrocious. The biggest problem is that the software we need to use has better support for Ubuntu than RHEL, so we're stuck using a dire OS to run it on.

The RHEL and CentOS boxes we have are rock solid stable and have never really given us significant issues. I walk into the office and get a new Ubuntu problem every day.

(FWIW I use Debian for all my own stuff exclusively, so I know my way around Debian-derivatives - this isn't a configuration issue).

But who are you? You don't have a name, or a mother. You're just an anonymous coward. If you really believed what you're saying, you'd log in.

I install Ubuntu and then nvidia won't install until I fucking massage the thing. And that's the selling point of Ubuntu. Give me a break. It's cool how fancy it is, and how it supports stuff, but it's not cool how flaky it is.

Ask a developer who has recently made or tried to make the transition from Windows to Linux and they expect inconsistency, plus doesn't everyone use it? Ask a seasoned Linux dev and they wouldn't touch Ubuntu with Bill Gates' $INSERT_APPENDAGE_HERE

The tension is stability versus the latest tech. RedHat purposely moves very, very slowly. The same can be said about Debian stable. As an admin I like slow moving targets. The problem is that developers want to use the latest stuff. So what does RedHat do about this? I think they are trying to solve it in two ways. First is their Software Collections. These are packages that site outside the base OS and are easy to pivot to the newer version. This allows for multiple versions of things like Python to be installed in parallel. Very handy!

Another thing that is helping quite a bit is Docker. RedHat is big on Docker. By packaging containers as apps, this allows a developer to easily control the dependencies outside of the OS that the app is running on. This makes everyone happy! Fedora is tracking some interesting tooling with Docker (geard, os-tree).

I like that RedHat tries to solve bigger problems than just packing and releasing a distro. They are trying to make things manageable (see FreeIPA, OpenLMI, RDO, CloudForms, oVirt)

Personally, I like RedHat. I like Debian. I run Fedora on my desktop and notebook. I maintain a CI/CD pipeline on RedHat at work. I never jumped on the Ubuntu bandwagon. It seems to me that Ubuntu has made quite a few more mis-steps in their short existence than RedHat has over the years. I get the feeling that a lot of people are just dropping back to Debian, which is just fine with me!

Very important for certain customers:RH has a Common Criteria certificate. So, it's basically the ONLY Linux you can run in an IA environment. The other option is Windows. I don't even know if Solaris is there, still. I've seen customers migrate entire Ubuntu networks to Red Hat, to meet this set of requirements.

This means revenue for Red Hat, and this drives them to work towards being a one-stop-shop for IA Enterprise systems.

With other environments leaning towards HIPPA and other sets of security regul

Agreed on all points. I especially feel good supporting RedHat as it really does help drive the Gnu/Linux Ecosystem forward. Ubuntu has tried doing the same, but when it does there are always stings attached or development is behind closed doors.

When someone asks me to connect to a Linux server, I think "Cool". When I find out it's Ubuntu I think they probably don't know much about Linux or they wouldn't be running Ubuntu as a server. My sampling is probably biased, but most of the Ubuntu user's I've met are beginning desktop users.

Plus, RedHat are the one pushing for new and untested systemd. That's another example of something you don't expect of a stable server distribution.

It's not new and untested, it's been used in at least Fedora since Fedora 15.

No, RedHat is not 'cool' or stable. They're fishing for consulting dollars, and they're trying to monopolize Linux mindshare by pushing systemd (themselves being the authors), and injecting it as a dependency everywhere else.

Yeah exactly, Red Hat supports a project that they ships as part of their product. That's outrageous, or something.

I have to disagree. I think tons of things are broken in Ubuntu. They usually get the GUI right, but the underlying system is a mess, especially if you want to configure things from the command line. hostname -f has been broken for years. I like sane limits in ulimit. I agree with you on the aliases to rm. Training wheels.

I would disagree with you. Despite the desktop-ness of Ubuntu, the distribution comes with a lot of things set up right. RedHat, on the other hand, assumes you're an idiot and treats you accordingly. Which of the two has rm aliased to 'rm -i' by default? RedHat. I'm not a fucking DOS user, I know that I want to delete something, this is supposed to be UNIX. Which of the two limits each username to 1024 threads/processes (ulimit -u)? RedHat again, a supposedly enterprise server distribution. Which one has/sbin only in the PATH of the root user? RedHat again. I don't want to fucking 'su' or do the full path to run ifconfig.

Plus, RedHat are the one pushing for new and untested systemd. That's another example of something you don't expect of a stable server distribution.

No, RedHat is not 'cool' or stable. They're fishing for consulting dollars, and they're trying to monopolize Linux mindshare by pushing systemd (themselves being the authors), and injecting it as a dependency everywhere else.

First, you are so not thinking "production", system management and security. You are obviously a newbie Linux user/developer who has never deployed anything where security was important.

NEVER, and I mean NEVER, use "root" to do anything, at least not directly, you apparently do. If you want "ifconfig" to work for you, put it in your PATH, if you want it to run for everybody, put it in the skeleton account or modify the necessary files at the system level. Some folks don't want this kind of stuff to show

More than a decade ago, when they abandoned desktop and regular users and only focused on enterprise, they made their biggest mistake. Where do you think Ubuntu Server users come from?

Even most of us who are highly knowledgeable and understand Linux to it's most profound depths appreciate a good desktop experience. The fact we can compile a kernel or any software does not mean we prefer that to a nice end-user experience.

It is still not too late for RedHat, and given the horrible direction Ubuntu has b

PPA's are similar to adding a -release package to Fedora/RedHat/CentOS. So for example, I was to add EPEL to my repos. I just click on the epel-release rpm and it installs it. I'm not so hip on the Software Center. I like to stick to core debian tools when using a.deb-base system.

More than a decade ago, when they abandoned desktop and regular users and only focused on enterprise, they made their biggest mistake. Where do you think Ubuntu Server users come from?

This.

Absolutely true. RedHat desktop was awful (in comparison to other distros) for a while. Unfortunately, it's going that way again (Gnome 3). I only hope that someone will create a MATE repository for RHEL/CentOS 7.

What this implies is that the execs at RedHat don't eat their own dogfood, which is terrible for any

Red Hat use to have a distribution for everyone. It was one of the most popular Linux distributions. Then it moved to Red Hat Enterprise, and that really caused many of the Linux users to find something else and switch. Fedora is nice and all, but it felt like Red Hat throwing a bone.

Ubuntu came up and took its place as the distribution for everyone. Red Hat got stuck in the stuffy enterprise market.

As most people who know, Enterprise software means over priced software, that barely works, but somehow it ma

As most people who know, Enterprise software means over priced software, that barely works, but somehow it makes executives feel good about using it, probably because they need a full IT Staff just to keep it running.

No, that's not a reason for using Red Hat. The reason to pay Red Hat is for support. Lets say you have an issue that your "expert" is unable to resolve. If you have Red Hat license, you put in a ticket with them and access the stable of engineers they keep employed to get you an answer. Chances are they have seen the issue before and many times have the developer who wrote the stuff in house. If you actually find a bug, they work that for you too because they have open relationships with the developmen

1. The infrastructure needs to be supported as well. If the various necessary agents (backups, monitoring, application distribution) only work on Red Hat (or CentOS) then Red Hat is what's acceptable in the production environment.

2. The staff needs to be in place to support it. We have three major Operating Systems we support (team of 5 admins). Solaris, HP-UX, and Red Hat/CentOS. With almost 1,100 systems, environments outside our expertise a

When I had a choice in Linux desktop, it was always Fedora because I was sued to it, and even with its bleeding edge slant, it rarely fell over with updates even with some third party repos in my mix. That was from Fedora 1 though like 16? They're up to 20 now so I have some catching up to do!

I don't know if anyone's mentioned that Redhat owns JBoss and all the tools and technologies around that which are very popular in the enterprise development markets. When I think of Redhat, I see a company:

Agile developers expect agile everything. Ubuntu happens to just be a happy compromise between agile and waterfall.

If you look at RHEL, it's 5-10 year old packages, kept alive by an enormous engineering team that backports fixes to old, dead software, which creates a huge pile of technical debt for any developer trying to use "modern", highly modular frameworks.

As far as developers go, In the Ruby, Python, and Node ecosystems, anything that's not the latest doesn't exist. They don't use the system package management, they use gem, pip, and npm. They really don't care about the underlying OS, until it gets in the way, and getting in the way is exactly what a decade-old OS does.

Just to throw out an example. Take some modern ruby on rails application, say Discourse. (discourse.org). Go download a tarball from github. Now try to make it work with nothing but software from the official RHEL repository. Let me know how that works out for you. After you tear out all your hair and skin trying to do that, try to get the pieces from 3rd party repos that will make that work. See how much you have to bring in as far as new libraries and new packages just to make it work. It's still a nightmare even with the 3rd party repos, and that RHEL support contract doesn't cover them - every single piece that's likely to break your application, is now outside of your support agreement, so your company is now wasting at least $799/year for support.

As soon as they start trying to develop on RHEL, the dirty hacks start. There are things missing - the versions of software that they need to make their dependancies work don't exist on RHEL. They end up in a kind of dependancy hell fighting with libraries that are a decade too old to compile their dependancies. One thing leads to another. Eventually, you recreate an entire current OS in/usr/local, or install one piece by piece from 3rd party repositories. At that point, it's not RHEL anymore. It might still say it's RHEL, but it's a bastardized system that looks more like an evil child of Gentoo and Fedora. (both of which are fine distributions by the way, just they aren't meant to crossbreed). The only thing you have left of RHEL at that point are the parts your application doesn't care about, which is probably not much.

Or, you can attempt to containerize with kvm, chroots, or lxc, which, while not breaking the underlying system as badly, means the application is really running on something other than RHEL.

If Red Hat wants developers back, they are going to have to be able to deliver a product with an agressive delivery schedule, maybe even a rolling release, and be able to deliver the kind of support to make operations feel good. That's a whole new territory, that nobody has touched yet, but if they are up to the challenge of keeping decade old software on life support, they are probably up to the challenge of an agile OS.

"As far as developers go, In the Ruby, Python, and Node ecosystems, anything that's not the latest doesn't exist. They don't use the system package management, they use gem, pip, and npm. They really don't care about the underlying OS, until it gets in the way, and getting in the way is exactly what a decade-old OS does."

^^These developers are idiots and don't deserve support.^^

Yeah, I'm aware of everything wrong with that statement, but it's a perspective that's valid for a lot of people. This culture evo

Agile developers expect agile everything. Ubuntu happens to just be a happy compromise between agile and waterfall.

If you look at RHEL, it's 5-10 year old packages, kept alive by an enormous engineering team that backports fixes to old, dead software, which creates a huge pile of technical debt for any developer trying to use "modern", highly modular frameworks.

As far as developers go, In the Ruby, Python, and Node ecosystems, anything that's not the latest doesn't exist. They don't use the system package management, they use gem, pip, and npm. They really don't care about the underlying OS, until it gets in the way, and getting in the way is exactly what a decade-old OS does.

Just to throw out an example. Take some modern ruby on rails application, say Discourse. (discourse.org). Go download a tarball from github. Now try to make it work with nothing but software from the official RHEL repository. Let me know how that works out for you. After you tear out all your hair and skin trying to do that, try to get the pieces from 3rd party repos that will make that work. See how much you have to bring in as far as new libraries and new packages just to make it work. It's still a nightmare even with the 3rd party repos, and that RHEL support contract doesn't cover them - every single piece that's likely to break your application, is now outside of your support agreement, so your company is now wasting at least $799/year for support.

If modern, highly modern frameworks are interested in getting into the big enterprise space, then they will dedicate the time to making their software work with RHEL. If a company is running Ubuntu in production, then they don't particularly care that much about stability, have a small server install base, or a team that can hack around enough to make things work.

If you install the newer packages you want, who cares what the "default" package is?

Personally I'd much rather a distro that lets me choose which version of packages to install rather than shoving one down my throat randomly during updates of the system.

Granted, the Debian stable I run isn't full of the latest shiny, shiny, but it isn't causing update problems by rolling out new versions of packages, either. Both Debian stable and RedHat RHEL are focused on stability, not bleeding edge development. No one in their right mind runs production systems on untested versions of packages, and no one (not even banks) can afford to do constant regression testing on the latest releases of software just because it's "new."

I'm constantly surprised at how many people opt for downloading the "production" version of my own project, even though that really was just a peg in the dirt of functionality, not some big fancy schmancy roll-out that went through more testing than other releases. There are bug fixes and new features in the latest and greatest, but a lot of people don't want that -- they want that peg in the dirt, and are content to wait for an SP1 to get access to the new features and bug fixes.

Don't forget it can often take a few months to properly regression test software. It isn't just an issue of booting with the latest version and making sure it starts running -- it's testing how it responds to having network cables yanked, power flipped off hard, sometimes even yanking hardware components while a box is running. Serious servers aren't something you just push out after running them with a dozen users for a week.

I've been a Linux developer for just over 20 years and I happen to hate Ubuntu. It's similar to how Slackware was in 1994 when I got started. Even the basic stuff requires tweaking to get working properly. In those days, that is how Linux was and we were all hobbyists enthusiastic about fixing problems. For example, burning a CD didn't work on the last Ubuntu system I used a few years ago. That is basic stuff that has worked the same way for 10+ years that no distribution should screw up. Other basic things

I work at a large university. IT gave us two options for operating systems on our servers, Redhat or Windows. They also offer a DIY vmware setup. Rather than having IT manage our servers, I have to do it just so we can run Ubuntu. It is impossible to run certain packages like OpenCPU on Redhat because no one ever bothered to port it. Before you jump to the conclusion that linux is linux, it's really not. You can blame Ubuntu for going off the beaten path or Redhat for not keeping up with the times but some software packages only run on one linux distro without considerable effort. Conversely, the only supported backup solution for our servers is IBM tivoli crap and I went through hell to convert the rpm based installer into something that would work on Ubuntu LTS. IBM doesn't get that Ubuntu (or debian derived) distros are popular now either.

As a *BSD guy, I find both Ubuntu and Redhat irritating but at least ubuntu has apt-get. Funny thing is I started on Redhat 5.0 in '99 or so as my first *nix like os. Back then they had a desktop that didn't suck though.

Given that developers are the new kingmakers, Red Hat needs to get out in front of the developer freight train if it wants to remain relevant for the next 20 years, much less the next two.

It's very hard to avoid a snarky response, but I'll try.

* Developers are not kingmakers* Developers are not system administrators* Developers don't understand operations* Developers often don't understand scale engineering unless they can abstract it away by not thinking too hard about anything* Red Hat Enterprise Linux (and its derivatives) are not intended to be shiny new, but to be reliable* Use Fedora if you want bleeding edge, or re-package things yourself. RPMs aren't hard.

RedHat does have MySQL, so some of the presumptions of the post are false. True, RedHat now is moving into MariaDB a MySQL branch currently, fork in the future. But RedHat is a great choice for developers. What about Tomcat or JBoss? Their long support window and awesome packaging makes a great choice for risk-averse organization. I see lots of orgs adopting these app servers supported by RedHat.

Why? Red Hat has been the one distro that spearheaded Linux adoption in the enterprise. It's stable has very long support life cycle and if you do not want to pay licensing you can (and many startups do) use Centos.
I hear people complaining about rpm/yum. Guess what. Many of us have extensive experience with it and have no problems with it. Creating repo cache is fairly easy and allows you to have a total control of what is deployed to your server. And yes I do like dpkg and apt-get. They are very nice to

I think you're kind of missing the point. Developers don't think "hey, I know Ubuntu/Mint, and it works great for me, but yum just got a little bit friendlier? Forget everything I know, I'm installing Red Hat."

People change distributions with a purpose. For me personally the odyssey was:

Mandrake: because (I kid you not) it came on a CD in a Linux magazineGentoo: because of the performance gainsMandrake: because (unlike Gentoo) you don't have to spend half your life compilingUbuntu: they did all the annoying stuff (eg. making Flash work) for meMint: Shuttleworth gave the middle finger to Ubuntu community vs. Mint 3s their community

The point is, no one is going back to Red Hat unless it offers something significant that their current distro doesn't (besides just yum). Making Red Hat one distro instead of two doesn't give me a reason to leave Mint. Making yum friendlier doesn't give me a reason either. At best changes like that might help stem the tide of departing Red Hat users ("why do I need Ubuntu, Red Hat finally got friendly") but if Red Hat ever wants to become a dominant distro again they have to offer a compelling reason to switch.

Dude, the question was about Enterprise servers. Do your development on Mint, that's just fine, but are you really going to deploy your production enterprise application on a farm of... Mint servers? really?

The whole point was that developers influence the choice of distro on the server

There must be cases where this is true. However, it's really unclear to me why most developers would care and why they would feel themselves qualified if they have competent sysadmins to work with.

When I've got my sysadmin hat on, most of the developers I work with are developing on Macs. They have no hangups about their code being deployed on EL systems in a big data center. Nobody is clamoring for a shelf full of MacPro tubes to deploy on.

When I've got my developer hat on, I usually write on a Fedora machine. But I'm not daft enough to try to run Fedora on a server and have to worry about the maintenance cycle. I put my configs in a puppet module that pushes the code out to whichever VM I'm going to run it on, regardless of the OS, hypervisor, hardware, or country that code is bound for.

If my code doesn't run on a particular distro, then my code is probably broken (or my devops is hosed).

Maybe there are some startups with a bunch of kids and one third-careeer CEO and they all tell him what's going to happen. Good for them, I guess. Someday a sysadmin might come in and help them fix their stack. Let's not speak of the failwhale.

Here's one example: how do you track packages? If every developer in your company is using apt (well, or brew for those Mac people, but let's ignore them because the server is NOT going to be a Mac), then it makes sense to compile a list of apt packages right? So then when you go to deploy the sysadmin just has to sudo apt-get those packages.

But if you're server runs Red Hat, somebody has to translate that list of apt packages to yum packages. Not a huge deal, but why would you want headaches like that,

Another thing to consider is debugging. As a developer, you want to debug on a system that's as close as possible to the machine where the bug occurred. Obviously it's easier to be sure that your environment is the same as your server's (and that you're seeing the same problem the server saw) if the two run the same distro.

Two words: virtual machine

Even if you were to run the same OS and version on your primary desktop as your server has, you're still VERY likely to end up installing stuff that the server does not have (ex. maybe you want to use eclipse and the latest JDK for it, or you need a newer version of python for some VCS tool you use). In any case, you are better off running the code on a vm that is very similar to production.

Because Red Hat just works? Somewhere along the line I was surprised that ubuntu got popular because there so much controversy with it. Maybe it's a generational thing, as Red Hat feels like Unix and Ubuntu feels like Windows.

And besides, once you have gcc and vi or emacs, what more does a developer need?

And besides, once you have gcc and vi or emacs, what more does a developer need?

PyCharm (ie. IntelliJ), Chrome, a music program (Spotify, Pandora, etc.) a chat program (Pidgin, Hipchat, etc.), GIMP for image manipulation...

I have no beef with the emacs/vi folks, but some of us think that development technology (like every other kind of technology) has advanced since the 80's, and we want an OS that looks like it's from this decade to run it on.

I've found SELinux useful. Yes, it can be a pain, but if the device is Internet facing or in the DMZ, it can do a lot to contain a security breach. As always, it can be shut off with a single command, but it is a layer of security that is generally worth having if at all possible. That way, even if the Web server has an exploit, an attacker manages to get into its context, then get root... they still are limited to the directories the Web server is allowed into. It isn't perfect, but it does help.

SELinux is another leyer of security people should learn. Is it difficult the first time you use it? true, but that doesn't mean it isn't useful.

Every time someone says that SELinux should be disabled, instead of learining how to use it, I remember the days when Windows changed from FAT to NTFS, and people said "disable NTFS, format FAT, filesystem permissions are difficult":)

I hope you're talking about your personal desktop and not publicly accessible servers. Many years ago, many packages didn't have SELinux policies, and that was painful. Disabling it was rather tempting. With all the many Linux computers I manage, I haven't run into a single SELinux related issue in several years. If you're disabling it now based on your experience in 2007, it might be worth taking another look.

You do realize that SE-Linux was not originally a Red Hat thing right? That little nightmare came from the NSA. But it is now firmly part of the Linux Kernel so you can blame the Kernel team for keeping it around. SE Linux has it's place though, when you need really enhanced security, which just doesn't include most people running this stuff at home or in corporate environments. High Security = pain to setup, so you get what you pay for.

Since you aren't sexist, it's safe to say you don't believe there's anything better or worse about the two pronouns. In that case, does it really matter what pronoun someone uses to substitute a proper noun? We've used he as the default for years and the general consensus was mostly "meh." Why is "she" so bothersome? Is it aesthetic? Is it because the usage of "she" brings up images of shrill anti-male feminists? If it's because of the second reason... well think of it this way. That's like being the polar