The reality for me over the past 10 years has been that in any Rails application, 80% of time is spent untangling messes to figure out how to make a change, or fixing bugs. Only 20% of time is spent on actually adding value. That ratio should be flipped around, and there are other languages and frameworks out there that try to do just that.

The poster mostly debates how they have maintained only bad Rails codebases. Nothing about Ruby.

It pretends to have an option type and you can use potentially nil references in that way if you go out of your way to do so, but there is nothing stopping certain code from just dereferencing nil. It’s a good step and done pretty well, but there still is nil.

Even still, the original author was not at all even a little bit claiming that nil values are the key problem in Ruby that makes it worse than other mainstream programming languages.

Rails incentivizes so many things that lead to potential pain later. It changes out from under you. It actively discourages anything but color-by-number software design. I have too much experience to make that an enjoyable workflow for me. In short, I don’t want to program in DHH’s little sandbox.

I still think it did great things overall for software, but I disagree with the core assumptions that it makes.

I work for a company that hosts/maintains/develops a lot of rails apps, and I don’t agree with this statement.

Also errors and crashing are lumped in the same bucket by the author, which is strange/wrong to me. 500’s are not crashes. Also in rails you don’t -have- to respond with 500’s to errors, you can throw in a rescue_from block to make it a 4xx if that’s the behavior you want really easily. If you are seeing behavior resulting in a 500 that is more-accurately handled with a 4xx that’s not a big deal to handle in rails.

The nature of web apps is that your application is likely to receive bad input from users sometimes, though, and logging errors for those cases seems normal to me. If you get enough of the same type of error and you decide you don’t care about them, stop logging them.

If you had to tell your boss that you need to upgrade to the paid plan because the application is logging more than 300 errors per day, the correct response from the boss should be “fix those errors then”, but it never is; because that would take a massive amount of time and effort, completely contradicting the “productivity” of the framework.

If bugs in application logic are causing your errors, then you decide if they’re important enough to fix, and fix them. Ruby/Rails has robust testing libraries and integration, so you’re only making excuses if you’re saying it’s too hard. Still, I don’t agree the correct response is to fix every bug - there’s a cost/benefit analysis involved.

I think the author is in for a rude awakening if they truly believe that there’s a language/framework where they’ll not have to deal with poorly-written applications. There might be an argument that Rails makes it easy to write messes that work buried under all the wrongness of the OP’s writing, but I have seen messes in every language/framework and have yet to encounter the language/framework that forces code to be good. I’ve worked with some good ruby/rails code too, so it does exist, just as good/bad code almost certainly exists in other languages/frameworks.

Agreed. One of our recent hires came to us (where double digit errors is a bad day) from a company in the “thousands of errors per day Rails app” category, despite having fewer users and traffic than us. They were making a transition to Python and I-forget-which-framework. Wouldn’t you know, the same team that created a giant exception-spewing mess in Ruby on Rails was dutifully working its way toward an equally exception-spewing mess in Python.

I like Ruby as a language, and work in a Rails-based project every day, but would definitely agree with the statement that “Rails makes it easy to write messes”, or even that “Ruby makes it easy to write messes.” But people tend to extrapolate that into “Ruby makes it hard to not make a mess”, and that I disagree with.

Look at the early community behind the rise of Ruby (pre-Rails and immediately post-Rails) from its obscurity as a Lisp-and-Smalltalk-inspired Perl alternative from Japan. It wasn’t new developers learning to code for the first time in Ruby.

It was languages geeks who try obscure languages for fun. It was folks fleeing the boilerplate-ridden strictures of big enterprise Java, or coming from vast swamps of spaghetti Perl. They had seen big messes and didn’t want to recreate them, they saw the utility of a cleaner object model and first-class functions. Or they had seen towering edifices of design patterns, set up to constrain them, to somehow save them from themselves, with XML configurations stretching to the horizon: a mess made in the name of keeping them from making a big mess. Some of those people wanted a little more freedom, a little less tooling.

People from those groups are less likely to make a mess when using Ruby, because they’re less likely to make a mess in any language, when compared to a team of junior developers. Worse still, compared to junior developers who are pointed at a framework and told to change the world using only the power of Google, Stack Overflow and gem install.

In the end, the team and its practices are the more influential variable in the equation than the language, but it can be uncomfortable to criticize people, and it is much safer to blame the people’s tools.

It seems particularly true in programming, but no tool seems to survive contact with the mainstream. They’ll demand something permissive-by-default, then make a mess of it and turn around blame you for allowing them to do it. Then they’ll go somewhere else and repeat the process.

They won’t choose something that will break the cycle because it will “cramp their style.”

Now, ask yourself why these defects happen too often. If your answer is that our languages don’t prevent them, then I strongly suggest that you quit your job and never think about being a programmer again; because defects are never the fault of our languages. Defects are the fault of programmers. It is programmers who create defects – not languages.

And what is it that programmers are supposed to do to prevent defects? I’ll give you one guess. Here are some hints. It’s a verb. It starts with a “T”. Yeah. You got it. TEST!

I edited my comment to say “most compilers” because I realized that was the case. Overwhelmingly web applications to-date have been written in languages where that is not the case. I -think- the author is clumsily making a case against Ruby’s type system, but it’s poorly-expressed here.

I get it that many consider that a language can eliminate an entire class of errors to be advantageous/superior. It takes some really generous interpretation to get from the author’s gripes to that argument. If that’s the point here, it’s not cogent.

Yes, this is a Ruby error message. However, the author is ranting about rails, and in fact not actually expressing anything interesting about the Ruby language itself other than “It’s not strongly/statically typed”. Duh! Want static typing? Go use another language :)

Web development is in a shabby state. I find it displeasing to work on many web applications. The reason for this I think is that frameworks like Rails set the barrier to entry/learning curve too low. What I mean specifically is that in my experience, web dev projects do not require well-thought out design decisions. Instead all you need is a couple individuals out of a bootcamp and maybe a senior installing gems/extensions with a couple mods. This will allow most projects to fulfill their requirements and in maybe 4-5 years a new website will be built to replace it.

I agree, even though I am new to the industry (2-3 years) and work on frontend. The low barrier doesn’t just apply to the actual web programmers (fairly low but not as bad as it seems), but almost every aspect of the project’s planning, resource management, and execution is done rushed and with little thought for maintenance (to be “"agile”“). It doesn’t matter, though, because 3-4 years later it’ll be rewritten and your old code will effectively cease to exist.

That said, I’d rather work on a Rails project with fellow newbies than a Node/React one which becomes an absolute mess because of the lack of convention.

Web development is in a shabby state. I find it displeasing to work on many web applications

The entire platform seems smothered between lots of people changing careers (good on them!) and the alpha nerds of the platform parroting cliches about the “open web” ad nauseum. It’s like the hype of making money on the platform overrides all quality concerns.

Things are held together by duct tape, but we should be proud of this thing because we’ve worked so hard on making it somewhat performant on Haswell i7s. Sunk cost fallacy all the way down.

Also, I really resent the self-justification that the easiest platform for users is the one that is the most important. This cedes control to people who don’t know any better. We should be framing how users use technology, and I’ll be the first to admit we’ve never prized accessibility as much as we should have.

I was discussing this with two people earlier in the week, in the context of visiting a local code school’s graduation showcase.

All three of us had “grown up” with the web. We remembered when CSS came to be, when DHTML was still a thing, and when JavaScript was only used to make the website snow during December. None of us could possibly imagine being in the shoes of someone learning the web now. All of us were trying to get a sampling of that from the various code school grads. Having a gradual history of the technology in our heads, we all felt it was easier to navigate new technologies as they come about, and to not let the new shiny distract from core software engineering principles.

On one hand, I definitely want more people to be able to code, to understand the digital world, and to grow intellectually or professionally. But when the most experienced fellow among us said “Anyone who can write 2 lines of JavaScript thinks they’re the god of the web,” I had an mental flash of agreement before opening my mouth to push back. The resulting conversation was around how and when someone matures out of that.

My current thinking revolves around the first time you realize you’ve added to a mess rather than having fixed it.

I meet a lot of ambitious junior developers who goes into one of their early-career jobs with a platonic ideal of clean code in their heads. They behold the vast sprawl of legacy code around them and think “This is a swamp! A huge a pile of mud! I guess I’ll be the one to build real structure and bring sanity to this place.”

Among those who are lucky and are given the chance to do that, the majority will fail, and the best among them will look back at what they built and see that all of their scaffolding was just heaping more mud onto the pile. Then the healing can begin. Then they have some perspective on how the the mess comes to be in the first place: well-intentioned people just like them.

all you need is a couple individuals out of a bootcamp and maybe a senior installing gems/extensions with a couple mods. This will allow most projects to fulfill their requirements and in maybe 4-5 years a new website will be built to replace it.

We, who care about quality and medium to long term maintenance, might not like it but
this is positive, if you are on the other side of the table. A junior team can crank out something that works in short order.

i’m still sad every time i think about opa failing to gain mindshare. it really should have been the next rails, and it would have advanced the state of web development marvellously. let’s see what elm and phoenix can do.

Usually this is when a team starts writing tests because they realize the situation is not sustainable.

So they’re writing a webapp in a dynamically-typed language, and they just started writing tests after 3 months of active development after already being deep in a slog, and they want me to believe that the language and framework is the problem? You need to be writing tests on day 1, especially in Ruby. It’s way, way easier to start testing, and designing for testability, on day 1 then when your application is already a mess of spaghetti code. These kinds of problems tend not to happen, or at least to be straightforward to fix, when you have a robust test suite.

Ok seriously, what’s up with the community bashing dynamic languages? I’ve seen plenty of Python projects in great shape, and also quite a few in terrible shape. Same thing goes for Java, C++ or GO. Static typing is not related to any of this.

It’s the usual tribalistic measuring contest. “My <X> is better than your <Y> because I like <Z> feature of <X> and <Y> doesn’t have that.”

I remember a story my Dad told me once – he’s a C-hacker and wouldn’t have it any other way (except maybe going back to punch cards or something). I was touting some new thing I was learning in school at the time as being the wave of the future or some such nonsense, and he asked, “Joe, do you know how much code is written in C?” I was a math major, I did some guesstimating and said, “Probably a lot.” (High quality mathematicianship right there). He said, “Damn right. How long has most of that code been around for?”

“Average? Maybe like 10-15 years, oldest things I can think of though might border on 30+ if you think about some Unix tools and stuff.”

“Damn right. Is there any reason to expect we’re going to replace a large portion of all the code in the world with some new language because it’s marginally better in some respect? Or is it more likely that in 30 more years, I’ll still be writing C, and there’ll just be six or sixteen or six hundred other new languages we have to maintain?”

“No.”

“Software engineering isn’t picking the best language to learn and master, it’s picking the most. Generalists always have jobs.”

People are always going to whine that something is better and they are the one-true-way. The rest of us will make like the PHP programmers, the C-hackers, and the other people who just learn what they need to learn to make the blinkenlights go off the way the people want them too. Javascript still sucks though.