Special remark: Using curly braces is no substitute for quoting, so don’t pretend it is:

Extra bad (adds confusion): ${my_var}

I actually completely disagree, and I would argue that you should always add the curly braces. I’ve never heard of anyone thinking curly braces equated to quoting, but I’ve heard of many, many people (myself included) who don’t know all the rules about what constitutes the boundaries of a variable name. (Quick: does "$my-var" expand the variable my or my-var?) The author actually stumbles upon one of these confusing cases further down ("$10" versus "${10}") and uses it to add a special exception to their no-curly-braces rule.

If you simply put every variable in its own set of quotes, you never need braces. Except for the exception.

You can avoid any exceptions by always using braces. It also clearly deliniates it as a variable in any context.

The only “good” one to me, of the ones listed above is "filename${num}blah.dat", although all three are safe, so it’s really a style issue. (I realize this is different from my example which didn’t have double quotes but it was just for illustrative purposes.) I personally find filename"$num"blah.dat quite distasteful.

With sensible variable naming (a-z, A-Z and underscore) this should not be a problem and curly brackets should not be needed for regular variables. Reserving curly brackets for special cases also makes it easier to spot the locations where parameter expansion (like ${asdf%%asdf}) is happening.

FWIW you can use ${a[@]+"${a[@]}"} to get around the empty array / set -u issue. In bash 4.3, arrays and set -u were incompatible. bash 4.4 fixed that, after probably a decade+ of broken behavior.

Recent Linux distros like Ubuntu LTS 16.04 is still on bash 4.3, so it’s hard to recommend using things that are bash 4.4 specific.

But I don’t recommend that trick – I mostly use strings as pseudo-arrays and I don’t use shell scripts with untrusted filenames. (e.g. my shell scripts mostly process files in my own git repos)

If you really need to process untrusted filenames in shell, you might want to follow these guidelines, but otherwise I think it’s too pedantic and ugly. I couldn’t recommend this to someone “with a straight face”.

Me too. I might attempt this at some point as an extension to shellharden.

Possibilities for a sensible/modern language:

Fish - basically apply Wikipedia’s bash/fish translation table. Interstingly, every variable in fish is an array, so the corresponding bash code would be verbose ($var → "${var[@]}"), but so be it. I’m a daily fish user, btw.

Quite a nice list! One thing that stands out to me is the #!/bin/bash shebang, since I’m a NixOS user and it won’t work there (this is NixOS’s fault for not following FHS, but it has compelling reasons for doing so ;) ).

I always use #!/usr/bin/env bash, which AFAIK looks up bash from the current PATH; this lets the user use their preferred bash (e.g. somewhere in ~), or propagate an augmented PATH through subprocesses (e.g. PATH=/my/favourite/bash/bin:$PATH ./my-top-level-script).

By using /usr/bin/env for bash, python, runhaskell, etc. we only need to hard code a single path (which we can even special-case, e.g. with a find/replace, so the path itself doesn’t even need to exist!)

I’m not missing that point at all. newlisp compiles and installs on virtually all systems easily. Its binary has almost no dependencies. To run scripts you only need the ~300kb executable, which can be shipped with them.

It’s not as easy as using something that’s already installed… it’s easier.

Could you explain how is it easier to ship a binary for every platform the thing will be run and have a shim which discovers the platform type and selects the right binary? I think most people would disagree that that is easier.

There’s a million different ways you could choose to do it depending on what you’re doing. Not all of them require doing what you just described. If you’re writing something non-trivial in bash you’re going to likely need a package manager for something anyway. You can install newlisp with a bunch of package managers. You can compile it. You can ship a binary with it. You can do one of a million things and in the end it’s always easier than writing thousands of lines of ugly hairy bash nonsense, which, btw, if you’ve ever done, you’ll notice that there are always system-specific things you end up doing with bash anyway.

But, again, I don’t believe you’ve addressed the actual criticism: bash is already there so why make life even harder. I think what you’re saying is newLISP is so much better than bash that any deployment problem is actually non-problematic. If that is your claim then I hope you can find a way to convince people because for most people the convenience is bash being available is going to override any possible language win. And a blanket statement of “just use X, cause itistoday says so” is not very convincing (at least to me).

FWIW, my bash scripts almost never require knowing what environment they are running on so I cannot relate to your argument.

I think what you’re saying is newLISP is so much better than bash that any deployment problem is actually non-problematic.

Yes, the amount of time you will fight with bash will in the end be more trouble than learning newlisp (which can be done in less than a day), installing it, and writing whatever you wanted instead in it.

Pretty sure that’s a safe assumption when you’re writing bash scripts as well. And as I said, you can ship the tiny binary with the scripts if you need to. Not sure where or why this pushback is coming. OK, I will not share useful tips with you people, lol.

Most bash scripts I use do not require full control of the system to be used.

Not sure where or why this pushback is coming.

How are you not sure? Commenters have been pretty clear that they think using something that already ships on a vast majority of systems is easier than shipping something with the package. Nobody is arguing that bash is a superior language, just that convenience is hard to trump.

OK, I will not share useful tips with you people, lol.

I don’t understand this statement. You made a suggestion, you got people that disagreed with you, life moves on. Why do you think you need to with hold other suggestions?

You’re telling UNIX folks to replace a key tool of theirs with another. They rarely give them up if it’s a shell or key tool they value.

That other is a LISP. That brings its own resistance despite being more powerful, safer, more self-contained, and potentially higher performance. I’m with you given few if any probably moved from a LISP machine to terminal UNIX happily. ;)

You were ignoring the already installed widely argument. That’s basically the by default and convenience arguments in one. Very powerful principles of psychogy powering billions in economic activity and lots of inertia in tech. Your alternative can’t be equivalent if it requires extra effort to get started.

So, they’re right that what’s their and/or idiomatic for UNIX will be a safer bet due to ecosystem effects. Then, they resist the LISP’s since the UNIX ecosystem almost always does. It aint happening for the masses but LISPy shells might havs their niche. I have some ideas on how to make that more interesting. Way into future, though.

They’re not things people will actually state. You just watch what they do individually or as a whole. Most UNIX folks here or in general aren’t giving up their main tools for substantially different ones outside of UNIX style. Most things people bring up are standard UNIX fare. They usually resist learning or using LISP despite its productivity advantages. You don’t see anyone trying to embrace the newlisp recommendation or posting other LISP-on-UNIX things they found.

So, I’m assuming the default behavior is still going on since I haven’t seen anything to the contrary.

You’re saying that regardless of what anyone on this thread says you know what is in our hearts. That is not a very evidence-based statement. I’d prefer it if you stuck to what people actually said rather than your unfalsifiable beliefs.

Now you’re just making stuff up. My claims about keeping with legacy tools like bash or using LISP’s are easy to falsify. Here’s a few things you can try:

Looking at what the defaults are in config for most sys administration of UNIX boxes. Was it legacy ones like bash or LISP’s? And is that true for things like UNIX training manuals and StackOverflow questions?

After LISP benefits were established, did most non-performance-critical software in UNIX start getting written in LISP’s? And performance-critical in a non-GC, C-like LISP? Or is it mostly in C or C++?

Are most of the distros bringing in LISP’s by default or other stuff that’s weaker?

Were the big companies pushing new and improved LISP’s or ALGOL’s?

There’s all kinds of ways to test whether the majority of UNIX software or admins leverage defaults or have an anti-LISP bias. There’s a few. Go look. You’ll find a lot more bash and C despite LISP able to have done what both did at a faster pace and memory-safe in C’s case.

I just mentioned what was widely installed vs widely used. No 3 was specifically about that. So, do you see more Common LISP or Scheme in use by UNIX developers and admins out there than bash, C, Perl, and Python that are the defaults in a lot of distros? And did this change after the piles of CVE’s in C apps or the studies showing LISP developers outpaced C developers?

Did they start massively investing in and standardizing on the thing that did better in many ways? Or keep making excuses to stay on what performed worse and/or use more stuff as similar as possible to it?