"return to table of content"

test, [, and [[ (2020)

neverrroot
44 replies
16h8m

[[ is bash only. If you know you’ll only use bash, use it. For details, the article is nice.

shmerl
35 replies
16h0m

Why would you not use Bash (unless you are using completely different shell like Fish explicitly?).

It feels like some completely archaic concern to target the minimal common shell denominator.

LambdaComplex
10 replies
15h56m

Not installed by default on BSDs. Sometimes not installed in minimal environments, e.g. where BusyBox is all you have available. Maybe not installed by default on Unixes? Not sure

loa_in_
9 replies
15h11m

Honestly, installing bash into environments seems like less work than avoiding it. It's not hard to do.

cpuguy83
3 replies
12h41m

You don't always have a package manager, root access to even be able to install anything, network access, etc.

"Just install bash" is not always easy, and is not even necessary when posix shell can easily do what most people use bash-specific syntax for.

emmelaich
2 replies
12h5m

That's the sort of yak-shaving I will never do. I'd rather just not use such a primitive unix and/or be in a job where one of pdksh/ksh/zsh/bash is not available at all.

isatty
0 replies
10h21m

Well it’s good that you have the choice to; but it’s also not all that difficult to understand that not everyone who needs to use your script, has that choice.

JohnFen
0 replies
3h3m

That's cool. We all decide for ourselves what markets we're willing and unwilling to address. But the market that involves "primitive" unix is lucrative and that can be worth it.

For me as a developer, it's not a big deal. You can do all the same things, just in a slightly different way, and you have a script that can run almost everywhere.

JohnFen
3 replies
13h36m

When your script will be running on customer machines, you often don't have the luxury (and sometimes it's not technically possible because it doesn't exist) of installing bash into the environment.

loa_in_
1 replies
5h15m

What prevents you from shipping bash with your solution?

JohnFen
0 replies
3h13m

These scripts are often used to install the application, so that introduces a chicken-and-egg situation. How can I install bash before running the installer?

Also, we'd have to build and maintain our own just for platforms missing it, which increases cost and complexity. It's easier, cheaper, and less error-prone just to use a minimum common denominator that runs on everything.

Additionally, when your customers have strict policies regarding how applications are approved for use and have to vet and test everything you give them, it's best practice to avoid installing anything that you can avoid installing. Doing that minimizes the effort and hassle for both the customer and the developer.

DonHopkins
0 replies
13h5m

Time to find better customers.

OmarAssadi
0 replies
9h35m

It's not usually hard to do, no, but it is >130K of C alone, excluding whitespace, comments, tests, examples, etc. I don't want it on my system from a security perspective alone.

Add in the bootstrapping pain it imposes due to autoconf and other stuff, I think there are many valid reasons to avoid it and choose another shell that is more auditable yet still has just as many eyeballs on it (e.g., mksh - the default on Android; dash - default on Debian; BusyBox - every embedded system).

driggs
6 replies
15h42m

Because Apple switched the default macOS shell from the GNU-licensed `bash` to the BSD-licensed `zsh`?

bgm1975
5 replies
12h59m

Apple switched when bash switched from GPL2 to GPL3 which they didn’t like. The older bash is still available.

masklinn
2 replies
11h40m

Apple switched when bash switched from GPL2 to GPL3

Apple just didn’t update. It took them years to finally switch to switch to zsh.

The older bash is still available.

Aside from it being bash (a great reason not to use it as far as I’m concerned) it’s now a 17 years old version of bash.

toast0
1 replies
11h28m

Aside from it being bash (a great reason not to use it as far as I’m concerned) it’s now a 17 years old version of bash.

I thought people liked macOs for its vintage feel? Remember a time when computers could only render a single menu bar in a fixed location, feel the experience of SYN floods, run a version of bash that is old enough to vote in the next presidential election.

mkesper
0 replies
9h7m

Honestly this is such a waste of time every time I have to argue with developers about installing up to date homebrew (or whatever) versions of the coreutils that I wish Apple would simply DELETE all these ancient versions of tools from MacOS. As a bonus, homebrew does not offer --with-default-names any more and some tools (like make) are posted into different paths so you need to add your own symlinks or add multiple paths to your PATH.

toast0
1 replies
11h46m

According to the interwebs, zsh became default with Catalina in 2019 [1]; ten years after bash 4 was released with gpl v3 or later.

Also, the interwebs suggest Apple used to use tcsh as the default shell[2]; I don't know when they changed that, but it may have been after bash 4 released? (Thanks, 10.3 was in 2003, so several years before the license changed)

[1] https://www.theverge.com/2019/6/4/18651872/apple-macos-catal...

[2] https://news.ycombinator.com/item?id=18853318

masklinn
0 replies
11h38m

bash became the default shell in 10.3. tcsh was the default before then.

mikem170
3 replies
15h54m

Bash seems to be about as archaic as sh.

I write my shell scripts in sh, because it comes with every unix-like os by default.

I know that bash has more features, but I've never really missed them. I switch from sh to perl for more complicated tasks.

To each their own, right?

15457345234
2 replies
14h56m

This to me seems like good practice, it's better to use a 'real' programming language when you're crossing the boundary from 'script' to 'program'

Shellscript has way too many idiosyncracies and weirdnesses that would have been beaten out of a proper programming language by now. (I know that talking about weirdnesses is amusing in relation to perl which also has a whole armload of them.)

hnbad
1 replies
9h22m

This to me seems like good practice, it's better to use a 'real' programming language when you're crossing the boundary from 'script' to 'program'

PowerShell would like a word with you.

account42
0 replies
5h6m

PowerShell's problem is that it is a real programming language, which makes it less suitable for a shell.

aidenn0
3 replies
15h37m

IIRC, /bin/sh on Ubuntu defaults to dash

eikenberry
1 replies
15h29m

That is from Debian and all child distros inherit it by default, not just Ubuntu.

jabl
0 replies
7h18m

If you want to nitpick, it was first done by Ubuntu and then later upstreamed into Debian (from which other child distros, as you correctly point out, inherit it).

andrewshadura
0 replies
11h47m

In fact, only dash is supported as /bin/sh on Debian and Ubuntu.

yjftsjthsd-h
2 replies
15h52m

IIRC Debian uses dash as /bin/sh because it's faster (execution speed) than bash.

emmelaich
1 replies
12h4m

That's like using a horse instead of a mule. Just not worth it. Move away from the whole equine world and use a real programming language.

yjftsjthsd-h
0 replies
11h18m

Eh, there are things that shell is really good at, and there are things that other languages are good at. I will grant that shell is best for glue code; my personal heuristic is that I stick to POSIX sh, and if that hurts then I take that as a sign that I should be considering moving to Python or whatever. But avoiding it completely strikes me as a poor choice, because for the "glue together separate programs and manipulate files" tasks that it's meant for, it's really good and IMO everything else still falls short.

JohnFen
2 replies
13h38m

I've worked on several modern projects that needed to be able to run on a wide variety of Unix platforms, several of which didn't have bash. Writing for the common shell denominator was important, not archaic.

shmerl
1 replies
12h15m

Still sounds archaic that it's a problem today that needs to even be addressed.

jtafurth
0 replies
8h6m

I wouldn't say it's an archaic problem, an example I can think of is writing scripts for infrastructure running minimal docker images where you want to keep the image size to a minimum, you would usually need to support both bash and other shells.

Embedded applications come to mind as well.

dredmorbius
0 replies
9h15m

Writing Bourne-compliant scripts ensures maximum portability.

As many here have noted, bash isn't universally available, with another possible issue being OpenWRT devices. Stock/base images tend to use a Bourne-compatible shell, not full Bash. Though the latter's installable through opkg, for sufficiently small devices (typical of consumer kit), you simply won't have the space to install them.

There's also the slight PITA that Apple's OSX ships with a very old, pre-GPLv2 Bash, out of licensing concerns. (Apple is phenomenally averse to GPL-based code, much as some *BSDs are, such as OpenBSD.)

And if you're dealing with legacy systems (which tend to be extraordinarily and stubbornly persistently legacy), you'll often find that either bash isn't present or is quite dated.

I freely confess that I tend to write fairly recent-feature bash scripts myself by default, and appreciate many of the newer features. But if and when I am writing portable code, I'll go back to Bourne-compatibility.

But when writing system level code, an appreciation for standards and the very long tail of legacy standards and the limitations they impose is in fact a large component of professional maturity.

csydas
0 replies
11h37m

it’s contextual as most things. need to actually use and work on the machine? use whatever shell you want to make your life easier.

need a script to run on many different systems and/or need to write a script to be managed automatically by a service account? probably you want a shell with syntax that is guaranteed to be the same on all your systems.

bregma
0 replies
5h22m

It's archaic if you're a dabbler, a hobbyist, an academic, or a junior dev who lacks the experience of shipping software into the wild.

For the rest of us, we have learned the hard way, sometime repeatedly, portability and adherence to published standards matters.

emmelaich
3 replies
12h8m

Always use [[

zsh and ksh have it; in fact I'm pretty sure it originated with ksh in 1988 or earlier.

OmarAssadi
2 replies
9h54m

Always use [[

While zsh, bash, mksh, ksh93, probably others have it, sure. But many don't -- and not totally irrelevant ones either. Debian's default, dash, for example, does not support `[[`.

IMO, unless you're writing something like shell-specific dotfiles, avoid non-POSIX features.

It's usually pretty trivial to avoid them, especially if you're willing to call other mandated commands like awk, etc. But often, with a bit of creative thinking, most non-standard features can be replicated with some combination of `set`, separate functions and/or subshells.

Shell scripts, in general, have dozens of footguns, are pretty much impossible to statically analyze, difficult to make truly robust, and many of the shells themselves -- e.g., bash -- have huge, borderline inauditable codebases.

I can think of a dozen reasons not to write shell scripts. Yet still, there is incredible value in the fact that some form of POSIX-compliant/compliant-enough shell can usually be found on most systems.

All of that value goes out the window, though, the moment you start relying on non-standard features.

massysett
0 replies
7h34m

Don’t worship at the altar of portability. There is real cost to portability, as it forces you to cater to broken implementations and to not use features that may be very useful. Also, it can be difficult to ensure compatibility with systems that you don’t test on, so true portability requires having different systems to test on.

Sometimes all those costs are necessary for the task at hand. For example, the whole point of GNU Autoconf is that it runs on a wide variety of systems.

On the other hand, many programs are for in-house or personal use and will not run on obscure systems. The cost of writing for portability simply might not be worthwhile in these situations. And that’s ok, notwithstanding some conventional wisdom of “always try to be portable.”

arp242
0 replies
6h30m

Depends what you're writing.

If you're writing an installer script or whatever that's going to be run on $many computers that you don't control: sure, it's probably best to go to the extra effort to stick with POSIX sh.

But that's not most scripts. Most scripts are things that you run on computers you control. I just write things as zsh scripts because it's so much easier than bash (never mind POSIX sh). That's fine, because I can just install zsh. And actually, this often makes scripts more portable because you need to rely on external utilities a lot less.

re
2 replies
15h55m
dredmorbius
0 replies
9h20m

"bash only" typically refers to "bashisms", that is, bash features not present in the plain Bourne shell (or Bourne-compatible interpreters such as dash). The fact that other shells (such as zsh) may include those features ... is beside the point of writing universally compatible shell scripts.

Confirming my facts for this comment, #TIL that "dash" is the "Debian Alquist Shell", that is, Debian's "ash" shell:

<https://en.wikibooks.org/wiki/Guide_to_Unix/Explanations/Cho...>

bgm1975
0 replies
13h12m

And ksh (ksh88 & later)

nerdponx
0 replies
16h6m

That is, test and [ are specified by POSIX and usually are physical binaries (but might also be masked by shell builtins). Whereas [[ is not specified by POSIX and usually only exists as a shell builtin.

ridiculous_fish
23 replies
15h48m

The biggest footgun in `[` and `test` is the single argument behavior. For example, you might attempt to check if a variable is nonempty like so:

     [ -n $FOO ]
but if FOO is unset, it expands to nothing (as opposed to the empty string), so this is equvalent to:

     [ -n ]
and POSIX requires that the one-argument form of `[` succeed if that argument (here, "-n") is non-empty. So this will falsely report that $FOO is non-empty.

Remember to quote your variables!

whateveracct
14 replies
15h37m

ShellCheck your scripts!

mr_toad
13 replies
15h0m

Or use set -u to fail early.

Or use [ -n ${FOO-} ] which will replace the unset variable with an empty string.

kazinator
8 replies
14h4m

That is false. There is no difference between ${FOO} and ${FOO-}. Both disappear if unquoted, and FOO is unset or blank:

  $ printf "<%s>\n" alpha ${beta} omega
  <alpha>
  <omega>
  $ printf "<%s>\n" alpha ${beta-} omega
  <alpha>
  <omega>
The form ${var-} form is useful for safely evaluating a variable that might be unset, when "set -u" mode is in effect, and for whatever reason we cannot just fix the script so that the variable is set.

ninkendo
3 replies
4h29m

and for whatever reason we cannot just fix the script so that the variable is set

I use this pattern all the time for variables that should be overridable by whoever is calling the script, ie. `FORCE="${FORCE-no}"`, overridable with `FORCE=yes foo.sh`. I'm not sure of any other way to do this.

(Yes, using getopt or other option parsing is better, but for my own scripts, env vars are just such a simpler way to pass options, and after 20 years of using bash/unix/linux I still can't write a getopt stanza from memory.)

neuromanser
2 replies
1h32m

    : ${FORCE=no}
Also, it's not so difficult

    while getopts :hab:c o; do
      case $o in
      a) opt_a=x ;;
      b) opt_b="$OPTARG" ;;
      c) opt_c=x ;;
      h) usage 0 ;;
      ?) usage 1 $o "$OPTARG" ;;
      esac
    done
    shift $((OPTIND - 1))

ninkendo
1 replies
1h24m

Also, it's not so difficult

Heh, was that an attempt at sarcasm? That's super difficult to do from memory, at least for me. Maybe I'm not as good at memorizing things as you are.

neuromanser
0 replies
30m

Heh, was that an attempt at sarcasm?

There was definitely a wink component, but at the same time, I meant it.

Half the code is just case / esac syntax, which you can (i do) find plenty use for outside of getopts. getopts itself is… manageable. In the end, the code is such a strong pattern, it's basically a snippet. The lack of variation is what makes it kinda easy to remember. Or you could put it in your notes or a gist and copy/paste.

That's super difficult to do from memory, at least for me. Maybe I'm not as good at memorizing things as you are.

Nah, my memory is shot. It's a matter of practice, and not being shy with man pages. Use it often enough, it's in memory; after a long hiatus, the details are a "/^\s*getopts \[" away (assuming your man pager is less).

js2
3 replies
13h44m

They meant "${FOO:-}" which should still be quoted.

The general form is "${FOO:-default}" where default can itself be another variable or whatever string you want.

I usually prefer to set default values at the top of a script though using this idiom:

   : "${FOO:=bar}"
And when creating a local variable I'll immediately set it to an empty string if there's a chance it won't be assigned later:

  local foo=""

kazinator
2 replies
12h53m

That colon makes no difference if the replacement is blank.

js2
1 replies
11h22m

Holy crap, 25 years of writing shell scripts and I just learned the difference:

With colon, tests variable for unset or empty. Without the colon tests only for unset. It's POSIX too:

https://pubs.opengroup.org/onlinepubs/9699919799/utilities/V...

kazinator
0 replies
10h29m

So ${foo-bar} will not expand to bar if foo exists, but is empty. The empty value will prevail. ${foo:-bar} will expand to bar.

If we have nothing in place of bar, they are effectively same.

yjftsjthsd-h
0 replies
14h54m

Preferably you'd use set -u to avoid certain problems, and also keep using shellcheck for all the other things it can catch.

sweeter
0 replies
9h39m

"set -x" is a life saver for debugging. Also "set -euo pipefail" so a script just exits on errors or unexpected behavior. If done well you can really reign in squirrely behavior.

Bash has some odd behaviors and footguns but it can also be surprisingly reliable and versatile. Ive caught some seriously messed up stuff that I couldn't figure out otherwise, using set -x. Also shellcheck will forcefully hammer good practice into your head and catch a ton of hangups that are hard to know before making the mistakes.

also setting backup variables is a good idea like: pictures="${pics_dir:-$HOME/Pictures}"

Xiol32
0 replies
9h44m

set -euo pipefail

Effectively strict mode for shell scripts.

LegibleCrimson
0 replies
14h53m

That bottom one will also fail. The empty string is not treated as a shell word.

You need double quotes.

dixie_land
3 replies
12h41m

[ x"$FOO" != x"" ]

stouset
0 replies
10h16m

Please stop with tricks like this and just quote your variables. Everywhere.

overtomanu
0 replies
11h21m

whoa, lot of history in this trick

https://news.ycombinator.com/item?id=26776956

andrewshadura
0 replies
12h1m

No. This hasn't been necessary for decades.

mattrighetti
1 replies
9h56m

In this case I would use [ -n "${FOO?}" ] so that the script will immediately stop if $FOO is null or unset

8n4vidtmkvmk
0 replies
9h31m

Set -xufo pipefail

Or whatever the magic string is. Enable all the errors at the start of the script

jstimpfle
0 replies
8h48m

I think your last sentence needs to go first. Quote your variables! There's no actual footgun in the specification of the test builtin -- the footgun is shell itself. The behaviour that you mention makes sense because

   [ "$FOO" ]
is always the non-empty check regardless what it contains (could be "-n").

gpvos
0 replies
8h0m

Also, if $FOO contains a space, it will expand into multiple arguments. Just quote your variables, always.

cellularmitosis
13 replies
15h26m

I stopped using [ a few years ago because ‘test’ reinforces the idea that this is just a command like any other, not syntax. Also, “man test” is much more pleasant that sifting through “man bash”.

kazinator
9 replies
15h11m

Your comment makes no sense. GNU Coreutils has a "man [" as well as "man test" man page.

Bash has "help test" for a quick cheatsheet.

The [ command is very old; it was already present in Version 7 Unix in 1979.

c0l0
6 replies
9h18m

Parts of the comment make a LOT of sense actually, when you look at shell scripts written by the uninitated. Often times, I see constructs like

    while [ 1 ]; do ...; done
which are a pretty clear indication of the author's misconception about the perceived nature of [ ], I think.

kazinator
5 replies
9h1m

The nice way to write that isn't any kind of test command but:

   while true; do ...; done
There is never any reason to use test, other than portability to some broken environment that is missing [ but not missing test.

Diti
3 replies
4h31m

There is never any reason to use test

    test -d /nix && echo "$_ exists" || echo "$_ doesn’t exist"
You cannot do this with [ – Don’t Repeat Yourself, and your scripts become more maintainable.

(Not POSIX-compliant. Documented in Bash and Zsh.)

teo_zero
2 replies
3h55m

You cannot do this with [

Why not?

  [ -d /nix ] && echo ...

forgotpwd16
1 replies
3h6m

As '$_' is set to last arg of previous run command, the test example will have it be the name of directory whereas for [ will always be ].

elteto
0 replies
1h32m

  D=/nix test -d $D && echo $D exists …

tuyiown
0 replies
8h20m

There is never any reason to use test

What's the reason for always using [ ?

rascul
0 replies
2h57m

Note that bash implements [ as a builtin so the coreutils man page might not match exactly.

Karellen
0 replies
6h15m

I think GP's point was that `[` feels like syntax, but - importantly - isn't.

Yes, `[` is a command, and has a man page and everything, but in a script it doesn't look like a command. It looks like something "special" is going on.

Whereas using `test` emphasises the point that you're just running another program. That all `if` does is check the output status of whatever program is being run, whether that's `test`/`[`, or `grep`, or anything else at all.

(Personally, I don't think that emphasis is necessary. But I've been using shell scripts long enough that I understand these nuances fairly well without having to think about them much any more. So I think that GP's point is a reasonable perspective to have, and worth considering.)

tuyiown
2 replies
8h23m

I'm with you, the [ (and [[ bashism) introduces lots of confusions about what is really happening, I could never manage any real confidence.

That said, [[ being guaranteed to be built-in certainly had its purpose at ages where shell script performance had any kind of relevance, and that was no so long ago.

kqr
1 replies
8h7m

[[ has more to do with trying to build an intuitive shell scripting environment than performance. [[ makes conditionals behave much more like you'd expect from other programming languages. I think it's a great idea, but then again, if I don't have to care for POSIX portability, I'd rather use something that's not a shell language for scripting.

DonHopkins
0 replies
4h53m

While they were at it, just to be consistent, they should have added syntax for easy-to-use non-fucked-up less-arbitrarily-punctuated versions of control structures, too:

    ifif [[ x == 1 ]] thenthen
      echo "x is one"
    elseelse
      echo "x is not one"
    fifi

    forfor x in 1 2 3 dodo
        echo "x is $x"
    donedone

    whilewhile [[ x == 1 ]] dodo
      echo "x is one"
      x=$((x + 1))
    donedone

    untiluntil [[ x != 1 ]] dodo
      echo "x is not one"
    donedone
The whole [[ ]] $(( )) ifif fifi dodo thing just seems like they're just doubling down instead of admitting they made a mistake.

And if they really wanted [[ ]] to seem like syntax instead of a shell command, they could at least allow it to be used without spaces on each side of it like $(( )) or parens or brackets in any other language. And every other language lets you use as many redundant parens as you like for clarity, without running twice as many commands or producing a syntax error or weird unexpected behavior. But I don't think clarity was ever a design goal with Unix shell scripting languages.

jmmv
10 replies
15h57m

Hey, original author here. Thanks for sharing this and making it rise to the front page! :) By the way, the title probably deserves a (2020) and it would be nice if "test" wasn't capitalized, because it actually refers to the command.

Here is something related from 2021 that also touches on bash's [[ operator and that I think you might enjoy in this context: https://jmmv.dev/2021/08/useless-use-of-gnu.html

gnfargbl
4 replies
8h21m

This article complains that using the extended GNU features (--ignore-case, set -o pipefail etc) makes scripts less portable. Fair enough.

What it doesn't explain is why a Linux user should much care about portability. OpenBSD and FreeBSD are alive and well, but the number of users seems so small that they aren't a particular concern. Maybe you could argue that we "should" consider these OSes out of a sense of fairness, but where does that stop? Do I also need to consider something obscure like vxWorks?

BusyBox (Alpine) is more interesting, but the changes there are so significant that a port will almost always be needed anyway.

Are there other compelling reasons to care about the non-GNU ecosystem?

kqr
1 replies
7h54m

Is your comment best read in Internet Explorer 6?

----

What I'm trying to say is that open standards (and the portability that comes with them) is not something that just happens on its own. It takes active maintenance, and part of that maintenance is opting to adhere to the standard even when it would be more convenient to use extensions available in the most popular systems.

Will you personally suffer from liberally using Bashisms? Not in the first order. But if we encourage that sort of thinking as a rule, the standards become meaningless. I believe that would be a net negative change for the world, but there are many intelligent people who would disagree.

dspillett
0 replies
4h30m

I don't disagree with you in the context of public scripts, but the vast majority of what I do with bash is local-only (where local is ${DayJob} and my own infrastructure/projects) so the fact that extensions to the standard make my work in those areas quicker to string together and easier to maintain afterwards means I'm happy to lock myself into bash.

From my PoV it is generally portable anyway: almost everywhere I use a shell of that nature bash is present. The exceptions to this are things that need to run in small environments, like anything which may end up in initrd (where busybox us generally providing shell/script support).

Though I do wish people would make sure they specify bash in the #!/bin/bash in scripts unless trying to stick to the standard, as using #!/bin/sh causes problems when extensions are used and something lighter than bash is the default script runner (i.e. dash in Debian, and again busybox commonly in small environments). Only use #!/bin/sh if you lnow you are making an effort to be compliant.

pjungwir
0 replies
49m

I want to agree, but macOS is a big one. For example `sed -i` has bitten me before.

chasil
0 replies
4h45m

You will notice that the parent article mentions "dash."

The dash shell is small and fast, but it does not allow any bash/korn language extensions beyond what was recorded in the POSIX.2 standard in the early 1990s.

Linux users should care because the Debian/Ubuntu family use dash as the system shell, so this problem is very real as many have learned.

o11c
1 replies
15h7m

[[ is not really a builtin, it's fundamentally syntactical (but presumably uses a mostly-inaccessible builtin internally). Fun fact, `]]` is also a reserved word despite never being allowed in a context where reserved words matter.

In some non-bash shells, the `function` keyword is needed to declare certain types of function.

For make `$(shell)`, if you're building a lot of targets the performance difference can be measurable. Still, it loses in the nop case, so you should actually usually do `include` to trigger re-making.

GNU is completely right to ignore POSIX, since POSIX is not useful for solving most real problems.

account42
0 replies
5h25m

In some non-bash shells, the `function` keyword is needed to declare certain types of function.

In shells that aren't POSIX-compliant [0], maybe. In which case: yes, there are many wildly different scripting languages with REPLs.

[0] https://pubs.opengroup.org/onlinepubs/9699919799/utilities/V...

dang
1 replies
10h34m

Ok, I've lowercased the leading 't'. We never do that but for this, ok :)

edanm
0 replies
7h58m

As per the Zen of Python:

Special cases aren't special enough to break the rules.

Although practicality beats purity.

:)

account42
0 replies
5h20m

A lot of the GNUisms described in https://jmmv.dev/2021/08/useless-use-of-gnu.html are very useful in interactive use. Search the current directory without an explcit . - useful. Append an option to a command you just typed - heck yeah, always annoyed when commands don't support that.

For scripts though, sticking to POSIX sh often makes sense, yeah. You should at least be aware if you use of Bash-isms.

stevage
9 replies
15h12m

But if you know your script is going to be Bash-specific anyway, you are probably better served by using [[ unconditionally and consistently

Honestly, I think writing a Bash script is just a terrible idea. Use a real language, and all of these nightmares go away.

For me, lately, that's been Zx (which uses Node), but there are other fine choices too.

jmmv
6 replies
13h49m

Bash is "real language" if you treat it as such. Yes, there are plenty of spaghetti scripts out there, riddled with global variables and full of side-effects, but if you write shell in a principled manner, you can solve some pretty big problems with ease, and you can write maintainable code.

I wouldn't start new projects in shell, of course, but one area in which I think the shell shines is in writing integration tests for tools. More on this in a recent post I wrote: https://jmmv.dev/2023/10/unit-testing-with-shtk.html

Joker_vD
3 replies
12h30m

but if you write shell in a principled manner

The same can be said about QBasic as well, honestly. And here's the catch: people who are inclined to write anything in a principled manner are likely the ones who'd write things in a principled language instead of e.g. shell. Or to put it in another way, if someone chosen to write in shell (or failed to consider an alternative, which also happens quite often), they're quite likely exactly the person to not write anything in a principled manner.

jmmv
2 replies
12h2m

Oftentimes, the constraints around what you have to do dictate what language(s) you can use. So you do what you can with them and you try to get the best out of them. https://jmmv.dev/2023/11/why-do-i-know-shell-and-how-can-you...

Joker_vD
1 replies
10h22m

I think one of the Rust's creators (or was it Go's?) said that they wanted to get the C++ developers to switch to their language but that didn't happend, they unexpectedly got Python developers instead but retrospectively it makes sense to them: people who wanted to switch from C++ and could afford to had done so already, so the C++ developers are the people who either don't mind C++ or the people who have to write it.

Which brings me back to my point: yes, technically you can write decent code even in a sloppy language but that misses the bigger picture which is that most of the code written in a sloppy language will inevitably be sloppy because most of the people who write it won't care, due to the dynamics described.

Heck, the Ops department in my org was forced by the security guys to globally enable ShellCheck on commit for their internal repos, and those security guys still have to drop by about every 2-3 months to rip away their "# shellcheck disable" pragmas and force them to actually fix their broken code because those people in the ops actively don't care. The Ruby scripts they write end up slightly less broken because Ruby itself is a slightly more principled language, not because they suddenly care more when they write Ruby.

Not to mention myself: I've spent quite some time and energy on learning shell's semantics and tricks and quirks and DOs and DONTs but it's such an infinite, never-ending descent into abyss with rewards of dubious value that nowadays I just don't care. Whenever I have to write a one-off script, I give up even before I start and write it however sloppy, with minimal quoting, to save my time; and only when it breaks, or when I have to reuse it, or when I have to share it — which happens quite rarely — then I re-write it in a proper language. On the whole, that definitely saved me both my time and my sanity. As for the cases when I have to write properly behaving shell script, well, those are almost arise in the course of the tasks that can be delegated to our ops team and now that's their problem.

P.S. I found that re-writing naive but broken shell scripts in a proper language is quite easy: the intended semantics is generally obvious, it's just the shell that actually requires quirkier syntax to propely express it; re-writing the (mostly) non-broken shell scripts is much harder: you have to decipher the intended meaining from the quirky syntax while keeping in mind that the original author still could have gotten it wrong by not knowing about a particular quirk you're aware about (or vice versa).

yakubin
0 replies
7h59m

It was Go. Rust has many converts from C++. But from scripting languages too. It’s a pretty heterogenous programmer-base.

stevage
0 replies
11h3m

but if you write shell in a principled manner, you can solve some pretty big problems with ease, and you can write maintainable code.

So if you are extremely good at writing shell, and you write it extremely carefully, then you can achieve basically what you can do in other languages without those caveats?

It sounds like we both agree that shell is not a sensible choice for scripting, for most people, then.

lmm
0 replies
12h2m

Bash is "real language" if you treat it as such. Yes, there are plenty of spaghetti scripts out there, riddled with global variables and full of side-effects, but if you write shell in a principled manner, you can solve some pretty big problems with ease, and you can write maintainable code.

Even carefully written bash code tends to be much less maintainable than code in better languages. It's just badly designed on multiple levels, like the famous post about PHP. Yes, if you're really careful you can write decent code in Malbolge - but why would you?

I wouldn't start new projects in shell, of course, but one area in which I think the shell shines is in writing integration tests for tools. More on this in a recent post I wrote: https://jmmv.dev/2023/10/unit-testing-with-shtk.html

The fact that you've written something that compiles to shell scripts rather than writing shell scripts rather undermines your claim that shell is a decent language. Integration tests of tools are important, but I'd still find e.g. TCL a much better way to write them.

tuyiown
0 replies
8h10m

People would had such a take and tell to use perl twenty years ago. Looks where we are now. I'd rather have improved my shell scripting abilities right away.

It's a three (optional) steps thing really:

- I want something that can use on multiple systems right away (shell)

- Ok things getting a bit serious, I'll allow myself to add some pre-requirement tooling on the system (others script)

- pre-requirement sucks, I'll invest in native code tooling and produce binary (eg rust & go for the choices of the moment).

LegibleCrimson
0 replies
14h47m

I agree. If a POSIX shell script serves your needs, use it. If it doesn't, you shouldn't be reaching for a more powerful shell, but an actual language. I'd reach for Perl before Bash, and I can't stand Perl.

yjftsjthsd-h
8 replies
15h48m

I actually thought [ and test were symlinks, but that might just be from Alpine Linux where they're both symlinks to busybox.

somat
6 replies
15h20m

Hardlinks actually, but who's counting? plus in real world usage you will hit the shell builtin anyway.

  ls -li /bin/\[ /bin/test
  26016 -r-xr-xr-x  2 root  bin  133256 Mar 25  2023 /bin/[
  26016 -r-xr-xr-x  2 root  bin  133256 Mar 25  2023 /bin/test
I have to admit I avoid "[" in my scripts, it is a weird hack trying to make a command look like syntax and this really bothers me for some reason.

yjftsjthsd-h
3 replies
15h10m

No, Alpine uses symlinks:

    $ docker run --rm -ti alpine
    / # ls -l /usr/bin/test /usr/bin/[
    lrwxrwxrwx    1 root     root            12 Sep 28 11:18 /usr/bin/[ -> /bin/busybox
    lrwxrwxrwx    1 root     root            12 Sep 28 11:18 /usr/bin/test -> /bin/busybox
    / # ls -li /usr/bin/test /usr/bin/[
     495254 lrwxrwxrwx    1 root     root            12 Sep 28 11:18 /usr/bin/[ -> /bin/busybox
     495360 lrwxrwxrwx    1 root     root            12 Sep 28 11:18 /usr/bin/test -> /bin/busybox
    / #

somat
1 replies
6h52m

Interesting... Doubly so because I thought busybox was a sort of linux crunchgen, A way to pack many independent executables into one to save space. With crunchgen at least, each executable name is then linked to the packed binary, that is, hardlinks. and the filesystem name is used to pick the correct code to run. Why did they go with softlinks? my guess... It can be moved across filesystem boundaries. Perhaps interference from cgroups?

http://man.openbsd.org/crunchgen

arp242
0 replies
6h22m

Both just look at the first entry in argv. Whether you used a hard- or symlink isn't very important for that. Or at least, that's how FreeBSD crunchgen works. Maybe they changed that in OpenBSD?

figmert
0 replies
10h31m

This is different. Alpine uses busybox, where everything is implemented in the busybox binary, thus everything is symlinked to it.

In other distributions, test is a separate binary, and [ is a link.

chatmasta
1 replies
12h44m

What bothers me is that you can write equivalent code with `test` and `[`, but when using `[` you need to terminate your conditional expression with `]`. Why? Isn't it the same binary? Why is the shell adding this extraneous requirement, just to surface "syntax errors" rather than simply treating `]` as a no-op?

kqr
0 replies
8h1m

It is the test command that requires the terminating ] when it is invoked as [.

It's possible your shell pre-empts this requirement and presents it as a proper syntax error, but it would have failed anyway.

mmphosis
0 replies
1m

They are different binaries on Linux Mint!

  # file /bin/test /bin/[
  /bin/test: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=6fe552c80ab0b3d3e60de2ab09167329e222eb67, for GNU/Linux 3.2.0, stripped
  /bin/[:    ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=99cfd563b4850f124ca01f64a15ec24fd8277732, for GNU/Linux 3.2.0, stripped

  #/bin/test --version
  #/bin/[ --version
  [ (GNU coreutils) 8.30
  Copyright (C) 2018 Free Software Foundation, Inc.
  License GPLv3+: GNU GPL version 3 or later <https://gnu.org/licenses/gpl.html>.
  This is free software: you are free to change and redistribute it.
  There is NO WARRANTY, to the extent permitted by law.

  Written by Kevin Braunsdorf and Matthew Bradburn.
Built with LBRACKET or not ...

https://git.savannah.gnu.org/gitweb/?p=coreutils.git;a=blob;...

kqr
8 replies
8h14m

Taking the last point one step further, we can also dispense with the if block entirely:

    if [ a = b ]; then
        echo "Oops!"
    else
        echo "Expected; phew!"
    fi
becomes

    [ a = b ] && echo "Oops!" || echo "Expected; phew!"
I'm not sure how often you should do this but sometimes it comes in handy for things like

    [ "$debug" ] && echo "what's going on" >&2
to conditionally print debug output to stderr.

----

And the fact that the if block tests a regular command means we can also do things like

    if grep -q 'debug' /var/log/nginx/access.log; then
        echo "Debug request found!"
    fi
----

Something I have not yet bothered to figure out is whether I should write

    [ $(expr 1 + 1) -eq 2 ] && [ $(expr 2 + 2) -eq 3 ]
or use the built in logical and of test:

    [ $(expr 1 + 1) -eq 2 -a $(expr 2 + 2) -eq 4 ]
As long as performance is not a concern, I can see roughly equal reasons in favour of either.

darrenf
5 replies
6h32m

Regardless of `-a` vs two tests and `&&`, there's no need to shell out to `expr` if bash's arithmetic evaluation is available:

    [ $((1+1)) -eq 2 ]

kqr
2 replies
6h2m

I'm going to assume that if one is using [ rather than [[ then one will also want to use expr rather than $(()).

ykonstant
0 replies
5h44m

Arithmetic expansion is a feature of all POSIX compliant shells, which furthermore advises against using `expr`. The latter is only needed for non-compliant shells like some legacy implementations of the Bourne shell.

darrenf
0 replies
5h21m

What's the basis for that assumption? I'm struggling to see a reason to not use the shell for arithmetic, regardless of `/usr/bin/[`, builtin `[` or `[[`.

neuromanser
1 replies
2h23m

arithmetic evaluation is ((…)), $((…)) is arithmetic expansion.

There’s no need for test(1) / [(1) or conditional expressions ([[…]]) if you’re doing arithmetic:

if ((1+1 == 2)); then …; fi

darrenf
0 replies
1h25m

Even better, and TIL. Thanks!

nwellnhof
0 replies
6h49m

Something I have not yet bothered to figure out

According to POSIX, the -a and -o binary primaries and the '(' and ')' operators have been marked obsolescent. See https://pubs.opengroup.org/onlinepubs/9699919799/utilities/t... under "Application Usage".

illo
0 replies
7h25m

[ a = b ] && echo "Oops!" || echo "Expected; phew!"

Not to be taken as a general rule though. I might be mistaken but I think that bash would parse the line as:

  ([ a = b ] && echo "Oops!") || echo "Expected; phew!"
so if the command sequence after `&&` fails, then the code sequence after `||` is executed anyway:

  illo@joe:~ $ [ "a" == "a" ] && >/dev/full echo "strings match" || echo "strings don't match"
  -bash: echo: write error: No space left on device
  strings don't match
  illo@joe:~ $
This is different from the semantics of the `if` block:

  illo@joe:~ $ if [ "a" == "a" ]; then >/dev/full echo "strings match"; else echo "strings don't match"; fi
  -bash: echo: write error: No space left on device
  illo@joe:~ $

shmerl
7 replies
16h3m

> And now for the final lolz. I’ve said above that these are the commands you use to evaluate expressions… but the shell also has expressions of its own via the !, &&, and || operators—all of which work on command exit statuses.

It works for simulating very basic boolean expressions, but it gets ugly quickly if you need some more complex combos of NOT, OR and AND. I wish Bash had proper boolean expressions support.

remram
4 replies
15h49m

What do you mean? Bash does have (), {}, !, &&, ||

What else do you need for "proper Boolean expressions support"?

shmerl
3 replies
15h15m

I don't think you can assign boolean variables to boolean expressions, becasue there is no boolean type.

Something like this won't work to become false:

    foo=true
    bar=false

    baz=$foo && $bar
    echo $baz
Or even simply:

    foo=! $foo
With numeric expressions, you can at least use $((...))

emmelaich
1 replies
11h56m

Yeh you'd have to do something like

    foo=true
    bar=false

    $($foo) && $($bar); baz=$?
    case $baz in
    1) echo false;;
    0) echo true;;
    esac
It just ain't worth it.

shmerl
0 replies
8h43m

Yeah, that's very convoluted. It would be useful to have some context for boolean expressions, similarly how $((...)) works for numeric ones.

account42
0 replies
4h54m

But you can do

    foo=0 # true
    bar=1 # false

    [ $foo = 0 ] && [ $bar = 0 ]
    baz=$?
    echo $baz
The syntax is a bit unusual coming from more modern languages, as is the use of 0 as true. But you can express whatever conditionals you want. Remember that C originally had no dedicated boolean type either.

You could also use numerical expressions although I don't recommend it due to being even less readable (even more so if you have expression complex enough that you need to protect against overflow):

    baz=$((foo + bar)) // foo && bar
    baz=$((foo * bar)) // foo || bar

candiddevmike
1 replies
15h47m

It works fine, you can satisfy your boolean logic needs with bash tests and exit codes.

You'd be amazed at how much of the world runs (just fine!) on bash conditionals.

hyperhopper
0 replies
13h21m

You'd be surprised how much of the world runs on cobol and Fortran.

Doesn't mean those are good or even okay or that people should choose to use them.

cduzz
7 replies
15h16m

I have strong opinions about shell and they don't actually line up with the rest of the world...

I believe that [ should never be used, only "test" because [ gives the illusion that the mechanism is some part of the language syntax when it is just another "program" (I'm including built-ins and functions in "program").

(if / || / && look at exit status; a program can't see the exit status of other things other than looking at the magic variable $? which is just another string once expanded; case looks at strings but doesn't operate based on exit status and doesn't set an exit status as part of the case ... esac operation; "programs" set an exit status)

I also believe that [ / test should only ever be used for evaluating filesystem constructs -- test -f /dev/null and if you've got string evaluations use case.

Unsurprisingly, most scripts make me itchy, and scripts that I write people find weird.

[edited to add the explanation of "program" vs "syntax" opinion]

chatmasta
4 replies
13h25m

Hello fellow traveler. I've got 8 years of scripts with `test` in them to prove I agree with you. It's a lonely road out there... I blame the Google shell style guide.

I picked up the habit of preferring `test` when I was writing scripts that needed to run in both `sh` and `bash`, but I kept it because it makes more semantic sense to me than treating a character like `[` as a command. It's also weird that `]` is an argument to `[` rather than also being a binary. I mean, I understand the technical reasoning... but it feels like a hack.

js2
2 replies
12h53m

Hello fellow traveler. I've got 20 years of scripts with `if test` in them. I only use `[[` when I need functionality it provides that `test` does not (pattern or regex matching, typically, and for pattern matching I'll generally use a case statement instead).

chatmasta
1 replies
12h40m

Yeah that's when I use it too. I figure if I've already given into the temptations of Bash, I may as well go all the way. Sometimes if I'm feeling extra frisky I even do [[ .. ]] || { echo "error!" ; exit 1 }

cduzz
0 replies
12h31m

I imagine most people have sets of macros they habitually add to any sufficiently complex script they're modifying...

  yelp(){
    es=$1
    shift
    echo "$@" >&2
    exit $es
  }
  
  test -d /tmp/goober || yelp 33 "couldn't find goober!"
(yes, I'm sure there are standards for exit status ranges and 33 is not such a thing)

quicklime
0 replies
11h37m

The Google style guide doesn’t disagree with you, it says to code exclusively for bash where possible (ie mostly everywhere inside Google) and to prefer [[ over [ and test:

https://google.github.io/styleguide/shellguide.html#s6.3-tes...

In a scenario where you also need to support shells other than bash, Google’s shell style guide doesn’t say to use [ over test.

tuyiown
0 replies
8h17m

Reading the comments here it looks like there are somehow dozens of us using test only. Dozens !

cduzz
0 replies
14h49m

Joke's on me; that's the point of the article.

I prefer, when writing in shell, to do this:

    if 
      program
    then
      something
    else
      otherthing
    fi
To emphasize that the thing after the if is just "look at the exit status of the last command before the then."

    if
      ls /tmp/goober
      test -d /tmp/goober
      echo "I'll always execute the then clause because echo will always return a 0 return code"
    then
      echo "this always gets run"
    else
      echo "this never gets run"
    fi
because the "if" is just looking at the exit status of "echo".

[edited to do code blocks]

skrebbel
6 replies
6h41m

I'll never understand how there's a whole class of developers who absolutely despise JavaScript for acting weird when adding arrays to objects, but at the same time gladly write bash scripts and send each other articles about how [ is totally a program but ] isn't and that's all fine and dandy and in no way objectionable.

EDIT: this comment is a bog standard HN middlebrow-dismissal and the blog post doesn't deserve this for to be the top comment. It was morning and I was grumpy. I can't delete it anymore so would appreciate some downvotes.

prerok
1 replies
5h46m

Well, shell scripts are small helper programs, where you don't have to implement all sorts of business rules and complex object interactions. I am pretty sure most developers (I definitely) would be horrified at the prospect of having to to implement in shell what we are now doing in JS. Even if that were possible.

The likes this gets is the design simplicity and uniformity (across processes), not how it looks and what it actually does.

DonHopkins
0 replies
5h36m

Oh come on. Who are you to say developers don't have to implement business rules and complex interactions? You're just making an after-the-fact rationalization for a terrible, foolish, thoughtless, pointlessly complex design. And then trying to dictate how people should use their tools and what kinds of problems they should limit themselves to solving.

It's not like the shell script designers sat down at a meeting and said:

"OK, it's very important that we don't want people using this language to implement all sorts of business rules and complex object interactions, because decades from now there will be invented an Ousterhoutian dichotomy and government regulations enforcing that developers should use other kinds of languages for that, because of the essential definition of what it means to be a shell scripting language, so we've got to come up with some way of punishing people who attempt to do that, and introduce obscure hard to spot bugs in their programs as a consequence if they have the audacity to do that, or even if their initially simple scripts later get more requirements and have to become more complex. Now let's brainstorm about how we can do that, and make sure the syntactic syrup of ipecac we come up to solve this problem fits in well with the rest of the language design by being totally off-the-wall and unlike every other piece of syntax in any other programming language including itself."

Then again, maybe you have a point, and they did do it on purpose, judging by how terrible the rest of the language is!

"Language Design Is Not Just Solving Puzzles" -Guido van Rossum

https://news.ycombinator.com/item?id=20672739

https://www.artima.com/weblogs/viewpost.jsp?thread=147358

Summary: An incident on python-dev today made me appreciate (again) that there's more to language design than puzzle-solving. A ramble on the nature of Pythonicity, culminating in a comparison of language design to user interface design.

Some people seem to think that language design is just like solving a puzzle. Given a set of requirements they systematically search the solution space for a match, and when they find one, they claim to have the perfect language feature, as if they've solved a Sudoku puzzle. For example, today someone claimed to have solved the problem of the multi-statement lambda.

But such solutions often lack "Pythonicity" -- that elusive trait of a good Python feature. It's impossible to express Pythonicity as a hard constraint. Even the Zen of Python doesn't translate into a simple test of Pythonicity. [...]

http://lambda-the-ultimate.org/node/1298

Guido: Language Design Is Not Just Solving Puzzles

And there's the rub: there's no way to make a Rube Goldberg language feature appear simple. Features of a programming language, whether syntactic or semantic, are all part of the language's user interface. And a user interface can handle only so much complexity or it becomes unusable.

The discussion is about multi-statement lambdas, but I don't want to discuss this specific issue. What's more interesting is the discussion of language as a user interface (an interface to what, you might ask), the underlying assumption that languages have character (e.g., Pythonicity), and the integrated view of semantics and syntax of language constructs when thinking about language usability. [...]

Ousterhout's Dichotomy is a contrived descriptive not prescriptive fiction, to rationalize the design of TCL after the fact. And despite its flaws and limitations, TCL is orders of magnitude better and more thoughtfully designed and purposefully thought out and internally consistent than any Unix shell scripting language.

https://news.ycombinator.com/item?id=9970505

https://en.wikipedia.org/wiki/Ousterhout%27s_dichotomy

Ousterhout's dichotomy is computer scientist John Ousterhout's categorization[1] that high-level programming languages tend to fall into two groups, each with distinct properties and uses: system programming languages and scripting languages – compare programming in the large and programming in the small. This distinction underlies the design of his language Tcl. [...]

Criticism: Critics believe that the dichotomy is highly arbitrary, and refer to it as Ousterhout's fallacy or Ousterhout's false dichotomy.[4] While static-versus-dynamic typing, data structure complexity, and dependent versus stand-alone might be said to be unrelated features, the usual critique of Ousterhout's dichotomy is of its distinction of compiling versus interpreting. Neither semantics nor syntax depend significantly on whether a language implementation compiles into machine language, interprets, tokenizes, or byte-compiles at the start of each run, or any mix of these. In addition, basically no languages in widespread use are purely interpreted without a compiler; this makes compiling versus interpreting a dubious parameter in a taxonomy of programming languages.
hun3
1 replies
6h30m

You can replace bash with JS. Scripting is their common application domain. Can you replace JS with bash, though?

If the answer is "no," that means JS has applications (e.g., servers and web clients) that bash can't be used for. And they're saying JavaScript is bad at them. Bash is arguably worse, but it's usually not an option in the first place so you don't get to complain.

mananaysiempre
0 replies
5h27m

You can replace bash with JS

Noo-ot really except in a Turing-tarpit sort of way. Like it or not, once learned (!) Bourne shell together with the traditional tools is a well-designed user interface and works even better for that than, say, Tcl[1]. Not an automation or scripting language, a user interface for all daily interaction. I would absolutely hate to manually sort through my files in JS, while in shell I can often do it faster than in a GUI file manager.

And, of course, it’s also a fairly strong contender as a programming language in the paradigm of many pipelined imperative processes—probably because that paradigm remains largely unexplored. I can only maybe name Icon as a viable competitor, and Icon’s also very nice. (Python, no matter how “inspired” it is by Icon, has traded its command of streams for more mainstream ease of use.) By comparison, the old complement of parallellized JS build tools (Gulp? Grunt? I forgot, it’s been a while) always surprised me with how awkwardly it accomplished shell-script-equivalent tasks.

To be clear, it’s not that Bourne shell is good and JS is bad. JS is a passable Fortran[2], while shell is at best a marginally functional one—there’s a reason Awk exists. But shell competes in categories that most other languages don’t even try to qualify for.

[Yes, I know about rc. I happen to think rc’s focus on one-level lists of strings (incidentally shared by Jam, an attempt at a better make) is a mistake, and more consistent arbitrarily-nested quoting, giving a “stringy Lisp” in the vein of Tcl, would be the way to go.]

[1] http://yosefk.com/blog/i-cant-believe-im-praising-tcl.html

[2] http://conal.net/blog/posts/can-functional-programming-be-li...

simias
0 replies
4h43m

I dislike JavaScript and shell scripts equally, but sometimes I have to add a feature to a web page and I have to use JS, so I do, and sometimes I need to automate some un*x system task in a portable way and without heavy deps and shell scripts are the obvious solution.

What annoys me is using JavaScript and shell scripts when there are clearly superior alternatives and no clear advantage for it besides the familiarity (which, admittedly, can be a strong argument).

Shell scripts being an arcane mess is no excuse for Javascript being as clunky as it is, and vice-versa.

rnmmrnm
0 replies
6h8m

one language being the only option available on browsers and the other being a (replaceable) glue layer between _other_ programs? I also see minimalistic beauty in the fact that `[` is not even part of the shell. "do one thing and do it well" at its finest.

pavlov
6 replies
10h21m

Writing bash scripts seems like the ideal use case for Chat-GPT style programming.

The programs are fairly short, examples are common in the corpus, and the syntax and execution model are so inscrutable that only a machine can pretend to understand what’s going on.

saagarjha
1 replies
8h27m

I tried doing this and it made a bunch of mistakes with handing spaces and whatnot that makes shell scripts often brittle. Of course, a human would probably make the same mistakes, because who actually knows how to do that correctly? But it doesn’t really make me comfortable using it for anything but the smallest of automation tasks.

account42
0 replies
4h43m

who actually knows how to do that correctly?

Many people. It's not exactly rocket science to quote your arguments which already gets you most of the way there.

croo
1 replies
10h14m

I already tried to customize my command line prompt with it. All the arcane colouring character decodings and random character escaping missions became a "put git branch names in parenthesis and make it cyan".

db48x
0 replies
10h4m

What’s so arcane about `echo "$(tput setaf 6)cyan$(tput sgr0)"`?

spullara
0 replies
10h7m
rjblackman
0 replies
10h6m

yes, this is the best use case I have found for chat gpt. I've been automating away all of my little annoyances with powershell.

metadat
6 replies
16h7m

I've learned to only use [[ when I want to do a regex match, e.g.

  if [[ "${foo}" =~ ^bar$ ]]; then echo Yes; fi
Otherwise, just stick with "test" or "[".

85,000 lines of bash and counting... Not saying bash is great, but it's still meeting my needs for a lot of stuff. Shrug.

stephenr
2 replies
14h32m

If you want to do (relatively) simple pattern matching in a posix-compliant way, `expr`[1] can match BREs, and even return a capture group.

1: https://pubs.opengroup.org/onlinepubs/9699919799/utilities/e...

somat
1 replies
6h17m

As someone who enjoys shell scripting I don't think I have ever used the "expr" command. this astonishes and baffles me because I used to read the man pages for fun, I even had a little script to pick a random page. So thanks a bunch for bringing it to my attention.

My only guess as to why I am unfamiliar with the command is that perhaps younger me failed to figure out "expression" means regular expression. It only clicked this time because of your comment and the see also: re_format link at the bottom.

http://man.openbsd.org/expr

stephenr
0 replies
5h14m

Yeah it's slightly obscure for a lot of people I think: most who do know of it seem to associate it purely with arithmetic, and rightly use `$((...))` instead.

cvccvroomvroom
1 replies
15h53m

That's a pointless example.

    [ "$foo" = bar ] && echo Yes
For substring matches, [ and * globs are generally good enough.

    [ "$bar" = extra* ] && echo '$bar began with extra'
Bash's regex dialect is primitive and rarely worth fussing over. For anything complicated, use another tool be it grep, awk, perl, or such. There are diminishing returns of obsessing over doing everything in bash when a complicated task demands more capabilities suitable to another tool with greater reusability, modularity, and intrinsic types.

aidenn0
0 replies
15h36m

I thought regex inside [[ was the same as egrep?

penguin_booze
0 replies
9h52m

The fact that the regex is written bare and unquoted, got me a while ago. I'm someone who religiously quotes everything; it took me a while before I figured out why my stupidly simple regex doesn't match.

hibbelig
2 replies
1h55m

I have recently written

    if some_command 1>/dev/null 2>/dev/null; then
        : # A-OK
    else
        here_is_the_code_i_actually_needed
    fi
And it leaves me wondering if there is a way to “negate” the exit status of a command...

(The command was “docker volume inspect VOLUME”, and in one place in my script I had to do things when the volume existed, and in other places I had to do things when the volume did not exist...)

jeramey
1 replies
1h46m

Yes, there is. To negate, just use ! as with many other languages.

    if ! command …; then
        do_needed_things
    fi

ramshorns
0 replies
23m

How does that work? It doesn't look like ! is a command or shell builtin, so I guess it's an argument to if.

user261
1 replies
13h51m

I really didn't understand why the last if statement is confusing. Is it because when starting out with shell scripting one would usually assume that the [ is a part of the bash scripting language not just another program? If it's then I think I get it now. Otherwise please mention why it's surprising. Also, @author thanks for a nice article. was a good reed.

computerfriend
0 replies
2h58m

Even if you don't know about [ as a binary, I still don't see how it's confusing. Seems like very normal bash to me.

xyzzy_plugh
0 replies
1h33m

Some people insist that Linux distributions be called GNU/Linux, not just Linux. The reason is that, strictly speaking, Linux is just a kernel and, when you are using a Linux system, you are primarily interacting with the GNU userland. While I sympathize with that idea, I’m not the one to adopt the GNU/Linux name, in part because most Linux distributions today ship with software developed by many more vendors than just GNU. In this post, however, I use the GNU/Linux term because saying Linux alone would be unfair to Linux.

Hear, hear!

xiaodai
0 replies
6h37m

R programmers be like: I still don't know when to use [ or [[. I just put a browser() then and test out the options

wodenokoto
0 replies
8h41m

I had no idea [ was a program and the fact that it checks if the last argument is a closing bracket is kinda funny to me.

But at least it explains why you need spaces on both sides of the brackets.

stn_za
0 replies
10h27m

js frontend dev masses ripping on bash here. lulz

l0b0
0 replies
15h8m
glandium
0 replies
11h32m

One place where you learn not to use [ or [[, or be very careful about them is autoconf shell scripts (and that applies to more than `if`, like globs, sed, awk, etc.). Because square brackets are quote characters in autoconf. https://www.gnu.org/savannah-checkouts/gnu/autoconf/manual/a...

JNRowe
0 replies
15h45m

chubot has written an interesting document¹ exploring more of the nuance with test/[/[[, and many of the other entries in that blog have intriguing explanations of the oddities of our shells(a random example²).

¹ https://www.oilshell.org/blog/2017/08/31.html

² https://www.oilshell.org/blog/2016/11/18.html

DonHopkins
0 replies
5h44m

Syntactic Syrup of Ipecac.