return to table of content

Command line interface guidelines (2021)

pimlottc
43 replies
4d2h

Please also consider a --dry-run option that gives a preview of what actions would be taken without actually making any changes. This is really helpful for learning a tool and making sure you got complex options and fileglobs correct before committing to potentially irreversible changes.

samstave
26 replies
4d2h

dry-run should be a pipable | function for any command;

> do_thing.py | dry-run

--

Think of it as "explain your work, step by step" as one would prompt...

also, its a food-for-thought you muppets.

---

@jasonjmcghee ; ( $ ) . ( $ ) great justice.

jasonjmcghee
20 replies
4d2h

How could that work? The script on the left executes first, piping its output to another command won’t prevent that

saghm
17 replies
4d2h

I think there are ways to detect if stdout is a pipe and operate differently (e.g. by using different defaults for coloring output), but I'm not sure if there are ways to detect what the other side of the pipe is, much less what `dry-run` would actually be expected to be in this case

sokoloff
7 replies
4d

I often pipe output to tee and would be pretty annoyed if that changed the behavior of the original command to not do anything because stdout was a pipe.

indymike
6 replies
3d21h

I often pipe output to tee and would be pretty annoyed if that changed the behavior of the original command

The output sent to tee is usually not the same as the output from the command to the terminal, so you are getting something different than most human users expect from original command... the reason is that terminal escape codes and other formatting for humans may need to be omitted from output to a pipe. You do this by asking the OS, "is this thing a terminal?".

Python example

"terminal" if sys.stdout.isatty() else "something else"

C is very similar:

if (isatty (1)) fprintf (stdout, "Terminal."); else fprintf (stdout, "Not Terminal.");

(printf works, too).

saghm
3 replies
3d17h

You can see this pretty easily with `ls` as well by using different options for the color. If you run `ls --color=auto`, you'll get colored output when running directly but black and white output if you pipe to `less`. However, if you pass `--color=always`, you'll get colored output when running directly and a bunch of garbage around some of the entries when piping to `less` because it doesn't interpret ANSI escape codes for color by default (although depending on what the output you're piping into `less` is, there are some workarounds like https://www.gnu.org/software/src-highlite/)

funcDropShadow
2 replies
3d6h

You can use ls --color=always | less -R to render the colors in less.

samstave
0 replies
3d5h

Thats fn cool

saghm
0 replies
17h2m

Huh, not sure how I've never found that option before. Thanks for the lesson, I'll need to rtfm a bit more closely!

sokoloff
1 replies
3d21h

Sure; the output format changes, but the functionality doesn't.

Even ls outputs a tabular format by default when it's on a terminal and a list one file/dir per line when it's not on a terminal (if it's piped to cat for example or why ls | wc -l correctly counts the entries).

But the (essential) behavior of the command remains the same. ls still lists files/dirs... scp still copies files, etc.

indymike
0 replies
2d23h

But the (essential) behavior of the command remains the same.

Of course. A command needs to do it's defined function.

You'll find some programs that are quite a bit different when invoked from outside the terminal vs inside the terminal. Developers need to take into account both situations, which is really the point the original post.

jasonjmcghee
3 replies
4d1h

Maybe I’m misunderstanding but sounds like you’re proposing the modifying the command on the left to detect whether it’s piping it’s output (curious how you do that btw sounds pretty cool / useful)

But at that point you could just handle —dry-run directly

kergonath
2 replies
4d1h

Maybe I’m misunderstanding but sounds like you’re proposing the modifying the command on the left to detect whether it’s piping it’s output (curious how you do that btw sounds pretty cool / useful)

You can do that. Possibly not from all languages but for anything that can call functions in the c standard library, that’s what isatty() is for (among other uses). It takes a file descriptor and returns whether it is a terminal or not. If you do this with stdout, this tells you whether it goes to a terminal or whether it is redirected in a way.

As the parent suspects, though, this won’t tell you anything about what is on the other side of the redirection.

jasonjmcghee
0 replies
4d1h

Very cool function - thank you!

epcoa
0 replies
3d4h

It also doesn’t tell you what is the terminus of a pipeline. Because often with isatty what you really want to know is if the pipeline ends with a tty. `ls` is screen formatted but `ls | less` is not without extra work.

gpderetta
2 replies
4d1h

on the other hand you could have 'dry-run <command>' that via .so interposition tricks could intercept and list all destructive changes done by an arbitrary <command>, as a form of sandboxing.

account42
1 replies
3d4h

Not generally because you can't e.g. know if a write to a socket is destructive or just a query for information needed to decide the next steps/output.

gpderetta
0 replies
3d2h

True, I was only thinking of fs access!

kjs3
1 replies
4d1h

If it operated differently based on what it was outputting to, that kinda defeats the point of 'dry run', which is to see exactly what's going to happen based on what's on the other side of the pipe when I run it for real. "Did this blow up because it's a bad command, or because there's a bug in the 'only kinda real dry run' code?".

samstave
0 replies
3d21h

+++

-

What if there wasa 'deep-pipe' '||' which would be based on a set env/docker/blah - which would launch an env and execute your '||'d code in it, and output some log/metrics/whatever?

int_19h
1 replies
3d23h

Here's a possibly more interesting take on this: instead of do-thing.py, imagine if there was thing.py, which processed all that complex CLI options, and as output produced a linear shell script, with each line invoking some atomic operation, and a comment explaining which options contributed to that particular line.

eichin
0 replies
3d22h

Yeah, that's a pattern I use for occasional sysadmin tools - the command itself generates the actual commands, I review them, then "<up arrow> | sh -xeu". (Yes, there's no guarantee that the output is the same, so I don't use the pattern when that's a risk; it's also rarely if ever used for things I expect other people to run, just bulk operations that are saving me repetition.)

koolba
1 replies
4d2h

Having it be a shell function would work. It could create a copy-on-write file system or override the syscalls for file system access.

agos
0 replies
4d2h

not all the destructive actions are on file system

jasonjmcghee
1 replies
4d1h

Wasn’t trying to be a prick- i was commenting “here’s what I’m seeing with your proposal and the presented obstacles, how would you overcome them?”

One approach could be something like “set -x” after setting a confirmation with “trap” command.

    confirm_execution() {
        echo -n "Execute $BASH_COMMAND? [y/N] "
        read response
        if [[ $response != [yY] ]]; then
            echo "Skipped."
            return 1
        fi
    }

    trap 'confirm_execution' DEBUG
    set -x
    user_command
    set +x
But that’s a wrapper script

samstave
0 replies
3d21h

confirm_execution() { echo -n "Execute $BASH_COMMAND? [y/N] " read response if [[ $response != [yY] ]]; then echo "Skipped." return 1 fi }

draw_heart() { cat << EOF <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" width="100" height="100"> <path fill="red" d="M12 21.35l-1.45-1.32C5.4 15.36 2 12.28 2 8.5 2 5.42 4.42 3 7.5 3c1.74 0 3.41.81 4.5 2.09C13.09 3.81 14.76 3 16.5 3 19.58 3 22 5.42 22 8.5c0 3.78-3.4 6.86-8.55 11.54L12 21.35z"/> </svg> EOF }

trap 'confirm_execution' DEBUG set -x draw_heart set +x

foobarqux
0 replies
4d1h

"try" is not a prefix (like watch, nice, etc) uses an overlayfs in order to be able to see and accept or reject changes to your filesystem from a command

https://github.com/binpash/try

bongodongobob
7 replies
4d

Whatif is one of my favorite things about powershell. Don't know what I'd do without it. Has saved me many times from completely destroying things.

snuxoll
3 replies
3d22h

WhatIf in Powershell also remains completely broken; not that it doesn't work, but it is not properly passed down the call stack like it's supposed to from cmdlets or functions written in PowerShell (as opposed to those loaded from .net assemblies).

jcotton42
2 replies
3d22h

Where have you seen this? If the function does it properly (SupportsShouldProcess) then it should pass down automatically.

snuxoll
0 replies
2d21h

I have written just in the last month cmdlets that have SupportsShouldProcess, and it does not pass on as it should. See https://learn.microsoft.com/en-us/powershell/scripting/learn... and https://github.com/PowerShell/PowerShell-RFC/pull/221#issuec...

partdavid
0 replies
3d

I have seen it, I think it's because it's not easy to see whether the cmdlets or functions your calling support and support it right. It does work, but I have seen it not always work and that results in having to check or test every command. I think the GP is overstating the incovenience, though, and Powershell is still much, much better than any Unix shell in this respect (which simply has no mechanism and no standard way to discover one).

ojintoad
2 replies
4d

Agree!

I'll go further. PowerShell covers a lot of the concerns in the OP out of the box. It is extremely well thought out and is one of my favorite cli models to buy into.

bigstrat2003
1 replies
3d22h

Powershell is just plain a great shell. Blows all the *sh variants out of the water. I'd love for it to gain traction in Linux (so that, for example I could use it as my shell on my desktop) but I don't really see that happening.

partdavid
0 replies
3d

What are the things that are stopping you?

I use MacOS on my personal machine and Linux for various shellboxes and I switched to Powershell years ago and haven't looked back. Occasionally I invoke bash as a language runtime for checking shell script stuff, the way I would any other language's REPL, but for a shell? Powershell is strictly better.

The one actual problem with Powershell in this area is quoting for external commands. It's solvable in scripts by replacing certain things with double-quoted equivalents but not really interactively, and it is an occasional pain (though I still think it's overall less of a problem than quoting in general in bash and its ilk).

azemetre
3 replies
3d20h

Do you have an example of a CLT that has a dry run flag? I am very confused on how to design one for some CLTs I'm making.

If you call an API to retrieve data, how can that be a dry run? Are you suppose to give fake examples with fake output?

scbrg
0 replies
3d16h

apt-get. It simply tells you what it's going to remove/install.

Calling an API to retrieve data is not really the type of program that requires a dry-run flag. It's mainly useful for commands that change the state of something in ways that could potentially be destructive, unwanted and/or hard to revert.

funcDropShadow
0 replies
3d6h

make has -n or --dry-run. But originally it was called -n and that has been used many cli tools.

amake
0 replies
3d12h

The AWS CLI has --[no]-dry-run in many of its subcommands.

https://docs.aws.amazon.com/cli/latest/userguide/cli-usage-h...

epage
1 replies
4d

Or if the command can't be undone and with significant side effects, be `--dry-run` by default and have an `--execute` flag

BD103
0 replies
3d23h

Exactly. This is implemented by the Javascript code formatter Prettier [1] where you have to pass `--write` in order to overwrite an unformatted file.

[1]: https://prettier.io

danstewart_
1 replies
3d21h

I tend to go the opposite way and have the default behaviour not actually make any changes and require passing `--commit` to actually do something.

I feel it’s safer for scripts that have irreversible (or difficult to reverse) actions.

enriquto
0 replies
3d20h

Even safer: your program does never perform any actual deed. It just prints commands that you can run afterwards, using an external program, ideally a shell. This has the advantage of allowing the user to edit all the actions one by one.

Instead of

    myprog           # see what would happen
    myprog --commit  # alright, do it

You do

    myprog           # see what would happen
    myprog | sh      # alright, do it

But if you want to change something:

    myprog > x
    vi x
    cat x | sh

And if you just want to run everything in parallel:

    myprog | parallel -j 16

ceving
25 replies
4d3h

The current Unix command line situation is on the one hand "incredible useful" and on the other hand "broken by design".

Why is it incredible useful?

Just imagine how long it would take to write the following in C or Rust:

    curl -sS https://go.dev/doc/devel/release |
      html2text |
      grep -o -P '\bgo\d+\.\d+\.\d+\b' |
      sort -V |
      uniq |
      tail -1
Why is it broken by design?

Read this: https://news.ycombinator.com/item?id=29747034

The problem: a command line interface must be human readable and machine readable at the same time. There is no canonical way to solve this problem.

shadowgovt
9 replies
4d3h

Why is it broken by design?

I think your example self-explains why it's broken by design. It's a good example.

a command line interface must be human readable and machine readable at the same time. There is no canonical way to solve this problem.

And you know, there could be one. Apple has Human Interface Guidelines to reify the meaning of the visual abstractions in its desktop UI. The problem is that the command line didn't come from people who think like Apple designers; it came from people who think "How can I express what I want using the least code possible, because laziness, impatience, and hubris are virtues?" And they weren't wrong for the time (especially because every byte matters), but the design decisions they made got baked into tooling that can't now be moved.

I think at this point we'd have to punt the POSIX toolchain to get something better; it's hard for me to imagine how we'd build discoverable, conceptually-consistent UX atop what we currently have.

ceving
7 replies
4d2h

I think graphical user interfaces are no high standard. They do not compose. How would you express a loop or a recursion in a graphical user interface?

samatman
5 replies
3d23h

I'm genuinely unsure how to translate your question into, hmm. Anything actually.

The snarky answer is "By writing it in source code, in a GUI text editor", a thing I do frequently. But the problem is that I have no idea what you're getting at in the first place, so that's just an attempt to recover some meaning from what you wrote.

shadowgovt
4 replies
3d22h

Shells allow one to pipe around the data flowing from called process to called process in a very smooth and transparent fashion.

By and large, GUIs do not. There is no such thing as a universal shell for GUIs, and attempts to layer automation atop the GUI abstraction are generally spotty and unreliable (certainly when compared to CLI and shells). This is both for technical reasons (i.e. it's much easier to clearly delineate the two ends of a pipe than to clearly delineate "I want to click on the red square inside the 'diagram' window inside the drawing app") and for ecosystem reasons (since GUIs aren't thought of as automatable, GUI designers are free to move pieces around version-to-version of software, making it extremely challenging to describe a GUI structurally).

I've seen some neat attempts at GUI automation (Sikuli is my favorite) but it's never been a core feature like it is in the CLIs-glued-together-by-shells world.

samatman
3 replies
3d22h

CLIs are in the modern world a subset of GUI programs which run inside a text-oriented window, and plenty of more sophisticated GUI programs can be directed textually, with macros or short scripts, and/or have a command-line component for doing that sort of pipe or batch-oriented work.

Yeah, sometimes there's functionality stuck behind a button or menu select which I want exposed in a more textual way, in macOS that's when you break out Automator, Alfred, or Hammerspoon (the only one I've ever used fwiw), Linux and Windows have their own equivalents.

I don't think the distinction you're pointing to is nearly so stark or clear-cut as it's often made out to be. Ecosystems converge towards the tasks which are amenable to batching and pipelining being equipped to do so.

shadowgovt
2 replies
3d20h

But in general, batching and pipelining are done via a text-based interface. Extremely rare is the teachable GUI interaface where there's a point-and-click way to describe the concept "See these five items I clicked? Do that twenty-eight more times with this radio button family swizzled via a linear sweep."

samatman
1 replies
3d19h

My point being, I don't see why there ever should be. A text window is a valid and frequently-included part of GUI programs (I happen to be using one this instant, in fact), so there isn't a lot of advantage in replacing something like Lua scripting with a bunch of buttons and whizbangs. It's been tried for normal programming, and no one likes it.

The specific kind of task you described is frequently exposed as macros in programs complex enough to deserve it.

shadowgovt
0 replies
3d2h

so there isn't a lot of advantage in replacing something like Lua scripting with a bunch of buttons and whizbangs

I mean... "no-code development" is an entire category of product offerings. People are positively thirsty for being able to do that sort of thing without having to learn to love staring at giant blobs of text all day.

shadowgovt
0 replies
4d2h

They are a challenge for composition, often.

Great for discoverability though, and that doesn't require graphics, just context.

There should be a button I can push in my shell that lets me ask "what does the token at cursor mean," and a button that lets me type a plain language search string that wires down to a contextual search (i.e. I'm in the middle of typing out "grep" I should be able to ask "how do I search folders?").

We didn't have the tools to build this when grep was invented; we have them now.

rascul
0 replies
4d1h

I think at this point we'd have to punt the POSIX toolchain to get something better; it's hard for me to imagine how we'd build discoverable, conceptually-consistent UX atop what we currently have.

Posix could potentially do this. It already has a bit about conventions and could be expanded. Problem is getting things to adhere to it, plus I doubt the posix authors could be convinced to add a lot more to it.

https://pubs.opengroup.org/onlinepubs/9699919799/basedefs/V1...

ta8645
6 replies
4d3h

The problem: a command line interface must be human readable and machine readable at the same time.

Since the computer is there to serve us, ultimately the solution must be for the machine to read as well as humans.

ceving
3 replies
4d2h

Never ever. Humans are idiots. If you build machines like humans you will get idiotic machines. Machines need to do things exactly right and not almost right or mostly right. Current AI trends will not create better machines, just more idiotic machines. Those idiotic machines will be sufficient to impress idiots, but they will not help to do things exactly right. Automated theorem proving is the only way to build better machines.

ta8645
0 replies
4d2h

Automated theorem proving is the only way to build better machines.

Sure, but that will just be one aspect of an integrated and wholistic AI, which can configure the parameters input to the automated theorem proving component, and act on the results.

samatman
0 replies
3d23h

Automated theorem proving will only ever be an arrow in the quiver, because even in theory, the space of useful computer programs is a vast superset of those which may have all their properties formally verified. In practice, it's a much larger superset than theory allows. Growing the space of formally verifiable subprograms is a worthy endeavor, sure, but so is good old-fashioned engineering.

funcDropShadow
0 replies
3d6h

Automated theorem proving is the only way to build better machines.

That is only true for a very narrow notion of "better machines". Automated theorem proving is still light years away from being applicable in the majority of software projects. Don't get me wrong I am aware of the progress made in the last decades. Yet, it will be some time until someone writing the next iOS app will reach routinely for an automated theorem prover to lower the defect rate.

And even then, the questions remains whether fulfilling a formal specification is at all correlated with "better machines" or "better software". There are domains where this is conceivable: os kernels (L4), certifying compilers (CompCert), and others. But how does a theorem prover, automated or not, help with improving the next generations of video codecs? That is an intrinsically subjective problem -- the quality axis, less so the performance axis. How does theorem proving help with neuronal networks? How does it help with capturing the right business process to actually improve business outcomes and not just introducing new bureaucracy?

oblio
1 replies
4d2h

What if the ultimate solution is only achieved after our lifetimes?

ta8645
0 replies
4d2h

I don't really understand the question. Nobody got to fly until we figured out how to build airplanes, people before that lived on the ground. Likewise, we'll live with iterations on the current command line, until AI is fully integrated with it.

friendzis
2 replies
4d2h

The problem: a command line interface must be human readable and machine readable at the same time. There is no canonical way to solve this problem.

* Powershell has entered the chat

riddley
1 replies
4d2h

Powershell has some cool things, but it's a disaster when it comes to usability.

adamrezich
0 replies
3d23h

how so?

n_plus_1_acc
0 replies
3d18h

If you assume a similar set of primitives, it could be as simple as

   html2text(get("https://go.dev/doc/devel/release")).find_all("\bgo\d+\.\d+\.\d+\b").sorted().unique().last()
It's just that Rust is designed to be more robust in exchange for stricter Compiler time checks.

hi41
0 replies
2d23h

Could you please explain what that command does.

avgcorrection
0 replies
4d1h

The article covers this. Special machine-readable output modes like `--json`.

PurpleRamen
0 replies
4d2h

The problem: a command line interface must be human readable and machine readable at the same time. There is no canonical way to solve this problem.

It's not really the "same time". Usually, human and machine use the same command, but at different times. And there are many ways to enable different outputs, even at the same time. But this all depends on having some standard which everyone follows. And that's where it becomes complicated.

BlueTemplar
0 replies
4d1h

Your example not only doesn't prove that it's broken by design (only broken most of the time ?), it even seems to give a (relatively ?) easy way to fix it ?

Since very few implementations of ls allow you to terminate filenames with NUL characters instead of newlines

In fact the solution to this issue seems to be so obvious that I might be missing something ? (Rejection by shell interfaces for some reason ??)

jansan
13 replies
4d3h

What I find super irritating is that some terminals will automatically execute a command if you paste it from the clipboard and there is a newline char at the end. IMO command line interfaces should not do this.

riddley
2 replies
4d3h

You might try a clipboard manager. Most of these that I've used have an option to strip new lines from anything in the clipboard.

eviks
1 replies
4d2h

that would break multi-line commands, so not a good option

riddley
0 replies
4d2h

It just strips off the final one.

avgcorrection
2 replies
4d1h

Bash

    bind 'set enable-bracketed-paste on'

mixmastamyk
1 replies
3d14h

Put in .inputrc which I believe is portable to readline shells:

https://wiki.archlinux.org/title/Readline#Bracketed_paste

avgcorrection
0 replies
3d7h

Wouldn’t surprise me if that’s the article where I got it from. :)

crlfcrlf
1 replies
4d3h

You find it irritating that terminals execute commands when they receive a newline, which is indistinguishable from pressing the enter key? The program is doing exactly as it should. Consider not copying newline characters, and you will solve the problem.

mixmastamyk
0 replies
3d14h

Bracketed paste has been a thing for twenty years now:

https://en.wikipedia.org/wiki/Bracketed-paste

Put in .inputrc:

https://wiki.archlinux.org/title/Readline#Bracketed_paste

aequitas
1 replies
4d

Fish shell by default does the expected behaviour of inserting in and allowing you to edit the command (multiline if needed) before executing it with [enter]. I've found Fish shell to have a lot of sane defaults and have yet to find a thing I would like to customize except for the prompt.

theshrike79
0 replies
3d1h

And the best way to customise the prompt is to install https://starship.rs =)

twic
0 replies
3d4h

If i know there will be at most one newline, i start by typing "# ", then paste. If the newline gets interpreted, then the whole line is just a comment anyway. It's then easy to edit the # off the start of the line before running it for real.

marcosdumay
0 replies
4d2h

You want to use C-x C-e.

bloopernova
0 replies
4d3h

On macOS, iTerm asks if you try to paste a string with a newline. You can disable it if you find it annoying.

bluetomcat
12 replies
4d3h

Traditionally, UNIX commands were written under the assumption they were going to be used primarily by other programs. They had more in common with functions in a programming language than with graphical applications.

Not quite. They were primarily intended for interactive use within a login shell. There are the programs which generate output on stdout (ls, cat, find, tty, who, date), and there are the "silent" text filters (tr, grep, cut, uniq, sort, wc). A one-liner would enable you to do basic computing tasks in that era. Any complex program would be written in C. After the appearance of DSLs like sed and AWK, certain string-heavy programs were offloaded to the shell.

The shell is not a sane programming environment and was never intended as such.

rollcat
10 replies
4d3h

The shell is not a sane programming environment and was never intended as such.

There are 10kloc C programs that could be 10 lines of shell and there are 1kloc shell programs that could've been 100 lines of C.

Both kinds are nowadays probably better done in Python or Lua, but the shell and C are what's most universally available.

bluetomcat
9 replies
4d3h

There are 10kloc C programs that could be 10 lines of shell

Only when the shell calls other external C programs. Ten lines of calling ffmpeg or curl is not shell programming.

there are 1kloc shell programs that could've been 100 lines of C

The 1kloc shell programs are fragile spaghetti that breaks in weird ways. Any invocation of an external program can fail for a variety of reasons, and the shell doesn't provide adequate mechanisms for dealing with it, apart from exit codes and filtering error text output.

fargle
6 replies
4d2h

Only when the shell calls other external C programs. Ten lines of calling ffmpeg or curl is not shell programming.

100% wrong. this is what the shell was designed to do and where it is at it's best.

often shell "scripts" are used like "macros" or power-tools, shortcuts to save off a complex invocation or workflow. error handling isn't as important in a one-off and "adequate" is whatever gets the job done for the user, which it does.

it's rare that 1kloc shell script is the best engineering choice vs. (in the ancient days) Perl or (today) Python. e.g. "real" programming languages. you mostly should not write large programs in shell. and you really should not glue together pipelines of external programs using, for example, Python or C, which is onerous.

ah, the HN crowd: where everything is either black or white, great or terrible. how about "each to his own" and "use the right tool for the job"

bongodongobob
5 replies
4d

It's not rare at all. In every corporate environment I've worked in, my choices were PowerShell or 3 months of red tape, meetings, and security audits.

I get that in a dev shop that's not the case, but most businesses employ 0 devs. So people end up being forced into shell scripting because it's the only approved option.

int_19h
2 replies
3d23h

At least PowerShell gives you access to the entirety of .NET, even if syntax is not ideal for non-interactive use in many cases.

bongodongobob
1 replies
3d23h

Yup. My "scripts" end up with a lot of .NET stuff in there. I'd rather use Python, but being forced to use PS for years, I've come to like it.

int_19h
0 replies
3d15h

Depending on what you're doing, C# might also be an option - since the compiler is a part of the .NET Runtime (not just SDK), it's available on any Windows install with PowerShell these days. It's too bad that the system one is always an old version that doesn't do .csx, but still.

fargle
1 replies
3d4h

it's not rare, it's rare that it's the best engineering choice.

if your management can't allow even a python script, but it can allow bash? then it's not an engineering problem. as you even allude - your business does not have a software engineering culture.

a good engineering culture does not depend on job titles or budget or degree either. but it can't exist without healthy management.

bongodongobob
0 replies
2d23h

It's much easier to control allowed functionality in PowerShell via security policies. Python is comparatively a free for all. It makes sense in a non-dev environment.

shadowgovt
0 replies
4d3h

Ten lines of calling ffmpeg or curl is not shell programming

Ten lines of calling ffmpeg or curl is shell programming in precisely the same sense that 100 lines of C that `#include <sys/socket.h>` are C programming.

If your code isn't standing on the shoulders of giants, you're probably wasting everyone's time.

numeromancer
0 replies
3d2h

Any invocation of an external program can fail for a variety of reasons, and the shell doesn't provide adequate mechanisms for dealing with it, apart from exit codes and filtering error text output.

True, handling multiple sub-processes is difficult in most Unix shells. This is a function where Powershell could have done so much better, but didn't. It is somewhat better than bash&co, but could have been much more so. Python does it much better than Powershell does.

samatman
0 replies
3d23h

Sane or not, shells are programming languages, and in early Unix this was quite a bit more prominent and obvious, the fanout into sh, bash, ksh, and csh being exemplary.

To me it makes total sense to think of the standard POSIX toolkit as the standard library of the various shell languages, that seems basically correct in fact.

candiddevmike
8 replies
4d3h

I get that some CLIs are absolutely huge and require nesting (like aws), but it really drives me nuts traversing nested CLIs. I'd rather most apps spit out all their options in the help and let me use less to find what I need instead of going into each level and running help.

wjdp
3 replies
4d2h

I find it depends on the number of subcommands and if those subcommands are clear. It's rather useful to have the options whittled down to just the ones you need if you know which subcommand you need but otherwise can be painful.

Grepping through a large list of options is also painful, seems ther needs to be a balance here.

candiddevmike
2 replies
4d2h

No need to grep, just use less and vi commands like forward slash to search through things/move about a long list.

hk__2
1 replies
4d1h

It’s the task of searching in a long list that’s painful, not the tool to do it.

cellularmitosis
0 replies
3d20h

But we are comparing the pain of searching vs the pain of having to re-run the help command for each possible subcommand, and parsing through all of that. Much easier to just have one command invocation and search through it.

There is a web analogy for this in how people organize FAQs. Some have a list of section links, and you have to click on a section to get the FAQs for that topic. Others just put everything on one giant page.

Here's the problem scenario with splitting things up into section pages: You think you see the appropriate section, but then you don't see your concern answered. There are two possibilities: either the organization was counter-intuitive and your concern was answered in one of the other sections, or your concern wasn't answered anywhere. And what's the only way to be sure? Visit every single section page and search through all of them.

Much, much less painful to just have it all on one page and search it.

frontalier
1 replies
4d2h

isn't that what man(ual) is for?

otteromkram
0 replies
3d23h

You mean manpage? Yes.

swozey
0 replies
3d23h

I've spoken about this previously, I write a huge number of TUI apps. I write every single app the have a nice TUI frontend for people who are less technical (like QA) that is purely the UI/UX IF you want to use it. What that does, in all of my apps, is pass the settings chosen in the TUI to some sort of separate generator file or function that can always be called by it's own on the terminal with all options/help that are used in the TUI.

So it's scriptable and useful by multiple levels of skill.

epage
0 replies
4d

Something that can help is flattening subcommands' `--help` into their parent

e.g. `git stash <sub> --help` just reports `git stash --help` which includes each `<sub>`

avgcorrection
8 replies
4d1h

The worst part about terminal programs is that they can’t be deprecated. Or I don’t know of a way to do it. Because once you release something someone might immediately put it into a script somewhere instead of running it interactively. Now what do you do? Display a “hint” about the deprecation? Well no one’s gonna read that because it’s a script which is run non-interactively.

So you just have to design the UI perfectly on the first try. That’s possible for small tools but what about larger ones? Past a certain point it becomes a truism that do-it-once perfectly without iteration is impossible.

fleischhauf
6 replies
4d1h

how is this different from some library somewhere?

avgcorrection
4 replies
4d1h

How is it similar? I get deprecation warnings if I update the library. I can pin the version of the library. A library is something I work with, unlike a terminal program which might be written and forgotten.

Then those terminal programs get upgraded on the next system update because hey, you’re supposed to the get latest version right?

Terminal programs can do the same (`-v1`) in principle. Few do.

samatman
3 replies
3d23h

A terminal program is quite easy to pin, though, just do this:

    - $package-manager install $program 
    - which $program
    - cp $program ~/my/own/directories
    - $package-manager uninstall $program
done. It won't ever change by accident.

10000truths
1 replies
3d21h

Not quite that simple. You can't just pin the program itself, you also have to pin its dependencies. Which means you have to run ldd to get the libraries it loads (whether directly or transitively), and copy those too, and then you patch the program and its dependencies to set the rpath to $ORIGIN/my/own/directories, then you have to examine all the other more subtle application-level gotchas (e.g. path search order for config files, hard-coded absolute paths, etc.) and port those too. Once you're done, congratulations! You've done 50% of the work of a package manager.

samatman
0 replies
3d19h

Yeah, I wish this kind of thing[0] were a core POSIX tool which worked reliably on all systems and binaries. Bit of an oversight, that.

[0]: https://github.com/greenpau/statifier

avgcorrection
0 replies
3d21h

Nice. Then I will not hesitate to break the UI in my upcoming terminal program releases. (... they are quite well designed in my head.)

Viliam1234
0 replies
4d

Other programs can use a later version of the library.

This would be analogical, if commands always included their version. For example:

    rm-2.44.287-SNAPSHOT -r /

account42
0 replies
3d3h

Good. You shouldn't deprecate interafaces used only directly by users either, if at all possible.

enriquto
7 replies
4d3h

Most people today don’t know what the command line is, much less why they would want to bother with it.

This is true today, and it was true as well "in the 1980s", to use the same time frame as TFA. The difference is that today there are more people than ever who know what the command line is, and who can use it. At least an order of magnitude more people; maybe two. We can certainly say that we live in the CLI golden age!

atoav
2 replies
4d

Absolute vs relative numbers.

If you want to judge the shift in quality in a thing that grew in quantity, then you should look at percentages — otherwise you end up creating statements that are true, but don't say anything meaningful.

E.g. there are probably more absolute listeners of Jazz music today than back at the height of the cultural bloom of the genre. But that isn't because Jazz is more popular today than it once was, but because there are more absolute listeners of any kind of music. Would you say that Jazz in the US is now more important, influential etc. than it was at its peak?

hiAndrewQuinn
1 replies
3d22h

No, but there's so much great jazz these days that wouldn't have existed without those absolute numbers. Quantity (aggregate supply) is a quality (increases specialization of labor) all its own!

atoav
0 replies
3d17h

Yeah, sure. But I hope you are aware your argument is dishonest? If you want to compare two times, looking at absolute numbers isn't useful unless those numbers contributed to the rise or fall of the phenomenon we are observing.

Whether from that quantity new qualities emerged is a different question. E.g. the new quantity of diverse jazz listeners probably has lead to an explosion of new sub-genres, that might have never emerged otherwise.

Still, just because there is more different Jazz now does not mean Jazz has become more relevant overall. This was the factor we discussed btw.

nerdponx
1 replies
4d1h

Was that true as a % of computer users in the 80s?

enriquto
0 replies
4d

I guess not, but why does it matter? Let people enjoy other things.

mvdtnz
0 replies
3d22h

If you substitute "people" with "computer users" (which is obviously implied, despite your pedantry) then the author's jist is correct. What a villager in a remote undiscovered Amazonian civilisation thinks of the terminal is not relevant.

hinkley
0 replies
4d

I’ve been writing command lines for parts of our app to help split up the monolith. Global state and dependencies thwart reasoning and thus debugging and performance optimization. Splitting out 500, 1000, 5,000, 10,000 or sometimes 50k lines of code to work separately can clarify a lot of things.

If done right it can also encourage Functional Core, Imperative Shell, because a sensible Unix-philosophy command line needs lots of actions without side effects and a few with. You can write a little command that generates and dumps out the system state just before a (bad) decision is made, and do so against production systems with virtual impunity. And that means you can hand these tools to someone who you need to become part of a bus number, even if they are otherwise hesitant to do so.

OJFord
7 replies
3d23h

If stdout is not an interactive terminal, don’t display any animations. This will stop progress bars turning into Christmas trees in CI log output

Never display animations in stdout! I quite liked TFA in general, but I was skimming around looking for where they were going to advise on the difference between stderr & stdout until I saw that and realised they weren't.

stderr should be all (not just 'errors') of your logging, informational type stuff, the bits that maybe you might animate (and some people will hate) if tty, etc.

stdout should be the useful output - which you may or may not have - regardless of whether tty or not, primarily because an inconsistency like that is just confusing.

e.g.

    echo foo | mysed 's/oo/aa/' | cat
    # mysed should:
    # stdout: faa
    # stderr: mysed version 1 here hello\nfound oo\nprinting aa (or whatever)
I don't want to have to fight your tool with grep to get the 'actual' output lines. And I don't want to struggle to debug it because if I remove `| cat` above (as a silly example) it behaves differently than with it.

samatman
3 replies
3d23h

That's a good rule, pity about the name.

If we could go back in time and make it stdin, stdout, and stdext (because UNIX so six letters, but standard_extra, or standard_extended), we might have a prayer of getting people to follow that convention.

But it's called stderr, so devs think, quite reasonably, that it should be used for error reporting, and conversely, if it isn't an error, it goes in stdout.

But I agree with you that this is the better way to structure a program. You might confuse more people with it, but they'll be able to do more useful things with its output, so that's a win.

nerdponx
1 replies
3d13h

Powershell has something like 5 or 6 different output streams.

OJFord
0 replies
2d20h

Unix doesn't really have a limit that I know of, certainly not a low one, stdin/out/err are just names for the (very) common descriptors numbered 0/1/2 respectively; you can use another. (In fact it's sometimes helpful in scripting for dealing with exactly this issue of potentially badly behaved subprocesses.)

OJFord
0 replies
3d23h

I think if I could go back in time, my fix would be to make stderr the default. i.e. everything written 'out' goes to stderr until you explicitly write it as an output to stdout (or another descriptor/file), the inverse of the current situation.

(Ok, sure, I'd probably change the name too!)

eikenberry
1 replies
3d23h

stdout should be the useful output

To add a small tweak to this rule... "stdout should be what you ask for". If I ask for --help, that should go to stdout. If I ask for logs, they should go to stdout. If I don't ask for it, stderr.

OJFord
0 replies
3d23h

Yup, absolutely, that is what I meant - the useful output from the command, the thing you were looking for, that it's actually doing/retrieving.

nmz
0 replies
1d22h

So where do you display animations then? I'd also say that coloring is not information.

tester457
6 replies
4d2h

Do not read secrets from environment variables

Secrets should only be accepted via credential files, pipes, AF_UNIX sockets, secret management services, or another IPC mechanism.

Which one of these is the most convenient and portable to use?

Do you use secret management services for work only, or do you use them in your personal projects too?

tashian
3 replies
4d

Hi, I'm one of the authors of CLI Guidelines.

See my post https://smallstep.com/blog/command-line-secrets/ for a bit more of a deep dive about using secrets on the command line.

Credential files are a good, simple, portable option. Files have permissions already. They don't depend on an external service or a proprietary API.

And, if your program accepts a credential file, it will be compatible with systemd credentials. systemd credentials offer more security than an unencrypted credential file. They are encrypted and can be TPM-bound, but they don't require the software using the credential to have native TPM support.

cbm-vic-20
2 replies
3d22h

It's probably a good idea to check the permissions of that file, too, and emit a warning or exit with an error if the they're too permissive.

indymike
1 replies
3d21h

A good example that all of us have seen is ssh. It does not run if permissions on certs are incorrect.

n_plus_1_acc
0 replies
3d18h

But it definitely could improve its error messages in this case.

flgstnd
0 replies
4d1h

I like it when programs have a way to specify a command to retrieve secrets. mbsync (https://isync.sourceforge.io/mbsync.html) e.g. has afaik 3 options to provide a password for IMAP authentication: If you don't configure a password, you'll be prompted on execution. You can also put the plain text password in the configuration (impractical if you want to share your configuration). But there is also a configuration option to provide a command to retrieve the password. That way you can delegate the password handling to another program, e.g. a password manager like pass(1) (https://www.passwordstore.org/) or some interactive graphical prompt.

AlwaysNewb23
0 replies
4d1h

A secrets management service would be most convenient. Documentation makes them easy to set up without having to build anything extra yourself. A secrets manager like Doppler (https://Doppler.com) or AWS Secrets Manager (https://aws.amazon.com/secrets-manager/) has the advantage of protecting your secrets in a secure place and the advantage of minimizing exposure of those secrets - even to your own developers. That way, you don't end up with a data breach that could have easily been avoided. These types of leaks can cost companies everything and are becoming way more common.

ho_schi
6 replies
4d4h

If you wonder about command-line argument parsing with C and C++, getopt() is built-in:

        https://www.gnu.org/software/libc/manual/html_node/Getopt.html
That said. I think CLI programs are user friendly for professionals. Because they support input-process-output. Where input is STDOUT read by human, process is thinking by human and output is keyboard input to STDIN by human.

UIs for the general audience? TUIs! TUIs are easily to parse, succinct in organization and fast input - all for humans. Humans can parse TUIs. And they can make up a mental modal.

GUIs fail often with a lack of organization, information overflow and distractions by weird metaphors. The Windows 95 desktop metaphor is an example. It doesn’t make sense. Same for Windows 11 and its file-browser which makes it hard to recognize the filesystem or even just the home-directory. Now open Nautilus on Linux, it opens by default your home-directory (in most cases the place to be).

I like the CLI but TUIs are my love. GUIs are okay if are like a TUI.

unwind
1 replies
4d3h

That is only true in some environments, since it's not in fact part of the language specification.

See the manual page's [1] "STANDARDS" section, which reads:

    getopt()
        POSIX.1-2008.

    getopt_long()
    getopt_long_only()
        GNU.

        The use of '+' and '-' in optstring is a GNU extension.

[1]: https://www.man7.org/linux/man-pages/man3/getopt.3.html

ho_schi
0 replies
4d3h

Correct.

It is probably a feature which isn’t necessary for the languages itself but Linux/POSIX.

kergonath
1 replies
4d1h

UIs for the general audience? TUIs! TUIs are easily to parse, succinct in organization and fast input - all for humans. Humans can parse TUIs. And they can make up a mental modal.

TUIs have the same drawbacks as GUIs and then some. Their big advantage is to work seamlessly over SSH but that’s pretty much it. They are not more discoverable and they are not more efficient than GUIs. There is nothing preventing you from having a decent GUI along the same lines as Midnight Commander to have a file manager without the metaphors you dislike, for example (as a matter of fact, there are several).

Linux-Fan
0 replies
3d23h

Working over SSH is not the only TUI advantage: IMHO one of the greatest benefits of TUIs is the usage of characters to display the UI. This way, the font size is always equal (no unreadably small fonts).

When designing TUIs I found this to be limiting in a creative sense -- I have to really think about how I arrange the TUI elements and information because I cannot put as many elements as I can on a GUI in the same screen space.

Also, TUIs seem to be mostely unaffected by the trend to make every GUI element “touch-friendly” large which is an advantage for me as a Desktop user.

Full rant here: <https://masysma.net/37/why_terminal.xhtml>

indymike
0 replies
3d21h

UIs for the general audience? TUIs!

TUIs are ideal (sometimes) where a command needs to be interactive. Many commands lend themselves well to batch processing or require no interactivity at all. In many cases, a script piped into a text editor (which is a TUI) is all that is needed, sparing apps from having to embed a text editor and deal with all of the design choices. Other times GUI will work a lot better.

IshKebab
0 replies
3d23h

getopt is not built in.

nmz
5 replies
3d23h

There's something I wish command lines would not adopt that is should not be as a CLI flag, and that's --json or --csv. Instead this should be offered as an environment variable that runs through all the commands like IOFORMAT=JSON { cmd1|cmd2|cmd3 } if they support the environment variable, they can all read input in a safe way and behave appropriately, if they don't, then they just print as normal

gumby
4 replies
3d23h

(BTW sometimes ENV vars are bug removers and sometimes a terrible source of mysterious bugs as well as a possible injection vector.)

A good option package should support long options being both command line flags and environment variables. The really good ones support them in init files as well (I'm partial to the JSON-ish HOCON format, though .INI works too).

I always spec (later overrides previous settings):

    - system default   (/etc/foo.conf)
    - user defaults    (~/.foo.conf)
    - local defaults   (${CWD}/foo.conf)
    - user-specified init file (overrides local default)
    - environment vars (FOO_OUTPUT=JSON)
    - command line var (--output=json)
As well as --no-init and --no-env

nmz
3 replies
2d23h

I disagree of having programs use env vars, ENV propagates, now every single program has an env var for that single program that you barely use, a good example is LSCOLORS, or nnn, all programs now consume extra memory. Which isn't a lot. but it adds up if every single program decides to use ENV.

gumby
2 replies
2d21h

I wouldn't worry about memory. As I noted, the propagation value of environment vars can be very handy (make sure the programs are communicating via JSON, from an upstream example) and can also lead to confusing bugs (why is this generating JSON output???)

nmz
1 replies
2d17h

I don't see the problem, you're just making something explicit what used to be implicit, unix is implicitly tsv, well now its a variable that you can change.

gumby
0 replies
2d16h

I do think it is superior to specify the setting explicitly on the cmd line, but sometimes you’re calling a script and want something specified.

arrakeen
5 replies
4d1h

Use symbols and emoji where it makes things clearer.

for the love of god please don't. the yubikey-agent example provided exemplifies everything i dislike about github READMEs and whimsical user interfaces.

on the technical side, symbols and emojis can render inconsistently among terminals, leading to potentially confusing messaging. on the artistic side, personal tolerances towards whimsy and playfulness vary wildly and should only be used very sparingly and ONLY if you know what you're doing (if you have to ask, you probably don't)

nerdponx
1 replies
4d1h

I like it as an optional feature. Some people enjoy it, some people don't. Turn it off by default, but let users choose.

kevindamm
0 replies
3d22h

like a --verbosity=cute level, perhaps

zestyping
0 replies
2d18h

I'd be good with the guidelines:

1. Use a vocabulary of at most two emojis in output (e.g. one for success, one for failure).

2. Any information conveyed by an emoji should be redundantly conveyed by text.

samatman
0 replies
3d23h

I like it. I like colored terminal output, and emoji are colorful, which helps me rapidly form a gestalt of what's going on. I like syntax highlighting too, and find code quite a bit more difficult to read without it.

Not everyone is like that, and that's ok. I don't expect my whims to be catered to, and you shouldn't either.

Linux-Fan
0 replies
3d23h

Strongly resonates. The first time I saw the CLI Guidelines I even stopped reading them upon reaching that section. This time I read through the remainder and found it quite OK.

wang_li
3 replies
4d1h

If your program detects -? and tells the user to use -h or --help or --long-help or --full-help you should be kicked in the crotch. Just show me the help.

rascul
1 replies
4d1h

I often show help if an unknown option was given but in the cases I don't I am not going through a list of special cases that I don't even know of to try and determine that someone wants the help text. I've never even seen -? before that I can recall.

account42
0 replies
3d3h

/? is common for help on Windows and some programs translate that to -? elsewhere or just accept both / and - for their arguments.

SAI_Peregrinus
0 replies
4d1h

I'd say -h and --help should return 0 & print help to stdout. Any unknown option should return EINVAL and output an error message and help text to stderr.

flykespice
3 replies
4d3h

POSIX standard had its own command line interface guide already, didn't it?

0cf8612b2e1e
1 replies
4d1h

Doesn’t POSIX comment more on what programs did vs what they should do? More a manual of backwards compatibility.

Like, why does tar get to be so special that it takes command line flags with or without leading dashes?

Sprocklem
0 replies
3d21h

tar isn't a POSIX command, and retains its argument format from before the POSIX CLI guidelines were standardized (as does the POSIX ar command). pax, the POSIX-specified (but rarely implemented/used) equivalent of tar does follow the POSIX CLI guidelines.

rascul
0 replies
4d1h
ttyprintk
2 replies
4d3h

not much has changed since the prior comments:

https://news.ycombinator.com/item?id=25304257

scbrg
1 replies
4d1h

Seems they took a small step back from their previous "don't bother with man pages" stance. Now it's "Consider providing man pages."

I still find it a rather shocking order of priority, honestly.

https://clig.dev/#documentation

ttyprintk
0 replies
3d16h

Good eye!

klodolph
2 replies
3d23h

If you have -r / --recursive, I would rather just have -recursive work as the others. POSIX be damned, the double hyphen is just an opportunity to make more mistakes, and the combined -rptgo single-hyphen flags are more opportunities to make mistakes.

Let some legacy programs like ls and sync keep their single-hyphen combinations. Most new programs should just accept separate flags, and allow either single or double hyphens.

Edit: Yeah, this opinion collects downvotes, doesn’t it? Y’all love it when you accidentally type -recursive instead of --recursive, and it turns out that it means the same thing as -r -e -c -u -s -i -v? That never made any sense to me, for the vast majority of tools out there. It sucks, to be honest.

OJFord
1 replies
3d23h

I don't like combined single-hyphens for a different reason: sometimes they accept arguments, and it seems strange to me that `-a0` can variously mean `-a 0` and `-a -0`. (I think some insist that argument-having options are long, and short options can only be flags, which sort of solves it, except for the existence of the others which makes it still confusing.) That said, in a script, which is the main time I'd have to read the way someone else has written it anyway, I favour everything being --long --anyway --to-make-it=really-clear. (Though there are a few that are so common I won't insist on, `cut -dX -fY` say.)

You must like `find`.

klodolph
0 replies
3d17h

I don't like combined single-hyphens for a different reason: sometimes they accept arguments, and it seems strange to me that `-a0` can variously mean `-a 0` and `-a -0`.

I think the answer to that is just to stop designing argument parsers that way—if you want -a 0 then maybe -a 0 is acceptable, maybe -a=0 is acceptable, but -a0 should not be. I don’t see how this part is controversial.

For the same reason, -all should not be parsed as -a -l -l.

I favour everything being --long --anyway --to-make-it=really-clear.

Yeah—I think it would be equally clear to write it like -long -anyway -to-make-it=really-clear, if we decided to make more parsers that worked that way. Some parsers do work that way.

You must like `find`.

I have to assume that this is just sarcasm. The `find` command gives you a DSL for writing queries, and for some reason, the tokens in that DSL are option flags starting with -. Bizarre. I don’t think anybody wants to design something like that, and I don’t think there’s really anything to learn from find except maybe “sometimes, for historical reasons, the command-line arguments for standardized tools just plain suck.”

EasyMark
2 replies
4d3h

This document is far to large for "guidelines".

tester457
0 replies
4d2h

I thought so too until I ignored the philosophy section.

shrikant
0 replies
4d3h

Whart's this based on? If anything, this is among the more concise set of interface guidelines I've come across. Design guidelines typically run into hundreds of pages long when PDF'ed -- this page is just about thirty on default "Save as PDF" settings...

xixixao
1 replies
4d3h

These are great. Most frustrating to me is that the basic commands: rm, mv, cp, touch, mkdir do not follow some of the basic guidelines, such as “ask before destructive action”. I have been reimplemting them in Rust for myself to fix this (although a wrapper would have been perhaps better, it’s hard to write it and have the program work in any shell).

stephenr
0 replies
4d2h

It's pretty easy to have shell aliases for commands like rm, cp, etc to use the "safe" mode (the `-i` option) by default.

Writing a replacement utility seems a bit like overkill.

shadowgovt
1 replies
4d3h

Of all the challenges with using CLIs, the one that bites me consistently is capitalization of flags. Really wish we'd standardized on being capital-agnostic (or even demanding lowercase only).

... but we didn't, so now you have to memorize whether recursion is capital or lowercase R, restart is capital or lowercase R, poweroff is capital or lowercase P, etc.

thworp
0 replies
4d2h

Most utilities I use daily have --long-flags for all of these. Using completion (prferrably https://github.com/unixorn/fzf-zsh-plugin) you can complete them pretty quickly and without knowing the precise name.

peterisza
1 replies
3d10h

One of my favorite CL use-case is pizza ordering. We had a script for it in college, something like this:

# pizza --cheese

Ordering...

Success.

#

3abiton
0 replies
3d9h

Before the age of captcha. You'll have to tip your captcha solcer nowadays.

hiAndrewQuinn
1 replies
3d22h

I know it goes without saying for most of us here, but actually being a heavy terminal user yourself is one of the most important things to understand how to design CLIs. It helps a ton to understand the ecosystem you live in, not just your own organism.

Example: Something I did a few months back ago for a tiny personal project @ https://github.com/hiAndrewQuinn/finstem was implement `--format CSV`, `TSV` and `JSON` flags. I haven't had need for any of these myself, but they exist so any future people who want to use `csvkit`, `awk` and `jq` respectively to wrap around my program have easy ways to do so. That's not stuff I would have had the instincts to do if I wasn't myself a user of all 3 of those programs.

hollerith
0 replies
3d22h

being a heavy terminal user . . . is one of the most important things

I wish people wouldn't conflate CLIs with terminals. I run (shell) command lines all day, but try hard to avoid terminals / apps that emulate terminals.

(To run command lines, I use Emacs, which I never run inside a terminal.)

eterps
1 replies
4d

The most comprehensive CLI guidelines in the past have been in this book by Eric Raymond:

https://www.goodreads.com/book/show/104745.The_Art_of_UNIX_P...

https://www.catb.org/esr/writings/taoup/

It has been a while since I read that book, but after skimming through clig.dev I gather opinions have changed over time quite a bit.

hiAndrewQuinn
0 replies
3d22h

I thought much the same thing when I read CLIG. I'm a big fan of the 17 Unix rules he puts down, though, and I think a lot of really good equilibria exist on the spectrum between the two.

MichaelMoser123
1 replies
3d22h

whatever happened to "Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new 'features'"?

now we got lots of mega-cli programs, each one with its own distinct option language.

Examples: kubectl, docker, openssl, git - (git got two command line languages: plumbing and porcelain)

https://danluu.com/cli-complexity/

... ls had 11 command line options in 1979, in 2017 it got 58.

numeromancer
0 replies
3d1h

We ran so much faster when we didn't have to run with shoes on.

restalis
0 replies
3d17h

"Don’t allow arbitrary abbreviations of subcommands. [...] you allowed them to type any non-ambiguous prefix, like mycmd ins, or even just mycmd i, and have it be an alias for mycmd install. Now you’re stuck: you can’t add any more commands beginning with i, because there are scripts out there that assume i means install."

Please avoid the use of short arguments in scripts. It makes the least sense there. The short arguments (along with aliases, abbreviations, and whatnot) are a convenience for human usage, to reduce the amount of manual typing. In scripts you can be explicit with minimal cost (and you also should, considering the ratio of writes vs. reads).

pmig
0 replies
4d3h

We recently chose cobra[1] to create a cli application. It comes with so many best practices already packaged like autocompletions, help texts etc. etc.

[1]: https://github.com/spf13/cobra

gumby
0 replies
3d23h

It is assumed that command-line interfaces are the opposite of this—that you have to remember how to do everything. The original Macintosh Human Interface Guidelines, published in 1987, recommend “See-and-point (instead of remember-and-type),” as if you could only choose one or the other.

I find that guis have worse discoverability and cli better. It's pretty hard to search for a gui affordance. Plus they are essentially unscriptable.

egberts1
0 replies
3d21h

systemd fails badly with the CLI guideline.

My biggest beef is total lack of CLI exit code.

Makes it useless for bash programming with systemd utils.

dang
0 replies
3d23h

Related:

Command Line Interface Guidelines - https://news.ycombinator.com/item?id=38053692 - Oct 2023 (1 comment)

Command Line Interface Guidelines - https://news.ycombinator.com/item?id=31651161 - June 2022 (1 comment)

Command Line Interface Guidelines - https://news.ycombinator.com/item?id=25492119 - Dec 2020 (5 comments)