return to table of content

In the Beginning Was the Command Line (1999)

mg
61 replies
1d10h

The command line is still king.

Whenever I see new coders struggle, it usually is because they:

    - Don't know the context of what they are executing

    - Don't know about the concept of input and output
On the command line, the context is obvious. You are in the context. The working dir, the environment, everything is the same for you as it is for the thing you execute via ./mything.py.

Input and output are also obvious. Input is what you type, output is what you see. Using pipes to redirect it comes naturally.

Not being natively connected to context, input and output is often at the core of problems I see even senior programmers struggle with.

xk3
12 replies
1d9h

On the command line, the context is obvious. You are in the context

While I share the opinion that the command line is _the one true way_ of computing, I don't think this is really all that true. Computers are alien. GUIs are alien. CLIs are alien. Everything is learned. Everything is experience. Everything is culture. Learning, experience, and culture blind us from the experience of the novice. This is "expert blindness".

Why Western Designs Fail in Developing Countries https://youtu.be/CGRtyxEpoGg

https://scholarslab.lib.virginia.edu/blog/novice-struggles-a...

copperx
8 replies
1d9h

More succinctly, "the only intuitive interface is the nipple."

fanf2
6 replies
1d8h

Not even that. Newborns have to learn how to suckle, and their mother has to learn how to hold everything in the right positions so it can work. It’s a tricky skill and many aren’t successful even if they want to breastfeed.

lamuswawir
5 replies
1d7h

Newborns don't have to learn how to suckle. It's a reflex, called a suckling reflex. Lacking a suckling reflex is an indicator of disease. Basically a newborn suckles everything that goes into the mouth, nipple or not.

lll-o-lll
3 replies
1d6h

Are you a woman? Have you had children? Have you partnered a woman as your child and her cried through the night, both trying to make this “breastfeeding” thing work; both failing. “How can something so intrinsic to basic survival, be so hard!”. And yet it is.

Talk confidently when you have experience.

lamuswawir
0 replies
1d2h

Having a reflex and breastfeeding are two different things. The baby can have the reflex, but the mother has no milk or the latching technique is poor (as mentioned elsewhere in these comments). So the child not breastfeeding doesn't mean there is no reflex. And yes, the process can be painful and stressful to many mothers most especially first timers (experience also counts).

jstanley
0 replies
1d4h

That doesn't negate the fact that newborns have a suckling reflex.

anthk
0 replies
1d5h

Sucking it's a reflex; the rest it's pseudoscience. Your personal anecdothes don't count.

rolisz
0 replies
1d7h

Yes they have a suckling reflex, but they might not latch on correctly and then breastfeeding is extremely painful.

lukan
0 replies
1d8h

Some babies actually do not have that intuition and rather bite and need time to learn.

noufalibrahim
0 replies
1d6h

True. All interfaces are abstractions and i think the command line interface is the best one we have. It gives you the maximum power.. It does have a steep but surmountable learning curve though and the effort is usually worth it given the ubiquitous nature and programmability.

johnisgood
0 replies
1d8h

I mean yeah, you are right, it is learned, but "CLI is the one true way" for me because it is much more powerful than most GUIs I have seen. I am much more productive using the CLI, etc.

heraldgeezer
0 replies
1d7h

Why Western Designs Fail in Developing Countries

Why add this?

Whats the solution then?

We have already given so much in the western world.

jasode
9 replies
1d8h

>On the command line, the context is obvious.

But CLI contexts are only obvious if the computer user is already familiar with the CLI which biases the learned mind to perceive things as obvious when they really are not.

A lot of CLI commands syntax are based on position instead of explicit argument names.

E.g. creating file system links via CLI has opposite syntax positions in Linux vs Windows:

  - Linux:  ln srcfile targetlink

  - Windows :  mklink targetlink srcfile
If one goes back & forth between the 2 operating systems, it's easy to mis-type the wrong syntax because the CLI doesn't make it obvious. On the other hand, using a GUI file browser like Ubuntu Nautlius or Windows Explorer lets a novice create links ("shortcuts") without memorizing CLI syntax.

This gap of knowledge is also why there is copy&paste cargo-culting of inscrutable ffmpeg, git, rsync, etc commands.

E.g. using ffmpeg to covert a DVD to mp4 by manually concatenating *.VOB files has very cumbersome and error-prone syntax. It's easier to use a visual GUI like Handbrake to click and choose the specific title/chapters to convert.

CLI vs GUI superiority depends on the task and the knowledge level of the user.

fragmede
4 replies
1d6h

These days it’s easier to ask ChatGPT for the ffmpeg command line to do the thing you want, imo.

anthk
2 replies
1d2h

When some big shit hits the fan either with ChatGPT or internet connections, the only people able to fix systems without an internet connection will be us, the millenials.

The rest will be prompty fired in the spot.

Gormo
1 replies
1d

I've found that as far as generational cohorts go, the most adept people are the late Gen-Xers and very early Millenials -- the people who grew up surrounded by C64s and Apple IIs and later had to fiddle with their config.sys and autoexec.bat files to get their game to fit into conventional memory.

People whose first exposure to computing was in the mid-'90s or later seem to have less depth of understanding of the fundamentals and less experiencing tinkering and using process of elimination to solve problems. I started encountering people who didn't understand certain fundamental concepts and had no exposure to the CLI, but still had CS degrees and were applying for developer positions, about 15 years ago.

anthk
0 replies
22h56m

Yeah, these too. But lots of millenials learnt with RedHat, Debian and they knew a bit of DOS for some issues with Windows 98 and potential recovering from disasters. Still, far more able than the ChatGPT aided people blindly pasting commands into a shell being a recipe for a huge disaster.

Gormo
0 replies
1d

Everything is "easy" when someone else is doing it for you. Results are best when you develop and apply your own expertise to a problem, but it's not feasible to do that for every problem domain -- relying on others is often important.

I'ts just that if I'm going to rely on someone else, I'd prefer for that someone else to be one of the people who has developed and applied their own expertise to that problem domain, rather than a statistical model that only actually knows how likely words are to appear in proximity to each other, and has no capacity to validate its probabilistic inferences against the outside world.

tjoff
3 replies
1d8h

Context isn't the same as syntax?

Yes, command line suffers from discoverability of which different applications (such as ln/mklink) may not be consistent.

It is one of the bigger problems (imho) of the cli but it doesn't go against GPs point.

The command line does have a learning curve (partly because of the above), but it is also quite rewarding.

sgc
2 replies
1d5h

When I start typing a formula in LibreOffice Calc, there is a popup showing possible matching functions, then when I choose the function, the popup shows the required syntax for the function and where I currently am within that syntax. A bash plugin that would do that would be an absolute game changer imho.

The cli excels because it is extremely flexible, with far more options available than a set of buttons could ever display. But discoverability rounds down to 0, and there are footguns. It seems like spreadsheet software has found an almost drop in ui that would greatly enhance the cli.

thinkmassive
1 replies
1d5h

Tab completion can get you much of the way there.

sgc
0 replies
1d4h

it's not the same thing. Tab completion is useful and will complete something you know of. But it does not help you discover something you don't know, or provide you the syntax of the command after it is entered. The problem I would like to solve is discoverability.

It's a 3 part problem: available commands, their options, their syntax. Part one would need to capture prompt input before enter was hit using solutions similar to those found at [1] perhaps the most useful but least complete one there is the one that uses apropos so something like `apropos -s 1 '' | sort | grep calc | less`. Similar solutions would be required for two and three. The roughest and easiest prototype would probably be two tabs in a split screen view, which would allow for selection of displayed matches to then modify the prompt creating those matches in the other tab. But Calc style popups directly attached to your cursor would be more useful still.

[1] https://stackoverflow.com/questions/948008/linux-command-to-...

dakiol
7 replies
1d10h

The more experience I accumulate, the more I rely on GUIs. Explanation: when I was younger I used exclusively the CLI and underestimated GUIs. Now I tend to appreciate GUIs and use them more.

lucianbr
3 replies
1d9h

If you're experienced with the command line, it's easy to use GUIs and get good results.

If one starts with GUIs and doesn't really understand what is behind, then all kinds of trouble happen.

So I guess, as with any tool, understanding is key.

lukan
2 replies
1d8h

Not all GUIs are just a graphical wrapper for a CLI. But in general sure, understanding the tech behind helps.

lucianbr
1 replies
1d5h

No doubt, if you're working in AutoCAD, there's no command line that you need to understand first.

But then again, if you're working in AutoCAD, you'd never say "I used to work in CLI only, now I use GUIs more and more".

Clearly they meant GUIs that have CLIs behind, or at least CLI alternatives.

kalleboo
0 replies
1d5h

AutoCAD is an unlucky choice of example here, because it's one of the few GUI drawing tools that actually does have a command line behind it that you have to understand sometimes! Look up a screenshot of AutoCAD and you can see the command prompt at the bottom of the window.

And if you were using AutoCAD in the 80's you can say exactly that you used to use the CLI only!

the_duke
1 replies
1d6h

Can you give some examples? Which GUIs are you using?

dakiol
0 replies
1d5h

I have used git extensively in the terminal. But nowadays I see myself more and more relying on GUIs like the ones integrated in Intellij IDE, Source Tree, etc.

Another example could be qemu and the GUIs that we have nowadays. One final example would be simply drag and dropping files via Finder instead of using cp/mv

devjab
0 replies
1d6h

I used to sort of like the Azure GUI (yes I’m a total psychopath), but then they changed it 9 billion times and now I just use the CLI. It’s frankly often experiences like this which drives me back to the cli. I like Gitkraken, but the it does an update and forgets my SSO or it doesn’t support a specific thing I need to do and then I’m using their console.

I’m not really religious about anything, but I often end up going back to the CLI for a lot of things because it’s just less of an annoyance.

delusional
6 replies
1d9h

I agree with you, but I think there's a caveat. The command line is king in Linux, BSD, MacOS, AIX, and to a lesser extent Windows. These operating systems were crafted from the bottom up with the commandline as a foundational layer. The idea of the context, of the "working directory", the "environment", were concepts that were lifted from that commandline centric world, into what we run now.

I think Windows very much wanted to be something different with COM. Instead of starting a shell in the context of the program, you'd connect some external "shell" into the very object graph if your program to inspect it. It turns out to be very difficult, and Windows has largely retreated to commandline centric architecture, but I think there was some essential attempt at foundational innovation there.

I would argue that the commandline has very much proven to be the best trade-off between being useful and being simple, but there is no saying if there exists some alternative.

steve1977
4 replies
1d7h

The command line is king in Linux, BSD, MacOS, AIX, and to a lesser extent Windows. These operating systems were crafted from the bottom up with the commandline as a foundational layer.

This is definitively not true for macOS.

skydhash
3 replies
1d6h

MacOS is a very complete, very well funded desktop environment targeted towards the general user. You want anything extras and you land in applications using private apis and the command line.

steve1977
2 replies
1d6h

The command line is not a "foundational layer" in macOS, that was my point. It exists on the same layer as the GUI does.

steve1977
0 replies
1d3h

The foundation of macOS contains elements of UNIX (or rather BSD) and the OS is UNIX certified, I‘m fully aware of that. But these are two different things.

For one thing, UNIX != command line.

In the same vein, Windows NT is not based on DOS anymore, even if it has a command line which resembles (parts of) DOS.

ThrowawayR2
0 replies
1d6h

Not every version of MacOS. Classic MacOS, System 1-7 and MacOS 8-9, were definitely not crafted with the command line environment as a foundational layer. Using it was like being wrapped in several layers of bubble wrap. You were trapped in the GUI and if you wanted to do something the GUI didn't allow for, you were "using it wrong".

vegabook
4 replies
1d6h

Reductio ad absurdum: we should all just use an interactive assembler. Then you really are "in the context".

There is a level of abstraction that makes sense. What level that is is dependent on your objectives.

skydhash
2 replies
1d6h

My first real programming experiences was done through the browser’s console (JavaScript) and IDLE’s REPL (Python). The short feedback cycle works wonder for understanding instead of struggling to the multistep process of C compilation or Java verbosity. I tried my hands at reverse engineering and using a dissassembler like IDA also gives the same immediacy feeling. Great DX for me is either a good debugger or a proper REPL whatever the abstraction level.

vegabook
0 replies
1d6h

Agreed. When writing the comment I returned seconds later and added the word "interactive".

richrichie
0 replies
1d4h

the multistep process of C compilation

it can be a single make command

teddyh
0 replies
1d6h

DDT, the shell for the ITS operating system, was also an assembly-level debugger.

at_a_remove
4 replies
1d8h

The command line is king, but sometimes the king is mad. Which is to say, it can be difficult to work with the monarchy when the syntax is shit. And there's a lot of bad syntax out there: overly cute, so terse as to be cryptographic, the straight-up baffling ...

Outside of the syntax (which seems to live forever), you have things like non-sane defaults, obscurantist man pages ... the list goes on.

skydhash
1 replies
1d6h

the syntax is shit

When you’re typing a lot, you really don’t want to do a lot of typing for each step. And the shell scripts were for automating some process, not solving a particular problem (you use a programming language for that). The workflow is to have the references ready in case you forgot something.

That brings up the man pages, which can varies in quality, but, for most software I’ve used, tend to be comprehensive. But they assume that you’re knowledgeable. If you’re not, take some time to read a book about system administration (users, processes, files permissions,…).

at_a_remove
0 replies
1d6h

I'm not sure if I have O'Reilly's System Administration book. I used to get pre-prints back in the early nineties when they were still quite new. In any case, yes, I have read and I have Been Around.

And I still think that we can improve. More over, we ought to improve.

fragmede
1 replies
1d6h

the ability to hit up and edit and retry is such a redeeming feature. repeating the same actions in a GUI with no keyboard shortcuts is an exercise in frustration.

jodrellblank
0 replies
1d4h

GUIs can have keyboard shortcuts. Have you honestly never pressed Alt+F, x to exit a GUI program or Ctrl+S to save a document in a GUI editor, or Ctrl+Tab to switch tabs in a GUI browser or Tab to move focus between fields, or the context-menu button next to Alt Gr and then a keyboard accelerator key for the menu, or Ctrl+C then Ctrl-P or anything?

Repeating the same actions in a CLI with no readline is an exercise in frustration, but ... that's not what happens most of the time.

sandreas
3 replies
1d10h

This. And it does not even exclude having a (T)UI. Modern terminal tools like neovim, lazygit, zellij, btop++ or yazi can do many things as window management, image previews and colors as well as having mouse support.

galangalalgol
2 replies
1d7h

Are there any good tools to be able to ssh into a machine and preview images or render markdown directly in line with the cli?

fragmede
0 replies
1d6h

sixel support lets you display images to the command line, for terminal emulators that support it.

steve1977
1 replies
1d7h

Oh believe me, I wish what you wrote was true, but it isn't.

I've seen people think they have a specific Python environment active just because they were in their project's directory on the command line.

I've seen people not understand that "python -m pip" is a command and even if they are in a directory which has "python" in its name, they still have to type "python" for that command.

PS: The command line might even be an emperor. And the emperor could be naked...

fragmede
0 replies
1d6h

I've seen people think they have a specific Python environment active just because they were in their project's directory on the command line.

I wrote python-wool as a simple wrapper to python to make that true because it's just easier that way. Direnv can also be configured to do that as well.

http://GitHub.com/fragmede/python-wool

pavlov
1 replies
1d10h

This is such generic advice about computing, it’s like saying:

“To make a building, you need to have a foundation, something to keep the roof up, and a way for people to move inside.”

mg
0 replies
1d10h

The analogy I would make is that living in the command line is like using a CAD program while living in IDEs is like using CorelDraw to design houses.

CorelDraw feels more efficient because one quickly has what looks like a beautiful, colorful house on the screen. And then one does not understand why the doors don't work correctly.

zoky
0 replies
1d8h

On the command line, the context is obvious.

You sound like someone who never tried to write a cronjob script…

samatman
0 replies
1d3h

Input and output are also obvious. Input is what you type, output is what you see.

Input maybe, although realizing that instead of typing, you can pipe, is a major conceptual breakthrough when new users are learning to work the command line.

But output? The existence of stdout and stderr as two sources of text with no visible distinction is highly nonobvious and continues to trip me up in practical situations to this very day.

fragmede
0 replies
1d5h

its unsung strength is in having multiple terminal windows and a browser open, and the simplicity of being able to hit up and being able to edit and then retry a failed command. I can't do that in Photoshop.

bloopernova
0 replies
1d4h

Yep, I ran a short "introduction to the command line" course for the devs in my team. Afterwards, I noticed that their usage of the vscode terminal was much higher, and folks were more comfortable exploring on their own.

Quick outline of the course, in case anyone wants a starting point:

  * Introduction
    Quick history of Unix

  * When you log in
    Difference between login and interactive shells
    System shell files vs user shell files
    .zshenv for environment variables like PATH, EDITOR, and PAGER
    .zprofile for login shells, we don't use it
    .zshrc for interactive shells
    Your login files are scripts, and can have anything in them

  * Moving Around

  ** Where am I?
    pwd = "print working directory"
    stored in variable $PWD
    Confusingly, also called current working directory, so you may see CWD or cwd mentioned

  ** What is here?
    ls
    ls -al
    ls -alt
    . prefix to filenames makes them hidden
    . is also the current directory!
    .. means the parent directory
    file tells you what something is
    cat displays a file
    code opens it in vscode

  ** Finding my way around
    cd
    cd -
    dirs | sed -e $'s/ /\\\n/g'

  ** Getting Help From The Man
    man 1 zshbuiltins
    manpage sections

  ** PATH
    echo $PATH | sed -e $'s/:/\\\n/g'
    zshenv PATH setting
    which tells you what will be run

  ** Environment Variables
    env | sort
    EDITOR variable

  ** History
    ctrl-r vs up arrow

  ** Permissions
    Making something executable

  ** Prompts
    zsh promptinit
    zsh prompt -l

  ** Pipes and Redirection
    Iterate to show how pipes work
    cat ~/.zshrc | grep PATH
    ls -al > ~/.tmp/ls-output.txt

  ** Commands

  *** BSD vs GNU commands
    BSD are supplied by Apple, and Apple often uses old versions
    GNU are installed via homebrew, and match those commands available in Linux

1vuio0pswjnm7
0 replies
1d8h

"Don't know about the concept of input and output"

Wow, that seems quite fundamental. Computing 101.

I'm not a "coder" and I spend "99%" of time on the command line. Because I prefer it. Ever since the 80s when I first used a VAX.

mxwsn
57 replies
1d11h

This essay by Neal Stephenson was first published in 1999. https://en.m.wikipedia.org/wiki/In_the_Beginning..._Was_the_...

The analogy of OS as cars (Windows is a station wagon, Linux is a tank) is brought up in the recent Acquired episode on Microsoft, where Vista was a Dodge Viper but Windows 7 was a Toyota Camry, which is what users actually wanted.

GrumpyYoungMan
41 replies
1d6h

And Neal Stephenson acknowledged it was obsolete in 2004:

"I embraced OS X as soon as it was available and have never looked back. So a lot of 'In the beginning was the command line' is now obsolete. I keep meaning to update it, but if I'm honest with myself, I have to say this is unlikely."

https://slashdot.org/story/04/10/20/1518217/neal-stephenson-...

But people still dredge this quarter century old apocrypha up and use it to pat themselves on the back for being Linux users. "I use a Hole Hawg! I drive a tank! I'm not like those other fellows because I'm a real hacker!"

llm_trw
22 replies
1d3h

Given what OS X has become it's un-obsoleted itself again.

It's kind of ironic that you're using a post from 20 years ago to invalidate an essay from 25 years ago, about an OS that's been substantially dumbed down in the last 10 years.

Bad corporate blood will tell.

UniverseHacker
12 replies
1d3h

In what way has it been “dumbed down?” I use modern MacOS as a Unix software development workstation and it works great- nothing substantial has changed in 20 years other than better package managers. I suppose they did remove X11 but it’s trivial to install yourself.

isametry
7 replies
1d2h

Not GP, but usually when people talk about the "dumbing down" of macOS, they refer to new apps and GUI elements adopted from iOS.

macOS as an operating system has been "completed" for about 7 years. From that point, almost all additions to it have been either focused on interoperation with the iPhone (good), or porting of entire iPhone features directly to Mac (usually very bad).

Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then. If Apple were to build a desktop OS today, there's no way they would make it the best Unix-like system of all time.

layer8
4 replies
1d1h

Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then.

This also applies to Windows, by the way (except it’s more like 20-30 years ago).

jthoward64
3 replies
21h45m

Whereas Linux never stopped coming up with new ideas, but doesn't have the manpower to implement them

Animats
2 replies
12h43m

systemd!

(Currently struggling with the way systemd inserts itself into the DNS query chain and then botches things.)

psd1
1 replies
10h30m

It likes to fall over to the secondary server, doesn't it.

bornfreddy
0 replies
8h43m

It likes to botch things.

wizardforhire
0 replies
15h44m

Yeah thats fair and I concur, but gp is right.

However and unfortunately I feel your last statement is spot tf on! Our only hope I guess is that they have incurred enough tech debt to be unable to enshitify themselves.

For those not in the know apple is an og hacker company, their first product was literally a blue box! Why this matters and gp is correct and why linux peeps gets in a tivvy and what stephenson was getting at with the batmobile analogy is that traditionally if hackers built something consumer facing they couldn’t help themselves but to bake in the easter eggs.

kolanos
0 replies
3h24m

Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then. If Apple were to build a desktop OS today, there's no way they would make it the best Unix-like system of all time.

Many of those ideas came from NeXT, so more like 30 years ago.

jlarocco
0 replies
15h49m

I stopped using it a few years ago, but IMO it was definitely being dumbed down and not respecting users any more. Things like upgrades reseting settings that I went out of the way to change - Apple has a "we know better than you" attitude that's frustrating to work around.

jhbadger
0 replies
7h9m

With each new version it has become increasingly hostile to installing new software, particularly open-source software that hasn't been "signed" by a commercial developer, throwing up huge warning windows suggesting that anyone daring to run such stuff is taking a huge risk. And many of the standard UNIX locations have become locked down making it impossible to install stuff there. It's clear that Apple would like to see a world where everything is installed on a Mac via their App Store and everyone writing Mac software are official paid developers as with their phones.

fake-name
0 replies
22h12m

OS X started going down hill as soon as they replaces spaces and expose with mission control.

bear141
0 replies
23h56m

Just using at a barely advanced level for 20 years or so as I do, the other comment was correct in that it is the changes that have seemingly been made to make it more familiar to iOS users and “idiot proof”.

Mainly slowly hiding buttons and options and menus that used to be easily accessible, now require holding function or re-enabling in settings or using terminal to bring them back.

rewgs
5 replies
1d

I will never understand the sentiment that macOS has been “dumbed down.”

It’s a zsh shell with BSD utils. 99% of my shell setup/tools on Linux just work on macOS. I can easily install the gnu utils if I want 99.9% similarity.

I very happily jump between macOS and Linux, and while the desktop experience is always potentially the best on Linux (IMO nothing compares to hyprland), in practice macOS feels like the most polished Linux distro in existence.

Do people just see, like, some iOS feature and freak out? This viewpoint always seems so reactionary. Whereas in reality, the macOS of the past that you’re pining for is still right there. Hop on a Snow Leopard machine and a Ventura machine and you’ll see that there are far, far more similarities than differences.

j2kun
3 replies
21h46m

I stopped using Mac because of the planned obsolescence, which is a huge problem with machines even 4 years old. Can't even update those BSD utils anymore because you can't update the OS, because you don't have the latest hardware, because you don't want to spend 2k more to get back the basics you had already paid for.

kstrauser
2 replies
20h55m

MacOS Sequoia supports every computer 4 years old. Almost no apps or CLI programs will absolutely require it unless they’re specifically invented to support some brand new feature. I’m running it on my 2018 Mac Mini.

If you wish Apple supported computers longer, fine. I’d personally disagree because I’ve had wonderful luck with them supporting my old hardware until said hardware was so old that it was time to replace it anyway, but would respect your different opinion. Don’t exaggerate it to make a point though.

jlarocco
1 replies
15h46m

It's ridiculous that you would claim this isn't a problem.

I'm typing this on a 12 year old MacBook Pro running Debian whose hardware perfectly fine, but hasn't been supported by Apple in years.

FWIW, Debian supports it fine, though NVidia recently dropped support for the GPU in their Linux drivers.

I'm going to miss it when it dies, too. Plastic Lenovos just can't compare.

kstrauser
0 replies
15h4m

I never said any such thing. I said it’s not a problem for me but that others may have a different opinion. And you could still use an older OS on that Mac; the ones it shipped with don’t magically stop working.

esalman
0 replies
12h58m

You're not a typical Mac OS user. The typical users I know do not know how to use finder, let alone shell commands, to navigate file system. Personally I have no issues developing in either Mac, Linux or Windows because I'm advanced level. But for the same reason, I prefer Linux or even Windows because those provide more freedom to the developer.

samatman
2 replies
1d3h

I also "embraced OS X as soon as it was available". My first Linux install was Yggdrasil, but I cut my teeth on SPARC stations. I respected two kinds of computer: Unix workstations, and the Macintosh.

So when Apple started making workstations, I got one. I've been a satisfied customer ever since.

I have no idea whatsoever what dumbing down you're referring to. The way I use macOS has barely changed in the last ten years. In fact, that's a major part of the appeal.

gizajob
1 replies
3h34m

Seconded. Make it 20 years.

aj7
0 replies
2h1m

Just got a Windows 11 machine. Had to, to run Solidworks. Have it next to a new M3 iMac. They’re all configured with the same apps. Despite not having used Windows in 10 years, these machines behave identically. But Windows 11 is snappier. And you can actually find things that you don’t know where they are!

I was amazed.

Phiwise_
11 replies
1d6h

"Obsolete" is too strong a word, I think. OSX isn't an evolution of the Macintosh's operating system; That'd be Pink, which was even mentioned, and it crashed and burned. OSX was far closer to a Linux box and a Mac box on the same desk, therefore the only change really needed is to replace mentions of Unix or specifically Linux with Linux/OSX as far as the points of the piece are concerned. If Jobs had paid Torvalds to call OSX "Apple Linux" (Or maybe just called it Apple Berkeley Unix) for some reason this would be moot.

I also primarily use Windows and don't have a dog in the fight you mentioned. I might actually dislike Linux more than OSX, though it has been quite a while since I've seriously used the one-button OS.

wpm
6 replies
1d3h

macOS shares zero lineage with Linux, which itself shares zero lineage with the UNIX derivatives. It would make zero sense for Apple to call macOS "Apple Linux" when it doesn't use the Linux kernel. Mac OS X is closest to a NeXTStep box with a coat of Mac-like polish on top. Even calling it "Apple Berkeley Unix" wouldn't make sense, because the XNU kernel is a mish mash of both BSD 4.3 and the Mach kernel.

Linux and the UNIX derivates are not even cousins. Not related. Not even the same species. They just both look like crabs a la https://en.wikipedia.org/wiki/Carcinisation.

frabbit
1 replies
4h28m

Your statement seems very strong. Is Mac OS X not based on Darwin? Are you defining "lineage" in some way to only mean licensing and exclude the shared ideas (a kernel manipulating files)? Thanks for the "carcinisation" link.

frabbit
0 replies
4h17m

To be more precise: this simplified history suggests a shared "lineage" back to the original UNIXes of all of BSDs (and hence Darwin and OSX) and of the GNU/Linux and other OSes https://unix.stackexchange.com/a/3202

So, "zero shared lineage" seems like a very strong statement.

Phiwise_
1 replies
23h14m

OS X is closest to a NeXTStep

This is already in the piece. Why waste time repeating it?

dylan604
0 replies
22h20m

Because the person they replied to used a term like Apple Linux which is ridiculous. So somebody on the internet was wrong and needed to be corrected

jholman
0 replies
11h32m

I used to have a coworker, a senior dev of decades of experience, who insisted that MacOS was a real Linux, "just like BSD". Sigh.

Of course, this belief probably had no downsides or negative consequences, other than hurting my brain, which they probably did not regard as a significant problem.

13of40
0 replies
1d

The crabs analogy isn't a good one, because they evolve independently to play well in a common environment. GNU/Linux is a rewrite of Unix that avoids the licensing and hardware baggage that kept Unix out of reach of non-enterprise users in the 80s and early 90s.

chipotle_coyote
2 replies
1d3h

OSX was far closer to a Linux box and a Mac box on the same desk

Setting aside the "more BSD/Mach than Linux", OS X pressed a lot of the same buttons that BeOS did: a GUI system that let you drop to a Unix CLI (in Be's case, Posix rather than Unix, if we're going to be persnickety), but whose GUI was sufficiently complete that users rarely, if ever, had to use the CLI to get things done. Folks who love the CLI (hi, 99% of HN!) find that attitude baffling and shocking, I'm sure, but a lot of people really don't love noodling with text-based UIs. I have friends who've used the Mac for decades -- and I don't mean just use it for email and web browsing, but use it for serious work that generates the bulk of their income (art, desktop publishing, graphic design, music, A/V editing, etc.) -- who almost never open the Terminal app.

though it has been quite a while since I've seriously used the one-button OS

Given that OS X has supported multi-button mice since 2001, I certainly believe that. :)

giantrobot
0 replies
1d2h

Given that OS X has supported multi-button mice since 2001, I certainly believe that. :)

And since MacOS 8 before that...

Phiwise_
0 replies
23h16m

Given that OS X has supported multi-button mice since 2001, I certainly believe that.

Just a joke, mate.

Animats
0 replies
12h38m

Apple was in some ways on the right track with an OS that had no command line. Not having a command line meant you had to get serious about how to control things.

The trouble with the original MacOS was that the underlying OS was a cram job to fit into 128Kb, plus a ROM. It didn't even have a CPU dispatcher, let alone memory protection. So it scaled up badly. That was supposed to be fixed in MacOS 8, "Copeland", which actually made it out to some developers. But Copeland was killed so that Apple could hire Steve Jobs, for which Apple had to bail out the Next failure.

theandrewbailey
1 replies
6h38m

There was a competing bicycle dealership next door (Apple) that one day began selling motorized vehicles--expensive but attractively styled cars with their innards hermetically sealed, so that how they worked was something of a mystery.

Neal said the essay was quickly obsolete, especially in regards to Mac, but I'll always remember this reference about hermetically sealed Apple products. To this day, Apple doesn't want anyone to know how their products work, or how to fix them, to the point where upgrading or expanding internal hardware is mostly impossible.

II2II
0 replies
1h24m

The difference between Apple and IBM is the latter lost control of their platform and those who inherited, arguably Intel and Microsoft, had no interest in exerting absolute control. (If they tried, it likely would have backfired anyhow.)

As for Apple, their openness comes and goes. The Apple II was rather open, the early Macintosh was not. Macintosh slowly started opening up with early NuBus machines through early Mac OS X. Since then they seem to be closing things up again. Sometimes it was for legitimate reasons (things had to be tightened up for security). Sometimes it was for "business" reasons (the excessively tight control over third-party applications for iOS and the incredible barriers to repair).

As for the author's claims about their workings being a mystery, there wasn't a huge difference between the Macintosh and other platforms. On the software level: you could examine it at will. At the hardware level, nearly everyone started using custom chips at the same time. The big difference would have been IBM compatibles, where the chipsets were the functional equivalent of custom chips yet were typically better documented simply because multiple hardware and operating system vendors needed to support them. Even then, by 1999, the number of developers who even had access to that documentation was limited. The days of DOS, where every application developer had to roll their own hardware support were long past. Open source developers of that era were making a huge fuss over the access to documentation to support hardware beyond the most trivial level.

eikenberry
1 replies
21h51m

Neal gave up on Linux because he wasn't a developer. He couldn't take advantage of the freedoms it provided and it worked the same for him as any proprietary OS would. I.E. he had the excuse that programming is hard, a specialty that requires much practice. This is an ongoing issue with free software and is why it is niche... it primarily appeals to software developers as they are the only ones that can take advantage of the freedoms it provides and are the ones that truly sacrifice that freedom when they use a non-free OS.

pacaro
0 replies
21h1m

Yeah, this is basically my take too. I had a hardcopy of this sometime between 99 and 02, and read it several times

At the time I was an embedded developer at Microsoft and had been a Windows programmer in the mid 90s. It was pretty clear that there was some dunning Krueger going on here. Neal knew enough about tech to be dangerous, but not really enough to be talking with authority

gumby
0 replies
23h59m

I keep meaning to update it, but if I'm honest with myself, I have to say this is unlikely."

IME he hates to revisit anything he's already written so this claim is polite but implausible.

bawolff
0 replies
18h55m

I think its still quite relavent. There are still people who enjoy more diy OS. Now a days they use arch or something. That doesn't make them any better or worse than anyone else. Some people enjoy tinkering with their OS, other people just want something that Just Works(tm) and there is nothing wrong with that.

Threre is an implicit supperiority in the text which is just as cringey now as it was at the time, but i think its still a good analogy about different preferences and relationships different people have to their computers.

throwawaydummy
3 replies
19h42m

I didn’t know myself let alone computers but anyone know why Vista and Windows 8 got such bad reps?

thfuran
0 replies
3h7m

Vista significantly changed the security model, introducing UAC and requiring driver signing. This caused a bunch of turmoil. It also had much higher system requirements than xp for decent performance, meaning that it ran at least a bit shit on many people's machines when it was new, and that did it no favors.

theandrewbailey
0 replies
6h46m

I'm one of the few people who thought Vista was fine. It mostly got a bad rap because of bad drivers and high system requirements.

smackeyacky
0 replies
16h3m

They were given “tablet friendly” user interfaces that made navigation extremely difficult. But Microsoft never finished the job so you’d try to find something in the stupid sliding screen interface only to be thrown into a windows 3.1 era control panel. Windows server of that era was even worse, as mostly you administered using RDP but it never played nice with the “hover here to bring up the whole screen menu” required to drive it.

Underneath it was just Windows, but the interface ruined it

justin66
3 replies
1d

recent Acquired episode on Microsoft, where Vista was a Dodge Viper but Windows 7 was a Toyota Camry, which is what users actually wanted.

I assume if anyone associated with Microsoft compared Vista to anything other than an abject failure, it's because they are - at best - broken or defective people who were involved in the creation of Vista, and therefore not objective and not to be trusted in any way.

Dodge Viper? WTF?

throwawaydummy
2 replies
19h37m

Doge Vipers get a bad rep even in need for speed lol but idk enough about cars to know why

00N8
0 replies
18h54m

The first generation or two of Vipers were very raw & unrefined, with ~400 horsepower, no stability control, no anti-lock brakes, minimal safety features, cheap & janky looking interiors, etc. It sounds great to me (aside from the lack of a/c & nice ergonomic interior), but it would be extremely easy to have a bad time if you're an average driver.

I'm not sure about the analogy though, they might have been thinking of later Viper versions where the complaints would be more about cost, gas mileage, or general impracticality for daily use.

hiAndrewQuinn
3 replies
1d11h

Windows XP would probably be an old 1970s Volvo that is still, somehow, running in all kinds of places otherwise forgotten by the waking world.

johnohara
0 replies
1d3h

Mostly in Oregon, Northern CA, and parts of Washington. But parts are increasingly hard to find.

heraldgeezer
0 replies
1d7h

OS/2 also! :D

simonmysun
0 replies
1d4h

I feel now the different way. Linux is never that much more powerful than other OS. It felt more like a tractor 10 years ago and now it is a good alternative to "Toyota Camry" with pretty good user experience.

Meanwhile Windows has become those cars with two 27" screen as dashboard, which has bad user experience and full of advertisements.

davidgerard
0 replies
1d3h

The actual comparison is that Vista was the Ford Edsel (hilarious and widely-mocked failure) but 7 was the Ford Comet (a huge hit).

It's a close analogy, because the Comet was actually the next model of Edsel. They just changed the branding. Same with Vista to 7.

bitwize
0 replies
6h39m

The way I explained my continued loathing of Microsoft to a friend was that back in the day, it was like having two choices of car: a Ford Pinto, or a pickup truck. Not everybody needed the heavy transport capabilities of the pickup, but their only choice if they didn't want the added bulk and fuel consumption was the Pinto, which had dangerous failure modes. Later, the Pinto would be withdrawn and replaced with a Ford Taurus, a serviceable but not particularly fun or performant vehicle which worked fine usually, but was based on a transaxle design, so when it did break you had to dismantle the entire front end of the vehicle in order to repair it. And now imagine that there were roads you couldn't drive on and places you couldn't go unless you had one of these three vehicles, simply due to Ford's market pressure on infrastructure planners, not for any good reason; and besides, people mocked you for wanting something else.

ndsipa_pomu
15 replies
1d5h

One major advantage of the CLI is that instructions/fixes etc are very concise and can be easily communicated. If someone has a Linux system that needs a known fix, then it's trivial to send someone the commands to copy/paste into a terminal. However, if there's a known fix for a graphical program, then it suddenly becomes much harder to communicate - do you go for a textual instruction (e.g. click on the hamburger menu, then choose "preferences"...), or a series of screenshots along with some text?

anthomtb
6 replies
1d3h

I think your trailing question is a hypothetical but to answer:

Screenshots along with text, then use Microsoft Paint to mark up the screen shots. For example, circling the appropriate menu option in a thick red line. Sadly, I do not know how to graphically convey the double-click operation.

Its a time consuming and error prone process, best used if the buggy GUI application is abandonware. Otherwise, the new version will have the same bug or non-intuitive process but the GUI will be unrecognizable. Probably because some manager, somewhere, read too many HN articles and decided a user-facing refactor was the most important thing they could possibly do.

ndsipa_pomu
5 replies
1d2h

You raise a good point about GUIs changing over time. Obviously, CLIs can change, but that happens less often and is usually in response to a poorly thought out first attempt at providing functionality (or when additional functionality is bolted on).

dredmorbius
4 replies
19h40m

Scripts are UI glue in more than one sense.

The usual interpretation is that scripts glue together system commands that don't otherwise incorporate one another. This is tremendously useful and is by itself a huge advantage over GUIs.

But the second sense is that vast libraries of scripts keep developers bound to their previous UI promises. You don't capriciously change command-line options without raising all kinds of ire. I've seen GNU tools (tar comes to mind) in which options have been "deprecated" for decades, yet are still available. I seem to recall a very small number of cases where options had their senses entirely inverted, to much protest, and ultimately abandonment of those tools. In other cases you've got generations of programs which support backwards-compatible options with precursors: exim and postfix with sendmail, ssh with rsh, alpine with pine, off the top of my head. Developer headspace and script compatibility are valuable.

On that note, whilst I find the *BSD's extreme conservatism annoying myself, I can appreciate that it does lead to increased consistency over time as compared with GNU userland.

ndsipa_pomu
1 replies
9h33m

There's also a lot of text based tools that are designed to be used as part of a script and don't provide much functionality in themselves - the philosophy of do one thing (and do it well). GUIs tend to be designed to provide the entire functionality that's wanted and little thought is given to them being used as part of a toolchain.

dredmorbius
0 replies
5h18m

Right. CLIs / TUIs offer composability. GUIs don't.

If someone makes a GUI that does specifically what you want, you're in luck. If not, you're S.O.L.

With CLIs, you can almost always find some way to string things together to get what it is you want.

The hard case is working in domains which aren't inherently textual: graphics, audio, video, etc. Even there, you can often get much more accomplished from the terminal than you'd think, though setting up the process can be challenging at times. Not so useful for one-offs, but invaluable for repetitive processes.

mjevans
1 replies
12h0m

The correct response to an API change is to re-version the API for the library. E.G.

bsdtar -> libarchive -- Arguments as presently known

bsdtar2 -> libarchive -- Some new overhaul that does things way differently

dredmorbius
0 replies
9h57m

Sure, for APIs or CLI tools.

Doesn't work so well for GUIs, which was sort of the original point.

limit499karma
2 replies
21h45m

This seems to deny the possibility of equivalence of any sequence of actions taken in a bounded spaces of entities (named widgets thus type:id) and actions to another 'representation' (e.g. text):

   { select[i]@dropdown:states > click@button:submit }
The fact that we don't have this (yet) does not mean it is not possible. In fact, given that the current darling of tech LLMs can 'compute visually' based on tokens (text) should make it clear that any 'representation' can ultimately be encoded in text.

So a 'record' feature on GUI 'a' can create action encoding that can be emailed to your peer looking at GUI 'b' and 'pasted'.

wmf
0 replies
21h21m

MacOS kind of had this with AppleScript, including recording. It's a little disappointing that it isn't widespread now but I realize that demand for GUI automation is extremely niche.

jerf
0 replies
6h26m

I really wish one of the little GUI frameworks would develop this, but alas, the evidence is that "nobody" actually wants this. As evidenced by the things like Applescript that existed, and died.

I feel like it could be done if it was really a goal from day one, and there were things like "record this set of actions as a script" built into the toolkit. Even Applescript was still an afterthought, I think, albeit a very well supported one.

In the meantime, given the comprehensive failure of UI toolkits to support this style, it is completely fair for people to act and speak as if it doesn't exist.

gregw2
2 replies
1d

Agreed. Poor scripting/replay inherently limits the GUI.

That said, the best late 90s expression of the core advantage of the GUI over the TUI/CLI is that it demands less of the user:

"recognize and click" vs

"remember and type"

That seems very fundamental to me.

I have not seen as succinct expression of tradeoffs for V(oice)UI or L(LM)UIs

mjevans
0 replies
11h58m

The 90s UIs all had 'hints' for how to activate UI features using keyboard shortcuts.

Want to know how to copy and paste quickly? It's Right There in the menu you found the action in. Don't know that yet? Alt + E (underlined in Edit) then some other key to jump to the action in the list, or you now see the list, abort the previous command sequence with Esc, and then start memorizing the new shortcut for Ctrl+c + Ctrl+v.

dsubburam
0 replies
1d

How about the new kid on the block here, the chat interface? Neither "recognize and click" nor "remember and type", esp. if done over voice.

Maybe the chat interface does away with the first half of the GUI/CLI schemes, skipping over learning the affordance part of the interface.

gumby
0 replies
1d

Or even write a script which is just those commands. Meanwhile every step a user as to do is an opportunity to cause a new and different problem.

xg15
13 replies
1d9h

Buyer: "But this dealership has mechanics on staff. If something goes wrong with my station wagon, I can take a day off work, bring it here, and pay them to work on it while I sit in the waiting room for hours, listening to elevator music."

Bullhorn: "But if you accept one of our free tanks we will send volunteers to your house to fix it for free while you sleep!"

Did Linux distros actually offer support at some point? (By what I assume would be some project contributor ssh-ing into your machine)

My impression was always the arguments were more like "Well yes, but we have this literal building full of technical manuals that describe every bolt and screw of your tank - and we can give you a copy of all of them for free! And think about it - after you have taken some modest effort to read and learn all of them by heart, you'll be able to fix and even modify your tank all on your own! No more dependence on crooked car dealers! And if you need help, we have monthly community meetups you can attend and talk with people just as tank-crazy as you are! (please only attend if you're sufficiently tank-crazy, and PLEASE only after you read the manuals)"

(This was decades ago, the situation has gotten significantly better today)

lukan
7 replies
1d7h

"By what I assume would be some project contributor ssh-ing into your machine"

Who would want that?

"Stay away from my house, you freak!" would be the normal reaction. Unless some serious trust is developed, I would not let people into my house while I sleep.

Also the actual usual reaction would have been more like: "hey it is open source, you can fix anything on your tank yourself"

You need a new module to connect with your other devices, just build it yourself, no big deal!

xg15
4 replies
1d6h

I know you shouldn't look too closely at analogies, but I think there is an interesting inconsistency in the "car dealerships" story:

The tank people offer to send someone to look into the car (rsp. tank) but the buyer rejects them from entering their house.

That's significant, because a car is much less private than a house. In the real world, if my car had an issue, it would be perfectly reasonable to give it into the hands of a mechanic, even if I don't know them personally. (And evidently the reputation of the dealership isn't the deciding factor either, otherwise all the independent repair shops wouldn't exist)

On the other hand, I'd be much more wary to let strangers into my house without supervision, because I have far more private and valuable possessions there than in my car.

So the question is whether computers are more like cars or like houses. I'd argue, they sort of blur the line and have definitely moved closer to "house" in the last decades. But it might have been different back then.

charonn0
1 replies
22h12m

Obviously a computer is like an an RV.

xg15
0 replies
21h56m

It might also be a bus.

dredmorbius
0 replies
19h35m

That nitpick over car/house analogies is rapidly breaking down as cars have increased compute and storage themselves, and integrate with other personal electronics, most especially "smart" phones.

Some recent discussion on the border phone search ruling:

<https://news.ycombinator.com/item?id=41084384>

BenjiWiebe
0 replies
23h43m

I'd say the reputation of dealerships is why independent repair shops exist. :)

heraldgeezer
1 replies
1d7h

Companies want that when prod stops.

owl57
0 replies
10h47m

Reminded me of an extreme case. In 2012, the largest Russian bank brought its prod back online, and then posted literally this:

We invite specialists to take part in a professional discussion to identify the causes of Oracle DBMS failure

System logs and a more complete description of the situation will be posted in the blog. To participate in the discussion, you must fill out the registration form and wait for an invitation to your email.

https://web.archive.org/web/20120716225650/http://www.sbrf.r...

bregma
2 replies
1d7h

Did Linux distros actually offer support at some point?

Ever wonder how Red Hat became a billion-dollar company before it was bought by IBM, and now makes up a huge segment of IBM's revenue stream?

Have you noticed SuSE is still around?

Have you ever speculated on how Canonical keeps its lights on?

Paid support, my naive friend. Linux support is big business and is what keeps the popular distros alive.

xg15
0 replies
1d7h

Good point!

matheusmoreira
0 replies
2h23m

Arch Linux seems to be an exception.

https://wiki.archlinux.org/title/Arch_Linux

Arch developers remain unpaid, part-time volunteers, and there are no prospects for monetizing Arch Linux
delusional
1 replies
1d9h

Did Linux distros actually offer support at some point? (By what I assume would be some project contributor ssh-ing into your machine)

I don't think that was the intended implication. I think the analogy is more akin to: "If send us a bug report, we'll fix it and ship a new version that you can download and use for free." In the olden days, you'd have to buy a new version of commercial software if it didn't work for your machine, complementary patches were rare.

jmclnx
0 replies
1d5h

Depends on where you lived. In the early days there were lots of LUGs in some areas, usually in college areas, but some would be hosted at a few Companies.

I think DEC had one or two. And you could find someone who would meet you somewhere to help you out, it was an exciting time. Also there were lots of install fests for Linux.

Most activity took place on USENET, so getting help was rather easy.

For example, I had asked how I could connect 2 monitors to my 386SX, one controlled by a VGA card, the other via a mono-card, each monitor with a couple of VTs. That was doable with Coherent on install. A day later I got a patch.

Things moved very quickly back then :)

hgyjnbdet
12 replies
1d11h

Me: seems like my sort of thing.

Me: navigate to linked website, see wall of text.

Me: clicks reading mode

Me: *193 - 245 minutes*

Me: bookmark to read later; probably not

imiric
5 replies
1d8h

I ran it through an LLM and asked it to summarize and answer questions about it. Worked great to get the gist.

johnisgood
3 replies
1d8h

I am curious, why is this comment being down-voted? I mean, I would like to hear an opinion against it (not that I care about the points).

johnisgood
0 replies
1d6h

I am just simply asking for the opinion of people who disagree with OP. I do not care about the down-vote per se, more so about the opinion of people who disagree indicated by the down-votes.

imiric
0 replies
1d7h

Because HN has a hate boner against "AI". :)

The reality is that summarizing text and answering questions about it is one of the best use cases for what we currently call "AI".

bregma
0 replies
1d7h

Are you purposefully being ironic?

sshine
3 replies
1d10h

It’s a short novel.

Putting it in a browser window gives it bad odds. You can also listen to it:

https://youtu.be/KpaUg6WwdzU

Begins at 01:30, 25 minutes.

sshine
1 replies
1d1h

Thanks for correcting me.

boomskats
0 replies
19h40m

Thanks for the link! I'm 3/7ths of the way through it.

yetihehe
0 replies
1d11h

It's very engaging, please try reading it.

enugu
0 replies
1d10h

You can skip ahead to his playful thesis > the universe emerging from the command line.

In his book The Life of the Cosmos, which everyone should read, Lee Smolin gives the best description I've ever read of how our universe emerged from an uncannily precise balancing of different fundamental constants. The mass of the proton, the strength of gravity, the range of the weak nuclear force, and a few dozen other fundamental constants completely determine what sort of universe will emerge from a Big Bang. If these values had been even slightly different, the universe would have been a vast ocean of tepid gas or a hot knot of plasma or some other basically uninteresting thing--a dud, in other words. The only way to get a universe that's not a dud--that has stars, heavy elements, planets, and life--is to get the basic numbers just right. If there were some machine, somewhere, that could spit out universes with randomly chosen values for their fundamental constants, then for every universe like ours it would produce 10^229 duds.

Though I haven't sat down and run the numbers on it, to me this seems comparable to the probability of making a Unix computer do something useful by logging into a tty and typing in command lines when you have forgotten all of the little options and keywords. Every time your right pinky slams that ENTER key, you are making another try. In some cases the operating system does nothing. In other cases it wipes out all of your files. In most cases it just gives you an error message. In other words, you get many duds. But sometimes, if you have it all just right, the computer grinds away for a while and then produces something like emacs. It actually generates complexity, which is Smolin's criterion for interestingness.

Not only that, but it's beginning to look as if, once you get below a certain size--way below the level of quarks, down into the realm of string theory--the universe can't be described very well by physics as it has been practiced since the days of Newton. If you look at a small enough scale, you see processes that look almost computational in nature.

I think that the message is very clear here: somewhere outside of and beyond our universe is an operating system, coded up over incalculable spans of time by some kind of hacker-demiurge. The cosmic operating system uses a command-line interface. It runs on something like a teletype, with lots of noise and heat; punched-out bits flutter down into its hopper like drifting stars. The demiurge sits at his teletype, pounding out one command line after another, specifying the values of fundamental constants of physics:

universe -G 6.672e-11 -e 1.602e-19 -h 6.626e-34 -protonmass 1.673e-27....

and when he's finished typing out the command line, his right pinky hesitates above the ENTER key for an aeon or two, wondering what's going to happen; then down it comes--and the WHACK you hear is another Big Bang.

pavlov
8 replies
1d8h

The CLI has a massive blind spot in today’s operating systems: it knows nothing useful about events.

Yet events are the primary way anything happens on a computer, whether it’s a user system or a server.

skydhash
2 replies
1d6h

In unix world, everything is a file, so you can poll the file waiting for something to happen. And there’s the whole signal thing and dbus exists.

pavlov
1 replies
1d5h

Yeah, these are the event paradigms that I meant by “nothing useful.”

Files are not a good abstraction for events. Signals are broken in many ways. And DBus is both extremely clunky to use and non-portable.

There isn’t a built-in event paradigm similar to how streams and pipes are an integral part of the Unix-style CLI.

dandanua
0 replies
1d5h

Files are not a good abstraction for events

Why is that? On the low level everything is a state of electronic cells. Files address those cells in a suitable fashion. Modern programming abstractions such as async/await are very simple, but fail miserably if you need something really complex and efficient.

nonrandomstring
0 replies
1d8h

Annoying though it may be, you can run a program in the background that can write to your open terminal.

Just in userspace you have;

   dmesg -w

   tail -f /var/log/messages
There's also dbus to monitor on Linux systems and a lot of kernel hook tricks you can use to get a message pop up if an event happens.

Because it gets annoying to have a process splurge notification stuff to a term you are working in, that's why you have info-bars which many terminal emulators support.

johnklos
0 replies
1d

I can't even understand this. How does the CLI relate to events? Are you saying that all servers secretly have an invisible GUI and handle events there?

This just seems like an odd non sequitur.

fock
0 replies
1d8h

huh? DBUS is very much a thing and has CLI-tooling?

dredmorbius
0 replies
16h40m

That's the specific hole which Expect fills for scripting:

<https://en.wikipedia.org/wiki/Expect>

Much of modern operating systems is events-based and relies on polling for specific actions, devices connecting / disconnecting, etc. Linux device management is based on this, for example. There are numerous scripts which fire as a service is enabled or disabled (networking "ifup" and "ifdown" scripts, for example), or services are started / stopped. Systemd extends these capabilities markedly.

And of course there's a whole slew of conditional processing, beginning with init scripts, [], test, &&, ||, and specific conditional logic within shells, scripting, and programming languages.

anthk
0 replies
1d2h

You mean, like entr triggering commands?

bee_rider
7 replies
1d4h

Yet now the company that Gates and Allen founded is selling operating systems like Gillette sells razor blades. New releases of operating systems are launched as if they were Hollywood blockbusters, with celebrity endorsements, talk show appearances, and world tours.

I was a kid at the time, but did many people actually buy windows? I know about the ad-thing where the cast of Friends or whatever bought windows 95, but as I recall even back then the OS just came with the device. The only exception was OSX, which was a “Big Deal,” even non-technical people downloaded it.

Anyway, it is funny to see this in retrospect. Nowadays, operating systems have become so commoditized that you can’t even make a business selling them.

I love Linux but his description is quite optimistic.

crazygringo
1 replies
1d3h

Absolutely -- yes Windows came with your PC, but you'd buy the new version as a (cheaper) upgrade if you didn't want to wait until the next PC you bought.

Today new OS versions aren't such a big deal, but when Windows 95 came out, and then XP, they were huge, with total interface redesigns.

On the other hand, I don't think people went out of their way as much to buy smaller upgrades like Windows 98.

Shawnj2
0 replies
1d3h

I think the fact that OS updates are free now kinda killed a lot of the marketing/promotion since most consumers will only update if their computer is old.

shostack
0 replies
1d

It feels like while the marketing of launches may still be like that, it is purely to usher along adoption of the target future state which seems to be doing away with versions and having you simply locked in to forever paying an ever-increasing subscription fee to continue to use your computer in addition to data collection which can be separately monetized.

This is markedly different from how it was in the past when they needed people to get up, go to a store, and buy the disc containing the new version of the OS.

plasticchris
0 replies
1d3h

Yes - people bought it to upgrade, as the step up from dos to 3.1 to win95 was huge.

layer8
0 replies
1d1h

I actually bought Windows 95 shrink-wrapped in store, because I was so excited about the new UI, for my previously DOS-only 486DX. Also OS/2 a bit later.

giantrobot
0 replies
1d

I was a kid at the time, but did many people actually buy windows?

Definitely. A new boxed OS version would often be the only updates anyone ever applied to their system. Even if you had Internet access, dial-up speeds and limited disk space meant downloading OS updates was often impractical. Even relatively small updates took forever to download.

There was also the relative costs of a computer. A $2,000 computer in 1995 would be about $4,000 in today dollars. Buying an OS update would be a relatively inexpensive way to upgrade the capabilities of your expensive computer without completely replacing it. Going from some Windows 3.1 release to Windows 95 would have been a nice upgrade in system stability for many people. Certainly not everyone but for many.

MetaWhirledPeas
0 replies
1d3h

I was a kid at the time, but did many people actually buy windows?

Yep! Windows did indeed come with machines, but the upgrades were always a big seller. I remember when Windows 3.1 hit the shelves and seemed to be everywhere. Same with Windows 95 but that one was a tougher upgrade because of the increased system requirements.

kkfx
5 replies
1d10h

To be (re)read together with the Unix Haters Handbook https://web.mit.edu/~simsong/www/ugh.pdf to realize that what we need is re-made LispM, Smalltalk workstations or the OS as a single application, framework opened down to the most low level part, in the user hands, fully end-user programmable, discoverable, and fully integrated. A 2D and even 3D/4D CLI as the UI, witch is the DocUIs with eventual 3D and video/multimedia elements.

As a conceptual framework http://augmentingcognition.com/assets/Kay1977.pdf

lproven
4 replies
1d9h

Riotous applause from the cheap seats

kkfx
3 replies
1d7h

Thanks :-) but allow me to enlarge, in my poor English: "us" the westerns, we came from those who invented the IT, the modern industry. We are evidently in a declining phase. Evidently someone else try to emerge and actually one of the most prominent new power, China, emerge doing what we have done when we was at the top of the world in industrial, public research, education terms. Meaning such "technical model" works.

Evidently the financial capitalism have worked for a certain period but does not work anymore. So, why keeping committing suicide? We have started to suicide with the WWI. We kept going with WWII and we continue now.

We are still the leader for IT, and we know what does it work, the classic FLOSS/at least open IT model, the one where some sell iron not bits, the one where customers own their systems and bend them as they wish, the one where communication is not owned by some walled gardens but open like on Usenet, classic mails (as opposite to webmails, hiding the decentralized ability for most users who do not own the WebMUA). To continue with the China comparison I've recently bought some battery tools, very cheap crap but enough for domestic usage and I've observed that batteries have a standard connector, I can swap them from different brands issueless, batteries, chargers are "standard". I also own some high end domestic battery tools, who happen to have a damn nfc tag inside the battery tying it to the device, even if inside the battery are classic connected li-ion batteries. The same I observed for BEV, some friends and I have three Chinese BEV from different brands/models and they have a gazillion of common parts. So to say "open/standard pay back" yes, it might erode SOME OEMs margins, but pay back the society at a whole and as a result the OEM itself. The same is valid in software terms. My desktop is Emacs/EXWM, I use as a search&narrow framework consult/orderless/vertico, they simply "plug in" any other packages because the system is a unique application end-user programmable at runtime. You do not need to "specifically support" a package to integrate it. You do not need third party to explicitly support your package to use it together. And what we do instead? Our best to create walled gardens, we have had XMPP, SIP/RTP and now most are on Zoom/Teams/Meet/Slack/Skype/* all walled gardens. Even many push to substitute emails with some new colorful walled garden. Similarly any app try to add features someone else have since it's impossible just using it, like a sexp in a buffer.

As a result modern IT from Unix where the user at least can combine simple tools with some IPCs in script we are limited by cut&paste, drag&drop and not much more. Anything is damn complicated because there are no shared data structure in a shared environment, but any "app" do everything on it's own, often with a gazillion of dependencies who have a gazillion of deps on their own, with a big load of parsers of any kind and even "some standard to pass data" (like JSON) who try to emerge but are still not a shared data structure in an unique integrated environment.

All of this is the old Greenspun's tenth rule and actually is killing our IT innovation, killing one of the last sector we still rules, for the same issues that have killed our industry essentially.

lproven
2 replies
1d7h

I 100% agree, and your points about Chinese tools are particularly incisive.

As an aside, but I think relevant and you might find it interesting:

A decade or so I discovered Oberon, the last masterwork of the great genius of programming languages Niklaus "Bucky" Wirth. A complete OS, UI and compiler, in about four and a half thousand lines of code.

I have it running in various forms.

I introduced it to the Squeak Smalltalk community, and when I told them what I was looking for:

« a relatively mature, high-performance, small OS underneath, delivering some degree of portability -- something written in a type-safe, memory-managed language, with multiprocessor support and networking and so on already present, so that these things do not have to be implemented in the higher-level system. »

That is how I found Oberon. They told me such a thing did not and could not exist.

I gave them some links and people were amazed.

It seems deeply obscure in, as you say, the West.

But I have found an active community working on it and developing it. It is in Russia.

It may be that in other countries now the target of Western sanctions, we may inadvertently be fostering some very interesting tech developments...

kkfx
0 replies
1d4h

I know Oberon only by name, I've used (and honestly not loved it at all) some Pascal dialect back at high school, but back then was not real programming and was an introductory very bad organized course so it's hard to tell, I've encountered probably Oberon for a river navigation applications around 10 years ago but I wasn't really involved so I can't say much, I essentially do not know anything but the name, if you have some interesting links to share I'll skim them with pleasure.

In more broad terms I do not put much attention in a specific programming language even if clearly an OS-single-application is tied to a specif programming language, in the sense that there are many, and they are many factions loving one and hating others, the point is offering something usable at user level, like "just type a sexp and execute it" also from an email, because integration means also an immense small diversity and heavy dialogues so innovation. With such model we can keep our supremacy and more important we can't lose it because essentially anything float in a common see.

The main issue to reach such goal I think it's general cultural of the masses, today most peoples, many programmers included, do think that IT means computers, like saying that astronomy is the science of telescopes. Of course computers are like pen and paper, an essential tool, but they are a tool, the purpose of IT is information and that's not a specific technical task but a broad aspects involving essentially all disciplines. Until this is clear for anyone there is little hope people understand the need, power and issues of IT, they'll keep looking at the finger pointing the Moon instead of at the Moon.

The rest came after, even the most basic computer skills came after because to learn we need to be motivated, learning "just because you have to" as a social rule is not productive.

yetihehe
3 replies
1d11h

My favourite part explaining how unix/linux users feel regarding windows: 'THE HOLE HAWG OF OPERATING SYSTEMS'.

This essay should be a mandatory reading for all CS students and anyone wanting to call himself hacker.

chuchurocka
1 replies
1d6h

I teach data analytics and it’s required reading on the first week. It doesn’t soften the blow of throwing people in the CLI but provides perspective to why I am.

askvictor
0 replies
19h16m

Just tell them it's the 'chat interface' to the computer.

raldi
0 replies
1d10h

Yes, and the part about Reagan.

booleandilemma
2 replies
1d5h

The capitalist system loves the CLI.

That complicated series of commands you just ran? Copy and paste them into the Jira ticket so the junior employee who makes half your salary can run them next time.

yakireev
0 replies
1d4h

> Copy and paste them into the Jira ticket so the junior employee who makes half your salary can run them next time.

Or, more likely: so that you yourself can remember what you did the next time the problem arises. Or your colleague, who is senior, but does not know this part of the codebase or infra well. Heck, you can even write a shell script, automate things and have your productivity increased!

These things will be just as true and useful in some communist FOSS context as they are in a capitalist system.

kstrauser
0 replies
1d1h

It’s not capitalist oppression to not want to reinvent the wheel every time.

If the solution to a problem is as simple as “copy and paste this command”, I want to hand that off to a junior employee so I can go do other things that require more creativity.

willguest
1 replies
1d9h

Shouldn't this be written in Latin?

layer8
0 replies
1d1h

Hebrew, you mean?

sschober
0 replies
12h45m

I love that you can do that with the text.

dang
0 replies
17m

I'm pretty sure it's far from #1 (consider "Story of Mel", "You and Your Research", and many others), but I don't have that list or even know how to create such a list, as it's perturbed by variations in both titles and URLs.

inciampati
1 replies
1d10h

I read this during my university graduation ceremonies. It was hidden in a fold in my robes. It was the most fitting thing I could have done, as I immediately changed my life direction and focused on exactly the ideas outlined in this work. My goal: move the world from the command line. I've almost managed to.

throwaway211
0 replies
1d10h

I like the problem to the move away from IP Chains.

We should move to the Command Table.

zeruch
0 replies
11h13m

Such an underrated book.

themadturk
0 replies
3h19m

It may be outdated, but it's still one of my favorite Stephenson texts. It's been a while since I read it, but the best part for me is his explanation of abstraction in graphical systems, which I think is still valuable today as we get further and further away from whatever the fundamental interface is between human and computer.

simonmysun
0 replies
1d4h

I just wrote a command line interface of LLMs in (almost) pure Bash[1]. I endorse the future of LLMs because of the points in this article. People talk to LLMs the same way we talk to CLI shells, and everything is plain text based (It's Unix philosophy! ). I should've read this earlier to get more ideas before writing the CLI client.

[1]: https://github.com/simonmysun/ell

sharas-
0 replies
9h0m

My first 486DX PC had a "turbo" button. You press it - and it wasn't just a teletype anymore.

ryandv
0 replies
1d7h

The section on "THE INTERFACE CULTURE" is critical to understand in today's digital media landscape. Disney's Animal Kingdom is to the written works of Lewis Carroll and J.M. Barrie as the GUI is to the command-line interface. The message of one medium is audiovisual spectacle and immersive experience; the other, cold intellectualism demanding participation from the reader to paint a picture in his mind's eye through the interpretation of raw text, words on a screen, or a piece of paper.

    But this is precisely the same as what is lost in the transition from the
    command-line interface to the GUI.

    Why are we rejecting explicit word-based interfaces, and embracing
    graphical or sensorial ones--a trend that accounts for the success of both
    Microsoft and Disney?

    But we have lost touch with those intellectuals, and with anything like
    intellectualism, even to the point of not reading books any more, though we
    are literate.
Elsewhere [0] I have called this concept "post-literacy," and this theme pervades much of Stephenson's work - highly technologically advanced societies outfitted with molecular assemblers and metaverses, populated by illiterate masses who mostly get by through the use of pictographs and hieroglyphic languages (emoji, anyone?). Literacy is for the monks who, cloistered away in their monasteries, still scribble ink scratchings on dead trees and ponder "useless" philosophical quandaries.

The structure of modern audiovisual media lends itself to the immediate application of implicit bias. On IRC, in the days of 56k before bandwidth and computer networks had developed to the point of being able to deliver low-latency, high definition audio and video, perhaps even for "real-time" videoconferencing, most of your interactions with others online was mediated through the written word. Nowhere here, unless some party chooses to disclose it, do race, gender, accent, physical appearance, or otherwise, enter into the picture and possibly cloud your judgment of who a person is - or, more importantly, the weight of their words, and whether or not they are correct, or at least insightful; consider Turing's "Computing Machinery and Intelligence" paper which first introduced what is now called the "Turing test," and how it was designed to be conducted purely over textual media as a written conversation, so as to avoid influencing through other channels the interrogator's judgment of who is the man, and who is the machine.

    The only real problem is that anyone who has no culture, other than this
    global monoculture, is completely screwed. Anyone who grows up watching TV,
    never sees any religion or philosophy, is raised in an atmosphere of moral
    relativism, learns about civics from watching bimbo eruptions on network TV
    news, and attends a university where postmodernists vie to outdo each other
    in demolishing traditional notions of truth and quality, is going to come
    out into the world as one pretty feckless human being.
Moreover, the confusion of symbols for reality, the precession of digitized, audiovisual content from a mere representation to more-than-real, digital hyperreality (since truth and God are all dead and everything is merely a consensual societal hallucination), leads people to mistake pixels on a screen for actual objects; narrative and spin for truth; influencers, videos, and YouTube personalities for actual people; or words from ChatGPT as real wisdom and insight - much in the same way that Searle's so-called "Chinese room" masquerades as an actual native speaker of Mandarin or Cantonese: "What we're really buying is a system of metaphors. And--much more important--what we're buying into is the underlying assumption that metaphors are a good way to deal with the world."

    So many ignorant people could be dangerous if they got pointed in the wrong
    direction, and so we've evolved a popular culture that is (a) almost
    unbelievably infectious and (b) neuters every person who gets infected by
    it, by rendering them unwilling to make judgments and incapable of taking
    stands.

    It simply is the case that we are way too busy, nowadays, to comprehend
    everything in detail.
The structure of modern short-form, upvote-driven media, lends itself to the production of short-form messages and takes with laughably small upper bounds on the amount of information they can contain. In a manner reminiscent of "you are what you eat," you think similarly to the forms of media you consume - and one who consumes primarily short-form media will produce short-form thoughts bereft of nuance and critical thinking, and additionally suffer from all the deficits in attention span we have heard of as the next braindead 10-second short or reel robs you of your concentration, and the next, and the next...

Beyond the infectious slot machine-like dopamine gratification of the pull-to-refresh and the infinite doomscroll, the downvote has become a frighteningly effective means of squashing heterodoxy and dissent; it is only those messages that are approved of and given assent to by the masses that become visible on the medium. Those who take a principled stand are immediately squashed down by the downvote mob, or worse, suffer from severe censure and invective at the hands of those zealous enforcers of orthodoxy. The downvote mechanism is reminiscent of the three "filters" Chomsky wrote of when he was discussing the mass media in "Manufacturing Consent," and the way advertisers, government, and capital all control and filter what content is disseminated to media consumers.

The message of modern, audiovisual, short-form, upvote-driven social media is bias and group compliance bereft of nuance. If you want to produce and consume novel ideas you are better served by media based on the written word.

[0] https://news.ycombinator.com/item?id=39990133

rrnechmech
0 replies
22h31m

This is timeless

qubex
0 replies
1d9h

I’ve got this in soft-cover. I think I read it back before the turn of the millennium. As a BeOS aficionado I loved the reference to batmobiles.

marmot777
0 replies
1d8h

Because I had to read this before falling asleep I had audio going out 2x so it sounds like a fifties cartoon.

macleginn
0 replies
20h56m

Using plain $LANGUAGE to communicate with LLMs is an interesting contender, which is even easier for end users than GUIs and even less precise and is further removed from the underlying computations. In spirit, it is definitely a step in the same general direction, which makes a lot of formulations in this article sharply prescient.

kaladin_1
0 replies
1d

Thanks for posting. A very interesting read. This is my first time seeing this well written piece. Written in a text file, sign of his distrust for software that mangle your ASCII :)

jart
0 replies
1d2h

It is commonly understood, even by technically unsophisticated computer users, that if you have a piece of software that works on your Macintosh, and you move it over onto a Windows machine, it will not run. That this would, in fact, be a laughable and idiotic mistake, like nailing horseshoes to the tires of a Buick.

Not anymore. https://justine.lol/ape.html

danielovichdk
0 replies
1d9h

Yes. Cults are real. At the same time very telling. Some people really believe they are better because they have a different machine than someone else. And that shit runs deep.

dang
0 replies
20h42m

Related:

In the Beginning was the Command Line (1999) - https://news.ycombinator.com/item?id=37314225 - Aug 2023 (2 comments)

In the Beginning Was the Command Line - https://news.ycombinator.com/item?id=29373944 - Nov 2021 (4 comments)

In the Beginning was the Command Line (1999) - https://news.ycombinator.com/item?id=24998305 - Nov 2020 (64 comments)

In the beginning was the command line (1999) [pdf] - https://news.ycombinator.com/item?id=20684764 - Aug 2019 (50 comments)

In the Beginning Was the Command Line (1999) - https://news.ycombinator.com/item?id=16843739 - April 2018 (13 comments)

In the Beginning Was the Command Line (1999) - https://news.ycombinator.com/item?id=12469797 - Sept 2016 (54 comments)

In the beginning was the command line - https://news.ycombinator.com/item?id=11385647 - March 2016 (1 comment)

In the Beginning was the Command Line, by Neal Stephenson - https://news.ycombinator.com/item?id=408226 - Dec 2008 (12 comments)

In the beginning was the command line by Neil Stephenson - https://news.ycombinator.com/item?id=95912 - Jan 2008 (5 comments)

In the Beginning Was the Command Line - https://news.ycombinator.com/item?id=47566 - Aug 2007 (2 comments)

(Reposts are fine after a year or so; links to past threads are just to satisfy extra-curious readers. In the case of perennials like this one, it's good to have a new discussion every once in a while so newer user cohorts learn what the classics are.)

almog
0 replies
22h4m

I still remember one of the quotes from The Metaphor Shear chapter, where he tried to explain how he felt after some of his files on his Mac got corrupted to such degree that he just couldn't recover: "It was sort of like watching the girl you've been in love with for ten years get killed in a car wreck, and then attending her autopsy, and learning that underneath the clothes and makeup she was just flesh and blood."

KingOfCoders
0 replies
1d9h

In the beginning was a switch.