return to table of content

Niklaus Wirth has died

lukego
79 replies
22h54m

Besides all his innumerable accomplishments he was also a hero to Joe Armstrong and a big influence on his brand of simplicity.

Joe would often quote Wirth as saying that yes, overlapping windows might be better than tiled ones, but not better enough to justify their cost in implementation complexity.

RIP. He is also a hero for me for his 80th birthday symposium at ETH where he showed off his new port of Oberon to a homebrew CPU running on a random FPGA dev board with USB peropherals. My ambition is to be that kind of 80 year old one day, too.

cscheid
72 replies
22h5m

not _better enough_

Wirth was such a legend on this particular aspect. His stance on compiler optimizations is another example: only add optimization passes if they improve the compiler's self-compilation time.

Oberon also, (and also deliberately) only supported cooperative multitasking.

frognumber
52 replies
21h51m

Supported cooperative multitasking won in the end.

It just renamed itself to asynchronous programing. That's quite literally what an 'await' is.

kragen
18 replies
21h45m

async/await has the advantage over cooperative multitasking that it has subroutines of different 'colors', so you don't accidentally introduce concurrency bugs by calling a function that can yield without knowing that it can yield

i think it's safe to say that the number of personal computers running operating systems without preemptive multitasking is now vanishingly small

as i remember it, oberon didn't support either async/await or cooperative multitasking. rather, the operating system used an event loop, like a web page before the introduction of web workers. you couldn't suspend a task; you could only schedule more work for later

nottorp
11 replies
21h15m

And these fancy new names aren't there just for hiding the event loop? :)

frognumber
5 replies
20h21m

Sort of and sort of not.

The key thing about 2023-era asynchronous versus 1995-era cooperative multitasking is code readability and conciseness.

Under the hood, I'm expressing the same thing, but Windows 3.1 code was not fun to write. Python / JavaScript, once you wrap your head around it, is. The new semantics are very readable, and rapidly improving too. The old ones were impossible to make readable.

You could argue that it's just syntactic sugar, but it's bloody important syntactic sugar.

hathawsh
2 replies
16h57m

Exactly. The way I think about it, the "async" keyword transforms function code so that local variables are no longer bound to the stack, making it possible to pause function execution (using "await") and resume it at an arbitrary time. Performing that transformation manually is a fair amount of work and it's prone to errors, but that's what we did when we wrote cooperatively multitasked code.

nottorp
1 replies
7h30m

when we wrote cooperatively multitasked code

That's my point, we still do that. And based on your phrasing we're forgetting it :)

hathawsh
0 replies
20m

Sure, that's a good way to look at it. Another way to look at it: because the process of transforming code for cooperative multitasking is now much cleaner and simpler, it's fine to use new words to describe what to do and how to do it.

wglb
0 replies
14h41m

Coroutines are better than both. Particularly in reasoning about code.

bjoli
0 replies
10h58m

I never left 1991 and I haven't seen anything that has made me consider leaving ConcurrentML except for the actor model, but that is so old the documentation is written on parchment.

kragen
4 replies
21h7m

if the implied contrast is with cooperative multitasking, it's exactly the opposite: they're there to expose the event loop in a way you can't ignore. if the implied contrast is with setTimeout(() => { ... }, 0) then yes, pretty much, although the difference is fairly small—implicit variable capture by the closure does most of the same hiding that await does

nottorp
3 replies
21h1m

Not asking about old JavaScript vs new JavaScript. Asking about explicit event loop vs hidden event loop with fancy names like timeout, async, await...

kragen
2 replies
20h54m

do you mean the kind of explicit loop where you write

    for (;;) {
        int r = GetMessage(&msg, NULL, 0, 0);
        if (!r) break;
        if (r == -1) croak();
        TranslateMessage(&msg);
        DispatchMessage(&msg);
    }
or, in yeso,

      for (;;) {
        yw_wait(w, 0);
        for (yw_event *ev; (ev = yw_get_event(w));) handle_event(ev);
        redraw(w);
      }
async/await doesn't always hide the event loop in that sense; python asyncio, for example, has a lot of ways to invoke the event loop or parts of it explicitly, which is often necessary for integration with software not written with asyncio in mind. i used to maintain an asyncio cubesat csp protocol stack where we had to do this

to some extent, though, this vitiates the concurrency guarantees you can otherwise get out of async/await. software maintainability comes from knowing that certain things are impossible, and pure async/await can make concurrency guarantees which disappear when a non-async function can invoke the event loop in this way. so i would argue that it goes further than just hiding the event loop. it's like saying that garbage collection is about hiding memory addresses: sort of true, but false in an important sense

nottorp
1 replies
19h38m

What worries me is we may have a whole generation who doesn't know about the code you posted above and thinks it's magic or worse, real multiprocessing.

kragen
0 replies
19h16m

okay but is that what you meant by 'hiding the event loop' or did you mean something different

scubbo
5 replies
18h13m

(To set the tone clearly - this seems like an area where you know a _lot_ more than me, so any questions or "challenges" below should be considered as "I am probably misunderstanding this thing - if you have the time and inclination, I'd really appreciate an explanation of what I'm missing" rather than "you are wrong and I am right")

I don't know if you're intentionally using "colour" to reference https://journal.stuffwithstuff.com/2015/02/01/what-color-is-... ? Cooperative multitasking (which I'd never heard of before) seems from its Wikipedia page to be primarily concerned with Operating System-level operations, whereas that article deals with programming language-level design. Or perhaps they are not distinct from one another in your perspective?

I ask because I've found `async/await` to just be an irritating overhead; a hoop you need to jump through in order to achieve what you clearly wanted to do all along. You write (pseudocode) `var foo = myFunction()`, and (depending on your language of choice) you either get a compilation or a runtime error reminding you that what you really meant was `var foo = await myFunction()`. By contrast, a design where every function is synchronous (which, I'd guess, more closely matches most people intuition) can implement async behaviour when (rarely) desired by explicitly passing function invocations to an Executor (e.g. https://www.digitalocean.com/community/tutorials/how-to-use-...). I'd be curious to hear what advantages I'm missing out on! Is it that async behaviour is desired more-often in other problem areas I don't work in, or that there's some efficiency provided by async/await that Executors cannot provide, or something else?

versteegen
2 replies
16h11m

I ask because I've found `async/await` to just be an irritating overhead

Then what you want are coroutines[1], which are strictly more flexible than async/await. Languages like Lua and Squirrel have coroutines. I and plenty of other people thing it's tragic that Python and Javascripts added async/await instead, but I assume the reason wasn't to make them easier to reason about, but rather to make them easier to implement without hacks in existing language interpreters not designed around them. Though Stackless Python is a CPython fork that adds real coroutines, also available as the greenlet module in standard CPython [2], amazing that it works.

[1] Real coroutines, not what Python calls "coroutines with async syntax". See also nearby comment about coroutines vs coop multitasking https://news.ycombinator.com/item?id=38859828

[2] https://greenlet.readthedocs.io/en/latest/

wglb
1 replies
14h43m

Bliss 36 and siblings had native coroutines.

We used coroutines in our interrupt rich environment in our real time medical application way back when. This was all in assembly language and the coroutines vastly reduced our multithreading errors to effectively zero. This is one place where C , claimed to be close to the machine falls down.

kragen
0 replies
14h10m

interesting, i didn't even realize bliss for the pdp-10 was called bliss-36

how did coroutines reduce your multithreading errors

kragen
1 replies
16h57m

well some of the things i know are true but i don't know which ones those are; i'll tell you the things i know and hopefully you can figure out what's really true

yes! i'm referencing that specific rant. except that what munificent sees as a disadvantage i see as an advantage

there's a lot of flexibility in systems design to move things between operating systems and programming languages. dan ingalls in 01981 takes an extreme position in 'design principles behind smalltalk' https://www.cs.virginia.edu/~evans/cs655/readings/smalltalk....

An operating system is a collection of things that don't fit into a language. There shouldn't be one.

in the other direction, tymshare and key logic's operating system 'keykos' was largely designed, norm hardy said, with concepts from sigplan, the acm sig on programming languages, rather than sigsosp

sometimes irritating overhead hoops you need to jump through have the advantage of making your code easier to debug later. this is (i would argue, munificent would disagree) one of those times, and i'll explain the argument why below

in `var foo = await my_function()` usually if my_function is async that's because it can't compute foo immediately; the reasons in the examples in the tutorial you linked are making web requests (where you don't know the response code until the remote server sends it) and reading data from files (where you may have to wait on a disk or a networked fileserver). if all your functions are synchronous, you don't have threads, and you can't afford to tie up your entire program (or computer) waiting on the result, you have to do something like changing my_function to return a promise, and putting the code below the line `var foo = await my_function()` into a separate subroutine, probably a nested closure, which you pass to the promise's `then` method. this means you can't use structured control flow like statement sequencing and while loops to go through a series of such steps, the way you can with threads or async

so what if you use threads? the example you linked says to use threads! i think it's a widely accepted opinion now (though certainly not universal) that shared-mutable-memory threading is the wrong default, because race conditions in multithreaded programs with implicitly shared mutable memory are hard to detect and prevent, and also hard to debug. you need some kind of synchronization between the threads, and if you use semaphores or locks like most people do, you also get deadlocks, which are hard to prevent or statically detect but easy to debug once they happen

async/await guarantees you won't have deadlocks (because you don't have locks) and also makes race conditions much rarer and relatively easy to detect and prevent. mark s. miller, one of the main designers of recent versions of ecmascript, wrote his doctoral dissertation largely about this in 02006 http://www.erights.org/talks/thesis/index.html after several years working on an earlier programming language called e based on promises like the ones he later added to js; but i have to admit that, while i've read a lot of his previous work, i haven't read his dissertation yet

cooperative multitasking is in an in-between place; it often doesn't use locks and makes race conditions somewhat rarer and easier to detect and prevent than preemptive multitasking, because most functions you call are guaranteed not to yield control to another thread. you just have to remember which ones those are, and sometimes it changes even though your code didn't change

(in oberon, at least the versions i've read about, there was no way to yield control. you just had to finish executing and return, like in js in a web page before web workers, as i think i said upthread)

that's why i think it's better to have colored functions even though it sometimes requires annoying hoop-jumping

pjmlp
0 replies
10h12m

async/await guarantees you won't have deadlocks

You will get them in .NET and C++, because they map to real threads being shared across tasks.

There is even a FAQ maintained by .NET team regarding gotchas like not calling ConfigureAwaitable with the right thread context in some cases where it needs to be explicitly configured, like a task moving between foreground and background threads.

jerf
12 replies
21h2m

It hasn't won. Threads are alive and well and I rather expect async has probably already peaked and is back on track to be a niche that stays with us forever, but a niche nevertheless.

Your opinion vs. my opinion, obviously. But the user reports of the experience in Rust is hardly even close to unanimous praise and I still say it's a mistake to sit down with an empty Rust program and immediately reach for "async" without considering whether you actually need it. Even in the network world, juggling hundreds of thousands of simultaneous tasks is the exception rather than the rule.

Moreover, cooperative multitasking was given up at the OS level for good and sufficient reasons that I see no evidence that the current thrust in that direction has solved. As you scale up, the odds of something jamming your cooperative loop monotonically increase. At best we've increased the scaling factors, and even that just may be an effect of faster computers rather than better solutions.

kragen
9 replies
15h27m

in the 02000s there was a lot of interest in software transactional memory as a programming interface that gives you the latency and throughput of preemptive multithreading with locks but the convenient programming interface of cooperative multitasking; in haskell it's still supported and performs well, but it has been largely abandoned in contexts like c#, because it kind of wants to own the whole world. it's difficult to add incrementally to a threads-and-locks program

i suspect that this will end up being the paradigm that wins out, even though it isn't popular today

senderista
7 replies
13h50m

I was considering making a startup out of my simple C++ STM[0], but the fact that, as you point out, the transactional paradigm is viral and can't be added incrementally to existing lock-based programs was enough to dissuade me.

[0] https://senderista.github.io/atomik-website/

kragen
4 replies
13h38m

nice! when was this? what systems did you build in it? what implementation did you use? i've been trying to understand fraser's work so i can apply it to a small embedded system, where existing lock-based programs aren't a consideration

senderista
3 replies
13h26m

It grew out of an in-memory MVCC DB I was building at my previous job. After the company folded I worked on it on my own time for a couple months, implementing some perf ideas I had never had time to work on, and when update transactions were <1us latency I realized it was fast enough to be an STM. I haven't finished implementing the STM API described on the site, though, so it's not available for download at this point. I'm not sure when I'll have time to work on it again, since I ran out of savings and am going back to full-time employment. Hopefully I'll have enough savings in a year or two that I can take some time off again to work on it.

kragen
2 replies
13h23m

that's exciting! i just learned about hitchhiker trees (and fractal tree indexes, blsm trees, buffer trees, etc.) this weekend, and i'm really excited about the possibility of using them for mvcc. i have no idea how i didn't find out about them 15 years ago!

senderista
1 replies
3h12m

Then you may be interested in this paper which shows how to turn any purely functional data structure into an MVCC database.

https://www.cs.cmu.edu/~yihans/papers/concurrency.pdf

kragen
0 replies
1h45m

thank you!

LispSporks22
1 replies
11h3m

Sound’s nifty. Did this take advantage of those Intel (maybe others?) STM opcodes? For a while I was stoked on CL-STMX which did (as well as implementing non-native version to the same interface)

senderista
0 replies
14m

No, not at all. I'm pretty familiar with the STM literature by this point, but I basically just took the DB I'd already developed and slapped an STM API on top. Given that it can do 4.8M update TPS on a single thread, it's plenty fast enough already (although scalability isn't quite there yet; I have plenty of ideas on how to fix that but no time to implement them).

Since I've given up on monetizing this project, I may as well just link to its current state (which is very rough, the STM API described in the website is only partly implemented, and there's lots of cruft from its previous life that I haven't ripped out yet). Note that this is a fork of the previous (now MIT-licensed) Gaia programming platform (https://gaia-platform.github.io/gaia-platform-docs.io/index....).

https://github.com/senderista/nextdb/tree/main/production/db...

The version of this code previously released under the Gaia programming platform is here: https://github.com/gaia-platform/GaiaPlatform/blob/main/prod.... (Note that this predates my removal of IPC from the transaction critical path, so it's about 100x slower.) A design doc from the very beginning of my work on the project that explains the client-server protocol is here (but completely outdated; IPC is no longer used for anything but session open and failure detection): https://github.com/gaia-platform/GaiaPlatform/blob/main/prod....

denton-scratch
0 replies
7h10m

in the 02000s there was a lot of interest

I read that as octal; so 1024 in decimal. Not a very interesting year, according to Wikipedia.

https://en.wikipedia.org/wiki/1024

hathawsh
1 replies
17h6m

Meanwhile, in JS/ECMAScript land, async/await is used everywhere and it simplifies a lot of things. I've also used the construct in Rust, where I found it difficult to get the type signatures right, but in at least one other language, async/await is quite helpful.

samus
0 replies
12h29m

Await is simply syntactic sugar on top of what everybody was forced to do already (callbacks and promises) for concurrency. As a programming model, threads simply never had a chance in the JS ecosystem because on the surface it has always been a single-threaded environment. There's too much code that would be impossible to port to a multithreaded world.

mkl
10 replies
21h44m

It has mostly won for individual programs, but very much not for larger things like operating systems and web browsers.

epcoa
5 replies
21h39m

Mostly won for CRUD apps (yes and a few others). Your DAW, your photo editor, your NLE, your chatbot girlfriend, your game, your CAD, etc might actually want to use more than one core effectively per task. Even go had to grow up eventually.

pjmlp
2 replies
21h3m

Indeed, however the experience with crashes and security exploits, has proven that scaling processes, or even distributing them across several machines, scales much better than threads.

kragen
1 replies
19h4m

preemptively scheduled processes, not cooperatively scheduled

pjmlp
0 replies
11h50m

Ah, missed that.

frognumber
1 replies
21h26m

It's moving in more and more.

A core problem is that it's now clear most apps have hundreds or thousands of little tasks going, increasingly bound by network, IO, and similar. Async gives nice semantics for implementing cooperative multitasking, without introducing nearly as many thread coherency issues as preemptive.

I can do things atomically. Yay! Code literally cooperates better. I don't have the messy semantics of a Windows 3.1 event loop. I suspect it will take over more and more into all walks of code.

Other models are better for either:

- Highly parallel compute-bound code (where SIMD/MIMD/CUDA-style models are king)

- Highly independent code, such as separate apps, where there are no issues around cooperation. Here, putting each task on a core, and then preemptive, obviously wins.

What's interesting is all three are widely used on my system. My tongue-in-cheek comment about cooperative multitasking winning was only a little bit wrong. It didn't quite win in the sense of taking over other models, but it's in widespread use now. If code needs to cooperate, async sure beats semaphores, mutexes, and all that jazz.

funcDropShadow
0 replies
11h6m

Async programming is not an alternative to semaphores and mutexes. It is an alternative to having more threads. The substantial drawback of async programming in most implementations is that stack traces and debuggers become almost useless; at least very hard to use productively.

funcDropShadow
3 replies
11h12m

In the last 15 to 20 years asynchronous programming --- as a form of cooperative multi-tasking [1] --- did gain lot's of popularity. That was mainly because of non-scalable threads implementations in most language runtimes, e.g. the JVM. At the same time the JS ecosystem needed to have some support for concurrency. Since threads weren't even an option the community settled first on callback-hell and then on async/await. The former reason to asynchronous programming alleged win is currently being reversed. The JVM has introduced light weight threads that have the low runtime cost of asynchronous programming and all the niceties of thread-based concurrency.

[1]: Asynchronous programming is not the only form of cooperative programming. Usually cooperative multi-tasking systems have a special system call yield() which gives up the processor in addition to io induced context-switches.

pjmlp
2 replies
10h17m

In .NET and C++ asynchronous programming is not cooperative, it hides the machinery of a state machine mapping tasks into threads, it gets prempted and you can write your own scheduler.

funcDropShadow
1 replies
10h11m

But, isn't the separation of the control-flow into chunks, either separated by async/await or by sepration between call and callback, a form of cooperative thread yielding on top of preemptive threads? If that isn't true for .NET, then I'd really interested to understand what else it is doing.

pjmlp
0 replies
8h57m

No, it is a state machine that generates an instace of a Task from Task Parallel Library, and automates the Run()/Get() invocations from it.

Assuming your type isn't an Awaitable, with magic methods to influence how the compiler actually generates the state machine.

vram22
5 replies
21h28m

Supported cooperative multitasking won in the end.

Is this the same as coroutines as in Knuth's TAOCP volume 1?

Sorry, my knowledge is weak in this area.

kragen
4 replies
21h11m
vram22
3 replies
20h55m

Thanks, will check that.

steveklabnik
2 replies
19h25m

The quick answer is that coroutines are often used to implement cooperative multitasking because it is a very natural fit, but it's a more general idea than that specific implementation strategy.

kragen
1 replies
19h22m

interesting, i would have said the relationship is the other way around: cooperative multitasking implies that you have separate stacks that you're switching between, and coroutines are a more general idea which includes cooperative multitasking (as in lua) and things that aren't cooperative multitasking (as in rust and python) because the program's execution state isn't divided into distinct tasks

i could just be wrong tho

steveklabnik
0 replies
18h44m

Yeah thinking about it more I didn’t intend to imply a subset relationship. Coroutines are not only used to implement cooperative multitasking, for sure.

pjmlp
1 replies
10h19m

Not in Java, .NET and C++ case, as it is mapped to tasks, managed by threads, and you can even write your own scheduler if so inclined.

Someone
0 replies
9h43m

Also (AFAIK) not in JavaScript. An essential property of cooperative multitasking is that you can say “if you feel like it, pause me and run some other code for a while now” to the OS.

Async only allows you to say “run foo now until it has data” to the JavaScript runtime.

IMO, async/await in JavaScript are more like one shot coroutines, not cooperative multitasking.

Having said that, the JavaScript event loop is doing cooperative multitasking (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Even...)

smartscience
0 replies
21h40m

I always knew my experience with RISC OS wouldn't go to waste!

superluserdo
12 replies
21h49m

His stance on compiler optimizations is another example: only add optimization passes if they improve the compiler's self-compilation time.

What an elegant metric! Condensing a multivariate optimisation between compiler execution speed and compiler codebase complexity into a single self-contained meta-metric is (aptly) pleasingly simple.

I'd be interested to know how the self-build times of other compilers have changed by release (obviously pretty safe to say, generally increasing).

amelius
6 replies
21h34m

Hmm, but what if the compiler doesn't use the optimized constructs, e.g. floating point optimizations targeting numerical algorithms?

hoosieree
1 replies
18h43m

Simple fix: floating-point indexes to all your tries. Or switch to base π or increment every counter by e.

Someone
0 replies
18h0m

That’s not a simple fix in this context. Try making it without slowing down the compiler.

You could try to game the system by combining such a change that slows down compilation with one that compensates for it, though, but I think code reviewers of the time wouldn’t accept that.

bunderbunder
1 replies
20h55m

Life was different in the '80s. Oberon targeted the NS32000, which didn't have a floating point unit. Let alone most the other modern niceties that could lead to a large difference between CPU features used by the compiler itself, and CPU features used by other programs written using the compiler.

That said, even if the exact heuristic Wirth used is no longer tenable, there's still a lot of wisdom in the pragmatic way of thinking that inspired it.

whartung
0 replies
14h53m

Speaking of that, if you were ever curious how computers do floating point math, I think the first Oberon book explains it in a couple of pages. It’s very succinct and, for me, one of the clearest explanations I’ve found.

samus
0 replies
12h48m

Rewrite the compiler to use a LLM for complication. I'm only half joking! The biggest remaining technical problem is the context length, which is severely limiting the input size right now. Also, the required humongous model size.

kragen
0 replies
21h33m

probably use a fortran compiler for that instead of oberon

thesz
3 replies
20h43m

You cannot add a loop skew optimization to compiler before compiler needs a loop skew optimization. Which it would not need at all because it is loop skew optimization (it requires matrix operations) that need a loop skew optimization.

In short, compiler is not an ideal representation of the user programs it needs to optimize.

Someone
1 replies
17h48m

Wirth ran an OS research lab. For that, the compiler likely is a fairly typical workload.

But yes, it wouldn’t work well in a general context. For example, auto-vectorization likely doesn’t speed up a compiler much at all, while adding the code to detect where it’s possible will slow it down.

So, that feature never can be added.

On the other hand, may lead to better designs. If, instead, you add language features that make it easier for programmers to write vectorized code, that might end up being easier for programmers. They would have to write more code, but they also would have to guess less whether their code would end up being vectorized.

kragen
0 replies
15h30m

perhaps you could write the compiler using the data structures used by co-dfns (which i still don't understand) so that vectorization would speed it up, auto- or otherwise

WJW
0 replies
20h32m

Perhaps Wirth would say that compilers are _close enough_ to user programs to be a decent enough representation in most cases. And of course he was sensible enough to also recognize that there are special cases, like matrix operations, where it might be wirthwhile.

EDIT: typo in the last word but I'm leaving it in for obvious reasons.

teleforce
0 replies
3h39m

His stance should be adopted by all languages authors and designers but apparently it's not. The older generation of programming language guru like Wirth and Hoare are religiously focused on simplicity hence they never take compilation time for granted unlike most popular modern languages authors. C++, Scala, Julia and Rust are all behemoth in term of complexity in language design hence have very slow compilation time. Popular modern languages like Go and D are the breath of fresh air with their lightning fast compilation due to their inherent simplicity in their design. This is to be expected since Go is just a modern version of Modula and Oberon, and D is designed by a former aircraft engineer where simplicity is mandatory not an option.

steveklabnik
2 replies
19h45m

Do you happen to remember where he said that? I've been looking for a citation and can't find one.

I think that some of the text in "16.1. General considerations" of "Compiler Construction" are sorta close, but does not say this explicitly.

steveklabnik
1 replies
17h19m
microtherion
0 replies
7h10m

The author cited, Michael Franz, was one of Wirth's PhD students, so what he relates is an oral communication from Wirth that may very well never have been put in writing. It does seem entirely consistent with his overall philosophy.

Wirth also had no compunction about changing the syntax of his languages if it made the compiler simpler. Modula-2 originally allowed undeclared forward references within the same file. When his implementation moved from the original multi pass compilers (e.g. Logitech's compiler had 5 passes: http://www.edm2.com/index.php/Logitech_Modula-2) to a single pass compiler http://sysecol2.ethz.ch/RAMSES/MacMETH.html he simply started requiring that forward references had to be declared (as they used to be in Pascal).

I suspect that Wirth not being particularly considerate of the installed base of his languages, and not very cooperative about participating in standardization efforts (possibly due to burn out from his participation in the Algol 68 process) accounts for the ultimately limited commercial success of Modula-2 & Oberon, and possibly for the decline of Pascal.

spongebobism
0 replies
1h38m

That's fascinating. I'd imagine there are actually two equilibria/stable states possible under this rule: a small codebase with only the most effective optimization passes, or a large codebase that incorporates pretty much any optimization pass.

A marginally useful optimization pass would not pull its weight when added to the first code base, but could in the second code base because it would optimize the run time spent on all the other marginal optimizations.

Though the compiler would start out closer to the small equilibrium in its initial version, and there might not be a way to incrementally move towards the large equilibrium from there under Wirth's rule.

pjmlp
0 replies
10h21m

Note that Oberon descendents like Active Oberon and Zonnon, do have premptive multitasking.

kristianp
0 replies
15h18m

In the past, this policy of Wirth's has been cited when talking about go compiler development.

Go team member, Robert Griesemer, did his Phd under Mössenböck and Wirth.

gjvc
2 replies
21h56m

... his 80th birthday symposium at ETH where he showed off his new port of Oberon to a homebrew CPU running on a random FPGA dev board with USB peripherals.

This was a fantastic talk. https://www.youtube.com/watch?v=EXY78gPMvl0

lynguist
1 replies
20h49m

Thank you for sharing. I was there and didn’t expect to see this again. :)

He had the crowd laughing and cheering, and the audience questions in the end were absolutely excellent.

gjvc
0 replies
18h23m

Always glad to be of service.

I think I last watched it during the pandemic and was inspired to pick up reading more about Oberon. A demonstration / talk like that is so much better when the audience are rooting for the presenter to do well.

vram22
0 replies
21h43m

A Wirthwhile ambition. :)

Sorry, couldn't resist.

I first wrote it as "worthwhile", but then the pun practically fell out of the screen at me.

I love Wirth's work, and not just his languages. Also his stuff like algorithms + data = programs, and stepwise refinement. Like many others here, Pascal was one of my early languages, and I still love it, in the form of Delphi and Free Pascal.

RIP, guruji.

Edited to say guruji instead of guru, because the ji suffix is an honorific in Hindi, although guru is already respectful.

microtherion
0 replies
18h9m

According to his daughter (she runs a grocery store, and my wife occasionally talks to her), he kept on tinkering at home well past 80.

kragen
0 replies
22h27m

i hope you are! we miss you

kragen
75 replies
23h32m

begin

this is terrible news;

is there a better source than twitter (edit: https://lists.inf.ethz.ch/pipermail/oberon/2024/016856.html thanks to johndoe0815);

wirth was the greatest remaining apostle of simplicity, correctness, and software built for humans to understand; now only hoare and moore remain, and moore seems to have given the reins at greenarrays to a younger generation;

young people may not be aware of the practical, as opposed to academic, significance of his work, so let me point out that begin

the ide as we know it today was born as turbo pascal;

most early macintosh software was written in pascal, including for example macpaint;

robert griesemer, one of the three original designers of golang, was wirth's student and did his doctoral thesis on an extension of oberon, and wirth's languages were also a very conspicuous design inspiration for newsqueak;

tex is written in pascal;

end;

end.

rollcat
20 replies
21h13m

wirth was the greatest remaining apostle of simplicity, correctness, and software built for humans to understand;

And yet far from the last. Simple, correct, and beautiful software is still being made today. Most of it goes unnoticed, its quiet song drowned out by the cacophony of attention-seeking, complex, brittle behemoths that top the charts.

That song never faded, you just need to tune in.

kragen
19 replies
21h5m

who are today's new great minimalists?

rollcat
16 replies
20h8m

In no particular order: 100r.co, OpenBSD (& its many individual contributors such as tedu or JCS), Suckless/9front, sr.ht, Alpine, Gemini (&gopher) & all the people you can find there, Low Tech Magazine, antirez, Fabrice Bellard, Virgil Dupras (CollapseOS), & many other people, communities, and projects - sorry I don't have a single comprehensive list, that's just off the top of my head ;)

amatecha
9 replies
19h32m

uxn ftw <3

kragen
8 replies
19h30m

uxn is cool but it's definitely not the same kind of achievement as oberon, pascal, quicksort, forth, and structured programming; rek and devine would surely not claim it was

amatecha
7 replies
19h28m

to quote gp, "beautiful software is still being made today". It's not a competition.

kragen
6 replies
19h24m

you don't get to be described as an 'apostle of simplicity' just because you like simplicity. you have to actually change the world by creating simplicity. devine and rek are still a long way from a turing award

amatecha
4 replies
18h19m

you don't get to dictate who does or doesn't get recognized for creating awesome works that influence and inspire others. take your persistent negativity elsewhere.

btw, uxn is absolutely the exemplification of "software built for humans to understand" and simplicity. I mean...

the resulting programs are succinct and translate well to pen & paper computing.

to make any one program available on a new platform, the emulator is the only piece of code that will need to be modified, which is explicitly designed to be easily implemented

https://100r.co/site/uxn_design.html

how one can frame this as trivial is beyond me.

kragen
3 replies
16h34m

i don't think uxn is trivial, i think it's a first step toward something great. it definitely isn't the exemplification of "software built for humans to understand"; you have to program it in assembly language, and a stack-based assembly language at that. in that sense it's closer to brainfuck than to hypertalk or excel or oberon. it falls short of its goal of working well on small computers (say, under a megabyte of ram and under a mips)

the bit you quote about uxn having a standard virtual machine to permit easy ports to new platforms is from wirth's 01965 paper on euler http://pascal.hansotten.com/niklaus-wirth/euler-2/; it isn't something devine and rek invented, and it may not have been something wirth invented either. schorre's 01963 paper on meta-ii targets a machine-independent 'fictitious machine' but it's not turing-complete and it's not clear if he intended it to be implemented by interpretation rather than, say, assembler macros

i suggest that if you develop more tolerance for opinions that differ from your own, instead of deprecating them as 'persistent negativity' and 'dictating', you will learn more rapidly, because other people know things you don't, and sometimes that is why our opinions differ. sometimes those things we know that you don't are even correct

i think this is one of those cases. what i said, that you were disagreeing with, was that uxn was not the same kind of achievement as oberon, pascal, quicksort, forth, and structured programming (and, let me clarify, a much less significant achievement) and that it is a long way from [meriting] a turing award. i don't see how anyone could possibly disagree with that, or gloss it as 'uxn is trivial', as you did, unless they don't know what those things are

i am pretty sure that if you ask devine what he thinks about this comment, you will find that he agrees with every word in it

jibal
1 replies
12h38m

Yeah, these younguns have a lot to learn. :-) The notion that there's something innovative about using a small VM to port software is hilarious. BTW, here is a quite impressive and effective use of that methodology: https://ziglang.org/news/goodbye-cpp/

Dewey Schorre and Meta II, eh? Who remembers such things? Well, I do, as I was involved with an implementation of Meta V when I was on the staff of the UCLA Comp Sci dept in 1969.

kragen
0 replies
12h29m

nice! which direction did meta-v go meta in?

i recently reimplemented meta-ii myself with some tweaks, http://canonical.org/~kragen/sw/dev3/meta5ixrun.py

i guess i should write it up

amatecha
0 replies
10h57m

There wasn't a single place I asserted that uxn is specifically novel or unprecedented. In fact, Devine's own presentation[0] about uxn specifically cites Wirth and Oberon, among countless other inspirations and examples. I'm saying it's awesome, accessible, simple and open.

I don't need to "develop more tolerance for differing opinions" - I have no problem with them and am completely open to them, even from people who I feel are communicating in an unfriendly, patronizing or gatekeeping manner. rollcat shared some other people and projects and you took it upon yourself to shoot down as much as possible in that comment - for what purpose? No one said Drecker is "on Wirth's level" when it comes to programming. We don't need him to write FizzBuzz, let alone any other software. I'm sorry you don't recognize the value of a publication like Low-Tech Magazine, but the rest of us can, and your need to shoot down that recognition is why I called your messages persistently negative.

Further, when I give kudos to uxn and recognize it as a cool piece of software, there's absolutely no point in coming in and saying "yeah but it's no big deal compared to ____" , as if anyone was interested in some kind of software achievement pissing contest. The sanctity and reverence for your software idols is not diluted nor detracted from by acknowledging, recognizing and celebrating newer contributors to the world of computing and software.

I have to come back and edit this and just reiterate: All I originally said was "uxn ftw" and you found it necessary to "put me in my place" about something I didn't even say/assert, and make it into some kind of competition or gatekeeping situation. Let people enjoy things. And now, minimizing this thread and never looking at it again.

[0] https://100r.co/site/weathering_software_winter.html

Ringz
0 replies
10h11m

From your Wikipedia link about the meaning of the Word Apostle:

“The term [Apostle] is also used to refer to someone who is a strong supporter of something.[5][6]“

So I would call many people and myself (as someone who started studying Computer Science with Assembler and Modula-2) Apostle of simplicity.

No need for techno-classism.

nextos
1 replies
18h14m

I'd also mention https://www.piumarta.com

kragen
0 replies
16h52m

ooh, yeah, now that's a possibility

kragen
1 replies
19h33m

i... really don't think kris de decker is on niklaus wirth's level. i don't think he can write so much as fizzbuzz

fabrice bellard is wirth-level, it's true. not sure about tedu and jcs, because i'm not familiar enough with their work. it's absurd to compare most of the others to wirth and hoare

you're comparing kindergarten finger paintings to da vinci

you said wirth was "far from the last" apostle of simplicity. definition of apostle: https://en.wikipedia.org/wiki/Apostle

rollcat
0 replies
6h4m

it's absurd to compare most of the others to wirth and hoare

You're the one trying to directly compare achievement, not me. If you're looking for top achievers, I'd have to name PHP or systemd, and THAT would be out of place ;)

I even said "in no particular order", because I don't think any two can be easily compared.

My main criterion for inclusion was the drive for simplifying technology, and publishing these efforts:

An apostle [...], in its literal sense, is an emissary. The word is [...] literally "one who is sent off" [...]. The purpose of such sending off is usually to convey a message, and thus "messenger" is a common alternative translation.

Every single project, person, or community I've named here has some form of web page, blog, RSS feed, papers/presentations, and/or source code, that serve to carry their messages.

Achievement can be measured, simplicity can only be appreciated.

johndoe0815
1 replies
17h0m

I would add Jochen Liedtke (unfortunately he passed away already more than 20 years ago) as inventor of the L4 microkernel.

Several research groups continued work on L4 after Liedtke's death (Hermann Härtig in Dresden, Gernot Heiser in Sydney, a bit of research at Frank Bellosa's group in Karlsruhe and more industrial research on L4 for embedded/RT systems by Robert Kaiser, later a professor in Wiesbaden), but I would still argue that Liedtke's original work was the most influential, though all the formal verification work in Sydney also had significant impact - but that was only enabled by the simplicity of the underlying microkernel concepts and implementations.

kragen
0 replies
16h52m

agreed, though i think l4 was more influential than eumel (which is free software now by the way) even though eumel preceded l4

ryukoposting
1 replies
15h16m

I use Suckless's terminal, st. It's great. To install it, I compile it from source. It takes a couple seconds. Take a look at their work.

kragen
0 replies
15h3m

do you think writing st is an achievement that merits a turing award

because i've also written a terminal emulator, and it compiles from source in 0.98 seconds https://gitlab.com/kragen/bubbleos/blob/master/yeso/admu-she... screenshot of running vi in it at https://gitlab.com/kragen/bubbleos/blob/master/yeso/admu_she...

(steps to reproduce: install dependencies; make; rm admu-shell admu-shell-fb admu-shell-wercam admu admu.o admu-shell.o admu_tv_typewriter.o; time make -j. it only takes 0.37 seconds if you only make -j admu-shell and don't build the other executables. measured on debian 12.1 on a ryzen 5 3500u)

i wrote pretty much all of it on october 12 and 13 of 02018 so forgive me if i don't think that writing a terminal emulator without scrollback is an achievement comparable to pascal and oberon etc.

not even if it were as great as st (and actually admu sucks more than st, but it can still run vi and screen)

bumbledraven
16 replies
19h0m

wirth was the greatest remaining apostle of simplicity, correctness, and software built for humans to understand; now only hoare and moore remain…

No. There is another.

https://en.m.wikipedia.org/wiki/Arthur_Whitney_%28computer_s...

vidarh
14 replies
18h7m

Some would dispute the "built for humans to understand". Whitney's work is brilliant, but it's far less accessible than Wirth's.

bumbledraven
13 replies
17h49m

The point of Whitney's array languages is to allow your solutions to be so small that they fit in your head. Key chunks should even fit on one screen. A few years ago, Whitney reportedly started building an OS using these ideas (https://aplwiki.com/wiki/KOS).

vidarh
12 replies
17h29m

I'm aware of the idea. I'm also aware that I can read and understand a whole lot of pages of code in the amount of time it takes me to decipher a few lines of K for example, and the less dense codes sticks far better in my head.

I appreciate brevity, but I feel there's a fundamental disconnect between people who want to carefully read code symbol by symbol, who often seem to love languages like J or K, or at least he better able to fully appreciate them, and people like me who want to skim code and look at the shape of it (literally; I remember code best by its layout and often navigate code by appearance without reading it at all, and so dense dumps of symbols are a nightmare to me)

I sometimes think it reflects a difference between people who prefer maths vs languages. I'm not suggesting one is better than the other, but I do believe the former is a smaller group than the latter.

For my part I want grammar that makes for a light, casual read, not having to decipher. I want to be able to get a rough understanding with a glance, and gradually fill in details, not read things start to finish (yes, I'm impatient)

A favourite example of mije is the infamous J interpreter fragment, where I'd frankly be inclined to prefer a disassembly over the source code. But I also find the ability to sketch out such compact code amazing.

I think Wirths designs very much fit in the languages that are skimmable and recognisable by shape and structure category. I can remember parts of several of Wirths students PhD theses from the 1990s by the shape of procedures in their Oberon code to this day.

That's not to diminish Whitney's work, and I find that disconnect in how we process code endlessly fascinating, and regularly look at languages in that family because there is absolutely a lot to learn from them, but they fit very different personalities and learning styles.

kragen
10 replies
15h37m

i keep hoping that one day i'll understand j or k well enough that it won't take me hours to decipher a few lines of it; but today i am less optimistic about this, because earlier tonight, i had a hard time figuring out what these array-oriented lines of code did in order to explain them to someone else

    textb = 'What hath the Flying Spaghetti Monster wrought?'
    bits = (right_shift.outer(array([ord(c) for c in textb]),
                              arange(8))).ravel() & 1
and i wrote them myself three months ago, with reasonably descriptive variable names, in a language i know well, with a library i've been using in some form for over 20 years, and their output was displayed immediately below, in https://nbviewer.org/url/canonical.org/~kragen/sw/dev3/rando...

i had every advantage you could conceivably have! but i still guessed wrong at first and had to correct myself after several seconds of examination

i suspect that in j or k this would be something like (,(@textb)*.$i.8)&1 though i don't know the actual symbols. perhaps that additional brevity would have helped. but i suspect that, if anything, it would have made it worse

by contrast, i suspect that i would have not had the same trouble with this

    bits = [(ord(c) >> i) & 1 for c in textb for i in range(8)]
however, as with rpn, i suspect that j or k syntax is superior for typing when what you're immediately evaluating expressions rather than writing a program to maintain later, because the amount of finger typing is so much less. but maybe i just have a hard time with point-free style? or maybe, like you say, it's different types of people. or maybe i just haven't spent nearly enough time writing array code during those years

rak1507
6 replies
10h39m

The k solution is ,/+|(8#2)\textb

k doesn't have a right shift operator, but you don't need that, you can use the base encoding operator instead

Personally I think this is clearer than both the array-ish python and the list comp.

https://ngn.codeberg.page/k/#eJwrSa0oSbJSCs9ILFEA4gyFkoxUBbe...

vidarh
3 replies
9h36m

I'm sitting here with the K reference manual, and I still can't decode this.

I'm wildly guessing that your approach somehow ends doing something closer to this:

    textb.bytes.flat_map{_1.digits(2)}
Which I must admit it took me embarrassingly long to think of.

kragen
1 replies
8h18m

i can't decode it either but i think you're right. note that this probably gives the bits in big-endian order rather than little-endian order like my original, but for my purposes in that notebook either one would be fine as long as i'm consistent encoding and decoding

rak1507
0 replies
1h8m

It would do if it the bits weren't reversed, which is done with |

rak1507
0 replies
1h9m

If you go to the link and press 'help' you'll see some docs for ngn/k

The relevant line is

I\ encode 24 60 60\3723 -> 1 2 3 2\13 -> 1 1 0 1

So (8#2)\x encodes x in binary, with 8 positions. And because k is an array language, you don't need to do any mapping, it's automatic.

,/+|x is concat(transpose(reverse(x))) (,/ literally being 'reduce with concat')

kragen
1 replies
10h2m

thank you very much! which k implementation does this page use?

oh, apparently a new one called ngn/k: https://codeberg.org/ngn/k

vidarh
0 replies
9h18m

As much as it's a struggle to read, you got to love the people implementing this are dedicated enough to write their C as if it was K.

vidarh
2 replies
11h9m

I think the K would likely be both simpler and harder than your first example by reading very straightforwardly in a single direction but with operators reading like line noise. In your case, my Numpy is rusty, but I think this is the Ruby equivalent of what you were doing?

    textb = 'What hath the Flying Spaghetti Monster wrought?'

    p textb.bytes.product((0...8).to_a).map{_1>>_2}.map{_1 & 1}
Or with some abominable monkey patching:

    class Array
      def outer(r) = product(r.to_a)
      def right_shift = map{_1>>_2}
    end

    p textb.bytes.outer(0...8).right_shift.map{_1 & 1}
I think this latter is likely to be a closer match to what you'd expect in an array language in terms of being able to read in a single direction and having a richer set of operations. We could take it one step further and break the built in Array#&:

    class Array
      def &(r) = map{_1 & r}
    end

    p textb.bytes.outer(0...8).right_shift & 1
Which is to say that I don't think the operator-style line-noise nature of K is what gives it its power. Rather that it has a standard library that is fashioned around this specific set of array operations. With Ruby at least, I think you can bend it towards the same Array nature-ish. E.g. a step up from the above that at least contains the operator overloading and instead coerces into a custom class:

    textb = 'What hath the Flying Spaghetti Monster wrought?'
    
    class Object
      def k = KArray[*self.to_a]
    end
    
    class String
      def k = bytes.k
    end

    class KArray < Array
      def outer(r) = product(r.to_a).k
      def right_shift = map{_1>>_2}.k
      def &(r) = map{_1 & r}.k
    end

    p textb.k.outer(0...8).right_shift & 1
With some care, I think you could probably replicate a fair amount of K's "verbs" and "adverbs" (I so hate their naming) in a way that'd still be very concise but not line-noise concise.

kragen
1 replies
10h45m

that all seems correct; the issue i had was not that python is less flexible than ruby (though it is!) but that it required a lot of mental effort to map back from the set of point-free array operations to my original intent. this makes me think that my trouble with j and k is not the syntax at all. but conceivably if i study the apl idiom list or something i could get better at that kind of thinking?

vidarh
0 replies
10h4m

I think you could twist Python into getting something similarly concise one way or other ;) It might not be the Python way, though. I agree it often is painful to map. I think in particular the issue for me is visualizing the effects once you're working with a multi-dimensional set of arrays. E.g. I know what outer/product does logically, but I have to think through the effects in a way I don't need to do with a straightforward linear map(). I think I'd have been more likely to have ended up with something like this if I started from scratch even if it's not as elegant.

    p textb.bytes.map{|b| (0...8).map{|i| (b>>i) & 1} }.flatten
EDIT: This is kind of embarrassing, but we can of course do just this:

    textb.bytes.flat_map{_1.digits(2)}
But I think the general discussion still applies, and it's quite interesting how many twists and turns it took to arrive at that

chongli
0 replies
11h1m

I sometimes think it reflects a difference between people who prefer maths vs languages. I'm not suggesting one is better than the other, but I do believe the former is a smaller group than the latter.

This dichotomy exists in mathematics as well. Some mathematicians prefer to flood the page with symbols. Others prefer to use English words as much as possible and sprinkle equations here and there (on their own line) between paragraphs of text.

The worst are those that love symbols and paragraphs, writing these dense walls of symbols and text intermixed. I’ve had a few professors who write like that and it’s such a chore to parse through.

kragen
0 replies
15h52m

well, simplicity anyway, arguably (like moore) to an even higher degree than wirth

wozer
10 replies
23h29m

No better source yet, I think.

But it is the real account of Bertrand Meyer, creator of the Eiffel language.

kragen
7 replies
23h28m

yeah, and i hope meyer would know

but still, it's twitter, liable to vanish or block non-logged-in access at any moment

PaulHoule
6 replies
23h11m

Since Twitter is suppressing the visibility of tweets that link outside their site I think it would be perfectly fair to block links to twitter, rewrite them to nitter, etc. There also ought to be gentle pressure on people who post to Twitter to move to some other site. I mean, even I've got a Bluesky invite now.

kragen
4 replies
22h1m

bluesky seems like the site for people who think that the problem with twitter was that the wrong billionaire gets to decide which ideas to suppress

(admittedly you could make the same criticism of hn; it certainly isn't decentralized and resilient against administrative censorship like usenet was)

PaulHoule
3 replies
21h39m

Well I didn't mean to just endorse Bluesky but call it out as one of many alternatives.

I'm actually active on Mastodon but I am thinking about getting on Instagram as well because the content I post that does the best on Mastodon would fit in there.

zilti
1 replies
16h38m

Do you know about Pixelfed?

PaulHoule
0 replies
6h4m

Yep. I interact w/ Pixelfed users on Mastodon all the time,

kragen
0 replies
21h31m

it won't surprise you to learn that i like mastodon but haven't used it in months

rileyphone
0 replies
22h19m

Please don't make dang do more work.

https://news.ycombinator.com/item?id=38847048

johndoe0815
1 replies
22h12m

Niklaus Wirth's death was also announced (by Andreas Pirklbauer) an hour ago on the Oberon mailing list:

https://lists.inf.ethz.ch/pipermail/oberon/2024/016856.html

kragen
0 replies
22h5m

thank you

dang, maybe we can change the url to this instead? this url has been stable for at least 14 years (http://web.archive.org/web/20070720035132/https://lists.inf....) and has a good chance of remaining stable for another 14, while the twitter url is likely to disappear this year or show different results to different people

Qem
9 replies
22h43m

wirth was the greatest remaining apostle of simplicity, correctness, and software built for humans to understand; now only hoare and moore remain

Also Alan Kay still with us.

kragen
8 replies
22h21m

in the neat/scruffy divide, which goes beyond ai, wirth was the ultimate neat, and kay is almost the ultimate scruffy, though wall outdoes him

alan kay is equally great, but on some axes he is the opposite extreme from wirth: an apostle of flexibility, tolerance for error, and trying things to see what works instead of planning everything out perfectly. as sicp says

Pascal is for building pyramids—imposing, breathtaking, static structures built by armies pushing heavy blocks into place. Lisp is for building organisms—imposing, breathtaking, dynamic structures built by squads fitting fluctuating myriads of simpler organisms into place.

kay is an ardent admirer of lisp, and smalltalk is even more of an organism language than lisp is

nerpderp82
7 replies
22h13m

I had wanted to interview Val Schorre [1], and looked him up on a business trip because I was close. Died 2017, seems like a righteous dude.

https://www.legacy.com/us/obituaries/venturacountystar/name/...

[1] https://en.wikipedia.org/wiki/META_II

kragen
6 replies
22h3m

yeah, i wish i had had the pleasure of meeting him. i reimplemented meta-ii 3½ years ago and would recommend it to anyone who is interested in the parsing problem. it's the most powerful non-turing-complete 'programming language' i've ever used

http://www.canonical.org/~kragen/sw/dev3/meta5ixrun.py

(i mean i would recommend reimplementing it, not using my reimplementation; it takes a few hours or days)

after i wrote it, the acm made all old papers, including schorre's meta-ii paper, available gratis; they have announced that they also plan to make them open-access, but so far have not. still, this is a boon if you want to do this. the paper is quite readable and is at https://dl.acm.org/doi/10.1145/800257.808896

nerpderp82
5 replies
21h10m

It is a good paper, and I give much respect for ACM opening up their paywall of old papers. They even stopped rate limiting downloads. I'd like to think my incessant whining about this had some effect. :) It is such a wonderful thing for curious people everywhere to be able to read these papers.

I haven't reimplemented meta-ii, I will.

You might like https://old.reddit.com/r/rust/comments/18wnqqt/piccolo_stack...

And https://www.youtube.com/@T2TileProject/videos

kragen
4 replies
21h3m

thanks! i like lua a lot despite its flaws; that's what i wrote my own literate programming system in. lua's 'bytecode' is actually a wordcode (a compilation approach which i think wirth's euler paper was perhaps the first published example of) and quite similar in some ways to wirth's risc-1/2/3/4/5 hardware architecture family

i hope they do go to open access; these papers are too valuable to be lost to future acm management or bankruptcy

protomolecule
1 replies
18h48m

What are the lua's flaws in your opinion? Sincere question.

kragen
0 replies
15h52m

there are a lot of design decisions that are pretty debatable, but the ones that seem clearly wrong to me are:

- indexing from 1 instead of 0;

- the absence of a nilness/nonexistence distinction (so misspelling a variable or .property silently gives the wrong answer instead of an exception);

- variables being global by default, rather than local by default or requiring an explicit declaration;

- printing tables by default (with %q for example) as identities rather than contents. (you could argue that this is a simplicity thing; lua 5.2 is under 15000 lines of code, which is pretty small for such a full-featured language, barely bigger than original-awk at 6200 lines of c and yacc plus 1500 lines of awk, and smaller than mawk at 16000 lines of c, 1100 lines of yacc, and 700 lines of awk. but a recursive table print function with a depth limit is about 25 lines of code.)

none of these are fatal flaws, but with the benefit of experience they all seem like clear mistakes

jacquesm
1 replies
9h19m

Are you familiar with the 'Leo' editor? It is the one that comes closest to what I consider to be a practically useful literate programming environment. If you haven't looked at it yet I'd love it if you could give it a spin and let me know what you make of it.

https://leo-editor.github.io/leo-editor/

kragen
0 replies
1h41m

i read a little about it many years ago but have never tried it. right now, for all its flaws, jupyter is the closest approximation to the literate-programming ideal i've found

usr1106
6 replies
16h53m

tex is written in pascal;

Just thought about that when Donald Knuth's Christmas lecture https://www.youtube.com/live/622iPkJfYrI lead me to one of his first TeX lectures https://youtu.be/jbrMBOF61e0 : If I install TeX on my Linux machine now, is that still compiled from the original Pascal source? Is there even a maintained Pascal compiler anymore? Well, GCC (as in GNU compiler collection) probably has a frontend, but that still does not answer the question about maintenance.

These were just thoughts. Of course researching the answers would not be overly complicated.

anon7725
4 replies
16h31m

Is there even a maintained Pascal compiler anymore?

Of course

https://www.freepascal.org/

usr1106
3 replies
16h29m

Nice. Although "latest news" are from 2021.

renewedrebecca
0 replies
14h26m

It's not like Pascal changes a whole lot.

badsectoracula
0 replies
8h4m

Free Pascal does very infrequent releases though the compiler is under active development and even has a bunch of new features both for the compiler (e.g. a wasm backend) and the language itself. There are always multiple daily commits in the git log by several developers.

anta40
0 replies
6h59m

The discussions in Gitlab are still active: https://gitlab.com/freepascal.org/fpc/source/-/issues

Of course, you can always build the latest version from Git.

svat
0 replies
16h7m

If I install TeX on my Linux machine now, is that still compiled from the original Pascal source?

If you install TeX via the usual ways–TeX Live and MikTeX are the most common—then the build step runs a program (like web2c) to convert the Pascal source (with changes) to C, then uses a C compiler. (So the Pascal source is still used, but the Pascal "compiler" is a specialized Pascal-to-C translator.) But there is also TeX-FPC (https://ctan.org/pkg/tex-fpc), a small set of change (patch) files to make TeX compilable with the Free Pascal compiler (https://gitlab.com/freepascal.org/fpc/).

For more details see https://tex.stackexchange.com/questions/111332/how-to-compil...

nerpderp82
3 replies
22h15m

As far as I know, Henry Baker is still with us. I had a dream where I interviewed Wirth for like 20 hrs so we could clone him with an LLM. We need to grab as much video interviews from folks as possible.

kragen
2 replies
22h8m

henry baker has made many great contributions, but last time i talked to him, he was waiting for somebody to start paying him again in order to do any more research

but i'm sure he'd agree his achievements are not in the same league as wirth's

nerpderp82
1 replies
21h17m

I wasn't trying to compare them in any other way other than, that Henry Baker is still with us.

kragen
0 replies
20h53m

cool, sorry if i came across as confrontational

vidarh
1 replies
18h10m

I've been a massive fan of the PhD dissertation of Wirth's student Michael Franz since I first read it in '94. He's now a professor at UC Irvine, where he supervised Andreas Gal's dissertation work on trace trees (what eventually became TraceMonkey)

kragen
0 replies
16h33m

thank you, i definitely should have mentioned franz's work, even though i didn't know he was gal's advisor

perhaps more significant than tracemonkey was luajit, which achieves much higher performance with the tracing technique

yawaramin
0 replies
22h34m

Martin Odersky, creator of the Scala language and Wirth's student, also seems to believe it: https://twitter.com/odersky/status/1742618391553171866

tahnyall
0 replies
20h52m

It's OK on Hacker News to dis a reputable news source now?

rramadass
0 replies
14h30m

wirth was the greatest remaining apostle of simplicity, correctness, and software built for humans to understand

Absolutely!

And equally important was his ability to convey/teach CS precisely, concisely and directly in his books/papers. None of them have any fluff nor unnecessary obfuscation in them. These are the models to follow and the ideals to aspire to.

As an example see his book Systematic Programming: An Introduction.

tasty_freeze
33 replies
23h11m

Besides his contribution to language design, he authored one of the best puns ever. His last name is properly pronounced something like "Virt" but in the US everyone calls him by "Worth".

That led him to quip, "In Europe I'm called by name, but in the US I'm called by value."

vram22
13 replies
21h12m

I saw it stated as pronounced as Veert, somewhere, maybe in his Wikipedia or Wikiquote pages.

nwellnhof
11 replies
20h27m

It's pronounced with an ɪ like in "wit".

vram22
5 replies
18h1m

Okay, either me or the ref. may have been wrong. but I distinctly remember "Veert", because it was non-intuitive to me (as the way to pronounce Wirth), as a guy who didn't know any German at the time, only English. So the ref., probably.

denton-scratch
3 replies
7h20m

"Veert" is correct, in the sense that it's how a German would pronounce it. Of course, the great man wasn't German; I don't know how he pronounced his own surname.

"Wit" is just wrong. Perhaps that was a joke that I missed about the man's humour.

G3rn0ti
2 replies
3h22m

Of course, the great man wasn't German.

He was Swiss, more exactly from the city of Winterthur located in the canton (state) of Zürich. The canton's official language is German, however. Of course, people over there speak in a strong local dialect called "Züritüütsch".

denton-scratch
1 replies
3h6m

Thank you for the correction. I assumed he spoke a variant of american-english. So how did he pronounce his own name? I've never heard his voice.

a strong local dialect called "Züritüütsch".

Damn, I've never seen a word with three u-umlauts in it. How the hell do you pronounce two consecutive u-umlauts? "eu-eu"?

G3rn0ti
0 replies
1h53m

You just elongate the vowel i.e. pronounce it longer. The double „ü“ just indicates this vowel to be stressed. Dialects do not follow a strict orthography, however, so you might find it written slightly differently in other contexts.

Wirth lived in the United States for some time throughout his life but is a Zürich native. He must have spoken Züritüütsch („Zurich German“) privately, I am pretty sure (without having known him personally).

housecarpenter
0 replies
7h34m

It's not really wrong. There are English accents (such as Received Pronunciation) where an "ee" before an "r" is normally pronounced with an [ɪ] like in "wit". In any case, even if you pronounce the "ee" as something else like [i], "Veert" is probably still the sequence of letters that maximises the likelihood that an English speaker will understand by it something close to the true German pronunciation ([vɪʁt] or [vɪɐt]). "Virt", for example, would be read by most people as [vɜrt] (rhyming with "hurt") which to my ear is further off from the correct pronunciation compared to something like [viət].

dotancohen
4 replies
7h59m

I think that the "i" sound like in "wit" does not exist in German. The Germans pronounce "i" like English speakers pronounce "ee".

jstimpfle
1 replies
6h57m

We have both, and I'd tend to pronounce "Wirth" similar to "wit" as far as the "i" goes. It's not always clear just from looking at the letter. But some words have explicit cues: There are "stretching consonants" like a-a, a-h, e-e, e-h, i-e, i-h, etc: Aal, Kahn, dehnen, dienen, sühnen, etc. And sometimes the following consonant gets doubled up to indicate a shorter pronunciation, like in "Bitte".

dotancohen
0 replies
3h39m

Thank you.

housecarpenter
1 replies
7h29m

The "i" sound in "wit" does exist in German and is what is normally indicated by "i" on its own. The long "ee" sound is normally spelt as "ie" in German.

dotancohen
0 replies
3h38m

Thank you.

Thorrez
0 replies
3h16m

The linked tweet says "Whereas Europeans generally pronounce my name the right way ('Ni-klows Wirt'), Americans invariably mangle it into 'Nick-les Worth'. This is to say that Europeans call me by name, but Americans call me by value."

phooda
6 replies
21h33m

He was the go to guy for witty comments.

fsckboy
4 replies
20h40m

Dijkstra would consider that one harmful. In America we consider Wirth the to go guy for witty takeaways.

vram22
0 replies
17h50m

worthy takeaways

vram22
0 replies
17h57m

Yogi Berra would love you.

vram22
0 replies
17h51m

Yes, Macs do have some worth, though they are harmful.

We should take them away.

tomcam
0 replies
17h4m

Sweet! Annoyed I didn’t think of that one

vram22
0 replies
17h51m

Goto the top of the class.

rabbits77
4 replies
20h54m

The joke really only works if you use his first name! The complete joke is that "by value" means pronouncing first and last name to sound like "Nickles Worth".

ggambetta
1 replies
8h49m

The way I understand it, it doesn't -- by name (aka by reference) vs by value, as in function arguments.

ak_111
0 replies
8h14m

Way better interpretation that i missed!

coldtea
0 replies
9h30m

I always (and still) undestood it to just need the surname.

alanbernstein
0 replies
20h11m

"worth" alone still means "value" though

blast
2 replies
22h29m

The joke goes back to Adriaan van Wijngaarden introducing Wirth at a conference in the 1960s. I'd love to see a video of the audience reaction to that one.

https://en.wikiquote.org/wiki/Niklaus_Wirth

https://lists.racket-lang.org/users/archive/2014-July/063519...

moritzwarhier
0 replies
20h23m

Today I learned (from the Wikiquote page), what an obviously socially witty person he seems to have been!

Finally a short story for the record. In 1968, the Communications of the ACM published a text of mine under the title "The goto statement considered harmful", which in later years would be most frequently referenced, regrettably, however, often by authors who had seen no more of it than its title, which became a cornerstone of my fame by becoming a template: we would see all sorts of articles under the title "X considered harmful" for almost any X, including one titled "Dijkstra considered harmful". But what had happened? I had submitted a paper under the title "A case against the goto statement", which, in order to speed up its publication, the editor had changed into a "letter to the Editor", and in the process he had given it a new title of his own invention! The editor was Niklaus Wirth.

It is refreshing to see the old-fashioned trope of the genius computer scientist / software enginieer as a "foreigner to the world" being contested again and again by stories like this.

Of course people like Niklaus Wirth are exceptional in many ways, so it might be that the trope has/had some grain of truth, that just does not co-correlate with the success of said person :)

And of course people might want to argue about the differences betweem SE, CS and economics.

After all that rambling... RIP and thank you Niklaus!

lallysingh
0 replies
16h46m

ACM Historian JAN Lee said he was the origin of that joke, if I recall that conversation correctly.

wiz21c
0 replies
22h53m

Best CS pun ever. Thanks for sharing !!!

vram22
0 replies
18h6m

Said the same here:

https://news.ycombinator.com/item?id=15361069

at the end of the comment.

the_arun
0 replies
22h42m

Looks like he had a good sense of humor too.

simonebrunozzi
0 replies
2h47m

This is brilliant.

It reminds me of my only meeting with Andy Tanenbaum / AAT [0], one of the smartest, nicest computer science guy I've ever met in my life. I can't recall the many puns and jokes he shared, but it was just incredible.

[0]: https://en.wikipedia.org/wiki/Andrew_S._Tanenbaum

khazhoux
26 replies
23h22m

Wirth was the chief designer of the programming languages Euler (1965), PL360 (1966), ALGOL W (1966), Pascal (1970), Modula (1975), Modula-2 (1978), Oberon (1987), Oberon-2 (1991), and Oberon-07 (2007). He was also a major part of the design and implementation team for the operating systems Medos-2 (1983, for the Lilith workstation), and Oberon (1987, for the Ceres workstation), and for the Lola (1995) digital hardware design and simulation system. In 1984, he received the Association for Computing Machinery (ACM) Turing Award for the development of these languages.

pjmlp
13 replies
23h2m

Also collaborated with Apple on Object Pascal initial design, his students on Component Pascal, Active Oberon, Zonnon, and many other research projects derived from Oberon.

bsimpson
12 replies
22h3m

For those who don't know, Pascal was what a lot of the classic Mac software was written in, before Objective-C and Swift. It grew into Delphi, which was a popular low-code option on Windows.

musicale
4 replies
17h26m

It's a shame that Pascal was largely abandoned (except for Delphi, which lived on for a while); I believe several Pascal compilers supported array bounds checking, and strings with a length field. In the 1980s this may have been considered overly costly (and perhaps it is considered so today as well), but the alternative that the computing field and industry picked was C, where unbounded arrays and strings were a common source of buffer overflow errors. Cleaning this up has taken decades and we still probably aren't done.

Better C/C++ compilers and libraries can help, but the original C language and standard library were certainly part of the issue. Java and JavaScript (etc.) may have their issues but at least chasing down pointer errors usually isn't one of them.

pjmlp
1 replies
12h14m

Delphi still lives on, to the extent that there is enough people to sell conference tickets in Germany, and a new release came out last month.

Scea91
0 replies
8h56m

My father celebrated 60 two weeks back and told me he bought license for new Delphi and loves it, I was quite surprised with the development he described.

I considered telling him that he could get most of the things (he also buys various components) for free today, but then.. he is about 5 years before retirement and won't relearn all his craft now.

Myself, I am not sure whether its nostalgia but I miss the experience of Delphi 7 I started with 20 years back. In many ways, the simplicity of VLC and the interface is still unbeaten.

dragonwriter
1 replies
17h24m

The industry picked C when Pascal was still widely supported, not as a result of it being abandoned.

pjmlp
0 replies
11h41m

A side effect of UNIX adoption, C already being in the box, whereas anything else would cost money, and no famous dialect (Object Pascal, VMS Pascal, Solaris Pascal, UCSD Pascal) being portable.

Unfortunately Pascal only mattered to legions of Mac and PC developers.

merelysounds
3 replies
21h51m

I wouldn’t describe Delphi as low code, it is an IDE. Wikipedia also describes it like this[1] and does not include it in its list of low code development platforms[2].

[1]: https://en.m.wikipedia.org/wiki/Delphi_(software)

[2]: https://en.m.wikipedia.org/wiki/List_of_low-code_development...

auselen
2 replies
21h3m

It was a RAD platform though. From following your links:

Low-code development platforms trace their roots back to fourth-generation programming language and the rapid application development tools of the 1990s and early 2000s.

Delphi was originally developed by Borland as a rapid application development tool for Windows as the successor of Turbo Pascal.
pjmlp
1 replies
21h1m

It still is, and got a new release last month.

auselen
0 replies
20h44m

I wouldn’t know, I was like a Borland fan…

sedatk
2 replies
20h22m

AFAIK, even Photoshop was originally written in Pascal.

zengid
0 replies
20h7m

It was, according to Sean Parent (Adobe employee) in an interview about Pascal (around 8:03): https://adspthepodcast.com/2023/12/29/Episode-162.html

cxr
0 replies
18h1m

The Photoshop 1.0.1 source code is available from the Computer History Museum <https://computerhistory.org/blog/adobe-photoshop-source-code...>

Comments: <https://news.ycombinator.com/item?id=17132058>

TremendousJudge
3 replies
23h1m

I learned Pascal and MODULA-2 in college, in my first two programming semesters. MODULA-2 was removed shortly afterwards but Pascal is still used in the introductory programming course. I'm very happy to have had these as the languages that introduced me to programming and Wirth occupies a very special place in my heart. His designs were truly ahead of their time.

scotty79
2 replies
22h7m

I had Pascal and some Modula as well (on concurrent programming course).

I learned C++ later myself as a Pascal with bizzare syntax. I always felt like semantics of C++ was taken entirely from Pascal. No two lanuages ever felt closer to each other for me. Like one was just reskin of the other.

pjmlp
0 replies
20h57m

I already told this story multiple times, when I came to learn C, I already knew Turbo Pascal since 4.0 up to 6.0, luckly the same teacher that was teaching us about C, also had access to Turbo C++ 1.0 for MS-DOS.

I adopted C++ right away as the sensible path beyond Turbo Pascal for cross-platform code, and never seen a use for C primitive and insecure code, beyond being asked to use it in specific university projects, and some jobs during the dotcom wave.

On Usenet C vs C++ flamewars, there might be still some replies from me on the C++ side.

TremendousJudge
0 replies
21h13m

I learned C that way (algorithms class was in C), even had a little printout table of the different syntaxs for the same instructions (here's how you write a for, if, record, declare a variable, etc). At the time I remember thinking that the C syntax was much uglier, and that opinion has stayed with me since -- when I learned Python everything just seemed so natural.

skrebbel
2 replies
22h24m

I still wonder what the tech world would’ve been like today if Wirth had had the marketing sense to call Modula “Pascal 2”

sedatk
1 replies
20h23m

…also if he hadn’t insisted on uppercase keywords.

pjmlp
0 replies
10h7m

That wasn't the issue many make it up to be, thanks to programmer editors.

We already had autoformatting tools in the early 1990's, Go did not invent them.

kloch
2 replies
22h39m

Pascal was the second language I learned after Fortran. I didn't particularly like Fortran but Pascal really hit home and motivated me to learn C.

boznz
1 replies
20h17m

Love motivated me to learn Pascal, Money motivated me to learn C.

hurtbird
0 replies
5h36m

Love motivated me to replace C.

musicale
0 replies
17h29m

I'm kind of a fan of Lola, an easy-to-learn HDL which was inspired by Pascal/Oberon vs. Verilog (inspired by C) and VHDL, inspired by Ada.

I like Wirth's whole software stack: RISC-5 (not to be confused with RISC-V) implemented in Lola, Oberon the language, and Oberon the environment. IIRC Lola can generate Verilog - I think the idea was that students could start with an FPGA board and create their own CPU, compiler, and OS.

I also like his various quips - I think he said something like "I am a professor who is a programmer, and a programmer who is a professor." We need more programmer/professors like that. Definitely an inspiration for systems people everywhere.

de6u99er
0 replies
22h7m

Pascal was the second language after Basic. I was always interested in learning Modula, but picked up Delphi instead.

kleiba
13 replies
23h21m

@HN: Black banner, please?

erikpukinskis
7 replies
22h38m

Not trying to be crude, but is someone passing away after a long, rich life of 89 years something to mourn? Isn’t that kind of the best case scenario?

For me something like a black banner signifies a tragedy, not merely a death. A bunch of children being shot, a war, a disease ravaging a country, etc.

I’m curious to learn others’ perspectives however.

vidarh
0 replies
22h27m

We can mark it without particularly mourning it. HN often puts up black banners specifically for people who meant something to the HN community. E.g:

https://bear.willmeyers.net/whos-received-a-black-bar/

masklinn
0 replies
22h20m

Not trying to be crude, but is someone passing away after a long, rich life of 89 years something to mourn? Isn’t that kind of the best case scenario?

It can be "kind of a best case scenario" and yet you still mourn the loss. Mourning doesn't require a tragedy.

My grandmother died in her sleep at 94, pretty healthy all things considered (still had a good head, could putter along, and was in her own home of more than 60 years), after having had a great day. Pretty much the best death she and we could have hoped for. I still wouldn't have minded having her nearby for a few more years.

madmountaingoat
0 replies
22h19m

I think mourning is more than just tragedy. It's recognition of loss. And the tradition of black things around death has seemed more a sign or respect than as indication of some tragic underpinnings. But I actually don't know the history of the tradition, so I am happy to be corrected.

froh
0 replies
22h29m

for computer science, Nikolaus Wirth was not simply "someone"

and a black ribon signifies a great loss, not a tragedy.

fasterik
0 replies
22h25m

A lot of tragedies happen in the world, but you're not going to see a black bar on HN for every one of them. It's not so much about the magnitude of the loss, but the contribution that person made to the history of computing.

bcantrill
0 replies
22h26m

You know, I had a comment earlier about the importance of death rites being broadly lost on the young (and without meaning to sound pejorative, I have to believe that you are relatively young). I had thought to myself that I was perhaps being unfair -- surely even a child understands the importance of a funeral? -- but your comment shows that I wasn't wrong.

So as it apparently does need to be said: we're humans -- we mourn our dead. That is, the black bar denotes death, not tragedy; when we mourn those like Wirth who lived a full life, we can at once take solace in the fullness of a life lived and mourn that life is finite. The death rite allows us to reflect on the finiteness of our own lives, and the impact that Wirth had us, and the impact that we have on others. You are presumably too young to have felt this personal impact, but I assure you that many are brought back to their own earliest exposure to computing -- for many of us, was Pascal.

Again, RIP Nik Wirth; thank you for giving so many of us so much.

Jtsummers
0 replies
22h24m

While very out of fashion these days, a black armband used to be a signal of mourning someone's death, whether the death was a "tragedy" (likely meaning unexpected, particularly violent, particularly early, or something similar) or not. The black bar is a digital imitation of that.

Niklaus Wirth contributed quite a bit to our field, and, directly or indirectly, impacted many of the people who frequent this (programming technology oriented) site.

https://en.wikipedia.org/wiki/Black_armband

adamnemecek
2 replies
23h4m

If there was ever a reason for one, this is it.

wiz21c
0 replies
22h55m

Double on this.

bcantrill
0 replies
22h49m

Perhaps perversely (or maybe just a reflection of my own middle age?), but the HN black bar is one of my favorite aspects of HN. Death rites are essential, but their significance is often lost on the young who (naturally) pervade tech; what HN has developed in the black bar is really perfect.

Anyway: I trust we're just seeing natural human latency here, but this clearly merits the HN black bar. RIP, Nik Wirth -- truly one of the giants, and someone whose work had a tremendous personal influence for so many of us!

nvrmnd
0 replies
22h38m

Some people say that it doesn't matter if someone dies at age 89 -- after they have lived a full life and contributed all they had to give -- it's still just as sad and shocking.

Personally, I don't agree, to me it's just not as sad or shocking. People don't live forever and Wirth's life was as successful and complete as possible. It's not a "black day" where society truly lost someone before they fulfilled their potential.

Copenjin
0 replies
22h49m

YES.

bhaak
9 replies
22h14m

I'm a former student of his. He was one of the people that made me from a teenager that hacked on his keyboard to get something to run to a seasoned programmer that thinks before he codes.

Even before I met him at the university I was programming in Oberon because there was a big crowd of programmers doing Wirth languages on the Amiga.

He will be missed.

sterlind
6 replies
21h0m

which languages? I've just restored an Amiga 500 with Workbench 2.1 and I'd love to honor his memory.

mkesper
4 replies
20h39m

Modula2 was available and got used on Amiga. Silly teenager me found such high level languages "cheating" at the time.

greggman7
3 replies
17h34m

Lords of the Rising Sun was written in Modula2 on the Amiga

https://www.google.com/search?q=lords+of+the+rising+sun

My understanding (please correct me) is that Turbo Pascal on PC was actually Modula2 ?

pjmlp
0 replies
10h10m

No, Borland did have a Modula-2 compiler (where actually Martin Odersky of Scala fame worked on), but they decided to focus on Turbo Pascal and sold it.

linsomniac
0 replies
14h9m

The recent discussion here about Turbo Pascal commenters said it was written in assembly. Seems to be supported by https://en.wikipedia.org/wiki/Turbo_Pascal

FabHK
0 replies
7h53m

There was a fun break-out clone on the Atari ST, Bolo, that was written in Modula-2.

https://en.wikipedia.org/wiki/Bolo_(Breakout_clone)

http://www.atarimania.com/game-atari-st-bolo_8775.html

bhaak
0 replies
19h32m

At least several Pascal, Modula-2, and Oberon-2 compilers.

My very first compiled programming language was Pascal. I got the free "PCQ Pascal" from the Fish disks as I wasn't able to get the C headers from Commodore which I would have needed for doing proper Amiga programming. Likewise later Oberon-A although I don't remember where I got that from.

There were also commercial Modula-2 and Oberon-2 compilers. I just found that the Modula-2 compiler was open sourced some years back. https://m2amiga.claudio.ch/

Aminet has directories for Oberon and Modula-2 related programs: https://aminet.net/dev/obero and https://aminet.net/dev/m2

pjmlp
0 replies
10h10m

Yes, the Amiga was one of the platforms where Modula-2 had quite a crowd, more so than on the PC, as we got spoiled with Turbo Pascal instead.

microtherion
0 replies
17h44m

I'm also a student of his, and later met him socially on a few occasions as a graduate student (in a different institute).

Undergraduate students were all in awe of him, but I got the impression that he did not particularly enjoy teaching them (Unlike other professors, however, he did not try to delegate that part of his responsibilities to his assistants). He seemed to have a good relationship with his graduate students.

In his class on compiler construction, he seemed more engaged (the students were already a bit more experienced, and he was iterating the Oberon design at the time). I remember an exchange we had at the oral exam — he asked me to solve the "dangling ELSE" problem in Pascal. I proposed resolving the ambiguity through a refinement of the language grammar. He admitted that this would probably work, but thought it excessively complex and wondered where I got that idea, since he definitely had not taught it, so I confessed that I had seen the idea in the "Dragon Book" (sort of the competition to his own textbook). Ultimately, I realized that he just wanted me to change the language to require an explicit END, as he had done in Modula-2 and Oberon.

Socially, he was fun to talk to, had a great store of computer lore, of course. He was also much more tolerant of "heresies" in private than in public, where he came across as somewhat dogmatic. Once, the conversation turned to Perl, which I did not expect him to have anything good to say about. To my surprise, he thought that there was a valid niche for pattern matching / text processing languages (mentioning SNOBOL as an earlier language in this niche).

pjmlp
5 replies
23h5m

A sad day for the history of computing, the loss of a great language designer, that influenced many of us in better ways to approach systems programming.

FpUser
2 replies
22h57m

It is sad but the guy had long and fulfilling life many can only dream about. I would raise a toast to that. Hopefully he is in a coding Valhalla.

082349872349872
1 replies
22h53m

Not just coding: he was also interested* in hardware and built whole machines.

(* might Carver Mead describe him as a metaphorical "tall, thin, person"?)

kragen
0 replies
22h26m

he very well might, dave. i miss talking to you

frognumber
1 replies
21h43m

I am not very sad. Death is part of life.

I'm much more sad when life sort of decays (Alzheimer's, dementia, or simply becoming slow/stupid/decrepit), ends early, or when life is simply wasted.

He was about to turn 90.

He lead a long, impactful, fulfilling life.

That's a life to celebrate.

marttt
0 replies
11h45m

This is beautiful phrasing, very much how I think about life myself. Let's hope the last days of Mr Wirth were free from physical pain. Thinking of my grandpa who died all of a sudden, apparently without serious physical impairments or aches, at age 90, after a happy, well-lived, ethical life.

Heaven is happier by one person now for sure, again. And maybe some compilers over there also need tinkering. Rest in peace, Mr Wirth.

eatonphil
5 replies
23h16m

I loved the book Compiler Construction. Wirth's emphasis on minimalism is a huge inspiration.

munificent
2 replies
23h2m

I haven't read that one yet, but "Algorithms + Data Structures = Programs" is just an absolutely beautiful gem of a book. It embodies his principles of simplicity and clarity. Even though it's outdated in many places, I adored reading it.

ufo
0 replies
21h11m

Speaking of which, that book is one of the very few sources I could find that talks about recursive descent error recovery and goes further than panic mode.

eatonphil
0 replies
22h59m

That's the other one I keep hearing about, must read it.

int_19h
1 replies
20h42m

There's also an interesting book "A model implementation of standard Pascal" - not by Wirth - that implements an ISO Standard Pascal compiler in Standard Pascal, written as a "literate program" with copious descriptions of every single aspect. Basically, the entire book is one large program in which non-code is interspersed as very long multiline comments.

I couldn't find it available for download anywhere, but Internet Archive has a version of it in the library that can be checked out for free: https://archive.org/details/modelimplementat0000wels/page/n9...

pdw
0 replies
18h43m

Here's my transcription of that book: https://2k38.be/misp/

compiler.pas can be compiled with a modern Pascal compiler, but the resulting compiler cannot compile itself. I don't know if that's caused by a transcription error, a bug in the modern compiler or a bug in the Model Implementation.

I would love it if somebody gets this working. I don't think I myself will continue with this project.

parshua
3 replies
22h42m

I started my first company based on Delphi, which itself was based on Turbo Pascal. Wirth was a great inspiration, and his passing is no small loss. May his work keep inspiring new programmers for generations to come.

One of his quotes: "Whereas Europeans generally pronounce my name the right way ('Ni-klows Wirt'), Americans invariably mangle it into 'Nick-les Worth'. This is to say that Europeans call me by name, but Americans call me by value."

blast
1 replies
22h21m

Wirth must have adopted the quote (how could he not), but it actually goes back to a clever line by someone introducing him at a conference.

https://news.ycombinator.com/item?id=38858993

dvaun
0 replies
22h19m

What conference was it? Edit: Nvm, saw your other comment[0].

[0]: https://news.ycombinator.com/item?id=38858993

bmo-at
0 replies
22h29m

He was indeed! I wrote my bachelors thesis on bringing modularity to a language for monitoring real time systems and his work, especially on MODULA-2, was a huge source of inspiration.

madamelic
3 replies
22h13m

Am I understanding correctly that he was the sole maintainer of Algol at the age of 86 years old? Or was it more supervisory / BDFL?

kragen
2 replies
21h21m

algol isn't a piece of software, so it doesn't have maintainers. i don't know if the algol committee ever officially disbanded but wirth had already resigned before algol 68 came out

madamelic
1 replies
20h53m

Oh sorry, I misremembered the list and meant Oberon but fair point. I had just noticed that the last stable release was in 2020.

If I had read more closely to the wording, the language was _designed_ by Wirth but that doesn't necessiate him being fingers-to-keyboard (or whatever modality) despite it saying he was the developer.

kragen
0 replies
20h52m

oh, i guess you could say he was bdflish

Sunspark
3 replies
22h45m

RIP GOAT.

switchbak
2 replies
22h32m

The passing of a titan like he was deserves a little more respect than simply saying: "RIP GOAT".

hooverd
0 replies
22h29m

Rest in peace, greatest of all time.

bregma
0 replies
22h5m

So it goes.

self_awareness
2 replies
10h28m

What I'm saddened with is that this not deserve HN's black strip bar at the top of the page.

erik_seaberg
1 replies
10h9m

The black bar was up yesterday. I’m guessing it’s midnight to midnight Pacific time.

SushiHippie
0 replies
5h13m
omoikane
2 replies
22h44m

Niklaus Wirth was also responsible for changing the title of Dijkstra's paper to "Goto Statement Considered Harmful".

https://en.wikipedia.org/wiki/Considered_harmful#cite_ref-6

tommsy64
1 replies
21h58m

Relevant excerpt of Dijkstra's own account (from EWD1308 [1]):

Finally a short story for the record. In 1968, the Communications of the ACM published a text of mine under the title "The goto statement considered harmful", which in later years would be most frequently referenced, regrettably, however, often by authors who had seen no more of it than its title, which became a cornerstone of my fame by becoming a template: we would see all sorts of articles under the title "X considered harmful" for almost any X, including one titled "Dijkstra considered harmful". But what had happened? I had submitted a paper under the title "A case against the goto statement", which, in order to speed up its publication, the editor had changed into a "letter to the Editor", and in the process he had given it a new title of his own invention! The editor was Niklaus Wirth.

[1] Transcription - https://www.cs.utexas.edu/%7EEWD/transcriptions/EWD13xx/EWD1... PDF - https://www.cs.utexas.edu/%7EEWD/ewd13xx/EWD1308.PDF

bb88
0 replies
19h47m

And it continues to this day!

marttt
2 replies
11h32m

After having read some of the comments on Pascal here -- fellow HNers, what's your view on Pascal as a teaching/introductory language in 2023, for children aged 10+? Mostly thinking of FreePascal, but TurboPascal in DOSBox/FreeDOS/SvarDOS is also a possibility.

I'm also thankful for references to "timeless" Pascal books or online teaching materials that would be accessible for a 10+ year old kid who is fine with reading longer texts.

(My condolences are below, fwiw. His death is, interestingly, a moment of introspection for me, even if I'm just a hobbyist interested in small systems and lean languages.)

cryptos
0 replies
7h42m

I wouldn't teach Pascal any more. While the ecosystem around it is not quite dead it is not alive either. So, everything feels a bit fallen out of time. At least to me it would be demotivating to learn the Latin of computer science.

I would rather pick Python or Kotlin.

augustk
0 replies
10h0m

Niklaus Wirth is most famous for Pascal but the best language is his last, namely Oberon which is both smaller and more capable than Pascal. If you are interested in a freestanding compiler (separate from the operating system Oberon), have a look at OBNC.

https://miasap.se/obnc/

sedatk
1 replies
19h7m

I just needed a feature of Pascal yesterday in one of my Rust libraries: ranged integers. I know, you can go about it in different ways, like structs with private fields and custom constructors, or just with a new generic type. But, having the ability to specify that some integer can only be between 15..25 built-in is a fantastic language feature. That's even so with runtime bounds checking disabled because the compiler would still complain about some subset of violations.

What an innovator and a role model. I wish I can be as passionate about my work in my 80's as him.

pjmlp
0 replies
10h4m

Not only did Pascal (TP more precisely) taught me about systems programming with a safer language, it was also my first foray into type driven programming, learning to use the type system to express conditions not bound to happen.

Ranged numerics and enumerations were part of that.

raphlinus
1 replies
21h59m

Prof Wirth was a major inspiration for me as a kid. I eagerly read his book on Pascal, at the time not appreciating how unusual it was for its elegance and simplicity. I also followed with interest his development of the Oberon language and Lilith workstation. When I was 13, he gave a talk not too far away, I think it might have been Johns Hopkins, and my dad took me to it. It was a wonderful experience, he was very kind and encouraging, as I think the linked photo[1] shows.

[1]: https://mastodon.online/@raph/111693863925852135

de6u99er
0 replies
21h42m

Great story. Thanks for sharing.

olvy0
1 replies
21h45m

Pascal was my first "real" language after Basic, learned it in the late 80s, wrote a couple of small apps for my dad in it.

Learned most of it from a wonderful book whose name I have forgotten, it had a wrench on its cover, I think?

Anyway, still rocking Pascal to this day, since I still maintain 3 moderately complex installers written with InnoSetup, which uses RemObjects Pascal as a scripting language.

4 years ago, a new guy on our team, fresh from school, who never even knew this language existed, picked up Pascal in a week, and started maintaining and developing our installers much further. He did grumble a bit about the syntax but otherwise did a splendid job. I thought that was a tribute to the simplicity of the language.

lioeters
0 replies
19h43m

Pascal was my first "real" language after Basic, learned it in the late 80s

Me too, word for word. I spent a few years in my pre-teens immersed in the Turbo Pascal IDE, which was a full-on educational tool of its own that explained everything about the language. I moved on to C after that, but I still get a nostalgic vibe from looking at Pascal syntax. It was a great foundational experience for me as a programmer.

formerly_proven
1 replies
23h28m

end;

boznz
0 replies
23h14m

end.

forinti
1 replies
21h42m

Pascal was my second language, after BASIC. I was about twelve and pointers cost me a little to understand. But the first hurdle was not having line numbers. It seemed weird.

In the end, it was definitely worth the effort, and I learnt good habits from it. I used it in college, and I suppose I kinda still do, because I do a lot of PL/SQL.

He was hugely important for generations of coders.

RIP.

zeroc8
0 replies
7h39m

Delphi/Object Pascal was my favorite programming environment for a long time. PL/SQL is Ada though.

fiddlerwoaroof
1 replies
21h41m

I learned to program in Delphi and, to this day, I haven’t ever found a tool for building GUIs as pleasant. So, I’m fond of Wirth-inspired languages.

saiya-jin
0 replies
21h19m

... and you never will, since OSes don't provide anything like that out of box and to make design ecosystem work like that, a major effort for fat clients would be needed. In a time when dev tools are mostly free (apart from intellij idea but this approach has its own drawbacks) and focused on other technologies / platforms.

Qem
1 replies
22h39m

RIP Mr. Wirth. The first programming language I ever learnt was Pascal, that brings me fond memories. A big loss for the computer science community.

kragen
0 replies
21h24m

minor quibble: dr. wirth had a doctorate from berkeley

AlbertCory
1 replies
23h23m

I'm happy to say I got to meet him, thanks to Charles Simonyi. May his memory be a blessing.

pjmlp
0 replies
23h0m

I was at the session he did at CERN back in 2004, but sadly never managed to talk to him.

zengid
0 replies
20h14m

ADSP podcast (named after Wirth's book) just had an episode on the history of Pascal https://adspthepodcast.com/2023/12/29/Episode-162.html

wormius
0 replies
20h19m

RIP King. 2nd language I learned was Pascal (Turbo 5 then 6) in high school. Tried UCSD P-System from a family friend with corporate/educational connections on 5.25" but didn't have a manual, and this was before the internet. I could/should have tried to use the library to get books about it, but gave up.

Fond memories; I feel like the 90s kids were the last ones to really get to appreciate Pascal in a "native" (supportive, institutional) setting.

I also loved learning Oberon/Bluebottle (now A2 I guess), which I was so fascinated with. I think that and Plan 9's textual/object interface environments are super interesting and a path we could have taken (may converge to someday?)

vidarh
0 replies
22h23m

When I first got to play with Turbo Pascal (3.something?), I was more impressed by the concise expression of the language in the EBNF in the manual than by Turbo Pascal itself, and it was what made me interested in parsers and compilers, and both Wirth's approach to them and the work his students undertook in his research group has been an interest of mine for well over 30 years since.

vaxman
0 replies
11h16m

He must have clicked on a video of Swift 5.9 ..those enums! https://youtu.be/pD2XZHnDKvo

I probably wouldn't have learned algorithms and data structures as[S] well without Pascal but I never learned it right until C eventually cameawrong.

PS: We still have Dr. Donald Knuth with us :)

tibbydudeza
0 replies
22h49m

A giant of the programming language field. My first programming language was Pascal (Borland) before I got introduced to C.

throw_m239339
0 replies
22h18m

RIP, a great loss for computer science.

Some people here are recommending greats books he wrote, definitely worth reading.

throw7
0 replies
22h7m

RIP. After Basic (on a commodore), I learned Pascal (turbo pascal) in high school of all places.

submeta
0 replies
22h13m

Pascal (Turbo Pascal on a PC) was my second programming language after Assembler and some C (had a copy of Aztec-C compiler) on an Amiga when I was 17ish. Pascal taught me modular programming, breaking down largensystems. I‘d written my own matrix calc library and would program animations for my physics class. And I learned the basic concepts of OO. It was a joy to program in.

RIP Niklaus Wirth.

squarefoot
0 replies
22h26m

This is a huge loss in computer science. Everyone interested in computing, no matter if using other languages than Pascal or derivatives, should read his "Algorithms + Data Structures = Programs" book. R.I.P.

rmrf100
0 replies
16h30m

The first programming language I started to learn in college is Pascal, RIP and thank you Niklaus!

revskill
0 replies
20h36m

Pascal and Delphi is great example of Simplicity that works.

progre
0 replies
3h47m

No black bar? I think he deserves it.

pengaru
0 replies
23h30m
notorandit
0 replies
20h21m

So long, and thanks for all of the data structures and algorithms!

nobleach
0 replies
23h27m

I owe a debt of gratitude to him and the Pascal programming language. Sincerest condolences to those he left behind.

nhatcher
0 replies
19h29m

Barely a week ago I was quoting from a paper of him here in HN, "A plea for lean software". It's been discussed here before:

https://news.ycombinator.com/item?id=27661559

RIP

nerpderp82
0 replies
22h19m

Damn.

neilv
0 replies
23h22m

I don't see obituaries yet. In the meantime: https://en.wikipedia.org/wiki/Niklaus_Wirth

msie
0 replies
23h30m

R.I.P.

mkovach
0 replies
20h22m

Now I have an excuse to write my next project in Pascal. :)

Bwahah! Bwahah!

mise_en_place
0 replies
19h53m

He was one of the most influential personalities in our field, most modern programming languages descend from and contain many of his ideas.

markus_zhang
0 replies
21h52m

Oh God, no!

I need to read his compiler book once I completed my toy interpreter.

lordgroff
0 replies
23h2m

One of the titans of the era, Pascal greatly contributed to my love of programming and my eventual career. Rest in peace Dr. Wirth.

lispm
0 replies
20h37m

He is a true legend.

krylon
0 replies
9h26m

Rest in peace.

justinclift
0 replies
18h37m
jart
0 replies
18h33m

The greatest of all quiche eaters has just passed away. May he rest in peace. https://www.pbm.com/~lindahl/real.programmers.html But seriously, PASCAL was the first programming language I loved that's actually good. Turbo Pascal. Delphi. Those were the days. We got a better world thanks to the fact that Niklaus Wirth was part of it.

hyperswine
0 replies
17h23m

Will be missed.

hasmanean
0 replies
14h16m

Pascal pointer notation was so logical. Dereferencing a pointer was just a caret to the right of the variable.

A double pointer was just two carets. And so on.

There was a struct symmetry about the whole thing. C broke that, especially with strict pointers.

groos
0 replies
20h18m

Wirth taught me how to write recursive descent parsers and design languages suited to them. RIP.

googamooga
0 replies
22h14m

R.I.P. Niklaus Wirth. Your ideas, languages and designs were the inspiration for several generations of computer scientists and engineers. Your Lilith computer and Modula-2 language kindled a small group of students in Western Siberia’s Akademgorodok to create KRONOS - an original RISC processor-based computer, with it’s own OS and Modula-2 complier, and lots of tools. I was very lucky to join the KRONOS team in 1986 as a 16 yo complete beginner, and this changed my life forever as I become obsessed with programming. Thank you, Niklaus.

gjvc
0 replies
23h21m

now would be a good time read this in his memory https://cr.yp.to/bib/1995/wirth.pdf

Also, his Oberon system provides a rich seam to mine. This, from a symposium held at ETH Zurich on the occasion of his 80th birthday in 2014, is a whirlwind retrospective. "Reviving a computer system of 25 years ago" https://www.youtube.com/watch?v=EXY78gPMvl0

One of the greats.

froh
0 replies
22h13m

I hold an old print of his Pascal language report near and dear on my.bookshelf. he bootstrapped oberon with one peer in 1-2 years.

his preference for clarity over notational fancyness inspired so many of us.

the Pascal family of languages are not only syntactically unambiguous to the compiler they are also clear and unambiguous to humans. can. the Carbon successor to c++ strives for the same iirc.

facorreia
0 replies
20h46m

Rest in peace. I owe a lot to his work.

"Algorithms + Data Structures = Programs" was a seminal book for me when I was learning about software development and it has influenced how I think about programming. Also, Pascal (in its various dialects) was my main language for many years on multiple platforms (CP/M, MS-DOS, Windows).

emmelaich
0 replies
21h15m

I liked his witticism "You can call me by name (Veert) or call me by value (Worth)."

But I can't find a reliable attribution.

[edit - see other comment, apparently said by Adriaan van Wijngaarden not Wirth]

elvis70
0 replies
21h35m

I really appreciate his work. He had a full life. Since yesterday, without knowing, I was just studying a section of a book detailing the code generation of one of the first Pascal compilers for the CDC 6400.

dstanko
0 replies
18h56m

Compiler construction was and is one of my all time favorite books on the matter. You can't put it down once you start, it is that good.

https://people.inf.ethz.ch/wirth/CompilerConstruction/Compil...

doubloon
0 replies
15h14m

End. in peace

deadmarshal
0 replies
11h50m

R.I.P Niklaus Wirth. One of the greatest men that has ever lived.

dark-star
0 replies
18h44m

Sad news indeed.

Didn't someone already publish a draft of his last book? I think I read that somewhere...

cmrdporcupine
0 replies
16h8m

Still remember at 14 scrounging $$ together to buy a 2nd hand copy of a Modula-2 compiler for my Atari ST and then eagerly combing through the manual as my parents drove me home from the city. Really was a different era. Like a lot of other people who have posted here who probably came of age like me in the 80s, I went from BASIC to Pascal to Modula-2 and only picked up C later. Wirth's creations were so much a part of how I ended up in this industry. The world of software really owes him a lot.

blackhaz
0 replies
22h21m

Coming from ZX Spectrum at home and seeing the beauty of Turbo Pascal on an IBM PC-compatible has greatly contributed to my love of programming. R.I.P., Professor Wirth.

bitwize
0 replies
23h25m

Time for a black bar on the front page. Wirth had been around since forever, influencing how we and generations before us program.

betimsl
0 replies
19h0m

I qofte dheu i lehte. Rest in peace brother.

baus
0 replies
22h43m

Modula-2 had a huge influence on my early understanding of Software Engineering and Computer Science. I feel it is one of his under-valued contributions. RIP Niklaus. One of the great ones.

anticensor
0 replies
22h5m

Wirth made one of the most critical observations in the whole history of computing: as hardware develops, software complicates to compensate and slow things down even further.

abc_lisper
0 replies
22h43m

My very first language was Pascal. I have since forgotten it, but distinctly remember the feeling computers are fun! And the red pascal book. Thank you Niklaus, for all the fun and impact you had on subsequent languages.

_ph_
0 replies
20h55m

A sad day. He was a titan of computing and still deserved even more attention that the got. If his languages had been more prevalent in software development, a lot of things would be in a better shape.

After playing around a bit with Basic on the C64/128, Pascal became my first "real" programming language I learned. In the form of UCSD Pascal on Apple II at my school as well as Turbo Pascal 3.0 on a IBM PC (no AT or any fanciness yet). Actually a Portable PC with a build-in amber CRT.

When I got my Amiga 500, Modula 2 was a very popular language on the Amiga and actually the M2Amiga system was the most robust dev env. I still think fondly of that time, as Modula 2 made it so easy to develop structured and robust programs. The module concept was quite ahead of the time, while the C world kept recompiling header files for so many years to come. Today, Go picked up a lot from Modula 2, one reason I immediately jumped onto it. Not by chance, Robert Griesemer was a student of Wirth.

During the 90ies, while MS Dos was still used, Turbo Pascal still was the main go-to language on the PC for everyone, as it was powerful, yet approachable for non-fulltime software developers. It picked up a lot of extensions from Modula 2 too and also had a nice Object system. It peaked at the version 6 and 7. Probably to the day my favorite development environment, partially because of the unmatched speed of a pure character based UI. And Turbo Pascal combined the nice development environment with a language which found a great compromise between power and simplicity.

Unfortunately, I was only vaguely familiar with his later work on Oberon. I ran the Oberon system natively on my 386 for some toying around. It was extremely impressive with its efficiency and full GUI in the time of DOS on the PC. A pity, it didn't achive more attention. Probably it would have been very successful if it had gained tracking in the not too late 80ies, in the early 90ies of course Windows came along.

From a puristic point of view, the crowning achievement was of course when he really earned the job title of a "full stack developer", not only designing Oberon and the OS, but the CPU to run it as well. Very impressive and of a huge educational value.

END.

WesolyKubeczek
0 replies
22h9m

May he and his work be forever remembered.

StillBored
0 replies
12h50m

RIP, and thanks for helping indirectly to put me on my career path.

I learned pascal fairly late in the grand scheme of things (basic->6502 assembly->C and then eventually pascal) but it was used for the vast majority of my formal CS education first by instruction, then by choice, and eventually in my first real programming job. The later pascal dialects remain IMHO far better than any other languages I write/maintain/guide others in using. Like many others of his stature it was just one of his many hits. Niklaus Wirth is one of the giants I feel the industry stands on, and I thank him for that.

"All those moments will be lost in time, like tears in rain..."

MelvinButtsESQ
0 replies
19h19m

begin

  For every, there is an
end.

RIP

I_am_tiberius
0 replies
19h30m

Here's a fantastic interview with him (in case you speak german): https://de.wikipedia.org/wiki/Datei:ProfessorNiklausWirth.we...

HumblyTossed
0 replies
22h15m

A great loss. I cherish my copy of "Algorithms + Data Structures = Programs". I re-read it every couple of years.

Decabytes
0 replies
22h3m

I was just exploring Pascal last month. I've been meaning to do some more programming in it. I think it's a good compromise for someone who wants a lower level language but doesn't want to use C or C++. The FreePascal compiler also rips through thousands of lines of code a second so the compile times are really short

ChuckMcM
0 replies
21h7m

From a comment I left on Mastodon:

He gave a talk at the CHM (He was inducted as a fellow in 2004) I got to talk with him and was really struck by someone who had had such a huge impact was so approachable. When another person in the group challenged Modula-2 he listened respectfully and engaged based on the the idea that the speakers premise was true, then nicely dissented based on objective observations. I hope I can always be that respectful when challenged.

Arnt
0 replies
18h1m

I bought the Art of Computer Programming volume 4A a few years ago and didn't even start reading it. 1-3 I read when I… god, my youngest child is almost that age now.

I think tonight is the time to start on 4A, before we lose Knuth too.

And as I picked it down I noticed that, almost by coincidence, AoCP stood next to Wirth's PiM2. It wasn't intentional but it feels very right. There's a set of language books that end with Systems Programming with Modula 3, the Lua book. Thinking Forth, PiM2, then a gap, then the theory-ish section starts with five volumes of Knuth. Sigh.