return to table of content

Enlightenmentware

delta_p_delta_x
27 replies
19h44m

I couldn’t fathom why anyone would use Windows²

I saw this sentence, was about to type something in response, and then I expanded the footnote (side note: is it really a 'foot'note if it's not in the footer of the page?):

I became significantly more tolerant since my early university years. Windows (specifically the NT family) is a great operating system. I even have it installed on my gaming pc so that I can buy games I never play.

It's pretty rare to see such a balanced perspective with respect to Windows when someone starts off with 'UNIX'.

20240519
7 replies
17h5m

Windows 2000 was great IMO, and peak Windows!

XP was peak if hardware compatibility is important.

Then it went downhill slowly. UI decisions and telemetry and now needing an internet connection and a MS account to install and now Win11 refuses to install on perfectly good but older hardware.

Microsoft cloudifying Office ironically makes going to Linux as a normie fairly easy as Office is the only thing I would miss. And mainly due to it’s dominance rather than it being great.

Windows dark side is a shame as MS as a developer’s company is really good. VS, VSCode, Typescript and C# and F# are awesome. And also some changed to Windows are good.

GuB-42
4 replies
10h7m

I put the peak at Windows 7.

Windows 7 UI is like Windows XP but prettier thanks to GPU acceleration. Compared to the XP generation, it had better security, 64 bit support out of the box, it was an "internet age" version of Windows, but it could still run offline and wasn't too obnoxious with ads, telemetry, etc...

It definitely got downhill after that.

timeon
2 replies
8h4m

Windows 7 UI is like Windows XP but prettier

Isn't 7 UI basically Vista?

Aeolun
1 replies
6h29m

Sorta, but Vista was slow and stuttery, so they don’t feel the same.

pxc
0 replies
2h10m

Is Windows 7 fast on Vista-era hardware?

Zecc
0 replies
7h51m

I mostly agree. But Windows 10 added virtual desktops[0]. Took them long enough!

Its dialog when copying files in nicer as well IMO.

[0] Windows 7 supported up to 4 virtual desktops, but not out-of-the-box: https://learn.microsoft.com/en-us/sysinternals/downloads/des...

Affric
1 replies
16h8m

Occasionally I have to help my father with his windows computer and each time it actually gets worse. Edge. Suggestions.

Like I am sure the actual fundamentals of the OS are fairly high quality but holy shit it sucks to use.

h4kor
0 replies
11h40m

I had a similar, frustrating experience last weekend trying to copy data from a phone to a windows machine.

Windows is trying to hide all "technical" stuff so hard that it becomes impossible to do anything.

mhh__
6 replies
17h27m

Pedantry: it feels like false balance.

Windows is fine, that's it. I use it every day, but it's got so many weird quirks that (that I can't do anything about, being the difference with Linux) it seems ridiculous to call it "great".

I'm in the other room from my windows laptop. It's late, there's almost nothing running in it, the lid is closed, surprise surprise I can still hear the fans.

xandrius
3 replies
11h34m

Fans are probably either a rogue service you can find in the task manager or hardware problem (maybe it needs some new thermal paste or a good air blow to remove dust).

That has usually nothing to do with the OS itself.

klibertp
0 replies
2h40m

Counterpoint: I bought a System76 laptop last year. As it came, the CPU fan was never off for longer than a few seconds, even when there was no load at all. The fans are not very loud, but the coil whine just before re-enabling the fan was disturbing.

The motherboard's firmware is open, however, so I rebuilt it with a slightly adjusted "CPU fan curve." After flashing it, the fans now go online when there's an actual need, which is to say - rarely. The coil whine still happens, but hearing it once or twice a day is much less irritating than hearing it every 10 seconds.

So it's possible the problem is on the OS side (I think we can agree the firmware is part of the OS) and it's sometimes possible to fix the problem in software... as long as you have control over it.

andrepd
0 replies
11h11m

Recent Windows releases are notable for not running software without the consent or wishes of the user :)

Narishma
0 replies
4h13m

And what about rogue services that come with the OS itself? Things like Windows Update, Windows Defender, the Phone app thingy, diagnostics policy service, the Xbox game bar or whatever it's called, the .NET optimization thingy and a dozen other things that like to wake up randomly and start consuming resources whenever they feel like it. Most of these things you can only disable temporarily, if at all, without resorting to dubious 3rd party tools.

rusk
1 replies
11h36m

I can still hear the fans.

I could not deal with that I’m sorry. I’d have to turn it off.

The only way I could make Windows usable day after day in a previous job was to shut it down every night, and then I had the BIOS configured to start it up again and iterate through my extensive startup list before I got into work. Was quite effective like that.

Aeolun
0 replies
6h26m

My Ubuntu machine is also noisy. I’m fairly certain it has nothing to do with system activity.

DeathArrow
2 replies
11h29m

After 20 years of using Linux on the desktop (and FreeBSD, and NetBsd) in parallel with Windows, I gave up. I don't like to always configure things, I don't need 20 different ways to accomplish a task and some of the software I use is not available on Linux. So I went Windows only for the desktop since 4 years. Of course, when I had to do something server side, it was Linux only.

Recently I bought a MacBook Pro and the experience is very Windows like. I don't have to mess with the OS and it just works.

worksonmine
1 replies
7h53m

Stick to one Linux distribution and you can have the "one size fits all" experience you want. Who's forcing you to unixhop and constantly fiddle with your stuff? I'm on Debian and never have to change anything and my setup just works.

bradstewart
0 replies
2h17m

For me at least, it's personal discipline. If I can fiddle and change stuff, I will.

lelanthran
1 replies
10h18m

It's pretty rare to see such a balanced perspective with respect to Windows when someone starts off with 'UNIX'.

I dunno. I've posted some pretty balanced opinions on OSes: I've frequently criticised Windows, Macs, Gnome, Plasma and more.

They all each suck in their own specific ways. Most people acknowledge this.[1] Many people are just like me: we put up with the crap on each system in order to get work done.

[1] The exceptions are almost always Mac and Gnome users: trying pointing out that UI can be objectively bad, and the default Mac/Gnome experience fails on more than a few of the objective UI metrics, and you almost always get multiple Mac/Gnome users saying that UI is all subjective.

Aeolun
0 replies
6h24m

To some extend it is? After all, those users like that objectively bad UI that you are talking about. The fact that they’re idiots doesn’t make it any less subjective.

ghostpepper
1 replies
19h24m

From that moment on, unix followed me through all stages of my life: the toddler phase of keeping up with the cutting-edge Ubuntu releases, the rebellious teens of compiling custom kernels for my Thinkpad T61p and emerging the @world on Gentoo, the maturity of returning to Ubuntu lts and delaying upgrades until the first dot one release, and to the overwhelmed parent stage of becoming a happy macOS user.

I started off as a Linux zealot and followed a very similar trajectory. I think it’s a sign of maturity to realize there is no absolute “best” in engineering, just a best solution in a particular problem space, and Windows is the best for a large number of users for a reason.

aulin
0 replies
12h34m

I started as a Linux enthusiast a long time ago, these days in my own time I use macos and I don't miss Linux that much, as long as I'm in the terminal I don't feel a difference.

At daily job I'm forced to use Windows, the only thing that's keeping me from changing jobs is WSL2. I'm just not productive with mouse based tools, I need a terminal and powershell doesn't do it for me. Everything feels alien and less usable to me even after years, fonts, window decorations, file manager, UI inconsistencies between different tools. Everything seems slightly hostile and out of place.

pie_flavor
0 replies
16h37m

On the contrary, footnotes on a webpage belong in the margins. Putting them at the foot of the article in current year is like banging rocks together.

flobosg
0 replies
10h57m

(side note: is it really a 'foot'note if it's not in the footer of the page?)

These are rather sidenotes (or margin notes).

ctenb
0 replies
8h36m

I think the peak of windows was when they introduced WSL, making windows the ultimate crossplatform dev OS.

bitwize
0 replies
9h12m

I'm not the Linux zealot I was as a kid, but I can never see myself going back to Windows. The particular niceties of a Unix environment are ones that I've come to rely on, and I could never go back to managing all my files and data through rat wrestling, the way Windows seems to want you to do.

That said, I can see the merits of Windows, especially for normies or video game players. It's just absolute friction town for me to use it.

analog31
0 replies
15h58m

The main thing keeping me on Windows is touch screen support. I know I'm a freak for wanting it, but we all have our ergonomic preferences.

Beyond that, I treat the OS as an appliance. Most of the software that I use is platform independent.

jchw
25 replies
20h37m

Doubt anyone will be surprised by this from me, but, Nix, 1000x. The amount of crazy stuff you can make work with Nix and Nixpkgs is nuts. This weekend someone pinged me wanting a static build of a Rust binary that had some gnarly bindings to C++ libraries. In under 100 lines of Nix, we have everything: static musl-based build, dynamic glibc build. Want an AppImage? `nix bundle` with an AppImage bundler. Want an OCI image? dockerTools.buildImage on top of your derivation. Throw it in GitHub actions using the Determinate Nix Installer action and you get automatic caching of the Nix store using GitHub actions cache; pretty useful since neither musl Rust nor static LLVM are cached in Hydra. Want to share your cache? Pipe a list of Nix store paths to Cachix, then they can pull it down, or add the Cachix GitHub action to automatically pull from and/or push to it for the CI build. So if anyone wanted to re-use your cache from GitHub Actions CI runs, they could, provided they trust you. You can even cross-compile with MinGW, or run it on macOS.

It's a hugely complex time sink, but my God, it's great. Whereas I don't generally recommend people go down the NixOS rabbit hole unless they're convinced it's right for them, I definitely think Nix is worth having in your toolbelt, it's ridiculously versatile.

memothon
7 replies
18h53m

What's the right way to learn how to use NixOS?

There are a few different approaches to doing everyday development (oh use Flakes or just make it a default installed package) that it's tedious to do lots of things I've found.

jchw
3 replies
17h8m

I don't think there's a right way to do it, you are correct in that learning NixOS is pretty tedious.

Re: flakes, my personal opinion is to use flakes. While Flakes are imperfect, they still provide a lot of functionality that Nix doesn't otherwise have. In my mind, it's like Nix's equivalent of "Go modules" or something like that. I do feel like people who do not like flakes make many valid points (the boilerplate, the fact that the top-level flake expression is a subset of Nix for some reason, etc.) but the argument isn't that those problems shouldn't be solved, it's that flakes are a sub-optimal design. Since they're so proliferated throughout the ecosystem though, it is quite unlikely that Nix or any prominent fork will outright drop flakes support any time in the near future. For better or worse, Flakes are part of the Nix ecosystem for the foreseeable future. In my opinion, one may as well take advantage of that.

If you haven't already, I'd get your feet wet with installing Nix on a non-NixOS machine first, and please feel free to ask questions about Nix in the NixOS Discourse "Help" section.

I have some recommendations:

1. https://github.com/nix-community/nix-direnv - Since Nix derivations usually wrap around other build systems, the entire derivation is recomputed when any file in it changes; using direnv, you can just get your normal dev tools upon cd'ing into your project directories. This gives you a lot of the benefits of Nix during local development, but with your normal stack, and without needing to globally install anything. Importantly, this works around the problem of needing to rebuild your project every time a file changes; Nix caching isn't granular enough (at least when wrapping around other build systems as it normally does.)

2. If you are trying to build something, chances are you can find inspiration in Nixpkgs. Are you curious how you might package a Bevy game? No problem: literally search "bevy" on the Nixpkgs GitHub repo and see what comes up. I found a derivation that does: https://github.com/NixOS/nixpkgs/blob/master/pkgs/games/jump...

3. If you use flakes, you should keep the flake "schema" handy. There are a lot of different kinds of flake outputs and there are different ways to specify the same thing, which is somewhat needlessly confusing; keeping the flake schema handy will make it easier to understand what Nix is looking for in a flake, which might make it easier to see what's going on (especially if it's obfuscated.) The most important takeaway here: A command like `nix run flake#attr` will try multiple different attributes. https://nixos.wiki/wiki/flakes#Flake_schema

4. Likewise, I really recommend reading up on what NixOS modules are. NixOS modules are the basis for configurations on NixOS, and having a clear understanding of what is even going on with them is a good idea. For example, you should understand the difference between the Nix language's `import` directive, and using the NixOS modules `imports` attribute to import other NixOS modules. Understanding how the configuration merge works saves a lot of headache, makes it easier to understand how people's configurations works, and also makes it easier to modularize your own NixOS configurations, too. https://nixos.wiki/wiki/NixOS_modules

Unfortunately though, there's just no way to make it "click", and I can't guarantee that it's worth all of the effort. For me, I felt it was, but yes, there's no one correct way to do it.

But please feel free to ask questions if anything seems confusing.

SSLy
1 replies
9h9m

Point 4 is incredibly under-marketed. Almost all of Nix/NixOS documentation focus on the language-y and build-system-y parts of Nix, and the NixOS modules are usually not talked about. Terrible state of docs doesn't help.

pxc
0 replies
3h22m

Knowing what are the 3 official manuals for the 3 most important projects in the core Nix ecosystem (the manuals for Nix, Nixpkgs, and NixOS, respectively) that together make up the core of the official docs will save newbies a lot of trouble.

the language-y and build-system-y parts of Nix Language-y manual: https://nixos.org/manual/nix/stable/language/index.html

Build system-y stuff manual: https://nixos.org/manual/nixpkgs/stable/

the NixOS modules

Using the NixOS module system in the sense of leveraging existing modules is the main topic of the NixOS manual: https://nixos.org/manual/nixos/stable/

Details about the module system mostly live in this section of it: https://nixos.org/manual/nixos/stable/#sec-writing-modules

The piece they don't tell you is that as a NixOS user, you generally want to look for a NixOS module that supports your application first. Only if you don't see one should you then directly/manually install a package into, e.g., environment.systemPackages.

In other words, search here first: https://search.nixos.org/options

And search here second: https://search.nixos.org/packages

The landing page that ties these reference docs together and also contains a lot more example-centric and tutorial content is nix dot dev, here: https://nix.dev/

Imo nix.dev is a great entrypoint and quickly getting better. In addition to providing a bit of a map to help you navigate the official docs, it includes the best official starting docs on flakes, and links to the highest-quality external tutorial and expository material about Nix.

Make a mental note about the 3 big reference manuals, and bookmark nix.dev, and you have everything you need to learn your way around the official docs.

memothon
0 replies
15h33m

Thanks for the inspiration. I actually already have nixOS installed on another laptop, but lapsed back to my Ubuntu machine out of a bit of frustration. I'll try it again and see how far I get with these tips.

pxc
0 replies
13h59m

What's the right way to learn how to use NixOS?

Just commit. Make it your daily driver, engage the community for help, and learn whatever you need to learn to do things the idiomatic way. Then settle in and get comfy.

That said, I hear you about the paralyzing number of choices. Here are some okay choices you can fall back on when you get dizzy:

  - use flakes with direnv for all your projects
  - when in doubt, just write the damn package
    - need to use some Python library that depends on another Python library that's not in Nixpkgs? Package 'em. Package all the recursive dependencies, then package the library you wanted to use.
    - want some program that's not in Nixpkgs? package it
    - want to reuse a binary built for a traditional distro? package it. Look at examples in Nixpkgs of similar things (e.g., Discord, Steam)
  - seek community engagement on Discourse and Matrix, not elsewhere (not Reddit, not Discord, not Slack, not IRC)
This is the best way to learn NixOS, imo. But if it seems like too much, it's not the only way.

null_point
0 replies
3h50m

I found using Nix package manager on my current daily-driver OS was a great way to break the ice. After translating my dotfiles to Nix and figuring out my project-specific development workflow I had given myself a strong foundation for NixOS.

Jumping into the deep end and going straight to daily-driving NixOS, is certainly also a good option.

atrus
0 replies
16h6m

The No Boilerplate video is what got me started: https://youtu.be/CwfKlX3rA6E?si=hJZ_mm9vaeKI0w8V

It's just super nice having everything in a git backed config, for multiple systems. Working with NixOS seems like I'm simultaneously working 10 years in the future and 10 years in the past.

emporas
4 replies
18h37m

Nix is one of the reasons open source wins in the long run. I would have never imagined that package dependencies could be installed and managed across programming language boundaries, kernel boundaries, shell boundaries, dotfile boundaries etc.

Given that computer complexity grows exponentially all the time, then at some point everyone will be forced to use something like Nix.

I agree with the article with Emacs and Unix/Linux.

fire_lake
3 replies
12h22m

Functional Programming approaches usually win in the long run. This is a common thread in the article too, although not explicit.

trashburger
2 replies
7h24m

I think you mean declarative. Whether something is FP or OP usually doesn't matter too much (unless you go to the extremes of each end of the sprectrum a la AbstractBeanFactory/IO Functor State). Declarative solutions last longer by virtue of their declarativeness; the underlying implementation can evolve as long as the declarative layer is untouched.

zarathustreal
1 replies
6h45m

Isn’t a class a declaration (of a class type)? Isn’t a function a declaration (of an algorithm)?

I’d assert that every programming language is declarative. Especially once you get enough written to constitute a DSL.

bubblyworld
0 replies
2h9m

I think they mean declarative in the sense that you write down want you _want_ in a solution (prolog/datalog/terraform/etc) without specifying exactly how to compute it (day to day programming languages).

renewiltord
3 replies
19h59m

Wait, this is a pretty good sell. I'm going to give this Nix thing a shot. All the other times it's posted people talk about things I don't care like replicable builds and stuff.

__MatrixMan__
2 replies
19h38m

My guess is that if you use it long enough for it to start being useful, you'll find that "replicable builds" solves a wider variety of problems than you initially thought it did.

At that point, the hard part becomes getting your co-workers to recognize that all of these little problems that they perceive as separate are actually just facets of the the same huge nondeterminism problem.

renewiltord
1 replies
19h25m

I'm sure. It's just not a selling point for me right now.

pxc
0 replies
2h11m

Perhaps that's exactly the problem: 'reproducibility', 'determinism', 'functional'-- these are unfortunately too much 'insider language' to be great sells to most people who aren't already FP or Nix people... even when they're closely related to benefits those prospective newcomers might enjoy!

I'm glad you've identified a use case that appeals to you. Have fun!

jauntywundrkind
3 replies
19h3m

I see tons of people very happily using Nix, but also from what I've seen Nix is one of the most opaque hard to understand inscrutable systems on the planet. The language is impossible to learn, and my limited experience has been that there are extremely few good guides to peering under the covers.

Nix is one of the most powerful & well-used bits of modern computing, hugely adopted and loved, but my limited understanding (and what's kept me broadly uninterested) is that it is on the wrong side of the Age of Reason trying to overthrow the Age of Magic war.

jchw
1 replies
17h2m

Honestly, I suspect a lot of what makes it opaque is the fact that it hinges a lot on functional paradigms, this definitely made it a lot harder for me to understand. The trouble is that now that I understand it, I'm not sure how they could architect this in a more scrutable way. This is the problem.

I think most of us who use and love Nix understand that it is too complicated, but it's too complicated because none of us can figure out how to make it simpler.

infogulch
0 replies
15h36m

I'm also nix-curious and intimidated by the wall. I feel like things would be a lot easier to grok if Nix actually had a static type system or even something opt-in instead of "the object has whatever fields I happen to put in it, good luck lol"-typed.

colordrops
0 replies
13h48m

The language is not the hard thing, it's actually straightforward once you get it. It's the crazy amount of convention and the number of layers in the stack that make up the Nix ecosystem that takes so long to grok.

xyzzy_plugh
1 replies
20h3m

This 100%. It's the gift that keeps on giving.

nicce
0 replies
19h38m

You need to give the first gift, however. Time. And lot of it.

helpfulclippy
1 replies
13h34m

I had only a dim awareness that Nix was even a thing before a few weeks ago. Then somehow I decided that I needed to commit to it 100% and make NixOS my daily driver and make myself learn this. It's been a lot of fun. Like, Dwarf Fortress kinds of fun. The kind of fun where when I first looked at it I thought it was insane inscrutable nonsense, and now I kind of wonder what happened to me that it's kind of making sense now. The kind of fun where I keep telling myself I just want to make some tiny little thing work, but actually I find excuses to rabbit hole down a bunch of different pathways and find amazingness under every stone. The kind of fun where I know better than to try to count how many hours I've spent on this now.

Except unlike Dwarf Fortress, I feel like things are actually improving over time instead of shambling ever-onwards towards an inevitable downfall. So I guess maybe it's more like the kind of fun the very first time I installed Linux and didn't know my way around anything.

I'm surprised how much I enjoy customizing things now. I always thought of my desktops sort of like betta fish before -- like, I've taken care of them, but also known better than to get too attached. Eventually the reformat is gonna come, and I'm never gonna set things up QUITE like I had it before. That's definitely not true now. I could start from scratch and be up and running with my entire suite of applications, themes, add-ons and configurations in no time, because it's all just a git repo of nixfiles.

pxc
0 replies
44m

I had only a dim awareness that Nix was even a thing before a few weeks ago. Then somehow I decided that I needed to commit to it 100% and make NixOS my daily driver and make myself learn this. It's been a lot of fun. Like, Dwarf Fortress kinds of fun. The kind of fun where when I first looked at it I thought it was insane inscrutable nonsense, and now I kind of wonder what happened to me that it's kind of making sense now. The kind of fun where I keep telling myself I just want to make some tiny little thing work, but actually I find excuses to rabbit hole down a bunch of different pathways and find amazingness under every stone. The kind of fun where I know better than to try to count how many hours I've spent on this now.

That's absolutely what diving in with NixOS was like for me. Wonderful description. :)

I'm surprised how much I enjoy customizing things now. I always thought of my desktops sort of like betta fish before -- like, I've taken care of them, but also known better than to get too attached. Eventually the reformat is gonna come, and I'm never gonna set things up QUITE like I had it before. That's definitely not true now.

Yes. NixOS makes customization feel more worth it than other operating systems can! I first heard it expressed best in this video: https://www.youtube.com/watch?v=17-TRCpDizA

WatchDog
15 replies
14h37m

I can't really think of a good example that other people haven't mentioned, but I have an anti-enlightenment piece of software, spring framework.

Spring actively hindered my ability to understand the simple concept that is dependency injection, and I know I'm not alone. Many a Java developer think that you need a big complicated framework to pass dependencies into your modules, instead of having your modules fetch them for themselves.

This isn't a criticism of spring per se, it's fine, it provides value for people, but I think it can lead people to build software that is more complicated and less portable than it needs to be.

kookamamie
10 replies
14h21m

I would dare to say Dependendency Injection as a concept is unnecessary and creates more problems than it solves.

WatchDog
5 replies
11h54m

I'm still very much pro DI as a concept.

I don't think DI itself really causes any problems, the solutions designed to save you from a little bit of boilerplate code, cause the problems.

kookamamie
3 replies
11h52m

I mean the original statemement from the perspective that only certain languages/environments (Java, etc.) propose DI as a solution. E.g. in my current language of choice, C++, DI is nowhere to be found.

WatchDog
1 replies
11h42m

C++ has constructors doesn't it?

ReleaseCandidat
0 replies
11h38m

And higher order functions.

ReleaseCandidat
0 replies
11h38m

in my current language of choice, C++, DI is nowhere to be found

STL, for example when passing explicit allocators. You can even call any higher order function using dependency injection.

And of course there are C++ codebases that look like Java - the pattern book works with C++ too.

stonemetal12
0 replies
3h19m

Are you pro DI as in "Dependency Injection", or pro DI as in "Dependency Inversion Principle"?

DIP is a good way to build software. When injecting dependencies becomes so complex you need a framework or need a separate concept of DI (sans P) then I think something has gone wrong, incidental complexity has won.

The_Colonel
1 replies
10h17m

One elementary need DI (or perhaps IoC more generally) provides is the ability to mock certain parts of your application for automated tests. While I'm not a fan of mocking too much and prefer higher-level tests (integration/component), mocking is still quite often needed. Is there some alternative to IoC/DI?

poorlyknit
0 replies
8h37m

Effect systems spring to mind but they're rather esoteric (in the Java world).

throwanem
0 replies
14h8m

One might about as sensibly say the same of functions being able to take arguments. If this is meant to illustrate the damage working with Spring does to understanding the concept, then it's an excellent, if mildly horrifying, illustration.

jerf
0 replies
3h56m

Dependency injection is a basic tool of writing robust, testable code. The alternative is strict hard-wiring of the dependencies, which deprives you of places your code can be tested.

But do not confuse "dependency injection" with "massive heavyweight opaque framework with a billion bells and whistles that breaks constantly". Dependency injection includes things like passing in a handle to the SQL database instead of it being a global variable, which your test suite uses to switch between various test instances of the database instead of the target code being hard coded to some variable, or even hard coded with its own connection credentials.

If you're not using dependency injection you are almost by definition using a lot of global variables. I'm as happy or happier than the next programmer to be a contrarian, but, no, the collective wisdom on those is dead on. They're deadly and should be avoided. Dependency injection is the biggest tool for that.

Not having dependency injection creates more problems than it solves.

However, I'm not sure that most "frameworks" for it aren't creating more problems than they solve. Probably one of the classic examples of lopsided accounting, in this case looking at the benefits but neglecting the costs. Anything looks good if you do that. But a lot of "frameworks" seem to bring along a suite of middling, sort of convenient benefits at the cost of massive opacity, unreliability, and the bad kind of magic. Not a good trade for a lot of programs.

penguin_booze
1 replies
11h59m

I've had my short stint at using Spring. Often I was dropped into a project where it's already setup and working. When something breaks or I want to extend/modify what was working, I hit a wall in terms of discoverability: how it's been working all this time, and how to find a suitable level of documentation to help me. There are reams of documentation for Spring, but nothing of the kind that'll help me if I'm lost. So, from my perspective, it's write-only framework; it's hard to reason back.

WatchDog
0 replies
11h5m

I spent a lot of time with spring, and became a bit of an expert on it, I'm able to debug and understand most issues that come up with it.

Still, I agree wholeheartedly, there is too much magic, it's very difficult for someone to come into a spring codebase without a lot of background experience, and understand how things tie together, and I don't think it's really time well spent acquiring that experience.

The_Colonel
0 replies
10h10m

Many a Java developer think that you need a big complicated framework to pass dependencies into your modules, instead of having your modules fetch them for themselves.

Not sure what you mean by "module" here, but DI means that the dependencies are defined externally to the actual business logic, which kinda contradicts this "fetch them for themselves".

I think the problem with Spring is that it has too many features, too many different ways to do the same thing, too much configurability. Historically, Spring jumped on most hypes, tried to basically support everything. We ended up with this complexity monster, but this also made it popular. Spring is the Oracle DB of the backend development in the sense that you won't ever get fired for choosing it, it's the safe choice which will support in some way everything you might need to do.

RamblingCTO
0 replies
12h33m

Totally agree. Spring and actually everything in that realm is plain horrible. DI is awesome on it's own (especially if you do hexagonal architecture), but spring hides behaviour and brings in undocumented changes on updates and things like that (same for any spring related project and hibernate). That's my biggest problem with it.

jiehong
13 replies
21h14m

Speaking of Bazel, I wanted to try it out for a Java project, but it felt a bit more complex than expected.

Would you recommend using it even for mono languages projects?

colordrops
7 replies
20h57m

If you haven't already committed, consider Nix instead.

threePointFive
6 replies
20h39m

I'm still trying to understand why people recommend Nix in place of a build system. Nixpkgs stdlib by default expects an autotools project. It will happily integrate with other build systems, as long as you've spelled out your dependencies in both. I've yet to see it generate a Makefile or make any decisions about compilition that weren't spelled out in a "traditional" build system. Could you shed some light on what I've missed?

ris
4 replies
18h42m

So.. it's sort of a battle over territory between build system and package manager.

Bazel is there becoming ever more complex and unwieldy in an attempt to provide supposed reproducibility - taking control of the provision of ever more of a project's dependencies (in often very janky ways). But to Nix people it's clear that what people are actually doing here is slowly building a linux/software distribution around their project, but in a very ad-hoc and unmaintainable way. And bazel projects will continue to grow in that direction because until you have control of the whole dependency stack (down to the kernel), you're going to struggle to get robust reproducibility.

I don't think many Nix people would suggest actually using Nix as the build system, but probably to use a comparatively simple cmake/meson/whatever build-system and use Nix to provide dependencies for it in a reproducible and manageable way.

threePointFive
2 replies
17h49m

Thanks for the summary. I've been using Meson + Nix, so the comments about using Nix as a build system have been confusing. I think what I've been seeing though are "use Nix instead of Bazel", not "use Nix as your build system".

colordrops
1 replies
13h52m

What I mean is use a relatively simple build system instead of Bazel, and deal with dependencies and reproducibility through a Nix development environment.

fire_lake
0 replies
12h14m

You lose out on some of the incremental compilation speed that Bazel offers doing this. I think many in the Bazel space suggest using Bazel inside of a Nix environment.

zokier
0 replies
11h28m

You call blaze side janky and ad-hoc but to me (as complete outsider) using monorepo+build tool seems more principled and working more with fundamentals, while nix feels more ad-hoc and trying to fix stuff post-facto.

And bazel projects will continue to grow in that direction because until you have control of the whole dependency stack (down to the kernel), you're going to struggle to get robust reproducibility.

This is bit weird statement, considering that it's not where bazel is growing to, but where bazel is growing from. The whole starting point for bazel is having full control (via monorepo) of the dependency stack

__MatrixMan__
0 replies
19h20m

I'm not sure why you'd want to generate a Makefile if you're using nix. Unlike make, nix understands the inputs to a build step and won't bother rerunning it unless their inputs have changed. You would lose that you generated a Makefile instead of having nix build whatever it is that the Makefile builds.

Otherwise it does the same things as make: this bunch of commands depends on this other bunch of commands... It just makes you express that as a function so it can be smarter about memoization.

I've not used it for large complex builds, so maybe there's some itch it fails to scratch at finer granularity which I'm overlooking. I liked this artical about where it shines and where it fails to be a build system: https://www.tweag.io/blog/2018-03-15-bazel-nix/. I've been waiting for the problem to arise that encourages me to learn Bazel so I can use it alongside nix, and it just hasn't yet.

AlotOfReading
1 replies
20h53m

Bazel is probably at its simplest in a monolingual codebase. Toolchains have a lot of complexity.

It's like that Churchill quote about democracy: Bazel is the worst build system except for all those others.

sanderjd
0 replies
17h43m

Used bazel for years, now using pants[0] and really enjoying it as a tool that is good in the same ways, but better in some smaller ways.

0: https://www.pantsbuild.org/

fire_lake
0 replies
12h18m

I would use choose it for a C++ only project, but that’s because the alternatives are so horrible.

WatchDog
0 replies
14h56m

While maven and gradle may lack the architectural purity of bazel, they work much in the same way.

Plan and execute steps, building a graph of the required build tasks, only executing the tasks that have changed inputs.

With that said, it's quite common to see maven/gradle builds that are misconfigured and execute tasks unnecessarily.

Tijdreiziger
0 replies
20h45m

Why not Gradle or Maven?

bbkane
9 replies
16h57m

Coming from C++ and Python, Go's packaging + deployment tooling really enlightened me. It's SO EASY to depend on things and deploy my apps, I love it!

I've also heard Rust gives folks similar warm fuzzy feelings in regards to building and deploying, one day I'll try that too

nox101
8 replies
16h41m

I felt that way about node and yet node lead to an explosion of poorly written and designed packages and constant notifications about needing to upgrade project X because it depended on Y which depends on Z and Z has some DoS issue if you pass the wrong regex to it.

I don't feel confident that rust won't go the same way when I tried to update the rust docs (https://github.com/rust-lang/docs.rs)

    cargo build
    Downloaded 541 crates (55.2 MB)
Seriously? 541 crates for a static site generator for docs?

rust is clearly off to copy npm in all of it's faults. I have no idea if go is similar in the explosion of dependencies and supply side attack surface area explosion

nicce
6 replies
15h55m

Seriously? 541 crates for a static site generator for docs? rust is clearly off to copy npm in all of it's faults. I have no idea if go is similar in the explosion of dependencies and supply side attack surface area explosion

In Rust, it is design choice. They try to keep the standard library small, and let community create competitive packages. You see the result in those numbers.

It is hard judge based on those numbers only.

kergonath
3 replies
10h57m

The philosophy does not really matter, though. Any one of these dependencies could be a vector for a supply chain attack and all these libraries being updated independently and a synchronously is just asking for 2 dependencies requiring incompatible version of something else. We’ve seen this happening already and it usually ends up in 2 ways:

- the node approach: who cares? YOLO!

- the particular circle of hell that is Python and its 25 ways of doing virtual environments. Wait, 26, yet another one just dropped.

For all its faults (and there are some), a centralised approach like with Boost has some merit.

nicce
2 replies
6h53m

Rust, the language itself depends on 220 packages: https://github.com/rust-lang/rust/blob/e8753914580fb42554a79...

If you trust nobody, it is hard to use anything.

But about your second note, (environment, mismatched dependencies), I would argue that Rust provides the best tooling to solve or identify issues on that area.

hu3
1 replies
6h11m

Rust depending on 220 packages is somewhat understandable.

After all it runs on many platforms. I counted 16 Windows packages just by glancing over and also many macOS related.

But 541 for docs?

Surely there's a gradient between trusting no one and trusting 541 packages to generate static files.

nicce
0 replies
4h33m

Are you confusing `docs.rs` with `cargo doc`?

It is indeed many packages, but if you look into the dependencies and code, docs is full blown standalone HTTP async server which uses tokio, AWS S3 and Postgresql. It is used to host the docs.rs where is the documentation of every cargo project.

Maybe they should feature-gate some functionality and also split the dependencies

Static site generator is mostly in the Rust itself: https://github.com/rust-lang/rust/tree/master/src/librustdoc

nox101
1 replies
11h37m

That's the same argument node people make. See how well it's worked out.

pxc
0 replies
31m

NPM's got bigger problems than just having lots of small libraries, like for instance allowing circular dependency relations in packages

steveklabnik
0 replies
2h22m

for a static site generator for docs?

docs.rs has a lot more to do than just that. But also, actually building those static pages takes a lot. To do so, it has to actually build every crate, sandboxed, of course. This makes it closer to "universal CI for the entire ecosystem" than "generate a few html pages."

If you look at the dependencies, they're pretty normal for a website that does this kind of thing. It's roughly 80 dependencies, then 11 used for development only, and a couple more that are build-time only. The larger number is the sum of all transitive dependencies.

dist1ll
8 replies
20h11m

I would say the compiler explorer[0] fits the definition perfectly. It may seem like a straightforward piece of software, but it has immensely changed the way people discuss and share knowledge around compilers and performance optimization.

I regularly feel the impact on the quality of forum discussions. There's a lot less speculation about if "call X gets inlined", or "Y gets vectorized". Bold claims can be supported or disproven quickly by sharing a link. And then you have tools like llvm-mca[1] or uiCA[2], if you don't mind going into the weeds.

[0] https://godbolt.org/

[1] https://llvm.org/docs/CommandGuide/llvm-mca.html

[2] https://uica.uops.info/

zem
4 replies
18h16m

along those lines, the entire notion of a web playground (a sandbox where users can just write and execute or otherwise process code) has vastly reduced the barrier for checking out a project or experimenting with its behaviour

ReleaseCandidat
3 replies
10h59m

web playground

This _so_ much. Where in the past I've used Jupyter Notebooks for short, one off stuff or to test something, I now do that online for almost any language.

Notebooks are still useful to write documentation though.

coldtea
1 replies
10h35m

Well, Notebooks main use case is a different purpose, not for trying one-off stuff or checking if some syntax is valid. It's for doing stuff step by step, annotating the steps and/or explaining each result.

Web playgrounds are ok for testing some syntax (if you don't have a local REPL/easy way to test), but not for one-off stuff that involves file input or that you want to check against real environment assets.

ReleaseCandidat
0 replies
10h29m

Well, Notebooks main use case is a different purpose, not for trying one-off stuff or checking if some syntax is valid.

I know, but that's what I had used it for too. Like posting some code, for example in a HN post ;) As I've said, I still use it mainly for documentation.

roman-kashitsyn
0 replies
5h28m

I was thinking about including Mathematica as enlightenmentware. Mathematica 6 (https://www.wolfram.com/mathematica/newin6/) was the first truly interactive system I used (we happen to have a box with a license at the university). It impressed me so much that I still have a lot of warm fuzzy feelings toward Stephen Wolfram and his work.

Unfortunately, my relationship with Mathematica didn’t go anywhere: It was too expensive back then, and I never found a good use for it except for double-checking my homework.

I tried other computer algebra systems, but they didn’t impress me as much.

If you own a Mathematica license and found a good application for it, please let me know!

rnewme
1 replies
14h52m

I would dare say for me the https://pythontutor.com/ even more so than compiler Explorer. (hint: it's not just for python)

AceJohnny2
0 replies
12h46m

(hint: it's not just for python)

I would definitely have overlooked it, and it really needs a better domain name.

ot1138
0 replies
7h6m

Pahole is a related utility for high performance programmers. I've been able to attain orders-of-magnitude performance improvements in trading applications using it.

hu3
5 replies
19h51m

Docker is one of these tools for me.

The unspeakable amount of my time and headache it saved during my consulting career puts a smile on my face.

Docker allows me to quickly iterate over the steps required to run ancient projects.

Not having to install 5 different relational database servers on my host OS alone is worth the learning curve.

Also crucial for running concurrent, reproducible, Python environments in my laptop.

udev4096
3 replies
13h21m

Imo, making a Dockerfile for an ancient project isn't always easy

fmbb
1 replies
12h27m

But usually easier than making it build locally.

patrickmay
0 replies
3h53m

And more easily repeatable. Plus, it avoids polluting the full environment.

klibertp
0 replies
2h31m

In comparison to what? Making a chroot env for such a project is way harder than dockerizing it. VirtualBox and Vagrant might not be much harder, but are slower to the point of being irritating. I might be missing some alternatives, but among the approaches I tested, Docker is still the easiest way to build and run unfamiliar projects.

XorNot
0 replies
18h25m

While it takes heavily from the UI, podman with user namespaces enabled is the completion of this idea for me.

No more sudo requirements, greatly reduced attack surface, isolation as a "user" concern. It sits happily in my UI doing exactly what I need it to do for all these use cases.

vvern
4 replies
19h45m

Maybe one day the buck2 ecosystem will evolve to be that bazel replacement. It has a much smaller core. Right now it’s lacking in ecosystem support, tooling, examples, and a local build sandboxing (which could be fine if there was an easy to use local implementation of remote build that felt natural). Also no go support is sort of painful, and it’s a bunch of work as i understand it to get something like rules_go to work for buck2.

zokier
0 replies
11h57m

Buck2 is really enticing. If I had too much time on my hands, I'd love to try to make a full bootable Linux distro as monorepo and built with buck2. I do see some sort of convergence between package managers and build tools, nix and buck2 do seem to approach similar problem from different angles.

ReleaseCandidat
0 replies
9h9m

And a remark: the "real" problem when using Buck 2 is the interface to the LSP, as most LSPs only work with the "native" project configuration. For C++ generating a `compile_commands.json` is quite easy (see my C++ examples in the other post), not the least because there is no single standard for a project's configuration.

ReleaseCandidat
0 replies
11h8m

examples

As it may be that people don't find them, the official ones are in `examples` https://github.com/facebook/buck2/tree/main/examples language examples in `examples/with_prelude` https://github.com/facebook/buck2/tree/main/examples/with_pr...

There is a minmal Go example too: https://github.com/facebook/buck2/tree/main/examples/with_pr...

More "real world" examples are dtolnay's CXX for Rust and C++ interop: https://github.com/dtolnay/cxx

and my own ones (sorry to link these again, but I do not know of any others):

C++ with Vcpkg: https://github.com/Release-Candidate/Cxx-Buck2-vcpkg-Example...

C++ with Conan: https://github.com/Release-Candidate/Cxx-Buck2-Conan-Example...

OCaml: https://github.com/Release-Candidate/OCaml-Buck-2-Examples

nighthawk454
4 replies
18h52m

They are "round": they pack the most volume in the smallest surface area. unix surface area is tiny, but it unlocks much power. Emacs and Git are all over the place, but their core is small, sweet, and easy to appreciate.

I really like this 'round' concept, seems very precise. Maximum interface area / use cases (surface area) with minimum core volume.

ssivark
2 replies
14h43m

Unfortunately, a sphere of characterized by exactly the opposite property. The article switches the intended meaning of surface and volume to get away with the metaphor, but I’m less than thrilled about the metaphor.

kergonath
0 replies
11h27m

I am not defending this specific metaphor as I am not sure it is really good in this case. But they are right about this specific property of spheres: they have the highest volume/surface ratio (i.e. that’s the way of minimising the surface area of the enveloppe for a given volume).

infogulch
0 replies
12h59m

"encapsulate the most complexity with the smallest API" seems to fit the metaphor better.

mbwgh
0 replies
7h20m

In Ousterhout's "A philosophy of software design", this aspect is described similarly via "deep classes", as opposed to "shallow classes".

donatj
4 replies
15h8m

We used SVN at my first job in 2006 and I had the exact opposite experience with it. I never fully understood what I was doing, nothing made rational sense, merges were an absolute nightmare, and somehow I would always ended up corrupting the repo and had to pull a nightly backup to get out of the broken state.

Git was a literal breath of fresh air in comparison. I fell in love hard and fast. Everything just made sense, even if our workflow using git am patches seems downright ancient these days. Friends at the time tried to sell me on hg, but I was in love.

WatchDog
3 replies
14h49m

I thought SVN was great, easy to use, and very intuitive, that is until you had any merge conflicts.

At the time, it worked very well for our small team, I imagine it would work less well for large teams on a single codebase.

I miss having a commit serial number.

ahartmetz
1 replies
14h23m

I used kdiff3 and could never understand people complaining so much about SVN merges. Now I use kdiff3 in Git and it's fine, too. What isn't fine (though occasionally improving) is Git's UI and mess of termnology and concepts.

WatchDog
0 replies
11h47m

It's been a long time since I used it, I don't remember what I was using for diffs, but I suspect it was just whatever the default built in diff support was.

It occurs to me that many people these days use git with github exclusively, and have it configured to only allow commits via PR, and only allow either squash or rebase merges, it's kinda SVN with extra steps.

globular-toast
0 replies
11h10m

You have to sacrifice the serial number to get a distributed system. Well worth it IMO. But if you really wanted you could tag every commit on master with the next number (should be easy to do with a hook).

brcmthrowaway
4 replies
16h44m

Closed tab at Bazel.

zvorygin
2 replies
16h12m

I quit my last job in no small part because of Bazel. I hated it so much. It tortured me.

I think Bazel is the kind of really complicated language that invites clever engineers to build incomprehensible balls of spaghetti. And the tooling and docs are really underinvested in.

But I got a new job, and to my surprise I've been doing Bazel all day. And I love it. I don't really know why.

All this to say, don't make a final judgement yet, there's something brilliant buried underneath that pile of rules and aspects.

roman-kashitsyn
0 replies
10h11m

Bazel confused the hell out of me at first, and I think the two-phase execution model (the “plan-execute pattern” as I called it) is to blame.

My favorite thing about Bazel is how easy it is to get stuff done if somebody sets up the rules for you. Copy-paste a code snippet and fiddle with the dependency list until it works.

But as soon as you go deeper, you get overwhelmed with new concepts, and the documentation doesn’t explain them well enough. I think this huge spike in complexity makes people hate Bazel, especially if their colleagues force it on them, breaking the usual workflows.

I don’t love Bazel, but it’s the build system I hate the least. And it taught me a lot.

ASinclair
0 replies
13h41m

I work on a couple sets of rules at Google. I enjoy it. Though I agree people really can make balls of spaghetti with it. Everyone’s macros are terrible except for mine . The configuration system is where things can get really out of hand: transitions, selects, flags, etc.

mellery451
0 replies
1h51m

agree - what's more, the author is really talking about blaze, where everything "just works" because there are massive dedicated resources maintaining it. He literally admits that he likes the copy-pasta-no-think-about-it:

Surprisingly, I didn’t need to fiddle with blaze, nor did I have to understand how it worked. I could copy some build targets and edit the dependency list, and the build worked as expected.

Sure, systems that just work are great, until they break.

bazel on the other hand, not so much. Heaven help you if it doesn't support your use-cases - you will wish you had Google to maintain your build.

8s2ngy
4 replies
14h9m

Magit, the git client for emacs, fits the bill perfectly. It is a masterclass in simplicity, effectiveness, and discoverability. It is one of those rare tools that makes you better at the underlying tool it abstracts over; instead of introducing its own jargon and workflow it exposes git's capabilities better than git does.

beautron
3 replies
10h25m

I'm sure Magit is lovely. But I can't resist sharing the story of my bad experience with it over a decade ago (which has left me scared away).

It was maybe early 2012, and I was excited to try Magit. I got it set up, and called 'M-x magit-init' from a source file I was editing. My understanding was that this would create a new git repo in that source file's directory, ending up with something like "/home/beautron/myproject/.git".

But something else happened. The git repo was put here instead: "/home/beautron/myproject/~/myproject/.git". Note the peculiar "~" directory, created inside the project directory.

Huh. Weird. Well, let's get rid of this mess and try again. I went to the project directory in my bash shell, typed "rm -r ~", and hit enter. Somewhere between my mind firing the signal to hit enter, and enter actually being hit, I realized with horror what this command would do. But it was too late to cancel the brain signal.

I didn't lose everything, because I had not typed something worse like "rm -rf ~", and somewhere in my home directory tree was a read-only file. So the command only deleted so far as that file, and then paused to ask for confirmation.

I estimated I lost about half of everything (the first half of the alphabet was gone from my home directory). The most frustrating thing was not even being sure what all I had lost. On the plus side, this experience improved my regimen around backups.

As I was trying to salvage the wreck of my system, I had a separate laptop out on the side, where I was trying to get some help, or maybe just some sympathy, from the #archlinux irc channel on freenode. But the two people who responded to me on the channel were very snarky to me. I felt they thought I was clearly an idiot for having run that command.

The irc people refused to believe that Magit created the "~" directory. They were convinced I had done that myself, with some other series of stupid commands. (If you had to guess the source of weird "~", who would you choose: the established Magit project, or the guy who just deleted half his home directory?)

But a short time later I was vindicated! From Magit's github issue 383, Feb 29, 2012:

So if you're editing "~/Temp/foobar/boo.txt" and call "M-x magit-init" which defaults to "~/Temp/foobar", instead of creating a git repo in "/Users/jimeh/Temp/foobar" it creates it in "/Users/jimeh/Temp/foobar/~/Temp/foobar".

Source: https://github.com/magit/magit/issues/383

It was a long night (and I had to leave on a trip the next morning). Now it's fun memory, perhaps with a number of lessons in it.

akho
1 replies
4h57m

Do you have backups now?

ziddoap
0 replies
4h40m

On the plus side, this experience improved my regimen around backups.
pama
0 replies
5h47m

Sorry to hear. Codes have bugs that evolve over time. I hope you try magit again some day. It is sometimes hard to remember to apply correct quoting in a shell, so if in Emacs you can also use dired for dealing with such errors in less risky ways.

20240519
4 replies
17h8m

I would add

Haskell - the rabbit hole to category theory and all the mad stuff Haskell people get into (compilers, proofs etc.)

Bitcoin - love or loath it, technically it is a marvel. At the time Bitcoin came out I was musing on the same problem but never could figure out how to avoid double spends and would never have come up with blockchain!

tsimionescu
2 replies
17h6m

Now that the blockchain itself is actually not novel, it was already a part of Git at the time of the Bitcoin white paper. The novel part, I believe, was the proof-of-work concept.

kevindamm
0 replies
15h39m

Even proof of work was not novel, there were proposals for fighting email spam with similar techniques. Bitcoin's fortune is combining the right pieces at the right time and getting sufficient buy-in to become relevant and more difficult to ignore.

20240519
0 replies
15h25m

Yes I take blockchain to mean proof of work (or proof in general) validating blocks such that you can track time to some extent in a system that cannot be sure of time because of being distributed and nodes not being trusted.

roman-kashitsyn
0 replies
9h43m

Haskell was one of my most important discoveries; it affected my thinking and approach to software engineering the most.

However, as I mentioned in the opening section, I deliberately removed programming languages from candidates for several reasons: 1. They received enough praise already. 2. My presentation would be very biased. 3. The article would be way too long.

vl
3 replies
14h37m

I’ll mention Typescript.

It elegantly improves almost unimprovable mess of JavaScript. It makes JavaScript development much more productive and pleasant. And it’s surprisingly powerful.

xandrius
2 replies
11h29m

I agree but it's a shame we need to put a bandaid over JS instead of having a properly typed language option for the web.

Now we start seeing some wasm being used but I still wouldn't use it for the whole project, so TS is the way for now.

vl
0 replies
2h34m

If you think about it, entire web infrastructure is buzzard and inefficient.

Sane design is to have some kind of bytecode for web code and some compact format instead of html. The waste of delivering this data and then executing js is enormous. And we need to pre-process and build websites and web apps before serving anyway!

ativzzz
0 replies
4h37m

Losing decades of backwards compatibility and portability is probably not worth it

jwrallie
3 replies
19h31m

I hope the author would consider trying Doom Emacs, he mentioned knowing Vim and using Vscode, so with Evil and lsp support he will feel at home, the configuration has great defaults!

roman-kashitsyn
1 replies
9h49m

I tried God Mode and Evil several times, but it never ended well. Mostly because of the muscle memory I accumulated over a decade.

Evil is fantastic, but it doesn’t play well with all the packages in the ecosystem. There are adapters for Magit, Compile, etc., but most packages define key bindings that require Ctrl-Meta cords, which causes cognitive dissonance.

God mode is a neat idea, but I never committed to it, and I found it confusing at times.

Overall, I decided to accept Emacs as it is, with all its quirks, and embrace its way of doing things. I’m not married to it, so I can occasionally cheat by switching to Vim or VS Code without feeling guilty.

mkesper
0 replies
8h30m

I never was able to use evil per se but it works really well in doom emacs (and before that in spacemacs). Oh and use Caps Lock as Ctrl to minify pinky usage!

srcreigh
0 replies
16h11m

Evil offends my simple sensibilities. I prefer boon for modal editing due to its simplicity and how it fits with emacs so well.

WillAdams
3 replies
16h50m

Yeah, one of the best programmers I've ever worked with would launch Epsilon (a commercial emacs style editor for various OSs) each morning and then do _all_ of his work from it.

The closest I come to that is messing emacs keyboard shortcuts when I'm not using a Mac.

I really wish that there were more programs which completely re-examined all aspects of various tasks _and_ incorporated scripting in a fashion which allows folks to take advantage of it.

Some of the apps I would consider if putting together such a list:

- LyX --- billed as a "What You See is What You Mean" Document Processor, v2.4 is looking to be quite promising...

- TeXshop/TeXstudio --- the former in particular is _very_ nice for folks who aren't able to devote the effort to learning emacs

- pyspread --- have a spreadsheet where every cell can contain a Python program or SVG graphic is _way_ cool --- I just wish it was as flexible as Lotus Improv/Quantrix Financial Modeler

- Solvespace --- I wish I could do better with 3D --- usually I fall back to OpenSCAD, esp. now that there's a Python-enabled version: https://pythonscad.org/ though I often use: https://github.com/derkork/openscad-graph-editor

- TikzEdt/IPE --- I really wish there was a nice graphical front-end for METAPOST/METAFONT (or that the graphical front-end for Asymptote was more flexible)

On the gripping hand, one has to give props to the Krita folks for making scripting a first-class citizen: https://scripting.krita.org/lessons/introduction

senkora
2 replies
16h5m

LyX

During college, I time-tracked how long I spent on each homework for each class. I can confidently say that using LyX instead of LaTeX for my math assignments resulted in me finishing them 50% faster.

I think that most of the improvement was that the WYSIWYM reduced the cognitive load enough that I could write equation reductions inside the editor without having to write them out on paper first.

I highly, highly recommend LyX to anyone who needs to typeset math equations.

WillAdams
0 replies
6h10m

That also helps folks downstream --- when I did book composition, the cleanest LaTeX manuscript I ever worked on was done by an author who used LyX.

rjmunro
2 replies
17h0m

Git was nothing like Subversion. It had a steep learning curve and confused everyone to no end

I've actually found git much easier to teach to people who don't know subversion than to people who do. It's still a confusing mess, though. Why do you create a branch with `git checkout -b` rather than something like `git branch -c` (`-c` to checkout new branch)?

It looks like the `git switch` command helps a lot, but I never remember to try to use it as I'm used to the old ways, so I never teach it to new people either. I wish I could alias `git branch` and `git checkout` to remind me to use `git switch` but you can't alias over a built in command.

yjftsjthsd-h
0 replies
14h23m

I started on mercurial, then git, and I think that was the happy path / easy on-ramp. Actually I still mostly prefer hg, but it has some downsides and the overall ecosystem really prefers git.

mkesper
0 replies
8h36m

For teaching to newbies, please use the new commands! Much better to distinguish `git switch` from `git restore` than to use `git checkout` for all possible tasks.

kaycebasques
2 replies
10h44m

These tools capture our imagination, open new possibilities, and affect how we design our own systems.

Puppeteer/Playwright fits the bill for me. Learning how to scrape websites with those tools subtly changed how I approach lots of other programming tasks and opened many new possibilities in both personal and work projects.

The event model of the web platform has had a deep effect on how I generally think about systems.

Some people may throw tomatoes at me for this last one: Google Apps Script. The docs and API aren't great, but it has opened so many new possibilities for automation of things related to my personal and work Google accounts.

Re: capturing my imagination, physical computing with RPi/Arduino. I've also gotta admit that GenAI APIs have been an explosion of new possibilities for my line of work (technical writing)

Thank you to OP for creating a productive and inspiring topic (enlightenmentware) for us to discuss and share ideas around

kreyenborgi
1 replies
9h37m

Can you give some examples of what you've created with Google Apps Script?

I've used it a little, but it's one of those things I always spend an hour building up courage to attempt (because of the slow feedback loop, browser-based testing, confusion about different deployment types, permission models and extension types). In such ways it's the opposite of writing Emacs extensions, though I can see how it promises to give something similar in the end.

kaycebasques
0 replies
5h1m

In a previous job I rigged up the submission of a Google Form to send out an FYI email to relevant stakeholders and to create a GitHub issue (glossing over the details of why this was useful)

Apps Script also has an API that essentially allows me to expose my spreadsheet data as a web service that returns the data as JSON

For a while I was doing daily journals in Google Docs. I used Apps Script to auto-populate the heading for each day, e.g. an H1 with the text "21 May 2024". I've done lots of little auto-population things like this

jovas
2 replies
18h1m

LaTeX.

I LaTeX, Git, and emacs every day.

nephanth
1 replies
14h9m

Honestly TeX/LaTeX the engine is a marvel of technology,

But everytime i see a \makeatletter or get a runaway argument it reinforces my belief LaTeX the language was a mistake

isametry
0 replies
7h58m

(La)TeX is an example of a very enlightened _idea_ that offed itself \footnotemark{} with a spectacularly cursed user interface. It is simply gross to write, and it's difficult for frontends, converters and GUIs to make it much better.

Yes yes, I can already hear the cultists chant "YoU dOn'T wRiTe In LaTeX" but this mentality is precisely the problem. If I can't write directly in your typesetting system nowadays, then I'm sorry, your system probably sucks.

You could unfortunately write an article or thesis quite comfortably in Word or even InDesign, while formatting as you go. (I say "unfortunately" because from a business-model and hacker's perspective, these tools suck.)

\footnotetext{not implying that LaTeX is dead, but referring to how it sentenced itself to the academic niche, in which case it might as well be dead…}

joe8756438
2 replies
19h5m

Happy to see emacs on the list, changed my life.

Enlightenment hardware: kinesis advantage 2, many non obvious, non ergonomic benefits, like adapting the hardware to _my way_ of working/thinking.

Certain games also feel like enlightenmentware.

dpflug
1 replies
6h5m

What games would you classify as enlightenmentware?

joe8756438
0 replies
2h45m

dwarf fortress

ultima-likes

minecraft

I’m not much of a gamer, but those definitely made a pretty big impact on me as far as what one or a small group of people can accomplish

bluepoint
2 replies
9h14m

Although I’m writing these words in Visual Studio Code, I always have my Emacs open

He is writting the blog's text in Code praising Emacs. Makes me a bit sceptical or am I missing something?

zarathustreal
0 replies
4h25m

Emacs is an operating system, VS Code is a code editor.

ativzzz
0 replies
4h34m

You can learn a lot from Emacs about software development philosophy but then move to something with more batteries included that requires less upkeep like VSCode as you move forward in life, especially once you lose the ability to tinker (mostly lack of time due to things like children)

valtism
1 replies
8h50m

React is this for me. Before it, I was fumbling around with libraries like ExtJS for my first job, but after I started using it the concept of components that produce a view as a functional output of state really made so much sense to me.

It has given me so many powerful primitives to use while coding for the web

JoeyJoJoJr
0 replies
7h16m

I would also say Redux. Even though I grew to dislike Redux, understanding the power of reducers was quite mind blowing.

And further I would throw in Tanstack/Redux query.

somishere
1 replies
19h19m

Great article. For me it's creative use of tools by others - sometimes myself - in a non-standard way (sometimes that becomes standard!) that brings enlightenment. Be it language, source control, networking, you name it.

Aside: assuming the author is reading, minor typo in the first para: But once in a w̶h̶i̶t̶e̶ while

roman-kashitsyn
0 replies
6h12m

But once in a w̶h̶i̶t̶e̶ while

Thanks, fixed! It's “occasionally” now. Even Grammarly didn’t find it :(

sneilan1
1 replies
16h36m

I found quicken to be enlightening. It took me two months to master my family’s finances and budgeting and I’ve never looked back. Learning it’s ins and outs and why it is what it is and provides some features but not others was a wonderful learning experience.

roman-kashitsyn
0 replies
5h21m

I worked with John Wiegley for a couple of years and discovered one of his projects, Ledger (https://ledger-cli.org/), during our conversations. This tool taught me double-entry accounting and helped me understand finance and blockchains on a deeper level.

Unfortunately, I’m too lazy to use the tool on a daily basis, so it’s the second most insightful piece of software I’ve never used :D

ryukoposting
1 replies
6h5m

I'll nominate another text editor for this subject: Sam. It's a graphical text editor, but it doesn't resemble any other graphical editor I've ever seen. I've been told it's "like ed on steroids" but I can't verify that statement as I've never used ed.

Sam made me reconsider the way I think about the mouse. In Sam, the mouse is integral to your workflow, but the way you use the mouse is unlike any other editor I've ever used. Brief strokes blended with keyboard stuff. Sam translates remarkably well to the trackpoint, even though it was designed a decade before laptops existed. Its command language is somehow as simple as sed, but (almost) as powerful as vim. It's really old software now, so it lacked some features I wanted (auto-indent, mainly). The source code is simple enough that I just added auto-indent myself!

I used Sam for the last two years of my undergrad. I wrote Python, C++, Nim, and Verilog using Sam. Sam shaped the way I think about computers in so many ways. At a glance it would seem idiosyncratic and weird. Having used it, I consider it to be wonderfully ergonomic and creative.

roman-kashitsyn
0 replies
1h3m

I believe Sam is a predecessor of Acme[1][2], which still attracts new users today. One of its notable users is Russ Cox, the tech lead of the Golang team. I haven’t used Acme yet, but I want to try it out one day.

If you’re interested in learning more about ed, I highly recommend “Ed Mastery” by Michael W Lucas [3]

[1] https://doc.cat-v.org/plan_9/4th_edition/papers/acme/ [2] https://www.youtube.com/watch?v=dP1xVpMPn8M [3] https://www.amazon.de/gp/product/B07BVBSDNZ

maxander
1 replies
10h58m

I don’t know whether this is a provocative or tedious thing to say, but the quintessential ‘enlightenmentware’ to have come out of the past several years is ChatGPT. Name anything that brings as much functionality with so simple an interface and so elegant a core!

(It’s simply a pity that we can’t install it locally or tinker with the internals. :) )

ReleaseCandidat
0 replies
10h53m

Name anything that brings as much functionality with so simple an interface and so elegant a core!

The most important thing of the last years has been LSP support! I can live perfectly fine without LLM autocompletion (although I did use Tabnine long before ChatGPT came out), but not without a LSP.

komali2
1 replies
12h32m

I really like the buku terminal bookmark manager. https://github.com/jarun/buku I like that I can just `man buku` when I don't understand something and I can actually find the answer I'm looking for.

roman-kashitsyn
0 replies
4h59m

Nice!

I used https://wiki.systemcrafters.net/emacs/org-roam/ for a while but switched to LogSeq (https://logseq.com/) because org-roam was buggy.

I like working with LogSeq, but even after a couple of years of using it, I’m not convinced by the Zettelkasten method. Maybe I’m doing it wrong!

dizhn
1 replies
20h25m

You have a typo. "earn to" when you probably meant "yearn to".

roman-kashitsyn
0 replies
10h20m

Fixed, thanks for reporting!

cryptonector
1 replies
1h22m

ClearCase is confusing like a Russian novel: All the characters have strange names, the plot is complex, and it doesn’t end well.

Well done.

Git removed the friction from using version control; there was no excuse not to version anything of value anymore. Merging branches with Git didn’t cause anxiety disorders. The staging area—confusingly named index—became essential to my workflows. But my favorite feature was the breathtaking beauty of Git’s design, the elegant mix of distributed systems, acyclic graphs, and content-addressed storage.

I love seeing glowing reviews of Git.

goeiedaggoeie
0 replies
40m

many moons ago, before git was released, I put svn in front of clearcase and had people locally commit to that, trap into hooks of svn, check the file out of clearcase commit and recheck-in. the whole engineering group I worked in switched to using the svn and sped up, although user contribution was thrown out in the clear case log.

alceta
1 replies
11h44m

lazygit (https://github.com/jesseduffield/lazygit) is enlightenmentware for me. It helps me navigate Git commands I forget all the time, like using the reflog to undo things, custom patches, or rebase --onto.

It makes working with Git a lot more fun, and I giggle like a little child whenever one of the weirder things work out again.

xandrius
0 replies
11h31m

I find VSCode/Codium to be even better at that! And worst case, open up the terminal and do the job there.

Even merging which was often an annoying endeavour is quite smooth there.

zem
0 replies
18h19m

blaze is one of mine too, specifically for the realisation that your build setup really feels rock solid when your entire dependency list is spelt out as an explicit DAG in a build file. inferring or otherwise auto discovering dependencies on the fly is seductive, but it always ends up letting me down when things get complex.

vegabook
0 replies
18h59m

NixOS fits the mold. The confidence it instills feels similar to when I moved from Windows to (Ubuntu) Linux.

tkgally
0 replies
17h19m

I’m barely a programmer, but I have been using computers for nearly four decades. Among the various tools that have, over the years, captured my imagination, opened new possibilities, and affected how I create things with computers, the current leading enlightenmentware by far is LLMs. Nearly every day I discover something surprising and useful that they can do for me.

tithe
0 replies
15h21m

Shells, and how by scripting them you have programmatic access to the entire operating system.

susam
0 replies
6h15m

But occasionally, we discover a piece of software that transcends mere utility. These tools capture our imagination, open new possibilities, and affect how we design our own systems.

For me, it was DEBUG.EXE on MS-DOS.

This humble debugger allowed me to peek into interrupt vector tables, inspect the content of ROM, learn how MS-DOS boots from scratch, etc.

I fondly remember the days when armed with an assembler, some knowledge of the CPU and the computer architecture, we could plunge into the depths of the system, unravelling its intricacies to our heart's content.

sssilver
0 replies
17h40m

I also must confess to a strong bias against the fashion for reusable code. To me, "re-editable code" is much, much better than an untouchable black box or toolkit.

I had never come across this particular thought by Knuth, but this hits home so hard.

It feels that most of our productivity is a function of how re-editable the code is in the codebase we operate.

It’s intriguing to think about what makes code re-editable vs simply reusable.

roman-kashitsyn
0 replies
1d1h

Once in a white, we discover a piece of software that transcends mere utility. These tools capture our imagination, open new possibilities, and affect how we design our own systems. I call such software enlightenmentware.

In this article, I praise the software that contributed the most to my enlightenment.

What’s your enlightenmentware?

revskill
0 replies
16h24m

It is vscode and es6 to me.

nephanth
0 replies
14h7m

Agda. Not as in something i'd use everyday, but using it definitely shaped the way i reason about programming and type systems

mihaitodor
0 replies
9h37m

Streaming and transforming structured documents at scale used to require some awfully complex machinery such as Apache Camel, Kafka Connect, Flink, etc. I was so happy when I bumped into Benthos https://benthos.dev which can be used as a lightweight replacement in most cases. Bonus: It’s written in Golang, so I don’t have to bother with heavy dependencies and slow start times.

kevinwang
0 replies
5h20m

The only good tools I've ever used are typescript and tqdm (a python progress bar).

Wandb and React can get honorable mentions.

jdeaton
0 replies
19h25m

JAX

jauntywundrkind
0 replies
18h54m

I want to thank wmii for being a vanguard force in illuminating the path before me, by being both a fine tiling window manager, while also showing it's guts, exposed as a 9p file-system.

Being able to craft together super crappy shell scripts to monitor and manage my windows was amazing. It was a huge turn on to feeling like someone who was really communing with the computer at it's deeper level, rather than just surfing above the application's crust. This feels like the real enlightenment goal: bringing forward the (ab-)natural philosophy that underlies each bit of software, rather than crafting facades of interface atop the core.

I have high hopes that general systems research can one day again spark an age of revelation & understanding, that we can form better more earnest symbiosis with machines. wmii was a good example of one way to let bonds grow close & strong.

igammarays
0 replies
3h13m

Laravel. I had been developing web apps since I was 12 years old, but Laravel completely changed the speed at which I could deploy and maintain a production-ready app. And the ecosystem of plugins, add-ons and developer tooling is incredible.

hcarvalhoalves
0 replies
4h57m

Common Lisp. Not just the language, but the entire runtime: repl, debugger, system loading, quicklisp. It just works.

globular-toast
0 replies
11h4m

My road to Emacs was similar to the author's. When I was learning programming everyone was using IDEs. It seemed like they were inseparable from the programming language. I remember thinking I couldn't learn C++ because I didn't have a C++ IDE. After my second or third language I started to think this was ridiculous. It's all just text! And these built in text editors are terrible! It took the best part of a decade for everyone else to realise and move to VSCode, a vastly inferior editor.

flatline
0 replies
16h31m

Boost::graph feels like one of the dustier corners of the Boost libraries. I have used it, it worked, but it took a long time to wrap my head around the design and actually adapt it to my project. It is not great for getting simple things done, but it will get them done, with the power and flexibility as stated in the article. You will likely never see the Boost interfaces poking through whatever facade you end up erecting around it.

fbn79
0 replies
10h10m

In my case Cycle JS (https://cycle.js.org) was very enlightenment and pedagogic. It make me realize that software is always and only a matter of data transformation. And those pure data transformations can be keep separated and decoupled from "side effects".

comment_ran
0 replies
16h58m

One day, I'm working on a Linux machine with my Emacs open, I'm using a Bazel to clean my today's to-do list project. And I open the browser to find a person who wrote a blog about Boost.graph which I never heard about, but I'm really interested to look at. I finish this writing, save the buffer and =C-c g= to lunch magit to write a commit message "good day", then pushed to my git repo.

boffinAudio
0 replies
10h43m

I have a couple of these to add as well:

VCVRack - simply one of the most mind-expanding things a synthesizer-nerd can play with. (https://vcvrack.com/)

ZynthianOS - another example of a simple software solution to a problem nobody realized existed, opening the door to an absolutely astonishing array of Audio processing tools (https://zynthian.org/)

bitwize
0 replies
9h30m

Emacs, Squeak, and Genera (Lisp machine OS) all qualify for me. It's no surprise that all of these examples are what I call "pervasively programmable": you can not only extend them with code, but examine and modify the running system by typing code into it, shaping the system to your needs as it runs.

Another pervasively programmable piece of enlightenmentware is Ashton-Tate Framework. It wasn't just a spreadsheet but a piece of "integrated software" (the 80s term for office suite), sporting spreadsheet, word processing, database, graphing, and serial communications capabilities, all under a unified desktop-GUI-like interface and programmable with the Lisp-like FRED language. If there were ever an "Emacs for business", it'd be Framework. It's pretty amazing that a program that powerful existed on 1980s PC hardware.

aulin
0 replies
12h32m

It turned a mundane job of fixing bugs into an exercise in skill.

Emacs does this for me. It's like a toy always there to play with when you're bored with mundane job tasks.

RivieraKid
0 replies
17h33m

JAGS - allows you to specify a probabilistic model and sample from the posterior distribution

Narigo
0 replies
18h39m

Docker would be on the list for me - for reproducible environments. Probably JUnit as it was my first real testing framework - for being able to use test driven development for hard problems.

With programming, I had so many "aha"-moments, it's hard to remember them. It's not all about software, but more about understanding the concepts and being able to transfer this knowledge. Being able to pass functions or function pointers. Streaming / piping data instead of a fixed data structure. Interpreted vs compiled languages. How everything we do here only happens through a long list of 0s and 1s and how this clever setup makes us even see graphics on the screen. Or hear audio through a screen reader....

Barrin92
0 replies
19h0m

Smalltalk and as a particular case Pharo is an example of this for me. (https://pharo.org/). When I was in uni a paper that I always came back to was Licklider's 1960s paper on human-computer symbiosis.

"[...] to enable men and computers to cooperate in making decisions and controlling complex situations without inflexible dependence on predetermined programs."

Experimenting with Smalltalk (and also with Clojure and Emacs) was one of the things to me that genuinely felt like that vision of programs as living, interactive, organic things rather than the formulaic, static and low level programming that I was used to learning. I think it's still such a shame that in daily jobs it's so difficult to convince people of trying these technologies out because it requires such a big shift in how people think about software.

https://worrydream.com/refs/Licklider_1960_-_Man-Computer_Sy...