return to table of content

I quit my job to work full time on my open source project

bad_user
16 replies
10h50m

This is the dream for many of us, but I hope she comes up with a good business model that doesn't rely on the goodwill of people (e.g., open core) because FOSS is terrible for earning money. The reason is that FOSS is essentially part of the commons, but without being maintained by taxes.

In general, people just want free stuff, companies rarely pay for support, and SaaS providers will steal your business if they can. I can think of several apps that macOS users are paying for, such as Bartender, Alfred, or MailMate. Clearly, there's a market for utilities, but only with scarcity.

goodpoint
1 replies
9h58m

Some grants exists but they are really tiny compared to the amount of FOSS that needs funding.

pompino
0 replies
19m

You have to weed out the people who just want to dick-around at home versus people who are actually providing value to society, so the bar is set very high right now.

RamblingCTO
0 replies
10h2m

The EU also has open source grants FYI

oefrha
3 replies
8h51m

Author's project is a command line productivity tool. I have co-authored a now-archived command line productivity tool with half the number of stars (6-7k, maybe worth more today due to inflation) in the past, and my observation is people are generally very stingy with this type of projects. We did make people's lives a little bit better, but unlike frameworks, libraries, etc., it's not going to end up in any money-making product, so people don't think it's essential (it's not), and companies which tend to donate larger sums than individuals are completely out of scope. I think the total donation (through a PayPal link in README) we got over more than half a decade was less than $100. In comparison, I once made some web-based analytic tools when playing a casual mobile game, and got a few thousand in donations over a year or two -- not much considering the time that went into it, but two orders of magnitude better than the command line productivity tool.

However, we hardly ever marketed our project and never tried to "build a following" or beg for donations in any way, so maybe the author will do a lot better than us. The server component should also help remind people it's not free.

Edit: One thing I forgot: I think command line utilities are in a worse financial position than GUI utilities, because people are accustomed to paying for GUI apps, but aren’t accustomed to paying for things in the shell at all.

Tolexx
1 replies
2h36m

I think command line utilities are in a worse financial position than GUI utilities, because people are accustomed to paying for GUI apps, but aren’t accustomed to paying for things in the shell at all.

I think that's because command line tools are more appealing to technical people. For convenience sake and ease of use, most average computer users will use GUI tools or applications.

oefrha
0 replies
2h14m

I'm specifically talking about developer utilities. Even developers are accustomed to paying for GUI apps but not command line stuff. To be fair I've never seen anyone selling a local (non-SaaS) command line only tool either.

johnmaguire
0 replies
2h17m

I think this is pretty accurate as a general rule. Anecdotally, the only CLI utility I pay for is Filebot: https://www.filebot.net/

And it has an optional GUI...

anticorporate
1 replies
3h40m

How to pay for the open source commons is far from a solved problem, but I'm glad individuals are trying to make it work for themselves anyway.

I wish we wouldn't act like producing software and making gobs of money are inextricably linked. Yes, we absolutely need to find a way to fund people who are building critical infrastructure. But sometimes, "I quit my job to work on open source" can be more akin to "I quit my job to hike the Appalachian Trail." I wish tech had a lot fewer people who were here for the money.

dingnuts
0 replies
1h24m

I wish we wouldn't act like producing software and making gobs of money are inextricably linked. Yes, we absolutely need to find a way to fund people who are building critical infrastructure

Don't you see the contradiction? Critical infrastructure costs gobs of money. The software has to pay for itself, or it has to survive on crumbs; that's just reality. Software is really, really expensive to create and maintain, because it takes a lot of time, and time costs money.

I wish Richard Stallman hadn't duped a generation into thinking that they have to use licenses that make Amazon richer instead of just using proprietary licenses to protect yourself, as the licenses were designed to do, so that people with more lawyers can't just steal your work.

Why did our whole generation listen to a guy who was caught on camera eating something off of his foot?

runjake
0 replies
53m

In terms of making money, I feel like the sweet spot might be a closed source app with a rich open source ecosystem around it, such as Raycast or Obsidian Notes.

The apps themselves are closed source and making money, but the extensions and add-on functionalities are mostly open source.

pjmlp
0 replies
7h23m

Yes, just yesterday the author of ImageSharp was complaining on Twitter that from 80 million NuGet downloads, there is hardly anyone paying.

This on a ecosystem that grew on Windows developer culture, where paying for developer tools is quite common.

Let alone in other comunities, quite hard indeed.

pierotofy
0 replies
3h16m

While it's certainly challenging, it's not terrible, but it requires some thought and a sustainable business model, something many FOSS developers don't want to do. https://piero.dev/category/foss-funding/

conaclos
0 replies
1h36m

Maybe it's crazy, but I'm one of those people who tries to make a living of the goodwill of people and companies. I am working for free since several months on Biome (https://biomejs.dev), a fast formatter and linter for JS/TS/JSX. At the moment we do not have enough donations to be paid for our contributions.

DoreenMichele
0 replies
9h49m

She seems aware this is an issue.

Possibly of interest to anyone pondering doing open source:

https://news.ycombinator.com/item?id=17824166

255kb
0 replies
8h39m

I can confirm that funding one's work on an open-source project through sponsoring is one hell of a ride. As you said, most people are not concerned by OSS funding, let alone companies that have a hard time justifying paying for something free...

However, OSS is not incompatible with some form of monetization, coming up with a plan to sell courses, custom services, or cloud options is probably a safer road.

motoxpro
14 replies
12h13m

What should I be using shell history for? I've never wanted to go back in my history but obviously people are getting a lot of value from it.

todd3834
3 replies
12h4m

Here are some of my often use cases: Long kunernetes commands, or ssh into an ip you don’t have remembered, curl commands when testing an api. Anything on the CLI that is long and either hard to remember or just annoying to type.

JohnMakin
1 replies
10h59m

I tend to write scripts or docs when things are annoying and hard to remember. shell history is usually littered with sensitive stuff, I usually go way out of my way to prevent my machines from saving any of it, but to each their own.

ellieh
0 replies
10h36m

shell history is usually littered with sensitive stuff

Atuin is trying to solve this one too :)

By default it won't save history containing AWS credentials, slack credentials, etc. The list of things to ignore is configurable with regex

While it's not difficult to avoid putting credentials into your shell, ik most people will just paste things in that perhaps they should not

shaneoh
0 replies
11h5m

I use [Shuttle](https://fitztrev.github.io/shuttle/) for that ssh issue which often comes in handy. Should solve for most of those CLI command issues.

cimnine
1 replies
11h30m

I often hit CTRL-R to reload a services config. I press `CTRL+R`, enter `reload`, and continue to hit `CTRL+R` until the right service appears. Enter. Done. Usually way quicker, especially when switching between distros. As one calls it httpd and one apache, once it's systemd and once it's and init script, and so on.

WinstonSmith84
0 replies
11h7m

Yes I'm using this command occasionally. While clearly the demo of Atuin (https://atuin.sh/) looks cool and more powerful than ctrl+r, I must say that ctrl+r has always been enough to me.

theshrike79
0 replies
9h53m

I'm lazy and I don't want to type

With fish's shell history for example I can just type 's' and it completes it to 'ssh user@host.tld', because that's the last one starting with S I used. If it's not right, I can type it up to 'ssh' and press arrow up to pick the ssh command I want.

Then I might remember that I did this fancy jq thing once to parse a field in a specific way, I can easily use Atuin to look for it with a nice text-mode UI just by pressing C-r and typing 'jq' as the initial filter.

sevagh
0 replies
8h14m

Not to be dramatic but atuin is life-changing. I can resume my working context from any project, no matter how far in the past (e.g. what docker or cmake or make commands I run). I don't need a personal wiki of cli tricks like an "ffmpeg cheatsheet" or whatever, I just access my own personal ffmpeg history.

Oh, also, I use tmux to split my terminal and do separate things in each one, so I like that atuin consolidates all the histories from my separate panes.

rcpt
0 replies
11h18m

I don't really know how to use awk and ctrl-r is faster than going back to stackoverflow to figure out what I did last time

paxys
0 replies
2h8m

Have you even used the up arrow key to go back to your last command? This is like that, but does a lot more.

noelwelsh
0 replies
9h12m

I'm similar. I do most of my work in Emacs and I don't do much sys admin, so I don't really use the shell much. It's always interesting to me to see other people's workflows, though.

baq
0 replies
9h5m

it's like having a multi-register clipboard in that once you figure out the workflow, it's hard to go back.

...but if you effectively have this in emacs, I can see why you wouldn't need it in the shell.

absoluteunit1
0 replies
2h17m

I’ll sometimes write a useful one liner that I want to write a shell script off but if I’m in the middle of something and don’t have time I’ll do this:

<my> | <cool> | <one liner> # add script for this

The # comment makes it easy for me to search through my history to find one liners I want to use to build a shell script from

CraigJPerry
0 replies
11h29m

I feel like I get a decent return on mining historical command usage for new (single keystroke?) aliases to setup, the most useful ones change over time for me.

Another use case I feel pays off is complicated one-liners where I need to do something similar but not quite the same again - good starting time saver. This depends on you being a mostly cli kinda person obviously, if you instinctively reach for excel over awk then ymmv.

lanza
10 replies
14h35m

I love the idea of Atuin but it's just way too slow with large history files. I've synced my history on my own for the past decade and have like 170k lines and the history search the ctrl-r search just crawls.

I don't need most of the history, but there's 0 chance in hell I'm auditing that many lines to decide what I need and what I don't need.

Brian_K_White
6 replies
13h5m

You don't need any of it.

History is useful enough to exist as a feature, I up-arrow routinely, but it doesn't actually matter when it doesn't exist.

I find the idea of going out of your way to preserve and migrate years of shell history and make it searchable in a db about like:

You have a problem that water is flooding your kitchen floor. Normally you deal with a spill with a mop or towels. There is now too much water and so you decide that your normal towels aren't good enough and so you get more & better towels, or even put a sump pump in the corner to keep pumping all this water away.

I've written a lot of complicated pipelines and awk and sed etc, but they were either one-offs that are of hardly any value later, or I made a script, and the few things that are neither of those, are so few they automatically don't matter because they are few.

It's not illegal or immoral, just goofy.

cuu508
3 replies
12h12m

History can also contain potentially sensitive things like hostnames of non-public systems, usernames, filenames, URLs. I would not want that stuff to hang around indefinitely.

CraigJPerry
1 replies
11h35m

You can tell your shell to ignore commands started with a space

    HISTCONTROL=ignorespace
Or you can use HISTIGNORE to always ignore specific patterns.

tandav
0 replies
1h20m

You can also use

    HISTCONTROL=ignoreboth
to use both ignorespace and ignoredups

berkes
0 replies
11h0m

This is one thing that atuin announced to solve: ignore patterns, keys, passwords. From the announcement I got that it's going to be far more useful than HISTIGNORE. Both out of the box and in abilities.

lanza
0 replies
11h12m

You don't need any of it. > History is useful enough to exist as a feature, I up-arrow routinely, but it doesn't actually matter when it doesn't exist.

That's absurdly naive to think the simplistic constraints of your own workflow is a general rule.

euos
0 replies
12h15m

I use those ! Bash history features all the time. I.e. !?some_test to just rerun a test case I ran several months ago. I don’t need to sync histories between PCs (they are different enough) but history is important.

ta988
0 replies
13h57m

remove duplicates, remove trivial commands like cat, less , more, cp etc

eviks
0 replies
13h46m

Does your history store frequency? Could prune those with only a single run until it's performant

ellieh
0 replies
10h33m

but it's just way too slow with large history files

hey! what issues were you having here? slow to open or slow to search?

we have a whole bunch of people with way more than 170k lines, so that shouldn't be happening :/

ellieh
10 replies
10h31m

Hey! Author here, thanks for sharing cwaffles

Happy to answer any questions :)

globular-toast
3 replies
1h54m

I'm a big fan of keeping tons of bash history. I have an idea for your tool: allow per-project history tracking. I do this myself for some projects using some bash history hacks: https://blog.gpkb.org/posts/project-local-bash-history/

ponyous
1 replies
1h48m

It allows this out of the box and you can easily change search modes with CTRL+R (directory, global, workspace, host, session):

https://docs.atuin.sh/configuration/config/#workspaces

globular-toast
0 replies
1h34m

That is pretty cool! I would recommend this tool to people like bioinformaticians and scientists who do a lot of work on the command line.

wswope
0 replies
1h49m

It’s already supported; you can ctrl + R from the atuin interface to filter for commands run in your current directory.

octocop
1 replies
7h39m

Nice to see someone pursuing something like this. I wish you all the luck.

ellieh
0 replies
5h13m

thank you!

nbobko
1 replies
8h47m

I'm curious whether you've figured out a good monetization strategy for Atuin. The article doesn't answer that question.

Like you said, you won't be paying your rent with sponsors any time soon, but you've already quit your job. Are you living on your savings, and trying to come up with valuable paid features meanwhile? Is it the plan?

Anyway, good luck with whatever you're up to! Building a good monetization strategy is hard

ellieh
0 replies
8h42m

Are you living on your savings, and trying to come up with valuable paid features meanwhile

Correct! I do already have a bunch of ideas I'd like to explore, will share more soon.

I'm lucky to have had a pretty good career so far, so my savings are more than enough for the next year or so

Thank you!

cudder
1 replies
9h4m

I can't seem to scroll with the arrow keys on your website (ellie.wtf). Is this intentional? Really annoying at the very least.

Good luck on your project!

ellieh
0 replies
8h42m

Not intentional, will see if I can sort it. Thanks!

the_arun
9 replies
13h52m

What is the security of this?

1. If someone steals my laptop & breaks in, can they get access to all my history

2. After breaking, if they run `atuin key` will get them the key for my history which they can use from any device (if they know the userid)

3. If you are running servers passing passwords as command line arguments in that device, they have all that.

euos
4 replies
12h12m

What do you have in the history that’s sensitive? Keys, passwords should not be in shell history anyways (e.g. I delete them from bash history if I enter by mistake)

the_arun
1 replies
10h45m

Shell history is wiped out after user logged out for a reason. More details are here - https://www.securityhq.com/blog/security-101-dont-bash-your-....

wswope
0 replies
43m

Please read the page you linked; it’s flushed to a file, not wiped.

extheat
1 replies
12h4m

I don't think it's that unusual. What comes to mind immediately is it's not unusual for me to clone something from a private git repo, where a username+password would be needed for permissions. In which case it's possible to put in `git clone http://username:password@example.com` or another git command that interacts with remotes. (To be clear the "password" is typically a token and not human generated string, but still functions like a password).

drbaba
0 replies
11h48m

For that example: Any reason the server doesn’t just have an SSH server? Then you can use `git clone` in the “usual way”, using SSH certificate authentication.

drbaba
1 replies
11h52m

If you are running servers passing passwords as command line arguments in that device, they have all that.

I make a point out of never doing that. It’s way too easy to accidentally expose things. For instance, doing a live demo with an audience, and using Ctrl-R out of muscle memory? Suddenly you flashed your password in front of everyone.

Generally, I’d recommend using a tool like Unix `pass` or your default OS keyring to store your secrets, then you can run `command1 --password=$(command2)` to feed a password from one command to another. If I really have to type something sensitive, I prefix the whole shell command with a space, which in many shells can be configured to mean that it doesn’t enter history. If you do so by accident, the shell history file can be edited in vim.

the_arun
0 replies
10h50m

This is a good approach. Thx for sharing.

fourteenminutes
0 replies
12h54m

If someone steals your laptop and breaks in I think you have many problems equal to, or more severe, than your shell history.

ellieh
0 replies
8h19m

1. If someone steals my laptop & breaks in, can they get access to all my history

Yes, but this is the case anyway with current shell history. I think if someone breaks into your laptop you have bigger problems than your shell history. It's best to get into the habit of not pasting secrets into your shell

2. After breaking, if they run `atuin key` will get them the key for my history which they can use from any device (if they know the userid)

They would need your username, your password, _and_ your encryption key

3. If you are running servers passing passwords as command line arguments in that device, they have all that.

Yes. If you're doing this, then all of your passwords are currently stored as plaintext in your home directory - with or without Atuin. I'd consider them no longer secure if this is the case, as any program you run could read .bash_history

Atuin by default comes with a set of filters to ignore secrets and not record them to history - AWS creds, slack creds, GitHub tokens, etc etc. So it may well reduce the impact of this

brundolf
9 replies
37m

I'm curious to hear what people's usecases are for retaining long shell history. For myself, I never really use it beyond the current work day (up arrow)

zem
7 replies
35m

for work, there are a lot of commands with complex invocations that I do not use regularly enough to bother remembering all the flags. it is really useful to just grep through my full shell history to find all the times I ran that command and just copy/paste the relevant one.

stoniejohnson
3 replies
21m

Yep, I do `history | grep "<whatever>"` all the time..

I should probably make it a function now that I think about it.

_kblcuk_
1 replies
16m

Most of the shells have built-in recursive search, why not just use that? `Ctrl + R` in bash / fish / zsh at least.

zem
0 replies
12m

i usually want to just see all the options at once and pick the one i want, rather than searching through them one by one

zem
0 replies
13m

i called my function `fh` for "full history"

Ntrails
1 replies
10m

I end up manually curating skeleton command lines in a text file - partly because I've never had functional full badh history

zem
0 replies
6m

i shared my full-history script on here a while back and someone pointed me to atuin :) https://news.ycombinator.com/item?id=32477431

brundolf
0 replies
20m

It may just be that I don't use the shell very much compared with other people, mostly for building and running local code

nox101
0 replies
8m

I definitely refer to months old command lines. I work on 10-15 projects ~2 are every day projects, ~6 every week projects, the rest every month or 2. I often recall commands I used on those month ago projects. I often remember the first few letters and press up arrow or I Ctrl-R for a part of the command-line I remember. I rarely have to dig out the docs to remember how to full type the line.

chaosprint
8 replies
10h5m

I quited my job as well to work on

https://glicol.org

I have a lot of feelings, but I don't have a blog so far. I feel that universities should alloc some of their funding to many of these open source projects and open source community should be better managed rather than donation. As for me, my plan is to start my own company and work on hardware .

jebarker
3 replies
3h9m

I feel that universities should alloc some of their funding to many of these open source projects

Can you expand on this? Why would universities fund these projects?

marcosdumay
0 replies
2h9m

I do think a similar thing to the GP, on my case it's because universities have the problem of "how to fund potentially society-changing projects that mostly go nowhere and depend on really qualified people" closest to solved than any other institution.

There's a lot of problems that I do think universities could be working on. Creating free software is one of them.

gyrovagueGeist
0 replies
2h13m

It doesn't make sense to me either. Universities barely even fund the lifecycles of their own research based software projects.

chaosprint
0 replies
2h9m

Maybe this doesn't apply to all situations. It's just that in my case, especially in music technology, I often see funding being allocated to things that I think are somewhat disappointing, and of course, that's just my subjective opinion. More often than not, it is an overall lack of funds. Many open source projects are actually of great academic value and worthy of study. In fact, many contributors to open source projects love blogging. I often feel more rewarded reading these blogs than dozens of pages of academic papers. I can clearly feel that I will have more time to make pure open source contributions during my funded Ph.D., or even when I am lecturing at the university. But now I have to balance it with some practical considerations.

xialvjun
0 replies
9h7m

How can I follow in hackernews :)

kolektiv
0 replies
8h32m

I'm working on a hobby project at the moment, which at the moment is all about sequencing, and my long term plan is to integrate Glicol in some way - it's a great project, and I can't wait to start digging around further.

I'm looking forward to seeing what you do with hardware - I'm sure it'll be cool!

ibz
0 replies
9h26m

Awesome to hear that! Been following Glicol for a while because it is such a unique and bold project!

While Sonic Pi is also beautiful and much easier to start with as a beginner, I later found the hard way that its architecture is incredibly messy - lots of unrelated parts glued together with duct tape. The simplicity and cleanness of Glicol's code is what made me immediately love it!

Thank you for such a beautiful project!

debevv
0 replies
7h47m

This is awesome

tpmp313459
5 replies
15h8m

open sourcing is utter waste of time. What do you still working for a big coporates companies in free of cost. This is how 70% of open sourcing literally works. You work for developers who are working in firms and often firms also fork your projects. Before 4 years I've inspected iOS whats app bundle they were used lot of open source projects. I am not about contributing to Linux Systems indeed projects copied by Apple & Windows bussiness.

If you do open source work for like Linux, Server or watever which is already avaiable in high cost, then It could be considerable as great hobby. Open sourcing the innovtive, invensitions aren't good in software industries they are remains fee by commericial firms. All your intentions just go waste. Most of people see this comment negative. The other side, One day come, major jobloss will directly affect you as well even you have settled well. You will understand!

quickthrower2
4 replies
14h58m

Not really. You just need to be strategic. $5/m for example would be reasonable for a hosted version of this, maybe add some AI touches for pro users, and then have thousands just expense it as it is well below thresholds, or just pay for it themselves.

ipaddr
3 replies
14h33m

5 a month with 1,000 users is 5,000 a month. You add in the costs and taxes and you are left with supporting a thousand users that you can't afford to hire for. If you had 2,000 users you might be able to afford someone but now you have to support 2,000 users.

If prices were tripled you could afford to staff for now and the future (r/d).

When small you need to go higher. When you are big you can use size to scale.

quickthrower2
1 replies
14h31m

$5 a month, the only support you get is the ability to cancel :-). However I kind of agree, given that this has sensitive data perhaps, so customers might get annoying about that, charge a bit more.

TillE
0 replies
12h52m

Yeah the Marco Arment approach of "I'm one guy, please don't ask me to spend my time on support instead of development" is clearly the correct one at that scale.

Like absolutely you want to accept bug reports and feedback, but your users should not expect active help with normal usage.

goenning
0 replies
2h32m

Why would they need to hire? I’m running a $5k/mo business with thousands of users and Im working solo. I have 1~2 support chat message per day which takes 10 minutes.

ilaksh
5 replies
13h37m

Has anyone who uses fish shell compared that to Atuin? I suspect that having my entire shell history accessible might not really be necessary.

WD-42
1 replies
12h23m

I have the same question. Lots of comments here from people hand rolling complex solutions to what it seems like fish does out of the box. Commands per directory, partial completions. Heck there’s even an embedded web server ui for searching and manipulating the history.

Multi device sync is not there without some effort but I don’t really care to mix my personal history with my work machines anyway.

wt__
0 replies
8h31m

Multi device sync is not there without some effort but I don’t really care to mix my personal history with my work machines anyway.

Yep, obviously there are many benefits to per-device history, but I think I'd find it more annoying having it synced between devices, especially if there are commands that either won't work on a particular machine or might even be dangerous in a different environment.

theshrike79
0 replies
9h46m

I use fish's own completion (the dim text that appears after you type) for most stuff. I also use the fish up arrow search instead of Atuin like this: atuin init fish --disable-up-arrow

Atuin is there when I need to find something more complex I remember doing 3 months ago and could actually repurpose today.

shawabawa3
0 replies
10h7m

I use fish with atuin

I only map atuin to ctrl+r and use fish's native up arrow search for simpler stuff

Gee8r9
0 replies
11h24m

Fish has smoother usability, autocompletion and search. It is useful on fresh machine in vanilla configuration. Installing bash plugins is not always possible. Installing some sync plugin on sensitive server is nono!

Also my biggest problem with bash, sometimes it does not keep part of recent history, if bash process gets killed. Fish does not have this problem.

I usually keep useful commands in notes, and sync my notes instead.

sireat
3 replies
10h13m

Does Atuin offer ability to save current path (with timestamp as a bonus) for the command as well?

Looking at OP it looks like it does " recording additional command context"

Often (meaning once every few months) I have to SSH to some less used machines and remember a few incantations where path and time is crucial.

In bash there is a hacky way to add current path and timestamp to history, but I've never gotten it to work exactly right. If you add timestamp then it seems to duplicate the timestamp when you repeat the command.

samlanning
2 replies
9h46m

Yes it does, both path and timestamp.

You can even filter to commands for your current directory by just pressing Ctrl+R a few times

ellieh
1 replies
9h34m

so we store

- command - directory - timestamp - duration - session ID - exit code - hostname

currently thinking of neat ways we can store extra things too (git remote, etc)

zipperhead
0 replies
1h59m

I wonder if atuin could parse tags if given at the end of a command, after a # ?

sudo somecommand arg1 arg2 # admin mpv arg1 # media

I have work-related stuff that I'd love to be able to tag and then filter history based on that. Hmm now that I think of it, maybe it will just work like that anyway.

Love atuin btw, best of luck!

outcoldman
3 replies
14h48m

I keep my shell history in sqlite database since 2017. Around 120k records at this point. Never synced history from the work laptops. Only personal history.

In 2017 wrote my own bash script (later optimized for zsh) to just record everything in sqlite with hooks on prompt. [1]

I mostly work right now on Mac, don't need to support Linux anymore, so wrote an app for Mac, that syncs the history over iCloud, and has a GUI interface. [2]

Anyway, storing years of shell history somewhere, where you can do complex searches, and actually find some magic command you run a few years ago, is priceless.

- [1] https://www.outcoldman.com/en/archive/2017/07/19/dbhist/

- [2] https://loshadki.app/shellhistory/

thekashifmalik
2 replies
14h12m

Do you know any way to accomplish the same thing but keeping the history in plaintext? Would love to still be able to run SQL over it.

hughesjj
0 replies
10h55m

Write to two places in the hook?

bdcravens
0 replies
13h50m

Couldn't you just point DuckDB or a similar tool to the history file?

cwaffles
3 replies
16h26m

Atuin: https://atuin.sh/

Note: I am a user, and not the author (Ellie Huxtable)

netcoyote
1 replies
15h29m

I’m an atuin user too, and think it’s great. It’s a significant improvement over the bash history configurations (incantations?) I used previously.

Where atuin really shines is in keeping a single unified history across multiple shell windows, which my incantations could never get to work correctly on all the platforms I use (zsh/bash on OSX/Linux/msys/cygwin/babun).

I’ve also enjoyed running SQL queries on my atuin-history to learn more about my own workflows to see where I can optimize.

Thanks!

ellieh
0 replies
10h28m

thank you for the kind words!

I’ve also enjoyed running SQL queries on my atuin-history to learn more about my own workflows to see where I can optimize.

if you could share more about what you found useful there, that would be amazing

llimllib
0 replies
15h49m

I’m a long-term user, and it’s been a fantastic tool for me. Simple, useful, and never in my way.

Wishing her the best of luck making it her job.

jorisboris
2 replies
14h56m

Off-topic: there is a “wtf” domain extension?!

mkl
0 replies
14h54m
input_sh
0 replies
11h11m

We're up to ~1500 TLDs.

At this point I just auto-assume there's a domain for every word I'm interested in, though some are not open to registration or prohibitively expensive.

ctur
2 replies
14h0m

While I doubt I'd quit my day job for it, over the past couple of years I've been poking at my own database-backed shell history. The key requirements for me were that it be extremely fast and that it support syncing across multiple systems.

The former is easy(ish); the latter is trickier since I didn't want to provide a hosted service but there aren't easily usable APIs like s3 that are "bring your own wallet" that could be used. So I punted and made it directory based and compatible with Dropbox and similar shared storage.

Being able to quickly search history, including tricks like 'show me the last 50 commands I ran in this directory that contained `git`' has been quite useful for my own workflows, and performance is quite fine on my ~400k history across multiple machines starting around 2011. (pxhist is able to import your history file so you can maintain that continuity)

https://github.com/chipturner/pxhist

neurostimulant
0 replies
11h35m

CouchDB might be useful for this scenario due to its multi-master support so devices can sync to each other without using a centralized database. It's also very performant, though if you put gigabytes of data into it, it'll also consume gigabytes of RAM.

abathur
0 replies
12h4m

Built something similar (though I've yet to get astound to the frontend for it--vaguely intend to borrow one).

I neither love nor hate it as a sync mechanism, but I ended up satisficing with storing the history in my dotfile repo, treating the sqlite db itself as an install-specific cache, and using sqlite exports with collision-resistant names for avoiding git conflicts.

alexchantavy
2 replies
15h34m

Anyone know what were the big inflection points in 2022 and 2023 for star growth?

mkl
1 replies
15h1m

2023's one may be HN: https://news.ycombinator.com/item?id=35839470

Ellie's Show HN was in 2021: https://news.ycombinator.com/item?id=27079862

2022's must be somewhere else.

ellieh
0 replies
10h22m

Pretty much!

Atuin tends to get shared more readily on Mastodon/Twitter than it does HN, which explains everything other than the 2023-HN-spike. We've also been on a few podcasts and newsletters

xialvjun
1 replies
9h6m

I'm going to quit my job to relax. And my relax is to build some open source things.

walteweiss
0 replies
8h14m

That’s how I approach open source in the first place. And the part that others can help you along your way, while you allow them to use your code as well.

ukd1
1 replies
1h14m

Eventually I suspect someone will abuse this for the free (and encrypted) storage, and then they'll have to fight that battle, which won't be fun, and likely end up charging. Probably should start charging for storage sooner rather than later, and just allow self hosting free.

kazinator
0 replies
1h9m

which won't be fun

Says who?

souvlakee
1 replies
13h41m

Is it possible to sync existing shell history? Fish history, precisely.

ellieh
0 replies
10h24m

Yes! You can import your existing history from all shells we support:

- zsh - bash - fish - nushell

ram_rar
1 replies
11h54m

I'm curious about the calculus involved in dedicating oneself to a full-time open-source project. Could someone with prior experience share insights on generating income or potential future exits solely through open-source contributions?

QuantumG
0 replies
11h17m

Freelancers typically make money by coding up solutions, leaning heavily on open-source, for individual clients (as opposed to mass-market software) or other forms of consulting. Sometimes they will become the caretaker of one or more "projects" that multiple clients rely on. That maintenance is billable. It's not uncommon for contract programmers to become small businesses. If they think they can get support from the community, open-source becomes the best option.

da39a3ee
1 replies
15h32m

I use this and it's been fantastic and rock solid; (I've been using Unix shells intensively for 15 years, started using it 1 year ago and never looked back).

latchkey
0 replies
19m

The real question here is whether you'd pay for it.

chaxor
1 replies
12h19m

Why does it have kubernetes?

ellieh
0 replies
8h24m

Because some people like to host it on Kubernetes

zubairq
0 replies
11h12m

I like all these new use cases for SQLite, which appears to be a new platform itself

tra3
0 replies
1h30m

Best of luck. I hope there's a path forward where open source can provide a reasonable income stream.

I maintain a couple of open source package for emacs -- it's a labour of love. I'm happy to help folks with their issues, but it's easy to say "sorry, I don't have capacity to add this feature" or "no, I don't think this is a good fit for the project". If I depended on this for money.. well this would change the whole approach, wouldn't it?

thesurlydev
0 replies
4h15m

Logging into another machine just to get a long command resonated with me and is enough that I've decided to give it a try. That, and I'm a sucker for tools written in Rust :)

teddyh
0 replies
3h16m

This person has gone from having a solid business model (having a job) to having at best a very vague one (“I hope I can focus on adding new premium hosted features for advanced users, and begin to support business usage.”).

You know how this sounds like, to me? “I quit my job to play with and walk my dog full time. I am hoping this will give me time to develop skills to eventually earn some money by maybe walking other people’s dogs, or maybe doing public performances with my dog.”

spiral09
0 replies
14h49m

Yeah it's blocked, how to check if sites are blocked

pull_my_finger
0 replies
14h27m

Bold move given the current job market. Always a risk, but seems especially so in 2024. Hope it works out!

ptek
0 replies
16m

Nice Website \o/, Dark mode, simple layout on the side (which I might steal) so I could click on the "Notes" part, loads fast

3 Trackers with EFF Privacy badger.

Best of luck Ellie :o)

prakhar897
0 replies
12h56m

Does fig.io have this feature?

poidos
0 replies
15h17m

atuin is awesome! I self-host with 0 problems across my machines.

nonameiguess
0 replies
1h55m

I think she is tackling a problem here that can't be solved by another tool, no matter how good it is. Thus, ever getting enough people to pay for it will be tough.

The issue isn't "I can't remember how to run a command I just ran." The issue is that the universe of CLI tooling you use is too large, too inconsistent, and too complex to remember how to use them all. Conventions may be wildly different between Windows and Unix, BSD and GNU, many tools have existed for over 50 years now and have accumulated enormous feature creep. Many newer tools try to improve upon perceived complexity of past tools, but by being different, they introduce even more complexity into the overall set of tools for anyone who can't abandon the past tools. There are huge debates about environment variables, config file formats, whether parameters should use one dash or two, what even is a parameter versus an argument versus a flag, how a tool should use STDOUT versus STDERR, how it should use exit codes, whether output should be structured or free text, and nobody agrees on the answers. There is very little standardization, and where standards exist, you can't count on anything to actually follow these.

Contrasts these with the tools of a painter or wordworker. They're similar enough that learning to paint in high school art class will transfer muscle memory near perfectly to every brush and surface you ever use for the rest of your life. Creating a tool like this is throwing up your hands and saying no human can ever hope to remember how to use their tools, so they need an additional tool that remembers for them. But now we also need to remember how to query this memory augmenter, so you've introduced yet another thing to learn for anyone who isn't willing to just stop trying to learn other tools at all and rely 100% on yours.

It isn't to say it can't be useful, but you're trying to solve an ecosystem problem with a tool. You can't. At best, you can alleviate a tiny portion of the difficulty for a very small number of users sufficiently similar to you. Then you run into the culture of not having to pay for these things mentioned elsewhere. On systems like Windows and Mac, they may be paid systems, but once you pay, you automatically the full suite of system utilities and CLI tooling. BSD and GNU were free creations made largely by university professors and industy professionals in their spare time for the purpose of sharing, not for making money. Fair or not, the expectation became and will likely remain that these tools either come as part of a larger package, or they're donated from the spare time of their own users.

Exceptions are few and far between. You've got things like curl and openssl that sustain themselves reasonably well as open source CLI packages, but even those don't charge for the tool itself. They only succeed because they're so ubiquitous that if a barely perceptible proportion of users ever donate or pay for support, that is still enough. That model doesn't work if your userbase isn't virtually the entire world of computing.

ndyg
0 replies
35m

This is fantastic. I'd love to live in a world populated by tens of thousands of incredible, practical projects like Atuin, whose maintainers earn a good living. Best of luck, Ellie!

lepetitchef
0 replies
17m

All the best Ellie. I use atuin and never miss any command from my past session.

The only feedback I want to call out is sometimes when I close the terminal tab, the atuin server may run in the background and I got a warning message.

k3vinw
0 replies
11m

You’re my hero, Ellie! Congratulations on your success and best wishes for the future.

hiq
0 replies
1h4m

I have some local setup replicating what Atuin does, but I'm realizing I don't use it at all even though I thought it'd be useful, and I don't use my shell history much overall. I have an alias "aled" to edit my aliases quickly, so it's easy for me to add new ones, and that's where I put the commands I want to use regularly, or even the ones I'd like to be able to find if I ever need them again in years. It's easier to add documentation in a .zshrc than in a shell history in any case.

Global aliases (which you can use anywhere, not only at the beginning) are also nice to compose aliases with each other.

g-b-r
0 replies
14h28m

Standard shell history searching is so bad, a good version of that would already be a huge leap for most people

firecall
0 replies
15h46m

Good luck!

The product is great, and I discovered it just the other day!

Via the Console email news letter I think! :-)

darkteflon
0 replies
14h21m

Said this last time Atuin popped up, but absolutely love it. I’m by no means a power user but there’s just something so elegant about the UX of combined shell history across multiple machines.

bsnnkv
0 replies
13h57m

TIL about Atuin!

I just deployed this to my "everything" NixOS server with `services.atuin.enable` and synced a few of my machines up with it. Very cool! I hope this move goes well for Ellie!

bdhcuidbebe
0 replies
1h3m

im one of those who disable shell history persistance, since i first found peoples user folders on misconfigured apache servers in the 90s, complete with .bash_history containing passwords and hostnames.

dont recall i needed to dig up old commands typed, like ever.

i dont think this one is for me, but good luck!

LanzVonL
0 replies
14h44m

I think it's great to see more women moving into full time open source careers. We don't have many!