I feel like such a downer when I ask this about Bun and Deno, but: why should I use them instead of Node?
I don’t mean to take away from the obviously impressive engineering effort here. But VC funding always gives me pause because I don’t know how long the product is going to be around. I was actually more interested in Deno when it promised a reboot of the JS ecosystem but both Bun and Deno seem to have discovered that Node interoperability is a requirement so they’re all in the same (kinda crappy) ecosystem. I’m just not sure what the selling point is that makes it worth the risk.
We could drastically simplify the building and deployment process of our services. By far the greatest advantage is that it runs TS natively. Dropping the compilation stage simplifies everything. From docker imaging to writing DB migrations to getting proper stack traces.
You don't need source maps. You don't have to map printed stack traces to the source. Debugging just works. You don't need to configure directories because src/ is different than dist/ for DB migrations. You don't have to build a `tsc --watch & node --watch` pipeline to get hot reloading. You don't need cross-env. No more issues with cjs/esm interop. Maybe you don't even need a multi-stage Dockerfile.
That's for bun. Deno might have a similar story. We did not opt-in to the Bun-specific APIs, so we can migrate back if Bun fails. Maybe we could even migrate to something like ts-node. Shouldn't be that hard in that case.
IMHO the API of Bun, as well as the package manager, sometimes tries to be _too_ convenient or is too permissive.
So why doesn't any major runtime run Java natively? Or C++ natively? Or Rust natively?
Why is this such a cool unlock that hasn't been done for any other language?
---
85% of this are people tired/bored of Node.js.
I don't understand your point. I don't even understand your argument.
The java runtime runs java.
And C++ and Rust are compiled and have no runtime.
JVM runs java bytecode.
Java bytecode is compiled from Java (javac), Scala (scalac), etc
Java is compiled, too. The Java runtime is the JVM which runs byte code.
Just, what?!
Python is among the most popular languages and it doesn't require a compile step.
TypeScript is entirely metadata, so it just doesn't make sense to need to compile it.
the jvm bytecode has been designed to be bytecode from day 1.
JS is the TS's bytecode but it has been designed to be a language to be developed in, which causes impedance mismatches as tools and people get confused about the usage context.
Kind of. When you do try to run bun in production you'll find out that it has significant differences to node -- like not handling uncaught exceptions: https://github.com/oven-sh/bun/issues/429
Then you'll use bun build and run node in production, only to find that sourcemaps don't actually work at all: https://github.com/oven-sh/bun/issues/7427
So you'll switch to esbuild + node in production :)
Definitely excited for the promise of bun, but it's not quite baked yet.
We’ll add the uncaught exceptions handler likely before Bun 1.2 and fix the issue with sourcemaps. Sourcemaps do work (with some rough edges) at runtime when running TS/JSX/JS files, but are in a worse state with “bun build” right now.
We’ve been so busy with adding Windows support that it’s been hard to prioritize much else
Every couple weeks I try again to run my app fully on bun. For now I just use it as a packager.
The big ones for me are continual prisma issues. Mainly due to poor decisions on their side it seems…
Vite crashing. Because I’m using remix.
And then the worst one I don’t see a way around: sentry profiling which requires v8.
I can’t wait for the day everything can be on bun. Everything else sucks and is so slow or requires really bad configuration to make it work.
Can’t believe node itself and TS are so terrible with module compatibility. Bun solves all of this and is 20000x faster when I can use it!
What prisma issues are you running into? For us we just installed node alongside bun in our docker container and then ran prisma with node… was there something else?
Much appreciated and definitely rooting for bun! It’s still my goto choice for dev and can’t wait to switch production back to bun :)
You can use tsx as loader with node if you want to directly run typescript.
node --import tsx ./file.ts
The problem is, if you have ESM and then a tool in your repo like jest, requires commonjs.
Now you have to compile stuff.
In my case I’ve had apps use certain ts config options, and then another library has a start script which is incompatible.
So you’re stuck needing a different TS config for both things. These annoyances are solved with bun
Does it support editing the source-files while in the debugger?
I've been hesitant to move to TypeScript because I'm unsure how well the debugger works in practice.
My current platform is Node.js + WebStorm IDE as a debugger. I can debug the JavaScript and I can modify it while stopping at a breakpoint or debugger-statement. It is a huge time-saver that I don't have to see something wrong with my code while in the debugger and then find the original source-file to modify and recompile and then restart.
Just curious, do Deno and Bun support edit-while-debug, out of the box? Or do I need to install some dependencies to make that work?
ts-node has all of these features too?
Yes, but it's slower. Here are the times for running a script that just prints hello.
Curious: does it run TS natively or does it just transpile for you? Because the former suggests exciting opportunity for better compiling or JITting if it can actually commit to holding on to the typing.
It does not do any type checking. You have to run tsc with noEmit separately. If you run `bun run foo.ts`, it just ignores all type annotations. It is transpiled to JS internally by removing the types (or it skips the types while parsing). While doing that, it keeps track of the original source locations. If you see some stack trace, you get the original location in the ts source.
Running tsc with noEmit is pretty much the standard in the frontend as well, as the TS is bundled by esbuild/rollup directly.
I'm not sure how difficult it would be for Nodejs to support .ts files natively. But if that's the main reason to use Bun, I'd be worried about its long term viability. Node could announce native .ts support at any time and then Bun might not look so good.
I can give a bit of perspective here. I'm currently porting our the Vanilla Forums frontend (~500k lines of typescript) from using Node, Yarn (we used it back before npm supported lockfiles) and Webpack, to build with bun and Vite.
There are a few notable differences:
- The out of the box typescript interoperability is actually very nice, and much faster than using `ts-node` as we were before.
- Installations (although rare) are a fair bit faster.
- With bun I don't have to do the frankly crazy song and dance that node now requires for ES modules.
- Using bun is allowing us to drop `jest` and related packages as a dependency entirely and it executes our test suite a lot faster than jest did.
For my personal projects I now reach for bun rather than node because
- It has Typescript support out of the box.
- It has a nice test runner out of the box.
- It has much runtime compatibility with browsers (`fetch` is a good example).
- The built-in web server is sufficient for small projects and avoids the need to pull in various dependencies.
ESM interop is inarguable. But these days Node has a test runner and compatibility with browsers (it implements fetch)… I guess I feel like Node is likely to catch up with most of this stuff over the lifetime of any long running project.
Sure, but node also has a huge amount of baggage and as others have pointed out is much slower.
One of things that makes me more bullish on bun rather than Deno is that bun is intentionally aiming for compatibility with node and the npm ecosystem while Deno doesn't seem to be.
My old dog experience has proven multiple times that staying with the main reference tool for the platform always pays out long term, as most forks or guest languages eventually fade out after the hype cycle is over.
The existing tools eventually get the features that actually matter, and I avoided rewriting stuff twice, on the meantime I gladly help some of those projects to backport into the reference tooling for the platform.
The only place I really haven't followed this approach was in regards to C++ in UNIX, which at first sight might feel I am contradicting myself, however many tend to forget C++ was born at Bell Labs, on the same building as UNIX folks were on, and CFront tooling was symbiotic with UNIX.
Yepp! There is absolutely no reason to upgrade early. If it becomes the reference tool one day, then it should be easy to switch.
But you also have to remember this is JS we’re talking about… stuff changes every 10 minutes.
How is the Bun test runner’s compatibility with Jest’s methods? Can a mature test suite be easily ported?
We are currently looking at vite and vitest to run 1600 jest tests.
You can track the progress here: https://github.com/oven-sh/bun/issues/1825
There's still a ways to go but folks are actively contributing.
Bun test is so enjoyably faster than Jest.
I have a file of thousands of string manipulation tests that Jest just crashes on after 3 minutes while Bun runs in in milliseconds.
FYI if you want to make a list on HN you're gonna need to add an extra line break everywhere.
ElysiaJS is a good library when you do need a bit more with the routing + middleware. It has great benchmarks as well.
The most compelling argument for Deno is the permission system in my opinion. Node added a permission system recently, but it's much more coarse grained than Deno's. Being able to limit a script to listening on a specific hostname and port, or only allowing it to read a specific environment variable is pretty cool, and makes me less paranoid about trusting third party dependencies. Both Bun and Deno are also more performant than Node in many cases, and add a bunch of little quality of life improvements.
The real question is how much you can trust this. Those kinds of permission systems have been tried before - e.g. .NET used to have something called "Code Access Security". It was retired largely because the very notion of VM-enforced sandbox was deemed inadequate from experience. IIRC SecurityManager in Java was something similar, also deprecated for similar reasons. I'm afraid that Deno will just be a repeat of that.
I definitely wouldn't make the Deno sandbox my only line of defense — I'm a strong proponent of defense in depth. Now having said that, there's definitely a precedent for trusting V8's sandboxing capabilities. Cloudflare is running untrusted user code across their entire network and relying on V8 isolates as a sandboxing mechanism for Cloudflare Workers. I'm not sure I would go that far, but I do think we should be taking advantage of the strides browser developers have been making from a security perspective. When I re-watched Ryan Dahl's original conference talk where he introduced Deno, the sandboxing aspect was the part that resonated the most with me. But again, it's always best to have multiple layers of security. You should sandbox your applications and audit your dependencies, those mitigation techniques aren't mutually exclusive.
.NET sandbox was used for ClickOnce and Silverlight for many years. Java's was used for applets even longer. It worked until it didn't.
The people who designed those things ultimately threw in the towel and said that if you want that kind of security, use containers or VMs.
I can see why they chose that route. It's a huge maintenance burden. I can't imagine Google throwing in the towel when it comes to securing their browser's JS engine though.
That's assuming that you can fit everything within the needs of the browser sandbox. But server-side JS needs are broader than that.
It's much easier to worry about locking down the few server-side modules which allow access to the underlying OS, than it is to have to worry about securing V8's JIT compiler. Node's module-based permission system literally just bans certain standard library modules from being imported (Deno's is more fine grained thankfully). That's a much smaller attack surface area to worry about compared to securing the underlying JS engine.
Also with Deno, it become very easy to write typed cli. .ts file can be run as script very easily with permission access defined on top of the script such as:
#!/usr/bin/env -S deno run --allow-net
Then one can just run ./test.ts if the script has +x permission.
Also project such as https://cliffy.io has made writing cli way more enjoyable than node.
It is a good idea to beware of the VC. So it is good idea to support project such as Hono (projects conform to modern web standard, and is runtime agnostic for JS).
I do this all the time. I used to use `npx tsx` in my hashbang line to run TS scripts with Node, but I've started using Deno more because of the permissions. Another great package for shell scripting with Deno is Dax, which is like the Deno version of Bun shell: https://github.com/dsherret/dax
This looks cool. I've always used the npm package Inquirer (which also works with Deno), but I'll have to compare Cliffy to that and see how it stacks up in comparison.
Hono is awesome. It's fast, very well typed, runs on all JS runtimes, and has zero dependencies.
What do you think of WebAssembly modules? It looks to me like the shared memory support that came with the threading proposal (which seems somewhat widely supported) should allow libraries to be isolated in their own modules and exchange data through explicitly shared memory even if they run on the same thread.
With secure isolation being a requirement for web browsers, and with the backing of multiple big companies, it seems like there should be enough momentum to make it work properly. Or maybe the browsers rely entirely on process isolation between different origins and won't care about the security of isolating individual modules?
I used both Deno and Bun.
Bun is really nicely compatible with node.
Speed of course is excellent, but the main reason I use Bun and recommend it:
You replace node, npm, tsx, jest, nodemon and esbuild with one single integrated tool that's faster than all of the others.
Nicely compatible, until it's not.
We banned all these forks at work.
The dev onboarding overhead is not worth the benefits.
Having all 1700 repos using the same build tooling is more important than slight increases in build performance.
Banned? Is that why you had to post this on a green text account? Because that sounds immature. If you really have so many repos it sounds annoying that there isn't room for team level experimentation.
It's not immature, it's pragmatic. You do have to weigh the benefits of being able to use non-standard tools vs the cost of having not being able to reuse the same tooling, linters, compilers, and what-not for all projects.
When you have a lot of projects to support, it's rare for the benefits to outweight the costs
Devil's advocate: Deno and Bun are not yet fully backwards compatible with Node. I myself have run into a _ton_ of pain trying to introduce Bun for my team.
This can become a big time sink on bigger teams. That time could be saved by just not allowing it until a full team initiative is agreed on.
For what it's worth, I'll say that I can understand such top down governance: you'd have an easier time around moving across projects that you work on within the org, there'd be less risk of a low bus factor, BOM and documentation/onboarding might become easier.
Same as how there are Java or .NET shops out there, that might also focus on a particular runtime (e.g. JDK vendor) or tooling (an IDE, or a particular CI solution, even if it's Jenkins).
On the other hand, if the whole org would use MySQL but you'd have a use case for which PostgreSQL might be a better fit, or vice versa, it'd kind of suck to be in that particular situation.
It's probably the same story for Node/Deno/Bun, React/Vue/Angular or anything else out there.
No reason why that mandated option couldn't eventually be Bun, though, for better or worse.
Why do you have 1700 repos?
The wonders of JavaScript package dependencies, where basic CS stuff is a function exported as a package.
Probably an agency environment, or a enterprise environment that insists on having private mirrors of all 3rd party code.
Node is safe choice, IMO. I tried deno and I think it's cool, but I'm staying on node for the time being. Things that deno makes easier are not that hard with node and stability matters for me. For example I had to spend few hours rewriting my tiny service with changed deno API. Don't have any experience with bun, though.
If there are any specific places where Deno didn't support a node API, please file a bug -- we definitely shoot for as close to node.js as possible, and if you have to modify your code that's most often on us.
(I'm a deno core dev)
How do we report that. Some issues has come up with aws on esm.sh https://esm.sh/@aws-sdk/client-secrets-manager@3.540.0
It is just not working.
https://github.com/denoland/deno/issues is the ideal place -- we try to triage all incoming issues, the more specific the repro the easier it is to address but we will take a look at everything that comes in.
No, I wrote my service some time ago (basically GitHub Webhook which does some crypto to validate payload and invokes kubectl) using Deno API and few months ago I took some time to update dependencies but found out that with new standard library version some APIs are either deprecated or removed, I don't really remember, but I felt the need to rewrite it.
Don't take it as a criticism, I totally understand that you need to iterate on API and nobody promised me that it'd be stable till the end of time, but still work is work.
Ack, I ran into a similar issue when writing a github webhook. It might be related to JWT handling if it's the same issue.
If you recall that exact issue, definitely feel free to file an issue.
The speed increases are nothing to sneeze at; I've moved a few Vite projects over to Bun and even without specific optimizations it's still noticeably faster.
A specific use case where Bun beat the pants out of Node for me was making a standalone executable. Node has a very VERY in-development API for this that requires a lot of work and doesn't support much, and all the other options (pkg, NEXE, ncc, nodejs-static) are out-of-date, unmaintained, support a single OS, etc.
`bun build --compile` worked out-of-the-box for me, with the caveat of not supporting native node libraries at the time—this 1.1 release fixes that issue.
Bun's standalone executables are great, but as far as I'm aware unlike Deno and Node there's no cross compilation support, and Node supports more CPU/OS combinations than either Deno or Bun. Node supports less common platforms like Windows ARM for example (which will become more important once the new Snapdragon X Elite laptops start rearing their heads [1]).
[1] https://www.youtube.com/watch?v=uWH2rHYNj3c
We'll add cross-compilation support and Windows arm64 eventually. I don't expect much difficulty from Windows ARM once we figure out how to get JSC to compile on that platform. We support Linux ARM64 and macOS arm64.
Does it support building an EXE with all source-code removed?
I've hit significant failures every time I've tried to use `bun build --compile`; the most recent code I was trying to compile hit this one[1].
I documented how to build a binary from a JS app with node, deno, and bun here [2]. Node SEA is a bad DX, but not that complex once you figure it out.
1: https://github.com/oven-sh/bun/issues/6832
2: https://github.com/llimllib/node-esbuild-executable
I’m pretty sure issue #6842 was fixed in Bun v1.0.32 or so and we forgot to mark the issue as fixed
Will check once I get to my computer
Edit: was not fixed. We made other changes to fs.readSync to improve Node compatibility, but missed this. It will get fixed though
Bun's equivalent of "npm i" is extremely fast, at least an order of magnitude faster on all 3 of my machines.
Since I run "npm i" many times per day, that in itself is a big timesaver, not just for local dev but also in CI pipelines.
How does it compare to pnpm
pnpm is 3x slower or even more
https://miro.medium.com/v2/resize:fit:832/1*noWyIXw-yeNZq5Db...
https://kinsta.com/wp-content/uploads/2023/12/bun-benchmark....
And for running scripts, it's not even close
https://miro.medium.com/v2/resize:fit:1400/1*oyPEXsFfZsTg4mO...
I found Bun to be faster. Monorepo support is a bit kludgy though. Once you know of the workarounds, it's ok. See my comment on https://github.com/oven-sh/bun/issues/5413#issuecomment-1956...
AFAIK, Pnpm monorepos do not follow standard npm. Bun does follow standard npm monorepos.
Pnpm's feature to override dependency versions is nice for legacy projects with many 3rd party dependencies. Not sure if Bun has the same feature. I mostly use it on greenfield projects with dependencies that I control.
Have you tried pnpm?
https://pnpm.io/
I haven’t used them yet for full sized apps, but they are both fantastic for scripting and small CLIs. Between the ease of running scripts, nice standard libraries, npm ecosystem, and excellent type system, I now feel TypeScript is a better scripting language than Python or Ruby.
I used to think so too, but that was because I had never really used Python. I still think Ruby is a mess, but it's so amazing how easy it is to manipulate data in Python, and so much faster.
I recently wrote a Node/Bun/Deno app that parses a 5k line text file into JSON.
The JavaScript on any runtime takes 30-45 seconds.
The Python implementation is sub 1 second.
I would not have been able to finish the tool so quickly if I were stuck relying on JS.
I still love Typescript but I'm not as blind about it now.
That runtime doesn't make any sense. This script creates a 1,000,000 line CSV string and then parses it into JSON in 700ms with Bun, and this is doing both things the slow way, creating the string with a giant array map and join, and parsing with a giant split('\n') on newline and map.
https://gist.github.com/david-crespo/8fea68cb38ea89edceb161d...
What does the code do? 30-45 seconds to parse a 5k line text file into JSON sounds like something is going very wrong
The real question is why would I use Bun over Vite? Even the ThreeJS developers determined Vite is the best.
Bun and Vite are not analogous. Bun is a runtime with a standard library, bundler, test runner. Vite is a bundler. You can run Vite through Bun.
Bun has been great as a package manager, test suite runner and typescript interpreter. We use node in prod
I don't know how you're using Node and not thinking "I wish there was a better option than this". I can't wait to jump ship but Bun/Deno aren't quite there yet, for my needs.
Cute logo.
And UX is pretty great: integrated fetch, simplified fs api, integrated test runner (I miss good old TAP style assertions though), ESM/CJS modules just work, some async sugar.
I think if they offer me a paid *worker solution, with sqlite, that's something I'm willing to pay for.
It also helps avoid a node/v8 monoculture, just like with web browsers. I'm sure the ecosystem as a whole will get better because of it, even if you decide not to use it.
I really like Deno for small scripts and small side projects - it's just fast to get started with. And it allows me to use web standards, like URL imports to grab packages from CDNs instead of having a config file. There's just less to think about, like oh what was Node's crypto thing? Node is making strides in web compatibility, and building in things like a test runner. And I don't have much interest in migrating company projects away from Node. But Deno feels really fresh and light when I just need to run some JS.
See my answer "what's cool about deno" https://gitlab.com/brlewis/brlewis-aoc/-/blob/main/README.md...
I have ask myself the same question a couple of weeks ago and decided to use Node for some side stuff. Simply because of Node being the most mature, boring choice. Still, I like the DX improvements of both Bun and Deno a lot. We'll see how it all plays out in some years.
I can't speak for deno, but bun is drop in compatible for most things and the test runner speed alone is enough to make it worth using.