return to table of content

Bun 1.1

afavour
82 replies
17h4m

I feel like such a downer when I ask this about Bun and Deno, but: why should I use them instead of Node?

I don’t mean to take away from the obviously impressive engineering effort here. But VC funding always gives me pause because I don’t know how long the product is going to be around. I was actually more interested in Deno when it promised a reboot of the JS ecosystem but both Bun and Deno seem to have discovered that Node interoperability is a requirement so they’re all in the same (kinda crappy) ecosystem. I’m just not sure what the selling point is that makes it worth the risk.

nikeee
20 replies
16h20m

We could drastically simplify the building and deployment process of our services. By far the greatest advantage is that it runs TS natively. Dropping the compilation stage simplifies everything. From docker imaging to writing DB migrations to getting proper stack traces.

You don't need source maps. You don't have to map printed stack traces to the source. Debugging just works. You don't need to configure directories because src/ is different than dist/ for DB migrations. You don't have to build a `tsc --watch & node --watch` pipeline to get hot reloading. You don't need cross-env. No more issues with cjs/esm interop. Maybe you don't even need a multi-stage Dockerfile.

That's for bun. Deno might have a similar story. We did not opt-in to the Bun-specific APIs, so we can migrate back if Bun fails. Maybe we could even migrate to something like ts-node. Shouldn't be that hard in that case.

IMHO the API of Bun, as well as the package manager, sometimes tries to be _too_ convenient or is too permissive.

paulddraper
6 replies
15h0m

By far the greatest advantage is that it runs TS natively.

So why doesn't any major runtime run Java natively? Or C++ natively? Or Rust natively?

Why is this such a cool unlock that hasn't been done for any other language?

---

85% of this are people tired/bored of Node.js.

7bit
2 replies
10h59m

I don't understand your point. I don't even understand your argument.

The java runtime runs java.

And C++ and Rust are compiled and have no runtime.

paulddraper
0 replies
7m

JVM runs java bytecode.

Java bytecode is compiled from Java (javac), Scala (scalac), etc

hibbelig
0 replies
9h45m

Java is compiled, too. The Java runtime is the JVM which runs byte code.

solumunus
0 replies
4h1m

Just, what?!

smt88
0 replies
14h10m

Python is among the most popular languages and it doesn't require a compile step.

TypeScript is entirely metadata, so it just doesn't make sense to need to compile it.

baq
0 replies
8h5m

the jvm bytecode has been designed to be bytecode from day 1.

JS is the TS's bytecode but it has been designed to be a language to be developed in, which causes impedance mismatches as tools and people get confused about the usage context.

ibash
4 replies
15h50m

Kind of. When you do try to run bun in production you'll find out that it has significant differences to node -- like not handling uncaught exceptions: https://github.com/oven-sh/bun/issues/429

Then you'll use bun build and run node in production, only to find that sourcemaps don't actually work at all: https://github.com/oven-sh/bun/issues/7427

So you'll switch to esbuild + node in production :)

Definitely excited for the promise of bun, but it's not quite baked yet.

Jarred
3 replies
15h41m

We’ll add the uncaught exceptions handler likely before Bun 1.2 and fix the issue with sourcemaps. Sourcemaps do work (with some rough edges) at runtime when running TS/JSX/JS files, but are in a worse state with “bun build” right now.

We’ve been so busy with adding Windows support that it’s been hard to prioritize much else

zackify
1 replies
14h26m

Every couple weeks I try again to run my app fully on bun. For now I just use it as a packager.

The big ones for me are continual prisma issues. Mainly due to poor decisions on their side it seems…

Vite crashing. Because I’m using remix.

And then the worst one I don’t see a way around: sentry profiling which requires v8.

I can’t wait for the day everything can be on bun. Everything else sucks and is so slow or requires really bad configuration to make it work.

Can’t believe node itself and TS are so terrible with module compatibility. Bun solves all of this and is 20000x faster when I can use it!

ibash
0 replies
13h13m

What prisma issues are you running into? For us we just installed node alongside bun in our docker container and then ran prisma with node… was there something else?

ibash
0 replies
13h15m

Much appreciated and definitely rooting for bun! It’s still my goto choice for dev and can’t wait to switch production back to bun :)

searchableguy
2 replies
14h50m

You can use tsx as loader with node if you want to directly run typescript.

node --import tsx ./file.ts

zackify
0 replies
14h23m

The problem is, if you have ESM and then a tool in your repo like jest, requires commonjs.

Now you have to compile stuff.

In my case I’ve had apps use certain ts config options, and then another library has a start script which is incompatible.

So you’re stuck needing a different TS config for both things. These annoyances are solved with bun

galaxyLogic
0 replies
14h26m

Does it support editing the source-files while in the debugger?

I've been hesitant to move to TypeScript because I'm unsure how well the debugger works in practice.

My current platform is Node.js + WebStorm IDE as a debugger. I can debug the JavaScript and I can modify it while stopping at a breakpoint or debugger-statement. It is a huge time-saver that I don't have to see something wrong with my code while in the debugger and then find the original source-file to modify and recompile and then restart.

Just curious, do Deno and Bun support edit-while-debug, out of the box? Or do I need to install some dependencies to make that work?

__alexs
1 replies
11h11m

ts-node has all of these features too?

xdennis
0 replies
8h55m

Yes, but it's slower. Here are the times for running a script that just prints hello.

    $ time bun hello.ts 
    real  0m0.015s
    user  0m0.008s
    sys   0m0.008s

    $ time ts-node hello.ts 
    real  0m0.727s
    user  0m1.534s
    sys   0m0.077s

Waterluvian
1 replies
16h7m

Curious: does it run TS natively or does it just transpile for you? Because the former suggests exciting opportunity for better compiling or JITting if it can actually commit to holding on to the typing.

nikeee
0 replies
15h59m

It does not do any type checking. You have to run tsc with noEmit separately. If you run `bun run foo.ts`, it just ignores all type annotations. It is transpiled to JS internally by removing the types (or it skips the types while parsing). While doing that, it keeps track of the original source locations. If you see some stack trace, you get the original location in the ts source.

Running tsc with noEmit is pretty much the standard in the frontend as well, as the TS is bundled by esbuild/rollup directly.

leptons
0 replies
13h16m

I'm not sure how difficult it would be for Nodejs to support .ts files natively. But if that's the main reason to use Bun, I'd be worried about its long term viability. Node could announce native .ts support at any time and then Bun might not look so good.

charrondev
10 replies
16h51m

I can give a bit of perspective here. I'm currently porting our the Vanilla Forums frontend (~500k lines of typescript) from using Node, Yarn (we used it back before npm supported lockfiles) and Webpack, to build with bun and Vite.

There are a few notable differences:

- The out of the box typescript interoperability is actually very nice, and much faster than using `ts-node` as we were before.

- Installations (although rare) are a fair bit faster.

- With bun I don't have to do the frankly crazy song and dance that node now requires for ES modules.

- Using bun is allowing us to drop `jest` and related packages as a dependency entirely and it executes our test suite a lot faster than jest did.

For my personal projects I now reach for bun rather than node because

- It has Typescript support out of the box.

- It has a nice test runner out of the box.

- It has much runtime compatibility with browsers (`fetch` is a good example).

- The built-in web server is sufficient for small projects and avoids the need to pull in various dependencies.

afavour
2 replies
16h43m

ESM interop is inarguable. But these days Node has a test runner and compatibility with browsers (it implements fetch)… I guess I feel like Node is likely to catch up with most of this stuff over the lifetime of any long running project.

fastball
0 replies
16h40m

Sure, but node also has a huge amount of baggage and as others have pointed out is much slower.

charrondev
0 replies
16h23m

One of things that makes me more bullish on bun rather than Deno is that bun is intentionally aiming for compatibility with node and the npm ecosystem while Deno doesn't seem to be.

pjmlp
1 replies
9h51m

My old dog experience has proven multiple times that staying with the main reference tool for the platform always pays out long term, as most forks or guest languages eventually fade out after the hype cycle is over.

The existing tools eventually get the features that actually matter, and I avoided rewriting stuff twice, on the meantime I gladly help some of those projects to backport into the reference tooling for the platform.

The only place I really haven't followed this approach was in regards to C++ in UNIX, which at first sight might feel I am contradicting myself, however many tend to forget C++ was born at Bell Labs, on the same building as UNIX folks were on, and CFront tooling was symbiotic with UNIX.

yurishimo
0 replies
9h22m

Yepp! There is absolutely no reason to upgrade early. If it becomes the reference tool one day, then it should be easy to switch.

But you also have to remember this is JS we’re talking about… stuff changes every 10 minutes.

1123581321
1 replies
15h38m

How is the Bun test runner’s compatibility with Jest’s methods? Can a mature test suite be easily ported?

We are currently looking at vite and vitest to run 1600 jest tests.

triyambakam
0 replies
14h9m

Bun test is so enjoyably faster than Jest.

I have a file of thousands of string manipulation tests that Jest just crashes on after 3 minutes while Bun runs in in milliseconds.

fastball
0 replies
16h41m

FYI if you want to make a list on HN you're gonna need to add an extra line break everywhere.

briantakita
0 replies
13h25m

- The built-in web server is sufficient for small projects and avoids the need to pull in various dependencies.

ElysiaJS is a good library when you do need a bit more with the routing + middleware. It has great benchmarks as well.

throwitaway1123
9 replies
16h47m

The most compelling argument for Deno is the permission system in my opinion. Node added a permission system recently, but it's much more coarse grained than Deno's. Being able to limit a script to listening on a specific hostname and port, or only allowing it to read a specific environment variable is pretty cool, and makes me less paranoid about trusting third party dependencies. Both Bun and Deno are also more performant than Node in many cases, and add a bunch of little quality of life improvements.

int_19h
8 replies
15h28m

The real question is how much you can trust this. Those kinds of permission systems have been tried before - e.g. .NET used to have something called "Code Access Security". It was retired largely because the very notion of VM-enforced sandbox was deemed inadequate from experience. IIRC SecurityManager in Java was something similar, also deprecated for similar reasons. I'm afraid that Deno will just be a repeat of that.

throwitaway1123
6 replies
15h18m

I definitely wouldn't make the Deno sandbox my only line of defense — I'm a strong proponent of defense in depth. Now having said that, there's definitely a precedent for trusting V8's sandboxing capabilities. Cloudflare is running untrusted user code across their entire network and relying on V8 isolates as a sandboxing mechanism for Cloudflare Workers. I'm not sure I would go that far, but I do think we should be taking advantage of the strides browser developers have been making from a security perspective. When I re-watched Ryan Dahl's original conference talk where he introduced Deno, the sandboxing aspect was the part that resonated the most with me. But again, it's always best to have multiple layers of security. You should sandbox your applications and audit your dependencies, those mitigation techniques aren't mutually exclusive.

int_19h
5 replies
14h45m

.NET sandbox was used for ClickOnce and Silverlight for many years. Java's was used for applets even longer. It worked until it didn't.

The people who designed those things ultimately threw in the towel and said that if you want that kind of security, use containers or VMs.

throwitaway1123
4 replies
14h35m

The people who designed those things ultimately threw in the towel and said that if you want that kind of security, use containers or VMs.

I can see why they chose that route. It's a huge maintenance burden. I can't imagine Google throwing in the towel when it comes to securing their browser's JS engine though.

int_19h
3 replies
14h12m

That's assuming that you can fit everything within the needs of the browser sandbox. But server-side JS needs are broader than that.

throwitaway1123
2 replies
13h47m

It's much easier to worry about locking down the few server-side modules which allow access to the underlying OS, than it is to have to worry about securing V8's JIT compiler. Node's module-based permission system literally just bans certain standard library modules from being imported (Deno's is more fine grained thankfully). That's a much smaller attack surface area to worry about compared to securing the underlying JS engine.

vmfunction
1 replies
11h1m

Also with Deno, it become very easy to write typed cli. .ts file can be run as script very easily with permission access defined on top of the script such as:

#!/usr/bin/env -S deno run --allow-net

Then one can just run ./test.ts if the script has +x permission.

Also project such as https://cliffy.io has made writing cli way more enjoyable than node.

It is a good idea to beware of the VC. So it is good idea to support project such as Hono (projects conform to modern web standard, and is runtime agnostic for JS).

throwitaway1123
0 replies
1h36m

Also with Deno, it become very easy to write typed cli. .ts file can be run as script very easily with permission access defined on top of the script such as:

I do this all the time. I used to use `npx tsx` in my hashbang line to run TS scripts with Node, but I've started using Deno more because of the permissions. Another great package for shell scripting with Deno is Dax, which is like the Deno version of Bun shell: https://github.com/dsherret/dax

Also project such as https://cliffy.io has made writing cli way more enjoyable than node.

This looks cool. I've always used the npm package Inquirer (which also works with Deno), but I'll have to compare Cliffy to that and see how it stacks up in comparison.

Hono (projects conform to modern web standard, and is runtime agnostic for JS)

Hono is awesome. It's fast, very well typed, runs on all JS runtimes, and has zero dependencies.

ptx
0 replies
4h14m

What do you think of WebAssembly modules? It looks to me like the shared memory support that came with the threading proposal (which seems somewhat widely supported) should allow libraries to be isolated in their own modules and exchange data through explicitly shared memory even if they run on the same thread.

With secure isolation being a requirement for web browsers, and with the backing of multiple big companies, it seems like there should be enough momentum to make it work properly. Or maybe the browsers rely entirely on process isolation between different origins and won't care about the security of isolating individual modules?

yolm
8 replies
16h0m

I used both Deno and Bun.

Bun is really nicely compatible with node.

Speed of course is excellent, but the main reason I use Bun and recommend it:

You replace node, npm, tsx, jest, nodemon and esbuild with one single integrated tool that's faster than all of the others.

bducycy
7 replies
15h57m

Nicely compatible, until it's not.

We banned all these forks at work.

The dev onboarding overhead is not worth the benefits.

Having all 1700 repos using the same build tooling is more important than slight increases in build performance.

triyambakam
3 replies
14h7m

Banned? Is that why you had to post this on a green text account? Because that sounds immature. If you really have so many repos it sounds annoying that there isn't room for team level experimentation.

yen223
0 replies
12h50m

It's not immature, it's pragmatic. You do have to weigh the benefits of being able to use non-standard tools vs the cost of having not being able to reuse the same tooling, linters, compilers, and what-not for all projects.

When you have a lot of projects to support, it's rare for the benefits to outweight the costs

dimgl
0 replies
13h49m

Devil's advocate: Deno and Bun are not yet fully backwards compatible with Node. I myself have run into a _ton_ of pain trying to introduce Bun for my team.

This can become a big time sink on bigger teams. That time could be saved by just not allowing it until a full team initiative is agreed on.

KronisLV
0 replies
13h38m

If you really have so many repos it sounds annoying that there isn't room for team level experimentation.

For what it's worth, I'll say that I can understand such top down governance: you'd have an easier time around moving across projects that you work on within the org, there'd be less risk of a low bus factor, BOM and documentation/onboarding might become easier.

Same as how there are Java or .NET shops out there, that might also focus on a particular runtime (e.g. JDK vendor) or tooling (an IDE, or a particular CI solution, even if it's Jenkins).

On the other hand, if the whole org would use MySQL but you'd have a use case for which PostgreSQL might be a better fit, or vice versa, it'd kind of suck to be in that particular situation.

It's probably the same story for Node/Deno/Bun, React/Vue/Angular or anything else out there.

No reason why that mandated option couldn't eventually be Bun, though, for better or worse.

tipiirai
2 replies
12h58m

Why do you have 1700 repos?

pjmlp
0 replies
9h53m

The wonders of JavaScript package dependencies, where basic CS stuff is a function exported as a package.

darylteo
0 replies
8h8m

Probably an agency environment, or a enterprise environment that insists on having private mirrors of all 3rd party code.

vbezhenar
5 replies
16h58m

Node is safe choice, IMO. I tried deno and I think it's cool, but I'm staying on node for the time being. Things that deno makes easier are not that hard with node and stability matters for me. For example I had to spend few hours rewriting my tiny service with changed deno API. Don't have any experience with bun, though.

mmastrac
4 replies
15h59m

If there are any specific places where Deno didn't support a node API, please file a bug -- we definitely shoot for as close to node.js as possible, and if you have to modify your code that's most often on us.

(I'm a deno core dev)

mmastrac
0 replies
2h21m

https://github.com/denoland/deno/issues is the ideal place -- we try to triage all incoming issues, the more specific the repro the easier it is to address but we will take a look at everything that comes in.

vbezhenar
1 replies
13h24m

No, I wrote my service some time ago (basically GitHub Webhook which does some crypto to validate payload and invokes kubectl) using Deno API and few months ago I took some time to update dependencies but found out that with new standard library version some APIs are either deprecated or removed, I don't really remember, but I felt the need to rewrite it.

Don't take it as a criticism, I totally understand that you need to iterate on API and nobody promised me that it'd be stable till the end of time, but still work is work.

mmastrac
0 replies
2h20m

Ack, I ran into a similar issue when writing a github webhook. It might be related to JWT handling if it's the same issue.

If you recall that exact issue, definitely feel free to file an issue.

Osmose
5 replies
16h36m

The speed increases are nothing to sneeze at; I've moved a few Vite projects over to Bun and even without specific optimizations it's still noticeably faster.

A specific use case where Bun beat the pants out of Node for me was making a standalone executable. Node has a very VERY in-development API for this that requires a lot of work and doesn't support much, and all the other options (pkg, NEXE, ncc, nodejs-static) are out-of-date, unmaintained, support a single OS, etc.

`bun build --compile` worked out-of-the-box for me, with the caveat of not supporting native node libraries at the time—this 1.1 release fixes that issue.

throwitaway1123
2 replies
16h24m

Bun's standalone executables are great, but as far as I'm aware unlike Deno and Node there's no cross compilation support, and Node supports more CPU/OS combinations than either Deno or Bun. Node supports less common platforms like Windows ARM for example (which will become more important once the new Snapdragon X Elite laptops start rearing their heads [1]).

[1] https://www.youtube.com/watch?v=uWH2rHYNj3c

Jarred
1 replies
14h46m

We'll add cross-compilation support and Windows arm64 eventually. I don't expect much difficulty from Windows ARM once we figure out how to get JSC to compile on that platform. We support Linux ARM64 and macOS arm64.

galaxyLogic
0 replies
14h6m

Does it support building an EXE with all source-code removed?

Jarred
0 replies
15h3m

I’m pretty sure issue #6842 was fixed in Bun v1.0.32 or so and we forgot to mark the issue as fixed

Will check once I get to my computer

Edit: was not fixed. We made other changes to fs.readSync to improve Node compatibility, but missed this. It will get fixed though

angra_mainyu
4 replies
15h11m

Bun's equivalent of "npm i" is extremely fast, at least an order of magnitude faster on all 3 of my machines.

Since I run "npm i" many times per day, that in itself is a big timesaver, not just for local dev but also in CI pipelines.

rlt
2 replies
14h49m

How does it compare to pnpm

briantakita
0 replies
13h38m

I found Bun to be faster. Monorepo support is a bit kludgy though. Once you know of the workarounds, it's ok. See my comment on https://github.com/oven-sh/bun/issues/5413#issuecomment-1956...

AFAIK, Pnpm monorepos do not follow standard npm. Bun does follow standard npm monorepos.

Pnpm's feature to override dependency versions is nice for legacy projects with many 3rd party dependencies. Not sure if Bun has the same feature. I mostly use it on greenfield projects with dependencies that I control.

arcanemachiner
0 replies
14h50m

Have you tried pnpm?

https://pnpm.io/

dcre
3 replies
14h37m

I haven’t used them yet for full sized apps, but they are both fantastic for scripting and small CLIs. Between the ease of running scripts, nice standard libraries, npm ecosystem, and excellent type system, I now feel TypeScript is a better scripting language than Python or Ruby.

triyambakam
2 replies
14h12m

I used to think so too, but that was because I had never really used Python. I still think Ruby is a mess, but it's so amazing how easy it is to manipulate data in Python, and so much faster.

I recently wrote a Node/Bun/Deno app that parses a 5k line text file into JSON.

The JavaScript on any runtime takes 30-45 seconds.

The Python implementation is sub 1 second.

I would not have been able to finish the tool so quickly if I were stuck relying on JS.

I still love Typescript but I'm not as blind about it now.

dcre
0 replies
12h45m

That runtime doesn't make any sense. This script creates a 1,000,000 line CSV string and then parses it into JSON in 700ms with Bun, and this is doing both things the slow way, creating the string with a giant array map and join, and parsing with a giant split('\n') on newline and map.

https://gist.github.com/david-crespo/8fea68cb38ea89edceb161d...

Jarred
0 replies
14h10m

What does the code do? 30-45 seconds to parse a 5k line text file into JSON sounds like something is going very wrong

Solvency
1 replies
15h54m

The real question is why would I use Bun over Vite? Even the ThreeJS developers determined Vite is the best.

dcre
0 replies
14h40m

Bun and Vite are not analogous. Bun is a runtime with a standard library, bundler, test runner. Vite is a bundler. You can run Vite through Bun.

waynenilsen
0 replies
16h54m

Bun has been great as a package manager, test suite runner and typescript interpreter. We use node in prod

solumunus
0 replies
4h6m

I don't know how you're using Node and not thinking "I wish there was a better option than this". I can't wait to jump ship but Bun/Deno aren't quite there yet, for my needs.

postepowanieadm
0 replies
11h37m

Cute logo.

And UX is pretty great: integrated fetch, simplified fs api, integrated test runner (I miss good old TAP style assertions though), ESM/CJS modules just work, some async sugar.

I think if they offer me a paid *worker solution, with sqlite, that's something I'm willing to pay for.

hiccuphippo
0 replies
16h36m

It also helps avoid a node/v8 monoculture, just like with web browsers. I'm sure the ecosystem as a whole will get better because of it, even if you decide not to use it.

crabmusket
0 replies
14h55m

I really like Deno for small scripts and small side projects - it's just fast to get started with. And it allows me to use web standards, like URL imports to grab packages from CDNs instead of having a config file. There's just less to think about, like oh what was Node's crypto thing? Node is making strides in web compatibility, and building in things like a test runner. And I don't have much interest in migrating company projects away from Node. But Deno feels really fresh and light when I just need to run some JS.

WuxiFingerHold
0 replies
9h56m

I have ask myself the same question a couple of weeks ago and decided to use Node for some side stuff. Simply because of Node being the most mature, boring choice. Still, I like the DX improvements of both Bun and Deno a lot. We'll see how it all plays out in some years.

CuriouslyC
0 replies
15h1m

I can't speak for deno, but bun is drop in compatible for most things and the test runner speed alone is enough to make it worth using.

Jarred
26 replies
1d2h

I work on Bun and happy to answer any questions.

Note that Bun v1.1 is still compiling at the time of writing (probably for another 20 minutes)

toastal
9 replies
1d2h

Now that Discord will start showing ads, is there any chance Bun will support a communication platform that is open & ad-free like IRC, XMPP, or Matrix?

dcgudeman
7 replies
1d

What kind of question is this? He is working on a JavaScript/TypeScript runtime not building a communication product. Why would you run to him to solve your pet grievance with Discord?

toastal
3 replies
23h54m

If open source, free software is a good enough ethos for your code base, it should be good enough for your community communications. Supporting only a proprietary locks out a swath of users & Discord in particular is an information blackhole—and now with ads!

fastball
2 replies
16h45m

Which users are unable to use Discord? Note that unable is different from unwilling, and "locked out" is not an accurate description of the latter.

vladxyz
0 replies
16h23m

Me! Discord does not consider me a real person if I don't have a phone number it approves of.

toastal
0 replies
13h49m

Users that need special clients (accessibility, hardware, etc.). Users blocked by US sanctions. Users that have been moderated off the platform for something not in your community (account bans even happen accidentally). Users with privacy/anonymity concerns about the data collection (& now ads)—especially the chat rooms that require a SIM card. Users that take their FOSS or otherwise ethical software views or anti-corporate views to heart & want something built on those principle—from wanting to use free software to make free software, to wanting to outright avoid what some now call enshitification where a free (gratis) account clashes with the idea of freedoms, etc.

tbeseda
0 replies
1d

I imagine the complaint is around Bun's use of Discord as community coordination tooling -- linked in the site's header. The post isn't implying Bun should be involved in the creation of an alternative.

(I'm not here to throw shade at using Discord for OSS "communities" - I do as well - And am concerned about the path forward. Just want to clarify the question's intent.)

scubbo
0 replies
14h34m

I think they used the word "support" to mean "use", rather than "build". That is - they weren't asking Bun developers to _build_ an alternative to Discord, but rather to stop _using_ Discord.

mdeeks
0 replies
1d

I'm assuming it is because Bun uses Discord? There are Discord links on their site and on Github, though the links don't seem to work.

Jarred
0 replies
15h52m

Ship an open-source Discord alternative that more people want to use than Discord and we’ll happily switch.

gr4vityWall
4 replies
1d2h

Is there a privacy policy available for the telemetry collected by default by Bun?

throwitaway1123
3 replies
1d1h

The opt-out telemetry is worrisome. Along with the fact that there doesn't appear to be a way to disable it for single-file executables if you plan to distribute a Bun cli app to users: https://github.com/oven-sh/bun/issues/8927

throwitaway1123
1 replies
15h31m

Yeah I saw a Github discussion where he mentioned that the code for uploading telemetry data was disabled, but he also said he plans to re-enable that at some point: https://github.com/oven-sh/bun/discussions/2605#discussionco...

I would prefer to have the telemetry become opt-in before data collection is turned on.

whstl
0 replies
4h20m

Completely agree.

eddd-ddde
2 replies
1d2h

Are there plans for adding concurrent or parallel execution to the test runner? I recently tried looking at the code base to maybe implement it myself, and it looks like it wouldn't be easy without some reworks.

Jarred
1 replies
15h54m

We need to do some form of this but I’m not exactly sure what yet. I suspect same process but multiple globals might work well. A lot of tests spend time sleeping or waiting for things. They might benefit from that kind of paralellism (like async/await, except between things it runs a whole other global object)

Threads could also work but the problem is you have to re parse & evaluate all the code. That’s a lot of duplicate work. It’s probably still worth it for large enough apps

eddd-ddde
0 replies
12h31m

Isn't there some way of cloning a loaded vm after loading a module? I would imagine that should be possible some how, that way you could parse once then execute in multiple threads.

cheema33
1 replies
21h42m

I work on Bun and happy to answer any questions.

I think I saw somewhere that 1.0 did not support NextJS. Does 1.1?

yolm
0 replies
15h58m

Not a question, but wanted to say: Great job!

txdv
0 replies
1d

love it, even `bun upgrade` is fast:

On my rpi 4 that I capped to 600mhz for performance testing:

Bun v1.1.0 is out! You're on 1.0.36 [3.93s] Upgraded.

theThree
0 replies
1d2h

When will `worker` API be ready for production?

rcarmo
0 replies
10h27m

When are we getting UDP/dgram support?

fofolo
0 replies
1d2h

GG for Windows support and all the addition in 1.1 :) Thxx

WuxiFingerHold
0 replies
9h38m

Yours and the team's performance is so impressive. I'm not brave enough to use Bun in production yet, but count me in in a year or so ... great stuff.

pier25
16 replies
17h10m

Is Bun executing TS or is it also compiling down to JS and executing that?

Edit:

The docs mention:

Because Bun can directly execute TypeScript, you may not need to transpile your TypeScript to run in production. Bun internally transpiles every file it executes (both .js and .ts), so the additional overhead of directly executing your .ts/.tsx source files is negligible.

https://bun.sh/docs/runtime/typescript

The idea I'm getting from this is that both JS and TS are transpiled to something else. Are types preserved in this bytecode, AST, or whatever it is?

vbezhenar
7 replies
16h56m

Transpiling TS is really easy task, because TS developers made huge effort to make it possible. You basically remove all types and that's about it. Probably could be done with simple character streaming algorithm.

At this point, IMO, it should just be implemented within V8. Would make things much simpler for everyone.

whstl
0 replies
4h20m

Hmmm good catch.

But on the other hand, it would be already fantastic to have at least a subset of TypeScript...

WorldMaker
0 replies
45m

It is more compatible than not and there are workarounds for most of the things that wouldn't work, many of which you shouldn't be using in Typescript today, have known workarounds (enums), and/or you probably have lint rules already warning against using (namespaces). (There are more details elsewhere in that proposal document outside of the direct linked FAQ question.)

forty
1 replies
11h16m

Why would doing work at run time that can be done at build time be a good idea? I have a CI, I'd rather have it so the build work rather than delegating that to my production servers.

forty
0 replies
11h13m

(note that you still have to tsc everything anyway in the CI to check the types, so when you ship TS files to production your CI does the hard work, but then doesn't finish the easy part it so your prod server has to do it? Why?)

chuckadams
0 replies
14h8m

It's not just stripping types: enums have a runtime representation that needs to be generated.

Kognito
4 replies
17h0m

It’s not running TS directly, it’s just preconfigured to transpile TS to JS without the user having to bring extra tooling. Neat, but you’ll see the docs still recommend tsc for type checking at build.

galaxyLogic
3 replies
13h47m

I wonder what's the benefit of TS if there's no type-checking? If types are not checked that means the TS type-declarations could be totally wrong and nobody would know. In other words they could be misleading.

Why incur the type-declaration overhead if they are not used after all?

pfg_
2 replies
12h53m

This is how typescript is run today. Typescript types never exist at runtime regardless of how typescript is run. There is no overhead defining types because they are deleted at runtime. The purpose of typescript is to make the editor experience better (autocomplete, error highlight). Typically typechecking is run in addition to tests to make sure there aren't a bunch of errors no one saw in editor.

galaxyLogic
1 replies
9h49m

So "type-checking is run". Could it not be run by Bun automatically?

noahtallen
0 replies
12m

It could be, but even today without Bun, a common approach is to do type checking in a separate step from the build. This is because tsc doesn’t parallelize well, so type checking will slow down the build a lot. So you can put the type check step in a separate CI job, and have it fail like unit tests would. Then the main build can be a lot faster since it just has to strip the annotations.

Plus, for local dev, iteration and watch/rebuild is more important than failing with invalid types on every change. Sometimes it’s helpful to circle back to fix/update types after you’ve tried a couple approaches. (TS can still be finicky at times!) On top of that, your IDE should report type errors as you work anyways.

emmanueloga_
0 replies
17h5m

"TypeScript & JSX support. You can directly execute .jsx, .ts, and .tsx files; Bun's transpiler converts these to vanilla JavaScript before execution."

--

https://bun.sh/docs#design-goals

crabmusket
0 replies
14h49m

or is it also compiling down to JS and executing that?

This is the only way to execute TypeScript. That's how every tool that "executes" TypeScript works.

Jarred
0 replies
15h57m

Bun transpiles Typescript to JavaScript before execution, removing types and doing some dead code elimination

gregoriol
10 replies
9h49m

Is it me or does this project tries to do too many things at once? "Bun is an npm-compatible package manager", and an http server, and a websocket server, and a test library, and a bundler, and... why?

norman784
3 replies
9h4m

I think is that modern languages like Go and Rust (there are for sure others, but I don't have experience with any other language that ships with all the tools) ship with all the tools you need, formatter, linter, test runner, etc, Go goes even further than Rust and ships a very complete std focused on web and Javascript is used primarily in web, so it makes sense that ships with all the libraries needed to build a web server, also websocket is a standard, so it's easy to implement it and make it work with the browser.

Nowadays if you start a new Javascript project you need to setup, vite/esbuild/webpack, eslint/oxlint/biome, prettier, typescript, etc, that is a ton of dependencies that YOU need to maintain for years, and if it is part of the tool you are using, then you don't, ideally there shouldn't be a breaking change, let's see how bun manages that when the time comes.

I am waiting for bun or a tool that has everything I need to bundle my frontend app, I'm very tired of fiddling with all the dependencies and try to make to work every dependency together, I have a legacy project that I work on, I would have migrated a long time ago to another tool, but there is none that would fix the current issue and it's managing the project build, test, formatter, lint dependencies, after using Rust I feel super frustrated with the Javascript ecosystem state.

Also you asked why? I want to work on the project, there is already a lot of work to maintain the dependencies up to date, the tooling should not be part of that work.

CaptainOfCoit
1 replies
6h29m

Nowadays if you start a new Javascript project you need to setup, vite/esbuild/webpack, eslint/oxlint/biome, prettier, typescript, etc, that is a ton of dependencies that YOU need to maintain for years

That's a choice YOU make, not everyone makes that choice, especially because they want to be able to continue working on a project for months/years without accruing automatic technical debt as all those projects move forward without actually thinking about backwards compatibility.

noahtallen
0 replies
19m

Right, but that’s why languages like Rust have an excellent DX. There is no complex choice about what linter, formatter, test runner, doc builder, or package manager to use. These at all such common requirements for building software at scale with lots of contributors that the language tooling just includes them. It’s not hard to maintain, because the language toolchain is so foundational it needs to be stable.

JS doesn’t have anything like that, which is why projects like Deno, Bun, Biome, etc are interesting. These projects explore how JS can also get a great out of the box experience without requiring the complex setup and maintenance steps that so many existing tools require.

Besides, professionally, you normally don’t make the choice in a vacuum. Linters, package managers, testing, bundling/building, type safety, and even formatting are all very useful in big projects with lots of people. So you often don’t get to say “ah we just won’t have unit tests because jest doesn’t care enough about backwards compatibility.”

gregoriol
0 replies
3h31m

No one likes to set up or update things. But when everything is coupled (why not even include a framework in Bun?), you are even more dependent on choices made by them and will be forced to upgrade no matter what. For example, if they decide that their implementation of the test component wasn't good enough in version X and they completely reimplement it in version X+1, you will have to upgrade your code, but maybe at that point you don't want to rewrite all your test suites, you just want to get the new http3 request handler, but you still must rewrite all your test suites...

When things are not that coupled, independent components you bring together, you can update the webserver, and/or some components, and keep the old version of the test library for now, until you decide it's time to upgrade.

dimitrisnl
3 replies
9h38m

That's the selling point

gregoriol
2 replies
9h17m

That's the scary point

norman784
0 replies
9h2m

I like their selling point and if there is enough demand, I think the node community will implement at least a few of those features.

dimitrisnl
0 replies
5h53m

Migrating bundlers and changing test-runners is no fun. If bun can handle that with consistent performance and no goofy upgrade paths, I'm happy.

melodyogonna
0 replies
9h2m

Because the project would be a failure otherwise, and including those is the main goal of the project.

dgellow
0 replies
9h13m

Because that’s what people want. That’s how you can get a really good developer experience similar to golang or other languages. Just install one tool to build, lint, format, run tests, run your local project. No time spent trying to setup a bundler when what you want is to build a new project.

Regarding runtime libraries, it’s similar to the battery-included approach of go or python, you get what you need to get started out of the box and only reach for dependencies when you want to go further. Testing library, an http server, a websocket server that’s perfectly reasonable to have as core library of a runtime developed to run web servers.

gr4vityWall
10 replies
1d1h

Seems like a good release. I watched their video, and some charts were a bit unclear, as in, I didn't know if they were comparing with the previous Bun version or Node.js.

My experience with using Bun in side projects has been good. The built-in APIs work well in my experience, and I hope popular runtimes adopt at least a subset of them. The hashing and the SQLite bindings come to mind as APIs that I wish were available in Deno and Node.js as well.

They collect some telemetry by default. I don't think the install script tells you that. The only mention of it that I've found is in the Bunfile documentation: https://bun.sh/docs/runtime/bunfig#telemetry

I'd prefer if it was opt-in, and that users were given instructions for disabling it if they want to during installation.

They offer an option to create a dependency-free executable for your project, which bundles the runtime with your .js entrypoint. That works great if you want a single binary to distribute to users, but at the moment, the file size is still pretty big (above 90MB on GNU/Linux for a small project). Not terrible, but nothing comparable to Go or QuickJS yet. I wonder if in the future, Bun would offer an option to compile itself with certain features disabled, so we'd get a smaller binary.

I have been playing with using Bun as a Haxe target. It works pretty well and IMO it's a choice to consider if you like Haxe more than TypeScript, or if you want to add a web server to an existing Haxe project without adding another programming language. You can also to do things like generating validation code at compile time, which seems hard to do with just TypeScript.

notelem
3 replies
16h7m

They collect some telemetry by default

I wondered why every time I upgraded bun macos would pop up a permissions dialog. This explains that.

Anyway, it can be disabled by adding the following to your environment:

    export DO_NOT_TRACK=1
This is the data collected:

https://github.com/oven-sh/bun/blob/801e475c72b3573a91e0fb4c...

This 64 bit machine_id line is concerning:

https://github.com/oven-sh/bun/blob/801e475c72b3573a91e0fb4c...

It may be a unique identifier for your machine.

Jarred
2 replies
16h1m

That is unrelated. This is a code signing / notarization issue because we don’t distribute Bun via the Mac App Store and likely the way it was installed was via npm or something other than .zip file we distribute. Code signing is necessary due to the JIT entitlement in Bun (otherwise Bun would be a whole lot slower)

Jarred
0 replies
15h26m

This is dead code from before Bun 1.0. This code exists but is not run and probably stripped from the final executable (Zig is great at dead code elimination). We do get the Linux kernel version to detect if syscalls like pidfd_open are supported and enable fast paths.

stefanos82
1 replies
21h0m

Thank you for letting me know about telemetry; I had no idea.

rnmkr
0 replies
17h1m

Then people go crazy when Golang announced they want to enable informed opt-out telemetry.

moritzwarhier
1 replies
21h17m

I'd prefer if it was opt-in, and that users were given instructions for disabling it if they want to during installation.

So you mean an informed opt-out, right?

Thank you for the interesting hands-on experiences and insights regarding Bun. I followed the coverage and posts by Jarred here on HN for quite a while, think since the initial alpha release, but haven't used it.

Will keep your examples for helpful added platform APIs in mind for when I hopefully come around to doing sth with Bun!

It sounds like a great platform for JS scripting as well - I also think that could be a good and easy way to test the waters.

Really, kudos to Bun and Jarred Sumner for living up to the promise he made when the first version was announced!

gr4vityWall
0 replies
20h37m

So you mean an informed opt-out, right? I worded it wrong - I meant I'd prefer if it was opt-in, or at least informed opt-out, like you said. Thanks for pointing it out. :)

It sounds like a great platform for JS scripting as well - I also think that could be a good and easy way to test the waters.

It is. Some other features you might enjoy are the built-in TypeScript support and test runner. It works well for one-off scripts too, if you'd prefer not using Bash. For me, it was refreshing coming from Node.js. Hope it is an enjoyable experience for you as well.

Jarred
1 replies
16h3m

Note that we do not currently send telemetry anywhere. The extent of what we track right now is a bitset of what builtin modules were imported and a count of how many http requests were sent with fetch(), and a few things like that. This is used in the message printed when Bun panics so we can have a better idea of what the code was doing/did when it crashed

gr4vityWall
0 replies
1h59m

Sorry, I didn't understand your post. From the last 2 phrases, it seems to be the case that you do collect some data.

So, when you say

Note that we do not currently send telemetry anywhere. You mean that Oven do not send that data to someone else, right?

I still see value in having a privacy policy so that users can find out what is collected by Oven in a concise way, and how to opt out of that. As far as I know, the fact that any data is collect at all, and that there's a flag to disable it is only mentioned in a documentation page for Bun's TOML config file.

pritambarhate
6 replies
1d

What's the revenue model for Bun? What happens when the VC funding runs out?

marviel
3 replies
23h36m

I'm not sure why this is being downvoted -- it feels like a valid question for anything that enjoys wide adoption.

fastball
2 replies
16h47m

It's a valid question, but does it matter for anyone except the dev team? Bun is open source, so VC-backing is mostly a helpful jumpstart. If they find a viable business model – great, development can be funded in perpetuity. If they don't, development was funded for a while by someone else's money and then Bun is just like any other open source project that lacks direct funding (most of them).

marviel
0 replies
16h19m

I suppose you're right -- the MIT model makes it a non-issue

keb_
0 replies
6h48m

I think it does matter. Open-source software can still suffer from "enshittification" when there is constant need to generate profit. Fortunately it is open source, so it can be forked when things get bad, but even then there still may be lots of tech debt to undo.

eddd-ddde
0 replies
23h24m

I imagine some hosted service like everyone does is a likely option.

galaxyLogic
6 replies
13h42m

Looks like a great development.

One thing I miss in Node.js is ability to run a HTTPS -server in a simple way without having to muddle with generating & installing a correct type of certificate. I understand there can be a "self-signed certificate" but there doesn't seem to be any npm-module I could install to take care of that.

Since Bun is a "server-side" JavaScript platform it would be great if it could support https out of the box too.

galaxyLogic
1 replies
9h51m

Does it work with Node.js or Bun or Deno?

francislavoie
0 replies
12h9m

More precisely, it runs its own local CA to issue certs for itself. Not exactly the same as self-signed certs (which is a cert whose key was used to sign itself), but better because leaf certs are short-lived and easily cycled. This allows for setting up trust easily, just need to trust the one root CA cert and every leaf cert for any domain served will be trusted.

eddd-ddde
0 replies
12h33m

Does this work? https://get.localhost.direct/

No need to mess with self signed certs, there are already some public 127.0.0.1 wildcard domains like the one I linked.

Aspos
0 replies
13h36m

Curious how do you actually see this working. Care to elaborate?

thatguyagain
4 replies
10h11m

I find it hilarious that we now present runtimes and other programming stuff like it was Apple presenting a new iPhone. This would be satire 15 years ago. No disrespect to Bun tho, I love Bun.

yurishimo
0 replies
9h31m

I feel like it’s meant to be satire here as well.

madsbuch
0 replies
9h6m

The audience for these types of announcement is bigger.

My intuition is that there are many more consumers of node-like environments today than any runtimes 15 years ago.

daredoes
0 replies
2h26m

I find these articles useful when migrating a legacy system, as they sometimes contain migration notes or rare minor details from the developers.

This combined with the Wayback machine makes for a great way to keep track of detailed information

CuriouslyC
0 replies
4h49m

That's because there's so much cacophony emanating from internet that you have to shout to be heard today.

liampulles
4 replies
10h59m

"...Bun on Windows passes 98% of our own test suite for Bun on macOS and Linux."

Does this mean the release was made with failing tests, or am I misunderstanding?

basil-rash
1 replies
9h49m

Skipping particular tests depending on platform is a very common practice, for better or worse.

CaptainOfCoit
0 replies
6h28m

That's not what's happening. Bun has tests that are supposed to work on the platform, but currently doesn't.

Skipping tests from a Linux suite that doesn't make sense to run on Windows is very common. Skipping tests that should pass on a platform but doesn't just in order to cut a release isn't as common.

vallode
0 replies
10h49m

Looks like it, it seems the 2% are mostly odd platform specific issues that the authors' did not deem very important (my assumption for the release happening anyway). AFAIK this[1] PR tries to fix them.

[1]: https://github.com/oven-sh/bun/pull/9729

baq
0 replies
8h17m

At some point it becomes infeasible to 100% the test suite for all configurations.

At some point further away it becomes infeasible to 100% the test suite for any configuration.

earslap
4 replies
16h7m

Did not realize Bun had (even if rudimentary) macros - bundle time executing code support. That is pretty neat! https://bun.sh/docs/bundler/macros

Waterluvian
1 replies
16h1m

Wow that’s pretty neat. Because of that I also learned about import attributes (https://github.com/tc39/proposal-import-attributes) which is probably going to be quite useful and make the 50 lines of imports in some of my files look even dumber.

Valodim
0 replies
11h52m

Some loaders already encode stuff in the path url style, e.g. vite-svg-loader

import iconUrl from './my-icon.svg?url'

Maybe I'm not imaginative enough but this seems like a reasonably restricted (i.e. simple) way of parametrizing imports.

agos
0 replies
9h21m

this is one of the most interesting new things in the bundler space - Parcel is introducing some form of macro too.

It's way overdue to have some way to program what happens at bundling time (without writing your own bundler plugin)

fastball
1 replies
16h51m

I don't think the bun devs are currently parcelling things up like that so doubt this would be worth the effort at this time, i.e. I don't think they're backporting any fixes from 1.1 into a 1.0.x release.

captn3m0
0 replies
16h9m

Stating that would be helpful as well.

tempaccount420
0 replies
7h53m

No LTS versions please. Release often, but don't break old code. Be more like Rust (v1 forever) than like Node (new major releases every year or so).

tjenkinsqs
3 replies
22h37m

We've been using bun for a while now. We love the speed, but we love the integration even more. No need to use node, npm, nodemon, tsx, esbuild and jest.

Bun is our one-stop-shop for Typescript.

Thank you!

fastball
2 replies
16h43m

Can I ask what you use it for specifically?

tjenkinsqs
0 replies
6h36m

Bun is used in github.com/yolmio/boost as a Typescript runner to build a complete system. We use it also for all kinds of other scripts.

ivanjermakov
0 replies
15h38m

I use Bun as a test and dev build runner for TS programs. I still compile with tsc though.

fleetfox
3 replies
12h20m

Faster this, faster that. Is it finally segfault free? I've tried it like 3 times in span of last year with different projects only to find out it segfaults at runtime or when installing package.

dimgl
0 replies
11h15m

Same. Tons of tooling breaks and segfaults. Our codebase has a dylib unknown symbol error that hasn't been fixed since before v1.

QuinnyPig
0 replies
12h19m

But look how quickly it segfaults!

CuriouslyC
0 replies
4h39m

I only use bun for tests/builds/storybook, but I haven't had it segfault at all. I suspect that you've got a dependency that is hitting an undocumented node API that isn't fully implemented. They talk about those in the blog post, they're a known thing.

ericyd
3 replies
15h45m

Not sure I see the benefit of the bun shell. I use shell scripts when I know that the other people using the script will be able to run it in a similar shell to me, in order to cut down on dependencies. If I need it to be cross platform I just use a scripting language like JS.

Bun shell keeps the more esoteric syntax of Unix-like shells but also requires a dependency (Bun itself). If you already have Bun installed why wouldn't you just write a JS script?

zamadatix
0 replies
15h20m

I can definitely see the value of not wanting to write the boilerplate kind of stuff you need to do for more shell like scripting. I can also see wanting to emulate the traditional syntax while abstracting away whether you are on a Unix system and still allowing traditional JavaScript syntax in between the sheep-y parts.

On the dependency side it'd be slick if you could bun --compile these like normal bun apps.

pfg_
0 replies
12h59m

It's realy nice when I need to do shell stuff - much nicer to be able to use js to write a shell script than either go look up shell syntax again or use js with child_process.spawn()

CuriouslyC
0 replies
4h41m

I mandate WSL when working with windows developers, and GNU coreutils for mac developers so that I can assume some things about their dev environments. This solves that problem for shell scripts, there are still other ways where it's useful but a scripting environment is probably the biggest one.

The velocity of writing filesystem and file manipulation files in shell is many times greater than in javascript. Bun shell lets you leverage the power of shell for that stuff while being able to leverage your javascript code at the same time, with fairly minor downsides in exchange.

dgellow
3 replies
10h59m

Really impressive list of changes. Bun sounds like the dream node alternative, I hope they succeed in their mission.

And I’m glad they spent time on Windows support, that’s something neglected in the web development world

madsbuch
2 replies
10h33m

I am curios, why do you think they did not succeed yet?

dgellow
0 replies
9h22m

From their website:

The goal of Bun is to run most of the world's server-side JavaScript and provide tools to improve performance, reduce complexity, and multiply developer productivity.

Bun is still pretty young and experimental, and not really production ready, though it’s getting there fast. If it grows enough to force node to improve, or if it takes over node, that would be a success based on their own goal

WuxiFingerHold
0 replies
9h41m

There's been some polls on social media: Overall picture was 80-90% using Node. Then Bun, then Deno. I'd bet in the real world it's 99% Node for production. If in 3 years 5% were using Bun, it would be a great success (Node usage is huge). I think they're on track, but would not recommend Bun for production backends as of now.

misiek08
2 replies
9h46m

So able to optimize node.js and be $x times faster, but not able to calculate percentage of speedup correctly? 1.3s vs 2s it's not 58% speedup - it's 35%...

Such approach makes those projects looks like unicorns, not production stuff - even they are.

dgellow
0 replies
9h8m

You can communicate the same without the snarky tone

anonymoushn
0 replies
9h42m

Using the numbers given, it's a 53% speedup.

MarkSweep
2 replies
13h52m

I was quite curious about the .bunx file format. I think this could be a quite a useful thing, a universal binary format. Then I see the shim DLL:

https://github.com/oven-sh/bun/blob/801e475c72b3573a91e0fb4c...

Even before this past week's XZ backdoor revelation, checking binaries into source control rather than building from source seems quite questionable. In fairness to the Bun developer's, they have a comment in their build.zig file acknowledging that this shim should be built more normally rather than being checked in.

Then I look into the source for it:

https://github.com/oven-sh/bun/blob/801e475c72b3573a91e0fb4c...

For no discernible reason, it is using a bunch of undocumented Windows APIs. The source cites this Zig issue as one reason for why they think it is OK to use undocumented APIs:

https://github.com/ziglang/zig/issues/1840

I don't see any good reasons here cited for using undocumented, unstable interfaces. For Zig's part, there seems to be some poorly-explained interest in linking against "lower level" libraries without any motivating use case (just some hand waving about security and drivers, neither of which makes much sense. Onecore.lib is a thing if you wanted a documented way of linking an executable that run on a diverse set of Windows form factors. And compiling drivers may as well be treated as a seperate target, since function names are different). For Bun, I assume they are trying to have low binary size. But targeting NTDLL vs. Kernel32 should not make a big difference, especially when the shim is just doing basic file IO. For an example of making small executable with standard API, you can make hello world 4kb using MSVC just by using /NODEFAULTLIB and /ENTRY:main with link.exe and this program :

    #include <Windows.h>
    #define MY_MSG "Hello World!\n"
    int main() {
        WriteFile(GetStdHandle(STD_OUTPUT_HANDLE), MY_MSG, sizeof(MY_MSG), nullptr, nullptr);
        return 0;
    }
So it should be possible to make a .bunx shim of small size without having to resort to undocumented API. (Current exe is 12kb). But even if the shim exe was 100kb, that would be still be an acceptable tradeoff for me than having to debug any problem that results from using non-standard APIs.

nektro
1 replies
12h33m

the motivation behind zig#1840 is that while the functions in ntdll aren't as well documented as the kernel32 functions, theyre not unstable and not having our binaries depend on kernel32.dll would lead to faster startup times as well as allow us to do things like use more performant algorithms for UTF-8 <-> UTF-16 conversion. on top of the things mentioned in the issue like having APIs with more powerful features.

MarkSweep
0 replies
4h10m

For Bun's shim, it is linking against kernel32 anyways. And there is nothing special about it's use of NtCreateFile, NtReadFile, NtWriteFile, and NtClose that would preclude it from using the standard functions.

I'm not sure it's possible to not have kernel32 loaded into your process anyways. Even if you create an EXE that imports 0 DLLs, kernel32 gets loaded into the process by NTDLL. The callstack from main:

    ConsoleApplication1.exe!main()
    kernel32.dll!BaseThreadInitThunk()
    ntdll.dll!RtlUserThreadStart()
There are valid reasons to use APIs from NTDLL. Where I disagree with zig#1840 is the idea that it is always better to use NTDLL versions of API. Every other software ecosystem uses the standard Win32 APIs and diverging from that without a good reason seems like a good way to have unexpected behavior. One concrete example is most users and programmers expect Windows to redirect some file system paths when running on WOW64. But this is implemented in Kernel32, not ntdll.

https://github.com/ziglang/zig/issues/11894

sharas-
1 replies
8h49m

Bun is run by a little prince

__jonas
0 replies
8h44m

could you elaborate?

preommr
1 replies
16h29m

Huge fan of Bun.

Came for the ts interoperability, stayed for the performance.

Also seems like the most sensible project in the space - I tried Deno and it was... rough. Bun on the otherhand was easy to intergrate and a very pleasant experience.

woutr_be
0 replies
14h58m

I started using Bun by default for small personal projects. Having to set up Node, with Typescript and reloads always took the fun out of quickly prototyping something.

Have yet to run Bun in production tho.

yoavm
0 replies
12h11m

I just tried using Bun to run one of our more complex projects. Did the same with Deno a week ago and too many things weren't working. with Bun everything loaded perfectly, and I could immediately drop ts-node and nodemon because they're essentially redundant when using Bun. great stuff!

rcarmo
0 replies
10h29m

Still can’t use it without dgram support. But it’s nice to see it adding batteries.

ptx
0 replies
7h51m

The API for the shell function is kind of neat, in that it seems to prevent you from accidentally creating shell injection vulnerabilities by calling it without properly quoting the arguments.

For example, in Python you could easily do this:

  message = '; cat /etc/passwd'

  # Whoops, shell injection vulnerability!
  subprocess.run(f'echo Message: {message}', shell=True)

  # Correct (assuming sh-compatible shell).
  subprocess.run(f'echo Message: {shlex.quote(message)}', shell=True)

  # Correct (without using shell).
  subprocess.run(['/bin/echo', 'Message:', message])
But the Bun API doesn't separate quoting from executing the command, so you can't make that kind of mistake:

  let message = '; cat /etc/passwd';

  // Works correctly.
  await $`echo Message: ${message}`.text();

  // Fails safely by throwing error about incorrect usage.
  await $('echo Message: ' + message).text();

ojosilva
0 replies
1h15m

One of my favorites: Bun has a working FFI that is also fast.

https://bun.sh/docs/api/ffi

So has Deno, but Bun's feel more evolved as it comes bundled with tools and sufficient examples for working with pointers. The only thing Bun is missing right now is the Deno equivalent of non-blocking FFI calls, ie `await mylib.myFunc()`.

Another one on my wishlist for Bun: embedded Bun. A library distribution would be nice, so that we can call into Bun as ie "libbun.so" from other languages. It would be more resourceful than just embedding WebKit/SpiderMonkey/v8 as these lack any real capabilities besides running vanilla JS.

modeless
0 replies
21h42m

Hooray for Windows support! That was keeping me from using Bun since I'm on Windows a fair bit. My experience with Bun has been excellent so far and I'm looking forward to using it more.

madsbuch
0 replies
10h34m

I use both Deno and Bun in production (albeit, on different projects).

They are both great upgrades from node. In particular with the first class support for TypeScript.

Bun is great for large projects with the enhanced DX over any node based environments I have worked on - I use it for a mono-repo project with several frontends and a GraphQl backend. Involved test suites run in 5 seconds, etc.

Deno seems to work really well i lambda style environment (I use them with Supabase) due to their module approach that are entirely stand alone. This is great for small scripts to glue things together.

brap
0 replies
7h24m

On one hand I like the work Bun is doing, on the other hand the Bun module is starting to look like a messy "utils" module, aka random junk drawer. All of them useful of course, just... incoherent.

bluelightning2k
0 replies
11h57m

This guy ships!

blackhaj7
0 replies
16h48m

Bun churning out the good stuff, yet again.

So often I read the release notes and think “what a great idea, why wasn’t this already a thing” e.g. syntax highlighted errors

atonse
0 replies
5h22m

I admire your video production values :) Nice job there, One thing is, what gear do you guys use? The audio was really good for what was probably a shotgun mic?

Can you tell I don't use bun yet? Because none of my questions are about the release. But I tried it out a few weeks ago on my NextJS project and it didn't work, but will try bun 1.1.

XCSme
0 replies
1h20m

Could I use this with Parcel.js for my local development of a React app?

Or Bun also replaces ParcelJS?

WuxiFingerHold
0 replies
9h59m

The energy and velocity behind Bun is mind blowing.

MrResearcher
0 replies
1h21m

Are there any measurements or statistics how many times a particular npm package was downloaded for usage in Node.js vs Bun vs Deno?

E.g. I see 9m downloads for vite: https://www.npmjs.com/package/vite

What proportion of that is currently for Bun?

CuriouslyC
0 replies
4h46m

Lots of good changes here, Bun shell + windows support solves a real problem.

Is `remix run dev` fully working now with the additional node API support?