What is the state of Deno's node.js compatibility when compared to Bun?
What is the state of Deno's node.js compatibility when compared to Bun?
I'm interrested in the WebGPU feature.
With Slint [1] we're working on a framework which allow to make a desktop GUI in Javascript/Typescript, without bringing a browser/webview. Currently, we do it by using binaries through napi-rs so we can bring in a window using the platform native API. And then we do some hack to merge the event loops.
But if Deno supports bringing up a window directly, this means we can just ship wasm instead of native binary for all platform. And also I hope event loop integration will be simplified.
Although we'd also need more API than just showing a window (mouse and keyboard input, accessibility, popup window, system tray, ...)
Edit: I got excited a bit too early. The WebGPU feature doesn't include the API to launch a Window. One still need to rely on an extra library binary.
This doesn't really seem related, it seems like you're saying "maybe we will use this, here is a link to our commercial GUI library".
Also why would you want to make a GUI even more bloated by integrating a browser to get webgpu and webasm (and why webgpu instead of just webGL?). That might be easy for library makers but why would someone want a giant library to make a GUI when they already have electron if they don't care about 350MB binaries to pop up a window.
They are talking about using webgpu for rendering to screen without going through a browser.
Webgpu and wasm without a web browser would be a nice way to distribute portable sandboxed code.
The reason for Webgpu is that it it has semantics closer to modern modern apis like Metal or Direct X.
What does it have to do with Deno and why do any of that to draw boxes and text on the screen? FLTK could do GUIs that took almost no CPU power starting with 100KB binaries 30 years ago.
People doing hardware accelerated GUIs have been using openGL for almost as long. This doesn't need to be a science project or a rabbit hole, drawing a GUI quickly is well worn territory.
FLTK and other libraries cannot be used from JavaScript.
JavaScript is one of the most popular programming language. But a JavaScript dev who needs to make a desktop GUI will usually need to use Electron to bring up a browser for the GUI. This means actually two JavaScript engine (Node and Chromium)
The reason it has to do with Deno is that you need a JavaScript/TypeScript runtime. But you may not need a browser. So you can develop your application in JavaScript using a framework that doesn't use the DOM but show a native window instead.
This is more lightweight and more secure (no risks of HTML injection and the browser is a big attack vector)
FLTK is permanently stuck in the 90s. It doesn't have full unicode support, right-to-left & bidirectional text, and doesn't support accessibility tools.
Not to mention that development has been stagnant for 15 years since it's flagship application (Nuke), was ported to QT.
edit: i will not be taking this comment in good faith, i had a look through parents comments and it looks like they simply do not agree with Slint's approach to GUI programming: https://news.ycombinator.com/item?id=39223499#39229382
What does it have to do with Deno
can we at least try to give people the benefit of the doubt instead of pretending like everyone is out to get you? if you read the change-log, it spells it out well:
Our goal is to provide a windowing solution for WebGPU without linking to native windowing systems like X11
This is a low level API that can be used by FFI windowing libraries like sdl2, glfw, raylib, winit and more to create a WebGPU surface using native window and display handles.
do you get it yet?
People doing hardware accelerated GUIs have been using openGL for almost as long. This doesn't need to be a science project or a rabbit hole, drawing a GUI quickly is well worn territory.
i suggest talking to other graphics professionals to get a better understanding of why opengl is not _the_ solution. for a tldr[0]:
regular OpenGL on desktops had Problems. So the browser people eventually realized that if you wanted to ship an OpenGL compatibility layer on Windows, it was actually easier to write an OpenGL emulator in DirectX than it was to use OpenGL directly and have to negotiate the various incompatibilities between OpenGL implementations of different video card drivers. The browser people also realized that if slight compatibility differences between different OpenGL drivers was hell, slight incompatibility differences between four different browsers times three OSes times different graphics card drivers would be the worst thing ever. From what I can only assume was desperation, the most successful example I've ever seen of true cross-company open source collaboration emerged: ANGLE, a BSD-licensed OpenGL emulator originally written by Google but with honest-to-goodness contributions from both Firefox and Apple, which is used for WebGL support in literally every web browser.
also, openGL is deprecated on macOS.
[0]: https://cohost.org/mcc/post/1406157-i-want-to-talk-about-web...
it looks like they simply do not agree with Slint's approach to GUI programming
My comment here is explaining exactly why I don't agree, no digging required.
Your quote is also about browsers implementing the webGL API, it has nothing to do with using basic openGL for GUIs, which again, is not required, because CPUs have been rendering GUIs for decades.
I totally get the idea of bringing JS into new environments because it is so ubiquitous. On the other hand, I hate the language and wish we could replace it in the browser, the opposite direction. I'd be much happier with ClojureScript or PureScript or something being the standard with the ecosystem to go with it.
Typescript takes away a significant amount of the pain for me; the only hold up after that was getting an environment set up to compile it. Deno supporting typescript without any configuration is incredible.
Big fan of custom rendering approaches, but wouldn't such a Design system sidestep any and accessibility tools? Nothing for screenreader to hook into. Text and interfaces would be neither native nor DOM based.
Slint uses https://github.com/AccessKit/accesskit to provide cross-platform a11y.
I poked around your website because I was curious if the name was a band reference. Love the Spiderland photo of the staff!
it actually does: its not part of the WebGPU API directly itself, but its separate: https://deno.com/blog/v1.40#webgpu-windowing--bring-your-own...
Would love to see the compile situation fixed - the generated executables are ~90MB+ at this stage and do now allow compression without erroring out. Deploying ala Golang is not feasible at that level but could well be down the line if this dev branch is picked up again!
The exe output grew from from ~50MB to plus ~90MB from 2021 to 2024: https://github.com/denoland/deno/discussions/9811 which mean Deno is worse than Node.js's pkg solution by a decent margin.
When I download a modern game it's like 700GB so I donno why people complain about 100mb self contained deploys for javascript. Most of it is the international libraries anyway so.
I find it pretty ridicolous since I go to a website today and it's at least 15MB each time I refresh but 100MB on the server is a problem? Dude cmon.
There are size limits when deploying on serverless or edge infrastructure so developers have to care about that. The providers also typically charge by compute seconds * memory consumed so a larger executable costs real money as well.
The compiled option isn’t intended for serverless or edge uses, you just deploy the source files and the platform takes over.
Some serverless use cases work like you say, but Docker-based options such as AWS ECS, Docker-based Lambda functions, or Kubernetes would all commonly make use of compiled options
This is whataboutism. Different sizes are acceptable for different people based on context. I worry about 10s of kilobytes for things I work on for instance.
So the fact that everything around is poorly made, forbids him from whining that a certain solution is as bad as everything else? Shouldn't we criticize and point out stuff that is bad, no matter whether it's as bad as something else? It's because of attitude like yours that we get 15mb of javascript on every website, 500gb games and UIs that take seconds to load.
I'm not sure what your requirements are, but I've had a good amount of success with converting Node.js libraries to native libraries by embedding a CommonJS module into the binary, then running the actual code through QuickJS. Much smaller binaries.
If you really are pressed for space, you could use upx, or store 7z compressed code and embed the 7z library to decompress before passing it along to QuickJS.
Here's a proof of concept: https://github.com/ijustlovemath/jescx
Isn't QuickJS order(s) of magnitude slower than V8? That doesn't seem like a practical tradeoff to make outside of embedded.
Like I said, I wasn't sure of their requirements. I can say QuickJS is orders of magnitude easier to embed and understand than V8, which is why I adopted it for my use case.
Currently the generated binary is not static anyway, so you still need some parts of the system installed to run code. To be more precise, you can't use a "from scratch" container image base, but need to use something that at the very least has libgcc installed, such as the distroless "cc" image (gcr.io/distroless/cc).
Deploying ala Golang is not feasible at that level but could well be down the line if this dev branch is picked up again!
While I agree with you that it's not optimal and should get fixed, 90MB doesn't sounds like it can stop you from deploying it either.
I'm not sure using deno compile as a way of deploying to a controlled environment has much benefit anyway. Unlike some other languages/runtimes, only a single system dependency is really needed (a new-enough deno installation) to run your code
In my view, deno compile is more about shipping command line tools to people with all sorts of personal environments (which may not have deno at all)
Hi, Bartek from the Deno team here. We are actively looking into improving this situation and from initial investigations we were able to shave off roughly 40% of the baseline size, with more improvements possible in the future. Stay tuned for upcoming releases.
I know it sound silly but I love the aesthetic behind Deno's brand, it makes me want to use it for one project or another. Node.js is so old now. it's reliable but boring. I want something new, bold, and daring. Node.js is not it, although it was ~11 years ago. It's interesting how our views change.
This does indeed sound silly. If people were to use proper arguments to select technology, we wouldn't be in this mess in the first place. Unfortunately, most of us are more susceptible towards aesthetics, novelty, and admiration of self-proclaimed software gurus. Thanks for being open about this and sharing.
It would have been nice if I had realized this aspect of technological evolution earlier in my career. I might have spent more time learning about marketing than about the actual technology.
If people were to use proper arguments to select technology, we wouldn't be in this mess in the first place.
The ultimate killer argument is always "does it improve my CV?"
CV driven development is usually the reason for questionable technology choices
No worries, it's never too late to learn important truths about human nature!
Novelty is good. Not every new thing is an improvement, but every improvement is a new thing.
I'm not sure if this is real or a subtle parody of the stereotypical JS/node user.
Maybe both?
Yeah.. It's become a cliché that web developers, especially in the JavaScript ecosystem, are always chasing after what's "new and shiny". This leads to constant churn, new frameworks and libraries that reinvent the wheel, endless stuff to learn and catch up. It gets tiring after a while.
So the old-timers remind us, "Choose boring technology." Reliable and boring is a good thing.
On the other hand, I recently started learning Bun, and oh what a breath of fresh air, it makes things fun again. I love that its codebase is so readable and small still, and everything is freshly designed with the insights of experience and hindsight. It doesn't have the accumulated cruft, years of decisions and compromises.
So, sometimes it's great to shed the old and grow with the new generation.
I like it too . I even liked the old website better than the current one. It was more colorful and fancy.
Brand reflects a project's philosphy. Deno is thought to be easy and simple, and the "childish" branding reflects that.
So you may not be wrong at all in your criteria. Subjective human communication can be more effective than objective communication sometimes ;). It just works at a different level.
I know! The dinosaur brand is awesome :)
what is a great comparison between Bun, Deno, and Node.js?
Why should I choose Bun over Deno or vice versa?
Pick your horse to bet on. They both have great teams behind them. Bun (written in Zig) claims higher performance. It looks promising but independent tests have yet to validate these claims.
They also kinda have different goals. Bun seeks to be more of a drop-in replacement from Node whereas Deno, being spearheaded by the same person who made Node, seeks to move the industry forward and fix mistakes Deno made. However Deno, out of necessity, has also highly valued backwards compatibility with the Node ecosystem
Deno, being spearheaded by the same person who made Node, seeks to move the industry forward and fix mistakes Deno (I assume you mean Node.js) made.
If they aren't able to evolve Node.js to overcome the mistakes (e.g. because they're technical in nature, or momentum of install base, or they don't have the leadership ability), I am worried that they might repeat the same pattern with Deno, since it isn't possible to NOT make any mistakes.
OTOH, having a clean slate that's learned from mistakes and you can bring a lot of your code along doesn't seem like a major impediment.
Is code written for any of the 3 runtimes generally transferrable? I realize some won't (e.g. Deno has WebGPU support, which is attractive to me), but generally?
The main difference is that Deno was thought to evolve hand in hand with browsers since the beginning. Node followed its own way and fragmented the JS ecosystem. It got to a point where it was basically impossible to realign Node's direction without breaking everyone's codebase. Mistakes will be made in Deno, but the direction of the project now takes into account the ecosystem as a whole.
It's a weird scenario. You have node which is entrenched, feature-rich, and stable although not perfect. Then you have two runtimes/ecosystems trying to optimize on that but in slightly different ways. I love it but I feel like there is barely room for even one. It's just incredible there is so much work being put forth into moving the needle maybe an extra 20% on the existing node/npm status quo.
fix mistakes Deno made
You probably meant "node" here.
Just build something useful. Your users don't care. Even PHP is good enough if you know what to build.
If you don't, focus on pointless things like Bun vs Node vs Deno, get some dopamine hits and fool yourself into thinking you are being productive.
Deno has built-in support for TypeScript.
Deno has members in Ecma TC39, so they take part in the development and standardization of JS.
Deno brings a new mentality to development focused on simplifying things, whereas Bun aims to be an improved Node.
Deno aligns with web APIs: you can use the same APIs in the browser and in Deno. Many packages work in both platforms. Deno is even in the compatibility tables in MDN.
Deno simplifies DX greatly. It's very easy to work with Deno. Easy to install things and set up a project, easy to deploy, no config files etc.
Deno is written in Rust, which allows them to move faster and more safely. Contributing to and extending Deno is a breeze. You can add Rust crates to the runtime and use them from JS.
I feel like the Bun team has been too pressured by executives and marketing. They announced 1.0 when Bun was clearly not stable enough, giving lots of segfaults. Even their readme had a huge statement right in the beginning saying that Bun was not production ready yet, despite 1.0 being hyped all over the place.
I'm not a big fan of JavaScript but I admit I stayed away from it because I dislike nodejs and npm terribly.
I was forced to start coding again in JS some weeks ago and I wanted to try Deno. I must say it's been a very smooth and fast experience so far. Very well done!
Can you comment on what you prefer about it? I find npm/js pretty smooth and the rough edges of Deno seem to kill the purported improvements at this point. That was just my gut-take several months ago and I was already steeped in the node/npm ecosystem so I'm curious about your perspective.
For starters Deno is much, much faster in both installation and runtime.
I am not sure if I'll have the chance to exploit other features it has (besides the built-in dotenv support).
Not so many years ago, even installing npm wasn't straight forward.
How is it faster in runtime? Isn't it all V8 at the end?
It's not the JavaScript runtime that's faster but the built-in APIs. Supposedly (I haven't tested this myself), Deno has faster implementations of many Node.js APIs, which I have seen reflected in benchmarks for things like throughput in an HTTP server.
We use Typescript for virtually everything at my place of work. Not so much because it's a great language for the a lot of the backend, and we do use some c++ for bottlenecks, but because it's so much more productive to use a single language when you're a small team. Not only can everyone help each other, we can also share resources between the front and the back end, and we've several in-house libraries to help us with things like ODATA querying and API's since there aren't really any public packages that are in anyway useful for that. I guess we probably should've gone with something other than ODATA, but since we use a lot of Microsoft graph APIs which despite their names are ODATA, it made sense at the time. We don't have trouble with Node or NPM, and when we onboard new people, the tend to step right into our setup and like it. Granted, we've done some pretty extensive and very opinionated skeleton projects that you have to use if you want to pass the first steps in our deployment pipeline. This took a little bit of effort, and it's still a team effort to keep our governance up to something we all agree on, but with those in place, I've found it's genuinely a nice way to work. An example of how strict we are is how you can't have your functions approved without return types and you certainly can't change any of the linting or Typescript configs. Similarly you can't import 3rd party NPM packages which aren't vetted first, and we have no auto-update on 3rd party packages with out a two week grace period and four sets of human eyes. I'm not going to pretend that a all of these choices are in anyway universal, but it's what we've come together and decided works for us.
Anyway you're certainly not alone in your opinion, but I think that a lot of the bad reputation with Node and NPM comes from the amount of "change management" you need to do to "limit" the vast amount of freedom the tools give you into something that will be workable for your team. Once you get there, however, I've found it to be a much to work with than things like Nuget and dotnet, requirements.txt and python, Cargo and Rust and a lot of others. I do have a personal preference for yarn, but I guess that's mostly for nostalgic reasons. I also very much appreciate that things like PiPy are going down what is similar to the NPM route.
Curious to hear your opinion on ODATA after using it on your projects. Pros, cons, anything.
You can join the waitlist in the sneak peak of JSR linked in the end of the article: https://jsr.io/waitlist
I'm curious on what the Deno team is building here.
Like others have said, it's going to be a new package registry. It was unofficially announced at SeattleJS Conf 2023: https://www.youtube.com/watch?v=Dkqs8Mcxbvo&t=424s
I didn't see any major reason for why they were creating another registry.
I thought their whole thing was to maintain backwards compatibility, not introduce new, redundant standards?
JSR = JavaScript Registry (from the site). It seems fairly clear this is a package registry, i.e. an NPM alternative.
But they promoted they will not need one at any time, to extent they will never build one. There was only deno.land as a point to discover libraries or what community builds.
Hmm, curious too. Maybe, given the few hints, an alternative to npmjs.com registry.
I assume it's an npm competitor? Likely with a different technical design.
If Deno is looking for more growth they should make it super easy to run Next.js with it
Deno has also created a Next.js competitor, Fresh. I found it a few weeks ago and am starting to go through the docs, looks like a good overall concept. https://fresh.deno.dev/
I actually found this worrying, since it means they are not committed to make NextJS work nicely, quite the opposite they now have incentive to not make it happen
NextJS doesn't have an incentive to run on anything else than Vercel. I wouldn't blame this on Deno.
I wonder if Vercel might eventually choose to run on Deno
Bartek from the Deno team here - we are actively working on improving experience running NextJS projects in Deno.
Fresh is quite new imo it's not quite ready for a large project
Deno Deploy dropped from 35 GCP regions to just 12: https://news.ycombinator.com/item?id=39127598
sure but is that a bad thing? this regions probably see next to or zero use and cutting them frees up money to be used on other things.
They’re operating like a startup, which like you say should be perfectly fine. Can’t pretend like a giant when you don’t have the bank account for that.
Though they’re also selling a developer ecosystem. A lock-in. Are you willing to bet your company’s own software tools on a vendor that could go bankrupt from their other (hosting) business? The question is if Deno hosting dies, will Deno as a platform still thrive?
The question is if Deno hosting dies, will Deno as a platform still thrive?
that I am unsure about, a large part of the appeal at least for me is the deno deploy.
URL imports are a bit of a hassle when you are trying to get them from say a private github repo.
Deno supports package.json now
sure but is that a bad thing? this regions probably see next to or zero use
If it's a startup looking for growth then definitely. Future prospects may no longer signup because the needed region isn't available.
A program run with Deno has no file, network, or environment access unless explicitly enabled.
You can do this using containerization technology, no need to invent this per language runtime.
Agree. This feature just make it worse as a script language, which are supposed to have rapid development.
You just add a flag to the command line to give permissions. It won't harm your productivity.
Even without flags, it will ask to allow access interactively instead of silently aborting.
Yes but now in non-linux systems you have the pretty large overhead of that.
On MacOS, you have built-in sandboxing via "sandbox-exec" which shouldn't incur any noticeable overhead. It's used by Chrome, Bazel, etc.
Not sure what's available on Windows.
Interestingly, Deno and Node were both originally developed by the same person, Ryan Dahl.
Why did he feel the need to build a competitor to his own product? Whatever features are supposed to make Deno "better" than Node...why didn't he just work on integrating them into Node?
I understand that sometimes changes to software can be infeasible, especially if they are large fundamental/foundational changes, but this is still a bit of a head scratcher to me.
He couldn't work them into Node because he doesn't control Node.
He explains why in this video: https://youtu.be/M3BM9TB-8yA
He also can't unilaterally change them in NodeJS because at that point it wasn't a solo project of his anymore.
He explains it in this talk, which is also when Deno was first announced: https://www.youtube.com/watch?v=M3BM9TB-8yA
Why did Bill Gates have a team build Windows NT when he already had the extremely successful MS-DOS?
Sometimes you do need to reset the foundations, then build compatibility on top.
It is explained here: https://choubey.gitbook.io/internals-of-deno/introduction/hi...
I have a deno app in production and it is working just fine. However, I still think node is superior when you self host when you are not using docker. Deno afaik still doesn't support any way of running one process for every CPU like the cluster module in Node.js.
I like to run my shit on the metal and without Docker and it feels like Deno was designed to run on Docker or some other kind of virtual containerized environment.
However, you can run it on node thanks to pm2, but I guess why even run on Deno in the first place?
OTOH Deno can produce self-contained binaries. For me this is a plus, it makes distribution so much easier. I don't care if it's 100MB, I just want to have one file which I can throw on my server,
So does node.js nowadays. But yeah, I agree it is an awesome feature!
That’s overstating it a bit. As you can see, it’s an experimental, tedious, manual process with Node:
Stability: 1 - Experimental: This feature is being designed and will change.
— https://nodejs.org/api/single-executable-applications.html
Deno, on the other hand, has it as a supported, complete feature that’s very easy to use:
"Jupyter, the open source notebook tool, added support for JavaScript and TypeScript using Deno. This means data science, visualization, and more can all be done using modern JavaScript and TypeScript and web standards APIs."
I love this! At the same time, who would want to do this, given Python's excellent support for numbers and mathematics? And what about Haskell?
Simply because using the same language for both Web and server is incredible.
The people who already know JS/TS and would like to occasionally do something interesting with a piece of data. With Python, most of my time goes into googling how that list filtering / mapping syntax went again or some other basic level stuff that I do every day with JS. I know Python, but I'm not fluent in it. And I will likely never be fluent in it, because my Python use cases are so infrequent.
It is interesting to me that none of the new NodeJS alternatives support multithreading. Why is that?
Is it just a side-effect of using V8 engine for the heavy lifting, or is it some part of the ECMAScript specification which forbids multithreaded implementations of the language?
Multithreading is achieved through workers and the event based architecture, it's not the same, but we should rather discuss the pros/cons of it.
Pros: Vertical scalability.
Cons: All your async code becomes full of data races.
I suppose adding multithreading to the Node ecosystem would be as hard (if not harder) than removing GIL from Python. At least Python has locking primitives. Node, as far as I know, has none.
I use Deno as a Javascript/Typescript-REPL. It has syntax-highlighting built-in.
So does node, only for JS though.
I deployed my first non-trivial Deno app to production in 2023. There were some teething issues with learning to keep the lock file in sync, especially in a repo with multiple entry points each with separate lock files. Some of the granular permissions stuff didn't work how I expected, to the point where I almost gave up and just allowed network to/from all hosts. But overall the experience was good, and I have positive feelings towards Deno. I look forward to seeing where they take it.
Is Deno using the same request/response buffer sizes as Node in those benchmarks?
I'm really interested in deno, but I didn't yet have the chance to test it properly.
I would love for this year to see the runtime support for either Google Cloud Functions or AWS Lambdas
Slightly off topic but I just came here to show appreciation for the hang glider in the picture.
Shout out to all my fellow hang gliders.
Sounds like it would be a great talk for a polyglot event like the Carolina Code Conference…
Is anyone a user of Deno KV? What's your experience been like?
I'd love to use Fresh but a framework for web development which calls itself v1.x and yet only supports Tailwind for styling purposes feels very immature.
The fact that for the next iteration they are prioritizing view transitions and not CSS bundling is baffling.
Shouldn't the HN title be *Deno in 2023 (2024)*,
so we know::
- it is not about a discussion of deno being new, relevant or modern in 2023, and
- avoid thinking that the article is from 2023?
Deno wasn't originally designed to be node compatible, but I think they realized nobody would want to switch to it because node is so prevalent already...
I think the main appeal of projects like Bun and Deno is the built-in tooling for building/bundling modern typescript applications without requiring dozens of dependencies for even a basic hello world app.
If node.js decided to include functionality similar to what is available on Bun/Deno, both projects would probably lose traction quickly.
I believe this too. The big appeal for me is not having to install typescript, eslint, jest AND then set up all the configs.
deno has nice defaults, though the importing via URLS and browser compatible API do make deno very tempting
that feels like a really weak value prop to me. how often do you have to install that stuff? how hard is it actually? can you really not use, e.g. for react, the typical vite starter and it's done?
The other side of it is if you want to distribute your code not as a server. If you write a CLI in Node + TS + ... then it might be pretty fiddly for someone to clone that repo and get it running locally. You'll certainly have to document exactly what's needed.
Whereas with Deno you can compile to a single binary and let them install that if they trust you. Or they can `deno install https://raw.githubusercontent.com/.../cli.ts`, or clone the repo and just run `deno task install` or `deno task run`. For those they need to install Deno, but nothing else.
with node + TS, it is straightforward (and common) to generate JS output at publish time for distribution. then, using the CLI tool or whatever is only a `npm install -g <pkg>` away, no extra steps.
sure it's not a single binary, but I'd argue _most_ users of a general CLI utility don't necessarily care about this.
its not that much of an issue, i have created a template repo in github that i can make new projects but not I have to maintain it.
It is more a matter of trust than effort, eg being less exposed to supply chain attacks.
For one-off scripts: every time.
So Deno is better at small scripts written in Typescript than Node. Then, the question becomes, if you're going to have Deno installed and if it works well enough to replace Node, why keep Node?
Except they are playing catch up with what Microsoft says Typescript is supposed to mean.
I rather have pure JavaScript, or use the Typescript from source, without having to figure out if a type analysis bug is from me, or the tool that is catching up to Typescript vlatest.
Being an old dog in the prairie, I see the outcome of these projects being like egcs, and io.js.
They create some riff, make the key incumbent improve itself, and then the world moves on as if nothing happened.
Not sure what modern typescript means, but you only need one or two dependendencies (esbuild and tsc) unless you are doing something more involved, in which case deno alone might not work either.
OTOH the ways that you can improve upon node's shortcomings while staying compatible with it are limited. Bun is taking the pragmatic approach of providing fast drop-in replacements for node, npm and other standard tools, while Deno was the original creator of node going "if I started node today, what would I do differently?". So, different approaches...