Behind the scenes of Bun Install

275 Bogdanp 83 9/11/2025, 12:39:29 PM bun.com ↗

Comments (83)

captn3m0 · 2h ago
> The M4 Max MacBook I'm using to write this would've ranked among the 50 fastest supercomputers on Earth in 2009.

I attempted to validate this: You'd need >75 TFlop/s to get into the top50 in the TOP500[0] rankings in 2009. M4 Max review says 18.4 TFlop/s at FP32, but TOP500 uses LINPACK, which uses FP64 precision.

An M2 benchmark gives a 1:4 ratio for double precision, so you'd get maybe 9 TFlop/s at FP64? That wouldn't make it to TOP500 in 2009.

[0]: https://top500.org/lists/top500/list/2009/06/

nine_k · 1h ago
> Now multiply that by thousands of concurrent connections each doing multiple I/O operations. Servers spent ~95% of their time waiting for I/O operations.

Well, no. The particular thread of execution might have been spending 95% of time waiting for I/O, but a server (the machine serving the thousands of connections) would easily run at 70%-80% of CPU utilization (because above that, tail latency starts to suffer badly). If your server had 5% CPU utilization under full load, you were not running enough parallel processes, or did not install enough RAM to do so.

Well, it's a technicality, but the post is devoted to technicalities, and such small blunders erode the trust to the rest of the post. (I'm saying this as a fan of Bun.)

robinhood · 4h ago
Complex subject, beautifully simple to read. Congrats to the author.

Also: I love that super passionate people still exist, and are willing to challenge the statut quo by attacking really hard things - things I don't have the brain to even think about. It's not normal that we have better computers each month and slower softwares. If only everyone (myself included) were better at writing more efficient code.

ljm · 2h ago
I didn’t know it was written in Zig. That’s a fascinating choice to me given how young the language is.

Amazing to see it being used in a practical way in production.

epolanski · 41s ago
The language is very in development but it's ecosystem and tooling are absolutely mature.
robinhood · 1h ago
Zig was created in 2016 though - almost 10 years at this point. Perhaps the surprise here is that we are not as exposed to this language on well-known and established projects as other languages like Rust, Go and C.
blizdiddy · 7h ago
I used bun for the first time last week. It was awesome! The built-in server and SQLite meant i didn’t need any dependencies besides bun itself, which is certainly my favorite way to develop.

I do almost all of my development in vanilla js despite loathing the node ecosystem, so i really should have checked it out sooner.

k__ · 7h ago
I tried using Bun a few times, and I really liked working with it.

Much better than Node.

However...!

I always managed to hit a road block with Bun and had to go back to Node.

First it was the crypto module that wasn't compatible with Nodejs signatures (now fixed), next Playwright refused working with Bun (via Crawlee).

koakuma-chan · 7h ago
You can use Bun as package manager only. You don't have to use Bun as runtime.
winterrdog · 3h ago
Sure?

Does it work if I have packages that have nodejs c++ addons?

abejfehr · 47m ago
Why wouldn’t it? The end result of a npm install or a bun install is that the node_modules folder is structured in the way it needs to be, and I think it can run node-gyp for the packages that need it.
petralithic · 2h ago
You should try Deno, they have good Node compatibility
drewbitt · 2h ago
Deno doesn't work with crawlee either unfortunately
Cthulhu_ · 7h ago
I think this is the big one that slows adoption of "better" / "faster" tooling down, that is, backwards compatibility and drop-in-replacement-ability. Probably a lot of Hyrum's Law.
jherdman · 4h ago
Storybook is another for me.
simantel · 3h ago
Node also has a built-in server and SQLite these days though? Or if you want a lot more functionality with just one dependency, Hono is great.
blizdiddy · 2h ago
And how many dependencies does Hono have? Looks like about 26. And how many dependencies do those have?

A single static zig executable isn’t the same as a a pipeline of package management dependencies susceptible to supply chain attacks and the worst bitrot we’ve had since the DOS era.

manuhabitela · 4h ago
I'm impressed how pleasant and easy to read this pretty technical explanation was. Good job on the writing.
winterrdog · 3h ago
Truth!

Lydia is very good at presenting complex ideas simply and well. I've read and watched most of her work or videos. She really goes to great lengths in her work to make it come to life. Highly recommend her articles and YouTube videos.

Though she's been writing less I think due to her current job

thornewolf · 6h ago
I think they forgot to include the benchmark time for "npm (cached)" inside the Binary Manifest Caching section. We have bun, bun (cached), npm. I think the summary statistics are also incorrect.
alberth · 3h ago
I really enjoyed the writing style of this post.

A few things:

- I feel like this post repurposed could be a great explanation on why io_uring is so important.

- I wonder if Zig recently io updates in v0.15 make any perf improvement to Bun beyond its current fast perf.

aleyan · 6h ago
I have been excited about bun for about a year, and I thought that 2025 is going to be its breakout year. It is really surprising to me that it is not more popular. I scanned top 100k repos on GitHub, and for new repos in 2025, npm is 35 times more popular and pnpm is 11 time more popular than bun [0][1]. The other up and coming javascript runtime, deno is not so popular either.

I wonder why that is? Is it because it is a runtime, and getting compatibility there is harder than just for a straight package manager?

Can someone who tried bun and didn't adopt it personally or at work chime in and say why?

[0] https://aleyan.com/blog/2025-task-runners-census/#javascript...

[1] https://news.ycombinator.com/item?id=44559375

dsissitka · 5h ago
I really want to like Bun and Deno. I've tried using both several times and so far I've never made it more than a few thousand lines of code before hitting a deal breaker.

Last big issue I had with Bun was streams closing early:

https://github.com/oven-sh/bun/issues/16037

Last big issue I had with Deno was a memory leak:

https://github.com/denoland/deno/issues/24674

At this point I feel like the Node ecosystem will probably adopt the good parts of Bun/Deno before Bun/Deno really take off.

hoten · 1h ago
uh... looks like an AI user saw this comment and fixed your bun issue? Or maybe it just deleted code in a random manner idk.

https://github.com/oven-sh/bun/commit/b474e3a1f63972979845a6...

phpnode · 6h ago
It’s a newer, vc funded competitor to the open source battle tested dominant player. It has incentives to lock you in and ultimately is just not that different from node. There’s basically no strategic advantage to using bun, it doesn’t really enable anything you can’t do with node. I have not seen anyone serious choose it yet, but I’ve seen plenty of unserious people use it
marcjschmidt · 1h ago
I think that summarizes it well. It's not 10x better that makes the risky bet of going into vendor lock from a VC-backed company worth it. Same issue with Prisma and Next for me.
johnfn · 6h ago
I am Bun's biggest fan. I use it in every project I can, and I write all my one-off scripts with Bun/TS. That being said, I've run into a handful of issues that make me a little anxious to introduce it into production environments. For instance, I had an issue a bit ago where something simple like an Express webserver inside Docker would just hang, but switching bun for node worked fine. A year ago I had another issue where a Bun + Prisma webserver would slowly leak memory until it crashed. (It's been a year, I'm sure they fixed that one).

I actually think Bun is so good that it will still net save you time, even with these annoyances. The headaches it resolves around transpilation, modules, workspaces etc, are just amazing. But I can understand why it hasn't gotten closer to npm yet.

williamstein · 6h ago
I am also very curious what people think about this. To me, as a project, Node gives off a vibe of being mature, democratic and community driven, especially after successfully navigating then io.js fork drama etc a few years ago. Clearly neither bun nor deno are community driven democratic projects, since they are both VC funded.
silverwind · 6h ago
Take a look at their issue tracker, it's full of crashes because apparently this Zig language is highly unsafe. I'm staying on Node.
audunw · 2h ago
Zig isn’t inherently highly unsafe. A bit less than Rust in some regards. But arguably more safe in a few others.

But the language haven’t even reached 1.0 yet. A lot of the strategies for doing safe Zig isn’t fully developed.

Yet, TigerBeetle is written in Zig and is an extremely robust piece of software.

I think the focus of Bun is probably more on feature parity in the short term.

petralithic · 2h ago
That's why out if I had to choose a Node competitor, out of Bun and Deno, I'd choose Deno.
mk12 · 4h ago
Good thing libuv is written in a "safe" language.
otikik · 4h ago
npm is a minefield that thousands of people traverse every day. So you are unlikely to hit a mine.

bun is a bumpy road that sees very low traffic. So you are likely to hit some bumps.

MrJohz · 6h ago
I think part of the issue is that a lot of the changes have been fairly incremental, and therefore fairly easy to include back into NodeJS. Or they've been things that make getting started with Bun easier, but don't really add much long-term value. For example, someone else in the comments talked about the sqlite module and the http server, but now NodeJS also natively supports sqlite, and if I'm working in web dev and writing servers, I'd rather use an existing, battle-tested framework like Express or Fastify with a larger ecosystem.

It's a cool project, and I like that they're not using V8 and trying something different, but I think it's very difficult to sell a change on such incremental improvements.

veber-alex · 6h ago
Neither Bun nor Deno have any killer features.

Sure, they have some nice stuff that should also be added in Node, but nothing compelling enough to deal with ecosystem change and breakage.

gkiely · 1h ago
bun test is a killer feature
tracker1 · 4h ago
There's still a few compatibility sticking points... I'm far more familiar with Deno and have been using it a lot the past few years, it's pretty much my default shell scripting tool now.

That said, for many work projects, I need to access MS-SQL, which the way it does socket connections isn't supported by the Deno runtime, or some such. Which limits what I can do at work. I suspect there's a few similar sticking points with Bun for other modules/tools people use.

It's also very hard to break away from entropy. Node+npm had over a decade and a lot of effort to build that ecosystem that people aren't willing to just abandon wholesale.

I really like Deno for shell scripting because I can use a shebang, reference dependencies and the runtime just handles them. I don't have the "npm install" step I need to run separately, it doesn't pollute my ~/bin/ directory with a bunch of potentially conflicting node_modules/ either, they're used from a shared (configurable) location. I suspect bun works in a similar fashion.

That said, with work I have systems I need to work with that are already in place or otherwise chosen for me. You can't always just replace technology on a whim.

oefrha · 5h ago
To beat an incumbent you need to be 2x better. Right now it seems to be a 1.1x better (for any reasonably sized projects) work in progress with kinks you’d expect from a work in progress and questionable ecosystem buy-in. That may be okay for hobby projects or tiny green field projects, but I’m absolutely not gonna risk serious company projects with it.
k__ · 1h ago
Seems awfully close to 2x, and that was last year.

https://dev.to/hamzakhan/rust-vs-go-vs-bun-vs-nodejs-the-ult...

davidkunz · 6h ago
I tried to run my project with bun - it didn't work so I gave up. Also, there needs to be a compelling reason to switch to a different ecosystem.
turtlebits · 4h ago
Tried it last year - I spent a few hours fighting the built in sqlite driver and found it buggy (silent errors) and the docs were very lacking.
fkyoureadthedoc · 6h ago
Bun is much newer than pnpm, looking at 1.0 releases pnpm has about a 6 year head start.

I write a lot of one off scripts for stuff in node/ts and I tried to use Bun pretty early on when it was gaining some hype. There were too many incompatibilities with the ecosystem though, and I haven't tried since.

madeofpalk · 6h ago
Honestly, it doesn't really solve a big problem I have, and introduces all the problem with being "new" and less used.
koakuma-chan · 6h ago
> I wonder why that is?

LLMs default to npm

fkyoureadthedoc · 6h ago
You sure it's not just because npm has been around for 15 years as the default package manager for node?
koakuma-chan · 6h ago
Didn't prevent me from switching to Bun as the cost is 0.
wink · 6h ago
> Node.js uses libuv, a C library that abstracts platform differences and manages async I/O through a thread pool.

> Bun does it differently. Bun is written in Zig, a programming language that compiles to native code with direct system call access:

Guess what, C/C++ also compiles to native code.

I mean, I get what they're saying and it's good, and nodejs could have probably done that as well, but didn't.

But don't phrase it like it's inherently not capable. No one forced npm to be using this abstraction, and npm probably should have been a nodejs addon in C/C++ in the first place.

(If anything of this sounds like a defense of npm or node, it is not.)

k__ · 6h ago
To me, the reasoning seems to be:

Npm, pnpm, and yarn are written in JS, so they have to use Node.js facilities, which are based on libuv, which isn't optimal in this case.

Bun is written in Zig, so it doesn't need libuv, and can so it's own thing.

Obviously, someone could write a Node.js package manager in C/C++ as a native module to do the same, but that's not what npm, pnpm, and yarn did.

lkbm · 5h ago
Isn't the issue not that libuv is C, but that the thing calling it (Node.js) is Javascript, so you have to switch modes each time you have libuv make a system call?
tracker1 · 4h ago
I'm somewhat curious how Deno stands up with this... also, not sure what packages are being installed. I'd probably start a vite template project for react+ts+mui as a baseline, since that's a relatively typical application combo for tooling. Maybe hono+zod+openapi as well.
tracker1 · 1h ago
For my own curiousity on a React app on my work desktop.

    - Clean `bun install`, 48s - converted package-lock.json
    - With bun.lock, no node_modules, 19s
    - Clean with `deno install --allow-scripts`, 1m20s
    - with deno.lock, no node_modules, 20s
    - Clean `npm i`, 26s
    - `npm ci` (package-lock.json), no node_modules, 1m,2s (wild)
So, looks like if Deno added a package-lock.json conversion similar to bun the installs would be very similar all around. I have no control over the security software used on this machine, was just convenience as I was in front of it.

Hopefully someone can put eyes on this issue: https://github.com/denoland/deno/issues/25815

steve_adams_86 · 2h ago
I think Deno isn't included in the benchmark because it's a harder comparison to make than it might seem.

Deno's dependency architecture isn't built around npm; that compatibility layer is a retrofit on top of the core (which is evident in the source code, if you ever want to see). Deno's core architecture around dependency management uses a different, URL-based paradigm. It's not as fast, but... It's different. It also allows for improved security and cool features like the ability to easily host your own secure registry. You don't have to use npm or jsr. It's very cool, but different from what is being benchmarked here.

tracker1 · 1h ago
All the same, you can run deno install in a directory with a package.json file an it will resolve and install to node_modules. The process is also written in compiled code, like bun... so I was just curious.

edit: replied to my own post... looks like `deno install --allow-scripts` is about 1s slower than bun once deno.lock exists.

k__ · 6h ago
"... the last 4 bytes of the gzip format. These bytes are special since store the uncompressed size of the file!"

What's the reason for this?

I could imagine, many tools could profit from knowing the decompressed file size in advance.

philipwhiuk · 6h ago
It's straight from the GZIP spec if you assume there's a single GZIP "member": https://www.ietf.org/rfc/rfc1952.txt

> ISIZE (Input SIZE)

> This contains the size of the original (uncompressed) input data modulo 2^32.

So there's two big caveats:

1. Your data is a single GIZP member (I guess this means everything in a folder)

2. Your data is < 2^32 bytes.

k__ · 6h ago
Yeah, I understood that.

I was just wondering why GZIP specified it that way.

ncruces · 5h ago
Because it allows streaming compression.
k__ · 5h ago
Ah, makes sense.

Thanks!

lkbm · 5h ago
I believe it's because you get to stream-compress efficiently, at the cost of stream-decompress efficiency.
8cvor6j844qw_d6 · 6h ago
gzip.py [1]

---

def _read_eof(self):

# We've read to the end of the file, so we have to rewind in order

# to reread the 8 bytes containing the CRC and the file size.

# We check the that the computed CRC and size of the

# uncompressed data matches the stored values. Note that the size

# stored is the true file size mod 2*32.

---

[1]: https://stackoverflow.com/a/1704576

randomsofr · 3h ago
wow, crazy to see yarn being so slow, when it used to beat npm by a lot, at a company i was we went from npm, to yarn, to pnpm, back to npm. Nowadays i try to use Bun as much as possible, but Vercel still does not uses natively for Next.
chrisweekly · 3h ago
why leave pnpm?
djfobbz · 6h ago
I really like Bun too, but I had a hard time getting it to play nicely with WSL1 on Windows 10 (which I prefer over WSL2). For example:

  ~/: bun install
  error: An unknown error occurred (Unexpected)
lfx · 6h ago
Why you prefer WSL1 over WSL2?
tracker1 · 4h ago
FS calls across the OS boundary are significantly faster in WSL1, as the biggest example from the top of my head. I prefer WSL2 myself, but I avoid using the /mnt/c/ paths as much as possible, and never, ever run a database (like sqlite) across that boundary, you will regret it.
djfobbz · 3h ago
WSL1's just faster, no weird networking issues, and I can edit the Linux files from both Windows and Linux without headaches.
rs_rs_rs_rs_rs · 7h ago
Python has uv, JS has bun, what does Ruby or PHP have? Are the devs using those languages happy with how fast the current popular dependency managers are?
JamesSwift · 7h ago
Youre looking at it wrong. Python has nix, JS has nix, ruby and php have nix : D

Thats closer to how pnpm achieves speed up though. I know there is 'rv' recently, but havent tried it.

koakuma-chan · 6h ago
You mean nix the package manager? I used to use NixOS and I had to switch off because of endless mess with environment variables.
tommasoamici · 7h ago
it's pretty new, but in Ruby there's `rv` which is clearly inspired by `uv`: https://github.com/spinel-coop/rv.

>Brought to you by Spinel

>Spinel.coop is a collective of Ruby open source maintainers building next-generation developer tooling, like rv, and offering flat-rate, unlimited access to maintainers who come from the core teams of Rails, Hotwire, Bundler, RubyGems, rbenv, and more.

hu3 · 6h ago
PHP is getting Mago (written in Rust).

Repo: https://github.com/carthage-software/mago

Announcement 9 months ago:

https://www.reddit.com/r/PHP/comments/1h9zh83/announcing_mag...

For now its main features are 3: formatting, linting and fixing lint issues.

I hope they add package management to do what composer does.

weaksauce · 5h ago
bundler is generally pretty fast on the ruby side. it also reuses dependencies for a given ruby version so you don't have the stupid node_folder in every project you use with every dependency re-downloaded and stored. if you have 90% of the dependencies for a project you only have to download and install/compile 10% of them. night and day difference.
aarondf · 6h ago
PHP has Composer, and it's extremely good!
kijin · 6h ago
PHP is much closer to raw C and doesn't do any threading by default, so I suppose composer doesn't suffer from the thread synchronization and event loop related issues that differentiate bun from npm.
swyx · 4h ago
i'm curious why Yarn is that much slower than npm? whats the opposite of this article?
phildougherty · 4h ago
Bun is FUN to say.
moffkalast · 3h ago
Anyone else also having a first association to https://xkcd.com/1682 instead of, you know, bread?
paularmstrong · 4h ago
This is all well and good, but the time it takes to install node modules is not a critical blocker for any project that I've ever been a part of. It's a drop in the bucket compared to human (ability and time to complete changes) and infrastructure (CI/deploy/costs). Cutting 20 seconds off the dependency install time is just not a make or break issue.
tracker1 · 4h ago
It's more than enough to lose your focus. If you can make a process take a couple seconds or less vs over 15, you should do that.
paularmstrong · 3h ago
How often are you doing a full install of dependencies? Re-runs for me using npm/pnpm/yarn are 1-2 seconds at worst in very large monorepos. I can't imagine needing to do full installs on any sort frequency.
tracker1 · 2h ago
I find that it's heavily dependent on the drive speed, so I've leaned into getting current generation, very fast drives as much as possible when I put together new computers and sometimes a mid-generation upgrade. Considering I often do consulting work across random projects, I pretty often am having to figure out and install things in one mono repo managed with pnpm, another with yarn, etc... so the pain is relatively real, that said, fastest drive matters as much or more, especially with build steps.

When handling merge/pull requests, I'll often do a clean step (removing node_modules, and temp files) before a full install and build to test everything works. I know not everyone else is this diligent, but this can happen several times a day... Automation (usually via docker) can help a lot with many things tested through a CI/CD environment, that said, I'm also not a fan of having to wait for too long for that process... it's too easy to get side-tracked and off-task. I tend to set alarms/timers throughout the day just so I don't miss meetings. I don't want to take a moment to look at HN, and next I know it's a few hours later. Yeah, that's my problem... but others share it.

So, again, if you can make something take less than 15s that typically takes much more, I'm in favor... I went from eslint to Rome/Biome for similar reasons... I will switch to faster tooling to reduce the risk of going off-task and not getting back.

paularmstrong · 3h ago
I also just tried bun install in my main work monorepo vs yarn. bun took 12s and yarn took 15s. This is hardly a difference worth noting.
tracker1 · 2h ago
Yeah, I find HDD speed can make more of a difference too. Gen 5 PCIe is amazing.. the difference in Rust builds was pretty phenomenal over even a good gen 4 drive.