Modern Node.js Patterns

219 eustoria 78 8/3/2025, 7:16:18 PM kashw1n.com ↗

Comments (78)

farkin88 · 2h ago
The killer upgrade here isn’t ESM. It’s Node baking fetch + AbortController into core. Dropping axios/node-fetch trimmed my Lambda bundle and shaved about 100 ms off cold-start latency. If you’re still npm i axios out of habit, 2025 Node is your cue to drop the training wheels.
tanduv · 19m ago
I never really liked the syntax of fetch and the need to await for the response.json, implementing additional error handling -

  async function fetchDataWithAxios() {
    try {
      const response = await axios.get('https://jsonplaceholder.typicode.com/posts/1');
      console.log('Axios Data:', response.data);
    } catch (error) {
      console.error('Axios Error:', error);
    }
  }



  async function fetchDataWithFetch() {
    try {
      const response = await fetch('https://jsonplaceholder.typicode.com/posts/1');

      if (!response.ok) { // Check if the HTTP status is in the 200-299 range
        throw new Error(`HTTP error! status: ${response.status}`);
      }

      const data = await response.json(); // Parse the JSON response
      console.log('Fetch Data:', data);
    } catch (error) {
      console.error('Fetch Error:', error);
    }
  }
farkin88 · 5m ago
Yeah, that's the classic bundle size vs DX trade-off. Fetch definitely requires more boilerplate. The manual response.ok check and double await is annoying. For Lambda where I'm optimizing for cold starts, I'll deal with it, but for regular app dev where bundle size matters less, axios's cleaner API probably wins for me.
exhaze · 20m ago
Tangential, but thought I'd share since validation and API calls go hand-in-hand: I'm personally a fan of using `ts-rest` for the entire stack since it's the leanest of all the compile + runtime zod/json schema-based validation sets of libraries out there. It lets you plug in whatever HTTP client you want (personally, I use bun, or fastify in a node env). The added overhead is totally worth it (for me, anyway) for shifting basically all type safety correctness to compile time.

Curious what other folks think and if there are any other options? I feel like I've searched pretty exhaustively, and it's the only one I found that was both lightweight and had robust enough type safety.

franciscop · 1h ago
As a library author it's the opposite, while fetch() is amazing, ESM has been a painful but definitely worth upgrade. It has all the things the author describes.
farkin88 · 1h ago
Interesting to get a library author's perspective. To be fair, you guys had to deal with the whole ecosystem shift: dual package hazards, CJS/ESM compatibility hell, tooling changes, etc so I can see how ESM would be the bigger story from your perspective.
franciscop · 26m ago
I'm a small-ish time author, but it was really painful for a while since we were all dual-publishing in CJS and ESM, which was a mess. At some point some prominent authors decided to go full-ESM, and basically many of us followed suit.

The fetch() change has been big only for the libraries that did need HTTP requests, otherwise it hasn't been such a huge change. Even in those it's been mostly removing some dependencies, which in a couple of cases resulted in me reducing the library size by 90%, but this is still Node.js where that isn't such a huge deal as it'd have been on the frontend.

Now there's an unresolved one, which is the Node.js streams vs WebStreams, and that is currently a HUGE mess. It's a complex topic on its own, but it's made a lot more complex by having two different streaming standards that are hard to match.

yawnxyz · 1h ago
node fetch is WAY better than axios (easier to use/understand, simpler); didn't really know people were still using axios
reactordev · 42m ago
You still see axios used in amateur tutorials and stuff on dev.to and similar sites. There’s also a lot of legacy out there.
bravesoul2 · 12m ago
AI is going to bring that back like an 80s disco playing Wham. If you gonna do it do it wrong...
benoau · 44m ago
axios got discontinued years ago I thought, nobody should still be using it!
Raed667 · 1h ago
I do miss the axios extensions tho, it was very easy to add rate-limits, throttling, retry strategies, cache, logging ..

You can obviously do that with fetch but it is more fragmented and more boilerplate

farkin88 · 1h ago
Totally get that! I think it depends on your context. For Lambda where every KB and millisecond counts, native fetch wins, but for a full app where you need robust HTTP handling, the axios plugin ecosystem was honestly pretty nice. The fragmentation with fetch libraries is real. You end up evaluating 5 different retry packages instead of just grabbing axios-retry.
hiccuphippo · 1h ago
Sounds like there's space for an axios-like library built on top of fetch.
farkin88 · 1h ago
I think that's the sweet spot. Native fetch performance with axios-style conveniences. Some libraries are moving in that direction, but nothing's really nailed it yet. The challenge is probably keeping it lightweight while still solving the evaluating 5 retry packages problem.
crabmusket · 28m ago
Is this what you're looking for? https://www.npmjs.com/package/ky

I haven't used it but the weekly download count seems robust.

farkin88 · 1h ago
Right?! I think a lot of devs got stuck in the axios habit from before Node 18 when fetch wasn't built-in. Plus axios has that batteries included feel with interceptors, auto-JSON parsing, etc. But for most use cases, native fetch + a few lines of wrapper code beats dragging in a whole dependency.
pbreit · 25m ago
It has always astonished me that platforms did not have first class, native "http client" support. Pretty much every project in the past 20 years has needed such a thing.

Also, "fetch" is lousy naming considering most API calls are POST.

synergy20 · 19m ago
axios works for both node and browser in production code, not sure if fetch can do as much as axios in browser though
vinnymac · 52m ago
Undici in particular is very exciting as a built-in request library, https://undici.nodejs.org
farkin88 · 38m ago
Undici is solid. Being the engine behind Node's fetch is huge. The performance gains are real and having it baked into core means no more dependency debates. Plus, it's got some great advanced features (connection pooling, streams) if you need to drop down from the fetch API. Best of both worlds.
panzi · 14m ago
Unless it changed how NodeJS handles this you shouldn't use Promise.all(). Because if more than one promise rejects then the second rejection will emit a unhandledRejection event and per default that crashes your server. Use Promise.allSettled() instead.
vinnymac · 1m ago
Promise.all() itself doesn't inherently cause unhandledRejection events. Any promise that is left unhandled will trigger an `unhandledRejection` event. There are still legitimate use cases for Promise.all, as there are ones for Promise.allSettled, Promise.race, Promise.any, etc. They each serve a different need.

Try it for yourself:

> node

> Promise.all([Promise.reject()])

> Promise.reject()

> Promise.allSettled([Promise.reject()])

Promise.allSettled never results in an unhandledRejection, because it never rejects under any circumstance.

bravesoul2 · 14m ago
Anyone else find they discover these sorts of things by accident. I never know when a feature was added but vague ideas of "thats modern". Feels different to when I only did C# and you'd read the new language features and get all excited. In a polyglot world and just the rate even individual languages evolve its hard to keep up! I usually learn through osmosis or a blog post like this (but that is random learning).
tyleo · 2h ago
This is great. I learned several things reading this that I can immediately apply to my small personal projects.

1. Node has built in test support now: looks like I can drop jest!

2. Node has built in watch support now: looks like I can drop nodemon!

pavel_lishin · 36m ago
I still like jest, if only because I can use `jest-extended`.
vinnymac · 31m ago
If you haven't tried vitest I highly recommend giving it a go. It is compatible with `jest-extended` and most of the jest matcher libraries out there.
pavel_lishin · 9m ago
I've heard it recommended; other than speed, what does it have to offer? I'm not too worried about shaving off half-a-second off of my personal projects' 5-second test run :P
hungryhobbit · 17m ago
Eh, the Node test stuff is pretty crappy, and the Node people aren't interested in improving it. Try it for a few weeks before diving headfirst into it, and you'll see what I mean (and then if you go to file about those issues, you'll see the Node team not care).
gabrielpoca118 · 2h ago
Don’t forget the native typescript transpiler which reduces the complexity a lot for those using TS
mmcnl · 1h ago
Exactly. You don't even need --experimental-strip-types anymore.
vinnymac · 16m ago
You no longer need to install chalk or picocolors either, you can now style text yourself:

`const { styleText } = require('node:util');`

Docs: https://nodejs.org/api/util.html#utilstyletextformat-text-op...

rco8786 · 1h ago
I've been away from the node ecosystem for quite some time. A lot of really neat stuff in here.

Hard to imagine that this wasn't due to competition in the space. With Deno and Bun trying to eat up some of the Node market in the past several years, seems like the Node dev got kicked into high gear.

prmph · 1h ago
I think slowly Node is shaping up to offer strong competition to Bun.js, Deno, etc. such that there is little reason to switch. The mutual competition is good for the continued development of JS runtimes
gear54rus · 1h ago
Slowly, yes, definitely welcome changes. I'm still missing Bun's `$` shell functions though. It's very convenient to use JS as a scripting language and don't really want to run 2 runtimes on my server.
adriancooney · 55m ago
You might find your answer with `zx`: https://google.github.io/zx/
austin-cheney · 35m ago
I see two classes of emerging features, just like in the browser:

1. new technologies

2. vanity layers for capabilities already present

It’s interesting to watch where people place their priorities given those two segments

azangru · 1h ago
Matteo Collina says that the node fetch under the hood is the fetch from the undici node client [0]; and that also, because it needs to generate WHATWG web streams, it is inherently slower than the alternative — undici request [1].

[0] - https://www.youtube.com/watch?v=cIyiDDts0lo

[1] - https://blog.platformatic.dev/http-fundamentals-understandin...

vinnymac · 43m ago
If anyone is curious how they are measuring these are the benchmarks: https://github.com/nodejs/undici/blob/main/benchmarks/benchm...

I did some testing on an M3 Max Macbook Pro a couple of weeks ago. I compared the local server benchmark they have against a benchmark over the network. Undici appeared to perform best for local purposes, but Axios had better performance over the network.

I am not sure why that was exactly, but I have been using Undici with great success for the last year and a half regardless. It is certainly production ready, but often requires some thought about your use case if you're trying to squeeze out every drop of performance, as is usual.

fleebee · 1h ago
Nice post! There's a lot of stuff here that I had no idea was in built-in already.

I tried making a standalone executable with the command provided, but it produced a .blob which I believe still requires the Node runtime to run. I was able to make a true executable with postject per the Node docs[1], but a simple Hello World resulted in a 110 MB binary. This is probably a drawback worth mentioning.

Also, seeing those arbitrary timeout limits I can't help but think of the guy in Antarctica who had major headaches about hardcoded timeouts.[2]

[1]: https://nodejs.org/api/single-executable-applications.html

[2]: https://brr.fyi/posts/engineering-for-slow-internet

llimllib · 55m ago
I have a blog post[1] and accompanying repo[2] that shows how to use SEA to build a binary (and compares it to bun and deno) and strip it down to 67mb (for me, depends on the size of your local node binary).

[1]: https://notes.billmill.org/programming/javascript/Making_a_s...

[2]: https://github.com/llimllib/node-esbuild-executable#making-a...

MuffinFlavored · 7m ago
Is current node.js a better language than .NET 6/7/8/9, why or why not?
amclennon · 1h ago
Some good stuff in here. I had no idea about AsyncIterators before this article, but I've done similar things with generators in the past.

A couple of things seem borrowed from Bun (unless I didn't know about them before?). This seems to be the silver lining from the constant churn in the Javascript ecosystem

keysdev · 2h ago
About time! The whole dragging the feet on ESM adoption is insane. The npm are still stuck on commonjs is quite a lot. In some way glad jsr came along.
yawnxyz · 1h ago
I feel like node and deno conventions are somehow merging (which is a good thing)
lvl155 · 2h ago
Thank you for this. Very helpful as I was just starting to dig into node for first time in a few years.
asgr · 56m ago
Deno has sandboxing tho
refulgentis · 1h ago
The LLM made this sound so epic: "The node: prefix is more than just a convention—it’s a clear signal to both developers and tools that you’re importing Node.js built-ins rather than npm packages. This prevents potential conflicts and makes your code more explicit about its dependencies."
bashtoni · 2m ago
Agreed. It's surprising to see this sort of slop on the front page, but perhaps it's still worthwhile as a way to stimulate conversation in the comments here?
wavemode · 1h ago
so in other words, it's a convention
Hackbraten · 41m ago
Also no longer having to use an IIFE for top-level await is allegedly a „game changer.“
insin · 25m ago
"SlopDetector has detected 2 x seamlessly and 7 x em-dash, would you like to continue?"
Lockal · 15m ago
Screaming "you’re not just writing contemporary code—you’re building applications that are more maintainable, performant, and aligned"
serguzest · 1h ago
One thing you should add to section 10 is encouraging people to pass `cause` option while throwing new Error instances. For example

new Error("something bad happened", {cause:innerException})

serguzest · 47m ago
I love Node's built-in testing and how it integrates with VSCode's test runner. But I still miss Jest matchers. The Vitest team ported Jest matchers for their own use. I wish there were a similar compatibility between Jest matchers and Node testing as well.
vinnymac · 33m ago
Currently for very small projects I use the built in NodeJS test tooling.

But for larger and more complex projects, I tend to use Vitest these days. At 40MBs down, and most of the dependency weight falling to Vite (33MBs and something I likely already have installed directly), it's not too heavy of a dependency.

tkzed49 · 40m ago
assertions in node test feel very "technically correct but kind of ugly" compared to jest, but I'll use it anyway
kfuse · 2h ago
Node now has limited supports for Typescript and has SQLite built in, so it becomes really good for small/personal web oriented projects.
chickenzzzzu · 2h ago
Yet more architecture astronaut behavior by people who really should just be focusing on ifs, fors, arrays, and functions.
triyambakam · 2h ago
Architecture astronaut is a term I hadn't heard but can appreciate. However I fail to see that here. It's a fair overview of newish Node features... Haven't touched Node in a few years so kinda useful.
chickenzzzzu · 2h ago
It's a good one with some history and growing public knowledge now. I'd encourage a deep dive, it goes all the way back to at least CPP and small talk.

While I can see some arguments for "we need good tools like Node so that we can more easily write actual applications that solve actual business problems", this seems to me to be the opposite.

All I should ever have to do to import a bunch of functions from a file is

"import * from './path'"

anything more than that is a solution in search of a problem

MrJohz · 2h ago
Isn't that exactly the syntax being recommended? Could you explain what exactly in the article is a solution in search of a problem?
WickyNilliams · 1h ago
Did you read the article? Your comments feel entirely disconnected from its contents - mostly low level piece or things that can replace libraries you probably used anyway
flufluflufluffy · 2h ago
what? This is an overview of modern features provided in a programming language runtime. Are you saying the author shouldn’t be wasting their time writing about them and should be writing for loops instead? Or are you saying the core devs of a language runtime shouldn’t be focused on architecture and should instead be writing for loops?
programmarchy · 2h ago
One of the core things Node.js got right was streams. (Anyone remember substack’s presentation “Thinking in streams”?) It’s good to see them continue to push that forward.
chickenzzzzu · 2h ago
Why? Why is a stream better than an array? Why is the concept of a realtime loop and for looping through a buffer not sufficient?
bblaylock · 2h ago
I think there are several reasons. First, the abstraction of a stream of data is useful when a program does more than process a single realtime loop. For example, adding a timeout to a stream of data, switching from one stream processor to another, splitting a stream into two streams or joining two streams into one, and generally all of the patterns that one finds in the Observable pattern, in unix pipes, and more generally event based systems, are modelled better in push and pull based streams than they are in a real time tight loop. Second, for the same reason that looping through an array using map or forEach methods is often favored over a for loop and for loops are often favored over while loops and while loops are favored over goto statements. Which is that it reduces the amount of human managed control flow bookkeeping, which is precisely where humans tend to introduce logic errors. And lastly, because it almost always takes less human effort to write and maintain stream processing code than it does to write and maintain a real time loop against a buffer.

Hopefully this helps! :D

cluckindan · 1h ago
Streams have backpressure, making it possible for downstream to tell upstream to throttle their streaming. This avoids many issues related to queuing theory.

That also happens automatically, it is abstracted away from the users of streams.

dwb · 2h ago
A stream is not necessarily always better than an array, of course it depends on the situation. They are different things. But if you find yourself with a flow of data that you don't want to buffer entirely in memory before you process it and send it elsewhere, a stream-like abstraction can be very helpful.
WickyNilliams · 1h ago
Why is an array better than pointer arithmetic and manually managing memory? Because it's a higher level abstraction that frees you from the low level plumbing and gives you new ways to think and code.

Streams can be piped, split, joined etc. You can do all these things with arrays but you'll be doing a lot of bookkeeping yourself. Also streams have backpressure signalling

chickenzzzzu · 10m ago
Backpressure signaling can be handled with your own "event loop" and array syntax.

Manually managing memory is in fact almost always better than what we are given in node and java and so on. We succeed as a society in spite of this, not because of this.

There is some diminishing point of returns, say like, the difference between virtual and physical memory addressing, but even then it is extremely valuable to know what is happening, so that when your magical astronaut code doesn't work on an SGI, now we know why.