Reports of Deno's Demise Have Been Greatly Exaggerated

166 stephdin 163 5/20/2025, 11:33:14 AM deno.com ↗

Comments (163)

IshKebab · 1h ago
> Most developers weren’t deploying simple stateless functions. They were building full-stack apps: apps that talk to a database

Honestly that seemed really obvious from the start - it's hard to think of many use cases where this isn't the case. Glad they realised anyway.

diggan · 10h ago
> There’s been some criticism lately about Deno - about Deploy, KV, Fresh, and our momentum in general.

It seems like they never replied to the criticism against their momentum (something I haven't seen myself, what would the argument even be), was that intentional or just missed?

> Some of that criticism is valid.

Would have been great to also outline what criticism is/was valid, and how they're aiming to solve those things. Sure, maybe a bit "shoot yourself in the foot" but personally I really prefer companies that are upfront about the drawbacks, and makes it more likely I'll chose them. Migadu is a great example of this, where they have a pro/con page where they are upfront about the drawbacks of using Migadu (https://migadu.com/procon/). Just the existence of that page is probably ~20% of why I chose Migadu in the first place.

skybrian · 9h ago
Here’s how they addressed momentum:

> Since the release of Deno 2 last October - barely over six months ago! - Deno adoption has more than doubled according to our monthly active user metrics.

The obvious question is: doubled, but compared to what? And what are they measuring? They’re not disclosing any real metrics on adoption.

I think what happened is that people were giving them the benefit of the doubt because they were new and you could imagine huge growth. The disappointment is by comparison to vague hopes and dreams.

sholladay · 8h ago
I was excited about Deno precisely because it was a greenfield approach without backwards compatibility. Early on, they focused on reducing complexity and it worked. There were definitely some new pain points compared to Node, but I found them pretty manageable.

At some point, rather than coming up with native solutions to those pain points, they retreated and started leaning on backwards compatibility as a workaround.

Today, Deno feels more complex than Node does because it contains both approaches. And now there are lots of edge cases where a Node package ought to work, but doesn’t because of one unimplemented API or option or a bug that exists only in Deno. My favorite testing framework, AVA, still isn’t supported.

I used to just ignore the npm compatibility layer and target Deno itself, but that’s become more cumbersome to do over time. For example, look at `deno run —help` and look at how many command line options and env vars there are. It’s exploded in the past few years. A lot of that is for npm interoperability. For me, it’s just a lot of noise.

The one area of Node compatibility that I want the most is support for ESLint configs in the Deno linter. Yet they don’t seem to want to do that.

I really want Deno to succeed, if for no other reason than because it’s pushing Node to do things that they should’ve done years ago, such as adding a permission system. I just don’t think the current vision for Deno is very coherent or consistent with its original purpose.

mark_and_sweep · 5h ago
> My favorite testing framework, AVA, still isn’t supported.

Have you checked recently? The docs (https://docs.deno.com/runtime/fundamentals/testing/) specifically mention AVA as being supported. Then again, I'd assume that most devs using Deno just use the built-in `deno test` instead of a third-party testing framework.

> The one area of Node compatibility that I want the most is support for ESLint configs in the Deno linter.

Again, have you checked recently? According to the docs this is supported: "Deno's built-in linter, `deno lint`, supports recommended set of rules from ESLint to provide comprehensive feedback on your code. (...) You can specify custom rules, plugins, and settings to tailor the linting process to your needs." (https://docs.deno.com/runtime/fundamentals/linting_and_forma...)

I've been using Deno for 6 years now. And I'm actually quite happy that most Deno projects don't have a custom testing and linting setup.

mattlondon · 10h ago
I was super-excited about Deno right up until they threw away their earlier commitments and added backwards compatibility for node and all the shite that comes with it.

The whole selling point for me was that deno was node without the bullshit and baggage, but they dropped that and basically just turned it into node with built in typescript support and a few other minor things like the permissions.

Similar story with bun.sh - node backwards compatibility (although not using V8).

Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?

skybrian · 9h ago
Why do you treat adding a feature (npm compatibility) like you’re losing something? You don’t have to use any Node API’s in your app - Deno’s API’s are pretty comprehensive. You can also stick with the libraries available on jsr.io if you’re satisfied with what you can find there.

If you want the developer experience of using something that’s not Node, you can still get it from Deno.

But it turns out that few people care that much about purity, so it’s fortunate that they’re not relying on that.

lolinder · 9h ago
An equivalent argument to yours could be made to defend the introduction of async/await to a language that has previously not had it (edit: like Rust): if you don't like async/await, just don't use it! What does it hurt you to have another feature added?

The answer is obvious in the programming language case: for those who do not want async, the addition of async/await begins to poison the ecosystem. Now they have a growing list of libraries that they cannot use if they want to avoid async, so the effort involved in picking a library goes up and the odds get increasingly high that they're locked out of some of the key tools in the ecosystem because new libraries without async become harder and harder to find.

For those who really hate colored functions, the addition of async is the removal of a feature: colorless functions are replaced with colored functions.

The same can be said of NPM compatibility. Sure, I can try to avoid it and stick to Deno imports and inspect each library that I use for NPM dependencies. But it gets harder and harder as time goes on, because a key feature of Deno has been removed: it's no longer an ecosystem reset.

skybrian · 8h ago
Node compatibility isn’t a language feature, though, and it doesn’t result in “colored” functions. If a Deno library uses a Node API or an npm library somewhere, it can be entirely encapsulated, so you might not even notice until you see it in a stack trace. That doesn’t seem very intrusive?

So it reminds me more of trying to avoid CGO in Go or avoid “unsafe” in Rust.

It would be worse if Node-specific types started appearing as function parameters or return types, but that seems fairly rare even in npm libraries, so it seems easy to avoid.

lolinder · 8h ago
> If a Deno library uses a Node API or an npm library somewhere, it can be entirely encapsulated, so you might not even notice until you see it in a stack trace. That doesn’t seem very intrusive?

Node API yes, NPM library no. If you add a dependency on a library that uses NPM you now depend on an entire web of transitive NPM dependencies, with all of the problems that entails. People don't dislike NPM because it's aesthetically displeasing—you can't just abstract away the problems with NPM. The ecosystem causes real problems with real software, and Deno initially recognized those real problems and set out to reset the ecosystem.

The only way in which NPM-compat is different than colored functions is that there's no static compiler feature telling you when you've added a dependency on a bunch of NPM libraries.

skybrian · 8h ago
I think that’s best addressed by avoiding dependencies and looking for libraries with few indirect dependencies. There are lots of npms that advertise few or no dependencies as a feature.

Though, it is nicer if it’s on jsr.io because you’ll see Typescript source code in the debugger.

There’s nothing about starting over that prevents ending up with a whole new rat’s nest of dependencies, if you’re not careful.

wink · 8h ago
I'm not saying it's a realistic view, but I had hoped without any inclusions from NPM there would exist a couple more clean-room (or at least decoupled) implementations of things in TS, leaving everything JS behind.
ogoffart · 4h ago
Off-topic, but the idea of "colored functions" from the "What Color is Your Function" article doesn't apply to Rust's async/await. That article is about JavaScript before it had async/await, when it used callbacks.

In Rust, you can call async functions from normal ones by spawning them on the executor. The .await syntax isn't as painful as dealing with callbacks and closures in JavaScript. Plus, if you call an async function incorrectly, Rust's compiler will catch it and give you a clear error message, unlike JavaScript, where bad function calls could lead to unpredictable behavior. The premises of the article don't apply, so Rust's async/await doesn't introduce the same "colored function" issues.

(See also https://without.boats/blog/let-futures-be-futures/ )

lolinder · 3h ago
I read the article when it hit HN months ago but I don't agree that function coloring doesn't apply. What you're describing is that Rust makes coloring less painful, not that functions aren't colored.

JavaScript itself has come a long way towards making coloring less painful. TypeScript+ESLint solves the weird unpredictable behavior issues with JS and async/await solves the syntax issue. Promises in general give well-defined semantics to calling an async function from a sync function. But all that only undoes some of the arguments about function coloring, not all of them. Fundamentally the same question applies: do you make async-ness part of the type system or do you instead build a system like green threads that doesn't put it in the type system?

I happen to think that coloring functions according to their async-ness is actually the right move (with the right ergonomic improvements), but plenty of people don't agree with me there even with all the ergonomic improvements Rust and TypeScript have made to the model.

incrudible · 8h ago
There is no such thing as a "colorless" alternative to colored functions[1] in Javascript, at least as far as browser-compatibility is concerned. Promises are a convention for what used to be all the colors in the rainbow (and then some imaginary ones). Async/await is syntactic sugar on top that makes it more readable. The inherent pitfalls of asynchronous code don't disappear if you remove that sugar.

If you're gonna argue that fragmentation is a problem in the node ecosystem (which I agree with), you can't convince me that a plethora of approaches to asynchronous code is preferable to async/await and promises.

1) The original essay that coined this term was looking at it from a language design perspective. The argument is a fair one if that design question is up for debate, but that isn't the case for Javascript.

lolinder · 8h ago
To be clear, I like async. I just don't think "you don't have to use it if you don't like it" is a good argument in favor of it because it's obviously not true.
stevage · 8h ago
Not the person you're replying to, but I don't get how your argument applies here. JS functions could already return promises. Some of them being declared as async doesn't change anything for the consumer does it?

(In general, I do agree that "you don't have to use it" is not a strong argument.)

jerf · 7h ago
Using the function color concept was an example of a place where this problem can occur, not the actual problem.

The problem is that if you think statically, you can say "oh, just use the 'clean' subset". But the world is not static. If you think dynamically, you can see the full Node ecosystem as a fairly powerful attractor; why would I write a deno library that only works on deno when I can write a node library that works on both? Well, if I'm writing in the Node ecosystem, why not use the whole thing?

This is a general effect; it is very hard for people to set up long-term ecosystems that are "too close" to existing ecosystems. Generally the new thing will either get pulled in (as in this case) or ignored (as in the many cases of 'hey Typescript seems to have worked pretty well, let me do a Typescript-like gloss of this other language', which generally just get ignored). There are successes, like Typescript (JS is in general a special case because being the only language in the browser for so long it was both a language and a compile target; most other attempts to "Typescriptify" a language flounder on the fact that few if any other languages have that situation), Elixir (managed to avoid just being absorbed by Erlang, IMHO kind of a similar situation where the 'base langauge' for the ecosystem was not really the best), and the occasional Lisp variant that bubbles up (though like Clojure, usually with a story about where it can be run), but in general this is very hard to pull off, harder in some ways than simply starting a brand new language ecosystem, which is itself no picnic.

lolinder · 8h ago
I'm not talking about JS. I had Rust in mind.

Also, promises already color functions just like callbacks do. Async/await just changes the syntax by which that coloring is expressed. The real problem people have with async is that they prefer green threads as a solution to concurrency, not that they don't like the syntax.

incrudible · 8h ago
...but you don't have to use it. You can keep using raw promises and you can trivially use any async/promise-based API with informal callbacks. I doubt many people want to do that, but they can.

Of course, in (browser-compatible) Javascript, some things can not be done synchronously, but that's not up for debate.

afavour · 8h ago
> Why do you treat adding a feature (npm compatibility) like you’re losing something?

Because you are losing something: a better ecosystem. Standardizing around… standards is a good thing. When I dive into the dependencies of any given Node app it’s a mess of transpiled code, sometimes minified even, there’s no guarantee what API it’ll be using and whether it’ll run outside Node (is it using the W3C streams API or the Node streams API?). But inertia is a powerful force. People will just use what’s there if they can. So the ecosystem never gets made.

> But it turns out that few people care that much about purity, so it’s fortunate that they’re not relying on that.

By that logic we never build anything newer or better. Python 3 is better than Python 2 and sets the language up for a better future. Transitioning between the two was absolutely torturous and if they just prioritised popularity it would never have happened.

pier25 · 3h ago
> Why do you treat adding a feature (npm compatibility) like you’re losing something?

Because they are losing something.

All the time and money they are investing into node compat could have been used towards a Deno first party ecosystem. It's not like they have hundreds millions to spare. Deno is a small company with limited resources.

People kept complaining that they couldn't use Deno with NPM packages so Deno ended up focusing in providing faster horses.

nicce · 9h ago
It is less about purity and more about why continue improving Deno APIs since now we can handle the stuff with Node or Node-powered library? Especially, if you are driven by profits. That means that all possible hours will be removed from the future Deno API development. Also, it does not force older libraries to adapt and make versions that use Deno API.
skybrian · 9h ago
What Deno API’s do you miss, compared to Node? It seems like they’re pretty built out?

I’m looking forward to whatever they’re going to do instead of KV, which I tried and is too limited, even for a KV store. (64k values are too small.) Something like Cloudflare’s Durable Objects might be nice.

You can’t “force” maintainers of old libraries to do anything. Deno never had that power. For people who are interested in supporting multiple alternate platforms, jsr.io seems pretty nice?

nicce · 8h ago
You should think Deno as standard library for JavaScript/TypeScript. Node was that. How well Node compares to Go/Python for example? We would like to see most used small Node libraries to merged at some level into Deno's standard library so that the amount of dependencies and deprecations would go downwards.

> You can’t “force” maintainers of old libraries to do anything. Deno never had that power. For people who are interested in supporting multiple alternate platforms, jsr.io seems pretty nice?

If enough people find Deno useful enough to skip some old libraries, maintainers are "forced", even thought Deno is not forcing anyone. If they do not adapt, then someone will just create a new library with better practices. In both cases there is pressure for better JS/TS evolution.

rtpg · 9h ago
by adding node compatibility it reduces pressure for libs to be written "the deno way". Libraries that could be cleaner!

At least that's the theory. To be honest I don't see Deno's value add. The runtime is like... I mean node works fine at this point? And the capabilities system is both too pedantic and too simplistic, so it's not actually useful.

I don't understand the value add of Bun much either. "Native" Typescript support but at the end of the day I need a bundler that does more than what these tools tend to offer.

Now if one of these basically had "esbuild but built in"....

int_19h · 7m ago
Bundler is for shipping code, but then you need hacks like tsx to use TS in tests, build scripts etc, and configuring all that can be surprisingly gnarly and prone to breakage (e.g. tsx uses unstable Node APIs).

Although now that Node itself has basic TS support with type-stripping, this substantially improves matter. But that's a fairly recent thing, both Deno and Bun predate it by a long time.

Also Bun has a built-in bundler? I'm not sure how it compares with esbuild tho.

user3939382 · 9h ago
I like JSDoc. Doesn’t have anything to do with Node and you get many of the same benefits without all the compilation toolchain complexity.
pier25 · 3h ago
> Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?

Cloudflare Workers workerd comes to mind but it's fundamentally a different thing.

https://github.com/cloudflare/workerd

It's not meant to be a generalist backend runtime and it provides almost zero batteries.

hyperpape · 9h ago
> Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?

I'm sure you can find other projects that are going to fail, but why do you want to?

Node has lots of problems (I am basing this statement on the fact that it's a major tech project). None of them are sufficient to prevent it from being extremely widely used.

To fix those problems in a product that will be used, it is not sufficient to provide something sort of like Node but without those problems. You either have to:

1. Provide a tool that requires a major migration, but has some incredible upside. This can attract greenfield projects, and sometimes edge out the existing tool.

2. Provide a tool with minimal migration cost, and without the same problems. Maybe this tool can replace the existing one. Ideally there will be other carrots (performance, reliability, ease of use). Such a tool can get people to migrate if there are enough carrots, and the migration cost is low enough.

Deno was a classic example of messing this up. It's not #1 or #2, it has the worst of both worlds. The upside was that it did things "the right way", and the downside was that you couldn't run most code that worked on Node. This is the kind of approach that only attracts zealots and hobbyists.

diggan · 10h ago
> Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?

What's the point? If you're in love with static types, but have to do JavaScript because you're targeting the browser, I kind of understand why'd you go for TypeScript. But if you're on the backend, and don't need anything JS, why limit yourself to TypeScript which is a "Compile-to-JS" language? You control the stack, make another choice.

lolinder · 9h ago
Because out of all the languages that stand a chance of being adopted at most workplaces, TypeScript is in my opinion the single most enjoyable to code in. It has an extremely expressive type system that gets out of your way and allows you to model almost anything you can come up with, it has great IDE integrations on both VS Code and JetBrains, it has strong first-class support for functional programming patterns. On top of all the things the language itself has going fit it, it allows me to write the same language on frontend and backend, which in my experience actually does make a huge difference in avoiding context switching and which also helps with avoiding code duplication.

People like to sneer at TypeScript, but let's be honest: people like to sneer at anything that's popular enough. The fact is that no language that I enjoy better than TypeScript (which is already not a very long list) stands any chance of adoption in an average workplace.

xnorswap · 9h ago
Well there's C# / .NET, which ticks off all of those boxes, even the functional syntax is well supported since it added pattern matching and people write a lot of fluent functional style anyway with LINQ.

It also interops nicely with F#, so you can write a pure functional library in F# and call it from a C# program in the "functional core, imperative shell" style.

It has an incredibly solid runtime, and a good type system that balances letting you do what you want without being overly fussy.

lolinder · 9h ago
It misses the frontend/backend symmetry and has too large a coupling to Microsoft and Windows in my head. I know that these days it's supposed to be cross platform, but every time I've tried to figure out how to install it I get lost in the morass of nearly-identical names for totally different platforms and forget which one I'm supposed to be installing on Linux these days.

That doesn't mean there's anything wrong with it and I've often thought to give it another shot, but it's not a viable option right now for me because it's been too hard to get started.

HideousKojima · 6h ago
>but every time I've tried to figure out how to install it I get lost in the morass of nearly-identical names for totally different platforms and forget which one I'm supposed to be installing on Linux these days.

I realize Microsoft is terrible at naming things, but for .NET/C# it's really not that hard these days. If you want to use the new, cross platform .NET on Linux then just install .NET 8 or 9.

New versions come out every year, with the version number increasing by one each year. Even numbered versions are LTS, odd numbered releases are only supported for about a year. This naming scheme for the cross-platform version of .NET has been used since .NET 5, almost 5 years ago, it's really not too complicated.

lolinder · 1h ago
Fair enough, I guess I haven't looked in the last few years. The last time that I did a search for .NET there were about five different names that were available and Mono still turned up as the runtime of choice for cross platform (even though I knew it wasn't any more).
int_19h · 1m ago
This is mostly the legacy stuff still ranking high up in search results.

These days you just add the Microsoft package repo for your distro and then do `apt install dotnet-sdk-9.0` or whatever.

It's also been spreading into the official distro repos. Nix, Arch, and Homebrew all have it.

HideousKojima · 13m ago
To clear things up for you a bit more (hopefully, or I'll just make it worse):

Any legacy .NET projects are made with .NET Framework 4.x (4.8.1 8s the latest). So if it's 4.x, or called .NET Framework instead of just .NET, it's referring to the old one.

.NET Core is no longer used as a name, and hasn't been since .NET Core 3.1. They skipped .NET Core t completely (to avoid confusion with the old one vut I think they caused confusion with this decision instead) and dropped the Core for .NET 5. Some people will still call .NET 5+ .NET Core (including several of my coworkers) which I'm sure doesn't help matters.

Mono isn't 100% completely dead yet, but you'll have little if any reason to use it (directly). I think the Mono Common Language Runtime is still used behind the scenes by the newer .NET when publishing on platforms that don't allow JIT (like iOS). They've started adding AOT compilation options in the newest versions of .NET so I expect Mono will be dropped completely at some point. Unless you want to run C# on platforms like BSD or Solaris or other exotic options that Mono supports but the newer .NET doesn't.

coolcase · 9h ago
I'm torn between Go (nicer runtime) and Node (get to use TS for nicer types!)
lolinder · 9h ago
For me Go doesn't even come close. It's way too restrictive in what I can and can't do with it.

I might feel differently if I worked with a large number of people who I didn't trust, but on small to medium teams composed of very senior people using Go feels like coding with one hand tied behind my back.

coolcase · 2h ago
I'm talking about the runtime though e.g. concurrency story being the main thing. Speed too. Easier multiform use. Memory footprint.
mattlondon · 8h ago
Yep I control the stack, and I want it to be typescript.

At this stage, I don't think anyone needs to try and persuade anyone why JavaScript and typescript are the Lingua Franca of software engineering.

Performant, expressive, amazing tooling (not including node/npm), natively cross-platform.

An absolute joy to code with. Why would anyone want to use anything else for general purpose coding?

In my mind there are two alternative approaches in the current ecosystem: C++ where absolute 100% maximal performance is the overriding primary objective and be damned with the consequences, then for everything else just use Typescript

diggan · 8h ago
> anyone why JavaScript and typescript are the Lingua Franca of software engineering.

I mean, it obviously isn't, although for web development, I'd probably agree with you. But regardless, zealots who hold opinions like this, where there is "one language to rule them all" is why discussing with TS peeps is so annoying. In your world, there is either C or TypeScript, but for the rest of us, we tend to use different languages depending on what problem we're solving, as they all have different strengths and drawbacks. If you cannot see any drawbacks with TypeScript, it's probably not because there aren't any, but you're currently blind to them.

> Why would anyone want to use anything else for general purpose coding?

Because you've tasted the world of having a REPL connected to your editor where you can edit running code live and execute forms. Just one example. There are so many languages available out there when you control your stack. I understand using JavaScript for frontend web stuff, because you have no other options, and I personally have nothing against JavaScript itself. But for the love of god, realize there is a world out there outside of your bubble, and some of those languages have benefits too.

hungryhobbit · 7h ago
Do you know what lingua franca means?

https://survey.stackoverflow.co/2024/technology#most-popular... ... JS is the most popular language in the world, per Stack Overflow.

rowanG077 · 5h ago
Do you know what it means? A lingua franca is able to facilitate communication between many parties who do not have a common mother language. JavaScript most certainly does not fit that bill. You could argue C is the lingua franca from the side of the CPU since C runs everywhere, it is literally meant for that. A portable assembly.
mattlondon · 4h ago
C does not run everywhere - you need to compile it to a binary per-platform and per-architecture first, then your platform+arch specific binary only runs on that specific combination... And even then there might be dynamically linked libs to worry about.

You may as well call binary (i.e. 1s and 0s) the Lingua Franca in that case.

Lets not get started on C build chains (especially cross-compiling) ... cmake Vs cygwin Vs msvc Vs whatever else these days with hacky-and-brittle ifdefs conditionals everywhere just to make it work - chaos! JavaScript just runs on pretty much any modern computer you can sit down at or put in your pocket, and even on more exotic things that don't have it installed by default you are about 10 seconds away from installing the official package and you are off and running.

rowanG077 · 2h ago
Obviously "Run everywhere" is after compilation. No language works without some kind of processing, and no machine code is not a language. Even if you'd call machine code a language it cannot be called a lingua franca by definition since it's designed to be architecture specific. I'm not sure why you are even start with linked libs, or even linking or libraries at all. That's far removed from the language itself, the C language standard does not prescribe how to link even. It's an implementation detail.
coffeebeqn · 7h ago
> amazing tooling

Oh I get it, you’re joking

Octoth0rpe · 10h ago
> why limit yourself to TypeScript which is a "Compile-to-JS" language? You control the stack, make another choice.

Because some of us _like_ typescript, or at a minimum, have invested a significant portion of our careers learning ts/js. We want an ROI, we just don't want node/npm.

diggan · 9h ago
> Because some of us _like_ typescript, or at a minimum, have invested a significant portion of our careers learning ts/js.

Right, makes sense. It also makes sense that most of those learnings are transferable, it's not like TypeScript is the only language with types. So your design/architecture skills can be used elsewhere too. Locking yourself into the ecosystem of one language, then asking other runtimes to adhere to your preference sounds like a sure way of getting disappointed, instead of being pragmatic and flexible to chose the right tool for the problem.

rtpg · 9h ago
Typescript's type system is uniquely good for preventing types in "bog standard enterprise code". No other language comes close to it. Two words: untagged unions. And of course all the other utilities it provides.

It's extremely good! Shame about it being coupled to Javascript.

LoganDark · 9h ago
I haven't seen any true competition with TypeScript, at least for me. Go is unsound (Go slices...), Python is too high-level (even with mypyc), Rust is too low-level (in comparison to TS), and so on. It's not just "a language with types".

Also, when I was writing a frontend and backend both in TS, I could literally share the exact same type definitions between them. Then I could use a compiler plugin (`typescript-is`) to validate on the server that payloads match the appropriate type. It was amazing and worked quite well, and I can't really see that being nearly as easy and seamless with anything else.

diggan · 9h ago
> I could literally share the exact same type definitions between them. Then I could use a compiler plugin (`typescript-is`) to validate on the server that payloads match the appropriate type. It was amazing and worked quite well, and I can't really see that being nearly as easy and seamless with anything else.

But isn't that benefit just because TypeScript does compile to JavaScript and is compatible with JavaScript? Remove that compatibility, and you wouldn't get that benefit anymore, right? And if you still can get that benefit, why wouldn't you be able to get that benefit with other languages too?

It's not like TypeScript gives you some inherit benefit that makes it easier to convert to JavaScript, besides the fact that it's literally a "Compile-to-JS" language.

LoganDark · 8h ago
> But isn't that benefit just because TypeScript does compile to JavaScript and is compatible with JavaScript? Remove that compatibility, and you wouldn't get that benefit anymore, right?

JavaScript does make it easier to target both the web browser and Node.js, sure. But TypeScript also has a fairly mature type system and ecosystem (flaws in `tsc` itself notwithstanding). Not to say that no novel approaches are worth exploring, though; I just haven't seen one that rivals my TS experience yet.

> And if you still can get that benefit, why wouldn't you be able to get that benefit with other languages too?

That depends. In many other programming languages (such as ones that compile to WASM) it's also possible to have common code shared between server and client, but it's usually pretty inconvenient to actually get the code running in both environments. It's also possible to have a common interface definition and generate types for server and client from that definition, but that's still more complicated.

Anyway I don't fault anyone for being disappointed that Deno fell into the Node.js compatibility trap. Pure TypeScript without that particular cruft is also something I was excited about. I also was excited to see what looked like some fair innovation (like their import mechanism and their sandboxing) but I don't know how that'll continue if Node.js compatibility ends up being too much of a time sink.

I don't have very strong opinions because I've never really used Deno and I probably won't even bother at this point, but I definitely would not agree that this is just a problem of needing to use another programming language instead.

skydhash · 9h ago
I don’t understand the need to share types between frontend and backend. It’s like a strong incentive to take the wrong decisions down the lines instead of using the right data model for the domain.
LoganDark · 9h ago
I mean JSON payload types.
ChocolateGod · 9h ago
> But if you're on the backend, and don't need anything JS, why limit yourself to TypeScript

Why not use it? What high level programming language would you suggest instead with the same level of performance and ecosystem support.

liveoneggs · 9h ago
Java has (much) better performance and a bigger ecosystem. It has types and multiple alternative languages which target it (kotlin, scala, clojure, etc)
alabastervlog · 7h ago
Hell, last I checked PHP is still king of the server-side Web scripting languages, as far as real-world speed. Like, fast enough that you've gotta be pretty careful in Java or Go or whatever, or you'll end up slower than PHP.

The way you make a scripting language fast is by getting the hell out of it and into C or C++ as fast as possible, and PHP's library ecosystem embraces that harder than just about any other scripting language, is the reason (I think).

[EDIT] My point is mainly that Node's performance isn't really that impressive, in the field of "languages one might use to write server-side Web code". It beats absolute slugs like Python and Ruby handily, but past that... eh. And really, in its speed category you'd probably do just as well using either of those and paying a little more attention to things like network calls and database access patterns.

coffeebeqn · 7h ago
Go. If by same level of performance you mean much better performance. The language even comes with a http server built into it so you never have to deal with something like node
silverwind · 11m ago
Performance maybe, but you will suffer go's rudimentary language features and it's very limited type system.
koakuma-chan · 10h ago
Why would you use something other than a JS framework to build a web app back-end? You don't have to deal with OpenAPI or GraphQL, you can just use server actions.
williamdclt · 9h ago
"server actions" seems to be a NextJS thing, not a JS thing? JS does not mean React and NextJS. The communication between frontend and backend is (almost) always HTTP, regardless of the languages and frameworks (server actions are http). Having a wrapper around HTTP isn't really a compelling reason to choose a technology for the very large majority of people: probably the opposite actually.

IDK what you mean by "deal with OpenAPI", OpenAPI is a spec not a technology like graphql.

In all honesty (and sorry for the directness), you don't really seem to understand these concepts and how relevant or not they are to this conversation

koakuma-chan · 9h ago
> server actions" seems to be a NextJS thing, not a JS thing?

It's a JS framework thing. Every mainstream JS framework has server actions or equivalent.

> Having a wrapper around HTTP isn't really a compelling reason to choose a technology for the very large majority of people: probably the opposite actually.

It is way more convenient to write a server action and be able to immediately use it in a client component than having to write an HTTP endpoint in a separate back-end project, and then regenerate your client via OpenAPI, or whatever else you use.

> IDK what you mean by "deal with OpenAPI"

I mean dealing with tooling to generate an HTTP client from OpenAPI schema.

> In all honesty (and sorry for the directness), you don't really seem to understand these concepts and how relevant or not they are to this conversation

Wrong

williamdclt · 5h ago
> Every mainstream JS framework has server actions or equivalent

No. "Server actions" are a React concept, it has little to do with backend technology (the backend still speaks HTTP). This concept is completely irrelevant to most big frameworks like Express, NestJS, Koa, Hapi. Next is barely a "backend" framework: it's rather a React server framework that has a few basic backend functionalities.

koakuma-chan · 5h ago
Okay, every full-stack JS framework.
dboreham · 6h ago
Raising my hand with the opinion that you're wrong. Making it confusing as to whether code is executing on server or client is imho an antipattern.
koakuma-chan · 5h ago
I’m wrong in what? Even if you’re confused whether server actions run on the server or on the client, it doesn’t take away benefits I listed before.
homebrewer · 9h ago
Better/scalable performance, actual runtime type checking without wrapping everything with third-party libraries and paying the associated overhead, talent pool in my area, better (or even existing) libraries for the task at hand, better observability instrumentation, personal preference... is this really an honest question?
koakuma-chan · 9h ago
> is this really an honest question?

Yes, there is nothing that works better than server actions. None of what you listed really makes sense to me. I have never had any runtime performance problems with TypeScript and wasn't JavaScript the most popular programming language in the world (the talent pool argument)?.

diggan · 10h ago
Because there are other options available that might be better? Personally I'd chose Clojure for anything I have a choice with. Cargo culting a language like that does no one any favors.
koakuma-chan · 9h ago
> Personally I'd chose Clojure for anything I have a choice with.

And then you would have to solve the problem of how to communicate with the client.

diggan · 9h ago
I dunno, HTTP works pretty well for me as a transport layer to communicate with clients. Otherwise Websockets. Not sure why you'd think that be a difficult thing?
koakuma-chan · 9h ago
> Not sure why you'd think that be a difficult thing?

You aren't suggesting to handwrite an HTTP API client, right? You would have to set up either OpenAPI which is a mess, or GraphQL which is also a mess. LMK if you have a better solution.

diggan · 9h ago
What exactly is the problem you're encountering when trying to interact with HTTP APIs? If you really want to use OpenAPI for whatever reason, there are plenty of ways of doing that even in Clojure/Java, so I'm sure it's possible in other languages as well. But it's not like using OpenAPI or GraphQL are the only two options, if you're of that opinion I'm afraid you've drank way too much of the koolaid.
koakuma-chan · 9h ago
> What exactly is the problem you're encountering when trying to interact with HTTP APIs?

The problem is that, unlike when using server actions, when using HTTP APIs, there is nothing that automatically generates bindings.

> If you really want to use OpenAPI for whatever reason

No, I don't. But people use OpenAPI to avoid having to handwrite an HTTP client. This is especially relevant if you are developing a public API.

> But it's not like using OpenAPI or GraphQL are the only two options

What are other options?

diggan · 8h ago
> when using HTTP APIs, there is nothing that automatically generates bindings.

Is that really the biggest problem you face when programming? How many endpoints do you have? Even with projects with ~30 endpoints, it doesn't seem problematic to me, but maybe people regularly work on projects with 100s of endpoints, then it kind of makes sense. But I'm not sure that's typical enough.

> No, I don't. But people use OpenAPI to avoid having to handwrite an HTTP client. This is especially relevant if you are developing a public API.

People do a lot of stuff for a lot of nonsense reasons, doesn't mean that's the best way to approach things. The JS/TS ecosystem seems extra easy to fall into cargo culting too.

floydnoel · 8h ago
"handwriting" an HTTP client is too much work for a developer? you need to import a library that does it for you? wow. abstraction at any cost, eh?
koakuma-chan · 8h ago
Yes, you have to waste time handwriting it and making sure that you actually wrote a proper binding. Then you also probably need to deal with API versioning. All of this goes away with server actions.
coolcase · 9h ago
You don't need to deal with those anyway. RPC it. fetch one one end, route on other.
koakuma-chan · 9h ago
> RPC it. fetch one one end, route on other.

What do you mean by this?

coolcase · 2h ago
JS fetch function in three browser.

Add a route on the back end.

RPC means just call it! don't worry about REST, GQL etc.

liveoneggs · 9h ago
nextjs or bust? This is a wild take
koakuma-chan · 9h ago
Yep, Next.js has the best support for vibe coding.
diggan · 7h ago
> vibe coding

"Vibe coding" as a concept is a fun joke, not a workflow you employ for doing serious engineering. It was a tiny experiment that somehow people thought was a suggested way of developing software, which obviously it isn't. Read the code yourself, otherwise it'll be really hard to call yourself any sort of engineer.

koakuma-chan · 7h ago
> "Vibe coding" as a concept is a fun joke, not a workflow you employ for doing serious engineering.

Well I guess making Next.js apps isn't really "serious engineering"

> Read the code yourself, otherwise it'll be really hard to call yourself any sort of engineer.

I do read the code but I barely write any code by hand.

diggan · 7h ago
> Well I guess making Next.js apps isn't really "serious engineering"

Where did I say that?

> I do read the code but I barely write any code by hand.

Right, so you use the words "vibe coding" yet you don't actually understand the concept? A lot of things make sense now. The description "vibe coding" is explicitly about "programming" with a LLM without reading or writing any code at all, for any purpose. If you read the code, you're not really vibe coding as originally described by Karpathy.

koakuma-chan · 6h ago
> Where did I say that?

You replied to a comment that says "Yep, Next.js has the best support for vibe coding."

> Right, so you use the words "vibe coding" yet you don't actually understand the concept? A lot of things make sense now.

You can stop arguing that if one glances at the code one is no longer vibe coding, because in practice by looking at the code or even the LLM's thoughts you can catch things you don't want early.

msie · 9h ago
Wow…
jgalt212 · 10h ago
maybe because async Python is painful, an async TypeScript / JS is the default.
diggan · 9h ago
Use another language where async isn't painful then? Since it's the backend, you have 100% control over what stack to chose, unless some other requirement gets in the way. And no, async isn't "the default" in at least JavaScript, not sure where you'd get that from.
jgalt212 · 9h ago
> And no, async isn't "the default" in at least JavaScript, not sure where you'd get that from.

My bad. I was conflating common idioms and actuality.

yahoozoo · 9h ago
The problem with Deno is that it has lost the plot. When it was first announced years ago, it was simply a safer, faster JS/TS runtime “written in Rust” which I assume it still is but now you go to the website and you click a “Products” drop down containing a bunch of other shit.

It’s like they looked at what Vercel did with introducing a deployment platform after their initial NextJS work and wanted to follow suit.

bredren · 9h ago
Was it this or did the JavaScript / node communities get their acts together?

I had thought a lot of what Deno was setting out to do was cool beans for a time but parity was faster to come from js/node than expected.

candiddevmike · 10h ago
Post is by the CEO and doesn't really address the criticisms around Deno, just seems to justify their own internal decisions (or his?). Seems like Deno products work really well for Deno though!
nchmy · 10h ago
What criticisms of deno do you think went unaddressed?
candiddevmike · 10h ago
They don't really address stability, and even go as far to say they aren't chasing parity. Blog post gives off major "you're holding it wrong" vibes.
azdavis · 10h ago
This is likely in response to https://news.ycombinator.com/item?id=43863937
dbushell · 10h ago
Doesn't inspire confidence.

I guess we’ll see soon enough what Deploy will become since that's "imminent".

KV is dead if they've no desire to develop it out of beta and are working on something new. No reason to ever use it for a new project now.

Fresh is being refactored with an alpha in "late Q3 2025 (likely September)". It was a fairly basic framework to being with. The no compilation/build step was the only interesting idea and that's going away.

The runtime is actively developed but I find this statement amusing:

> We’re not chasing feature parity with other runtimes.

The release notes on Node/NPM compatibility would suggest otherwise.

pier25 · 6h ago
> KV is dead

Yeah this is a terrible move. Companies aren't relying on KV precisely because it's in beta not because it was a bad idea. I use Cloudflare Workers KV a lot and I'm not interested in durable objects. I was really interested in Deno KV until now.

Plus the optics of announcing a product and abandoning it are not good. Ryan is a great technical guy but these decisions don't look good from a strategic perspective.

fallinditch · 8h ago
> KV is dead if they've no desire to develop it out of beta and are working on something new. No reason to ever use it for a new project now.

I think you're right, I was just about to use it for something but now I'm considering other options...

wyuenho · 10h ago
Most developers weren’t deploying simple stateless functions. They were building full-stack apps: apps that talk to a database, that almost always is located in a single region.

I wonder if this is true in general for most people on serverless these days. If so, whether this is what the original intention of this movement and whether these people just don't want to deal with docker/k8s.

mosura · 9h ago
My gut feeling is that people want a modernized heroku. Managed RDBMS and an auto scaling set of servers that use it.

That covers a massive proportion of the companies that don’t need or want massive scale.

o_m · 7h ago
Most people and even most companies don't need horizontal scaling. Hardware has been much faster and cheaper since Heroku's heyday. Scaling vertically with 80+ cores on a single CPU and 256gb+ of ram only costs a few hundred dollars a month these days. With caching on a server like that, it can handle a million requests a second, or tens of thousands a second for dynamic data from the database on the same server.
nicce · 10h ago
> One of the biggest questions we’ve been hearing is about Deno Deploy — specifically, the reduction in available regions. While we understand the optics of this scaling back, it isn’t for the reasons often feared or accused.

> Rather, reality is: most applications don’t need to run everywhere. They need to be fast, close to their data, easy to debug, and compliant with local regulations. We are optimizing for that.

Why does this sound very odd? I chose to not use Deno Deploy because region was not close enough and it would have just make everything slower than using other means. (Because there are many options to host data closer to overall end-users, and some regulations also happen on country level)

k__ · 8h ago
It might just be my perception, but I had the impression Deno got his ass whooped by Bun and Node.js.

While some people whine about the Node.js compat, I'd assume it's the main point that kept Deno on life-support in the long run.

Bun did it right from the start and it seems people love it. Being quite a bit faster than Node.js (even with the compat APIs) and Deno obviously helps too. If they keep that going, they'd enter Go level of performance.

ctz · 13m ago
It is instructive to compare Bun and Deno's issue tracker. Like, the five most recent issues for Bun at the time of writing are all crashes. Some of these are controlled panics or assertion failures, but others are like "we are now executing from address -1" or "we are trying to read from address 0x00000069." Recently written software simply should not have these classes of problem.
eknkc · 10h ago
Whenever I read a blog post assuring me that something is not how it looks, it turns out to be exactly how it looks at the end.

BTW, I don't use deno and haven't been following any news whatsoever so this is simply a shitty statement from an outsider. It is interesting that I tested deno a couple of times but kept using node until bun came around and I basically switched to bun. I can't say why exactly.

CuriouslyC · 9h ago
Bun has high node compatibility with lightning fast testing and a good/fast built in package manger. I'd use bun for local dev even I was deploying with node.
bredren · 9h ago
Why not esbuild? It was ~fast enough first and free of capital entanglements.
CuriouslyC · 6h ago
Bun is still faster, and Bun's testing is insanely fast -- I had a test suite that would take 30 seconds with Jest that finished in 800ms with bun. Plus Bun's networking performance is insane compared to Node, and you can have a lot more concurrent clients in a light vpc (think 1gb).
fluidcruft · 10h ago
I agree also as an outsider. These sorts of "meta" discussions always smell of spin aimed at investors and usually are not good news for customers. Customers generally care about things like product and long term reliability and stability. These meta things always have the tone of Monty Python's "Bring out your dead!" segment.
teucris · 6h ago
I’m seeing some debate on Deno’s decision to ensure Node compatibility, apparently as it gives up a core value prop of early Deno to try and hit the reset button.

Can someone help me understand what was lost here? Is there no longer a way to use Deno without using the Node ecosystem?

ffsm8 · 7h ago
Mark Twain was born in 1835, made the quote in 97 and died in 10.

So the quote was done around 60 yrs old. And he perished roughly 1/4 of the of the time later.

Demo was released in 2018, it has now quoted the statement, 7 yrs later. I guess the next 2 years are gonna be interesting?

thunderbong · 6h ago
Is this a pattern seen elsewhere?
ffsm8 · 6h ago
Of projects and companies saying "we're not going to close down" and then - shortly after - shutting down?

It's not rare, so kinda.

The reason a project addresses these rumors at all means that they've noticed a trend and are worried about it.

Just like meta isn't publishing articles how react isn't going anywhere - they know it won't, despite the countless articles claiming otherwise.

What this kind of statement actually means it's basically "we're not secure, but we can't admit to it as that would cement it." Which funnily enough applied to Twain too, as he did indeed suffer from the illness people were gossiping about. It was just a lot less eminently dangerous then the rumors claimed

jppope · 9h ago
Personally I love what Ryan and the Deno team have done, if there is anything to really say its that incumbency in languages and software ecosystems is really strong- and its stronger the further down the stack you get.

I will say that I was disappointed when they added NPM into the project, I understand why they did it but I would have preferred they not do it.

With that said all of my blogs and client sites are all being happily built in lume with deno right now (hosted on cloudflare) and they have been great for years now. I am still very happy for having made that change.

rafram · 9h ago
I’m not sure why anyone would want their JS runtime to be their package manager, code formatter, compiler, bundler, web framework, KV store, and cloud provider(!) all at the same time. There’s just no way that they can ship the best product in all of those categories.
rglover · 2h ago
It's less likely (and really, depends on the quality of the team and their attention to detail—it's not a "law"), but certainly not a "no way." I'm far more concerned when the official answer to a feature missing is "just use a third-party!" That spoils a lot of tools for me as I immediately read that as "you're on your own, buddy, have fun with your Frankenstein app!"

Over time, unless the team building the thing is entirely tone deaf, I'd expect each individual tool to improve as demanded/necessary. Not only that, but knowing that those tools are being thought about as parts of a whole is deeply comforting (I trust them more than standalone tools as interdependency headaches have likely been solved).

One of the biggest headaches in JS is the tendency for tool builders to just eschew responsibility in favor of sending their community on a goose chase. I commend the Deno folks for taking this approach. We should have more, not less of this attitude in the ecosystem.

ecares · 9h ago
Deno is just a marketing company dressed as a software startup
rc00 · 8h ago
And Rust is the perfect lingua franca to accomplish this. The demise of both of these trends cannot come soon enough.
popcorncowboy · 10h ago
I get the "earnest, 'authentic', 'responsible' engagement by the CEO", but this post and title is lifted straight out of media-playbook-fails-101. Post a title like this at your peril. The content doesn't "own" the missteps, it writes the epitaph of Deno. All this article does is validate the "reports" of "demise" and unavoidably presents as "doth protest too much". If you insist on engaging with a negative narrative there are more constructive ways to frame it. Don't talk about "committing" to anything, just DO. Ryan, if you do nothing else, think about changing the title. But this entire post should ideally get rewritten. There are some really positive things you're doing. But they're covered in stink.
eqvinox · 8h ago
This is really OT, but if I don't ask now I might never get an answer…

Someone mentioned to me "Deno-style event loops" / "Deno-style main loops". I asked what that is but they were gone. I've tried to look it up, to no avail.

I do quite a bit of work on low level event loops. I'm continually interested in how different projects are doing it and what ideas and tricks they come up with. It bugs me to no end that I can't find anything on what this "Deno style loop" is supposed to be.

Anyone know what's meant / have a pointer or two?

skybrian · 8h ago
I’m guessing the use of Promises rather than Node-style network programming, which gets pretty hairy.
TiredOfLife · 8h ago
"Google Stadia is not shutting down" was posted by Stadia 2 months before being shut down.
NHQ · 7h ago
Deno ought to become an engine for the innovative development of web browsers. That is what we need, and what it could very well offer. Better permissions, protocol choices, simpler extensions, all around more options and control.

Business wise turn their deploy system into a resource for the browser base, for instance app store, for instance flash compute/rendering, for instance agent hosting services.

afavour · 10h ago
I’m sure a bunch of the criticism of Deno is exaggerated. But there’s something fundamental holding me back from investing my time in Deno, or Bun for that matter: they’re both VC funded.

The post is a good illustration of why that matters. Very little of it is about Deno itself, instead it’s mostly about the paid-for services Deno Inc offers. They have to prioritise and chase that because their investors want to see financial growth.

It’s the same reason they abandoned the idea of Deno being a fresh start in the JS ecosystem (an idea I loved) and instead started patching in Node compatibility layers. They had to reduce friction, not add to it. But for me that compromised the reason for using it in the first place.

Node has many flaws. And it’s boring. But its existence doesn’t depend upon the whims of random investors who might decide to pull the plug at any moment. So I’m just going to stick with it.

nchmy · 10h ago
Could it be that they added node compatibility because people wanted node compatibility. If investors pushed for it as well, then they were just being sensible...

I started working with JS/TS just before Deno 2 came out and having, essentially, full node (and TypeScript) compatibility was the primary reason I switched to it. It is all just so simple in comparison to node.

But, I agree about the VC funding - it certainly gives cause for concern about Deno's direction and longevity. But what other option is there, really? Hopefully what was said in this post about reduction of Deno Deploy locations being a function of use rather than economics was true

vegadw · 1h ago
> "But what other option is there, really?"

Not taking VC funding, having slow organic growth, making a good product, and having pride in your work?

Like, maybe I'm missing something, but why does the end goal always have to be VC funding and acquisition? Is it too much to ask to stay independent and just make something you take pride in and enjoy the craft over many years of a successful, but not self-canibalizing, business?

I dunno man, I just keep seeing every smaller business's end goal to be acquired or turn into a money pumping SASS and it's just depressing. Lets make good things, enjoy delivering a product that people like, and spread good. Keep you and, if you have employees, making a good living and be happy?

nchmy · 1h ago
Have you considered that its a much larger job than someone can just bootstrap? Also, Node itself was initially sponsored by a corporate entity, and it lead to considerable problems as well. Now it has backing in many other ways.

Also, are you aware that Deno is being built by literally the creator of Node? This isn't some get rich quick scheme - its something that he deeply wants to see exist. He's also leading the charge against Oracle (a genuine parasite) for the copyright/trademark of Javascript.

afavour · 8h ago
> Could it be that they added node compatibility because people wanted node compatibility

I imagine that’s exactly the reason! But they outlined their reasoning for a clean break pretty well in their 1.0 announcement post[1] and they haven’t, to my knowledge, posted a follow up “here’s why we were wrong about all that” post.

All of which is to say I understand the business reasons why they did it, but to me it compromises the original technical promise of Deno. A rebooted, sensible JS ecosystem was the reason I was interested in Deno originally. I use Node every day and I’m mostly happy with it but whenever I need to dive into a dependency to see what’s going on it’s a five layer deep rats nest of transpiled and sometimes even minifed code using different module formats and platform specific APIs. I’d love to be done with all that.

Sometimes it pays to be bold when you’re challenging an entrenched incumbent. Any non-Node JS platform has to pitch "don't use the status quo, take a risk, use me" and absent the original benefit I don’t see a good argument to use Deno, especially when factoring in the risk of VC-driven priorities. I’m not saying everyone has to agree with me on that but it’s my personal perspective.

[1] https://deno.com/blog/v1

WorldMaker · 6h ago
A lot of the boldness stands though, even with the compatibility layer, and it's a "layer" more than a pivot for Deno. deno.json is still far simpler than package.json. Deno still takes a batteries included approach with smart defaults by default. Deno still pushes you toward a modern-standards "native" approach: ESM by default; ESM native libraries including a growing "standard library" on JSR; your dependency graph is still mostly an importmap you can also dump directly into a browser, too (even some of the compatibility shims with the npm ecosystem).

Deno still has a permissions model that is very different and far more opt-in than Node. This post makes a case for thinking of Deno's deep, native OpenTelemetry support as something very new and different from Node's approach, and clearly important to the future of application deployment and observability.

Technically Deno is still very interesting in technical promise, especially compared to Node, even with a larger compatibility layer.

nchmy · 1h ago
You nailed it. Its incredible how many dunces here are lamenting how Deno abandoned everything, when all they really did was add a compatibility layer.
afavour · 14m ago
Maybe next time instead of name calling you could consider that other people simply have different perspectives to you?

I and others are lamenting that the compatibility layer removed incentive to help create a new JS ecosystem that isn’t layers of garbage piled on top of each other. That new ecosystem is what I wanted and Deno is no longer the path to it. If that makes me a “dunce”, so be it.

nchmy · 1h ago
Evidently people didn't sufficiently value the technical promise of Deno and just wanted a (MUCH) better node. But, it also has plenty of new, bold, extra things (a sibling comment elaborates on it much better than I can). I, for one, am quite happy with it.

Given that you still use Node, you might want to try Deno 2 out... It'll likely solve a lot of your headaches.

afavour · 12m ago
I’ve tried it, I like it, but the positives for me personally aren’t worth investing in a VC backed product. I don’t have many headaches with Node itself these days.
bityard · 7h ago
I almost never write server-side JS/TS so I don't have a horse in this race, but it sounds like a good time and reason for a community fork that eschews legacy compatibility in order to focus on only modern features.
afavour · 6h ago
Announcing Oden...
zem · 6h ago
when it hits 1.0 they can feature freeze and call it "done"
pier25 · 7h ago
> If investors pushed for it as well, then they were just being sensible...

Not really.

The biggest issue with Node is the dependance on the fragile NPM ecosystem. Strategically, fixing this is the thing that would distinguish Deno and make it more valuable.

And Node is already adding TS and other features that were initially the reason to leave for Deno.

nchmy · 1h ago
Deno created JSR, and it already has TS (node may take a while to implement that) and plenty more.
pier25 · 34m ago
JSR is essentially NPM with a couple of extra improvements.

And it's only a matter of time until Node has full TS support.

wink · 8h ago
yes of course, more people probably wanted node compat than people explicitly didn't want it (like the person who you replied to, and me too).

You have to decide where to go and apparently not being a niche product was one the reasons, that's fine - but now they have to live with at least 2+ unhappy (ex?) users.

Lerc · 7h ago
>it’s mostly about the paid-for services Deno Inc offers.

In a way I think that's a good thing. Their plan for making money is to provide those services. That goal is enhanced by Deno being healthy. I would be more concerned if Deno was the product they were wanting to sell.

As long as Deno itself is FOSS, then I think I'm ok with it.

WorldMaker · 6h ago
Also, I've been using Deno Deploy for hobby projects and it is a delight to work with so far. In terms of finding a product that is a good complement to the open source Deno, they seem to have good ideas. Though I'm still in the VC-subsidized freeloader category in my hobby usage today, so I haven't experienced it yet as a paid product.
lolinder · 9h ago
Yep. Honestly, the pivot to Node/NPM compatibility was the moment I lost interest in Deno. I know why they did it, and as you say from a financial perspective it makes complete sense, but they had the chance to be a fresh start to the whole ecosystem and they gave that up.

I really like coding in TypeScript and think that most of people's irritation with JavaScript isn't actually related to the language so much as the ecosystem of NPM. The exponentially growing web of dependencies and the constant churn of deprecations are exhausting, detracting from a core language that is now pretty solid.

Deno set out to change that and be something new, but they squandered that chance because it was too risky for their investors. And again, that's totally fair—resetting an ecosystem is risky and probably wouldn't have yielded the return they needed! But giving up on that was giving up on what made Deno different and interesting. If I'm going to use NPM anyway why not stick with Node while I'm at it?

the_gipsy · 8h ago
They basically said they were taking a risk (node/npm incompatible) for a big long-term benefit. They gave up on that forever, for some short-term growth. How many more times would they back-pedal on any risk they announced taking?
skybrian · 8h ago
They did say something like that, but I don’t remember what the big long-term benefit was supposed to be. What specifically did they give up? Maybe it wasn’t that important after all.
afavour · 8h ago
Their original announcement post covers it pretty well:

https://deno.com/blog/v1

IMO their logic still holds up. Dahl had a whole talk about the mistakes made with Node:

https://www.youtube.com/watch?v=M3BM9TB-8yA

skybrian · 8h ago
I haven’t done all that much network programming in Deno, but I think it’s still fairly easy to get “promises all the way down” by sticking with Deno API’s? What’s your experience with this?
afavour · 8h ago
My complaint is not with Deno APIs. From what I’ve seen they’re great. My problem arrives the moment you install a dependency, because it isn’t using Deno APIs. And digging into exactly what any dependency is doing is often an odyssey through transpiled-to-ES3 JavaScript, outdated APIs and so on.

The original promise of Deno was a consistent ecosystem. Absent that it doesn’t matter to me all that much how great Deno is within itself, the case for using it simply isn’t compelling enough. These days the newer, standards-compliant Node APIs are pretty good too!

monooso · 4h ago
Surely you can choose to only install "Deno native" dependencies?

It may sometimes be difficult to find such an option, but that was always going to be the case without Node compatibility.

Now, in theory at least, you have the option of sticking with Deno native dependencies, and an escape hatch when none are available.

That seems like the most pragmatic solution to the ideology vs adoption dilemma.

the_gipsy · 6h ago
They gave up on a quality Deno ecosystem. And being relevant.
skydhash · 9h ago
Yeah. JavaScript is fine if you’re dealing with the DOM and vendor libraries, or if you’re using it in some scripting environment like GNOME. Node were ok too, but failed to develop a standard library like Go or Python. Addressing that failure would go a long way towards a better JS ecosystem.
skybrian · 8h ago
Deno’s standard libraries [1] seem pretty nice, but to be honest I never used them much, because lots of stuff on MDN works fine in Deno.

[1] https://jsr.io/@std

indigodaddy · 7h ago
I stumbled upon Deno when I needed to spin up a simple API to add to/update a CSV file, and really the only thing I found was deno-csv library, and it worked great. I was pleased with how easy it was with Deno, had it going in under an hour.
dkarl · 8h ago
> Deno set out to change that and be something new, but they squandered that chance because it was too risky for their investors

Maybe it was too risky for users? The people with the most appetite for a new start and a new way of doing things are people who are suffering from their existing investment in Node. Making a halfway jump to a new platform with no path to completing the migration would leave their customers running on two platforms indefinitely. It's the worst-case outcome for a migration, going halfway and getting stuck with your feet in two canoes.

By supporting Node, Deno lets customers switch to something new and better and bring their legacy baggage along as well.

lolinder · 8h ago
It was always going to be too risky for a subset of users, from the moment they announced it. That would not have stopped a project that was not VC funded—a smaller project with less at stake could easily have stuck to their guns and just appealed to the people who were actually interested in pioneering a new ecosystem.
1vuio0pswjnm7 · 4h ago
"But there's something fundamental holding me back from investing my time in Deno, or Bun for that matter: they're both VC funded."

For me it is the lack of support for musl. Perhaps there is a connection between inattention to certain details and being VC-funded.

bredren · 9h ago
Yes. And m this is also why Poetry remains good enough in the face of PDM.

That said, next.js achieved widespread adoption and displaced create react app. And however you feel about the framework, react itself, are possibly reasons to believe.

What others are out there?

ramesh31 · 8h ago
>And it’s boring.

This is a feature. Once upon a time, Node was the new hotness that all the cutting edge hackers were excited to play around with, and needed a hard sell to management. It has since graduated to IBM status - i.e. "no one ever got fired for...". And thank god for that. It's the most mature possible ecosystem choice at this point for the niche it fills, and we are able to build rock solid maintainable systems with it (and hire people who know it deeply). That didn't come cheaply or easily (IO.js drama anyone?), and anything that wants to take its place will need to make it through the same process.

v3ss0n · 10h ago
There is barely very little positive points in moving existing node to deno - which are mostly frontend.
redwood · 9h ago
Anyone looking at Mastra on top of Deno for at-scale concurrent agent orchestration?