Deep Learning Is Applied Topology (theahura.substack.com)
179 points by theahura 3h ago 101 comments
Show HN: 90s.dev - game maker that runs on the web (90s.dev)
95 points by 90s_dev 2h ago 42 comments
Reports of Deno's Demise Have Been Greatly Exaggerated
129 stephdin 137 5/20/2025, 11:33:14 AM deno.com ↗
The post is a good illustration of why that matters. Very little of it is about Deno itself, instead it’s mostly about the paid-for services Deno Inc offers. They have to prioritise and chase that because their investors want to see financial growth.
It’s the same reason they abandoned the idea of Deno being a fresh start in the JS ecosystem (an idea I loved) and instead started patching in Node compatibility layers. They had to reduce friction, not add to it. But for me that compromised the reason for using it in the first place.
Node has many flaws. And it’s boring. But its existence doesn’t depend upon the whims of random investors who might decide to pull the plug at any moment. So I’m just going to stick with it.
I started working with JS/TS just before Deno 2 came out and having, essentially, full node (and TypeScript) compatibility was the primary reason I switched to it. It is all just so simple in comparison to node.
But, I agree about the VC funding - it certainly gives cause for concern about Deno's direction and longevity. But what other option is there, really? Hopefully what was said in this post about reduction of Deno Deploy locations being a function of use rather than economics was true
I imagine that’s exactly the reason! But they outlined their reasoning for a clean break pretty well in their 1.0 announcement post[1] and they haven’t, to my knowledge, posted a follow up “here’s why we were wrong about all that” post.
All of which is to say I understand the business reasons why they did it, but to me it compromises the original technical promise of Deno. A rebooted, sensible JS ecosystem was the reason I was interested in Deno originally. I use Node every day and I’m mostly happy with it but whenever I need to dive into a dependency to see what’s going on it’s a five layer deep rats nest of transpiled and sometimes even minifed code using different module formats and platform specific APIs. I’d love to be done with all that.
Sometimes it pays to be bold when you’re challenging an entrenched incumbent. Any non-Node JS platform has to pitch "don't use the status quo, take a risk, use me" and absent the original benefit I don’t see a good argument to use Deno, especially when factoring in the risk of VC-driven priorities. I’m not saying everyone has to agree with me on that but it’s my personal perspective.
[1] https://deno.com/blog/v1
Deno still has a permissions model that is very different and far more opt-in than Node. This post makes a case for thinking of Deno's deep, native OpenTelemetry support as something very new and different from Node's approach, and clearly important to the future of application deployment and observability.
Technically Deno is still very interesting in technical promise, especially compared to Node, even with a larger compatibility layer.
Not really.
The biggest issue with Node is the dependance on the fragile NPM ecosystem. Strategically, fixing this is the thing that would distinguish Deno and make it more valuable.
And Node is already adding TS and other features that were initially the reason to leave for Deno.
You have to decide where to go and apparently not being a niche product was one the reasons, that's fine - but now they have to live with at least 2+ unhappy (ex?) users.
In a way I think that's a good thing. Their plan for making money is to provide those services. That goal is enhanced by Deno being healthy. I would be more concerned if Deno was the product they were wanting to sell.
As long as Deno itself is FOSS, then I think I'm ok with it.
I really like coding in TypeScript and think that most of people's irritation with JavaScript isn't actually related to the language so much as the ecosystem of NPM. The exponentially growing web of dependencies and the constant churn of deprecations are exhausting, detracting from a core language that is now pretty solid.
Deno set out to change that and be something new, but they squandered that chance because it was too risky for their investors. And again, that's totally fair—resetting an ecosystem is risky and probably wouldn't have yielded the return they needed! But giving up on that was giving up on what made Deno different and interesting. If I'm going to use NPM anyway why not stick with Node while I'm at it?
https://deno.com/blog/v1
IMO their logic still holds up. Dahl had a whole talk about the mistakes made with Node:
https://www.youtube.com/watch?v=M3BM9TB-8yA
The original promise of Deno was a consistent ecosystem. Absent that it doesn’t matter to me all that much how great Deno is within itself, the case for using it simply isn’t compelling enough. These days the newer, standards-compliant Node APIs are pretty good too!
Maybe it was too risky for users? The people with the most appetite for a new start and a new way of doing things are people who are suffering from their existing investment in Node. Making a halfway jump to a new platform with no path to completing the migration would leave their customers running on two platforms indefinitely. It's the worst-case outcome for a migration, going halfway and getting stuck with your feet in two canoes.
By supporting Node, Deno lets customers switch to something new and better and bring their legacy baggage along as well.
[1] https://jsr.io/@std
That said, next.js achieved widespread adoption and displaced create react app. And however you feel about the framework, react itself, are possibly reasons to believe.
What others are out there?
This is a feature. Once upon a time, Node was the new hotness that all the cutting edge hackers were excited to play around with, and needed a hard sell to management. It has since graduated to IBM status - i.e. "no one ever got fired for...". And thank god for that. It's the most mature possible ecosystem choice at this point for the niche it fills, and we are able to build rock solid maintainable systems with it (and hire people who know it deeply). That didn't come cheaply or easily (IO.js drama anyone?), and anything that wants to take its place will need to make it through the same process.
The whole selling point for me was that deno was node without the bullshit and baggage, but they dropped that and basically just turned it into node with built in typescript support and a few other minor things like the permissions.
Similar story with bun.sh - node backwards compatibility (although not using V8).
Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?
If you want the developer experience of using something that’s not Node, you can still get it from Deno.
But it turns out that few people care that much about purity, so it’s fortunate that they’re not relying on that.
The answer is obvious in the programming language case: for those who do not want async, the addition of async/await begins to poison the ecosystem. Now they have a growing list of libraries that they cannot use if they want to avoid async, so the effort involved in picking a library goes up and the odds get increasingly high that they're locked out of some of the key tools in the ecosystem because new libraries without async become harder and harder to find.
For those who really hate colored functions, the addition of async is the removal of a feature: colorless functions are replaced with colored functions.
The same can be said of NPM compatibility. Sure, I can try to avoid it and stick to Deno imports and inspect each library that I use for NPM dependencies. But it gets harder and harder as time goes on, because a key feature of Deno has been removed: it's no longer an ecosystem reset.
So it reminds me more of trying to avoid CGO in Go or avoid “unsafe” in Rust.
It would be worse if Node-specific types started appearing as function parameters or return types, but that seems fairly rare even in npm libraries, so it seems easy to avoid.
Node API yes, NPM library no. If you add a dependency on a library that uses NPM you now depend on an entire web of transitive NPM dependencies, with all of the problems that entails. People don't dislike NPM because it's aesthetically displeasing—you can't just abstract away the problems with NPM. The ecosystem causes real problems with real software, and Deno initially recognized those real problems and set out to reset the ecosystem.
The only way in which NPM-compat is different than colored functions is that there's no static compiler feature telling you when you've added a dependency on a bunch of NPM libraries.
Though, it is nicer if it’s on jsr.io because you’ll see Typescript source code in the debugger.
There’s nothing about starting over that prevents ending up with a whole new rat’s nest of dependencies, if you’re not careful.
If you're gonna argue that fragmentation is a problem in the node ecosystem (which I agree with), you can't convince me that a plethora of approaches to asynchronous code is preferable to async/await and promises.
1) The original essay that coined this term was looking at it from a language design perspective. The argument is a fair one if that design question is up for debate, but that isn't the case for Javascript.
(In general, I do agree that "you don't have to use it" is not a strong argument.)
The problem is that if you think statically, you can say "oh, just use the 'clean' subset". But the world is not static. If you think dynamically, you can see the full Node ecosystem as a fairly powerful attractor; why would I write a deno library that only works on deno when I can write a node library that works on both? Well, if I'm writing in the Node ecosystem, why not use the whole thing?
This is a general effect; it is very hard for people to set up long-term ecosystems that are "too close" to existing ecosystems. Generally the new thing will either get pulled in (as in this case) or ignored (as in the many cases of 'hey Typescript seems to have worked pretty well, let me do a Typescript-like gloss of this other language', which generally just get ignored). There are successes, like Typescript (JS is in general a special case because being the only language in the browser for so long it was both a language and a compile target; most other attempts to "Typescriptify" a language flounder on the fact that few if any other languages have that situation), Elixir (managed to avoid just being absorbed by Erlang, IMHO kind of a similar situation where the 'base langauge' for the ecosystem was not really the best), and the occasional Lisp variant that bubbles up (though like Clojure, usually with a story about where it can be run), but in general this is very hard to pull off, harder in some ways than simply starting a brand new language ecosystem, which is itself no picnic.
Also, promises already color functions just like callbacks do. Async/await just changes the syntax by which that coloring is expressed. The real problem people have with async is that they prefer green threads as a solution to concurrency, not that they don't like the syntax.
Of course, in (browser-compatible) Javascript, some things can not be done synchronously, but that's not up for debate.
Because you are losing something: a better ecosystem. Standardizing around… standards is a good thing. When I dive into the dependencies of any given Node app it’s a mess of transpiled code, sometimes minified even, there’s no guarantee what API it’ll be using and whether it’ll run outside Node (is it using the W3C streams API or the Node streams API?). But inertia is a powerful force. People will just use what’s there if they can. So the ecosystem never gets made.
> But it turns out that few people care that much about purity, so it’s fortunate that they’re not relying on that.
By that logic we never build anything newer or better. Python 3 is better than Python 2 and sets the language up for a better future. Transitioning between the two was absolutely torturous and if they just prioritised popularity it would never have happened.
I’m looking forward to whatever they’re going to do instead of KV, which I tried and is too limited, even for a KV store. (64k values are too small.) Something like Cloudflare’s Durable Objects might be nice.
You can’t “force” maintainers of old libraries to do anything. Deno never had that power. For people who are interested in supporting multiple alternate platforms, jsr.io seems pretty nice?
> You can’t “force” maintainers of old libraries to do anything. Deno never had that power. For people who are interested in supporting multiple alternate platforms, jsr.io seems pretty nice?
If enough people find Deno useful enough to skip some old libraries, maintainers are "forced", even thought Deno is not forcing anyone. If they do not adapt, then someone will just create a new library with better practices. In both cases there is pressure for better JS/TS evolution.
At least that's the theory. To be honest I don't see Deno's value add. The runtime is like... I mean node works fine at this point? And the capabilities system is both too pedantic and too simplistic, so it's not actually useful.
I don't understand the value add of Bun much either. "Native" Typescript support but at the end of the day I need a bundler that does more than what these tools tend to offer.
Now if one of these basically had "esbuild but built in"....
I'm sure you can find other projects that are going to fail, but why do you want to?
Node has lots of problems (I am basing this statement on the fact that it's a major tech project). None of them are sufficient to prevent it from being extremely widely used.
To fix those problems in a product that will be used, it is not sufficient to provide something sort of like Node but without those problems. You either have to:
1. Provide a tool that requires a major migration, but has some incredible upside. This can attract greenfield projects, and sometimes edge out the existing tool.
2. Provide a tool with minimal migration cost, and without the same problems. Maybe this tool can replace the existing one. Ideally there will be other carrots (performance, reliability, ease of use). Such a tool can get people to migrate if there are enough carrots, and the migration cost is low enough.
Deno was a classic example of messing this up. It's not #1 or #2, it has the worst of both worlds. The upside was that it did things "the right way", and the downside was that you couldn't run most code that worked on Node. This is the kind of approach that only attracts zealots and hobbyists.
What's the point? If you're in love with static types, but have to do JavaScript because you're targeting the browser, I kind of understand why'd you go for TypeScript. But if you're on the backend, and don't need anything JS, why limit yourself to TypeScript which is a "Compile-to-JS" language? You control the stack, make another choice.
People like to sneer at TypeScript, but let's be honest: people like to sneer at anything that's popular enough. The fact is that no language that I enjoy better than TypeScript (which is already not a very long list) stands any chance of adoption in an average workplace.
It also interops nicely with F#, so you can write a pure functional library in F# and call it from a C# program in the "functional core, imperative shell" style.
It has an incredibly solid runtime, and a good type system that balances letting you do what you want without being overly fussy.
That doesn't mean there's anything wrong with it and I've often thought to give it another shot, but it's not a viable option right now for me because it's been too hard to get started.
I realize Microsoft is terrible at naming things, but for .NET/C# it's really not that hard these days. If you want to use the new, cross platform .NET on Linux then just install .NET 8 or 9.
New versions come out every year, with the version number increasing by one each year. Even numbered versions are LTS, odd numbered releases are only supported for about a year. This naming scheme for the cross-platform version of .NET has been used since .NET 5, almost 5 years ago, it's really not too complicated.
I might feel differently if I worked with a large number of people who I didn't trust, but on small to medium teams composed of very senior people using Go feels like coding with one hand tied behind my back.
At this stage, I don't think anyone needs to try and persuade anyone why JavaScript and typescript are the Lingua Franca of software engineering.
Performant, expressive, amazing tooling (not including node/npm), natively cross-platform.
An absolute joy to code with. Why would anyone want to use anything else for general purpose coding?
In my mind there are two alternative approaches in the current ecosystem: C++ where absolute 100% maximal performance is the overriding primary objective and be damned with the consequences, then for everything else just use Typescript
I mean, it obviously isn't, although for web development, I'd probably agree with you. But regardless, zealots who hold opinions like this, where there is "one language to rule them all" is why discussing with TS peeps is so annoying. In your world, there is either C or TypeScript, but for the rest of us, we tend to use different languages depending on what problem we're solving, as they all have different strengths and drawbacks. If you cannot see any drawbacks with TypeScript, it's probably not because there aren't any, but you're currently blind to them.
> Why would anyone want to use anything else for general purpose coding?
Because you've tasted the world of having a REPL connected to your editor where you can edit running code live and execute forms. Just one example. There are so many languages available out there when you control your stack. I understand using JavaScript for frontend web stuff, because you have no other options, and I personally have nothing against JavaScript itself. But for the love of god, realize there is a world out there outside of your bubble, and some of those languages have benefits too.
https://survey.stackoverflow.co/2024/technology#most-popular... ... JS is the most popular language in the world, per Stack Overflow.
Oh I get it, you’re joking
Because some of us _like_ typescript, or at a minimum, have invested a significant portion of our careers learning ts/js. We want an ROI, we just don't want node/npm.
Right, makes sense. It also makes sense that most of those learnings are transferable, it's not like TypeScript is the only language with types. So your design/architecture skills can be used elsewhere too. Locking yourself into the ecosystem of one language, then asking other runtimes to adhere to your preference sounds like a sure way of getting disappointed, instead of being pragmatic and flexible to chose the right tool for the problem.
It's extremely good! Shame about it being coupled to Javascript.
Also, when I was writing a frontend and backend both in TS, I could literally share the exact same type definitions between them. Then I could use a compiler plugin (`typescript-is`) to validate on the server that payloads match the appropriate type. It was amazing and worked quite well, and I can't really see that being nearly as easy and seamless with anything else.
But isn't that benefit just because TypeScript does compile to JavaScript and is compatible with JavaScript? Remove that compatibility, and you wouldn't get that benefit anymore, right? And if you still can get that benefit, why wouldn't you be able to get that benefit with other languages too?
It's not like TypeScript gives you some inherit benefit that makes it easier to convert to JavaScript, besides the fact that it's literally a "Compile-to-JS" language.
JavaScript does make it easier to target both the web browser and Node.js, sure. But TypeScript also has a fairly mature type system and ecosystem (flaws in `tsc` itself notwithstanding). Not to say that no novel approaches are worth exploring, though; I just haven't seen one that rivals my TS experience yet.
> And if you still can get that benefit, why wouldn't you be able to get that benefit with other languages too?
That depends. In many other programming languages (such as ones that compile to WASM) it's also possible to have common code shared between server and client, but it's usually pretty inconvenient to actually get the code running in both environments. It's also possible to have a common interface definition and generate types for server and client from that definition, but that's still more complicated.
Anyway I don't fault anyone for being disappointed that Deno fell into the Node.js compatibility trap. Pure TypeScript without that particular cruft is also something I was excited about. I also was excited to see what looked like some fair innovation (like their import mechanism and their sandboxing) but I don't know how that'll continue if Node.js compatibility ends up being too much of a time sink.
I don't have very strong opinions because I've never really used Deno and I probably won't even bother at this point, but I definitely would not agree that this is just a problem of needing to use another programming language instead.
Why not use it? What high level programming language would you suggest instead with the same level of performance and ecosystem support.
The way you make a scripting language fast is by getting the hell out of it and into C or C++ as fast as possible, and PHP's library ecosystem embraces that harder than just about any other scripting language, is the reason (I think).
[EDIT] My point is mainly that Node's performance isn't really that impressive, in the field of "languages one might use to write server-side Web code". It beats absolute slugs like Python and Ruby handily, but past that... eh. And really, in its speed category you'd probably do just as well using either of those and paying a little more attention to things like network calls and database access patterns.
IDK what you mean by "deal with OpenAPI", OpenAPI is a spec not a technology like graphql.
In all honesty (and sorry for the directness), you don't really seem to understand these concepts and how relevant or not they are to this conversation
It's a JS framework thing. Every mainstream JS framework has server actions or equivalent.
> Having a wrapper around HTTP isn't really a compelling reason to choose a technology for the very large majority of people: probably the opposite actually.
It is way more convenient to write a server action and be able to immediately use it in a client component than having to write an HTTP endpoint in a separate back-end project, and then regenerate your client via OpenAPI, or whatever else you use.
> IDK what you mean by "deal with OpenAPI"
I mean dealing with tooling to generate an HTTP client from OpenAPI schema.
> In all honesty (and sorry for the directness), you don't really seem to understand these concepts and how relevant or not they are to this conversation
Wrong
No. "Server actions" are a React concept, it has little to do with backend technology (the backend still speaks HTTP). This concept is completely irrelevant to most big frameworks like Express, NestJS, Koa, Hapi. Next is barely a "backend" framework: it's rather a React server framework that has a few basic backend functionalities.
Yes, there is nothing that works better than server actions. None of what you listed really makes sense to me. I have never had any runtime performance problems with TypeScript and wasn't JavaScript the most popular programming language in the world (the talent pool argument)?.
And then you would have to solve the problem of how to communicate with the client.
You aren't suggesting to handwrite an HTTP API client, right? You would have to set up either OpenAPI which is a mess, or GraphQL which is also a mess. LMK if you have a better solution.
The problem is that, unlike when using server actions, when using HTTP APIs, there is nothing that automatically generates bindings.
> If you really want to use OpenAPI for whatever reason
No, I don't. But people use OpenAPI to avoid having to handwrite an HTTP client. This is especially relevant if you are developing a public API.
> But it's not like using OpenAPI or GraphQL are the only two options
What are other options?
Is that really the biggest problem you face when programming? How many endpoints do you have? Even with projects with ~30 endpoints, it doesn't seem problematic to me, but maybe people regularly work on projects with 100s of endpoints, then it kind of makes sense. But I'm not sure that's typical enough.
> No, I don't. But people use OpenAPI to avoid having to handwrite an HTTP client. This is especially relevant if you are developing a public API.
People do a lot of stuff for a lot of nonsense reasons, doesn't mean that's the best way to approach things. The JS/TS ecosystem seems extra easy to fall into cargo culting too.
What do you mean by this?
"Vibe coding" as a concept is a fun joke, not a workflow you employ for doing serious engineering. It was a tiny experiment that somehow people thought was a suggested way of developing software, which obviously it isn't. Read the code yourself, otherwise it'll be really hard to call yourself any sort of engineer.
Well I guess making Next.js apps isn't really "serious engineering"
> Read the code yourself, otherwise it'll be really hard to call yourself any sort of engineer.
I do read the code but I barely write any code by hand.
Where did I say that?
> I do read the code but I barely write any code by hand.
Right, so you use the words "vibe coding" yet you don't actually understand the concept? A lot of things make sense now. The description "vibe coding" is explicitly about "programming" with a LLM without reading or writing any code at all, for any purpose. If you read the code, you're not really vibe coding as originally described by Karpathy.
You replied to a comment that says "Yep, Next.js has the best support for vibe coding."
> Right, so you use the words "vibe coding" yet you don't actually understand the concept? A lot of things make sense now.
You can stop arguing that if one glances at the code one is no longer vibe coding, because in practice by looking at the code or even the LLM's thoughts you can catch things you don't want early.
My bad. I was conflating common idioms and actuality.
It seems like they never replied to the criticism against their momentum (something I haven't seen myself, what would the argument even be), was that intentional or just missed?
> Some of that criticism is valid.
Would have been great to also outline what criticism is/was valid, and how they're aiming to solve those things. Sure, maybe a bit "shoot yourself in the foot" but personally I really prefer companies that are upfront about the drawbacks, and makes it more likely I'll chose them. Migadu is a great example of this, where they have a pro/con page where they are upfront about the drawbacks of using Migadu (https://migadu.com/procon/). Just the existence of that page is probably ~20% of why I chose Migadu in the first place.
> Since the release of Deno 2 last October - barely over six months ago! - Deno adoption has more than doubled according to our monthly active user metrics.
The obvious question is: doubled, but compared to what? And what are they measuring? They’re not disclosing any real metrics on adoption.
I think what happened is that people were giving them the benefit of the doubt because they were new and you could imagine huge growth. The disappointment is by comparison to vague hopes and dreams.
At some point, rather than coming up with native solutions to those pain points, they retreated and started leaning on backwards compatibility as a workaround.
Today, Deno feels more complex than Node does because it contains both approaches. And now there are lots of edge cases where a Node package ought to work, but doesn’t because of one unimplemented API or option or a bug that exists only in Deno. My favorite testing framework, AVA, still isn’t supported.
I used to just ignore the npm compatibility layer and target Deno itself, but that’s become more cumbersome to do over time. For example, look at `deno run —help` and look at how many command line options and env vars there are. It’s exploded in the past few years. A lot of that is for npm interoperability. For me, it’s just a lot of noise.
The one area of Node compatibility that I want the most is support for ESLint configs in the Deno linter. Yet they don’t seem to want to do that.
I really want Deno to succeed, if for no other reason than because it’s pushing Node to do things that they should’ve done years ago, such as adding a permission system. I just don’t think the current vision for Deno is very coherent or consistent with its original purpose.
Have you checked recently? The docs (https://docs.deno.com/runtime/fundamentals/testing/) specifically mention AVA as being supported. Then again, I'd assume that most devs using Deno just use the built-in `deno test` instead of a third-party testing framework.
> The one area of Node compatibility that I want the most is support for ESLint configs in the Deno linter.
Again, have you checked recently? According to the docs this is supported: "Deno's built-in linter, `deno lint`, supports recommended set of rules from ESLint to provide comprehensive feedback on your code. (...) You can specify custom rules, plugins, and settings to tailor the linting process to your needs." (https://docs.deno.com/runtime/fundamentals/linting_and_forma...)
I've been using Deno for 6 years now. And I'm actually quite happy that most Deno projects don't have a custom testing and linting setup.
I guess we’ll see soon enough what Deploy will become since that's "imminent".
KV is dead if they've no desire to develop it out of beta and are working on something new. No reason to ever use it for a new project now.
Fresh is being refactored with an alpha in "late Q3 2025 (likely September)". It was a fairly basic framework to being with. The no compilation/build step was the only interesting idea and that's going away.
The runtime is actively developed but I find this statement amusing:
> We’re not chasing feature parity with other runtimes.
The release notes on Node/NPM compatibility would suggest otherwise.
Yeah this is a terrible move. Companies aren't relying on KV precisely because it's in beta not because it was a bad idea. I use Cloudflare Workers KV a lot and I'm not interested in durable objects. I was really interested in Deno KV until now.
Plus the optics of announcing a product and abandoning it are not good. Ryan is a great technical guy but these decisions don't look good from a strategic perspective.
I think you're right, I was just about to use it for something but now I'm considering other options...
It’s like they looked at what Vercel did with introducing a deployment platform after their initial NextJS work and wanted to follow suit.
I had thought a lot of what Deno was setting out to do was cool beans for a time but parity was faster to come from js/node than expected.
So the quote was done around 60 yrs old. And he perished roughly 1/4 of the of the time later.
Demo was released in 2018, it has now quoted the statement, 7 yrs later. I guess the next 2 years are gonna be interesting?
It's not rare, so kinda.
The reason a project addresses these rumors at all means that they've noticed a trend and are worried about it.
Just like meta isn't publishing articles how react isn't going anywhere - they know it won't, despite the countless articles claiming otherwise.
What this kind of statement actually means it's basically "we're not secure, but we can't admit to it as that would cement it." Which funnily enough applied to Twain too, as he did indeed suffer from the illness people were gossiping about. It was just a lot less eminently dangerous then the rumors claimed
I wonder if this is true in general for most people on serverless these days. If so, whether this is what the original intention of this movement and whether these people just don't want to deal with docker/k8s.
That covers a massive proportion of the companies that don’t need or want massive scale.
While some people whine about the Node.js compat, I'd assume it's the main point that kept Deno on life-support in the long run.
Bun did it right from the start and it seems people love it. Being quite a bit faster than Node.js (even with the compat APIs) and Deno obviously helps too. If they keep that going, they'd enter Go level of performance.
Can someone help me understand what was lost here? Is there no longer a way to use Deno without using the Node ecosystem?
BTW, I don't use deno and haven't been following any news whatsoever so this is simply a shitty statement from an outsider. It is interesting that I tested deno a couple of times but kept using node until bun came around and I basically switched to bun. I can't say why exactly.
> Rather, reality is: most applications don’t need to run everywhere. They need to be fast, close to their data, easy to debug, and compliant with local regulations. We are optimizing for that.
Why does this sound very odd? I chose to not use Deno Deploy because region was not close enough and it would have just make everything slower than using other means. (Because there are many options to host data closer to overall end-users, and some regulations also happen on country level)
Someone mentioned to me "Deno-style event loops" / "Deno-style main loops". I asked what that is but they were gone. I've tried to look it up, to no avail.
I do quite a bit of work on low level event loops. I'm continually interested in how different projects are doing it and what ideas and tricks they come up with. It bugs me to no end that I can't find anything on what this "Deno style loop" is supposed to be.
Anyone know what's meant / have a pointer or two?
I will say that I was disappointed when they added NPM into the project, I understand why they did it but I would have preferred they not do it.
With that said all of my blogs and client sites are all being happily built in lume with deno right now (hosted on cloudflare) and they have been great for years now. I am still very happy for having made that change.
Business wise turn their deploy system into a resource for the browser base, for instance app store, for instance flash compute/rendering, for instance agent hosting services.