Codex CLI is going native

139 bundie 125 6/1/2025, 11:22:55 AM github.com ↗

Comments (125)

wiseowise · 1d ago
Recent resurgence of tools going native is really interesting. For years (at least last decade) I've heard mantra that JIT/interpreters are getting so good that you don't need to bother with native languages anymore, clients can be written in Python/Ruby, because it's a one-off operation anyway, etc., etc.

And now everyone is rewriting everything in Go/Rust left and right.

amazingamazing · 2m ago
> And now everyone is rewriting everything in Go/Rust left and right.

Seems like confirmation bias.

cameronh90 · 1d ago
It really comes down to the Go and Rust ecosystems being easy to work with. A decade or two ago, writing a native app meant setting up an automake/autoconf Rube Goldberg machine - especially difficult if you needed to build for multiple architectures.

I'd argue Rust and Go are even easier to work with than Python/JS/TS. The package management is better, and static linked native binaries eliminate so many deployment headaches.

pjmlp · 11h ago
As someone educated in the likes of BASIC and Z80, it isn't as if it was really that complicated, versus the willigness to learn.

Sometimes people really need to follow Yoda and Mr Miyagi advices, instead of jumping right into it.

cameronh90 · 9h ago
It is complicated, though. Not as complicated as quantum physics, but still far more complicated than it needs to be - especially if you care about multiple architectures and platforms. At one point I was making builds for X86, AMD64 and Itanium on Windows, MacOS, Linux (which itself was subdivided into various distributions with different glibc/openssl/whatever versions). It took more work maintaining the build pipeline than working on features.

Go and Rust prove you can get most of the benefit of C/C++ without paying that complexity cost.

pjmlp · 7h ago
Well, Modula-2 and Object Pascal already proved that quite a time ago, but they didn't come with curly brackets, nor UNIX.
jvanderbot · 11h ago
Both can be true. It can be easy to learn and also a complete pain to set up and get right for every new project and menu of architectures you want to support.
fragmede · 19h ago
cross compiling to Mac from Linux, cross-architecture used to be an impossible bitbake adventure to go on. in go, it's just two env vars!
metaltyphoon · 19h ago
With a big footnote: Until you need CGO
supriyo-biswas · 18h ago
Having recently done some investigation along the same lines for a side project that I did not continue to work on: is it not just a matter of passing CGO_LDFLAGS=—static and building on musl?
steveklabnik · 7h ago
To say what your siblings said, but more generally: basically this only works for Linux. Other OSes generally don't let you do this, you must have some degree of dynamic linking.

Go used to try and let you do it, but has walked back those implementations after all the bugs they've caused, in my understanding.

saagarjha · 15h ago
I don't think you can do this when targeting macOS
metaltyphoon · 10h ago
or FreeBSD
fragmede · 19h ago
fair.
diggan · 1d ago
> And now everyone is rewriting everything in Go/Rust left and right.

Especially interesting for software that are 99.9% of the time waiting for inference to come back to you. Sure, makes sense to rewrite something that is heavily relying on the CPU, or where you want an easier time to deal with concurrency, but I feel like that's not what makes Codex takes long time to finish a prompt.

With that said, I also rewrote my local LLM agent software to Rust, as it's easier to deal with concurrency compared to my initial Python prototype. That it compiles into a neat binary is an additional benefit, but could have as easily gone with Go instead of Rust.

gwynforthewyn · 1d ago
> Especially interesting for software that are 99.9% of the time waiting for inference to come back to you.

In a different domain, I’ve seen a cli tool that requests an oauth token in Python be rewritten to rust and have a huge performance boost. The rust version had requested a token and presented it back to the app in a few milliseconds, but it took Python about five seconds just to load the modules the oauth vendor recommends.

That’s a huge performance boost, never mind how much simpler it is to distribute a compiled binary.

rybosome · 19h ago
I’ve spent some time optimizing Python performance in a web app and CLI, and yeah it absolutely sucks.

Module import cost is enormous, and while you can do lots of cute tricks to defer it from startup time in a long-running app because Python is highly dynamic, for one-time CLI operations that don’t run a daemon or something there’s just nothing you can do.

I really enjoy Python as a language and an ecosystem, and feel it very much has its place…which is absolutely not anywhere that performance matters.

EDIT: and there’s a viable alternative. Python is the ML language.

emporas · 1d ago
Python's startup cost is terrible. Same with Node. Go is very good, but Rust is excellent.

Even if a GC'ed language like Go is very fast at allocating/deallocating memory, Rust has no need to allocate/deallocate some amount of memory in the first place. The programmer gives the compiler the tools to optimize memory management, and machines are better at optimizing memory than humans. (Some kinds of optimizations anyway.)

nasretdinov · 13h ago
TBH I'm still surprised how quickly Go programs start up given how much stuff is there in init() functions even in the standard library (e.g. unicode tables, etc)
teaearlgraycold · 1d ago
Packaging Python apps is pure hell. npm gets a lot of shit, but Python deserves as much if not more.
dayjah · 1d ago
I know far too much about python packaging while only knowing a little about it.

I agree it’s hell. But I’ve not found many comprehensive packaging solutions that aren’t gnarly in some way.

IMHO the Python Packaging community have done an excellent job of producing tools to make packaging easy for folks, especially if you’re using GitHub actions. Check out: https://github.com/pypa/cibuildwheel

Pypa have an extensive list of GitHub actions for various use cases.

I think most of us end up in the “pure hell” because we read the docs on how to build a package instead of using the tools the experts created to hide the chaos. A bit like building a deb by hand is a lot harder than using the tools which do it for you.

teaearlgraycold · 1d ago
That’s fair. I’m also thinking about the sheer size of Python apps that make use of the GPU. I have to imagine a C++ app performing neural network shenanigans would’t be >1GB before downloading weights.
QuadmasterXLII · 22h ago
I’ve tried, it is still gigabytes, unless you try to dynamically link to user installed CUDA libraries from c++. Which I don’t recommend.
teaearlgraycold · 8h ago
Oof
crabmusket · 1d ago
Speaking of which, I didn't realise Node had a built in packaging feature to turn scripts into a single executable:

https://nodejs.org/api/single-executable-applications.html

bxdhxhxh · 18h ago
I was not aware of that feature either, thanks for the heads up

In my opinion, bundling the application Payload would be sufficient for interpreted languages like python and JavaScript

dweekly · 1d ago
Everything follows a cycle. Client/server, flat/skeuomorphic, compiled/interpreted, cloud/colo, colorful/minimalist, microservices/monolithic, standards-driven/vendor-driven, open/proprietary, CPU/xPU. There is a natural tension in the technical universe that pushes progress in one direction that loads the spring in the other direction. Over time you'll see the cycles rhyme.
BrouteMinou · 1d ago
I can't wait for the "rewritten in Rust" era to end!
tonyhart7 · 1d ago
wait until Zig(and ecosystem) got mature first

this is next Trends

apwell23 · 1d ago
crazy how fast "rewritten in Go" ended . now suddenly all those native go projects are uncool again.
jeremyloy_wt · 22h ago
The official Typescript compiler is being rewritten in Go as we speak.
apwell23 · 21h ago
ah intersting

"The existing code base makes certain assumptions -- specifically, it assumes that there is automatic garbage collection -- and that pretty much limited our choices."

mbb70 · 1d ago
Feels like the big shift is Rust hitting critical mass mindshare and LLM assisted translation making these rewrites viable. Rust is a very vibable language.

An announcement of Codex CLI being rewritten in C++ would be met with confusion and derision.

energy123 · 1d ago
> Rust is a very vibable language.

Why would you say this for Rust in particular?

falcor84 · 1d ago
Once a Rust program finally compiles it's much likelier that it's correct, in comparison to other languages, at least in the sense of avoiding unexpected runtime behavior.
csomar · 19h ago
If you properly structure your types and document them well, you can speed run lots of code generation.
wrsh07 · 1d ago
Good error messages which are great for humans but also great for LLMs that are trying to debug their initial attempt
energy123 · 1d ago
What about this which says the opposite:

https://news.ycombinator.com/item?id=44149809

koakuma-chan · 1d ago
The comment you replied to is talking about Rust compiler's errors.

The comment you linked is talking about unspecified application's runtime errors.

lmm · 20h ago
I suspect it's accidents of history that those are "native". Go's advantage is in its Google backing, and Rust is just a good language design that's managed to hit on the right kind of community. As far as I can see all the reasons that native was unnecessary are still valid.
pjmlp · 1d ago
And then they ship an Electron based GUI on top.

Note that most of these rewrites wouldn't be needed if the JIT language would be Java, C#, Common Lisp, Dart, Scheme, Raket.

And all of that list also have AOT compilers, and JIT cache tooling.

NitpickLawyer · 19h ago
> And then they ship an Electron based GUI on top.

If this catches on, and more tools get the "chatgpt, translate this into rust, make it go brrr" treatment, hopefully someone puts in the time & money to take tauri that extra 10-20% left to make it an universal electron replacement. Tauri is great, but still has some pain points here and there.

h1fra · 1d ago
For cli I'm sure the biggest drawback of node is the lack of portability and very large footprint of the "executables".
gschier · 1d ago
You really notice the startup delay with a large CLI written in something like NodeJS. Going from >100ms to ~0ms is a huge QoL improvement.
rane · 15h ago
It largely depends on how often you have to run the command.
miki123211 · 1d ago
CLIs are different than other software, as they're launched and killed often.
CSMastermind · 1d ago
If only we could shift off of React Native as well.
ralusek · 1d ago
Speaking for myself: one reason would be because it's one of the things LLMs/tools like Codex are the most useful for.

When I have a smallish application, with tests, written in one language, letting an LLM convert those files into another language is the single task I'm most comfortable handing over almost entirely. Especially when I review the tests and all tests in the new language are passing.

crop_rotation · 1d ago
Codex is terrible compared to claude code, even though the individual models of anthropic are not really that much better than openAI. They should have made that their top priority instead of a rewrite which will just make the improvements take a back seat.
serverlessmania · 14h ago
Had an awful experience with Claude Code—it couldn’t even handle a basic API test for existing code, while Codex with o3 nailed it in one shot.

Claude Code tends to write meaningless tests just to get them to pass—like checking if 1 + 1 = 2—and somehow considers that a job well done.

No comments yet

adsharma · 1d ago
Wouldn't it be a great showcase if codex showed that the rewrite to rust was done using codex itself instead of a human?

If it's not possible today, what are the challenges and where does a human need to step in and correct the model?

csomar · 19h ago
If it was possible, then OpenAI will stop selling AI and start selling services.
tptacek · 21h ago
This has been coming for awhile now (there's been a Rust fork available to try forever); as I understand it, the impetus here is just getting the CLI installed places where Node is annoying to install.
pjmlp · 1d ago
With all the RIIR going on nodejs ecosystem, I am waiting for the blog posts on how people completely got rid of nodejs, rewriting the whole backend in Rust and how everything is so much better.

Waiting for Show HN: AbrasionNext, the framework evolution for frontend devs, with SaaS cloud deployment.

wiseowise · 1d ago
It seems Rust "just" needs RoR level web framework and Qt level GUI framework to take over the world, everything else is already conquered (with some Go conclaves still holding on).
ogoffart · 1d ago
> Qt level GUI framework

Trying to bring that with Slint: https://slint.rs

csomar · 19h ago
> RoR level web framework

This is not happening. The new folks have moved to SPA/RSC and a RoR type framework doesn't make much sense in that context.

spiderice · 6h ago
Elixir/Phoenix is doing great
tonyhart7 · 1d ago
"It seems Rust "just" needs RoR level web framework"

its already HERE https://loco.rs/

I writing a production level app right now with it

satvikpendem · 1d ago
> I am waiting for the blog posts on how people completely got rid of nodejs, rewriting the whole backend in Rust

I did this for several projects, works great with much lower costs and compute/memory usage.

gavinray · 1d ago
This is one of the most ridiculous RIIR I've seen.

It's a CLI tool that makes API calls. I'd bet my bottom dollar that the performance difference between API-wrapping CLI tools in something like Ruby/Python vs Rust/C++ is negligible in perceived experience.

If they wanted people to not have a dependency on Node pre-installed, they could have shipped Single Executable Application's [0] or used a similar tool for producing binaries.

Or used Deno/Bun's native binary packaging.

[0] - https://nodejs.org/api/single-executable-applications.html

franga2000 · 1d ago
Having the CLI self-contained and easy to cross-compile is a huge improvement, as is the binary size (from what I've seen, wrapped Node binaries are huge). Also, a CLI tool should have low latency and no matter how good your engine is, an interpreter will have much higher startup latency than a proper binary.
ramoz · 23h ago
It's not just API calls though.

It's often parallel processing of I/O (network, filesystem) and computational tasks like testing and compiling code.

yahoozoo · 12h ago
These tools just run the test/compile commands in a shell…
geertj · 1d ago
I wonder how much of the rewrite was done using codex itself. It does seem like a perfect use case for it.
joshka · 1d ago
I imagine a reasonable amount. The maintainer who is doing most of the Rust rewrite submitted a PR to one of the Ratatui widget libraries I maintain that seemed to be Codex produced[1].

[1]: https://github.com/joshka/tui-markdown/pull/80

tymscar · 1d ago
This is an interesting proof of what I keep reading about. Where people are quick at making something, like a PR, with AI, but then doing the last 10% is left in the air.

If codex was half as good as they say it is in the presentation video, surely they could’ve sent a request to the one in chatgpt from their phone while waiting for the bus, and it would’ve addressed your comments…

winrid · 21h ago
The last 10% you're referring to are nits. That's like the last 0.000001%. also, it could have fixed all these in like a minute by itself.
suddenlybananas · 17h ago
Why didn't it?
apwell23 · 1d ago
Thats an interesting question. how much of code at these companies is written by tools they are selling.

I suspect its not much because I never see any stats published by any of these companies.

NitpickLawyer · 19h ago
Aider does publish this, and it's fascinating - https://aider.chat/HISTORY.html

> Aider writes most of its own code, usually about 70-80% of the new code in each release. These statistics are based on the git commit history of the aider repo.

apwell23 · 18h ago
> Whenever aider edits a file, it commits those changes with a descriptive commit message.

interesting model. every prompt response acceptance gets a git commit without human modifications .

quotemstr · 1d ago
It's interesting how people conflate language choice with memory management strategy, compilation mechanism, distribution toolset, and type system strictness. I mean, on one hand, people explain that Rust's manual memory management makes it fast, but then also praise garbage collected Go for its speed and low startup latency. It's not that manual=fast and GC=slow: it's that Go's GC doesn't suck, and you can make fast GCed programs and slow manually memory managed ones (like games and their infamous loading screens). The equation of GC with slow is just a simplifying idea embedded in industry consciousness.

Likewise, you can make a single file distribution of a TypeScript program just fine. (Bun has built in support even.) But people don't think of it as a "thing" in that ecosystem. It's just not the culture. TypeScript means npm or Electron. That's the equivalence embedded in the industry hive mind.

To be clear, I'm not decrying this equivalence. Simplification is good. We use language as a shorthand for a bundle of choices not necessarily tied to language itself. You can compile Python with Nuitka or even interpret C. But why would you spend time choosing a point on every dimension of technical variation independently when you could just pick a known good "preset" called a language?

The most important and most overlooked part of this language choice bundle is developer culture. Sure, in principle, language choice should be orthogonal to mindset, areas of interest, and kinds of aptitude. But let's be real. It isn't. All communities of humans being and Go developers evolve shared practices, rituals, shibboleths, and priesthoods. Developer communities are no exception.

When you choose, say, Rust, you're not just choosing a language. You're choosing that collection of beliefs and practices common among people who like to use that language. Rust people, for example, care very much about, say, performance and security. TypeScript people might care more about development speed. Your C people are going to care more about ABI stability than either.

Even controlling for talent level, you get different emphases. The Codex people are making a wire format for customizing the agent. C people would probably approach the problem by making a plugin system. TypeScript people would say to just talk to them and they'll ship what you need faster than you can write your extension.

Sometimes you even see multiple clusters of practitioners. Game development and HFT might use the same C++ syntax, but I'd argue they're a world apart and less similar to each other than, say, Java and C# developers are.

That's why language debates get so heated: they're not just expressing a preference. They're going to war for their tribe. Also, nothing pisses a language community off more than someone from a different community appropriating their sacred syntax and defiling it by using it wrong.

Codex isn't so much swapping out syntax as making a bet that Rust cultural practices outcompete TypeScript ones in this niche. I'm excited to see the outcome of this natural experiment.

rgbrgb · 1d ago
I enjoy this ethnographic take on programming language choice being as much about culture as technical features. I wonder what effect LLM coding agents will have here as it makes big rewrites between languages trivial and potentially allows developers to quickly shift their programs between languages to gain advantages. Echoes some of the immigration debates the right stirs up.
pjmlp · 11h ago
Many are yet to fully grasp their favourite languages will become irrelevant, when agents start spewing executables directly.

We are in the middle of a transition in programming paradigms.

Let the AI coding models flamewars start.

pjmlp · 1d ago
Somehow compilers should be part of any kind of computing related teaching, as means to dispel programming languages urban myths.

Unfortunately that in an utopia that will never realise.

People will keep learning programming languages based on hearsay, whatever books they find somewhere, influencers and what not.

quotemstr · 1d ago
If I were drafting a CS curriculum, I'd include as mandatory projects writing both a C interpreter and a Scheme compiler.
ThouYS · 1d ago
sounds like a post-hoc rationalization for the whims of their devs. that being said Node was a weird choice to begin with
justachillguy · 4h ago
Would love to see OpenAI add Codex CLI as part of their subscription and not API pricing.
vendiddy · 1d ago
I would love for them to optimize the ChatGPT desktop client. It lags on me quite frequently.
jbellis · 1d ago
I'm surprised that performance wasn't on the list. The main reason I started Brokk on the JVM was for access to https://github.com/joernio/joern, but high performance code that can easily go multicore is a very nice secondary reason.
jauntywundrkind · 1d ago
On the list:

> Optimized Performance — no runtime garbage collection, resulting in lower memory consumption

Introducing the list (efficiency resonates with me as a more specific form of performance):

> Our goal is to make the software pieces as efficient as possible and there were a few areas we wanted to improve:

jbellis · 1d ago
gc is negligible impact on llm agents

the others ("zero dependencies") are not actually related to efficiency

jauntywundrkind · 21h ago
It's not totally unjustified to hone in on GC as over focusing. But it feels strongly overfocusing to me.

Efficiency is the top level goal, and that equates directly to performance in most computing tasks: more efficiency means being able to let other work happen. There's times where single threaded outright speed is better, but usually in computing we try as hard as possible to get parallelism or concurrency in our approaches such that efficiency can directly translate to overall performance.

Overall performance as a bullet seems clear enough. Yes it's occluded by a mention of GC, but I don't think the team is stupid enough to think GC is the only performance factor win they might get here, even if they don't list other factors.

Even a pretty modest bit of generosity makes me think they're doing what was asked for here. Performance very explicitly is present, and to me, I think quite clearly a clear objective.

simianwords · 15h ago
gc is especially not relevant in async flows. i don't know what they are optimising for..
CGamesPlay · 1d ago
It sounds like what they're actually doing is going closed-source, and using a Rust rewrite as cover for that.

This is interesting, because the current Codex software supports third-party model providers. This makes sense for OpenAI Codex, because is is the underdog compared to Claude Code, but perhaps they have changed their stance on this.

[edit] Seems that this take is incorrect; the source is in the tree.

mappu · 1d ago
mdaniel · 23h ago
For all this "AI is the future," how in the world does OpenAI's own codebase still have "TODO" comments for the most trivial thing I can possibly imagine? <https://github.com/openai/codex/blob/rust-v0.0.2505302325/co...> made extra wtf by a comment at the top of the file laying out that requirement, so no "?" required <https://github.com/openai/codex/blob/rust-v0.0.2505302325/co...>

I would bet it took more wall-clock time to type out that comment than it would have for any number of AI agents to snap the required equivalent of `if not re.match(...): continue` into place

quesera · 23h ago
Seems uncharitable to me.

  // TODO: Verify server name: require `^[a-zA-Z0-9_-]+$`?
There may be several elements of server name verification to perform.

That regex does not cover the complete range of possibilities for a server name.

The author apparently decided to punt on the correctness of this low-risk test -- pending additional thought, research, consultation, or simply higher prioritization.

CGamesPlay · 23h ago
OK, that's a good sign. I didn't see any links to the source in the Github thread, and the only mention of contributing in that thread is an email to apply for a job. I'm not going to delete my comment, because the subtree you showed doesn't have a LICENSE file so it isn't totally clear, but I agree it does appear to at least continue to be source-available.

Reviewing the source for this tree, looks like it's been in public development for a fair amount of time, with many PRs.

xyzzy_plugh · 23h ago
It is in fact totally clear. There is a LICENSE file in the root of the repository. Adding a new subtree (directory) should not call into question whether or not that tree is covered by the preexisting license. That's silly. If they wanted to change the license then there needs to be an actual change to the license.
phito · 7h ago
Imagine writing a CLI in JS in the first place...
npalli · 18h ago
People will do anything to avoid reading and maintaining someone else's code. If that means rewriting in native and someone (usually VC's) is paying for it so be it. If you think working in existing TS/JS code is hard, wait until someone hands you their 100k+ line Rust code and asks you to make changes. In five years, another big shift to rewrite and change everything.
drodgers · 14h ago
Making changes to huge rust projects is quite easy. For a substantial alteration, you make your change, the compiler tells you the 100 problems it caused, and you fix them all (~50% auto fix, 30% Claude/Codex, 20% manual), then the program probably does the thing.

Architecting the original 100kloc program well requires skill, but that effort is heavily front loaded.

antimora · 1d ago
Now waiting for Claude Code to be rewritten in Rust.
light_hue_1 · 11h ago
Everyone is missing why they're doing this. It has nothing to do with anything in the post and it's not a technical decision.

It's a way to close off codex. There's no point in making a closed source codex if it's in typescript. But there is if it's in rust.

This is just another move to make OpenAI less open.

lioeters · 1d ago
They're rewriting Codex CLI from TypeScript to Rust, for performance, security, zero dependency install, extensibility.
laurent_du · 1d ago
What kind of performance gains can you get from that? Seems to me that 99.9% of the compute happens remotely.
tux3 · 1d ago
Probably minimal, it's mostly lower memory use from not having to boot a separate V8 engine with its own GC, like how Electron apps have to boot a separate browser engine. But CPU-wise it's not doing anything interesting on the client.

The neat thing for me is just not needing to setup a Node environment. You can copy the native binary and it should run as-is.

Wowfunhappy · 1d ago
There are various solutions for turning node programs into standalone binaries, by more-or-less including Node within the binary. One I've used before with success is "pkg".
satvikpendem · 1d ago
I would rather not have to package the entire runtime of a language just to run one program, hence Rust is a great choice for this.
koakuma-chan · 1d ago
Don’t have to wait 2 seconds for the runtime to start.
wrsh07 · 1d ago
If you're parsing JSON or other serialization formats, I expect that can be nontrivial. Yes, it's dominated by the llm call, but serializing is pure CPU.

Also, ideally your lightweight client logic can run on a small device/server with bounded memory usage. If OpenAI spins up a server for each codex query, the size of that server matters (at scale/cost) so shaving off mb of overhead is worthwhile.

yahoozoo · 1d ago
Yeah, they could probably get away with doing this in Go.
littlestymaar · 1d ago
Sure, both language are equally fit for this kind of tasks.
qsort · 1d ago
Probably uses less memory?

The big one is not having node as a dependency. Performance, extensibility, safety, yeah, don't really warrant a rewrite.

mrweasel · 1d ago
I get why OpenAI didn't invest time and money into this, but I do wonder if there's some reason that nobody have written a JavaScript frontend for LLVM.

There shouldn't be a reason why you couldn't and it would give you performance and zero dependency install.

wiseowise · 1d ago
> but I do wonder if there's some reason that nobody have written a JavaScript frontend for LLVM

Astral folks are taking notes. (I wouldn't be surprised if they already have a super secret branch where they rewrite Python and make it 100x faster, but without AI bullshit like Mojo).

littlestymaar · 1d ago
What do you mean by “a JavaScript frontend for LLVM”?

Edit: ah, I see, I read “LLM” instead of LLVM at first! It's only after I posted my question that realized my mistake.

I'm not sure it makes sense to compile JavaScript natively, due to the very dynamic nature of the language, you'd end up with a very slow implementation (the JIT compilers make assumptions to optimize the code and fall back to the slow baseline when the assumptions are broken, but you can't do that with AoT).

mrweasel · 1d ago
> I'm not sure it makes sense to compile JavaScript natively, due to the very dynamic nature of the language

That's a good point, maybe TypeScript would be a better candidate.

crabmusket · 1d ago
TypeScript is just as dynamic as JavaScript.

For what it would take to compile TS to native code, check out AssemblyScript.

jacob019 · 1d ago
Thank you. It is not clear what "going native" means.
karn97 · 1d ago
They are running out of actual useful stuff to sell huh? LLMs provide 0 business value (any value is offset by uncertainty)
wiseowise · 1d ago
> LLMs provide 0 business value (any value is offset by uncertainty)

Hilarious take.

alexchamberlain · 1d ago
I think you've highlighted the real engineering challenge of using LLMs - overcoming the uncertainty. In some contexts, a reasonable level of uncertainty is fine, as the tool itself is being used by experts ( ie the coding tools), but in other cases, you need to engineer a lot more guardrails to ensure a result of sufficient quality.
satvikpendem · 1d ago
> ? LLMs provide 0 business value

Usage is in the eye of the user, I see.

semiinfinitely · 21h ago
Justification is so weak it's literally just rewrite it in rust because I wanna
winrid · 21h ago
Getting Node installed is a blocker for some of their customers, evidently. It doesn't surprise me. If it holds back one mid sized enterprise customer, the rewrite is worth it.
mahmoudimus · 9h ago
I have largely avoided the entire typescript / JavaScript ecosystem specifically because I don't want to deal with node or its ecosystem. It's just so confusing with yarn, npm, npx, then the build systems gulp, grunt, webpack, etc etc - felt very overwhelming.

Yes, if I spent more time learning these things, it would become simple but that seemed like a massive waste of time.

rcleveng · 18h ago
It's a blocker for many windows enterprise customers. The suggested way is to use nvm to manage it (link: https://learn.microsoft.com/en-us/windows/dev-environment/ja...).

This needs admin permissions, which means a ticket with IT and a good chance it'll be rejected since it's scary as it'll open up the door to many admin level installs of software that IT has no control over.

Installing node under WSL is a better approach anyway, but that'll make it harder for enterprise customers still.

pjmlp · 11h ago
It can literally be a zip that one unpacks somewhere and sets the PATH.

https://nodejs.org/en/download

I never used nvm.

If someone doesn't get this, it is a skill issue.

rileytg · 21h ago
node install can be a real pain sometimes. node ecosystem has had a number of security related issues over the years. supply chain attacks are one of my main fears.
mahmoudimus · 8h ago
I think most package systems are going to start, if not already, facing real supply chain attacks. The node ecosystem, from an attacker's lens, had quite a heavy leaning ratio of non-security conscious users which makes a better breeding ground for exploitation.
unshavedyak · 19h ago
Is that not a valid justification?
puskuruk · 9h ago
It’s just waste of time. JavaScript is the most natural computer language out there for now