Claude can now create and edit files (anthropic.com)
362 points by meetpateltech 8h ago 219 comments
I don't want AI agents controlling my laptop (sophiebits.com)
14 points by Bogdanp 1h ago 1 comments
Source code for the X recommendation algorithm (github.com)
222 points by mxstbr 7h ago 128 comments
A new experimental Go API for JSON
160 darccio 48 9/9/2025, 2:54:32 PM go.dev ↗
https://github.com/centralci/go-benchmarks/tree/b647c45272c7...
So it seems both are operating at the edge of Go's capabilities.
Personally, I think JSON should be in Go's core and highly optimised simd c code and not in the Go's std library as standard Go code. As JSON is such an important part of the web nowadays, it deserves to be treated with more care.
Edit: See: https://go.dev/wiki/AssemblyPolicy
I think when it's introduced it might be worth discussing that again. Otherwise providing assembly for JSON of all packages seems like a huge maintenance burden for very little benefit for end users (since faster alternatives are readily available)
There is a case to be made here but Corba, SOAP and XML-RPC likely looked similarly sticky and eternal in the past
Sonic may be different but I'm feeling once bitten twice shy on "faster" JSON parsers at this point. A highly optimised SIMD version might be nice but the stdlib json package needs to work for everything out there, not just the cases the author decided to test on, and I'd be a lot more nervous about something like that being sufficiently well tested given the extra complexity.
If you're pushing data around on disk where the serialization library is your bottleneck, pick a better format.
https://github.com/bytedance/sonic/issues/785
second of all, sonic apparently uses unsafe to (unsafe-ly) cast byte slices to strings, which of course is gonna be faster than doing things correctly, but is also of course incomparable to doing things correctly
like almost all benchmark data posted to hn -- unsound, ignore
No, it's an exception. It was badly designed from the start - it's not just that people's json needs (which hardly changed) outgrew it.
Over time, it became evident that the JSON package didn't meet the needs of its users, and the package has evolved as a result. The size of the evolution doesn't matter.
It's also true that a json IO built-in lib typically wouldn't be so poorly designed in the first release of a language, that it would immediately be in need of maintenance.
JSON library released with Go 1, in 2012. This makes the library 13 years old [0].
If that's immediate, I'm fine with that kind of immediate.
[0]: https://pkg.go.dev/encoding/json@go1
Obviously encoding/json doesn't "work perfectly", the TFA lists several problems it has, and the need for a new version, and that's directly by the horse's mouth. Is the Go team "trolling" as well?
Second, we're not talking whether it "does the job", which is what you might mean by "works perfectly great".
We're talking about whether it's a good design for the problem domain, or whether it has footguns, bad API choices, performance problems, and other such issues.
The latter is a weasely way to put the blame on changing needs - as if initially it was fine, but user needs grew and it's not covering them anymore. Truth is, user needs are the same, we havent had any magical change in JSON use patterns over the last 10 years. The design was just flawed to begin with.
I'd argue it didn't "become evident over time" either. It was evident on day one, and many people pointed it out 10 and 13 years ago.
The types themselves have a way to customize their own JSON conversion code. You could have a struct serialize itself to a string, an array, do weird gymnastics, whatever. The JSON module calls these custom implementations when available.
The current way of doing it is shit though. If you want to customize serialization, you need to return a json string basically. Then the serializer has to check if you actually managed to return something sane. You also have no idea if there were some JSON options. Maybe there is an indentation setting or whatever. No, you return a byte array.
Deserialization is also shit because a) again, no options. b) the parser has to send you a byte array to parse. Hey, I have this JSON string, parse it. If that JSON string is 100MB long, too bad, it has to be read completely and allocated again for you to work on because you can only accept a byte array to parse.
New API fixes these. They provide a Decoder or Encoder to you. These carry any options from top. And they also can stream data. So you can serialize your 10GB array value by value while the underlying writer writes it into disk for example. Instead of allocating all on memory first, as the older API forces you to.
There are other improvements too but the post mainly focuses on these so thats what I got from it (I havent tried the new api btw, this is all from the post so maybe I’m wrong on some points)
Now here’s a new implantation that addresses some of the architectural problems that made the old library structurally problematic for some use cases (streaming large JSON docs being the main one).
How did that make it into the v1 design?
Also, now that nil map is an empty object, shouldn't that extend to every nil struct that doesn't have a custom marshaller? It would be an object if it wasn't nil after all...
And then people complain that Rust doesn't have a batteries-included stdlib. It is done to avoid cases like this.
Both v1 packages continue work; both are maintained. They get security updates, and were both improved by implementing them on top of v2 to the extent possible without breaking their respective APIs.
More importantly: the Go authors remain responsible for both the v1 and v2 packages.
What most people want to avoid with a "batteries included standard library" (and few additional dependencies) is the debacle we had just today with NPM.
Well maintained packages, from a handful of reputable sources, with predictable release schedules, a responsive security team and well specified security process.
You can't get that with 100s of independently developed dependencies.
Yes, we should definitely go with the Rust approach instead.
Anyway, I'd better get back to figuring out which crate am I meant to be using...
It is good to see some partial solutions to this issue. It plagues most languages and introduces a nice little ambiguity that is just trouble waiting to happen.
Ironically, JavaScript with its hilarious `null` and `undefined` does not have this problem.
Most JSON parsers and emitters in most languages should use a special value for "JSON null".