A new experimental Go API for JSON

160 darccio 48 9/9/2025, 2:54:32 PM go.dev ↗

Comments (48)

analytically · 5h ago
Benchmark Analysis: Sonic vs Standard JSON vs JSON v2 in Go

https://github.com/centralci/go-benchmarks/tree/b647c45272c7...

gethly · 4h ago
Those numbers look similar to goccy. I used to use it in the past, even Kubernetes uses it as direct dependency, but the amount of issues have been stockpiling for quite some time so I no longer trust it.

So it seems both are operating at the edge of Go's capabilities.

Personally, I think JSON should be in Go's core and highly optimised simd c code and not in the Go's std library as standard Go code. As JSON is such an important part of the web nowadays, it deserves to be treated with more care.

tptacek · 4h ago
What does "highly optimized" have to do with whether it's in the standard library? Highly-optimized cryptography is in the standard library.
ronsor · 4h ago
Not to mention that Go is never going to put C code in the standard library for anything portable. It's all Go or assembly now.
dilyevsky · 3h ago
Previously Go team has been vocal about sacrificing performance to keep stdlib idiomatic and readable. Guess the crypto packages are the exception because they are used heavily by Google internally and json and some others (like say image/jpeg which had crap performance last time i checked) are not.

Edit: See: https://go.dev/wiki/AssemblyPolicy

nasretdinov · 4h ago
Go doesn't yet have native SIMD support, but it actually might in the future: https://github.com/golang/go/issues/73787

I think when it's introduced it might be worth discussing that again. Otherwise providing assembly for JSON of all packages seems like a huge maintenance burden for very little benefit for end users (since faster alternatives are readily available)

godisdad · 2h ago
> As JSON is such an important part of the web nowadays, it deserves to be treated with more care.

There is a case to be made here but Corba, SOAP and XML-RPC likely looked similarly sticky and eternal in the past

CamouflagedKiwi · 3h ago
Agreed. goccy has better performance most times but absolutely appalling worst-case performance which renders it unacceptable for many use cases - in my case even with trusted input it took effectively eternity to decode it. It's literally a quadratic worst case, what's the point of having a bunch of super clever optimisations if the big-O performance is that bad.

Sonic may be different but I'm feeling once bitten twice shy on "faster" JSON parsers at this point. A highly optimised SIMD version might be nice but the stdlib json package needs to work for everything out there, not just the cases the author decided to test on, and I'd be a lot more nervous about something like that being sufficiently well tested given the extra complexity.

ForHackernews · 25m ago
The fact that JSON is used so commonly for web stuff seems like an argument against wasting your time optimizing it. Network round trip is almost always going to dominate.

If you're pushing data around on disk where the serialization library is your bottleneck, pick a better format.

tgv · 5h ago
IIRC, sonic does JIT, has inline assembly (github says 41%), and it's huge. There's no way you can audit it. If you don't need to squeeze every cpu cycle out of your json parser (and most of us don't; go wouldn't be the first choice for such performance anyway), I'd stick with a simpler implementation.
Thaxll · 44m ago
And Sonic with its "cutting edge" optimization is still slower than std Json on arm64 with basic use cases. It shows that JIT, simd, low level code comes at cost of maintenance for all platform.

https://github.com/bytedance/sonic/issues/785

kiitos · 1h ago
first of all, that doesn't exercise JSON v2 at all, afaict

second of all, sonic apparently uses unsafe to (unsafe-ly) cast byte slices to strings, which of course is gonna be faster than doing things correctly, but is also of course incomparable to doing things correctly

like almost all benchmark data posted to hn -- unsound, ignore

coldtea · 4h ago
>Over time, packages evolve with the needs of their users, and encoding/json is no exception

No, it's an exception. It was badly designed from the start - it's not just that people's json needs (which hardly changed) outgrew it.

bayindirh · 3h ago
A bad design doesn't invalidate the sentence you have quoted.

Over time, it became evident that the JSON package didn't meet the needs of its users, and the package has evolved as a result. The size of the evolution doesn't matter.

pcthrowaway · 3h ago
It's true that packages (generally) evolve with the needs of their users.

It's also true that a json IO built-in lib typically wouldn't be so poorly designed in the first release of a language, that it would immediately be in need of maintenance.

bayindirh · 3h ago
> immediately

JSON library released with Go 1, in 2012. This makes the library 13 years old [0].

If that's immediate, I'm fine with that kind of immediate.

[0]: https://pkg.go.dev/encoding/json@go1

pcthrowaway · 3h ago
In need of maintenance and having received maintenance are two different things
coldtea · 3h ago
"immediately be in need of maintenance" means it needed this update 13 years ago.
kiitos · 1h ago
encoding/json works perfectly great even today, please troll somewhere else
coldtea · 40m ago
Please don't use the "troll" accusation to something because you disagree with it. It's weasely.

Obviously encoding/json doesn't "work perfectly", the TFA lists several problems it has, and the need for a new version, and that's directly by the horse's mouth. Is the Go team "trolling" as well?

Second, we're not talking whether it "does the job", which is what you might mean by "works perfectly great".

We're talking about whether it's a good design for the problem domain, or whether it has footguns, bad API choices, performance problems, and other such issues.

pcthrowaway · 45m ago
It works well enough, despite being clunky, but it has a few issues, including significant performance-related ones.
coldtea · 3h ago
"Becoming evident it doesn't meet the needs of its users" is not the same as "packages evolve with the needs of their users".

The latter is a weasely way to put the blame on changing needs - as if initially it was fine, but user needs grew and it's not covering them anymore. Truth is, user needs are the same, we havent had any magical change in JSON use patterns over the last 10 years. The design was just flawed to begin with.

I'd argue it didn't "become evident over time" either. It was evident on day one, and many people pointed it out 10 and 13 years ago.

geodel · 2h ago
Well mostly I have seem people learn shortcomings of software by using or creating it and come up with new version when possible. In your case it seems v1 are perfect each time.
breakingcups · 2h ago
If/once this goes through, I wonder what the adoption is going to be like now that all LLMs still only have the v1 api in their corpus.
sroerick · 3h ago
Could somebody give a high level overview of this for me, as not a godev? It looks like Go JSON lib has support to encode native go structures in JSON, which is cool, but maybe it was bad, which is not as cool. Do I have that right?
eknkc · 3h ago
Go already has a JSON parser and serializer. It kind of resembles the JS api where you push some objects into JSON.stringify and it serializes them. Or you push some string and get an object (or string etc) from JSON.parse.

The types themselves have a way to customize their own JSON conversion code. You could have a struct serialize itself to a string, an array, do weird gymnastics, whatever. The JSON module calls these custom implementations when available.

The current way of doing it is shit though. If you want to customize serialization, you need to return a json string basically. Then the serializer has to check if you actually managed to return something sane. You also have no idea if there were some JSON options. Maybe there is an indentation setting or whatever. No, you return a byte array.

Deserialization is also shit because a) again, no options. b) the parser has to send you a byte array to parse. Hey, I have this JSON string, parse it. If that JSON string is 100MB long, too bad, it has to be read completely and allocated again for you to work on because you can only accept a byte array to parse.

New API fixes these. They provide a Decoder or Encoder to you. These carry any options from top. And they also can stream data. So you can serialize your 10GB array value by value while the underlying writer writes it into disk for example. Instead of allocating all on memory first, as the older API forces you to.

There are other improvements too but the post mainly focuses on these so thats what I got from it (I havent tried the new api btw, this is all from the post so maybe I’m wrong on some points)

stackedinserter · 1h ago
gjson/sjson is probably for you if you need to work with 100MB JSONs.
skywhopper · 20m ago
Nah, the existing implementation is pretty decent, actually, but doesn’t address every use case and has some flaws that are hard or impossible to fix. But for lots of use cases it works great.

Now here’s a new implantation that addresses some of the architectural problems that made the old library structurally problematic for some use cases (streaming large JSON docs being the main one).

tibbe · 2h ago
> Since encoding/json marshals a nil slice or map as a JSON null

How did that make it into the v1 design?

rowanseymour · 2h ago
I had a back and forth with someone who really didn't want to change that behavior and their reasoning was that since you can create and provide an empty map or slice.. having the marshaler do that for you, and then also needing a way to disable that behavior, was unnecessary complexity.
binary132 · 2h ago
how is a nil map not null? It certainly isn’t a zero-valued map, that would be {}.
rplnt · 1h ago
Well those are different things, aren't they? Empty slice/map is different from nil. So it makes a lot of sense that nil = null and []string{} = [], and you have an option to use both. That being said, it starts to make less sense if you work with go where the API mostly treats it as equivalent (append, len, []). So that would be my guess how it ended up the way it did.

Also, now that nil map is an empty object, shouldn't that extend to every nil struct that doesn't have a custom marshaller? It would be an object if it wasn't nil after all...

stackedinserter · 1h ago
Why shouldn't it be? The nil is null and empty array is an empty array, they are completely different objects.
afdbcreid · 1h ago
This is the second time a v2 is released to a package in the Go's standard library. Other ecosystems are not free of this problem.

And then people complain that Rust doesn't have a batteries-included stdlib. It is done to avoid cases like this.

ncruces · 1h ago
That has its own downsides, though.

Both v1 packages continue work; both are maintained. They get security updates, and were both improved by implementing them on top of v2 to the extent possible without breaking their respective APIs.

More importantly: the Go authors remain responsible for both the v1 and v2 packages.

What most people want to avoid with a "batteries included standard library" (and few additional dependencies) is the debacle we had just today with NPM.

Well maintained packages, from a handful of reputable sources, with predictable release schedules, a responsive security team and well specified security process.

You can't get that with 100s of independently developed dependencies.

oncallthrow · 1h ago
Wow, two whole times in 19 years? That sounds terrible.

Yes, we should definitely go with the Rust approach instead.

Anyway, I'd better get back to figuring out which crate am I meant to be using...

skywhopper · 18m ago
Two v2s in 15 years seems pretty good given the breadth of the stdlib.
kiitos · 1h ago
I'm not sure how this is a problem, and I'm very sure that even in the presence of this "problem" it is far better for a language to have a batteries-included stdlib than to not
rjrodger · 4h ago
null != nil !!!

It is good to see some partial solutions to this issue. It plagues most languages and introduces a nice little ambiguity that is just trouble waiting to happen.

Ironically, JavaScript with its hilarious `null` and `undefined` does not have this problem.

Most JSON parsers and emitters in most languages should use a special value for "JSON null".

pjmlp · 4h ago
Fixed in 1976 by ML, followed up by Eiffel in 2005, but unfortunately yet to be made common.
afiori · 3h ago
Null and undefined are fine imho with a sort of empty/missing semantics (especially since you mostly just care to == them) I have bigger issues to how similar yet different it is to have an undefined key and a not-defined key, I would almost prefer if

    obj['key']=undefined

 was the same as 

    delete obj['key']
h1fra · 2h ago
I still don't get how a common thing like JSON is not solved in go. How convoluted it is to just get a payload from an api call compared to all languages is baffling
simpaticoder · 19m ago
It may be because Go programs use gRPC more heavily than REST at Google.
Thaxll · 39m ago
You should read that, it's still relevant: https://seriot.ch/projects/parsing_json.html