Green Tea Garbage Collector

91 cirwin 5 6/14/2025, 4:10:56 AM github.com ↗

Comments (5)

brianolson · 6h ago
"In select GC-heavy microbenchmarks ... we observed anywhere from a 10–50% reduction in GC CPU costs"

- Yay!

"The Go compiler benchmarks appear to inconsistently show a very slight regression (0.5%)"

- Boo

"Green Tea is available as an experiment at tip-of-tree and is planned as to be available as an opt-in experiment in Go 1.25"

I definitely know some application code that spends 30% of CPU time in GC that needs to try this.

Imustaskforhelp · 5h ago
Regarding "The Go compiler benchmarks appear to inconsistently show a very slight regression (0.5%)"

Let the golang developers "cook", I am pretty sure that they are going to do what would be right for the language.

"The Go compiler benchmarks appear to inconsistently show a very slight regression (0.5%). Given the magnitude and inconsistency of the regression, these benchmarks appear to be rather insensitive to this change. One hypothesis is that the occasional regression may be due to an out-of-date PGO profile, but remains to be investigated."

So they are going to be investigated and definitely a reason why this occurs and how to fix it would also come before you or I would use it in 1.26 (since they are saying it would most likely be shipped in 1.26)(If I remember correctly?) so there is no need to boo I guess.

Great job from the golang team.

Imustaskforhelp · 5h ago
This is fantastic if I am reading it correctly. Making go even faster.
silisili · 10h ago
Well, I don't love that reported performance regressions are handwaved away as not the new gc, but doing something wrong or abnormal.

Will wait for more real world cases showing substantial improvements, but existing(and possibly bad) code exists and it shouldn't be blamed for regressions.

zozbot234 · 9h ago
I didn't see anyone "handwaving away" performance regressions in the thread. The closest was a special case where a Golang program was auto-tuning caching decisions based on heap size metrics, and this led to an apparent regression due to the improved metrics w/ the new GC leading to excessive caching. That's hardly the common case!

(In general though, if you take the authors' concerns about the increased future impact of memory bandwidth and memory non-locality seriously, the obvious answer is "don't use GC in the first place, except when you really, really can't avoid it. And even then, try to keep your object graphs as small and compact as possible wrt. memory use; don't have a single "tracing" phase that ends up scanning all sorts of unrelated stuff together." Of course this is unhelpful if you need to work w/ existing codebases, but it's good to keep in mind for new greenfield projects!)