Generalist AI doesn't scale (2024)
2 TMWNN 1 6/16/2025, 1:33:14 AM daemonology.net ↗
Comments (1)
NoahZuniga · 14h ago
This idea has basically already been implemented (even before this post was published) in a construction called mixture of experts. It makes it easier to train, while still making the model somewhat interconnected.