We should ask AI do coding tasks more efficiently, just for saving tokens

2 lidangzzz 4 7/8/2025, 9:06:56 AM github.com ↗

Comments (4)

082349872349872 · 13h ago
If you want to save tokens, how about using a source language in the APL family?
lidangzzz · 12h ago
APL may save a lot of tokens for computing tasks, but not for other general tasks, such as backend development?

Also I guess LLM doesn't have enough APL codes in training dataset, which might be a big problem.

LLMs are still very good at popular languages, so moving to APL for general tasks is probably a bad choice.

skruger · 11h ago
As to the training set's depth of APL, yes, it's an issue. However, it's worth seeing how well MoonBit[1, 2] works with LLMs, faced with exactly the same problem -- integrating the LLM directly into the parser pipeline.

1: https://www.youtube.com/watch?v=SnY0F9w1xdM 2: https://www.moonbitlang.com/blog/moonbit-ai

lidangzzz · 11h ago
Hongbo blocked me on Twitter, lol