Distillation Makes AI Models Smaller and Cheaper

2 nsoonhui 1 7/19/2025, 11:22:35 AM quantamagazine.org ↗

Comments (1)

nickpsecurity · 3h ago
"The paper was rejected from a conference, and Vinyals, discouraged, turned to other topics."

"The original distillation paper, still published only on the arxiv.org preprint server, has now been cited more than 25,000 times (opens a new tab)."

I wonder what the reason was for rejecting this critical paper.