Ask HN: Why aren't AIs being used as app beta testers yet?
13 points by amichail 15h ago 17 comments
Ask HN: How would expose a scam involving a powerful figure?
10 points by soueuls 13h ago 8 comments
Ask HN: What are your peronsal data backup and sync setups?
15 points by shelled 13h ago 12 comments
Theoretical Analysis of Positional Encodings in Transformer Models
24 PaulHoule 2 6/27/2025, 10:07:11 PM arxiv.org ↗
Comments (2)
semiinfinitely · 3h ago
Kinda disappointing that rope- the most common pe- is given about one sentence in this work and omitted from the analysis.
gsf_emergency_2 · 55m ago
Maybe it's because ropes by themselves do nothing for the model capacity?