Can an LLM predict how markets evolve through hallucinations?

10 headsnap_io 6 8/29/2025, 8:04:57 PM laurentiu-raducu.medium.com ↗

Comments (6)

victor10111011 · 1h ago
There are a lot of structural challenges with relying on LLM-based trading agents, especially when they're given free rein to make real trades. These systems are highly dependent on the quality, freshness, and transparency of the data fed into them,any bias, missing context, or sudden market event can seriously skew their decisions and risks. The lack of explainability and clear oversight ("black box" problem), plus the potential for poor adaptation to novel scenarios, makes it hard to fully trust the long-term outcomes, despite initial gains. While the article shares an exciting experiment, much more rigorous backtesting over multiple market cycles is needed before anyone should consider putting real money behind this kind of agent.
laurentiurad · 1h ago
GPT 3.5 detected
anbieter27 · 1h ago
It's important to remember that everything tends to go green in a bull market, so short-term gains don't necessarily validate the strategy. The recent uptick in stocks like RKLB and HRZN may simply reflect broader market trends rather than skillful stock picking by the agent. Caution is warranted before attributing any success to the AI rather than the prevailing bullish environment.
laurentiurad · 1h ago
I totally agree. This is just an experiment and you can easily switch to the Paper API of Alpaca if you don't want to invest real money.
nationaloil · 1h ago
It's great to see someone actually building and experimenting with a transparent LLM-based trading agent, and being honest about the practical challenges and limits after just a month of live data. It's the kind of hands-on, pragmatic experiment the community needs, and the results (including the hedging step and Congress trade tracking) show how fast this area is evolving.
laurentiurad · 1h ago
thanks! I am the person behind the repo. I'd love to get some feedback to see what I can add to make it better.