Show HN: Merliot – plugging physical devices into LLMs
40 sfeldma 8 5/17/2025, 1:09:38 AM github.com ↗
Merliot Hub is an AI-integrated device hub.
What does that mean? It means you can control and interact with your physical devices, your security cameras, your thermometer, seamlessly using natural language from an LLM host such as Claude Desktop or Cursor. The hub is a gateway between AI and the physical world.
What could go wrong?
The idea is I’d go on stage singing and playing guitar with a looper and some samples, then bring a robot toy and introduce the robot “controlling” the looping and sampling as the bandmate.
It’s a gimmick that’s been done before, but with LLMs driving verbal interaction and now I could use this to animate a robot…it becomes pretty compelling. I’d plug the LLMs into the audio feed so I could banter with it and get responses then have the robot avatar animate accordingly.
If only my full time job saw value in this project.
One example on unmanned boats: a human could radio to the boat over VHF and say “move 100 meters south”… that speech-to-text would feed to an LLM which extracts the meaning and calls the MCP.
I’ll have to install this and play around.
I tried the MCP server with the demo (https://merliot.io/demo) using Cursor and asked:
What is the location of the "GPS nano" device?
The location of the "GPS nano" device is: Latitude: 30.448336 Longitude: -91.12896
No comments yet
https://www.marksetbot.com/
(not affiliated, just a fan)
No comments yet