Show HN: RNN Musical Instrument

2 cochlear 0 9/7/2025, 8:54:37 PM blog.cochlea.xyz ↗
In this small demo, I first overfit a single-layer RNN, decomposing a short segment of classical music into two "factors":

- a sparse "control signal", something like a score, representing the injection of energy into the system or instrument(s) - an RNN which models the resonances of the instrument(s) being played

Finally, hand-tracking landmarks (thanks to MediaPipe) are mapped onto the RNN's input space via a random linear projection.

While recent text-to-music models are a groundbreaking technology, I'm exploring ways to enhance the _real-time_ experience (and joy) of action and reaction that comes when we play a musical instrument, or even when we tap on some random object that sounds particularly pleasant.

In "Livewired", David Eagleman discusses the astounding ability of our neural networks to quickly integrate new senses and sources of information. Similarly, I think musicians are able to quickly and expertly map high-dimensional signals onto the space of possible sounds.

I'd like to grow this collection of sensors, projections, and resonance models.

Comments (0)

No comments yet