Show HN: Sheet Music in Smart Glasses

96 kevinlinxc 12 5/6/2025, 3:47:21 PM
Hi everyone, my name is Kevin Lin, and this is a Show HN for my sheet music smart glasses project. My video was on the front page on Friday: https://news.ycombinator.com/item?id=43876243, but dang said we should do a Show HN as well, so here goes!

I’ve wanted to put sheet music into smart glasses for a long time, but the perfect opportunity to execute came in mid-February, when Mentra (YC W25) tweeted about a smart glasses hackathon they were hosting - winners would get to take home a pair. I went, had a blast making a bunch of music-related apps with my teammate, and we won, so I got to take them home, refine the project, and make a pretty cool video about it (https://www.youtube.com/watch?v=j36u2i7PKKE).

The glasses are Even Realities G1s. They look normal, but they have two microphones, a screen in each lens, and can be even made with a prescription. Every person I’ve met who tried them on was surprised at how good the display is, and the video recordings of them unfortunately don’t do them justice.

The software runs on AugmentOS, which is Mentra’s smart glasses operating system that works on various 3rd-party smart glasses, including the G1s. All I had to do to make an app was write and run a typescript file using the AugmentOS SDK. This gives you the voice transcription and raw audio as input, and text or bitmaps available as output to the screens, everything else is completely abstracted away. Your glasses communicate with an AugmentOS app, and then the app communicates with your typescript service.

The only hard part was creating a Python script to turn sheet music (MusicXML format) into small, optimized bitmaps to display on the screens. To start, the existing landscape of music-related Python libraries is pretty poorly documented and I ran into multiple never-before-seen error messages. Downscaling to the small size of the glasses screens also meant that stems and staff lines were disappearing, so I thought to use morphological dilation to emphasize those without making the notes unintelligible. The final pipeline was MusicXML -> music21 library to render chunks of bars to png -> dilate with opencv- > downscale -> convert to bitmap with Pillow -> optimize bitmaps with imagemagick. This is far from the best code I’ve ever written, but the LLMs attempt at this whole task was abysmal and my years of Python experience really got to shine here. The code is on GitHub: https://github.com/kevinlinxc/AugmentedChords.

Putting it together, my typescript service serves these bitmaps locally when requested. I put together a UI where I can navigate menus and sheet music with voice commands (e.g. show catalog, next, select, start, exit, pause) and then I connected foot pedals to my laptop. Because of bitmap sending latency (~3s right now, but future glasses will do better), using foot pedals to turn the bars while playing wasn’t viable, so I instead had one of my pedals toggle autoscrolling, and the other two pedals sped up/temporarily paused the scrolling.

After lots of adjustments, I was able to play a full song using just the glasses! It took many takes and there was definitely lots of room for improvement. For example: - Bitmap sending is pretty slow, which is why using the foot pedals to turn bars wasn’t viable. - The resolution is pretty small, I would love to put more bars in at once so I can flip less frequently. - Since foot pedals aren’t portable, it would be cool to have a mode where the audio dictates when the sheet music changes. I tried implementing that with FFT but it was often wrong and more effort is needed. Head tilt controls would be cool too, because full manual control is a hard requirement for practicing.

All of these pain points are being targeted by Mentra and other companies competing in the space, and so I’m super excited to see the next generation! Also, feel free to ask me anything!

Comments (12)

eitally · 1h ago
Is there an opportunity to partner with (or sell to) one of the big digital sheet music vendors (like Musescore or Music Notes, etc)? I've never come upon a compelling personal use case for smart glasses, but as a pianist this could be it. I would HAPPILY purchase both glasses and a subscription from one of the big music vendors if this worked seamlessly and I could do things like embed a metronome or link it to my DAW so I could control things like tempo, rewind, even key transposition.
kevinlinxc · 56m ago
This would make the most sense, since MuseScore is notoriously litigious about usage and redistribution of their library/MusicXMLs, so a collaboration would be necessary to get a usable music catalog for smart glasses
zharknado · 37m ago
Congrats! Great video write up also!
kevinlinxc · 22m ago
Thank you!
KyleBrandt · 1h ago
A full orchestra on stage playing with no music stands sure would be make for a nice sight (assuming the glasses looked like regular old glasses -- (or maybe blues brothers shades)).
kevinlinxc · 20m ago
Agreed! These glasses do look very normal - only tell is that at a certain angle you can see the green of the screen, and the part near the ear is a bit bigger (but easy to conceal with hair)
floren · 3h ago
Three seconds to send a bitmap? And I thought the Brilliant Monocle/Frame was slow! In the video it looks like you don't get more than a bar or two on-screen at a time... wouldn't any reasonably fast piece outpace the rate at which you can get the next bar on the device?
kevinlinxc · 3h ago
Yeah, it's a big deal for sure, I was bugging Mentra all hackathon to try and lower it, and also reached out to Even for suggestions (which Mentra is implementing). Regardless, I made it work and next gen hardware, firmware and software are all definitely going to be better for bitmaps
floren · 3h ago
If they're using the same Nordic BLE chips everybody else is, there's just gonna be a cap on how quickly you can move stuff, I think.

I've found the display capabilities of the current gen smartglasses pretty disappointing. Yes they're less obtrusive, but the resolution is pitiful. I've found the Vufine a lot more useful, if more ridiculous looking.

alex1115alex · 2h ago
Mentra here.

The Nordic MCU they use isn't actually the limiting factor, rather it's the glasses' firmware. For bitmaps from third party apps (like AugmentOS), they enforce 194 byte chunk sizes and do not support RLE. Their first-party app does not have these limitations. We're stuck with this problem for the G1, but we're working with hardware partners to make sure future glasses don't have these issues.

kevinlinxc · 2h ago
If I were designing around this limit, I would put enough memory to be able to store a nice buffer of bitmaps in either direction and then do sends that don't change what's currently displayed. I feel like that memory probably exists, I just don't have access to the firmware sadly
paul7986 · 2h ago
Great and cool to see this, as well see some fellow smart glass enthusiasts on Hacker News.

I've been an avid enthusiast and promoter of Meta Ray Bans since Oct 2023. They are very handy and I think for anyone person who wear sunglasses or glasses and uses their phone to take pics or vids then they make a ton of sense (both things you can do with them without needing your phone.. also ask them for the time). Though Im not sure even the HN population is much about them.

Albeit I love them I do not think as you see the media and i guess Zuckerberg saying they are the next computing platform that to be true. You can not take selfies with smart glasses unless they offer a pop out tiny drone in the glasses to take pics of u lol. Thus, I think they will be complementary to our personal pocket smart and or upcoming pocket AI devices, which will able to take the best selfies of you ever (ur AI friend see on the lock screen directs you to the best light to get the best selfies).