Show HN: Lossless Semantic Matrix Analysis (99.999% accuracy, no training)

2 AyodeleFikayomi 4 7/23/2025, 1:09:16 AM
I built an open-source system for lossless, structure-preserving analysis of matrix data.

It auto-discovers semantic relationships between datasets (e.g., drugs ↔ genes ↔ categories) with 99.999% accuracy — without deep learning or lossy compression.

It supports CSV/TSV/Excel data, runs via Docker, and outputs semantic clusters, property stats, similarity scores, and visualizations.

No setup needed. Works on any data. Fully explainable.

GitHub: https://github.com/fikayoAy/MatrixTransformer Docker: fikayomiayodele/hyperdimensional-connection

Happy to answer any questions!

Comments (4)

ndgold · 5h ago
Hey I’m interested but I need an example to understand this better. Can you maybe add one to the repo readme?
TXTOS · 6h ago
this is genuinely cool — lossless semantic alignment without DL is a breath of fresh air.

we’ve been exploring the inverse direction: letting semantic tension stretch and bend across interaction histories, and then measuring the ΔS divergence between the projection layers.

we benchmarked it as an external semantic resonance layer that wraps around Claude/GPT etc — boosted multi-turn coherence +42% in evals.

would love to see if your static-matrix pipeline could "snap-in" upstream for symbolic grounding.

drop by our playground if curious (PDF only, no setup): https://github.com/onestardao/WFGY

AyodeleFikayomi · 6h ago
Hi thank you for your feedback, and your project is really impressive... it's good to find like minds like yourself however to clarify the system i built is a dynamic and adaptive system focused on high-dimensional connection discovery and coherent transformations, not a static matrix pipeline. It's designed to find and adapt to structure over time, not to apply fixed transformations.
AyodeleFikayomi · 6h ago
and definitely i think it is great to explore other means to Ai without the pain of DL