Ring Convolution Networks – Novel neural architecture achieves 90.1% on MNIST

4 bigdatateg1992 0 6/30/2025, 7:38:49 PM
Hi HN! I'm an independent researcher from Uzbekistan, and I've developed a new neural network architecture called Ring Convolution Networks (RCN).

The key idea: Instead of fixed weights, each weight exists in a "ring" of values that create a quantum-inspired superposition. Think of it as giving each weight multiple "perspectives" that combine during computation.

Results: - 10.6% accuracy with random weights (vs 8.8% for standard NN) - 90.1% accuracy after training (v2.0 released today) - Works specifically well on structured data like images

What makes it interesting: 1. No explicit convolution kernels, but achieves CNN-like performance 2. "Smart mirrors" implementation - O(1) memory per weight regardless of ring depth 3. Inspired by quantum superposition but runs on classical hardware

Technical details: - Each weight w becomes {w_center, w-δ, w+δ, ...} - Forward pass: y = 0.5×w_center + 0.25×w_left + 0.25×w_right - Training: gradients propagate through the ring structure

Code: https://github.com/Akbar1992A/ring-convolution-networks Paper: https://doi.org/10.5281/zenodo.15776775 PaperV2.0: https://doi.org/10.5281/zenodo.15777644

I'd love to hear your thoughts on: - Theoretical explanation for why this works on images - Potential applications beyond computer vision - Optimization techniques for ring structures

This is my first major research project, and I'm excited to share it with the HN community. Happy to answer any questions!

Comments (0)

No comments yet