Continuous Neural Networks: A Physics-Inspired Framework

1 milocoombs 1 5/31/2025, 3:12:58 PM drive.google.com ↗

Comments (1)

milocoombs · 17h ago
This is a recent piece of independent research I did exploring a new formulation of neural networks as continuous integral operators, inspired by variational principles in physics. My ideas went viral on LinkedIn.

Instead of stacking discrete layers, this framework uses a weight function W(s,t) that maps input functions to output functions via integration. Learning is framed as minimizing an action functional composed of kinetic and potential energy terms, much like in classical mechanics.

It leads to an integro-PDE for the optimal weight function, which invites interesting analytical and numerical techniques from physics and applied math. The framework is especially promising for scientific computing and signal processing, though it's still early-stage.

Would love feedback or thoughts from others exploring functional or physics-based ML models. Link is to the full write-up.