Computational complexity of neural networks (2022)

18 mathattack 4 7/21/2025, 12:23:33 AM lunalux.io ↗

Comments (4)

constantcrying · 1h ago
"If we once again assume that there are the same number of neurons in each layer, and that the number of layers equal the number of neurons in each layer we find:"

These are terrible assumptions.

Why not compute the runtime as the product of the actual sizes? Then the comparison will also make more sense.

duvenaud · 12h ago
This is simply wrong. Backprop has the same asymptotic time complexity as forward.
bobmarleybiceps · 9h ago
I think they're misusing "forward propagation" and "backward propagation" to be basically mean "post training inference" and "training". they seem to be assuming n iterations of the backward pass, which is why it's larger...
vrighter · 6h ago
n iterations would be a constant factor, which is omitted from asymptotic complexity