2 papers across 2 sessions
Spike-timing-dependent plasticity can be rephrased as noisy gradient, which allows to obtain convergence guarantees for the stochastic learning dynamics.
We accelerate Taylor mode for practically relevant differential operators by collapsing Taylor coefficients; this can be done automatically with compute graph simplifications