1 paper across 1 session
We make neural network training cheaper and more accurate by progressively dropping parts of the data after each epoch.