Man, how does that work…?

Reshared post from +gwern branwen

Man, how does that work…?

"In this paper we propose stochastic depth, a training procedure that enables the seemingly contradictory setup to train short networks and obtain deep networks. We start with very deep networks but during training, for each mini-batch, randomly drop a subset of layers and bypass them with the identity function. The resulting networks are short (in expectation) during training and deep during testing. Training Residual Networks with stochastic depth is compellingly simple to implement, yet effective. We show that this approach successfully addresses the training difficulties of deep networks and complements the recent success of Residual and Highway Networks. It reduces training time substantially and improves the test errors on almost all data sets significantly (CIFAR-10, CIFAR-100, SVHN). Intriguingly, with stochastic depth we can increase the depth of residual networks even beyond 1200 layers and still yield meaningful improvements in test error (4.91%) on CIFAR-10."

Embedded Link

[1603.09382] Deep Networks with Stochastic Depth
As an open-access site, arXiv serves people like you all over the world and your opinion counts. Please complete this questionnaire to help us improve arXiv and think of future directions for the service in a way that best serves users like you. The survey has four sections, and will take about …

Leave a Reply