Cory's Wiki

Many modern stochastic techniques applied to neural networks share these characteristics:

  • they are not well-understood, or originated as neural network hacks that somehow improved performance
  • they seem to imitate creating an ensemble of models by adding stochastic processes to one neural network
  • they utilize very large networks and biology-inspired architectures

In general, use of these methods tend to be referred to as deep learning.

Deep learning ensemble methods include:

Skipforward Nets

Residual Nets

Swapout

Introduced by Singh et al 2016, swapout is a generalization of dropout and stochastic depth and provides a bit of intuition about choice of layer width and depth in neural networks.

Swapout is a neural net ensemble technique combining skipforward, residual nets, and dropout techniques1).