Many modern stochastic techniques applied to neural networks share these characteristics:
In general, use of these methods tend to be referred to as deep learning.
Deep learning ensemble methods include:
Introduced by Singh et al 2016, swapout is a generalization of dropout and stochastic depth and provides a bit of intuition about choice of layer width and depth in neural networks.
Swapout is a neural net ensemble technique combining skipforward, residual nets, and dropout techniques1).