A Regularization method for deep neural networks, which works by dropping out random neurons/nodes in the network, while training. This avoids overfitting by forcing the network to learn more robust models, that tend to be simpler.
How does the dropout method work in deep learning?
Dropout: A Simple Way to Prevent Neural Networks from Overfitting
See also New advances in deep learning