Found in Residual neural networks, and Highway networks.
See connections to Spiking neural networks here. See also Time-delay neural network
Bridging the Gaps Between Residual Learning, Recurrent Neural Networks and Visual Cortex
Skip Connections as Effective Symmetry-Breaking. We argue that skip connections help break symmetries inherent in the loss landscapes of deep networks, leading to drastically simplified landscapes. We find, however, that skip connections confer additional benefits over and above symmetry-breaking, such as the ability to deal effectively with the vanishing gradients problem.
HIGHWAY AND RESIDUAL NETWORKS LEARN UNROLLED ITERATIVE ESTIMATION
Inside-Outside Net: Detecting Objects in Context with Skip Pooling and Recurrent Neural Networks . Contextual information outside the region of interest is integrated using spatial recurrent neural networks. Inside, we use skip pooling to extract information at multiple scales and levels of abstraction – video –SKIP CONNECTIONS -- what and where
DelugeNets: Deep Networks with Massive and Flexible Cross-layer Information Inflows – Densely Connected Convolutional Networks – Hypercolumns for Object Segmentation and Fine-grained Localization –
See applications in Image segmentation, and Object detection
I think skip-connections can simulate Polychronization
Recurrent Residual Learning for Sequence Classification – We show that for sequence classification tasks, incorporating residual connections into recurrent structures yields similar accuracy to Long Short Term Memory (LSTM) RNN with much fewer model parameters. – Code
Architectural complexity of RNNs
Deep transition RNN - How to Construct Deep Recurrent Neural Networks