Deep learning

cosmos 26th February 2018 at 2:42pm
Machine learning

Deep learning Machine learning in a modular way using layers, like in Torch. Artificial neural networks, with many layers..

http://www.abigailsee.com/2018/02/21/deep-learning-structure-and-innate-priors.html

Two+ Minute Papers - How Does Deep Learning Work?The computer that mastered Go

Oxford course (with video) on lecture 12

matlab

The idea is also that layers are recursive, i.e. layers can be made up of layers.

future

Concepts as programs; programs as networks

Probabilistic programming, Program induction

trainning models based on demonstration

Multi-agents, and Communication

Generating programs is not that different from generating explanations

Augmented RNNs

http://www.thespermwhale.com/jaseweston/

NIPS2016


Deep learning methods

Neural networks for spatially structured data

Convolutional neural network

Multi-scale networks and an application.

Scene Parsing with Multiscale Feature Learning, Purity Trees, and Optimal Covers

http://www.clement.farabet.net/research.html#parsing

Computer vision

Residual neural network

Neural networks for sequential data

Recurrent neural network

Transfer learning

good for generalizing models, transfer learning, multi-task learning. Good when don't have much supervision data.

Neural networks with memory

Memory is good for recognizing time sequence data. See Long short-term memory.

Attention in machine learning

Integrating symbols into deep learning

Deep reinforcement learning

more...

Structured learning – Learning to learn and compositionality with deep recurrent neural networks


Some techniques for deep learning

Layers for deep learning

New advances in deep learning

Dropout. usefulness of dropout

Batch normalization

Predicting Parameters in Deep Learning The intuition motivating the techniques in this paper is the well known observation that the first layer features of a neural network trained on natural image patches tend to be globally smooth with local edge features, similar to local Gabor features [6, 13]. I.e. they are seizing the Simplicity often found in real-world structures. Given this structure, representing the value of each pixel in the feature separately is redundant, since it is highly likely that the value of a pixel will be equal to a weighted average of its neighbours.

The core of the technique is based on representing the weight matrix as a low rank product of two smaller matrices.


Deep learning theory

Deep learning applications

Deep art

Applications of AI


Hardware for deep learning

Software for deep learning


People in deep learning

History of deep learning


Books and reources

Deep learning in neural networks: An overview

https://deepmind.com/publications.html

http://www.deeplearningbook.org/

http://carpedm20.github.io/


Work on giving prior knowledge to deep learning: https://yani.io/annou/thesis_online.pdf