aka statistical learning
A part of Artificial intelligence that uses many methods from Computer science to Statistics to create automated (machine) learners: systems that can extract Knowledge and insight from Information and data.
https://paperswithcode.com/sota
http://bit.do/oxtorch. – http://ml4a.github.io/guides/ – https://developers.google.com/machine-learning/crash-course/
Good discussion panel on current ML research (2017) – https://blog.google/topics/machine-learning/introducing-machine-learning-practica/
–Book recommendations – another list
Building Machine Learning Systems with Python – Machine learning in Matlab –Lecture list of Andrew's course: – lecture notes – Andrew Ng machine learning course https://www.youtube.com/watch?v=UzxYlbK2c7E . On lecture 2 – Machine Learning - mathematicalmonk – Machine Learning: A Probabilistic Perspective and here – Machine Learning: Discriminative and Generative (The Springer International Series in Engineering and Computer Science)
– Pedro Domingos: "The Master Algorithm" Talks at Google. Grand unified theory of learning?
http://www.r2d3.us/visual-intro-to-machine-learning-part-1/
lecture series – another one, focusing on theory
Parametric approaches start with a model, with a fixed number of parameters, like
and a learning algorithm to find best parameters for the data. See Learning theory
Nonparametric approaches define a method which defines a function. They can be seen as models with variable number of parameters. Some examples are:
++Often in machine learning, we assume the observations are independent, but we can also treat non-independent with sequence learning
New paradigm, in which we try to learn as much as possible from features to classification, by using deep models.
Training data consisting on inputs and outputs. Want to find function relating inputs to outputs, to then be able to predict new outputs from new inputs. This problem is thus formalized as function approximation.
Two main types:
Actually, I think unsupervised learning is the most general. After all, supervised learning can be seen as a special case of unsupervised learning, where the data points are pairs , and we want to find a function so that the data can be modeled as as well as possible; no need to interpret this as "supervising", but can instead interpret it as "finding structure".
–> Well, actually: I think the distinction is that in unsupervised learning, your training and test data has the same form, while in supervised learning, your training and test data are different (training is labelled, and test isn't)
Variations on supervised and unsupervised
You are given a set of inputs , but you only have the corresponding outputs for some. You have to predict the for the rest (by learning the function for instance, like in Supervised learning.
Like semi-supervised learning but the algorithm can ask for extra data, which it deems to be the most useful data to ask for.
Basically loss-functions/costs used by the learning agent are based on Decision theory. See example here.
Incremental learning is a machine learning paradigm where the learning process takes place whenever new example(s) emerge and adjusts what has been learned according to the new example(s).
Related: Transfer learning
Inferring values of missing entries in data
To me it seems like the difference with supervised learning, is that you don't specify input, output pairs, but just outputs. You specify desired outputs, and undesired outputs. There is no input, but still the problem is not just trivial (i.e. it only ever produces one output), because the model is probabilistic.
Sequence of decisions
Reward function
Used often in robotics.
See Computational learning theory
The theory and algorithms for learning.
Models for Probability distributions. These models relate Random variables, using some more or less general assumptions about the nature of the data.
Many other models used in different areas of machine learning.
Good framework: Stan
Other forms of Artificial intelligence, particularly symbolist AI, can be useful for machine learning
Promising approaches combine several of the paradigms: Integrating symbols into deep learning
Learning as the inverse of deduction, going from instances to generals. see vid
Mathematics of machine learning
See more in Machine learning in science and engineering, and Applications of AI.
Lectures on theoretical foundations of data science
Try Torch:
See https://www.youtube.com/watch?v=DHspIG64CVM#t=45m40s
http://www.robots.ox.ac.uk/~az/lectures/index.html
https://www2.eecs.berkeley.edu/Faculty/Homepages/jordan.html
https://www.wikiwand.com/en/Transduction_(machine_learning)
Good collection of tutorials https://medium.com/machine-learning-in-practice/over-200-of-the-best-machine-learning-nlp-and-python-tutorials-2018-edition-dd8cf53cb7dc