A learning principle that can be viewed as approximating the expected value of the output from a model, using Bayesian statistics, by only considering the hypothesis with maximum a posteriori probability.
video – MAP on graphical models
It can be seen as formally equivalent to Maximum likelihood by multiplying the likelihood by the prior (adds the log of the prior to the log likelihood).