Optimization

cosmos 29th November 2017 at 12:22am
Engineering Mathematical methods Operations research Scientific computing

https://en.wikipedia.org/wiki/Mathematical_optimization

https://en.wikipedia.org/wiki/Optimization_%28disambiguation%29

oxford course

http://www.benfrederickson.com/numerical-optimization/


minxXf(x)\min\limits_{x \in X} f(x)

  • Continuous otpimization. The space XX over which you are optimizing is continuous
  • Discrete optimization. The space XX over which you are optimizing is discrete. Good book

Continuous optimization

Good learning resource: book + lectures.

Unconstrained and constrained optimization

minxXf(x)\min\limits_{x \in X} f(x)

See intro slides. Often X=RnX = R^n an n-dimensional real number space.

Constrained optimization refers to optimization where the optimization space is some subspace of an "underlying space". This is a bit arbitrary, but the underlying space is most often assumed to be RnR^n and the constraints then define a subset of RnR^n through inequalities and equalities.

Constrained optimization

All of these are mostly refering to continuous optimization

wikinotesnotes2

Optimization problem with inequality constraints

A mathematical optimization problem, or optimization problem.

One can also have equality constraints, apart from inequalities.

Primal and dual problem and KKT conditions

Andrew Ng videoGeneral optimization problemDual problem, uses Max–min inequality. Conditions for the primal and dual problems being equivalent (includes Karush–Kuhn–Tucker conditions)

Has applications in Support vector machines. The reasons we use the dual problem is that it has many useful properties, I think it is convex?

Linear programming: Linear objective and constraint functions.

Used in Operations research. Can be used to solve (either exactly or approximately) some discrete optimization problems. See Linear programming relaxation

Nonlinear programming

Objective and constraint functins are non-linear. Some special cases:

Often for global nonlinear optimization, we need to use brute force methods, with Computational complexity exponential in the dimension of th optimzation space (space of optimization variables). For approximate but faster solutions, we can use local optimizal optimization, or heuristics.

Lagrange multiplier method

http://mat.gsia.cmu.edu/classes/QUANT/NOTES/chap4/node6.html

Good explanation of inequality constraints using an extension of Lagrange multipliers. Think of planes in 3D as constraint surfaces for intuition on equation! Delta f should be perpendicular to hypersurface defined by constraints.

Discrete optimization

Discrete optimizatino using energy minimization

Can formulate discrete optimization as an Energy minimization problem on a set of random variables which can take values in a discrete set. This can be formulated as a Graphical model. Energy-based model, Ising model..

Linear programming relaxation

Local optimization

Find local optima, as a method to approximately solve hard nonlinear optimization problems. Main methods:

THE EFFECT OF GRADIENT NOISE ON THE ENERGY LANDSCAPE OF DEEP NETWORKS

EXPLORATIONS ON HIGH DIMENSIONAL LANDSCAPES

More things on optimization


Heuristic optimization

Evolutionary computing

Artificial intelligence?


Hyperoptimization

Gradient-based Hyperparameter Optimization through Reversible Learning


Probabilistic programming

See links here

Memetic algorithm

Evolutionary computing

Simulated annealing ! http://www.mit.edu/~dbertsim/papers/Optimization/Simulated%20annealing.pdf

https://en.wikipedia.org/wiki/Inferential_programming


Applications

Many applications in Science, Engineering, Statistics...


Can we measure the difficulty of an optimization problem?