Optimization Introduction
Biostat/Biomath M257
1 Introduction
Optimization considers the problem \[ \begin{eqnarray*} \text{minimize } f(\mathbf{x}) \\ \text{subject to constraints on } \mathbf{x} \end{eqnarray*} \]
Possible confusion:
- We (statisticians) talk about maximization: \(\max \, L(\mathbf{\theta})\).
- People talk about minimization in the field of optimization: \(\min \, f(\mathbf{x})\).
Why is optimization important in statistics?
- Maximum likelihood estimation (MLE).
- Maximum a posteriori (MAP) estimation in Bayesian framework.
- Machine learning: minimize a loss + certain regularization.
- …
Our major goal (or learning objectives) is to
- have a working knowledge of some commonly used optimization methods:
- Newton type algorithms
- quasi-Newton algorithm
- expectation-maximization (EM) algorithm
- majorization-minimization (MM) algorithm
- convex programming with emphasis in statistical applications
- implement some of them in homework
- get to know some optimization tools in Julia
- have a working knowledge of some commonly used optimization methods:
What’s not covered in this course:
- Optimality conditions
- Convergence theory
- Convex analysis
- Modern algorithms for large scale problems (ADMM, CD, proximal gradient, stochastic gradient, …)
- Combinatorial optimization
- Stochastic algorithms
- Many others
- Optimality conditions
You must take advantage of the great resources at UCLA.
- Lieven Vandenberghe: EE236A (Linear Programming), EE236B (Convex Optimization), EE236C (Optimization Methods for Large-scale Systems). One of the best places to learn convex programming.
- Kenneth Lange: Biomath 205 (Top Computational Algorithms). The best place to learn MM type algorithms.
- Lieven Vandenberghe: EE236A (Linear Programming), EE236B (Convex Optimization), EE236C (Optimization Methods for Large-scale Systems). One of the best places to learn convex programming.