Convex optimization is a collection of techniques for optimizing a small but interesting class of functions. Examples of convex optimization problems are least squares regression and linear programming. It's a very interesting subject, but since most optimization problems data scientists are interested in aren't convex, it may not offer the same practical utility as more applied subjects.
Stephen Boyd and Lieven Vandenberghe
- In-text exercises
- Errata etc.
This is the book that everyone we know has used as an introduction to convex optimization, and it's easy to understand why. It's very coherently written, has good coverage of the basic mathematics, applications and algorithms, and it doesn't really feel much harder than linear algebra or calculus. We especially found the section on statistical estimation useful. We wish they provided some code for the algorithms, but they aren't that hard to implement in general. The appendix dealing with numerical linear algebra is quite well done as well. We're not sure how much practical utility this will have for most data scientists, but it is a great book.