
MIT 6.7220/15.084 — Nonlinear Optimization (Spring ‘25) Tue, Feb 4th 2025
Lecture 1
Introduction
Instructor: Prof. Gabriele Farina ( gfarina@mit.edu)
L1.1 Mathematical optimization, and this course
One cannot overstate how pervasive and far-reaching mathematical optimization is. Opti-
mization is everywhere. This is true in two senses:
• Nearly all human endeavors and designs are driven by an aspiration to optimize:
minimize risk, maximize reward, reduce energy consumption, train a neural network
to minimize model loss, et cetera.
• Nature’s phenomena themselves seem to be guided by optimization—the principle
of least action, Hamiltonian mechanics, the Rayleigh-Ritz variational principle (from
which the Schrödinger equation can be derived), et cetera
But the benefits of an optimization mindset are also more subtle. For example, optimization
in the space of semidefinite matrices has led to important breakthroughs in classical combi-
natorial optimization problems in graph theory, such as the Max-Cut problem [GW95].
Optimization theory can also be used to show the fundamental theorem of algebra, and
amusingly Güler presents that on page 34 (!) of his “Foundations of Optimization” book
[Gül10], adapting some ideas by Terkelsen, F. [Ter76] and Fefferman, C. [Fef67].
The literature on optimization dates back to (at least) the ancient Greeks, with their
work on the isoperimetric problem. The problem, which asks to construct the shape that
maximizes the area given a fixed perimeter, is at the center of the legend of how Carthage
—one of the most powerful empires of the classical world—was founded.² It was solved (to
the standards of rigor of those times, at least) by Zenodorus.³
Fast forward to today, mathematical optimization is a vast field, subdivided into a myriad
of subfields depending on what types of optimization problems are being solved; here are
just a few:
• Linear optimization refers to problems that seek to minimize or maximize linear
objectives subject to linear constraints imposed on continuous optimization variables.
• Integer linear optimization is akin to linear optimization, but variables can be further
restricted to only taking integer values.
• Combinatorial optimization more generally studies optimization problems over finite
sets.
• Convex optimization studies optimization of convex objectives on convex sets.
• Calculus of variations studies optimization problems in function spaces (that is, where
the optimal object that is sought is a function).
L1.1.1 Focus and objectives of this course
This is a graduate-level course on nonlinear optimization. The main focus of this course
is continuous optimization in finite dimensional spaces. As such, nonlinear optimization
encompasses a large variety of topics and algorithms, with applications that range from
semidefinite relaxations of combinatorial problems, to regression problems, to training of
deep neural networks. I foresee that this course attracts students from (at least) two very
different categories:
• Some of you are interested in nonlinear optimization theory. Typically, you’re facing
an optimization problem that you hope to solve in closed form, so that you can plug
in or reason about the solution as part of a larger mathematical endeavor.
• Some of you are interested in nonlinear optimization algorithms. Typically, you are
facing an optimization problem for which you simply want a numerical solution.
In this course, the TAs and I have worked hard to cater to both of these categories of
students.
L1.1.2 Prerequisites
This class is pretty light on prerequisites. Fundamentally, the only prerequisites are Linear
algebra (18.06) and Real analysis (18.100A/B/Q). Beyond some basic mathematical concepts
(continuity, vector spaces, some basic point-set topology), we will try to keep this course
as self-contained as possible.
L1.2 Optimization problems: Preliminary considerations
In general, an optimization problem has the form
min
𝑥
s.t.
𝑓(𝑥)
𝑥 ∈ Ω
In terms of nomenclature:
• The function we are trying to minimize (or maximize), in this case 𝑓(𝑥), is called the
objective function.
• The entries of the vector 𝑥, are called optimization variables.
• The set Ω is called the feasible set or constraint set.
In particular, in this class we are interested in optimization problems that satisfy certain
assumptions:
• First of all, we assume that Ω is a subset of some ambient finite Euclidean space 𝐸.
Without loss of generality, 𝐸 = 𝑛.
• The objective function 𝑓(𝑥) is a continuous real-valued function. Very often, we also
assume it is differentiable.
• A category of problems we are paying attention to are those with functional con
straints, that is those in which the feasible set is defined as
Ω ≔ {𝑥 ∈ 𝐸 | 𝑔𝑖(𝑥) ≤ 0,
𝑗(𝑥) = 0,
𝑖 = 1,...,𝑟
𝑗 = 1,...,𝑚}. (1)
When faced with an optimization problem, two questions are immediate:
1. Does a solution exist?
2. If yes, can we compute it?
We will look into some preliminary observations regarding these two questions in Sec-
tions L1.2.1 and L1.2.2, respectively.
L1.2.1 Existence of solutions: Weierstrass theorem
There are two reasons why an optimization problem might not attain a minimum:
Ω is empty, that is, no feasible points exist at all; or
• for any candidate minimizer, there always exists another point that achieves a lower
value, as shown in the following two examples.
Example L1.1. Consider minimization of the following two functions, both over
their domain:
−1.5 −1 −0.5 0.5 1 1.5 x
−12
−8
−4
4
8
12
y
0
𝑓(𝑥) = tan(𝑥)
Ω = (−𝜋
2 , 𝜋
2 )
−3 −2 −1 1 2 3 x
−9
−6
−3
3
6
9
y
0
𝑓(𝑥) = −10 + 𝑒−𝑥
Ω =
In the plot on the left, the objective function is unbounded: values can be made
arbitrarily small. No minimizer exists.
In the plot on the right, the objective has a lower bound, but for any value of 𝑥, all
the values 𝑥 > 𝑥 achieve a strictly smaller value than 𝑓(𝑥). Again, no minimizer
exists.
So, assuming that the objective 𝑓 is continuous, when can we be certain that an optimizer
exists? If we inspect Example L1.1, we notice that both functions are continuous. In the
first example, the domain is open (the extrema ±𝜋
2 are not in the domain) and bounded.
In the second example, the domain is unbounded and closed. This shows that, if there is
any hope of establishing a general result on the existence of minimizers, such a result must
require both closedness and boundedness. We are in luck:
Theorem L1.1 (Weierstrass theorem). Let 𝑓 : Ω → be a continuous function defined
on a nonempty and compact (i.e., closed and bounded) set Ω. Then, there exists a
minimizer 𝑥 ∈ Ω of 𝑓 on Ω, that is,
𝑓(𝑥) ≤ 𝑓(𝑥) for all 𝑥 ∈ Ω.
A note on boundedness. Weierstrass theorem is a pretty universal tool that you can keep
in your toolbox for when you need to argue that an optimization problem has a solution.
However, many optimization problems do not start off with a bounded feasible set Ω.
Consider the following example:
Example L1.2. Let 𝑦 ∈ 𝑛 be a point, and 𝑆 ⊆ 𝑛 be a closed but not necessarily
bounded set. Prove that there always exists a projection point
min
𝑥
s.t.
‖𝑥 − 𝑦‖
𝑥 ∈ 𝑆.
𝑆
𝑦
Solution. The idea is to show that we can safely restrict our attention to a compact
subset of the feasible set of the problem—in this case, 𝑆. By “safely”, we mean that the
minimizer does not change before and after making the modification.
In this case, this is simple: pick an arbitrary point 𝑧 ∈
𝑆. Then, we know that for sure, if a minimizer exists,
it must live in the intersection
Ω ≔ 𝑆 ∩ 𝔹‖𝑧−𝑦‖(𝑦)
between 𝑆 and the closed ball centered in 𝑦 of radius
‖𝑦 − 𝑧‖ (see the figure on the side).
Now, Ω is nonempty (it contains 𝑧), closed (since both
𝑆 and the ball are closed and intersection preserves
closedness) and bounded (since the ball has finite
radius). So, since the objective function is continuous,
we can invoke Weierstrass theorem and conclude the
proof.
𝑆
𝑦
𝑧
Ω

More generally, we have the following:
Theorem L1.2 (Weierstrass theorem for compact sublevel sets). Let 𝑓 be a continuous
function defined on a set 𝑆. If 𝑓 has a nonempty and compact sublevel set, that is, there
exists 𝛼 ∈ such that
{𝑥 ∈ 𝑆 : 𝑓(𝑥) ≤ 𝛼}
is nonempty, bounded, and closed, then 𝑓 has a minimizing point in 𝑆.
A note on closedness. When the feasible set is bounded but not closed, things are slightly
more tricky. However, in many cases you will still be able to use Weierstrass theorem. As
a rule of thumb, there are two ways to deal with this:
• Show that you can safely restrict your attention to a compact subset of the feasible
set, as in Example L1.2.
• Start by considering the closure of the feasible set, and then show that the objective
function is continuous on the closure. After applying Weierstrass theorem, you
can then argue that the minimizer is not one of the boundary points you added
when taking the closure. This approach could for example be used in the following
application, which comes up naturally in online learning and reinforcement learning:
entropy-regularized best responses.
Example L1.3. Let ̊ Δ𝑛 denote the set of probability distributions over 𝑛 actions
{1, ..., 𝑛} that have full support, that is, the open set
̊
Δ𝑛 ≔ {(𝑥1, ..., 𝑥𝑛) ∈ 𝑛
>0 : 𝑥1 + ⋯ + 𝑥𝑛 = 1}.
Given an assignment of values 𝑣𝑖 for each action 𝑖 = 1, ..., 𝑛, the entropyregularized
best response given the values is the distribution that solves the following problem:
min
𝑥
s.t.
𝑔(𝑥) ≔ − ∑
𝑛
𝑖=1
𝑣𝑖𝑥𝑖 + ∑
𝑛
𝑖=1
𝑥𝑖 log 𝑥𝑖
𝑥 ∈ ̊ Δ𝑛,
Show that a minimizer exists.
(You should try to solve this!)
L1.2.2 Nonlinear optimization is computationally hard
We now turn our attention to the second question: what is reasonable to expect in terms of
computational efficiency in finding the solution?
Let me start with the good news: there are entire subclasses of nonlinear optimization
problems that are computationally tractable (most notably, convex optimization problems,
albeit with some exceptions). As we will see, even beyond these classes, usually it’s possible
to at least guarantee some form of convergence to some weaker, “local”, notion of optimality.
However, in general, finding minimizers of nonconvex problems is computationally
intractable. This means that we believe that there does not exist any algorithm that
is able to solve all nonconvex optimization problems. This is because we believe that
linear integer programming is already intractable, and nonconvex optimization captures
known intractable problems such as Max-Cut,4 already under benign assumptions such as
quadratic functional constraints and objective.
Example L1.4 (Max-Cut as a nonlinear optimization problem). Max-Cut is one of the
most fundamental problems in theoretical computer science and discrete optimization.
The definition is very simple:
• Let 𝐺 = (𝑉 , 𝐸) be a graph, where 𝑉 ≔ {1, 2, ..., 𝑛} is the set of vertices.
• We want to partition 𝑉 into two sets 𝑆 and 𝑉 ∖ 𝑆, such that the number of edges
going between 𝑆 and 𝑉 ∖ 𝑆 is maximized.
Can you see how this optimization problem might be formulated as a nonlinear
optimization problem with integer variables? Can you remove the constraint that the
variables are integer, and convert it into a continuous problem?
Solution. As a general rule of thumb, the first we want to do when we go from a problem
into its mathematical programming formulation, is to pick variables. In this case, it is
intuitive that we need variables that encode whether each node belongs to 𝑆 or 𝑉 ∖ 𝑆.
While several choices make perfect sense, let’s go with the following. Let’s assign to each
node 𝑣 ∈ 𝑉 a variable 𝑥𝑣 ∈ {−1, +1}, with the meaning that:
• if 𝑥𝑣 = +1, then 𝑣 ∈ 𝑆;
• if 𝑥𝑣 = −1, then 𝑣 ∈ 𝑉 ∖ 𝑆.
It is immediate to see that an edge (𝑢, 𝑣) ∈ 𝐸 is a cut edge (that is, connecting one node
in 𝑆 to one node in 𝑉 ∖ 𝑆) if and only if 𝑥𝑢 ⋅ 𝑥𝑣 = −1. Else, 𝑥𝑢 ⋅ 𝑥𝑣 = +1. Hence, we can
count exactly all edges in the cut by considering the objective function
ℎ(𝑥) =
(𝑢,𝑣)∈𝐸
1 − 𝑥𝑢 ⋅ 𝑥𝑣
2 .
This leads to the discrete optimization problem
max
𝑥
s.t.
ℎ(𝑥)
𝑥𝑣 ∈ {−1, +1} ∀𝑣 ∈ 𝑉 .
This problem can be readily written as a continuous, nonlinear optimization problem by
noting that
𝑥𝑣 ∈ {−1, +1} ⟺ 𝑥2
𝑣 = 1.
This leads to the nonlinear optimization problem
max
𝑥
s.t.

(𝑢,𝑣)∈𝐸
1 − 𝑥𝑢 ⋅ 𝑥𝑣
2
𝑥2
𝑣 = 1 ∀𝑣 ∈ 𝑉 ,
(2)
which is a quadratically-constrained quadratic optimization problem.
Despite the overall generally negative message, I want to end with a message of encourage-
ment. Sure, we cannot hope to construct a machine that can solve all nonlinear optimization
problems. But nonlinear optimization theory is still incredibly powerful, both because many
useful/powerful optimization problems can be solved efficiently, and because even for the
hard problems, nonlinear optimization might have some tricks up its sleeve. As a case in
point, it turns out that the optimization problem in (2), which is hard, can be relaxed to a
pretty non-obvious optimization problem that can be efficiently solved.
For those interested, this is obtained by introducing the relaxation
max
𝑦
s.t.

(𝑢,𝑣)∈𝐸
1 − 𝑦𝑢𝑣
2
𝑦𝑣𝑣 = 1 ∀𝑣 ∈ 𝑉
[𝑦𝑢𝑣] ⪰ 0,
where [𝑦𝑢𝑣] denotes the matrix where the entry in row 𝑢 ∈ 𝑉 and column 𝑣 ∈ 𝑉 is 𝑦𝑢𝑣 and
⪰ 0 means that the matrix just defined must be positive semidefinite. The important fact
here is that this relaxation can be efficiently solved (we will see how later on in this course).
With some clever rounding scheme, one can then recover a solution to the original Max-
Cut instance that is guaranteed to be 0.878-approximate. This algorithm, due to Goemans,
M. X., & Williamson, D. P. [GW95] makes deep use of nonlinear optimization.
Bibliography for this lecture
[GW95] Goemans, M. X., & Williamson, D. P. (1995). Improved approximation algorithms
for maximum cut and satisfiability problems using semidefinite programming. J.
ACM, 42(6), 1115–1145. https://doi-org.ezproxyberklee.flo.org/10.1145/227683.227684
[Gül10] Güler, O. (2010). Foundations of Optimization. Springer. https://link.springer.
com/book/10.1007/978-0-387-68407-9
[Ter76] Terkelsen, F. (1976). The Fundamental Theorem of Algebra. Am. Math. Mon.
https://www-tandfonline-com.ezproxyberklee.flo.org/doi/abs/10.1080/00029890.1976.11994202
[Fef67] Fefferman, C. (1967). An Easy Proof of the Fundamental Theorem of Algebra. The
American Mathematical Monthly, 74(7), 854–855. http://www.jstor.org.ezproxyberklee.flo.org/stable/
2315823
[Blå05] Blåsjö, V. (2005). The Isoperimetric Problem. Am. Math. Mon. https://www.
tandfonline.com/doi/pdf/10.1080/00029890.2005.11920227
[GJS76] Garey, M. R., Johnson, D. S., & Stockmeyer, L. (1976). Some simplified NP-
complete graph problems. Theoret. Comput. Sci., 1(3), 237–267. https://doi-org.ezproxyberklee.flo.org/
10.1016/0304-3975(76)90059-1
Changelog
• 02/04/2025: Added note on bounded but not closed domains.
• 02/07/2025: Removed footnote in L1.2, which was meant under the hypothesis of 𝑆
closed (thanks Charis).
These notes are class material that has not undergone formal peer review. The TAs and I are grateful
for any reports of typos.
¹See https://en.wikipedia.org/wiki/Variational_principle for a longer list.
²Dido was the legendary founder—and first queen—of Carthage, a Phoenician city-state that would then
become of the most powerful empires in the world, and will be destroyed by the Roman empire. As recounted
by Blåsjö, V. [Blå05], the legend says that “Dido, daughter of the king of Tyre, fled her home after her
brother had killed her husband”. She later “ended up on the north coast of Africa, where she bargained to
buy as much land as she could enclose with an oxhide. So she cut the hide into thin strips, and then she
faced, and presumably solved, the isoperimetric problem.”
³Approx 150BC, https://en.wikipedia.org/wiki/Zenodorus_(mathematician)
4NP-completeness of Max-Cut was shown by Garey, M. R., Johnson, D. S., & Stockmeyer, L. [GJS76].

Content

PDF File

Typo or question?

Get in touch!
gfarina AT mit.edu

Metadata

Course: MIT 6.7220 / 15.084
Term: Spring 2025
Date: 2025-02-04