Consider the following optimization problem:

$min{\textstyle \phantom{\rule{1em}{0ex}}}{x}_{1}^{2}+{x}_{2}^{2}$ $\text{subject to}{\textstyle \phantom{\rule{1em}{0ex}}}({x}_{1}-1{)}^{2}+({x}_{2}-1{)}^{2}\le 1,{\textstyle \phantom{\rule{0.167em}{0ex}}}({x}_{1}-1{)}^{2}+({x}_{2}+1{)}^{2}\le 1.$ - Sketch the feasible set and the level sets of the objective function. Find the optimal point
and the optimal value${x}^{\ast}$ .${f}^{\ast}$ - Give the KKT conditions. Do there exist Lagrange multipliers
and${\mu}_{1}^{\ast}$ that satisfy KKT conditions?${\mu}_{2}^{\ast}$ - Write down the Lagrange dual function and the dual problem.
- Solve the dual problem. Does strong duality hold? Does Slater's condition hold?

- Sketch the feasible set and the level sets of the objective function. Find the optimal point

Consider the following optimization problem:

$min{\textstyle \phantom{\rule{1em}{0ex}}}f(x)={x}^{T}Qx$ $\text{subject to}{\textstyle \phantom{\rule{1em}{0ex}}}\Vert x{\Vert}_{2}=2.$ We would like to solve it by the projected gradient descent method:

.${x}_{k+1}=\mathcal{P}({x}_{k}-t\cdot \mathrm{\nabla}f({x}_{k}))$ - Suppose that
. Find the closed form of${x}_{k}\ne \mathbf{0}$ .${x}_{k+1}$ - Let
from now on. Solve the optimization problem by finding Lagrange multipliers.$\begin{array}{c}Q=\left(\begin{array}{cc}1& 0\\ 0& 2\end{array}\right)\end{array}$ - Assume that
is the sequence produced by the projected gradient descent method. Let$\{{x}_{k}\}$ and${x}_{k}=({x}_{k,1},{x}_{k,2})$ . Find the connection between${y}_{k}={x}_{k,2}/{x}_{k,1}$ and${y}_{k+1}$ (assume${y}_{k}$ ).${x}_{k,1}\ne 0$ - Find a possible value of
so that$t$ will converge.$\{{x}_{k}\}$

- Suppose that

- How long does it take you to do this homework?
- If
represents "very easy" and$1$ represents "too hard", how difficult do you feel this homework is? (You can give different points for different problems.)$10$