{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "Name: **Your name here** \n", "UID: **Your student ID num here**" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Homework 7: Dual methods \n", "Upload a pdf version of your solution to gradescope.\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Problem 1: The conjugate and the dual\n", "Consider the \"monotropic\" program\n", "\\begin{align}\n", "\\text{minimize} & \\quad \\|x\\|_\\infty \\\\\n", "\\text{subject to} & \\quad Ax=b . \\nonumber\n", "\\end{align}\n", "\n", "Write this as an unconstrained (or implicitly constrained) problem using the characteristic function of the zero vector $\\chi_0(z) .$ This function is zero if it's argument is zero, and infinite otherwise.\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Your answer here...\n", " $$\\|x\\|_\\infty + \\chi_0(Ax-b) $$" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### What is the conjugate of $f(z)= \\|z\\|_\\infty$?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Your answer here...\n", "\n", "$f^*(z) = \\chi_1(z),$ the characteristic function of the 1-norm ball. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### What is the conjugate of $g(z)=\\chi_0(z)$?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Your answer here...\n", "\n", "The conjugate is defined as\n", "$$g^*(y) = \\max_z y^Tz - \\chi_0(z).$$\n", "The maximizer will always have $z=0,$ and so \n", "$$g^*(y) = 0.$$" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Using the conjugate functions, write down the dual of the monotropic problem." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Your answer here...\n", "\n", "The original problem is\n", "$$\\min_x f(x)+g(Ax-b).$$\n", " The dual is \n", "$$\\max_\\lambda -\\lambda^Tb -f^*(-A^T\\lambda)-g^*(\\lambda).$$\n", "Using our dual formulas we get\n", "$$\\max_\\lambda -\\lambda^Tb -\\chi_1(-A^T\\lambda)-0$$\n", "which simplifies to\n", "\\begin{align}\n", "\\text{minimize}_\\lambda \\quad & \\lambda^Tb\\\\\n", "\\text{subject to} \\quad & \\|A^T\\lambda\\|_1 \\le 1.\n", "\\end{align}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Problem 4: Linear programming\n", "Consider the linear program\n", "\\begin{align*}\n", "\\text{minimize} \\quad & c^T x \\\\\n", "\\text{subject to} \\quad & Ax=b\\\\\n", " & x\\ge 0.\n", "\\end{align*}\n", "\n", "#### Write the optimality conditions for this problem (i.e., the KKT system)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Your answer here...\n", "\n", "We'll start with the Lagrangian\n", "$$c^T x + \\lambda^T(Ax-b)-\\eta^Tx$$\n", "The KKT system is then\n", "\\begin{align}\n", "c+A^T\\lambda-\\eta = 0 &\\quad \\text{Primal optimality}\\\\\n", "Ax=b &\\quad \\text{Primal feasibility}\\\\\n", "x\\ge 0 & \\quad\\text{Primal feasibility}\\\\\n", "\\eta \\ge 0 & \\quad\\text{Dual feasibility}\\\\\n", "x_i \\eta_i =0 & \\quad\\text{Complementary slackness}\\\\\n", "\\end{align}\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Write the Lagrangian for this problem." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Your answer here...\n", "\n", "$$c^T x + \\lambda^T(Ax-b)-\\eta^Tx$$" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Minimize out the primal variables in the Lagrangian, and write the dual formulation of this linear program." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Your answer here...\n", "\n", "The saddle point problem is\n", "$$\\min_x \\max_{\\lambda,\\eta}c^T x + \\lambda^T(Ax-b)-\\eta^Tx$$\n", "Let's minimize out $x$. Take the gradient of the Lagrangian with respect to $x$. At optimality, this is zero, and so \n", "$$c+ A^T\\lambda -\\eta^T=0.$$\n", "Note that, because $\\eta\\ge 0,$ this equation only has a solution when $c+ A^T\\lambda \\ge 0.$ Plugging this into the Lagrangian, we now have \n", " \\begin{align}\n", " \\text{minimize}_{\\lambda} \\quad & -\\lambda^T b\\\\\n", " \\text{subject to} \\quad & c+ A^T\\lambda \\ge 0\n", " \\end{align}\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Problem 5: Sensitivity bound\n", "Consider the problem\n", "\\begin{align}\n", "\\text{minimize} \\quad & f(x) \\\\\n", "\\text{subject to}\\quad & g(x) \\le 0.\n", "\\end{align}\n", "Let $x_0$ be a solution to this problem, and $\\lambda_0$ be the corresponding optimal Lagrange multiplier. Now, define a perturbed problem\n", " \\begin{align}\n", "\\text{minimize}\\quad & f(x) \\\\\n", "\\text{subject to}\\quad & g(x) \\le \\epsilon\n", "\\end{align}\n", "where $\\epsilon$ is a vector. Let $x_\\epsilon$ be a solution to the perturbed problem. Note, if we put large negative values in $\\epsilon$, then the constraint set gets smaller, and we expect the corresponding value of $f(x_\\epsilon)$ to increase. \n", "\n", "Prove the ``sensitivity bound''\n", "$$ f(x_0) - \\lambda_0^T \\epsilon \\le f(x_\\epsilon). $$\n", "This bound shows that the Lagrange multipliers determine how much the objective increases as the vector $\\epsilon$ becomes more negative." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Your solution here\n", "The lagrange multiplier is such that the optimal $x$ satifies\n", "$$ f(x_0) =\\min_x f(x)+ \\lambda_0^T g(x)$$\n", "The value $f(x_\\epsilon)$ satisfies\n", "$$ f(x_\\epsilon) = \\max_\\lambda \\min_x f(x)+ \\lambda^T (g(x)-\\epsilon) \\ge \\min_x f(x)+ \\lambda_0^T (g(x)-\\epsilon) = \\min_x f(x)+ \\lambda_0^T g(x) - \\lambda_0^T \\epsilon = f(x_0)- \\lambda_0^T \\epsilon.$$" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.6" } }, "nbformat": 4, "nbformat_minor": 2 }