Questions tagged [karush-kuhn-tucker]

In mathematical optimization, the Karush–Kuhn–Tucker (KKT) conditions are first order necessary conditions for a solution in nonlinear programming to be optimal, provided that some regularity conditions are satisfied.

In mathematical optimization, the Karush–Kuhn–Tucker (KKT) conditions, also known as the Kuhn–Tucker conditions, are first-order necessary conditions for a solution in nonlinear programming to be optimal, provided that some regularity conditions are satisfied. Allowing inequality constraints, the KKT approach to nonlinear programming generalizes the method of Lagrange multipliers, which allows only equality constraints. The system of equations and inequalities corresponding to the KKT conditions is usually not solved directly, except in the few special cases where a closed-form solution can be derived analytically. In general, many optimization algorithms can be interpreted as methods for numerically solving the KKT system of equations and inequalities.

The KKT conditions were originally named after Harold W. Kuhn, and Albert W. Tucker, who first published the conditions in 1951. Later scholars discovered that the necessary conditions for this problem had been stated by William Karush in his master's thesis in 1939.

The KKT conditions include stationarity, primal feasibility, dual feasibility, and complementary slackness.

433 questions
272
votes
8 answers

Please explain the intuition behind the dual problem in optimization.

I've studied convex optimization pretty carefully, but don't feel that I have yet "grokked" the dual problem. Here are some questions I would like to understand more deeply/clearly/simply: How would somebody think of the dual problem? What…
13
votes
3 answers

Is KKT conditions necessary and sufficient for any convex problems?

In Boyd's Convex Optimization, pp. 243, for any optimization problem ... for which strong duality obtains, any pair of primal and dual optimal points must satisfy the KKT conditions i.e. $\mathrm{strong ~ duality} \implies \mathrm{KKT ~ is ~…
12
votes
1 answer

Help me organize these concepts -- KKT conditions and dual problem

This is a long question in which I explain my current understanding of certain ideas. If anyone is interested in reading this and would like to provide any commentary/feedback that may help me understand these ideas more clearly, or that you think…
littleO
  • 48,104
  • 8
  • 84
  • 154
11
votes
1 answer

Simple explanation of lagrange multipliers with multiple constraints

I'm studying support vector machines and in the process I've bumped into lagrange multipliers with multiple constraints and Karush–Kuhn–Tucker conditions. I've been trying to study the subject, but still can't get a good enough grasp on the subject.…
9
votes
1 answer

Determining active constraints in KKT

Suppose there is a constrained optimization problem having inequality constraints. We can solve it using Karush-Kuhn-Tucker conditions. My question is how do we determine which constraints are active and which are inactive? I read it in a KKT post1,…
8
votes
1 answer

How to solve this nonlinear constrained optimization problem

I have the following nonlinear optimization problem: $$ \begin{align*} \text{Find } x \text{ that maximizes } & \frac{1}{\|Ax\|} (Ax)^{\top} y \\ \text{Subject to } & \sum_{i=1}^n x_i = 1 \\ & x_i \geq 0 \; \forall \: i \in \{1\dots n\} \\ …
8
votes
4 answers

Minimize $-\sum\limits_{i=1}^n \ln(\alpha_i +x_i)$

While solving PhD entrance exams I have faced the following problem: Minimize the function $f(x)=- \sum_{i=1}^n \ln(\alpha_i +x_i)$ for fixed $\alpha_i >0$ under the conditions: $\sum_{i=1}^n x_i =1$ and $x_i \ge0$. I was trying to use KKT…
J.E.M.S
  • 2,456
  • 1
  • 17
  • 30
7
votes
1 answer

Big picture behind how to use KKT conditions for constrained optimization

What is the point of KKT conditions for constrained optimization? In other words, how is the best way to use them. I have seen examples in different contexts, but miss a short overview of the procedure, in like one or two sentences. Should we use…
6
votes
2 answers

Linear independence of equality constraint gradients in constraint qualifications

I'm, trying to get an intuitive feel for the various constraint qualifications for KKT points. Most of them seem to rely on the linear independence of $\nabla g_i(x^*)$ where $g_i$ are the equality constraints. The book doesn't really state why. The…
6
votes
1 answer

Understanding Karush-Kuhn-Tucker conditions

Suppose we may want to use the K–T conditions to find the optimal solution to: \begin{array}{cc} \max & (\text { or } \min ) z=f\left(x_{1}, x_{2}, \ldots, x_{n}\right) \\ \text { s.t. } & g_{1}\left(x_{1}, x_{2}, \ldots, x_{n}\right) \leq b_{1}…
6
votes
2 answers

Question about KKT conditions and strong duality

I am confused about the KKT conditions. I have seen similar questions asked here, but I think none of the questions/answers cleared up my confusion. In Boyd and Vandenberghe's Convex Optimization [Sec 5.5.3] , KKT is explained in the following…
5
votes
1 answer

Uniform convergence of objective function implies convergence of minimizers

Let $A,A_h\in M_{n\times m}(\mathbb{R})$ be $n\times m$ matrices with $\Vert (A-A_h)x\Vert\leq h^2 \Vert x\Vert$ for $h\in (0,1)$ and $e\in \mathbb{R}^n$. Now consider for a fixed $\lambda>0$ the optimization…
5
votes
1 answer

Minimization problem with solutions on boundary

I am trying to solve the following optimization problem. I do not care about the particular values of the $x$'s, I just care about the optimal value of the function. \begin{equation} \min_{x\geq0}…
5
votes
1 answer

Minimize $x^T A y$, subject to $ x^Ty\geq 0$, where $A=\Phi^T\Phi$ is symmtric and semi-positive definite.

I try to solve it by KKT conditions. The Lagrangian is $L=x^TAy-\lambda x^Ty$. Its KKT conditions are given by $$ \begin{align} Ay-\lambda y=&0\quad (1)\\ A^Tx-\lambda x=&0\quad (2)\\ \lambda\geq &0\quad (3)\\ x^Ty\geq &0 \quad (4)\\ \lambda…
5
votes
2 answers

Second order optimal condition for this question

I am trying to solve a very simple constrained optimization problem below: P: $\min{x_1 + x_2}$ subject to $ x_1 \geq 0$ $ x_2 \geq 0$ $ (2x_1+x_2)^2 = 4$ By solving the KKT condition, I have the KKT point as…
Lynn
  • 63
  • 3
1
2 3
28 29