Questions tagged [karush-kuhn-tucker]

In mathematical optimization, the Karush–Kuhn–Tucker (KKT) conditions are first order necessary conditions for a solution in nonlinear programming to be optimal, provided that some regularity conditions are satisfied.

In mathematical optimization, the Karush–Kuhn–Tucker (KKT) conditions, also known as the Kuhn–Tucker conditions, are first-order necessary conditions for a solution in nonlinear programming to be optimal, provided that some regularity conditions are satisfied. Allowing inequality constraints, the KKT approach to nonlinear programming generalizes the method of Lagrange multipliers, which allows only equality constraints. The system of equations and inequalities corresponding to the KKT conditions is usually not solved directly, except in the few special cases where a closed-form solution can be derived analytically. In general, many optimization algorithms can be interpreted as methods for numerically solving the KKT system of equations and inequalities.

The KKT conditions were originally named after Harold W. Kuhn, and Albert W. Tucker, who first published the conditions in 1951. Later scholars discovered that the necessary conditions for this problem had been stated by William Karush in his master's thesis in 1939.

The KKT conditions include stationarity, primal feasibility, dual feasibility, and complementary slackness.

433 questions
0
votes
0 answers

optimization of a function with inequality constraint

I have a function to be maximized subject to constraints. I can write the primal Lagrange function as the following: (objective function WITH two constraints in the last two terms) $$L_P = \frac{1}{2} ||\beta||^2 + C\sum_{i=1}^N \xi_i -…
0
votes
1 answer

Eliminate cases before calculting all KKT conditions

I have the following non linear programming to solve: $$\left\{\begin{matrix} \min & (x-3)^2 + (y-2)^2 \\ s.t. & x^2 +y^2 \leq 5 \\ & x+y\leq 3 \\ & x \geq 0\\ & y\geq 0 \end{matrix}\right.$$ To check the first order kkt condition, I should check…
Giiovanna
  • 3,147
  • 3
  • 18
  • 55
0
votes
0 answers

For which values of $c_1, c_2$ and $c_3$ is (1, 2, -2) a local minimum

Consider the problem $$\left\{\begin{matrix} \min & x^2 -2xy + 2xz +y^2 + 4yz + z^2 + c_1x + c_2y + c_3z \\ s.t & g(x,y,z)=-x^2 -4xy - 4xz -2y^2 -4yz - 2z^2 + x -y+z+4 =0 \\ \; & h(x,y,z)=-x-y+z+5 \geq 0 \end{matrix}\right. $$ For which values…
Giiovanna
  • 3,147
  • 3
  • 18
  • 55
0
votes
1 answer

Constrained Optimization: $\min x_1$

Consider the problem $$\left\{\begin{matrix}\min & x_1 \\ s.t & x_2 \geq 0 \\ \; & x_2 \leq x_1^3 \end{matrix}\right.$$ It is asked to find the minimum and show why this does not satisfy the KKT conditions. For the given region, I concluded that the…
Giiovanna
  • 3,147
  • 3
  • 18
  • 55
0
votes
1 answer

Nocedal/Wright: Numerical Optimization, Lemma 12.3.(ii)

In the above given monograph (1999, 1E) the following parametrized system of equations $R:\mathbb{R}^n \times \mathbb{R} \rightarrow \mathbb{R}^n$ is introduced: $$ R(z,t) := \left[ \begin{array}{c} c(z) - tAd \\ Z^T (z - x^* -…
0
votes
0 answers

How to obtain the optimal Lagrange multiplier vectors if the globally optimal solution for a nonconvex QCQP is found?

I am using a black-box solver to solve the following non-convex QCQP to global optimality. $$ \min_x x^TQ_0x + c^T x \\ s.t. \quad x^TQ_1x+c_1^Tx=b_1 \\ Ax=b \\ l\leq x\leq u $$ where $Q_0$ is indefinite diagonal matrix and $Q_1$ is positive…
0
votes
1 answer

Why is one of the KKT conditions the same as one of the constraints?

I'm working through an SVM tutorial (from Andrew Ng Stanford course notes). In the brief coverage of Lagrange duality. The primal optimization problem is stated $$ \min_{w} \theta_{\mathcal{P}}(w) = \min_{w} \max_{\alpha, \beta \::\: \alpha_{i} \ge…
0
votes
1 answer

Is there any way to make the following function convex?

I need to find optimal lagrangian multiplier vectors for a quadratic programming problem subject to three quadratic equality constraints and several other linear inequality constraints. I would like to use the KKT conditions for that. The objective…
0
votes
2 answers

optimization on two "max" function

Anyone knows how to use lagrange multiplier (or KKT conditions) to minimize an objective function such as $L(\beta,\beta_0)=\sum_{i=1}^n[a_i(1-y_if(x_i))_++b_i(1+y_if(x_i))_+$] where $a_i$, $b_i$ are all constant, $x_i$ and $y_i$ are known. Also…
0
votes
1 answer

Solving an optimization problem with KKT-conditions

I've been studying about KKT-conditions and now I would like to test them in a generated example. My task is to solve the following problem: $$\text{minimize}:\;\;f(x,y)=z=x^2+y^2$$ $$\text{subject to}:\;\;\;\; 38x+32y-24z+964=0$$ I generated a…
jjepsuomi
  • 8,135
  • 12
  • 49
  • 89
-1
votes
1 answer

Lagrange duality compared with Lagrange multiplier method

As we all know, Lagrange multiplier method says: in order to find the extremum of $f(x)$ over $x$, s.t. $g(x)=0$, one instead finds the extremum of $f(x)+\lambda g(x)$ over $x$ and $\lambda$. Note here only finding extrema is talked about, not min…
feynman
  • 61
  • 6
-1
votes
1 answer

Can KKT be used in minimization s.t to constant param

Can KKT be used : min g(x) s.t x>=constant where constant > 0 I have read this The Kuhn-Tucker method: here says that This is an alternative, and slightly simpler method for dealing with the common case where there are positivity constraints, that…
-1
votes
2 answers

Langrarian multiplier

Consider the following function $$f(x, y)=x^4-y^2$$ And Set $A=\{(x,y)\in R^2: x^2+y^2=1\}$ is required. find the Lagrangian equation that determines the extreme point of $F$ on $A$ and calculates the solution for this equation. characterize the…
1 2 3
28
29