Questions tagged [lagrange-multiplier]

This tag is for the questions on Lagrange multipliers. The method of Lagrange multipliers (named after Joseph Louis Lagrange) provides a strategy for finding the local maxima and minima of a function subject to equality constraints.

When are Lagrange multipliers useful?

One of the most common problems in calculus is that of finding maxima or minima (in general, "extrema") of a function, but it is often difficult to find a closed form for the function being extremized. Such difficulties often arise when one wishes to maximize or minimize a function subject to fixed outside conditions or constraints. The method of Lagrange multipliers is a powerful tool for solving this class of problems without the need to explicitly solve the conditions and use them to eliminate extra variables.

Put more simply, it's usually not enough to ask, "How do I minimize the aluminum needed to make this can?" (The answer to that is clearly "Make a really, really small can!") You need to ask, "How do I minimize the aluminum while making sure the can will hold $10$ ounces of soup?" Or similarly, "How do I maximize my factory's profit given that I only have $\$15,000$ to invest?" Or to take a more sophisticated example, "How quickly will the roller coaster reach the ground assuming it stays on the track?" In general, Lagrange multipliers are useful when some of the variables in the simplest description of a problem are made redundant by the constraints.

The mathematics of Lagrange multipliers:

To find critical points of a function $f(x, y, z)$ on a level surface $g(x, y, z) = C$ (or subject to the constraint $g(x, y, z) = C$), we must solve the following system of simultaneous equations: $$∇f(x, y, z) = λ∇g(x, y, z)$$ $$g(x, y, z) = C$$ Remembering that $∇f$ and $∇g$ are vectors, we can write this as a collection of four equations in the four unknowns $x, y, z,$ and $λ$ : $$f_x(x, y, z) = λ~g_x(x, y, z)$$ $$f_y(x, y, z) = λ~g_y(x, y, z)$$ $$fz(x, y, z) = λ~g_z(x, y, z)$$ $$g(x, y, z) = C$$ The variable $~λ~$ is a dummy variable called a Lagrange multiplier; we only really care about the values of $x, y,$ and $z$ .

Once we have found all the critical points, we plug them into $~f~$ to see where the maxima and minima. The critical points where $~f~$ is greatest are maxima and the critical points where $~f~$ is smallest are minima.

Note:

Solving the system of equations can be hard! Here are some tricks that may help:

$1.\quad$ Since we don’t actually care what $~λ~$ is, you can first solve for $~λ~$ in terms of $~x,~ y,~$ and $~z~$ to remove $~λ~$ from the equations.

$2.\quad$ Try first solving for one variable in terms of the others.

$3.\quad$ Remember that whenever you take a square root, you must consider both the positive and the negative square roots.

$4.\quad$ Remember that whenever you divide an equation by an expression, you must be sure that the expression is not $~0~$. It may help to split the problem into two cases: first solve the equations assuming that a variable is $~0~$, and then solve the equations assuming that it is not $~0~$.

Reference:

https://en.wikipedia.org/wiki/Lagrange_multiplier

2512 questions
12
votes
3 answers

The meaning of $\lambda$ in Lagrange Multipliers

This is related to two previous questions which I asked about the history of Lagrange Multipliers and intuition behind the gradient giving the direction of steepest ascent. I am wondering if the constant $\lambda$ in the Lagrange equation $$\nabla…
user142299
12
votes
2 answers

the least value for :$\frac{a}{b^3+54}+\frac{b}{c^3+54}+\frac{c}{a^3+54}$

For every $a,b,c$ non-negative real number such that:$a+b+c=1$ how to find the least value for : $$\frac{a}{b^3+54}+\frac{b}{c^3+54}+\frac{c}{a^3+54}$$
user56821
  • 131
  • 4
12
votes
4 answers

Why is the lagrange dual function concave?

In a book I'm reading (Convex Optimization by Boyd and Vandenberghe) it says I'm struggling to understand the last sentence. Why can one conclude concavity from having a pointwise infimum of a family of affine functions?
user2820379
  • 293
  • 2
  • 9
11
votes
1 answer

Simple explanation of lagrange multipliers with multiple constraints

I'm studying support vector machines and in the process I've bumped into lagrange multipliers with multiple constraints and Karush–Kuhn–Tucker conditions. I've been trying to study the subject, but still can't get a good enough grasp on the subject.…
11
votes
2 answers

Minima of symmetric functions given a constraint

If $f(x,y,z,\ldots)$ is symmetric in all variables, (i.e $f$ remains the same after interchanging any two variables), and we want to find the extrema of $f$ given a symmetric constraint $g(x,y,z,\ldots)=0$, $$\bf\text{When is it true that the…
nbubis
  • 31,733
  • 7
  • 76
  • 133
11
votes
4 answers

Minimizing a quadratic function subject to quadratic constraints

Okay, so I am attempting to minimize the function $$f(x,y, z) = x^2 + y^2 + z^2$$ subject to the constraint of $$4x^2 + 2y^2 +z^2 = 4$$ I attempted to solve using Lagrange multiplier method, but was unable to find a $\lambda$ that made the system…
11
votes
3 answers

Geometric interpretation of duality and Slater's condition

I am trying to study about optimization problems, Lagrange duality and related topics. I came across some presentation on the net, which claims to show the geometric interpretation of the duality and Slater's condition for a simple problem with only…
10
votes
2 answers

Strong duality: When does the optimal primal variable coincide with the primal variable giving the dual function.

I'm considering the inequality-constrained optimization problem of finding $$ x^{\star} = \arg \min_{x} f(x) \;\; \text{s.t.} \;\; h(x) \le 0 $$ which is assumed to have a unique minimizer. The objective $f$ maps $R^{n} \rightarrow R$ and $h$ maps…
10
votes
4 answers

Maximum of $(1-q_1)(1-q_2)\ldots(1-q_n)$

I'm trying to find the maximum of $(1-q_1)(1-q_2)\ldots(1-q_n)$ where $n\ge 2$, on a the set $\{(q_1,\ldots , q_n) :q_1^2+q_2^2+\ldots+q_n^2=1 \ q_i\ge 0 \}$ (With the condition $q_i\ge0$ this is just the upper half of the sphere). This appeared to…
user63697
  • 597
  • 3
  • 14
10
votes
1 answer

are elementary symmetric polynomials concave on probability distributions?

Let $S_{n,k}=\sum_{S\subset[n],|S|=k}\prod_{i\in S} x_i$ be the elementary symmetric polynomial of degree $k$ on $n$ variables. Consider this polynomial as a function, in particular a function on probability distributions on $n$ items. It is not…
10
votes
1 answer

How to set up Lagrangian optimization with matrix constrains

Suppose we have a function $f: \mathbb{R} \to \mathbb{R} $ which we want to optimize subject to some constraint $g(x) \le c$ where $g:\mathbb{R} \to \mathbb{R} $ What we do is that we can set up a…
Boby
  • 5,407
  • 3
  • 19
  • 53
9
votes
1 answer

Proving the AM-GM Inequality with Lagrange Multipliers

Exercise: Let $x_1,x_2,...,x_n$ be real positive numbers. Prove the arithmetic-geometric mean inequality, $(x_1x_2...x_n)^{1/n}\le (x_1+x_2+...+x_n)/n$. Hint: Consider the function $f(x_1,x_2,...,x_n)=(x_1+x_2+...+x_n)/n$ subject to the constraint…
Jeff
  • 1,503
  • 1
  • 14
  • 21
9
votes
3 answers

True or false: $a^2+b^2+c^2 +2abc+1\geq 2(ab+bc+ca)$

Is this inequality true? $a^2+b^2+c^2 +2abc+1\ge2(ab+bc+ca)$, where $a,b,c\gt0$. Can you find a counterexample for this or not?
user85046
9
votes
1 answer

A lower bound of $\sum_{i=1}^n a_i \sum_{i=1}^n \frac{1}{a_i}$

For fixed $n \ge 2$ , find the maximum of real number $t$ such that $$ \sum_{i=1}^n a_i \sum_{i=1}^n \frac{1}{a_i} \ge n^2 + t \cdot \frac{\displaystyle\sum_{1\le i
9
votes
1 answer

Maximizing $\int_0^\infty (1+xy')^2e^y dx$ subject to $\int_0^\infty e^ydx = 1$

I'm trying to solve a calculus of variations-type problem, which requires finding the extrema of: $$\int_0^\infty (1+xy')^2e^y dx, $$ subject to the constraint that $\int_0^\infty e^ydx = 1$. Intuitively, $e^y$ is the probability density function of…