Problem: minimise $F(x,y,y')$ over $x$, constrained by $G(x,y,y')=0$.

$$J_1(x,y,y')=\large \int_{x_0}^{x_1}F(x,y,y')+ \lambda (x) G(x,y,y')dx$$

I understand the Euler-Lagrange equation and Lagrange multipliers in multivariable (i.e. not variational) calculus, but am having a hard time putting them together, I don't understand the logic behind this equation.

  1. Is this the same $\lambda$ that appears in the multivariable $\nabla f (x,y)= \lambda \nabla g (x,y) $? If so, why is it not also a function of $y$?

  2. By the fundamental lemma of the calculus of variations, doesn't $ \int_{x_0}^{x_1}\lambda (x) G(x,y,y')dx=\int_{x_0}^{x_1}\lambda (x) 0dx=0$, thus $J=J_1$, and so the constraint has had no effect on the integral?

  3. More importantly, what is the proof that if the minimum of $J_1$ is found (i.e. the Euler-Lagrange equations are satisfied with this new integrand) $G$ will be constrained properly?

    In short: what is the logic behind that integrand?

Side-note: Without constraints, the problem is simply to minimise $$J(x,y,y')=\large \int_{x_0}^{x_1}F(x,y,y')dx$$ i.e. solve the Euler-Lagrange equation for the functional $F$ and the variable $y$. (Side-question: why is $G$ usually given as a function of only $x$ and $y$, and $J$ of $y$? Surely in $J$'s case this severely limits the number of possible functionals $F$ [as the integral must not be a function of $x$ or $y'$], and in $G$'s case there are many possible constraints involving $y'$ that are overlooked?).

  • 5,853
  • 6
  • 38
  • 59
  • On 3.: That's not the case; you need to solve the resulting Euler-Lagrange equations *together* with the constraint equation; the constraint equation isn't automatically satisfied. You've introduced one more unknown, $\lambda$, and you have one more equation to determine it, just as in the multivariable case. – joriki Jan 15 '13 at 20:31
  • @joriki Thank you. Why not just solve the 'normal' (nonconstrained) Euler-Lagrange equation and the constraint together (what is special about the 'new' Euler- Lagrange equation?)? I suppose that may be one of the cruces of my bafflement. – Meow Jan 15 '13 at 20:37
  • 1
    You can't -- without $\lambda$, you don't have the additional freedom required to satisfy them both. If you just solve the normal Euler-Lagrange equation, the solution will generally not fulfil the constraint. It's exactly analogous to the multivariable case. – joriki Jan 15 '13 at 20:40

1 Answers1


I think you have some problems, because you use an incorrect notation. Let me rewrite your original problem: \begin{align*} \text{minimize}\quad & J(y) = \int_{x_0}^{x_1} F(x, y(x), y'(x)) \, \mathrm{d} x \\ \text{subject to}\quad & G(x, y(x), y'(x)) = 0 \quad\text{for all } x \in [x_0,x_1]. \end{align*} Here, $F : \mathbb{R} \times \mathbb R \times \mathbb R \to \mathbb R$ and $G : \mathbb R \times \mathbb R \times \mathbb R \to \mathbb R^n$. Do you see the differences? $J$ only depends on the function $y$, whereas the integrand $F$ and the constraint $G$ depend on real numbers.

Now (if a constraint qualification is satisfied), you get a multiplier $\lambda : [x_0, x_1] \to \mathbb R^n$ (compare with section 6.2 in your link: you get a multplier for each constraint, that is, for each $x$), such that the derivative of the Lagrangian $$J(y) + \int_{t_0}^{t_1} G(x, y(x), y'(x)) \, \lambda(x) \, \mathrm{d} x$$ with respect to $y$ is zero (that is, the derivative of your lagrangian w.r.t. the optimization variable). Now, you can continue like for the derivation of the euler-lagrange equation.

@2.: Yes, this is correct, but the derivative of $J$ w.r.t. $y$ does not equals the derivative of $J_1$ w.r.t. $y$.

@3.: As joriki already said, you have to solve the resulting Euler-Lagrange equation together with the constraint. In other words: The Euler-Lagrange equation depends on $\lambda$. Once you have fixed $\lambda$, they have a unique solution $y$ (depending on $\lambda$). It remains to choose $\lambda$, such that the corresponding $y$ satisfies the constraints (this is reasonable, since you have as many constraints as degrees of freedom in $\lambda$).

  • 2,375
  • 3
  • 19
  • 39
  • 28,022
  • 1
  • 18
  • 50
  • Thanks for the thorough answer, but I have one further question: if we know $G(x,y(x),y'(x))=0$, why don't we just use this knowledge: insert it into $J_1$ and make out lives easier? This doesn't seem incorrect, but implies that $J=J_1$, which is obviously untrue as the unconstrained problem is very different from the constrained one. – Meow Jan 22 '13 at 18:54
  • 2
    Your $J_1$ is the Lagrangian and $J$ is the objective. Note that $J_1$ depends on your optimization variable and additionally on your multiplier $\lambda$. You stated correctly $J_1(y, \lambda) = J(y)$ iff $y$ satisfies the constraints. However, the multiplier rule tells you to compute the zeros of the derivative of $J_1$. But this differs from the derivative of $J$. – gerw Jan 23 '13 at 11:33