Problem: minimise $F(x,y,y')$ over $x$, constrained by $G(x,y,y')=0$.

$$J_1(x,y,y')=\large \int_{x_0}^{x_1}F(x,y,y')+ \lambda (x) G(x,y,y')dx$$

I understand the Euler-Lagrange equation and Lagrange multipliers in multivariable (i.e. not variational) calculus, but am having a hard time putting them together, I don't understand the logic behind this equation.

Is this the same $\lambda$ that appears in the multivariable $\nabla f (x,y)= \lambda \nabla g (x,y) $? If so, why is it not also a function of $y$?

By the fundamental lemma of the calculus of variations, doesn't $ \int_{x_0}^{x_1}\lambda (x) G(x,y,y')dx=\int_{x_0}^{x_1}\lambda (x) 0dx=0$, thus $J=J_1$, and so the constraint has had no effect on the integral?

More importantly, what is the proof that if the minimum of $J_1$ is found (i.e. the Euler-Lagrange equations are satisfied with this new integrand) $G$

*will*be constrained properly?In short: what is the logic behind that integrand?

*Side-note:*
Without constraints, the problem is simply to minimise $$J(x,y,y')=\large \int_{x_0}^{x_1}F(x,y,y')dx$$
i.e. solve the Euler-Lagrange equation for the functional $F$ and the variable $y$. *(Side-question: why is $G$ usually given as a function of only $x$ and $y$, and $J$ of $y$? Surely in $J$'s case this severely limits the number of possible functionals $F$ [as the integral must not be a function of $x$ or $y'$], and in $G$'s case there are many possible constraints involving $y'$ that are overlooked?).*