How do we apply Lagrangian multiplier method for a problem with more than one condition? Here is my work so far, for the below optimization problem:

Basically the same way, except that instead of having to deal separately with points where the gradient of the constraint vanishes, you have to check every point where the gradients of all the constraints are linearly dependent (if you think about it, that's a logical generalization of the case of a single constraint). – Jonathan Y. Dec 09 '13 at 17:09

Can you elaborate a little more, please? – Niousha Dec 09 '13 at 17:16
1 Answers
The Lagrange multipliers schema states that if $\Omega\subset\mathbb{R}^n$ is open, and $f:\Omega\to\mathbb{R}$ is differentiable, and if $m\leq n$ and $g_1,g_2,\ldots,g_m:\Omega\to\mathbb{R}$ are continuously differentiable, then at every local extremum $a\in U$ of $f$ subject to the constraint $g_i=0$ for $i=1,2,\ldots,m$ there exist $\lambda_1,\lambda_2,\ldots,\lambda_m$ such that $\nabla f(a) = \sum_{i=1}^m\lambda_i\nabla g_i(a)$, or $\{\nabla g_i(a): i=1,2,\ldots,m\}$ are linearly dependent.
Therefore, to find local extrema of $f$ subject to the constraints $g_i=0$, we need to treat separately the points where $\{\nabla g_i(a): i=1,2,\ldots,m\}$ are linearly dependent, and solve the system of $n+m$ equations in $x_1,\ldots,x_n,\lambda_1,\ldots,\lambda_m$ given by $$\frac{\partial f}{\partial x_j}=\sum_{i=1}^m\lambda_i\frac{\partial g_i}{\partial x_j}, j=1,\ldots,n;\quad g_i = 0, i=1,\ldots,m.$$
 4,204
 11
 20

I see. Thanks a lot. I think I understand the algorithm, but now I have a problem with taking derivatives! – Niousha Dec 09 '13 at 17:35