Questions tagged [optimization]

Optimization is the process of choosing the "best" value among possible values. They are often formulated as questions on the minimization/maximization of functions, with or without constraints.

In mathematics, computer science, economics, or management science, mathematical optimization (alternatively, optimization or mathematical programming) is the selection of a best element (with regard to some criteria) from some set of available alternatives.

An optimization problem can be represented in the following way: given a function $f:A\to\mathbb{R}$ from some set $A$ to the real numbers, we want to find an element $x_0\in A$ such that $f(x_0)\le f(x)$ for all $x \in A$ ("minimization") or such that $f(x_0)\ge f(x)$ for all $x \in A$ ("maximization").

20164 questions
4
votes
0 answers

For cubic $P(x)$ with a root in $[0,1]$, find the smallest $C$ such that $\int_0^1 |P(x)|dx\leq C\max_{x\in[0,1]}|P(x)|$

Find the smallest constant $C$ such that for every polynomial $P(x)$ of degree $3$ with a root in $[0,1]$, $$\int_0^1 |P(x)|dx\leq C\max_{x\in[0,1]}|P(x)|.$$ Here's my rough work. Write $P(x)=a_3x^3+a_2x^2+a_1x+a_0$. Then $$\int_0^1…
user706791
4
votes
2 answers

Maximizing distance between points

I asked a similar question on SciComp, but it is a little out of the domain, so I thought I'd give it a try here as well. Give n points, I would like to place them in a periodic box (periodic such that the distance between two points "wraps around"…
Nick
  • 967
  • 1
  • 9
  • 21
4
votes
2 answers

How to compute derivative with Hadamard product?

Let $\mathbf{x}$, $\mathbf{y}$ and $\mathbf{z}$ are $n$-dimensional column vector, and $$f = \mathbf{x}\circ \mathbf{y} \circ\mathbf{z}$$ Here $\circ$ is the element-wise Hadamard product. Then how to compute the gradient $\frac{\partial f…
jason
  • 753
  • 5
  • 14
4
votes
3 answers

Why is Unit Circle Including Its Interior not a Polyhedron

I'm in a linear programming class and I'm trying to understand why the unit circle with its interior is not a polyhedron. I know there is a proof by contradiction that the unit circle not including its interior is not a polyhedron. But I'm wondering…
4
votes
0 answers

subgradient of composition

Are there general rules or references of subgradient of composition? For example, consider the composition $f(x)=h(g(x))$ where $h$ is convex nondecreasing and $g$ is convex, then the subgradient of $f$ can be computed as in slide 15. However, what…
John Smith
  • 343
  • 1
  • 10
4
votes
1 answer

Proving optimal solution for Linear Programming

Suppose we have a standard optimization problem. $A'$ is an optimal solution to the problem. If we add a constraint to our original optimization problem, and $A'$ satisfies the new constraint, then is $A'$ still optimal for the new optimization…
4
votes
1 answer

Extrema of $f(x_1,\ldots,x_n) = (1+x_1)\cdots(1+x_n)$

Consider $f: (\mathbb R_+)^n \to \mathbb R$ defined by $$f(x_1,\ldots,x_n) = (1+x_1)\cdots(1+x_n)$$ I am looking for local or global extrema under the condition $x_1\cdots x_n=a^n, a >0$ using the method of Lagrange multipliers. First of all I…
user699386
  • 71
  • 1
4
votes
2 answers

Derivative of the Prox / Proximal Operator

Consider a proximal operator, $$ \operatorname{Prox}_{ \lambda f( u ) } \left( x \right) = \arg \min_{u} \lambda f \left( u \right) + \frac{1}{2} {\left\| u - \mu x \right\|}_{2}^{2}.$$ What is the partial derivative of the proximal operator w.r.t.…
4
votes
1 answer

Why does the Hamilton-Jacobi-Bellman equation lead to an optimal control law?

I am an electrical engineer, and currently I am reading some literature on control engineering. I read the following assertion, which is presented without any proof: Let $\dot x = f(x,u)$ be an autonomous control system with smooth $f:U \times…
4
votes
1 answer

Minimize Discrete Integral

Assuming i have discrete samples of a function and thereby estimate a continuous function $f(x)$ by connecting the samples linearly with each other. Could look for example like this: is there an easy way to minimize $b - a$…
man zet
  • 197
  • 1
  • 6
4
votes
1 answer

Questions about different formulations of the Taylor expansion terms

I'm reading Numerical Optimization from Nocedal/Wright, and was playing around the matrix notation of the Taylor expansion, regarding which I would have two questions. We have a 3-times differentiable function $f\,:\,\mathbb{R}^n\rightarrow…
4
votes
4 answers

Range of $f(x,y)=\frac{4x^2+(y+2)^2}{x^2+y^2+1}$

I am trying to find the range of this function: $$f(x,y)=\frac{4x^2+(y+2)^2}{x^2+y^2+1}$$ So I think that means I have to find minima and maxima. Using partial derivatives gets messy, so I was wondering if I could do some change of variables to…
Dude111
  • 233
  • 1
  • 8
4
votes
2 answers

How to solve this? (Argument of the complex number in complex plane)

Let the $z \in C$ s.t. $|z-10i|= 6 $ $\newcommand{\Arg}{\operatorname{Arg}}$ Say the $\theta = \Arg(z)$ Find the maximum and minimum value of the $8\sin \theta + 6\cos \theta$ My trial) Trying to solve by picture and graph, I considered $8i\sin…
4
votes
1 answer

Verification for maximum principle

Given optimal control problem $$ \dot x = f(t,x(t),u(t)), \quad x(0) = x_0,\\ J(u) = \int_0^T f^0(t,x(t),u(t))dt \to \min, $$ we can apply Pontryagin's maximum principle to get a necessary condition for an optimum. My question is when maximum…
niyazets
  • 101
  • 5
4
votes
1 answer

Generalizing the conjugate gradient like this works?

Given $A \in \mathbb{R}^{n \times n}$, a SPD matrix, and a vector $b \in \mathbb{R}^n$, it is possible to solve the problem $$\min_x \| Ax - b\|$$ with the conjugate gradient method. Its algorithm basically is $r_0 = b - Ax_0\\ p_0 = r_0\\ k =…
1 2 3
99
100