Questions tagged [proximal-operators]

Use this tag in question related to the Proximal Operator / Proximal Mapping. It might also be used in question about Proximal Gradient Method and Alternating Direction Method of Multipliers (ADMM).

Dedicated to the Proximal Operator.

155 questions
21
votes
8 answers

Derivative of the nuclear norm

The nuclear norm is defined in the following way $$\|X\|_*=\mathrm{tr} \left(\sqrt{X^T X} \right)$$ I'm trying to take the derivative of the nuclear norm with respect to its argument $$\frac{\partial \|X\|_*}{\partial X}$$ Note that $\|X\|_*$ is a…
17
votes
3 answers

Derivation of Soft Thresholding Operator / Proximal Operator of $ {L}_{1} $ Norm

I was going through the derivation of soft threholding at http://dl.dropboxusercontent.com/u/22893361/papers/Soft%20Threshold%20Proof.pdf. It says the three unique solutions for $\operatorname{arg min} \|x-b\|_2^2 + \lambda\|x\|_1$ is given…
user34790
  • 3,772
  • 4
  • 20
  • 26
14
votes
2 answers

The Proximal Operator of the $ {L}_{\infty} $ (Infinity Norm)

What is the proximal operator of the $ \left\| x \right\|_{\infty} $ norm: $$ \operatorname{Prox}_{\lambda \left\| \cdot \right\|_{\infty}} \left( v \right) = \arg \min_{x} \frac{1}{2} \left\| x - v \right\|_{2}^{2} + \lambda \left\| x…
12
votes
2 answers

What Is the Difference Between Interior Point Methods, Active Set Methods, Cutting Plane Methods and Proximal Gradient Methods?

Can you help me explain the basic difference between Interior Point Methods, Active Set Methods, Cutting Plane Methods and Proximal Methods. What is the best method and why? What are the pros and cons of each method? What is the geometric intuition…
11
votes
4 answers

Proof of the Moreau Decomposition Property of Proximal Operators

Given the prox operator i.e. $$ \operatorname{prox}_{ h \left( \cdot \right) } \left( x \right) = \arg \min_{u} h \left( u \right) + \frac{1}{2} {\left\| u - x \right\|}_{2}^{2} $$ the Moreau decomposition property says that $$ x =…
11
votes
3 answers

What Is the Motivation of Proximal Mapping / Proximal Operator?

For a convex function $h$, its proximal operator is defined as: $$\operatorname{prox}_h(x)=\arg\min_u \Big(h(u)+\frac{1}{2}\|u-x\|^2\Big)$$ Can anyone provide an intuitive explanation/motivation of proximal mapping?
8
votes
4 answers

Firm Non Expansiveness in the Context of Proximal Mapping / Proximal Operators

$\newcommand{\prox}{\operatorname{prox}}$ Probably the most remarkable property of the proximal operator is the fixed point property: The point $x^*$ minimizes $f$ if and only if $x^* = \prox_f(x^*) $ So, indeed, $f$ can be minimized by find a fixed…
8
votes
1 answer

Weak Jacobian of Proximal Operator

Given a convex function $g(x):\mathbb{R}^n\rightarrow \mathbb{R}$, the proximal operator of $g$ is defined as $P_g(x)=\underset{u}{\arg\min}\quad \frac{1}{2}||x-u||_2^2+g(u)$. Since $g(x)$ is convex, the proximal is a singleton, i.e., there is a…
8
votes
1 answer

Proximal Operator / Proximal Mapping for Composition of Functions

Suppose I have a convex function $f(x)$ for which I can easily compute the proximal mapping prox$_f(z) = \arg\min_{x} f(x) + \frac{1}{2}||x-z||^2_2$ is there a simple expression for the proximal mapping of $f(\mathcal{A}(x))$ where $\mathcal{A}(x)$…
David P
  • 153
  • 4
7
votes
1 answer

The Proximal Operator of the $ {L}_{1} $ Norm Function

Write down explicitly the optimal solutions to the Moreau-Yosida regularization of the function $f(x)=\lambda\|x\|_1$, where $f:\mathbb{R}^n\to(-\infty,+\infty]$. I have found that the answer is $x_i=sgn(x_i)\max\{|x_i|-1,0\}$ Here is my attempt to…
7
votes
3 answers

$ {L}_{1} $ Regularized Unconstrained Optimization Problem

I am encountering an unconstrained minimization problem. The problem is of the form $$\min_x \frac{\|x-a\|_2^2}{2}+\lambda\|x\|_1$$ where $x,a \in R^n$ and $x$ is the optimization variable. $\lambda \in R$. The problem can be separated in each…
6
votes
2 answers

Proximal Mapping of Least Squares with $ {L}_{1} $ and $ {L}_{2} $ Norm Terms Regularization (Similar to Elastic Net)

I was trying to solve $$\min_x \frac{1}{2} \|x - b\|^2_2 + \lambda_1\|x\|_1 + \lambda_2\|x\|_2,$$ where $ b \in \mathbb{R}^n$ is a fixed vector, and $\lambda_1,\lambda_2$ are fixed scalars. Let $f = \lambda_1\|x\|_1 + \lambda_2\|x\|_2$, that is to…
6
votes
1 answer

Simple Optimization Algorithm Compared to Douglas Rachford

The Douglas-Rachford optimization algorithm solves problems of the form $$\text{minimize} \hspace{8pt} f(x) + g(x)$$ where $f$ and $g$ are Closed Convex Proper (CCP). It is useful when both $f$ and $g$ have simple proximal operators (in the sense…
6
votes
1 answer

Proximal Operator of Weighted $ {L}_{2} $ Norm

Define the weighted $ {L}_{2} $ norm $ {\left\| x \right\|}_{2,w} = \sqrt{ \sum_{i = 1}^{n} {w}_{i} {x}_{i}^{2} }$. Find the formula for $ \operatorname{prox}_{\lambda {\left\| \cdot \right\|}_{2,w} }(y)$, where $ \lambda > 0 $. By definition we…
5
votes
1 answer

Proximal Operator - Scaling by a Matrix

Proximal operator is defined for matrices as a map prox$_f:R^m\times R^n \rightarrow R^m\times R^n$: prox$_f$(X) := argmin$_{Y\in R^m\times R^n}$ $ f(Y) + \frac{1}{2}||Y-X||^2$ In case of vectors, it is known http://arxiv.org/pdf/0912.3522v4.pdf…
trembik
  • 1,129
  • 1
  • 12
  • 18
1
2 3
10 11