The Optimization Problem given by the Prox Operator:

$$ \operatorname{Prox}_{\lambda {\left\| \cdot \right\|}_{1}} \left( x \right) = \arg \min_{u} \left\{ \frac{1}{2} {\left\| u - x \right\|}^{2} + \lambda {\left\| u \right\|}_{1} \right\} $$

This problem is separable with respect to both $ u $ and $ x $ hence one could solve the following problem:

$$ \arg \min_{ {u}_{i} } \left\{ \frac{1}{2} {\left( {u}_{i} - {x}_{i} \right)}^{2} + \lambda \left| {u}_{i} \right| \right\} $$

Now, you can proceed using First Order Optimality Condition and the Sub Gradient of the $ \operatorname{abs} \left( \cdot \right) $ function or you can employ simple trick.

The trick is to understand that $ {u}_{i} $ can be either positive, zero or negative.

Assuming $ {u}_{i} > 0 $ the derivative is given by $ {u}_{i} - {x}_{i} + \lambda $ which vanishes for $ {u}_{i} = {x}_{i} - \lambda $ and holds for $ {x}_{i} > \lambda $.

The same procedure for the case $ {u}_{i} < 0 $ yields $ {u}_{i} = {x}_{i} + \lambda $ for $ {x}_{i} < -\lambda $.

For values of $ {x}_{i} $ in between, since ${u}_{i} = 0 $ and hence the derivative (Sub Gradient) of $ {u}_{i} $ can freely be chosen on the range $ \left[ -1, 1 \right] $ the value of $ {u}_{i} = 0 $ holds.

In summary:

$$ \operatorname{Prox}_{\lambda {\left\| \cdot \right\|}_{1}} \left( x \right)_{i} = \operatorname{sign} \left( {x}_{i} \right) \max \left( \left| {x}_{i} \right| - \lambda, 0 \right) $$

As @NicNic8 noted, this operation is called *Soft Threshold*.