Since the function logarithm is concave, Jensen's inequality shows that $E(\ln(X))\leqslant \ln E(X)$ and that equality occurs iff $X$ is almost surely constant.

**Edit** (This is to expand on a remark made by Shai.)

Shai's answer explains how to prove $E(\ln(X))\leqslant \ln E(X)$ using only AM-GM inequality and the strong law of large numbers. These very tools yield the following refinement (adapted from the paper Self-improvement of the inequality between arithmetic and geometric means by J. M. Aldaz).

Apply AM-GM inequality to the *square roots* of an i.i.d. sequence of positive random variables $(X_i)$, that is,
$$
\sqrt[n]{\sqrt{X_1}\cdots\sqrt{X_n}}\leqslant\frac1n(\sqrt{X_1}+\cdots+\sqrt{X_n}).
$$
In the limit $n\to\infty$, the strong law of large numbers yields
$$
\exp(E(\ln\sqrt{X}))\leqslant E(\sqrt{X}),
$$
that is,
$$
E(\ln X)\leqslant 2\ln E(\sqrt{X})=\ln (E(X)-\mbox{var}(\sqrt{X})).
$$
Finally:

For every positive integrable $X$,
$$
E(\ln X)\leqslant [\ln E(X)]-\delta(X)\quad\mbox{where}\ \delta(X)=\ln[E(X)/E(\sqrt{X})^2].
$$

The correction term $\delta(X)$ is nonnegative for every $X$, and $\delta(X)=0$ iff $X$ is almost surely constant.

Naturally, this also obtains directly through Jensen's inequality applied to $\sqrt{X}$.

And this result is just a special case of the fact that, for every $s$ in $(0,1)$,
$$
E(\ln X)\leqslant [\ln E(X)]-\delta_s(X)\quad\mbox{where}\ \delta_s(X)=\ln[E(X)/E(X^s)^{1/s}].
$$
The quantity $\delta_s(X)$ is a nonincreasing function of $s$ hence the upper bound $[\ln E(X)]-\delta_s(X)$ is better and better when $s$ decreases to $0$. For every $X$, $\delta_1(X)=0$, $\delta_{1/2}(X)=\delta(X)$ and $\delta_0(X)=[\ln E(X)]-E(\ln X)$.

The interesting point in all this, if any, is that one has *quantified* the discrepancy between $E(\ln X)$ and $\ln E(X)$ and, simultaneously, recovered the fact that $E(\ln X)=\ln E(X)$ iff $X$ is almost surely constant.