The inverse normal distribution does not furnish a suitable example, because if $Y = Z^{-1}$ where $Z \sim \operatorname{Normal}(0,1)$, then $\operatorname{E}[Y]$ is indeterminate. We can, however, consider a double-inverse gamma distribution: define $$f_X(x) = \frac{|x|}{2}e^{-|x|}, \quad -\infty < x < \infty.$$ It is trivial to see that this function indeed defines a density. Now let $Y = X^{-1}$, from which we find that the density of $Y$ is $$f_Y(y) = f_X(y^{-1})y^{-2} = \frac{1}{2y^2|y|} e^{-1/|y|}, \quad y \ne 0.$$ This function does have a well-defined expectation since $$\int_{y=0}^\infty yf_Y(y) \, dy = \frac{1}{2}.$$ Then, due to $f_Y$ being an even function, we trivially find $\operatorname{E}[Y] = 0$.

Now, whether the harmonic mean of an IID sample drawn from $Y$ is in some sense the "best" estimator of the population mean because $\bar x$ is the "best" estimator of the mean of $X$ and $Y = 1/X$, I am not so sure. This is because we can say that the estimator $\tilde y = n (\sum_{i=1}^n y_i^{-1})^{-1}$ has expectation $$\operatorname{E}[\tilde y] = n \operatorname{E}\left[\left(\sum_{i=1}^n y_i^{-1}\right)^{-1}\right],$$ but it cannot be said that the RHS is in general equal to $$n \left(\operatorname{E}\left[\sum_{i=1}^n y_i^{-1}\right]\right)^{-1},$$ in as much as we cannot generally write $$\operatorname{E}[g(X)] = g(\operatorname{E}[X]):$$ that is, the expectation of a function of a random variable does not generally equal the function evaluated at the variable's expected value. If you could say that, then the expectation passes through the sum via linearity and you'd get $$n \left(\sum_{i=1}^n \operatorname{E}[y_i^{-1}]\right)^{-1} = n \left(n \operatorname{E}[X]\right)^{-1} = \operatorname{E}[X]^{-1}.$$ And again, you run into the same problem/fallacy: you can't claim that this last expression equals $\operatorname{E}[Y]$. Thus, the idea to consider inverse distributions seems dubious to me.