First, we must show that $\beta^{*} = \{ f_{1},f_{2},f_{3}\}$ is a basis for the dual space $V^{*} = \mathcal{L}(V,F) = \mathcal{L}(\mathbb{R}^{3},\mathbb{R}):$

In the finite-dimensional setting, there is a one-to-one correspondence between linear transformations and matrix representations. Hence, we can compute the dimension of the dual space, as follows:
$$dim(V^{*}) = dim(\mathcal{L}(V,F)) = dim(V) \cdot dim(F) = dim(V) = dim(\mathbb{R}^{3}).$$

Thus, $dim(V^{*}) = 3.$

Now, if we find a linearly independent subset of $V^{*}$ that contains exactly $3$ vectors, then it is a basis for $V^{*}.$ In particular, consider $\beta^{*} = \{ f_{1},f_{2},f_{3}\},$ and suppose that $\sum_{i=1}^{3} a_{i} f_{i} = 0,$ where $a_{i} \in \mathbb{R}$ for $i=1,2,3.$ Then, pick an arbitrary vector $(x,y,z) \in \mathbb{R}^{3},$ and show that
$$0 = 0(x,y,z) = \sum_{i=1}^{3} a_{i} f_{i}(x,y,z)$$ implies $a_{i}=0$ for $i=1,2,3$ (note the multiple uses of the symbol $0$). From this, it follows that $\beta^{*}$ is a linearly independent subset of $V^{*}$ with exactly $3$ vectors.

Second, we follow the solution of BaronVT to find a basis $\beta$ for $\mathbb{R}^{3}$ such that $\beta^{*}$ is the dual basis of $\beta:$

Namely, we encode the three separate systems of equations in the following augmented matrix
$$\left(\begin{array}{ccc|ccc} 1 & -2 & \phantom{-} 0 & 1 & 0 & 0 \\ 1 & \phantom{-} 1 & \phantom{-} 1 & 0 & 1 & 0 \\ 0 & \phantom{-} 1 & -3 & 0 & 0 & 1 \end{array}\right),$$
then we perform Gauss-Jordan elimination to find $\beta.$