Let's first generalize the problem, find an optimal solution for that,
and then specialize it to the given problem.

OK, so the general problem is: We have a source that gives us one of
$n$ different results randomly with uniform distribution. We want to
emulate a source giving us $k$ different results randomly with uniform
distribution. And we want to do it with as few possible draws from the
original source as possible.

Since we want $k$ different results, we need $k$ equally probable
events. For $d$ draws from the source, we get $n^d$ equally probable
results. So unless $k=n^d$ for some $d$, we are going to have to throw
away *some* information (if $k=n^d$, the solution is obvious: Draw a
random number $d$ times). But of course we want to throw away as
little information as possible.

Now it is obvious that we cannot give $n$ equally probable results if
there are less than $n$ results from our source. So the first step is
to draw as many times from the source that $n^{d_0}>k$ where $d_0$ is
the number of draws. For example, for emulating a coin with a standard
six-sided die, we only need to throw once, while for emulating a
six-sided die with a coin, we need at least $3$ tosses (because
$2^2<6<2^3$).

Next, we want $k$ equal-probability events. We have $n^d$
equal-probability base events. We now find the largest multiple of $k$
that's not larger than $n^{d_0}$; say that multiple is $m_0k$. Then we
make $k$ groups of $m_0$ base events each, leaving $r_0:=n^{d_0}-m_0k$
base events not covered by any group. Obviously those $k$ groups have
equal probability, so if the base event of our draw is in one of the
groups, we can just take as result which of the group is in. Note that
there's another uniform-distributed information we didn't use, which
is which of the elements in that group we selected; that information
may be reused if we want to draw another number in the same way as
explained below for the case that we get one of the "leftover" base
events.

When we get one of the $r_0$ base events that are not in one of the
group. we have failed so far to get a result. Now the simplest way to
continue would be to start over; however that way we would discard
information; namely information about which of the $r_0$ "leftover"
events we got (unless $r_0=1$, then there is of course no information
left). Those are also uniformly distributed, so we can easily use them
to generate the new uniform set. Thus if we draw an addition $d_1$
times, we get not just $n^{d_1}$, but $r_0n^{d_1}$ equally distributed
values.

Note that this reuse also implies that we can simplify our step
before; if the number of events after some throw is too small, we
simply get groups of zero elements (and thus zero probability of
getting a result; as it should be given that we haven't yet thrown
often enough to generate sufficient events).

To accomodate the case of $k\le n$ (where a single draw from the
source might generate one or even several

So the algorithm goes as follows:

You have an uniform distribution of size $r_i$ left over by
previous steps (for no information left, which includes the
initial state, that distribution has size $r_n=1$), and a value
$v_i$ from that distribution (which I assume for simplicity to go
from $0$ to $r_i-1$; for $r_i=1$ we obviously then have $v_i=0$).

Calculate $m_i = \lfloor r_i/k \rfloor$.

If $v_i < m_ik$, give as result $d_i=\lfloor v_i/m_i \rfloor$ and
generate as new leftover data (for possible further draws)
$r_{i+1}=m_i$ and $v_{i+1} = v_i-m_id_i$

Otherwise, don't generate a result, draw a new random number $s_i$
from the source (which I assume to give numbers from $0$ to $n-1$),
set $r_{i+1} = r_i n$, $v_{i+1} = v_i n + s_i$ and start over.

Now let's apply that algorithm to the specific problem, where $n=6$
and $k=10$ (where I implicitly subtract $1$ for the dice throws and
add $1$ for the results wherever applicable).

So we start out with $r_0=1$ and $v_1=0$. Then we get $m_0=0$. Since
clearly $0 < 0$ is false, we throw once, resulting (after subtraction
of $1$) in a random value $s_0$ between $0$ and $5$. Our new leftover
values are thus $r_1 = 1\cdot 6=6$ and $v_1 = 0\cdot 6 + s_0$. That
is, our leftover distribution contains a number between $0$ and $5$.

In the next iteration, we still find $m_1=0$ (we don't yet have enough
data) and thus we again get to draw a new number (that is, throw
again). We therefore now arrive at a random distribution from $0$ to
$35$.

In the next iteration, we get $m_2=3$. So if our randomly distributed
number is less than $30$ (which happens in $30/36 = 5/6$ of all
cases), we get a result (and a leftover distribution from $0$ to $2$
for further throws; noite that this means drawing a second number from
$1$ to $10$ in the best case needs only one further dice throw).

Otherwise, we get a leftover distribution of size 6 (namely between 31
and 36), which is exactly as if we had thrown only once. Therefore,
the following iterations are exactly like this one.

Indeed, when changing slightly the way the result is calculated, this
gives rise to the following specific algorithm for drawing the first
number:

Throw your dice once. This gives you the result $a$ (from $1$ to
$6$).

Now throw until you have something other than a $6$. That gives
you the result $b$ (from $1$ to $5$).

Take the last digit of $a+6b$ and add $1$.

This gives a minimal number of two throws, and an average number of
$11/5 = 2.2$ throws.

Let's also consider mathmandan's idea of using not only the top side, but the complete orientation of a cube forced to lie on a given square.

Here we have $n=24$, which is larger than $10$, so we have a chance at every throw. For the first throw, the algorithm gives already a result with probability $20/24 = 5/6$ (while leaving a single bit as leftover for generating another random number). However, unlike with mathmandan's solution, in the other case we don't just start over, but reuse the remaining information, which is an uniform distribution of 4 values. After another throw, we now have an uniform distribution of $4\cdot 24 = 96$ different values. Now with probability $90/96 = 15/16 \approx 0.94$ you get a result (and a comfortable $9$-value distribution for a possible next draw). If despite that great chance you again fall into the leftover range, you've got a uniform distribution of length $6$, so after the next throw you get a distribution of length $6\cdot 24 = 144$. Now you have a probability of a whopping $140/144 = 35/36 \approx 0.97$ to get a result (and in that case a comfortable starter distribution of size $14$ for a possible further draw). If you're unlucky enough to fall into the remaining $3\%$ you again have a leftover distribution of size $4$, so we get repetition.

So with the "full cube orientation" method, the algorithm gives an average number of $1 + 153/840 \approx 1.18$ tosses for the first draw.

Note that further draws are more efficient because you've got a leftover from the previous draw.