I got stuck with a problem that pop up in my mind while learning limits. I am still a high school student.

Define $P(m)$ to be the statement: $\quad \lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{m})=0$

The statement holds for $m = 1$: $\quad \lim\limits_{n\to\infty}\frac{1}{n}=0$.

Assume that $P(k)$ holds for some $k$. So put $m = k$: $\quad \lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{k})=0$.

We prove $P(k + 1)$: $\quad \lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{k+1}) =\lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{k}+\frac{1}{n})$

$=\lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{k}) +\lim\limits_{n\to\infty}\frac{1}{n}$

$=0+0=0$.

It has now been proved by mathematical induction that statement holds for all natural m.

If we let $m=n$, then $\lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{n})=0 \tag{*}$.

However, $\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{n}=1 \implies \lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{n})=1 \tag{$\dagger$}$.

Then $(*) \, \& \, (\dagger)$ yield $1=0$?

Can anybody explain this? thanks.