Suppose the proof went like this.

As we know if $f(x) = \color{blue}a\cdot x$ then $f'(x)= \color{blue}a$.

So that means if $f(x) = x^2 = \color{blue}x\cdot x$ so that means $f'(x)$ ought to be $\color{blue}x$.

Suppose we even took it further and figure $f'(x) = \lim_{h\to 0}\frac {f(x+h)-f(x)}h =\lim_{h\to 0} \frac {x\cdot (x+h)-x\cdot x}{h} = \lim_{h\to 0} \frac {x^2 + h\cdot x - x^2}h=\lim_{h\to 0}\frac {h\cdot x}h = x$.

We are treating the squaring of $x$ (multiplying $x$ by $x$) as though the $x$ we are multiplying by is constant and not a variable that is modifying in the same manner as the *other* $x$ is and clearly $f(x+h) \ne x\cdot (x+h)$; it is $(x+h)(x+h)$ which is entirely different.

It's true that $f(x) = x^2 =x\cdot x = \sum_{k=1}^x x$ assumes $x \in \mathbb N$ and it is true that $f: \mathbb N \to \mathbb R$ can not be differentiated. But more important, the upper index $\sum_{k=1}^{\color{blue} x} x$ is a *variable* and not a constant. Even we *could* (we can't be even if we could) define $\lim_{h\to 0}\frac {f(x+h) - f(x)}h$ where $f(x) = \sum_{k=1}^{\color{blue} x} x$ then derivative would be $\lim_{h\to 0} \frac {\sum_{k=1}^{\color{blue}{x+h}} (x + h) - \sum_{k=1}^{\color{blue} x} x}h$ and it *WOULDN'T* be $\lim_{h\to 0} \frac {\sum_{k=1}^{\color{blue}{x}} (x + h) - \sum_{k=1}^{\color{blue} x} x}h$ and the index $\color{blue}{x+h}$ would make all the difference.

Of course that argument is somewhat fetid dingo kidneys as $\sum_{k=1}^{\color{blue}{x+h}} (x + h)$ doesn't actually make any sense as $x+h$ is not a natural number index. But if it did (which it would if we defined $f(x)= x\cdot x$ so $f(x+h)$ as $(x+h)(x+h) \ne x\cdot (x+h)$) that argument would be ... healthy and fresh dingo kidneys.