3

This a tricky math question I encountered. I know a little bit about the answer. But I want somebody who is very good at math to help me find the real reason behind this.

OK

Lets start

$1^2 = 1$

$2^2 = 2+2$

$3^2 = 3+3+3$

...........................

................................

$x^2 = x+x+x+x+.....(x times)$

Differentiating with respect to $x$

We get

$2x = 1+1+1+1...... (x times)$

which is equal to

$2x = 1*x$

$2x = x$

Which is incorrect.

Where did I go wrong?? :O :O

2 Answers2

1

Please think of the case when $x$ belongs to a real number and not an integer.

Then you will get the idea, say what happens when x = 1.5?

$ x^2 = x * x $

$\frac{d}{dx} x^2 = x + x = 2x $

Ashutosh Gupta
  • 422
  • 3
  • 14
1

Your problem arises because you have made the assumption that $$\frac{d}{dx}\sum_{i=1}^{x}x$$ can be calculated by simply summing the derivatives of the individual summands; i.e., you have made the assumption that $$\frac{d}{dx}\sum_{i=1}^{x}x=\sum_{i=1}^{x}\frac{d}{dx}x.$$ This is not true; the value of $x$ affects both the elements of the sum and the number of elements in the sum.

David Simmons
  • 1,537
  • 2
  • 10
  • 26