When the following number is converted to a string and then split at the decimal point, the string that represents the decimal portion is incorrect. The expected value is 0123456789 but the value displayed in the console is 01234567.
const number = 123456789.0123456789;
console.log(number.toString().split("."));
If the number of digits to the left of the decimal is changed then what is displayed in the console for the decimal portion also changes but is not more than 8 characters for the testing I've done thus far. Why is it behaving like this? How can I get it to display the true string of digits that follow the decimal point?
EDIT -
After some trial and error testing, here are some additional findings -
When number = 2097151.0123456789
the decimal portion is correct.
When number = 2097152.0123456789
the decimal portion is incorrect.