I know that certain numbers will get slight variations from their original value.
- Eg.
0.1 + 0.2->0.30000000000000004.
But if I do Math.round(0.30000000000000004 * 100) / 100, I will get the correct answer -> 0.3.
I ran a Javascript test and found that the results will accurate at least up to 1e+10.
Are there any caveats to doing this?
If I use Math.round(result * 100) / 100 after every calculation, can I be sure the results will be accurate?
The only calculations I plan to make are addition and multiplication and all numbers will only have 2 decimal places as confirmed by Math.round(n * 100) / 100.
I don’t need the numbers to be accurate over about $1000.
Can I be sure my results will be accurate to the nearest cent?
Advertisement
Answer
You may face some errors while using Math.round(n * 100) / 100 .
It won’t always give your expected result, as for example:
console.log(Math.round(0.145*100)/100)
the expected result would be 0.15. it happens because there are some floats like 0.145 * 100 = 14.499999999999998
i suggest using different approaches, such as:
more on that topic: