I am just trying to implement a simple RNG in JS.
What’s happening is javascript evaluates 119106029 * 1103515245
to be 131435318772912110
rather than 131435318772912105
. We know it’s wrong since two odd numbers multiplied does not give an even number.
Anyone know what’s up? I just want a reliable repeatable RNG, and because of these incorrect values I can’t get results to match up with my C implementation of the same thing.
Advertisement
Answer
Per the ECMAScript standard, all numbers in JavaScript are (64-bit IEEE 754) floating-point numbers by default.
However all 32-bit integers can be exactly represented as floating-point numbers. You can force a result to 32 bits by using the appropriate bitwise operator, like this:
x = (a * b) >>> 0; // force to unsigned int32 x = (a * b) | 0; // force to signed int32
Weird, but that’s the standard.
(Incidentally this rounding behavior is one of the most frequently reported “bugs” against Firefox’s JavaScript engine. Looks like it’s been reported 3 times so far this year…)
If you want real integer math, you can use BigInt
values, a different type of number, written with an n
at the end:
> 119106029n * 1103515245n 131435318772912105n
This is a relatively recent JS feature, and may not be implemented in old browsers.
As for reproducible random numbers in JavaScript, the V8 benchmark uses this:
// To make the benchmark results predictable, we replace Math.random // with a 100% deterministic alternative. Math.random = (function() { var seed = 49734321; return function() { // Robert Jenkins' 32 bit integer hash function. seed = ((seed + 0x7ed55d16) + (seed << 12)) & 0xffffffff; seed = ((seed ^ 0xc761c23c) ^ (seed >>> 19)) & 0xffffffff; seed = ((seed + 0x165667b1) + (seed << 5)) & 0xffffffff; seed = ((seed + 0xd3a2646c) ^ (seed << 9)) & 0xffffffff; seed = ((seed + 0xfd7046c5) + (seed << 3)) & 0xffffffff; seed = ((seed ^ 0xb55a4f09) ^ (seed >>> 16)) & 0xffffffff; return (seed & 0xfffffff) / 0x10000000; }; })();