I want to define a BigInt
number in JavaScript. But when I assign it, the wrong number is stored. In fact 1 is added to the number when storing.
let num = BigInt(0b0000111111111111111111111111111111111111111111111111111111111111) console.log(num) // Output: 1152921504606846976n console.log(num.toString(2)) // Output: 1000000000000000000000000000000000000000000000000000000000000
So the number stored is 1152921504606846976
, but it should be 11529215046068469765
. Why is that?
Advertisement
Answer
Converting a Number to a BigInt can’t create bits that weren’t there before.
0b1
(just like 1
) is a Number literal, so it creates a Number.
0b1n
(just like 1n
) is a BigInt literal, so it creates a BigInt.
By writing BigInt(0b1)
, you’re first creating a Number and then converting that to a BigInt. As long as the value is 1, that works just fine; once the value exceeds what you can losslessly store in a Number
[1], you’ll see that the value of the final BigInt won’t match the literal you wrote down. Whether you use binary (0b...
), decimal, or hex (0x...
) literals doesn’t change any of that.
(And just to be extra clear: there’s no reason to write BigInt(123n)
, just like you wouldn’t write Number(123)
. 123n
already is a BigInt, so there’s nothing to convert.)
A simple non-BigInt way to illustrate what’s happening is to enter 12345678901234567890
into your favorite browser’s DevTools console: you can specify Number literals of any length you want, but they’ll be parsed into an IEEE754 64-bit “double”, which has limited precision. Any extra digits in the literal simply can’t be stored, though of course each digit’s presence affects the magnitude of the number.
[1] Side note: this condition is more subtle than just saying that Number.MAX_SAFE_INTEGER
is the threshold, though that constant is related to the situation: any integral number below MAX_SAFE_INTEGER
can be stored losslessly, but there are plenty of numbers above MAX_SAFE_INTEGER
that can also be represented exactly. Random example: 1e20
.