Skip to content
Advertisement

Why does [][[]] evaluate to undefined?

The expression [][[]] evaluates to undefined in JavaScript. My understanding of this was that the compiler sees the second set of [...] and interprets that to be an array subscript operator (because you can’t have two arrays next to each other).

So the compiler knows that the inner expression, [], must be an index, and so after evaluating it, it coerces it to a number. Number([]) evaluates to 0, and so we have [][0], which is undefined.

However, [1][[]] does not evaluate to 1 as I would expect, but rather to undefined suggesting that in this case (or maybe also in the previous case), [] isn’t being coerced to a number. It seems that I must use the unary + to force the type coercion:

[1][+[]] // returns 1

So if the inner [] in the expression [][[]] is not being coerced to a number, then why does that expression evaluate to undefined?

Advertisement

Answer

The faulty assumption was that the expression that evaluates to the index is coerced to a number. It is in fact coerced to string as are all object keys (except for Symbols, which stay Symbols).

Thus, [1][[]] turns into [1][""], and since the "" property doesn’t exist on the array, we get undefined instead of 1.

User contributions licensed under: CC BY-SA
7 People found this is helpful
Advertisement