I have a question regarding the outcome of this in JavaScript as I don’t really understand it. Why if I use this code it gets the next result:
var a =[1][1];
var b = [1][0];
if(a){console.log(true);}else{console.log( false);} --> returns false
if(b){console.log(true);}else{console.log(false);} --> returns true
How to explain the exact way of how JavaScript interprets these results?
Advertisement
Answer
Pretty simple actually, lets break it down:
var a =[1][1];
Broken down is:
var a = [1]; //An array with the value '1' at the 0 index a = a[1]; //assigns a the result of the 1 index, which is undefined
Same with b – but b uses the 0 index, which is defined (as 1);
a is undefined which is falsy, and b is 1 – which is truthy.