Simple question, but I’m interested in the nuances here.
I’m generating random booleans using the following method I came up with myself:
const rand = Boolean(Math.round(Math.random()));
Whenever random()
shows up, it seems there’s always a pitfall – it’s not truly random, it’s compromised by something or other, etc. So, I’d like to know:
a) Is the above the best-practice way to do it?
b) Am I overthinking things?
c) Am I underthinking things?
d) Is there a better/faster/elegant-er way I don’t know of?
(Also somewhat interested if B and C are mutually exclusive.)
Update
If it makes a difference, I’m using this for movement of an AI character.
Advertisement
Answer
You can compare Math.random()
to 0.5
directly, as the range of Math.random()
is [0, 1)
(this means ‘in the range 0 to 1 including 0, but not 1’). You can divide the range into [0, 0.5)
and [0.5, 1)
.
var random_boolean = Math.random() < 0.5;
// Example console.log(Math.random() < 0.1); //10% probability of getting true console.log(Math.random() < 0.4); //40% probability of getting true console.log(Math.random() < 0.5); //50% probability of getting true console.log(Math.random() < 0.8); //80% probability of getting true console.log(Math.random() < 0.9); //90% probability of getting true