I was playing around with Javascript creating a simple countdown clock when I came across this strange behavior:
var a = new Date(), now = a.getTime(), then = Date.UTC(2009,10,31), diff = then - now, daysleft = parseInt(diff/(24*60*60*1000)); console.log(daysleft );
The days left is off by 30 days.
What is wrong with this code?
Edit: I changed the variable names to make it more clear.
Advertisement
Answer
The month is zero-based for JavaScript.
Days and years are one-based.
Go figure.
UPDATE
The reason this is so, from the creator of JavaScript, is
JS had to “look like Java” only less so, be Java’s dumb kid brother or boy-hostage sidekick. Plus, I had to be done in ten days or something worse than JS would have happened.
http://www.jwz.org/blog/2010/10/every-day-i-learn-something-new-and-stupid/#comment-1021