Skip to content
Advertisement

Inconsistent behaviour with Javascript Date getTime() function

I am trying to make a filter based on hours for timestamps (in this example filter for all times after 8 am):

JavaScript

I have debugged my way to finding out that I wouldn’t get a FALSE value in the 2nd IF, however I am at a loss as to why the .getTime() function returns vastly different values for my console log:

“136556220816000000000 VS -5.0438323821312e+21”

“136560264336000000000 VS -5.0438323821312e+21”

Advertisement

Answer

The problem is in the following line:

JavaScript

The (deprecated) getYear() function returns

[a] number representing the year of the given date, according to local time, minus 1900.

The (also deprecated) setYear() function

[…] interprets any two-digit number as an offset to 1900

In your case, getYear() returns a value like 121, which is not a two-digit number. When you subsequently invoke setYear() with that value, you get a date that is set to the year 121 instead of 2021.

Since getTime() returns the number of milliseconds since 1970, and 121 is before 1970, you get a negative number.


TL;DR: use getFullYear() and setFullYear() instead of getYear() and setYear().

Advertisement