I am trying to make a filter based on hours for timestamps (in this example filter for all times after 8 am):
var beginningTimeValue = new Date('2020-01-01 08:00:00'); var unique = [["value","value","value","value","12/01/2021 00:03:35","value"],["value","value","value","value","01/01/2020 00:03:35","value"], ["value","value","value","value","01/01/2020 08:03:35","value"], ["value","value","value","value","01/01/2020 13:03:35","value"]] if(!beginningTimeValue == ""){ unique = unique.filter(function(row) { var rYear = row[4].substring(6, 10); var rMonth = row[4].substring(3, 5); var rDay = row[4].substring(0, 2); var rHour = row[4].substring(11, 13); var rMinute = row[4].substring(14, 16); var rSecond = row[4].substring(17, 19); var bTime = new Date(parseInt(rYear, 10), parseInt(rMonth, 10), parseInt(rDay, 10), parseInt(rHour, 10), parseInt(rMinute, 10), parseInt(rSecond, 10)); console.log("ODATE = " + rYear + "/" + rMonth + "/" + rDay + "_" + rHour + ":" + rMinute + ":" + rSecond); console.log("BDATE = " + bTime.getFullYear() + "/" + bTime.getMonth() + "/" + bTime.getDate() + "_" + bTime.getHours() + ":" + bTime.getMinutes() + ":" + bTime.getSeconds()); beginningTimeValue.setYear(bTime.getYear()); beginningTimeValue.setMonth(bTime.getMonth()); beginningTimeValue.setDate(bTime.getDate()); if(bTime.getTime() >= beginningTimeValue.getTime()){ console.log(bTime.getTime()*24*3600*1000 + " VS " + beginningTimeValue.getTime()*24*3600*1000); } else{ console.log("FALSE"); } return bTime.getTime() >= beginningTimeValue.getTime(); } ); } console.log(unique);
I have debugged my way to finding out that I wouldn’t get a FALSE
value in the 2nd IF
, however I am at a loss as to why the .getTime()
function returns vastly different values for my console log:
“136556220816000000000 VS -5.0438323821312e+21”
“136560264336000000000 VS -5.0438323821312e+21”
Advertisement
Answer
The problem is in the following line:
beginningTimeValue.setYear(bTime.getYear());
The (deprecated) getYear()
function returns
[a] number representing the year of the given date, according to local time, minus 1900.
The (also deprecated) setYear()
function
[…] interprets any two-digit number as an offset to
1900
In your case, getYear()
returns a value like 121
, which is not a two-digit number. When you subsequently invoke setYear()
with that value, you get a date that is set to the year 121 instead of 2021.
Since getTime()
returns the number of milliseconds since 1970, and 121 is before 1970, you get a negative number.
TL;DR: use getFullYear()
and setFullYear()
instead of getYear()
and setYear()
.