Skip to content Skip to sidebar Skip to footer

Why Does Javascript Interprete 1/1/0000 As 1/1/2000?

I encountered some strange behaviour, and while I don't really have an issue with it, I would like to know the reasoning behind it. I wrote the following statements: console.log(ne

Solution 1:

This is browser specific.

This is yet another area where JavaScript and browser inconsistency cause extra programmer pain. Some browsers will interpret all two digit years as 19xx, so new Date('1/1/49') gives January 1st, 1949, and new Date('1/1/50') gives January 1st, 1950. Others use 1950 as the "two digit year cutoff date", so new Date('1/1/49') gives January 1st, 2049, and new Date('1/1/50') gives January 1st, 1950.

Two Digit Years in JavaScript - Chris Bristol

You have to bear in mind that the RFC document you've referenced was published in April 2001. The 1900s had only just ended. I guess in an attempt to modernise dates away from the 1900s, some browsers now map 0 to 49 as 2000 to 2049. 50, however, still gets mapped as 1950.

The article quoted above also goes on to give some test results:

Here is a brief summary of the results:

  • IE9: No two digit year cutoff found. Year 00 = 1900.
  • Chrome 24.0: Two digit years change between 49 and 50. Year 00 = 2000.
  • Opera: No two digit year cutoff found. Year 00 = 1900.
  • Firefox: No two digit year cutoff found. Year 00 = 1900.
  • Safari: Two digit years change between 49 and 50. Year 00 = 2000.

The article is quite dated now though, so I imagine the data for Opera and Firefox above may have changed.

Post a Comment for "Why Does Javascript Interprete 1/1/0000 As 1/1/2000?"