ISO 8601 and ECMAScript - the headache of misreading standards

    We are developing here some integration service with a very third-party system. The service itself runs on Node.js. And everything would be fine, but only the unavailability of the server during the garbage collection really unnerved the third-party system.

    On New Year's Eve, it was decided to give the server a gift - upgrade Node.js from version 0.4.8 to 0.6.6. For a number of organizational reasons that I don’t really want to discuss here, the update was carried out immediately on the combat system and even without regression testing .

    Could something have gone wrong in this situation?

    Updated. We work further. It suddenly turns out that in messages transmitted by a third-party system, the time is shifted 4 hours ahead. There is probably no need to talk about the business consequences of such a shift.

    Start to think. There is a hypothesis that if everything worked before, but with the Node.js update it suddenly stopped, then it means the matter is in it or in v8. It can’t be, I say. That such a jamb and we were the first to notice - for sure this is our admin screwed up. Obviously, I say, he has something wrong with the settings of the time zone on the server. They looked into all the possible nooks and crannies - no, everything is clean.

    There remains the last, most incredible hypothesis - the parsing of time in the ISO 8601 format has broken. It is in this format that the third-party system sends time in messages. It would seem that what can be broken here. Here comes the local time: “2011-12-30T22: 00: 00”. We quickly look in the server:

    > var d = new Date ('2011-12-30T22: 00: 00')
    undefined
    > d
    Fri, 30 Dec 2011 22:00:00 GMT
    > d.getHours ()
    2
    

    Goofy. The server obviously lives in GMT. Not otherwise, already emigrated. We check:

    > var d = new Date ('2011-12-30T22: 00: 00Z')
    undefined
    > d
    Fri, 30 Dec 2011 22:00:00 GMT
    > d.getHours ()
    2
    

    In both cases, regardless of whether the time is indicated locally or GMT, it is perceived as GMT. We look what Chrome will say to this:

    > var d = new Date ('2011-12-30T22: 00: 00')
    undefined
    > d.toString ()
    "Sat Dec 31 2011 02:00:00 GMT + 0400 (MSK)"
    > d.getHours ()
    2
    

    But this is already unpleasant. Chrome - at the workstation. And with the time zone there everything is in order.

    While the developer taunts me with the question “When will we post a bug on Google?” I read the description of the standard. 130 francs for an official document is not, therefore, we study Wikipedia :
    If no UTC relation information is given with a time representation, the time is assumed to be in local time

    Digging further.

    var d = new Date ('2011-12-30T22: 00: 00')
    undefined
    d. getTimezoneOffset () / 60
    -4
    

    -4 is returned on all tested computers and platforms. Then a thought comes to my mind. So far, we have tested everywhere except Firefox and Windows. I look in FF under Windows:

    var d = new Date ('2011-12-30T22: 00: 00')
    undefined
    d.getUTCHours ()
    18
    d.getHours ()
    22
    

    Checking the same in Chrome under Windows:

    var d = new Date ('2011-12-30T22: 00: 00')
    undefined
    d.getUTCHours ()
    22
    d.getHours ()
    2
    

    Wow! Is Google really a jamb? Again there is a proposal to write a bug report. Composing it is still too lazy, so I'm starting to read the manuals.

    Mozilla Developer Network (MDN). Description of the Date.parse method :
    If you do not specify a time zone, the local time zone is assumed

    Here, the time is sorted by standard, so Firefox is doing it right.

    There is no independent documentation on v8, but there is a link to ECMAscript. Download the specification version 5.1, on page 181 we read:
    The value of an absent time zone offset is "Z".

    Wow! A third-party system gives us local time in accordance with the ISO standard, and our server interprets it as GMT - in full accordance with the ECMA standard.

    Just in case, I’m reading MSDN - Microsoft has long claimed that their JavaScript standard is the most correct, because it is ECMA. And for sure:
    If you do not include a value in the Z position, UTC time is used.

    A study of the repositories showed that the same correction of v8 behavior was made in edition 8513 at the end of June 2011. However, when Mozilla was asked to fix it in his own hands , the discussion pointed to differences in ISO and ECMA 5.1 standards. As a result, there is a chance that in version 6 of the ECMA standard, time will still be understood correctly .

    In the meantime, we had to add crutches to the code. If the time coming from a third-party system does not contain an indication of the time zone, add getTimezoneOffset to it. After the new ECMA standard is released, in which this discrepancy is corrected and v8 is updated in accordance with this standard, we will have to track this and remove the crutches.

    PS The main blow in the events described was taken by the zerodivisi0n habrayuzer , which, unfortunately, has read-only rights. If someone has enough karma (or whatever is needed) to put him in a more advanced state, we will both be very grateful. At the same time, he himself will be able to answer questions.

    Also popular now: