Follow

If only I had a dollar for every time when my question "in what time zone are your timestamps" is answered with a deadpan "it's just a number"…

Yes, it is just a number. What it means is that the time zone information is not provided with it, and has to be either assumed or provided by other means. This is exactly why I'm asking the question.

What the answer is telling me is that you rely on your programming environment to make that assumption for you, and you don't even know.

· · Web · 2 · 1 · 1

@isagalaev Do you mean unix timestamps? My understanding was the it was seconds since 00:00:00 UTC on 1 January 1970. So when converting to a time you could read it as any timezone, but the timestamp itself doesn't have one. Have I been misunderstanding this?

@badtuple you just said both "UTC" and "itself doesn't have one" about the same thing :-)

So yes, a timestamp doesn't any information about the time zone (same as the string "1970-01-01T00:00:00"). Many people however believe a time represented by an integer it must be assumed to mean time in UTC. Even though popular libraries like "moment" in JS and datetime.fromtimestamp() in Python's stdlib use local time zone when constructing full datetime objects from timestamps, for example.

@isagalaev No no, I meant the epoch itself is defined as 1970-01-01 00:00:00 UTC. The timestamp (which is just seconds since then) isn't necessarily UTC it's just a number of seconds.

So you can consider the epoch in different timezones (for instance 01:00:00 BST). Then you just add the "seconds since" to the timezone you're "viewing" it in. That way the timestamp itself is tz agnostic.

I'm willing to be wrong about this, but it really seems like it's how unix timestamps are defined.

@badtuple if you treat timestamp as seconds since a concrete moment in UTC, then you *do* necessarily get an unambiguous moment in time. You can represent it in other time zones, but the important part is that it's unambiguous.

What I'm talking about is that this assumption — that the epoch is in UTC — is not what happens in practice. The libraries I mentioned would happily consider timestamps to be seconds from 1970-01-01 00:00:00 *in the local TZ*. (cont.)

@badtuple this makes timestamps ambiguous: you have to somehow communicate your assumption along with it if you want to communicate a concrete moment in time (sometimes you don't).

@badtuple actually, let me revise the bit about JS 'moment' and Python 'datetime': they both do indeed consider timestamps to be from a concrete UTC point, so I was wrong about that. Still, there's a lot of code out there that doesn't do it and just works in the naive (tz-less) space, for example.

@isagalaev Well... We have a machine that intakes, transforms, and pushes out data. Each piece of data gets a long epoch timestamp added as it is pushed out. The clock isn't corrected. The timestamp is only to help identify a locked process that might result in us pushing out copies of old data. Nobody cares about accurate time or what time zone was used. We could have incremented a counter. We could have used randoms. We used time.

It's just a number. Sometimes that's okay.

@ericphelps obviously! As long as all parts of the code know about this assumption, it's fine. What I'm saying is "it's just a number" doesn't communicate any assumption. It's different from "we use it as an opaque monotonically increasing number".

@isagalaev In all honesty, when I first got hired, I tried to use the time output of that device. C'mon, it was labeled "time" in the XML, so why not? I found it was wildly inaccurate and varied between devices. I had to have it explained to me that the consuming server just used it to look for differences between successive message blocks. So yes, documenting it (and naming it appropriately) is important.

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!