The UNIX timestamp is used by many computers to record the time of events. It counts the seconds that have elapsed since Jan 1st 1970.
But numbers in computers can't be arbitrarily big, because you need to designate a certain amount of memory to them. As such UNIX timestamps were defined to allocate 32 bits of information, which lets you store numbers up to 2,147,483,647.
In 2038 more seconds than that will have passed, and it could lead to problems not unlike the Y2K bug.
@gudenau There's nothing wrong with signed values, as long as you have the bits of precision to support it. 64-bits gives you precision to measure as far back as origin of the universe, and a future wrap-around somewhere close to 150 billion years into the future.
@vertigo I guess stuff did happen before 0.
@fribbledom Oh wow, did not know that.
Server run by the main developers of the project It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!