Time is a mess. Always. The author only scratched the surface on all the issues. Even if we exclude the time dilation of relativity which affects GPS/GNSS satellites - independent of if it is due to difference in gravitational pull or their relative speed over ground, it's still a mess.
Timezones; sure. But what about before timezones got into use? Or even halfway through - which timezone, considering Königsberg used CET when it was part of Germany, but switched to EET after it became Russian. There's even countries that have timezones differenting by 15 minutes.
And dont get me started on daylight savings time. There's been at least one instance where DST was - and was not - in use in Lebanon - at the same time! Good luck booking an appointment...
Not to mention the transition from Julian calendar to Gregorian, which took place over many, many years - different by different countries - as defined by the country borders at that time...
We've even had countries that forgot to insert a leap day in certain years, causing March 1 to occur on different days altogether for a couple of years.
Time is a mess. Is, and aways have been, and always will be.
voidUpdate · 20m ago
Even worse, there are some areas where the timezone depends on your religion. Lebanon had "Muslim time" and "Christian time" at one point (Unsure if that's still a thing)
johnisgood · 2h ago
It is, there are a couple of timezones where not only there is a hour difference, but even a 30 and 45 minutes difference. India is UTC +5:30, and Lord Howe Island is UTC +10:30 / +11:00 and New Zealand, Chatham Islands is UTC +12:45 / +13:45, Iran is UTC +3:30 / +4:30 and so on. Where the format is X / Y, that means X is Standard Time, and Y is Daylight time.
You can use a Bash script that can give you an exhaustive list based on files from /usr/share/zoneinfo/, i.e. find timezones with non-whole hour offsets.
volemo · 2h ago
I don’t understand this. What practical difference does it make making the time to round to the nearest quarter of an hour instead of the nearest hour? Personally, I don’t care if noon (sun is in zenith) happens half an hour before 12:00 or half an hour after.
Why do such time zones exist?
johnisgood · 1h ago
Well, I do not know the answer to that, my guess is that it is for historical, political, geographical, and socio-economic reasons.
For example in terms of India, they had two timezones before they adopted a compromise: UTC+5:30.
Nepal uses UTC+5:45, partly to distinguish itself from Indian Standard Time, reinforcing national identity.
minkzilla · 11h ago
Author covers how IANA handles Königsberg, it is logically its own timezone.
An IANA timezone uniquely refers to the set of regions that not only share the same current rules and projected future rules for civil time, but also share the same history of civil time since 1970-01-01 00:00+0. In other words, this definition is more restrictive about which regions can be grouped under a single IANA timezone, because if a given region changed its civil time rules at any point since 1970 in a a way that deviates from the history of civil time for other regions, then that region can't be grouped with the others
I agree that time is a mess. And the 15 minute offsets are insane and I can't fathom why anyone is using them.
% zdump -i Europe/Warsaw | head
TZ="Europe/Warsaw"
- - +0124 LMT
1880-01-01 00 +0124 WMT
1915-08-04 23:36 +01 CET
1916-05-01 00 +02 CEST 1
1916-10-01 00 +01 CET
1917-04-16 03 +02 CEST 1
1917-09-17 02 +01 CET
1918-04-15 03 +02 CEST 1
% zdump -i Europe/Kaliningrad | head -20
TZ="Europe/Kaliningrad"
- - +0122 LMT
1893-03-31 23:38 +01 CET
1916-05-01 00 +02 CEST 1
1916-10-01 00 +01 CET
1917-04-16 03 +02 CEST 1
1917-09-17 02 +01 CET
1918-04-15 03 +02 CEST 1
1918-09-16 02 +01 CET
1940-04-01 03 +02 CEST 1
1942-11-02 02 +01 CET
1943-03-29 03 +02 CEST 1
1943-10-04 02 +01 CET
1944-04-03 03 +02 CEST 1
1944-10-02 02 +01 CET
1945-04-02 03 +02 CEST 1
1945-04-10 00 +02 EET
1945-04-29 01 +03 EEST 1
1945-10-31 23 +02 EET
%
drob518 · 10h ago
Yep. Fortunately, a lot of apps can get by with just local civil time and an OS-set timezone. It’s much less common that they need to worry about leap seconds, etc. And many also don’t care about millisecond granularity, etc. If your app does care about all that, however, things become a mess quite quickly.
yen223 · 8h ago
The way Google implemented leap seconds wasn't by sticking a 23:59:60 second at the end of 31st Dec. The way they did it was more interesting.
What they did instead was to "smear" it across the day, by adding 1 / 86400 seconds to every second on 31st Dec. 1/86400 seconds is well within the margin of error for NTP, so computers could carry on doing what they do without throwing errors.
Edit: They smeared it from noon before the leap second, to the noon after, i.e 31st Dec 12pm - 1st Jan 12pm.
That was probably at Move Fast And Break Things Corp, instead of We Used To Be Do No Evil Inc.
BlackFly · 49m ago
> How does general relativity relate to the idea of time being a universal, linear, forward-moving "entity"?
TAI provides a time coordinate generated by taking the weighted average of the proper times of 450 world lines tracked by atomic clocks. Like any other time coordinate, it provides a temporal orientation but no time coordinate could be described as "universal" or "linear" in general relativity. It would be a good approximation to proper time experienced by most terrestrial observers.
Note that general relativity doesn't add much over special relativity here (the different atomic clocks will have different velocities and accelerations due to altitude and so have relative differences in proper time along their world lines). If you already have a sufficiently general notion of spacetime coordinates, the additional curvature from general relativity over minkowski space is simply an additional effect changing the relation between the coordinate time and proper time.
johnisgood · 2h ago
> One way the website could handle this is by storing the user's exact input 2026-06-19 07:00, and also store the UTC+0 version of that datetime (if we assumed that the timezone rules won't change); this way, we can keep using the UTC+0 datetime for all logic, and we can recompute that UTC+0 datetime once we detect that the time rules for that timezone have changed.
Well, how do we know what timezone is "2026-06-19 07:00" in, to be able to know that the time rules for that timezone have changed, if we do not store the timezone?
Additionally, how do we really "detect that the time rules for that timezone have changed"? We can stay informed, sure, but is there a way to automate this?
karmakaze · 10h ago
Two things that aren't really covered:
- system clock drift. Google's instances have accurate timekeeping using atomic clocks in the datacenter, and leap seconds smeared over a day. For accurate duration measurements, this may matter.
- consider how the time information is consumed. For a photo sharing site the best info to keep with each photo is a location, and local date time. Then even if some of this is missing, a New Year's Eve photo will still be close to midnight without considering its timezone or location. I had this case and opted for string representations that wouldn't automatically be adjusted. Converting it to the viewer's local time isn't useful.
bigiain · 4h ago
The calendar event scheduling problem is hard too.
If I'm in Sydney and I accept a 4pm meeting in 3 weeks time, say 4pm July 15 2025 in San Francisco, how should my calendar store that event's datetime and how does my calendar react to my phone changing locations/timezones?
And now try and work that out if the standard/summertime changeover happens between when the event is created in one timezone and the actual time the event (is supposed to) occur. Possibly two daylight savings time changes if Sydney goes from winter to summer time and San Francisco goes from summer to winter time - and those changeovers don't happen at the same time, perhaps not even the same week.
valenterry · 1h ago
That's easy though. An event of such type is about an absolute point in time, so your calendar stores it like that and then displays it in your current timezone (or whatever one you specify).
When you change locations and you have your calendar configured to show events in "the" timezone of your location, it does so. And should there be no clear timezone, it should ask you.
Very simple problem and simple solutions. There are much harder problems imho.
As you can see, the summertime change does it even matter here.
bigiain · 50m ago
But I don't want my San Francisco meeting to display in my calendar as 1am the day before when I'm in Sydney, then switch to 4pm on Tuesday once I'm in California. And I sure as hell don't want the displayed time in Sydney to switch from 1am to 11pm or 3am just because daylight savings kicked in.
It's a 4pm Tuesday meeting. I want it to show as 4pm while I'm in Sydney, 4pm while I'm on a stopover in Hawaii, and correctly alert me for my 4pm meeting when I'm in San Francisco. And it probably should alert me at 4pm San Francisco time even if I'm not there, in case I missed my connecting flight in Hawaii and I want to call in at the correct time. And that last requirement conflicts wit the "I want it to show as 4pm while I'm on a stopover in Hawaii" requirement, because I'm human and messy and I want the impossible without expending any effort to make it happen.
I'm pretty sure there is no "simple solution" for getting the UX right so I can add a meeting in San Francisco on my phone while I'm in Sydney, and have it "just work" without it always bugging me by asking for timezones.
mzl · 1h ago
As many others have said, time and calendars is messy, and there is often no correct solution but just a bunch of trade-offs. Jon Skeets Storing UTC is not a Silver Bullet (https://codeblog.jonskeet.uk/2019/03/27/storing-utc-is-not-a...) was very influential for me in realizing some of the subtleties in what a point in time means for a user, and how that should incluece the design of a system.
wpollock · 9h ago
Very nice write up! But I think your point that time doesn't need to be a mess is refuted by all the points you made.
I know you had to limit the length of the post, but time is an interest of mine, so here's a couple more points you may find interesting:
UTC is not an acronym. The story I heard was the English acronym would be "CUT" (the name is "coordinated universal time") and the French complained, the French acronym would be "TUC" and the English-speaking committee members complained, so they settled for something that wasn't pronouncable in either. (FYI, "ISO" isn't an acronym either!)
Leap seconds caused such havoc (especially in data centers) that no further leap seconds will be used. (What will happen in the future is anyone's guess.) But for now, you can rest easy and ignore them.
> other epochs work too (e.g. Apollo_Time in Jai uses the Apollo 11 rocket landing at July 20, 1969 20:17:40 UTC).
I see someone else is a Vernor Vinge fan.
But it's kind of a wild choice for an epoch, when you're very likely to be interfacing with systems whose Epoch starts approximately five months later.
r2_pilot · 9h ago
That's kind of the point of software archeology, isn't it? Sometimes something so evident to people within the first few hundred years becomes opaque in reasoning later on, and what's 5 months anyway? You'd need a Rosetta stone to be sure you were even off in time, otherwise you just might have a few missing months that historians couldn't account for.
Raphell · 5h ago
I never really took time seriously until one of my cron jobs skipped execution because of daylight saving. That was the moment I realized how tricky time actually is.
This article explains it really well. The part about leap seconds especially got me. We literally have to smear time to keep servers from crashing. That’s kind of insane.
bigiain · 4h ago
I avoid running "daily" cron jobs or other scheduled tasks around 2am for that reason, they might not get run or them might get run twice.
Where practical I schedule them around 12:00 (but I'm sure one day I'll get stung but some odd country who chooses to implement their daylight savings changeover in the middle of the day).
quantike · 11h ago
Nice post. I think about time... all the time haha. There's another source you might enjoy (Re: your NTP and synchronization question) from TigerBeetle: [Implementing Time](https://www.youtube.com/watch?v=QtNmGqWe73g)
One thing I found out when programming a timeline lately was that year zero doesn’t exist. It just goes from -1 to 1. Which looks very weird if you want to display intervals of e.g. 500 years: -1001, -501, -1, 500, 1000, etc
zokier · 1h ago
ISO 8601, one of the most common date notations, does have year zero. It really depends on what specific calendar you use. Gregorian calendar itself only starts at 1582, anything before that is some sort of extension.
nzach · 10h ago
> What explains the slowdown in IANA timezone database updates?
My guess is that with the increasing dependency on digital systems for our lives the edge-cases where these rules aren't properly updated cause increased amounts of pain "for no good reason".
In Brazil we recently changed our DST rules, it was around 2017/2018. It caused a lot of confusion. I was working with a system where these changes were really important, so I was aware of this change ahead of time. But there are a lot of systems running without too much human intervention, and they are mostly forgotten until someone notices a problem.
klabb3 · 7h ago
It’s quite different from how I think about time, as a programmer. I treat human time and timezones as approximate. Fortunately I’ve been spared from working on calendar/scheduling for humans, which sounds awful for all the reasons mentioned.
Instead I mostly use time for durations and for happens-before relationships. I still use Unix flavor timestamps, but if I can I ensure monotonicity (in case of backward jumps) and never trust timestamps from untrusted sources (usually: another node on the network). It often makes more sense to record the time a message was received than trusting the sender.
That said, I am fortunate to not have to deal with complicated happens-before relationships in distributed computing. I recall reading the Spanner paper for the first time and being amazed how they handled time windows.
zokier · 10h ago
It is a pet peeve of mine, but any statement that implies that Unix time is a count of seconds since epoch is annoyingly misleading and perpetuates such misconception. Imho better mental model for Unix time is that has two parts, days since epoch * 86400, and seconds since midnight, which get added together.
valenterry · 1h ago
But it's correct. It's "a" count. Just not the count that you might always expect. And the "second" in this definition means what people usually understand as a second, as in the duration is always the same. That's all, and it's pretty useful imho.
zokier · 42m ago
> And the "second" in this definition means what people usually understand as a second, as in the duration is always the same.
Umm what? In Unix time some values span two seconds, which is the crux of the problem. In UTC every second is a proper nice SI second. In Unix time the value increments every one or two SI seconds.
charcircuit · 9h ago
How is it misleading? The source code of UNIX literally has time as a variable of seconds that increments every second.
adgjlsfhk1 · 8h ago
leap seconds
LegionMammal978 · 7h ago
Also, UTC had a different clock rate than TAI prior to 1972. And TAI itself had its reference altitude adjusted to sea level in 1977.
smurpy · 6h ago
We don’t have much trouble yet with relativistic temporal distortions, but Earth’s motion causes us to lose about 0.152 seconds per year relative to the Solar system. Likewise we lose about 8.5 seconds per year relative to the Milky Way. I wonder when we’re going to start to care. Presumably there would be consideration of such issues while dealing with interplanetary spacecraft, timing burns and such.
zokier · 1h ago
It is why we are introducing LTC, Coordinated Lunar Time. Apparently the relativistic effects on the Moon are already big enough to make using UTC problematic.
Bjartr · 6h ago
GPS satellite clocks have to run fast to account for the combined relatavistic effects of moving fast and being significantly farther away from earth's gravity. Without this, they would accumulate around 11km of error per day from losing around 7microseconds per day compared to earthbound clocks.
But I hate how when I stack my yearly weather charts, every four years either the graph is off by one day so it is 1/366th narrower and the month delimiters don't line up perfectly, or i have to duplicate Feb 28th so there is no discontinuity in the lines. Still not sure how to represent that, but it sure bugs me.
Raphell · 6h ago
I never really took time seriously until one of my cron jobs skipped execution because of daylight saving. That was the moment I realized how tricky time actually is.
This article explains it really well. The part about leap seconds especially got me. We literally have to smear time to keep servers from crashing. That’s kind of insane.
dijksterhuis · 10h ago
I think this is one of my favourite write ups on HN for a while. I miss seeing more things like this.
drob518 · 10h ago
Me too
a_t48 · 5h ago
I’m all about monotonic time everywhere after having soon too many badly configured time sync settings. :)
TZubiri · 1h ago
>Two important concepts for describing time are "durations" and "instants"
The standard name for durations in physics are "periods" or 'uppercase T' ('lowercase t' being a point in time), which curiously enough are the inverse of a frequency (or the frequency is the inverse of). A period can also be thought of as an interval [t0,t1] or inequality t0<=T<=t1
> The concept of "absolute time" (or "physical/universal time") refers to these instants, which are unique and precisely represent moments in time, irrespective of concepts like calendars and timezones.
Funnily enough, you mean the opposite. An absolute time physically does not exist, like an absolute distance, there is no kilometer 0. Every measurement is relative to another, in the case of time you might use relative to the birth of (our Lord and saviour) Jesus Christ. But you never have time "irrespective" of something else, and if you do, you are probably referring to a period with an implicit origin. For example if I say a length of 3m, I mean an object whose distance from one end to the other is 3m. And if I say 4 minutes of a song, I mean that the end is 4 minutes after the start, in the same way that a direction might be represented by a 2D vector [1,1] only because we are assuming a relationship to [0,0].
That said, it's clear that you have a lot of knowledge about calendars from a practical software experience of implementing time features in global products, I'm just explaining time from the completely different framework of classical physics, which is of course of little use when trying to figure out whether 6PM in Buenos Aires and 1 PM in 6 months in California will be the same time.
lionelholt · 10h ago
... humans don't generally say
"Wanna grab lunch at 1,748,718,000 seconds from the Unix epoch?"
I'm totally going to start doing that now.
moffkalast · 10h ago
Obligatory falsehoods programmers believe about time:
In a nutshell if you believe anything about time, you're wrong, there is always an exception, and an exception to the exception. And then Doc Brown runs you over with the Delorean.
kevindamm · 8h ago
Marty!! We have to go back...
to string representations!
jcranmer · 3h ago
> What's the history of human timekeeping? Particularly before the Gregorian calendar, what historical records do we have for who was tracking/tallying the days elapsed over time? How did people coordinate on the current date globally (if at all)? How did local mean time (LMT) work in the past?
Ooh, this is a really interesting topic!
Okay, so the first thing to keep in mind is that there are three very important cyclical processes that play a fundamental role in human timekeeping and have done so since well before anything we could detect archaeologically: the daily solar cycle, the lunar cycle (whence the month), and the solar year. All of these are measurable with mark 1 human eyeballs and nothing more technologically advanced than a marking stick.
For most of human history, the fundamental unit of time from which all other time units are defined is the day. Even in the SI system, a second wasn't redefined to something more fundamental than the Earth's kinematics until about 60 years ago. For several cultures, the daylight and the nighttime hours are subdivided into a fixed number of periods, which means that the length of the local equivalent of 'hour' varied depending on the day of the year.
Now calendars specifically refer to the systems for counting multiple days, and they break down into three main categories: lunar calendars, which look only at the lunar cycle and don't care about aligning with the solar year; lunisolar calendars, which insert leap months to keep the lunar cycle vaguely aligned with the solar year (since a year is about 12.5 lunations long); and solar calendars, which don't try to align the lunations (although you usually still end up with something akin to the approximate length of a lunation as subdivisions). Most calendars are actually lunisolar calendars, probably because lunations are relatively easy to calibrate (when you can go outside and see the first hint of a new moon, you start the new month) but one of the purposes of the calendar is to also keep track of seasons for planting, so some degree of solar alignment is necessary.
If you're following the history of the Western calendrical tradition, the antecedent of the Gregorian calendar is the Julian calendar, which was promulgated by Julius Caesar as an adaptation of the Egyptian solar calendar for the Romans, after a series of civil wars caused the officials to neglect the addition of requisite leap months. In a hilarious historical example of fencepost errors, the number of years between leap years was confused and his successor Augustus had to actually fix the calendar to have a leap year every 4th year instead of every third year, but small details. I should also point out that, while the Julian calendar found wide purchase in Christendom, that didn't mean that it was handled consistently: the day the year started varied from country to country, with some countries preferring Christmas as New Years' Day and others preferring as late as Easter itself, which isn't a fixed day every year. The standardization of January 1 as New Years' Day isn't really universal until countries start adopting the Gregorian calendar (the transition between Julian and Gregorian calendar is not smooth at all).
Counting years is even more diverse and, quite frankly, annoying. The most common year-numbering scheme is a regnal numbering: it's the 10th year of King Such-and-Such's reign. Putting together an absolute chronology in such a situation requires accurate lists of kings and such that is often lacking; there's essentially perennial conflicts in Ancient Near East studies over how to map those dates to ones we'd be more comfortable with. If you think that's too orderly, you could just name years after significant events (this is essentially how Winter Counts work in Native American cultures); the Roman consular system works on that basis. If you're lucky, sometimes people also had an absolute epoch-based year number, like modern people largely agree that it's the year 2025 (or Romans using 'AUC', dating the mythical founding of Rome), but this tends not to be the dominant mode of year numbering for most of recorded human history.
Timezones; sure. But what about before timezones got into use? Or even halfway through - which timezone, considering Königsberg used CET when it was part of Germany, but switched to EET after it became Russian. There's even countries that have timezones differenting by 15 minutes.
And dont get me started on daylight savings time. There's been at least one instance where DST was - and was not - in use in Lebanon - at the same time! Good luck booking an appointment...
Not to mention the transition from Julian calendar to Gregorian, which took place over many, many years - different by different countries - as defined by the country borders at that time...
We've even had countries that forgot to insert a leap day in certain years, causing March 1 to occur on different days altogether for a couple of years.
Time is a mess. Is, and aways have been, and always will be.
Messy.
I think the full list can be found here: https://www.timeanddate.com/time/time-zones-interesting.html
You can use a Bash script that can give you an exhaustive list based on files from /usr/share/zoneinfo/, i.e. find timezones with non-whole hour offsets.
Why do such time zones exist?
For example in terms of India, they had two timezones before they adopted a compromise: UTC+5:30.
Nepal uses UTC+5:45, partly to distinguish itself from Indian Standard Time, reinforcing national identity.
What they did instead was to "smear" it across the day, by adding 1 / 86400 seconds to every second on 31st Dec. 1/86400 seconds is well within the margin of error for NTP, so computers could carry on doing what they do without throwing errors.
Edit: They smeared it from noon before the leap second, to the noon after, i.e 31st Dec 12pm - 1st Jan 12pm.
http://rachelbythebay.com/w/2025/01/09/lag/
That was probably at Move Fast And Break Things Corp, instead of We Used To Be Do No Evil Inc.
TAI provides a time coordinate generated by taking the weighted average of the proper times of 450 world lines tracked by atomic clocks. Like any other time coordinate, it provides a temporal orientation but no time coordinate could be described as "universal" or "linear" in general relativity. It would be a good approximation to proper time experienced by most terrestrial observers.
Note that general relativity doesn't add much over special relativity here (the different atomic clocks will have different velocities and accelerations due to altitude and so have relative differences in proper time along their world lines). If you already have a sufficiently general notion of spacetime coordinates, the additional curvature from general relativity over minkowski space is simply an additional effect changing the relation between the coordinate time and proper time.
Well, how do we know what timezone is "2026-06-19 07:00" in, to be able to know that the time rules for that timezone have changed, if we do not store the timezone?
Additionally, how do we really "detect that the time rules for that timezone have changed"? We can stay informed, sure, but is there a way to automate this?
- system clock drift. Google's instances have accurate timekeeping using atomic clocks in the datacenter, and leap seconds smeared over a day. For accurate duration measurements, this may matter.
- consider how the time information is consumed. For a photo sharing site the best info to keep with each photo is a location, and local date time. Then even if some of this is missing, a New Year's Eve photo will still be close to midnight without considering its timezone or location. I had this case and opted for string representations that wouldn't automatically be adjusted. Converting it to the viewer's local time isn't useful.
If I'm in Sydney and I accept a 4pm meeting in 3 weeks time, say 4pm July 15 2025 in San Francisco, how should my calendar store that event's datetime and how does my calendar react to my phone changing locations/timezones?
And now try and work that out if the standard/summertime changeover happens between when the event is created in one timezone and the actual time the event (is supposed to) occur. Possibly two daylight savings time changes if Sydney goes from winter to summer time and San Francisco goes from summer to winter time - and those changeovers don't happen at the same time, perhaps not even the same week.
When you change locations and you have your calendar configured to show events in "the" timezone of your location, it does so. And should there be no clear timezone, it should ask you.
Very simple problem and simple solutions. There are much harder problems imho.
As you can see, the summertime change does it even matter here.
It's a 4pm Tuesday meeting. I want it to show as 4pm while I'm in Sydney, 4pm while I'm on a stopover in Hawaii, and correctly alert me for my 4pm meeting when I'm in San Francisco. And it probably should alert me at 4pm San Francisco time even if I'm not there, in case I missed my connecting flight in Hawaii and I want to call in at the correct time. And that last requirement conflicts wit the "I want it to show as 4pm while I'm on a stopover in Hawaii" requirement, because I'm human and messy and I want the impossible without expending any effort to make it happen.
I'm pretty sure there is no "simple solution" for getting the UX right so I can add a meeting in San Francisco on my phone while I'm in Sydney, and have it "just work" without it always bugging me by asking for timezones.
I know you had to limit the length of the post, but time is an interest of mine, so here's a couple more points you may find interesting:
UTC is not an acronym. The story I heard was the English acronym would be "CUT" (the name is "coordinated universal time") and the French complained, the French acronym would be "TUC" and the English-speaking committee members complained, so they settled for something that wasn't pronouncable in either. (FYI, "ISO" isn't an acronym either!)
Leap seconds caused such havoc (especially in data centers) that no further leap seconds will be used. (What will happen in the future is anyone's guess.) But for now, you can rest easy and ignore them.
I have a short list of time (and NTP) related links at <https://wpollock.com/Cts2322.htm#NTP>.
I see someone else is a Vernor Vinge fan.
But it's kind of a wild choice for an epoch, when you're very likely to be interfacing with systems whose Epoch starts approximately five months later.
This article explains it really well. The part about leap seconds especially got me. We literally have to smear time to keep servers from crashing. That’s kind of insane.
Where practical I schedule them around 12:00 (but I'm sure one day I'll get stung but some odd country who chooses to implement their daylight savings changeover in the middle of the day).
My guess is that with the increasing dependency on digital systems for our lives the edge-cases where these rules aren't properly updated cause increased amounts of pain "for no good reason".
In Brazil we recently changed our DST rules, it was around 2017/2018. It caused a lot of confusion. I was working with a system where these changes were really important, so I was aware of this change ahead of time. But there are a lot of systems running without too much human intervention, and they are mostly forgotten until someone notices a problem.
Instead I mostly use time for durations and for happens-before relationships. I still use Unix flavor timestamps, but if I can I ensure monotonicity (in case of backward jumps) and never trust timestamps from untrusted sources (usually: another node on the network). It often makes more sense to record the time a message was received than trusting the sender.
That said, I am fortunate to not have to deal with complicated happens-before relationships in distributed computing. I recall reading the Spanner paper for the first time and being amazed how they handled time windows.
Umm what? In Unix time some values span two seconds, which is the crux of the problem. In UTC every second is a proper nice SI second. In Unix time the value increments every one or two SI seconds.
https://www.gpsworld.com/inside-the-box-gps-and-relativity/
But I hate how when I stack my yearly weather charts, every four years either the graph is off by one day so it is 1/366th narrower and the month delimiters don't line up perfectly, or i have to duplicate Feb 28th so there is no discontinuity in the lines. Still not sure how to represent that, but it sure bugs me.
The standard name for durations in physics are "periods" or 'uppercase T' ('lowercase t' being a point in time), which curiously enough are the inverse of a frequency (or the frequency is the inverse of). A period can also be thought of as an interval [t0,t1] or inequality t0<=T<=t1
> The concept of "absolute time" (or "physical/universal time") refers to these instants, which are unique and precisely represent moments in time, irrespective of concepts like calendars and timezones.
Funnily enough, you mean the opposite. An absolute time physically does not exist, like an absolute distance, there is no kilometer 0. Every measurement is relative to another, in the case of time you might use relative to the birth of (our Lord and saviour) Jesus Christ. But you never have time "irrespective" of something else, and if you do, you are probably referring to a period with an implicit origin. For example if I say a length of 3m, I mean an object whose distance from one end to the other is 3m. And if I say 4 minutes of a song, I mean that the end is 4 minutes after the start, in the same way that a direction might be represented by a 2D vector [1,1] only because we are assuming a relationship to [0,0].
That said, it's clear that you have a lot of knowledge about calendars from a practical software experience of implementing time features in global products, I'm just explaining time from the completely different framework of classical physics, which is of course of little use when trying to figure out whether 6PM in Buenos Aires and 1 PM in 6 months in California will be the same time.
"Wanna grab lunch at 1,748,718,000 seconds from the Unix epoch?"
I'm totally going to start doing that now.
https://gist.github.com/timvisee/fcda9bbdff88d45cc9061606b4b...
In a nutshell if you believe anything about time, you're wrong, there is always an exception, and an exception to the exception. And then Doc Brown runs you over with the Delorean.
to string representations!
Ooh, this is a really interesting topic!
Okay, so the first thing to keep in mind is that there are three very important cyclical processes that play a fundamental role in human timekeeping and have done so since well before anything we could detect archaeologically: the daily solar cycle, the lunar cycle (whence the month), and the solar year. All of these are measurable with mark 1 human eyeballs and nothing more technologically advanced than a marking stick.
For most of human history, the fundamental unit of time from which all other time units are defined is the day. Even in the SI system, a second wasn't redefined to something more fundamental than the Earth's kinematics until about 60 years ago. For several cultures, the daylight and the nighttime hours are subdivided into a fixed number of periods, which means that the length of the local equivalent of 'hour' varied depending on the day of the year.
Now calendars specifically refer to the systems for counting multiple days, and they break down into three main categories: lunar calendars, which look only at the lunar cycle and don't care about aligning with the solar year; lunisolar calendars, which insert leap months to keep the lunar cycle vaguely aligned with the solar year (since a year is about 12.5 lunations long); and solar calendars, which don't try to align the lunations (although you usually still end up with something akin to the approximate length of a lunation as subdivisions). Most calendars are actually lunisolar calendars, probably because lunations are relatively easy to calibrate (when you can go outside and see the first hint of a new moon, you start the new month) but one of the purposes of the calendar is to also keep track of seasons for planting, so some degree of solar alignment is necessary.
If you're following the history of the Western calendrical tradition, the antecedent of the Gregorian calendar is the Julian calendar, which was promulgated by Julius Caesar as an adaptation of the Egyptian solar calendar for the Romans, after a series of civil wars caused the officials to neglect the addition of requisite leap months. In a hilarious historical example of fencepost errors, the number of years between leap years was confused and his successor Augustus had to actually fix the calendar to have a leap year every 4th year instead of every third year, but small details. I should also point out that, while the Julian calendar found wide purchase in Christendom, that didn't mean that it was handled consistently: the day the year started varied from country to country, with some countries preferring Christmas as New Years' Day and others preferring as late as Easter itself, which isn't a fixed day every year. The standardization of January 1 as New Years' Day isn't really universal until countries start adopting the Gregorian calendar (the transition between Julian and Gregorian calendar is not smooth at all).
Counting years is even more diverse and, quite frankly, annoying. The most common year-numbering scheme is a regnal numbering: it's the 10th year of King Such-and-Such's reign. Putting together an absolute chronology in such a situation requires accurate lists of kings and such that is often lacking; there's essentially perennial conflicts in Ancient Near East studies over how to map those dates to ones we'd be more comfortable with. If you think that's too orderly, you could just name years after significant events (this is essentially how Winter Counts work in Native American cultures); the Roman consular system works on that basis. If you're lucky, sometimes people also had an absolute epoch-based year number, like modern people largely agree that it's the year 2025 (or Romans using 'AUC', dating the mythical founding of Rome), but this tends not to be the dominant mode of year numbering for most of recorded human history.