I just use millis since epoch
(Recently learned that this isn’t accurate because it disguises leap seconds. The standard was fucked from the start)
This is the way.
In one work report, I recorded the date as “1/13/25”, “13/1/25” and “13JAN2025”
I have my preference, but please for the love of all that is fluffy in the universe, just stick to one format…
“13.1.25”, not “13/1/25”.
I use ss/mm/hh/dd/MM/YYYY
t.european
i never saw year first in Europe.
You’re reading the post backwards.
All my homies hate ISO, RFC 3339 for the win.
All my homies hate ISO
Said no-one ever?
EDIT: thanks for informing me i now retract my position
Nah, ISO is a shit organization. The biggest issue is that all of their “standards” are blocked behind paywalls and can’t be shared. This creates problems for open source projects that want to implement it because it inherently limits how many people are actually able to look at the standard. Compare to RFC, which always has been free. And not only that, it also has most of the standards that the internet is built upon (like HTTP and TCP, just to name a few).
Besides that, they happily looked away when members were openly taking bribes from Microsoft during the standardization of OOXML.
In any case, ISO-8601 is a garbage standard.
P1Y
is a valid ISO-8601 string. Good luck figuring out what that means. Here’s a more comprehensive page demonstrating just how stupid ISO-8601 is: https://github.com/IJMacD/rfc3339-iso8601P1Y is period notation. It means a Period of 1 Year. It actually makes decent sense tbh.
Sure, it means something, and the meaning is not stupid. But since it is the same standard, it should be possible to be used to at least somehow represent the same data. Which it doesn’t.
I think it is reasonable to say: “for all representation of times (points in time, intervals and sets of points or intervals etc) we follow the same standard”.
The alternative would be using one standard for points in time, another for intervals, another for time differences, another for changes to a timezone, another for …
The alternative would be
More reasonable, if you ask me. At least I came to value modularity in programming, maybe with standards it doesn’t work as good, but I don’t see why
Standards are used to increase interoperability between systems. The more different standards a single system needs the harder it is to interface with other systems. If you have to define a list of 50 standard you use, chances are the other system uses a different standard for at least one of them. Much easier if you rely on only a handful instead
if i am not wrong, it is because essentially both are same (slight differences in what is allowed and what is not, https://github.com/IJMacD/rfc3339-iso8601), but RFC is more free as in freedom
Thx i take that back
“Europe”, as if there weren’t several languages in Europe with different date formats per language…
Hot take: 2025-Jan-27 is better than 2025-01-27 in monolingual contexts.
The beautiful part of 2025/01/27 is that it can inherently be sorted without formatting.
MM ≠ MM !!!
I’m almost 40 and now just realizing my insistence on how to structure all my folders and notes is actually an ISO standard. Way to go me.
I stumbled upon it years ago because sorting by name sorts by date. There was no other thought put into it.
It’s incredibly annoying that in clinical research we are prohibited from using it because every date must comply with the GCP format (DD mmm yyyy). Every file has the GCP date appended to the end.
Maybe in programming or technical documentation, but no, when I check the date I want to know the day and the month, beyond that, it’s all unnecessary information for everyday use, and we have it right in Europe.
You can’t change my mind. ¯\_(ツ)_/¯
These people are just too far into the ISO rabbit hole. I completely agree with you that DD.MM.YYYY is the best format for everyday use.
Nah. Sort that alphabetically and you end up with a useless list.
And sorting the other formats alphabetically does not yield the same result?
No…?
Thank you! 😂
E: I even said how I can see it being useful in some applications, but fuck, if I’m looking at the date it’s almost certainly to see what day it is today, what day (and maybe month) an appointment is, what day some food is going off, stuff like that. I know what month and year it is right now, and if I want to know the time, I look at a clock, not a calendar. If they love extra and often unnecessary information so much they’re free to use whatever format they want, but I’m good, and so are many others, and they just need to learn to be ok with that lmao
the “best” format for everyday use is each individual person’s personal preference.
you may be more used to DDMMYYYY due to culture, language, upbringing, and usage. in the same vein, i am more used to YYYYMMDD because in chinese we go 年月日 (year-month-day), and it makes organizing files and spreadsheet entries much more intuitive anyways.
Well in that case people should stop complaining about us wanting to use DD.MM.YYYY it’s perfectly fine and the only format that should be shot on sight is MM.DD.YYYY
You can’t change my mind.
That’s not a good thing. That attitude limits you from improving how you do things because you’ve gotten emotionally attached to some arbitrary … never mind. Have a nice day.
I don’t know why anyone would ever argue against this. Least precise to most precise. Like every other number we use.
(I don’t know if this is true for EVERY numerical measure, but I’m sure someone will let me know of one that doesn’t)
They are all equally prescise. American one is stupid just like their stupid ass imperial units. European one is two systems slapped together(since they are rarely used together and when they are its the iso format) and iso is what european standard should be.
You misunderstand my comment.
I’m saying the digits in a date should be printed in an order dictated by which units give the most precision.
A year is the least precise, a month is the next least, followed by day, hour, minute, second, millisecond.
You are looking not for precision but for largest to smallest, descending order. this is distinct from precision, a measure of how finely measured something is. 2025.07397 is actually more precise than 2025/01/27, but is measured by the largest increment.
And to address the argument on precision versus descending. I disagree. An instrument counting seconds is more precise than a machine counting minutes, hours, days, weeks, months etc… And that holds true through the chain. The precision is in the unit.
We can debate this all day. And I can’t honestly say that I would take either side in a purely semantics argument.
But the wording comes directly from RFC3339 which is, to me, the definitive source for useful date representation.
https://www.ietf.org/rfc/rfc3339.txt
5.1. Ordering
If date and time components are ordered from least precise to most precise, then a useful property is achieved. Assuming that the time zones of the dates and times are the same (e.g., all in UTC), expressed using the same string (e.g., all “Z” or all “+00:00”), and all times have the same number of fractional second digits, then the date and time strings may be sorted as strings (e.g., using the strcmp() function in C) and a time-ordered sequence will result.
Largest to smallest is also wrong. In 2025/01/28, the 28 is larger than the 01.
It should be “most significant” to “least significant”
largest to smallest is correct. 1 mile is larger than 20 meters. if i had specified numerical value or somesuch, maybe you’d be correct. though significance works as well.
Largest to smallest is at best ambiguous. It can refer to the size of the number itself, or the size of the unit.
There is a reason this exact concept in maths/computer science is known as the “significance” of the digit. Eg. The “least significant bit” in binary is the last one.
Sorting with either the month or the day ahead of the year results in more immediately relevant identifiable information being displayed first. The year doesn’t change very often, so it’s not something you necessarily need to scan past for every entry. The hour changes so frequently as to be irrelevant in many cases. Both the month and the day represent a more useful range of time that you might want to see immediately.
Personally, I find the month first to be more practical because it tells you how relatively recent something is on a scale that actually lasts a while. Going day first means if you’ve got files sorted this way you’re going to have days of the month listed more prominently than months themselves, so the first of January through the first of December will all be closer together then the first and second of January in your list. Impractical.
Year first makes sense if you’re keeping a list around for multiple years, but the application there is less useful in the short term. It’s probably simpler to just have individual folders for years and then also tack it on after days to make sure it’s not missing.
Also, like, this format is how physical calendars work assuming you don’t have a whole stack of them sitting in front of you.
By keeping years in different folders you are just implicitly creating the ISO format: eg.
2025/"04/28.xls"
My stupid ass read this top to bottom and I was confused why anyone would start with seconds
I work with international clients and use 2025-01-26 format. Without it… confusion.
That’s an ISO date, and it’s gorgeous. It’s the only way I’ll accept working with dates and timezones, though I’ll make am exception for end-user facing output, and format it according to locale if I’m positive they’re not going to feed into some other app.
How is day smaller than month? There are up to 12 values for month, but up to 31 for days
obvious troll is obvious
It’s sorted by the length of time, so a day is shorter than a month.