@DavitMasia I don't know, I find this format to be backwards. I mean I remember what year it is (only time travellers wouldn't!) and same with the month. It is the day I might forget. Having to adopt yyyy-mm-dd just because Americans apparently forget the month more than the day (what's the deal with their mm-dd-yyyy otherwise?) is just unfair to say the least. So why use yyyy etc, except to be sure of being understood by American-made computers?

@lmintmate @DavitMasia

To have the date order match the "alphabetical" one ?

@Nocta @DavitMasia (Sorry for not replying earlier, it was night in my timezone) What do you mean "alphabetical" date order? Maybe what this guy said (cybre.space/@rick_777/10062481), that sorting alphabetically also sorts by date? If so, I still don't see the point since I can always sort by last modified date.

@DavitMasia Used to program Java, ugly brutal language that. Those years I spent a lot of time on google and one typical search was `Java Date`, every time I hit enter I shuddered at the thought that *this time*, there will be some website trying to hook me up with the Java developer of my dreams.

@cjd let me interject for a moment, but compared to JS, Java is a beautiful language.

@dax Yet @cjd was a big fan of it, last time I checked.

@Wolf480pl For me, JS is like a big sandbox, you can build anything you want. You're gonna get all dirty and whatever you build will be kind of fragile, but it's yours, you are the king in that sandbox.

Java is part of a culture of rigid obedience. Soul-crushing mediocrity because "cleverness isn't Enterprise", worst part about it is, in the end it's still a sandbox. It's just it's the CEO's sandbox and you're one of the toys.

@cjd
I've spent way more time reading other people's code than writing my own.

From my POV, the job of any programming language other than macroassembler is to eliminate a large fraction of bad programs and save me time figuring out on my own why they're bad.

A nice side effect is that it eliminates bad programs of other people, which makes it easier to find ones that meet certain level or correctness and readability.

@Wolf480pl That's why the image of a dating service for Java programmers is so awful. It's like everything out of Metropolis except that the workers develop such a stockholm syndrome for the Factory and Manager and FactoryManager that they actively seek to breed within their caste.

@cjd is it the same for Haskell programmers?

@Wolf480pl IMO no, the relationship that people have with such languages as Haskell and Rust is different. Just because a language has a powerful type system doesn't mean it is proletarianizing. The deeper question is "who was it made for?" (programmer or the boss). One might argue that gofmt is more insidious than the borrow checker. IMO any language which has powerful macros has chosen the programmer.

@cjd ok, yeah, lack of macros can be annoying.

btw. I know a language that was made for neither the programmer, nor the boss. It was made for compiler writers. What language do you think it is?

@Wolf480pl Sounds like some kind of lisp variant, given it's already in a nice AST, but then eval throw a wrench into compiler design...

Lots of languages exist for learning & experimentation which is great.

Only languages which really make me sick are corporate ones: COBOL, JAVA, C#, GO... All lacking of macros (don't allow anyone to create anything too complicated for an intern to understand) and all limited in the way code can be written, it all looks the same.

@Wolf480pl C++ is a battleground language, templates and macros makes it too powerful to be a business language but it's not so powerful as to allow creating entire new EDSLs and managers still very much try to fit it into the mold. IMO Rust is a cultural extension of C++.

@Wolf480pl
My ideal language would be a macro assembler with arbitrary compile-time computation & easy-to-write AST transformers. Let the programmer pull in language features as libraries.

My configuration would be object-capability w/ first class closures & a declarative language for static assertions which are validated by a prover, on top of this I'd build the type system.

Problem w/ this though is everyone would make their own dialect & it becomes unteachable -> no business would use it.

@cjd
IMO C++ is just a kitchensink, a language without a vision.

@cjd
Nope :P
By a language made with compiler writers in mind, I meant C.
It just gives them so much leeway wrt. what a particular piece of code should do, and if it invokes UB, then it's "yeah you are free to compile the program to whatever you want"

@Wolf480pl I would say C was more designed by people who had to write a lot of ASM and eventually got tired of keeping their data-structure as packet diagrams written on paper stuck on the wall.

The portability of C to essentially every architecture was a brilliant idea, freed up processor design, but sadly this created constraints on the language. SYSTEM-V calling convention doesn't consider how many registers are *needed*, slow calls + no closures -> proliferation of for/while loops.

@cjd designed, maybe. But it was standardised with compiler writers in mind, I think.

Btw. its' funny how it's no longer a low-level language, in that it doesn't reflect the underlaying machine in any way.

It's also funny how we go from imperative C, to static single assignment, to imperative x86 machine code, then within CPU pipeline back to SSA, then to something VLIW-like

@Wolf480pl Probably standardized as a documentation exercise after the fact, which is probably the right way do standards anyway.

@cjd I'd argue the right way to do standards is an iterative process, where multiple prototypes implement the latest draft, they're tested, lessons are learned, the draft is adjusted based on those lessons, the prototypes are adjusted to the latest draft, and so on.

I believe that's what IETF does.

@Wolf480pl Also when you want to implement `x << y`, you can do it as 1 instruction but processors handle edge cases differently, you could test y before running it but you slow down good code to protect bad. A good prover can make sure you are being safe but probably wasn't worth the effort at that time, so they shrug their shoulders and said UB.

@cjd yeah, but they didn't just say "the result of this expression is undefined", they said "the behaviour of the whole program is undefined".

@Wolf480pl Yeah, probably because they looked at the compiler and found that it generates crash-code and they scratched their heads a bit and then said "oh well, write what you see"...
I recently ran into that because my allocator was providing memory aligned on sizeof(uintptr_t) instead of MAX_ALIGNMENT, compiling for arm neon -> sigbus while pushing registers for a function call.

@cjd It's not just that you'll get dirty, it's that after 5 minutes of playing, everything is covered in sand to the point I can't tell a square from a circle.

@Wolf480pl IMO types are a form of static assertion, which I love by the way, I actually wish I had a more powerful way to program static assertions into my code. But I want to be in charge.

Thanks for the link, I tabbed it...

One which explains some of where I'm coming from is: youtube.com/watch?v=mZyvIHYn2z

@cjd
Watched it, it was a pretty interesting talk.
Turns out I'm "obsessive", maybe with some "sadistic" traits.
I still don't understand why somebody would want a "psychotic" language like JS.

@Wolf480pl
Good video, I agree with what he has to say. I don't have a lot of experience with "good" type systems like Haskell, only with bad type systems like Java or dynamic. My experience generally has been that I either am not worried about bugs (throwaway scripts, code with few/simple inputs, stateless code) and I want the language to get out of my way so I can get it done, or I'm writing scary code (big long lived data-structures) and the type system is far too simple to help.

@cjd when I write some quick-and-dirty script, below 1k LOC, I just use Python, and it's pretty good for that purpose.

But if I write something that I may want to refactor, I want an easy way to check that I didn't make stupid mistakes during refactoring. And even Java's type system is good enough for that.

But Haskell's is even better. IMO you should try either OCaml or Haskell, just to see what it's like.

@DavitMasia this is the most controversial too I've seen in my day of being on the site.

@DavitMasia
Who forgets what year it? Day, moth then year make most sense.

@sproid
If you format your filenames with YYYY-MM-DD, sorting alphabetically also sorts them by date.

Most significant digit first.

@DavitMasia

@sproid @DavitMasia ask me again in the first two weeks of January…

But think about reading the date of a random event: if the first information is the year, you start to have an idea of how far ago it happened (or how close it will be in the future), and then if you need it you can read on to get more precision with the month and day.

@valhalla
I see that more of a marketing /promotion ways to exalt an event, to make it clear and maybe avoid confusion. But on every other occasion just casually it's not that relevant to have the year first.
@DavitMasia

@DavitMasia I like the lack of ambiguity in DD-MMM-YYYY but it does have some English imperialism built in.

Sign in to participate in the conversation
Mastodon

Follow friends and discover new ones. Publish anything you want: links, pictures, text, video. This server is run by the main developers of the Mastodon project. Everyone is welcome as long as you follow our code of conduct!