True UML has never been tried,
Rational Rose can never fail, it can only *be* failed,
The Gang of Four were alone responsible for the downfall of the Patterns Movement,
@natecull Well, in the other thread where I mentioned "true" OOP. I was mostly referring to this train of thought: https://wiki.c2.com/?AlanKaysDefinitionOfObjectOriented
Almost all popular OOP languages today fail on even the first item: everything is an object. Ruby is a notable exception (and Smalltalk, though it's not particularly popular).
Python is a fundamentally conservative and restrictive language. Much like Java, you can look at Python code and know right away that it's Python code. This isn't the case with languages like Ruby.
A control construct like 'if' could definitely be an object as long as you had a syntax/semantics for passing in unevaluated (or partially evaluated code blocks) and environments.
This is why I'm interested in John Schutt's 'vau calculus' which is basically Scheme reinvented with 'macro' as the basis rather than 'function' (not actually macros, more just 'function which receives its caller's environment and a list of unevaluated arguments)'.
I reckon OOP could do the same.
@natecull @urusan @xnx38h Fexprs were around in Lisp long before macros were. The problem is balancing expressive power with understandability (for humans and for programs, i.e. compilers). In the end, unfolding a macro definition at compile time into a pre-established set of special forms is a lot easier than predicting what a fexpr will do at runtime.
(You can build fexprs out of macros relatively easily if you really need to, something like (macro fexpr-apply (f &rest args) `(,f 'args)).)
"In the end, unfolding a macro definition at compile time into a pre-established set of special forms is a lot easier than predicting what a fexpr will do at runtime."
I believe constructing an argument against this premise (that fexprs are hard to predict) is what John Schutt's PhD was all about.
Admittedly I don't have the ability to parse exactly what vau-calculus proves (though it's basically lambda calculus with a model of 'program text', I think).
This is his PhD here:
I wish I could understand what it lets us prove about fexprs, because I think fexprs (with suitable pre-evaluation of them) are a much more tractable basis for macros than anything else.
They have to be *lexically scoped*, of course, which Lisp's original fexprs weren't.
@natecull @urusan @xnx38h I understand from scrolling through that thesis before, that enforcing lexical scoping is one ingredient, but also other things need to be forbidden (or strongly discouraged), such as updating bindings outside of the current scope, procedural macros (although: just use a fexpr) and quasiquotation.
In my own hobby Lisp implementation, the choice to avoid fexpr-like constructs is that they require a lot more analysis in the compiler. Not impossible, just more work.
One thing that I don't understand about Schutt's Kernel language, is why he doesn't use a syntactic mechanism for fexpr evaluation. It seems to me that that would solve some problems.
Eg, since he loves '$' as a pseudosyntax to mark fexprs: why not make $ be special syntax, eg:
(foo bar) applies function foo but
($ foo bar) applies fexpr foo (silently inserting the caller's environment as the first parameter)
But my brain is not big enough to argue why.
@natecull @urusan @xnx38h I think your `$' operator is called `unwrap' in Kernel. Basically you can pretend that each applicative (read: function) is actually of the form `(wrap $f)', where `wrap' is a primitive applicative, which has the effect of evaluating the arguments before passing them to `$f'. And `(unwrap (wrap $f))' is equivalent to just `$f'. Presumably in the actual implementation, for some definitions the `wrap'ped version is primitive and for some the `unwrap'ped version is.
Yeah, I don't get why he does that wrap/unwrap business at all. Makes no sense to me. Doesn't seem the simplest or most useful route.
Using $ as syntax to explicitly mark the evaluation of a fexpr would I think give both the compiler AND the human programmer much more targeted information that Here Be Dragons.
like if you see $? Then you know that be careful, everything following is HIGHLY dependent on exactly what operative follows. This is a macro. Treat nothing as if it's evaluated, or if evaluated, as if it's evaluated in the current scope. Also, be aware that you're giving that operative full read access to your current environment. Security risk.
But sometimes you really really need expressions which are quotes, or which aren't evaluated in the current scope.
I mean the point of the exercise is that it's a language built from the ground up, it is not built over an existing macro facility. It assumes that there is no built-in macro facility. To me that's a plus because the semantics of all existing macro facilities are weird and horrid and so I don't want them in my base language.
ie, in a Scheme-like language with my $ operator as syntax, there would be no other way of defining macros. If you see an expression not starting with $, you *know* that it is a function evaluation.
This would not be true of all existing Schemes, unless one forcibly removed/disabled the built-in macro facilities.
More specifically, 'a should not be a pure computation returning a symbol with name "a", but an effectful computation that creates a new symbol object also carrying the current binding of "a".
(define x 1) (macro foo (y) `(+ x ,y))) (let (x 2) (foo x))
When expanding (foo 3), we evaluate `(+ x ,y). Because we are in the scope of `foo', we remember that `x' refers to the `x' of (define x 1), not `(let (x 2))'. And the opposite for the `x' we substitute in for `,y'.
Hmm. Leaning hard into the Lisp idea of 'symbol' and letting them be fully unique objects linked to their environment and which can't be recreated at all from their printed string name is interesting, but goes against my preference for symbols to be just plain text atoms so that what you print is exactly what you get.
It's philosophical choice but I prefer the Prolog 'atom' to the the Lisp 'symbol'. It's also why I don't like Ratchet's 'syntax' objects.
Basically I'm coming from the angle that if you don't have a clearly defined 100% correct fully-roundtrippable serialization for every in-RAM object, then you don't have a language, or at least not one that's useful for describing all of your computational objects.
At some point you're gonna have to export your objects out of RAM to another system, and when you do, you're going to want a serialization syntax. It might as well be the same language that created them.
But symbols (or any other arbitrary object, including say integers) being marked up with extra 'invisible' data that's provided by their lexical scope or computational history, is still a valid idea. It really needs a GUI to really explore well, and that's what gives me pause. There would need to be some clearly standardised way of exporting that invisible data in some visible way, imo. Otherwise it's a big data-loss black hole.
@natecull @urusan @xnx38h More seriously, it seems that the main motivation is to make metaprogramming first-class and orthogonal. For first class, the thesis makes a comparison to macros which must appear named at compile-time, while you can decide at run-time which fexpr to apply.
For orthogonality, they compare "the head gets evaluated, then applied to the tail" with "if the head is not one of these special forms, each element of the list is evaluated, then the head is applied to the tail".
Right. Removing the idea of 'compile time' as a special privileged space is important to me, because on the Internet, there is no such thing as a generalised 'compile time'. Everything happens at runtime.
So we need a useful way of thinking about 'compilation-like' processes as things we can do at runtime, over runtime-varying data.
($let ((self (get-current-environment))) ($lambda () ($set! self counter (+ counter 1)) counter))))
(I somehow have a strong urge to label this code block as "Statements dreamed up by the utterly Deranged" on one of those "Stop doing PL Theory" memes, as in https://mastodon.vierkantor.com/@Vierkantor/104423398955867191)
Server run by the main developers of the project It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!