big if true
Thanks Computer, I needed that
So uh hello everyone, meet my (literal) big brother @pdcull .
He's been working with at-risk teenagers in the favelas of Brazil for the last 20 years; is a certified CERT (Community Emergency Response Team) trainer and Emergency Manager; is studying for a Masters in Emergency Management with a special focus on empowering communities to develop resilience.
He has seen a bit of crap in his time (corrupt cops, drug dealers etc) so he can *probably* cope with you all.
2018 PHONE FASHION TRENDS
This is a pretty strong look to be honest I think it'll be really hard to beat.
Code Show more
Anyway, I now have a very low level syntax/semantics that I'm happy could represent BOTH an object's 'address' in an OOP language, AND the definition of that object AND the code of all its 'methods'. Or at least has the potential to express that. (as well as also, eg, the address of a data field in a SQL table, in a JSON object, etc . So we could then do logic or maths on those addresses.
The hard part now is what can we do with that data structure? What *kind* of maths/logic can we apply?
Code Show more
Because from a certain point of view, we already have a universal 'dataspace address', and that's a string of OOP method calls. From a, say, Powershell shell, you can call up a COM object and feed it an object string and find yourself talking to an Amazon VM on the other side of the planet. Or at least a local object in your process which will do the talking on your behalf.
What you can't do so easily is take that 'address' - a foo.bar.baz(x).zap sort of thing - and look at IT as an object.
Code Show more
OOP style method call sequences are one of the original things that I created Term-Expressions to be able to model - basically as a formalism so I could compare and contrast various types of addressing schemes.
I haven't *done* anything with that formalism now I have it, but that was the basic underlying impetus.
A method call comes out almost exactly like a Prolog term - and a string of object method calls comes out as an array.
And *that's* why I needed a clean array syntax.
Code Show more
Thinking about Juan Campa's 'Membrane' (https://medium.com/@juancampa/web-apis-game-engines-and-the-universal-inspect-button-4c49eac1073c) and translating Membrane addresses into Term-Expressions
An example Membrane address:
which translates into the URL
In Term-Expression syntax, the first one might be:
(github users (/one name facebook) repos (/one name react))
and the URL might come out as
(https (github com) repos facebook react)
like okay, even images? We might want to crop an image, etc?
Instead of taking a screenshot of our screen, bringing that into an imaging program, cropping, etc,
We should be able to select a crop of an image and then save a small record saying 'X image, with Y crop settings applied...'
Okay, that COULD backfire horribly for privacy, say you crop an image and whoops the whole original gets sent, so we would need to think through some of the issues there.
But the fact that we have to resort to screenshots, that's telling us that something has gone horribly wrong.
Our data should be granular enough, and we should have the access rights and the indexing schemes and the UIs that expose that granularity, to be able to 'copy and link' rather than 'copy and paste' data at any scale we want. And have caching/transfer protocols sort out the rest of the busywork.
One simple argument that the HTTP/HTML-based web is not true Hypertext in the Ted Nelson sense:
That everyone, even here on Mastodon, has to use screenshots to share complex data.
We should be able to copy and link that data as a unit, somehow, in its underlying.... whatever.... source format.
But even though we have close to (not actually, but close to) a Universal Whatever Format in HTML... we can't transclude it into our posts. Because security, or... a whole bunch of valid reasons.
Boost this if you have no idea what you're doing.
and of course it will be your ISP's server you send that HTSP message to, and it's gonna say YES YES to everything you send and THEN redirect you to its corporate partner headquarters, so...
... and this was the situation literally even with the search engines before Google took over. The web servers were lying. The web PAGES were lying about their content, in their content headers. The search ENGINES were lying by putting in for-pay search results. Google came in and seemed to be honest.
It would be awesome if we could just trust that we send, eg, an imaginary HTSP (Hypertext Search Protocol) message to a server and ask it 'hey what have you got about DANCING CATS'
but you KNOW that every lying server out there, which will be ALL of them because they will all go to the same shady SEO seminars that tell them today to put 'join our mailing list popups' on their blogs... every server will answer 'YES YES WE SUPER HAVE THAT PAGE' to every query, whether they do or not.
Anyway: solving these kind of search reliability problems is HARD, so I think that sort of stuff needs to be 'user-code layer' somehow, ie, there needs to be a way to run arbitrary functions that do stuff on datasets, any complex algorithm we can't rely on baking it into the firmware once and getting it right...
but relying on a big central server to do all the search? That was the big mistake. It allowed the centralisation to take place. Lots of big money needed that to happen, of course.
Like, you might think SEO is bad now, with Google running everything? but omg it was FAR FAR WORSE before Google, in the AltaVista era, when search engines naively relied on pages putting in keywords into their documents and then just STRAIGHT UP LYING AND FAKING EVERYTHING.
You have no idea, if you came onto the Web after 2000. Just how bad it was.
I guess a sensible search protocol would just burn all the lying pages with fire and the servers thet rode in on? Some kind of reputation system?
... and, yeah, I think the connections need to be two-way? Perhaps? I don't really know about that. I think the network would be more efficient if the links were two-way-ish (eg: publish-subscribe ), but real networks have to deal with connections just dropping out, so the HTML approach of only one way links was pragmatic, to a point.
I think the existence of Google is a failure. Search should have been part of the protocol from the beginning. But... there were so many pages that just lied.
Computers operational in the Homebrew Computer Club, February 1977.
Mastodon is an anagram for "not so mad"
But yeah: a soup of objects, connected by links into a graph, and those links may include arbitrary computations (so... maybe not a graph, or not a finite-sized graph). I think that's what the future looks like (or even the present, if you look at it through something like Powershell, where you can make an object out of anything).
I think those objects need to be not quite 'objects' but pure-functional, though, for safety; and I think their types need to be describable (and creatable) somehow.
There's a whole world of hurt, of course, in the details as soon as you bring 'typing' into the picture. Since there are so many different type systems, and they don't play nicely with each other!
That's kind of why I started looking at the problem of what typed objects are made OF and ended up with 'Prolog terms, but as S-expressions' as my attempt at a cut at it.
Because if there's typed data out there we need to have a way of describing the types *themselves*.
Juan's getting part of it, though:
<<the ability to reference any node in the Graph. A sort of URL on steroids. URLs can point to arbitrary resources but are limited in that they cannot point to data inside or referenced by the resource....
In Membrane, you use “Refs” which are analogous to URLs but actually designed to work with programmable interfaces, for example, Refs are typed and arguments are explicit>>
Yeah, that's sorta what I want! 'Addresses' which are functions. Or methods?
and we're all gonna shine a light together
Follow friends and discover new ones. Publish anything you want: links, pictures, text, video. This server is run by the main developers of the Mastodon project. Everyone is welcome as long as you follow our code of conduct!