I guess the question that occupies my mind a lot these days is:

Can we build a healthy, positive, life-affirming Internet?

I feel like large parts of our Internet infrastructure are toxic to mental health and social freedom and were designed that way on purpose, because the system seeks money, and you get more money by controlling people than by allowing them to flourish and reach their full potential. This has always been capitalism's big problem (and socialism's too).

@natecull The real issue is technological change outpacing society's speed of adaptation.

As environments change, people develop etiquette, laws, religious traditions & stories to pass on what behaviours work & which values we need to remember.

Companies have just been responding to what's favoured by the social, legal and social context. So there's a feedback loop with both positive and negative consequences.

Maybe we need to increase the rate of social response.

@byron
I don't really buy this. Technological change has slowed down substantially since its peak in the 70s (to the point that most of what we, as individuals and even as early-adopters, run into as 'new' technology' is really 70s tech that finally became profitable), & smaller groups had bigger shifts in tech for decades.

We *are* seeing the effects of certain tech at a larger scale than before, but mostly, we're seeing the effects of capital-amplifiers.

@natecull

@byron
There's no qualitative change happening in, say, ad targeting. Ad targeting works exactly the same way it did in 1995 (and exactly the same way folks were expecting it to eventually start working in the 70s, when computers & statistics were first being applied to the problem).

There's a quantitative change happening, which is that we reached the physics-theoretic peak of speed for integrated circuits 15 years ago, & we're working on getting everybody on the grid.

@natecull

@byron

And that basically means that it's becoming harder to ignore the gap between theory and practice: you have enormous amounts of computing power and enormous amounts of data and you still can't make advertising work reliably.

@natecull

@byron

And until that bubble collapses (which nobody really wants, because it'll take the global economy with it, because most of the economy is just gambling on futures of futures of futures of ad valuations for novelty t-shirts and other trash), the reaction is that everybody in that industry doubles-down and makes promises about how they'll get two tenths of one percent more likelihood of a sale from three gigs more targeting data per person.

@natecull

@byron

This isn't a 'new' phenomenon at all. It's the inevitable result of following the original 1970s script. And, you'll find people -- not even necessarily terribly technical people, but essayists and science fiction authors -- writing in the 60s, 70s, 80s, 90s, about this script and talking about its end-game (which we are living through) because all it takes to predict it is an unwillingness to buy into the hype.

@natecull

@enkiv2 @natecull Some parts of the dystopian imagination of the 70s and so on were well conceived, like John Brunner's idea that the successful modern humans would be the ones best at adapting to hyper rapid change, or his prescient ideas about crowdsourcing knowledge.

But those visions are always lopsided, seeing one set of forces without anticipating the counterforces equally well.

@byron @enkiv2

The 1920s reference is something I've been thinking about too. It feels like we're in a very similar spot.

In some ways, the smartphone is just the final (or interim) delivery on what 'radio' promised in the 1920s. Took a while to get batteries, transmitters, aerials, small enough, and then layering computers over the top was something nobody quite imagined then. But the definite feeling was in the air. "what if... telecommunication?"

Follow

@byron @enkiv2

And now we're like a dog who caught the car and we're sitting there, a little dazed, entire back axle between our teeth.

We wired every human brain together. Now what comes next?

· · Web · 2 · 0 · 1

@byron @enkiv2

A THOUSAND YEARS OF BLISS AND JOY AND ALL HUMAN PROBLEMS SOLVED

yep

that is definitely what is going to happen next

@byron @enkiv2

(I have no clue what is going to happen next. Hopefully we can keep the extinctions and genocides and industrial genetic engineering of slave sub-populations down to a dull roar until we figure out what it is we all really want.)

@natecull @byron @enkiv2
In my humble opinion, there's a lot of exploration to do, but we are stuck in this post-PC phase where the convenience of having our systems online & available outweigh the apparent advantages of having computing be personal, be thing we can change & control & orchestrate & customize.

We are all trapped, bound to online cloud mainframes that offer us only a small slice of the world they contain & computer, where all powers we receive must be baked into the application layer & there is no cloud os we can expect.

The challenge seems obvious, that we are all online, but supplicants. We could have a million responses to disinformation, to bad actors, try & discover what works to grow healthily together, but we are locked onto these giant properties, reliant on them to give us all tools & systems for socialization about this hostile environment. We must begin to be of our own minds, bring our own minds online. #noospherics

@natecull @byron @enkiv2
Byron came gobsmackingly close to the motherboard of the truth,

> John Brunner's idea that the successful modern humans would be the ones best at adapting to hyper rapid change, or his prescient ideas about crowdsourcing knowledge.
>
> But those visions are always lopsided, seeing one set of forces without anticipating the counterforces equally well.
ruby.social/@byron/10486743480

But right now there are less than a dozen corporations who have their own cloud, who have the basis to begin to adapt & explore & adventure. The rest of us monkeys scratching about in the dirt can use these tools to advance ourselves, but at great expense, & with limited control & greatly restricted understanding. For corporations, these restraints are not so bad, but the de-personalization places a hard limit on the individual & their expression & adaption.

@jauntywunderkind420 @natecull @enkiv2 We're not monkeys scratching in the dirt; computing is made cheaper by the cloud.

Still, it *must* be a short-term solution. I'd love to see a growth of mesh computing, both physical networks (wifi) and P2P cloud computing.

The value of CPU time keeps dropping right? If you could sell spare CPU cycles to a P2P cloud network and it ran your computer down a little faster, you'd come out ahead.

And we'd get cloud computing power without the monopolies.

@byron
i don't demand we go p2p right now, or even accounting. as much as innovate, i'd like some catch up for the rest of the world.

i would like to be able to run some servers & cloud services that others can use & share. we're starting to get to the point where i could run a #k8s cluster in some kind of multi-tenant way, but it'd be a ton of assembly & we're not really there yet.

figuring out the peering & market systems sounds good, but i see those as extra atop an un-opinionated multi-tenant cloud infra, infra we are only beginning to begin to be able to emerge for ourselves.
@natecull @enkiv2

@byron

Centralization creates economies of scale that are only useful when centralizing. A little bit of duplication, properly distributed, is not noticeable to those who it is distributed to, while duplication of the whole of a network's data by some centralized owner can easily bankrupt the owner. I don't see why we should bother with this centralization at all. We have the tech to avoid it.

@jauntywunderkind420 @natecull

@byron

If you are already centralized, there are structural incentives to double down and become even more centralized, and economies of scale are part of that.

If you are not centralized (not even federated), then none of that stuff applies. Much easier to run everything off a raspberry pi hooked up to the wifi of the coffee shop down the street than pay amazon to let you access their nightmare of overlapping security groups.

@jauntywunderkind420 @natecull

@byron

As soon as you admit any centralization (even so much as a client-server model), you're trapped by an inevitable logic that leads you to exactly the things we are complaining about in "big tech", & you either go all the way and become the bad guy or you fail earlier.

If you avoid that centralization, however, you've got a lot of flexibility in creating and responding to incentives. You don't need to get subsumed by capital.

@jauntywunderkind420 @natecull

@enkiv2 @byron @jauntywunderkind420

I really hope this is true! I've always *felt* it to be true, even way back in the 80s era of cassette tapes and modem BBSes. It always felt like we were the pioneers of a new underground and there was this vast potential for radical decentralisation.

but, lol, I spent all my time online downloading games, and most of my programming time making games, and not even great games. And now I don't even do much programming and what little I do seems to be harder

@enkiv2 @byron @jauntywunderkind420

It's pretty dizzying when you think about the sheer computing power in just an ordinary laptop today.

I have access to a vast collection of books, photos, videos. But organizing that collection is hard, and doing meaningful and useful things with it is even harder.

Websites are pretty terrible infrastructure for spinning up new community groups. It would be great if we could get to 'Facebook group' level of 'just make something happen'. Without the spam.

@enkiv2 @byron @jauntywunderkind420

My personal attention span is shot this year. I really need to at least finish Tex/Tix 0.2. But just sitting down to code up a tiny parser drains my cognitive resources.

I wish we had computing tools that could make thinking itself easier. That's what I've always wanted. Not running all my data through some snooping cloud AI to look for patterns, but just... doing something small and local, letting me create my own patterns in my workflow.

@enkiv2 @byron @jauntywunderkind420

So much of what we do on a modern desktop seems really resistant to automation - because it's all about stitching together tasks performed in multiple app silos that don't share a common language or data model - that it just seems odd when you think about it. Why did we build the desktop this way, so non-user-programmable? Or... a scarier idea... would a really user-programmable desktop actually be a nightmare because everyone's would be subtly different?

@natecull

I'm sure everyone's would be slightly different.

This would be a nightmare for centralized control and centralized maintenance, but it would not be a nightmare for the users, because the users would have made their computers act the way they wanted.

@byron @jauntywunderkind420

@enkiv2 @byron @jauntywunderkind420

Also, if the system ran on a kind of versioning/differencing concept (which I hope it would), it shouldn't be too hard to extract out just the differences from a base system.

At least that's what I think. I don't know if the details really work, but I keep thinking: versioned filesystems, permanent undo, version control, package management, libraries, all these are really the same thing, we should provide this as a base.

@natecull

chiming in just to strongly affirm, versioning has got to be in the base system / operating system, yes, for sure.

for management tasks & operations, yes, but more so because user objects should have history unto themselves, because not having that is a great weakness. lot of great interrelated stuff here nate, thanks for surfacing it.

@enkiv2 @byron

@natecull @enkiv2 @jauntywunderkind420 Desktops are hard to program for many reasons. The Alto apparently was really object-oriented under the hood. But I think it's actually a hard problem on the one hand, and not enough users cared on the other. Apple did a decent job at user-oriented GUI scripting with AppleScript tbh, but it never caught on massively even within Apple fandom.

You see so many attempts at tying app silos together, inc. IFTTT Yahoo web pipes, CORBA.

@byron @natecull @enkiv2 @jauntywunderkind420
The Alto had Smalltalk, yes (among other environments -- Mesa, for instance, appears to be in many ways more influential than smalltalk). Unlike Smalltalk, AppleScript could only really automate input -- users could not live-edit existing running code, let alone the OS itself. Smalltalk was designed to be syntactically simple enough that children could pick it up without much instruction -- and this largely worked.

@enkiv2 @byron @natecull @jauntywunderkind420
Of course, smalltalk-style environments didn't take off -- for a couple reasons. 1) Apple popularized the GUI, and Steve Jobs considered user programmability a liability. 2) The Alto was a really powerful machine for 1979, and the Mac was a weak machine for 1984. Consumer GUI machines didn't have the horsepower to run acceptably with interpreted languages all the way down until the 90s.

@enkiv2 @byron @natecull @jauntywunderkind420
By that point, the sentiment that "interpreted languages are too slow for system code" had become 'common sense', & in the mid to late 90s it was basically disproven by the double hitter of java and javascript, but both those languages were largely used by an influx of new web-crazy programmers who weren't around for the 80s and early 90s flamewars about the future of smalltalk/lisp/whatever.

@enkiv2 @byron @natecull @jauntywunderkind420
Now, most people spend most of their time interacting with code that can be live-edited (largely because JIT allows code to be compiled selectively, blurring the line between compiled and interpreted code, not because computers are faster than they were 15 years ago). And, modern smalltalk systems are around. Squeak is pretty good. It implements Morphic, which is a really interesting GUI system.

@enkiv2 @natecull @jauntywunderkind420 It's interesting how in general, attempts at bottom-up fully object-oriented OSes just don't seem to succeed.

I mean you could say that about a lot of things--more stuff fails than succeeds in any category, so it's not a categorical coffin.

Beyond the Altair, there's also OS/2 and BeOS, both of which had rave reviews. (Haiku OS is a neat successor too.)

Consumers didn't value that, then.

But maybe future creative, digital-native generations might.

@byron @natecull @jauntywunderkind420

I mean, that's the problem right there.

You can have users, or you can have consumers. If you fail to distinguish between the two, you get Steve Jobs-style paternalism, which inevitably leads to the kind of consumer-friendly but user-hostile systems we have today.

Nobody likes appliances except appliance manufacturers.

@byron @natecull @jauntywunderkind420

Locking things down is useful in a commercial context. If users can only do things on their own machines with permission, you can sell them permission, & sell third parties a license to give permission (giving you a cut) -- the app store model that gives us one dollar "flashlight apps".

The open source movement made folks realize that soft lockdowns worked even better: give people who master unnecessarily complicated dev toolchains extra priviledges.

@byron @natecull @jauntywunderkind420

So long as you scare everybody away from the dev tools except the people who already think they're hot shit, you get all the benefits of being able to squeeze them for pennies plus a dedicated fanbase of people whose egos you have stroked with a tedious puzzle.

Show more

@enkiv2 @byron @natecull there was an article a couple weeks ago talking about "app consoles" which was the scariest, most post-general-computing phrasing i had heard.

this is a line between systems/devs & users that should be worked away, should be made fuzzier. alas apple in particular seems to be building ever more totalistic divisions between the systems & the users.

@enkiv2 @natecull @jauntywunderkind420 I want a computer, of course, not an appliance, and so do you. But MOST of the non-programmers I know don't want to learn to program.

I mean look at the classic "nobody can program the VCR" trope." It was *too much* computer and *not enough* appliance.

Something Jobs *got* was that people didn't want to "program" things, they wanted to interact with them in natural ways: with swipes, with voice commands. He was right about that *especially* back then.

@enkiv2 @natecull @jauntywunderkind420 I've even had points in my life where I was doing so much non-technical work that I couldn't afford hack time with my devices and even *I* wanted appliances that would "just work."

What's changing is that programming and hacking are becoming more humanized--more Python, less Asm--which is part of what Jobs was actually doing with AppleScript and Siri type tech. And consumers are becoming hackers, which he didn't predict at all.

It's a cool convergence.

Show more
Show more

@byron @enkiv2 @natecull my own theory comes close to something Jobs's said:

> “It’s not about Personal Computer .. it’s about *Interpersonal* Computing”.

via tbl! w3.org/blog/2011/10/steve-jobs

The OS layer is, imo, frankly unimportant. All our efforts & interests need to align to a much broader vision of computing that makes the box incidental. Unless that is your starting place, you are probably not going to contribute anything relevant: you'll be remixing & shuffling the obsolete & at this point irrelevant experience.

@jauntywunderkind420 @enkiv2 @natecull Oh that's REALLY GOOD! Good share, thanks.

I bring up Jobs a lot because I think that he and FOSS in general had complementary strengths. He was trying to get to a Star Trek world where you'd chat with a computer, not program it, and tech would look like pretty, fashionable everyday things, not explicitly "technological."

And a lot of us geeks *want* to program it, and like that "tech" look that says "this can be hacked."

Again, convergence is exciting.

@enkiv2 @natecull @jauntywunderkind420 I had a neat journey with that. I started off with interpreted BASIC and it was on even the weakest machines, so thinking of it as "too slow" was kind of funny. I made games and pretty much any kind of program.

Then I did switch to compiled Pascal & C to run things faster. I developed that "interpreted languages suck" mentality--the first web code I wrote was in C.

Quickly I learned Perl was much easier, and learned JavaScript as it ate the world.

@enkiv2 @natecull @jauntywunderkind420 The new book Valley of Genius gives great perspective on this, incl what Jobs got & didn't re the Altair. He valued a product being physically well-crafted inside & out, but not on the software end, until NeXT.

But as you say, getting the Mac to work at all on that hw was a miracle.

"Valley" also shows Jobs crystallizing what I'd call his big theme: making computers *appliances*. He didn't see coding as creative, but an *artifact* of poor product design.

@enkiv2 @natecull @jauntywunderkind420 In that way I think Jobs just represented the overall view of the tech industry: coding is just a nasty job people have to do to use computers, and if we can reduce or avoid that, more people will buy.

It turns out that when you grow up "digital native" (even I feel like I did to a degree), tech can be fun, even natural.

We see a kind of human/tech co-design.

BTW I learned *about* Smalltalk in school but never used it. Didn't seem kid-level easy though.

@byron @enkiv2 @jauntywunderkind420

I agree with just about all of this.

I do ops rather than dev, so I don't "actually program" nearly as much as I want to, but.. Programming is also just *hard* and I feel like our tools still make it way harder than it needs to be.

I want to explore data sets, write queries, save them... There still seem not many good tools for this. And the ones that exist, mostly are aimed at bigcorps with HUGE datasets.

@natecull @byron @jauntywunderkind420

To tangent a bit, the whole "dev" vs "ops" (and now "devops" vs both "dev" and "ops") is an indication that the user/developer division is getting worse.

Once upon a time, users were people who used computers, and using a computer was the same as programming a computer.

Then we made "applications", so-called because they made specific ways to apply the computer to problem-solving easier.

Now there's a special class of paid expert non-programmer users.

@natecull @byron @jauntywunderkind420

And on top of that, another classification of paid expert non-programmer users who can program a bit.

Which means that application configuration, by itself, is now harder than application development was 40 years ago.

@natecull @byron @jauntywunderkind420

(Most of this difficulty is because of scale, and specifically centralization. We need experts because scale is hard and we need scale because centralization is hard and we need centralization because people will pay us more than server costs to run open source programs on our machines but wouldn't pay anybody anything to run it on their own machines.)

@natecull @byron @jauntywunderkind420

(But if we make the configuration complicated enough & make it default to behaviors that only make sense on clusters of beefy datacenter racks, then nobody will run it on their machines anyway and our income is ensured!)

@urusan @natecull @byron @jauntywunderkind420

I find it hard to believe that 'computer operators' and ops are the same people, since the job has changed fundamentally since the hey-day of computer operators in the 60s and 70s. But then again, this classification includes data entry as a subclassification, so it's absurdly broad.

@enkiv2 @natecull @byron @jauntywunderkind420 Well, my point is just that there's always been a special class of paid expert non-programmer computer users. BLS was even tracking the role until very recently (the link was from 2017).

Of course ops is a different job (or really several different jobs). Here's OOH's page on the main ops job bls.gov/ooh/computer-and-infor

Show more

@natecull @enkiv2 @jauntywunderkind420 BTW on the topic of "finding the energy to code something" -- I find that with creative work, you have to acknowledge inspiration as part of the job.

That is, the job isn't just "do X" but also "stay excited about X." On the bigger level X might not just be that project, but coding in general. You might need to take a break from that project and just *play* with code a bit, whatever that means. Being an obligation you're not excited about can be draining.

@natecull

Few of the things users can do today with book/photo/&c collections have much impact on the web/online. We have to keep waiting for apps to begin to have capabilities, to let them ingest our media for us. And that speaks of a very poor ecosystem.

@enkiv2 @byron

@jauntywunderkind420

They're linked, though.

Back in the day, the go-to explanations for why users didn't have control over application behavior were:
* "our source code is a valuable business secret, and keeping it secret makes us better than competitors"
* "interpreted languages are too slow"
* "end users can't be trusted with compilers"

All of that is clearly bullshit or irrelevant no.

@natecull @byron

@jauntywunderkind420

But a corporation can still take their local modifications of open source code, stick it on a machine they own or rent, and write some interpreted code to interface with it -- and boom! web app, which they can modify whenever they want and which end users can't even see the whole of.

Mandatory network connection is part of keeping users away from meaningful control of these systems.

@natecull @byron

@jauntywunderkind420

(This is, in fact, to some degree intentional, as you can see from the geneaology. The first applications to require unnecessary always-on internet, way back in the 90s, were expensive CAD systems like Maya that used this for DRM -- checking for simultaneous license use from different machines.)

@natecull @byron

@natecull @enkiv2 @jauntywunderkind420 I *definitely* agree that I remember that same feeling that computers and networking could be freeing and a decentralizing force.

You get different forces when there's huge value to be unlocked in a "wild west" frontier: the decentralizing forces that have been powerful at times, like IRC, email, the P2P sharing revolution, hopefully the fediverse; and centralizing forces.

My worry is the "wild west" will be "won" by big gov and big corps.

@natecull @enkiv2 @jauntywunderkind420 The thing is, hierarchy is powerful, which is what centralization is. It's why militaries tend to that structure, and corporations.

I mean, imagine if every Linux user just agreed on ONE Linux distro to support in evangelizing to newbies? Unity has power.

Tech frontiers tend to start chaotic and create a few big winners.

But *networking* tech has the *potential* to disrupt that pattern and make decentralization cheaper, more efficient and more powerful.

@enkiv2
There's a lot to be said for systems that don't have to be online to work (net-work). Breaking that often presently necessary coupling is I agree pretty important, if only because not everyone can keep a pi at a coffee shop but they still deserve to be able to serve & help the net if they fall off it.

But past that definition, I feel like most notions of distributed/decentralized are alluring sugar pop dreams that mask over how rudimentary even centralized tools, with full super-ops powers, are at enabling user creativity & imagination. Distributed is not entirely but largely orthogonal to a vaster more alarming impotency.
@byron @natecull

@jauntywunderkind420 @enkiv2 @natecull Git's a good example of the way systems can be built to function both online and offline, and it's good that even web apps like GMail try to handle loss of connection temporarily.

I agree, good decentralized systems should be able to handle a variety of network levels, from "fuck it, I'm airgapped" to "I'm travelling a lot" to reliable high speed connections.

But the networks are where the most power and potential, especially when we think P2P clouds.

@jauntywunderkind420 @natecull @enkiv2 You're talking about it in negative terms, but actually as much as I prefer decentralized and private computer power, it's actually *amazing* the power that big cloud computing puts cheaply at the disposal of even individuals and small companies.

One simple example is the way that if you only occasionally need massive spikes of CPU and RAM resources, clouds can give that without you buying it or waiting hours or days. Eg. for Blender renders, compiling...

@byron
i speak as much of the applications running & their massive, at-scale, inflexible, corporate run nature as i do the physical infrastructure of the cloud.

i acknowledge the technics you speak of, you're not wrong. yet this current balance of power is grossly grossly inhumane & deeply restrictive of the human consciousness. these social media applications we interact with are far off, unintelligible aliens to us, deeply impersonal entities, with spectacular haunting capabilities that give enormous hard-to-recognize power to strangers.

you talk about us having technical capabilities, but i don't see them being marshaled to help create independent pervasive-onlineness that competes with the heavily saturated smattering of social networks much of humanity has integrated themselves into. agreed, access to cloud infra is good, yet it's not being used to compete. we here are some of the few counter-examples.
@natecull @enkiv2

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!