I guess the question that occupies my mind a lot these days is:

Can we build a healthy, positive, life-affirming Internet?

I feel like large parts of our Internet infrastructure are toxic to mental health and social freedom and were designed that way on purpose, because the system seeks money, and you get more money by controlling people than by allowing them to flourish and reach their full potential. This has always been capitalism's big problem (and socialism's too).

@natecull The real issue is technological change outpacing society's speed of adaptation.

As environments change, people develop etiquette, laws, religious traditions & stories to pass on what behaviours work & which values we need to remember.

Companies have just been responding to what's favoured by the social, legal and social context. So there's a feedback loop with both positive and negative consequences.

Maybe we need to increase the rate of social response.

@byron
I don't really buy this. Technological change has slowed down substantially since its peak in the 70s (to the point that most of what we, as individuals and even as early-adopters, run into as 'new' technology' is really 70s tech that finally became profitable), & smaller groups had bigger shifts in tech for decades.

We *are* seeing the effects of certain tech at a larger scale than before, but mostly, we're seeing the effects of capital-amplifiers.

@natecull

@byron
There's no qualitative change happening in, say, ad targeting. Ad targeting works exactly the same way it did in 1995 (and exactly the same way folks were expecting it to eventually start working in the 70s, when computers & statistics were first being applied to the problem).

There's a quantitative change happening, which is that we reached the physics-theoretic peak of speed for integrated circuits 15 years ago, & we're working on getting everybody on the grid.

@natecull

@byron

And that basically means that it's becoming harder to ignore the gap between theory and practice: you have enormous amounts of computing power and enormous amounts of data and you still can't make advertising work reliably.

@natecull

@byron

And until that bubble collapses (which nobody really wants, because it'll take the global economy with it, because most of the economy is just gambling on futures of futures of futures of ad valuations for novelty t-shirts and other trash), the reaction is that everybody in that industry doubles-down and makes promises about how they'll get two tenths of one percent more likelihood of a sale from three gigs more targeting data per person.

@natecull

@byron

This isn't a 'new' phenomenon at all. It's the inevitable result of following the original 1970s script. And, you'll find people -- not even necessarily terribly technical people, but essayists and science fiction authors -- writing in the 60s, 70s, 80s, 90s, about this script and talking about its end-game (which we are living through) because all it takes to predict it is an unwillingness to buy into the hype.

@natecull

@enkiv2 @natecull Some parts of the dystopian imagination of the 70s and so on were well conceived, like John Brunner's idea that the successful modern humans would be the ones best at adapting to hyper rapid change, or his prescient ideas about crowdsourcing knowledge.

But those visions are always lopsided, seeing one set of forces without anticipating the counterforces equally well.

@byron @enkiv2

The 1920s reference is something I've been thinking about too. It feels like we're in a very similar spot.

In some ways, the smartphone is just the final (or interim) delivery on what 'radio' promised in the 1920s. Took a while to get batteries, transmitters, aerials, small enough, and then layering computers over the top was something nobody quite imagined then. But the definite feeling was in the air. "what if... telecommunication?"

@byron @enkiv2

And now we're like a dog who caught the car and we're sitting there, a little dazed, entire back axle between our teeth.

We wired every human brain together. Now what comes next?

@natecull @byron @enkiv2
In my humble opinion, there's a lot of exploration to do, but we are stuck in this post-PC phase where the convenience of having our systems online & available outweigh the apparent advantages of having computing be personal, be thing we can change & control & orchestrate & customize.

We are all trapped, bound to online cloud mainframes that offer us only a small slice of the world they contain & computer, where all powers we receive must be baked into the application layer & there is no cloud os we can expect.

The challenge seems obvious, that we are all online, but supplicants. We could have a million responses to disinformation, to bad actors, try & discover what works to grow healthily together, but we are locked onto these giant properties, reliant on them to give us all tools & systems for socialization about this hostile environment. We must begin to be of our own minds, bring our own minds online. #noospherics

@natecull @byron @enkiv2
Byron came gobsmackingly close to the motherboard of the truth,

> John Brunner's idea that the successful modern humans would be the ones best at adapting to hyper rapid change, or his prescient ideas about crowdsourcing knowledge.
>
> But those visions are always lopsided, seeing one set of forces without anticipating the counterforces equally well.
ruby.social/@byron/10486743480

But right now there are less than a dozen corporations who have their own cloud, who have the basis to begin to adapt & explore & adventure. The rest of us monkeys scratching about in the dirt can use these tools to advance ourselves, but at great expense, & with limited control & greatly restricted understanding. For corporations, these restraints are not so bad, but the de-personalization places a hard limit on the individual & their expression & adaption.

@jauntywunderkind420 @natecull @enkiv2 We're not monkeys scratching in the dirt; computing is made cheaper by the cloud.

Still, it *must* be a short-term solution. I'd love to see a growth of mesh computing, both physical networks (wifi) and P2P cloud computing.

The value of CPU time keeps dropping right? If you could sell spare CPU cycles to a P2P cloud network and it ran your computer down a little faster, you'd come out ahead.

And we'd get cloud computing power without the monopolies.

@byron

Centralization creates economies of scale that are only useful when centralizing. A little bit of duplication, properly distributed, is not noticeable to those who it is distributed to, while duplication of the whole of a network's data by some centralized owner can easily bankrupt the owner. I don't see why we should bother with this centralization at all. We have the tech to avoid it.

@jauntywunderkind420 @natecull

@byron

If you are already centralized, there are structural incentives to double down and become even more centralized, and economies of scale are part of that.

If you are not centralized (not even federated), then none of that stuff applies. Much easier to run everything off a raspberry pi hooked up to the wifi of the coffee shop down the street than pay amazon to let you access their nightmare of overlapping security groups.

@jauntywunderkind420 @natecull

@byron

As soon as you admit any centralization (even so much as a client-server model), you're trapped by an inevitable logic that leads you to exactly the things we are complaining about in "big tech", & you either go all the way and become the bad guy or you fail earlier.

If you avoid that centralization, however, you've got a lot of flexibility in creating and responding to incentives. You don't need to get subsumed by capital.

@jauntywunderkind420 @natecull

@enkiv2 @byron @jauntywunderkind420

I really hope this is true! I've always *felt* it to be true, even way back in the 80s era of cassette tapes and modem BBSes. It always felt like we were the pioneers of a new underground and there was this vast potential for radical decentralisation.

but, lol, I spent all my time online downloading games, and most of my programming time making games, and not even great games. And now I don't even do much programming and what little I do seems to be harder

@enkiv2 @byron @jauntywunderkind420

It's pretty dizzying when you think about the sheer computing power in just an ordinary laptop today.

I have access to a vast collection of books, photos, videos. But organizing that collection is hard, and doing meaningful and useful things with it is even harder.

Websites are pretty terrible infrastructure for spinning up new community groups. It would be great if we could get to 'Facebook group' level of 'just make something happen'. Without the spam.

@enkiv2 @byron @jauntywunderkind420

My personal attention span is shot this year. I really need to at least finish Tex/Tix 0.2. But just sitting down to code up a tiny parser drains my cognitive resources.

I wish we had computing tools that could make thinking itself easier. That's what I've always wanted. Not running all my data through some snooping cloud AI to look for patterns, but just... doing something small and local, letting me create my own patterns in my workflow.

Follow

@enkiv2 @byron @jauntywunderkind420

So much of what we do on a modern desktop seems really resistant to automation - because it's all about stitching together tasks performed in multiple app silos that don't share a common language or data model - that it just seems odd when you think about it. Why did we build the desktop this way, so non-user-programmable? Or... a scarier idea... would a really user-programmable desktop actually be a nightmare because everyone's would be subtly different?

· · Web · 2 · 1 · 2

@natecull

I'm sure everyone's would be slightly different.

This would be a nightmare for centralized control and centralized maintenance, but it would not be a nightmare for the users, because the users would have made their computers act the way they wanted.

@byron @jauntywunderkind420

@enkiv2 @byron @jauntywunderkind420

Also, if the system ran on a kind of versioning/differencing concept (which I hope it would), it shouldn't be too hard to extract out just the differences from a base system.

At least that's what I think. I don't know if the details really work, but I keep thinking: versioned filesystems, permanent undo, version control, package management, libraries, all these are really the same thing, we should provide this as a base.

@natecull

chiming in just to strongly affirm, versioning has got to be in the base system / operating system, yes, for sure.

for management tasks & operations, yes, but more so because user objects should have history unto themselves, because not having that is a great weakness. lot of great interrelated stuff here nate, thanks for surfacing it.

@enkiv2 @byron

@natecull @enkiv2 @jauntywunderkind420 Desktops are hard to program for many reasons. The Alto apparently was really object-oriented under the hood. But I think it's actually a hard problem on the one hand, and not enough users cared on the other. Apple did a decent job at user-oriented GUI scripting with AppleScript tbh, but it never caught on massively even within Apple fandom.

You see so many attempts at tying app silos together, inc. IFTTT Yahoo web pipes, CORBA.

@byron @natecull @enkiv2 @jauntywunderkind420
The Alto had Smalltalk, yes (among other environments -- Mesa, for instance, appears to be in many ways more influential than smalltalk). Unlike Smalltalk, AppleScript could only really automate input -- users could not live-edit existing running code, let alone the OS itself. Smalltalk was designed to be syntactically simple enough that children could pick it up without much instruction -- and this largely worked.

@enkiv2 @byron @natecull @jauntywunderkind420
Of course, smalltalk-style environments didn't take off -- for a couple reasons. 1) Apple popularized the GUI, and Steve Jobs considered user programmability a liability. 2) The Alto was a really powerful machine for 1979, and the Mac was a weak machine for 1984. Consumer GUI machines didn't have the horsepower to run acceptably with interpreted languages all the way down until the 90s.

@enkiv2 @byron @natecull @jauntywunderkind420
By that point, the sentiment that "interpreted languages are too slow for system code" had become 'common sense', & in the mid to late 90s it was basically disproven by the double hitter of java and javascript, but both those languages were largely used by an influx of new web-crazy programmers who weren't around for the 80s and early 90s flamewars about the future of smalltalk/lisp/whatever.

@enkiv2 @byron @natecull @jauntywunderkind420
Now, most people spend most of their time interacting with code that can be live-edited (largely because JIT allows code to be compiled selectively, blurring the line between compiled and interpreted code, not because computers are faster than they were 15 years ago). And, modern smalltalk systems are around. Squeak is pretty good. It implements Morphic, which is a really interesting GUI system.

@enkiv2 @natecull @jauntywunderkind420 It's interesting how in general, attempts at bottom-up fully object-oriented OSes just don't seem to succeed.

I mean you could say that about a lot of things--more stuff fails than succeeds in any category, so it's not a categorical coffin.

Beyond the Altair, there's also OS/2 and BeOS, both of which had rave reviews. (Haiku OS is a neat successor too.)

Consumers didn't value that, then.

But maybe future creative, digital-native generations might.

@byron @natecull @jauntywunderkind420

I mean, that's the problem right there.

You can have users, or you can have consumers. If you fail to distinguish between the two, you get Steve Jobs-style paternalism, which inevitably leads to the kind of consumer-friendly but user-hostile systems we have today.

Nobody likes appliances except appliance manufacturers.

@byron @natecull @jauntywunderkind420

Locking things down is useful in a commercial context. If users can only do things on their own machines with permission, you can sell them permission, & sell third parties a license to give permission (giving you a cut) -- the app store model that gives us one dollar "flashlight apps".

The open source movement made folks realize that soft lockdowns worked even better: give people who master unnecessarily complicated dev toolchains extra priviledges.

@byron @natecull @jauntywunderkind420

So long as you scare everybody away from the dev tools except the people who already think they're hot shit, you get all the benefits of being able to squeeze them for pennies plus a dedicated fanbase of people whose egos you have stroked with a tedious puzzle.

@enkiv2 @natecull @jauntywunderkind420 I'd call this a feedback loop more than a sinister plan (although there's some sinister there too).

Companies over-invest in the sure things (most customers don't want to program, we don't want to scare them off, so we'll make those crazy geeks jump through hoops) and not enough in the long tail.

Microsoft originally got that pretty well, trying to appeal to developers; it wasn't until Jobs returned that he got it, too.

FOSS evolved in the reverse way!

@enkiv2 @byron @natecull there was an article a couple weeks ago talking about "app consoles" which was the scariest, most post-general-computing phrasing i had heard.

this is a line between systems/devs & users that should be worked away, should be made fuzzier. alas apple in particular seems to be building ever more totalistic divisions between the systems & the users.

@enkiv2 @natecull @jauntywunderkind420 I want a computer, of course, not an appliance, and so do you. But MOST of the non-programmers I know don't want to learn to program.

I mean look at the classic "nobody can program the VCR" trope." It was *too much* computer and *not enough* appliance.

Something Jobs *got* was that people didn't want to "program" things, they wanted to interact with them in natural ways: with swipes, with voice commands. He was right about that *especially* back then.

@enkiv2 @natecull @jauntywunderkind420 I've even had points in my life where I was doing so much non-technical work that I couldn't afford hack time with my devices and even *I* wanted appliances that would "just work."

What's changing is that programming and hacking are becoming more humanized--more Python, less Asm--which is part of what Jobs was actually doing with AppleScript and Siri type tech. And consumers are becoming hackers, which he didn't predict at all.

It's a cool convergence.

@enkiv2 @natecull @jauntywunderkind420 I strongly oppose locked-down platforms, and I've always found any operating system that didn't have built-in programming languages to be broken.

But the forces that cause those things aren't only nefarious, they're also based on consumer preferences and the tech learning curve too, things that open source has been slower to learn.

Yes, easy things should be easy and hard things possible. But eg. Apple focused first on "easy" and FOSS on "possible."

@byron @enkiv2 @jauntywunderkind420

I think this is what worries most: that the forces pushing us to locked down, snooping, censoring, distantly controlled, not-in-the-users-interests platforms *are* emergent and not top-down, caused by inherently uncontrollable complexity and insecurity. That there may *be* no way to get both "it just works" AND "I can trust it to be working for ME" in sufficiently programmable technology.

@natecull @byron @jauntywunderkind420

We might not be able to get all the way there, but we can absolutely get closer than we have (because we have gotten closer in the past).

It requires fighting against big powerful forces like "profitability" and "the existence of a software industry"

@enkiv2 @byron @jauntywunderkind420

I used to think that open source and copyleft would be enough to create a self-improvement feedback cycle that made all software inevitably evolve towards Smalltalk / Lisp Machine with accelerating interoperability, making the software industry obsolete.

But then I saw 1) Emacs and 2) Squeak. Both seemingly very walled, rigid, hard-to-evolve environments and user cultures. Not just hard, actively *refusing* evolution and interoperability.

@enkiv2 @byron @jauntywunderkind420

I still kind of cling to the foolish hope that it's "just a grammar problem", that if we just somehow get a small, safe, complete, consistent set of tools for enabling self-modification, that we'll get a new user-driven Cambrian Explosion.

But I fear that I'm way wrong, that the big forces are just too big to be solved that way.

@natecull @enkiv2 @byron
It had to go easy passed the language. We need digital entities, networked entities. That make sense on their own. Outside if language.

@natecull @enkiv2 @byron @jauntywunderkind420 :blobcatpeek:

For emacs, it seems people would like it to grow but this is hamstrung. For example there is that research project to compile e-lisp so they can bring more of emacs in to itself. But there are onerous legalities to getting things in to emacs (multi-step copyright assignments) and dealing with RMS once upon a time (ex. complicating emacs communicating with other processes out of fear emacs would get colonized by corporates.)

For squeak, well, yes. The VM god is outright hostile. https://github.com/OpenSmalltalk/opensmalltalk-vm/issues/519

@natecull @byron @enkiv2
Imo the main reason these beyond visibility & control systems keep emerging & spreading is because there aren't alternatives. We have yet to do the wayfinding, for our selves, for how to do interpersonal computing.

Fediverse here is as bleeding edge as we've gotten. XMPP is another example, it's chat (MUC) was terrible, is good now but unimplemented (MIX, XEP-0369). And xmpp just got dropped, right as it was working, & we failed to diy thereafter.

@jauntywunderkind420 @byron @enkiv2

I really like that phrase, "interpersonal computing". Seems to capture a lot of the problem space.

@jauntywunderkind420 @natecull @byron @enkiv2 I don't know that I agree with your statement, jaunty. MIX is still pretty raw, and xmpp was a good enough choice for damn near every messaging service on planet earth worth mentioning to be based on.

@Citizenzibb

there's a whole bunch of very very small subtle changes in MIX that, imo, add up to a radically better experience over MUC.

xmpp.org/extensions/xep-0369.h lists a bunch of technics. users being able to join channels on multiple clients, having reactions to messages, getting chat history (MAM) on join... hard to imagine chat without these. & the technics are, all in all, much more consistent & easy for developers to work with, & across with other specifications, imo.

MUC was a huge specification, super busy, almost all custom; MIX is simple, & broken into a bunch of small understandable features: most complexity is delivered from existing super popular standards.

but yeah MUC was better than having zero power as users, which is what we were left with.

@Citizenzibb i think there were some other good posts in this genre, maybe earlier years? but State of Mobile XMPP in 2016 is a good example of #xmpp doing some critical self assessment, & (not as visible from this post) doing a lot of hard work & new spec work to find better paths forwards. xmpp has been amazing about this.

gultsch.de/xmpp_2016.html

@byron @natecull @jauntywunderkind420

Jobs was not involved with AppleScript (or HyperCard) and would not have approved it had he been around at that time. One of the first thing he did upon coming back to Apple was cancel Hypercard & OpenDoc.

Siri matches the appliance model of a computer perfectly: you can only perform specific already-approved tasks, because composition & complex commands are not possible.

@enkiv2 @natecull @jauntywunderkind420 He was also trying to get simplify offerings in a company that was dying. It sucked that he got rid of HyperCard though, that was a neat development environment for kids. My brother learned on it.

As for Siri, you can see from his 80's commercials that his end goal was a *conversational* AI assistant/agent. The limitations were just "what we can do now."

That's not to canonize Jobs, just that I'm trying to learn the best lessons there.

@byron @natecull @jauntywunderkind420

Jobs was out when the Knowledge Navigator ads came in. That was a firmly John Scully project.

@byron @natecull @jauntywunderkind420

We should not confuse "just works" with "not hackable".

First off, nothing "just works" for all possible uses or all possible users. Hackability is what provides that flexibility. When you have hackability, then "just works" really means "sensible defaults".

Jobs' goal was personal control. This is why he banned programmability & extension cards.

@byron @natecull @jauntywunderkind420

We have the ability to not just blur the line between programming and a 'user-friendly' experience but completely demolish it. We've had that ability for decades -- if we learn from minimalist languages like smalltalk & real-time user-centric REPL design like modern unix shells have.

@byron @natecull @jauntywunderkind420

The thing is, if we empower users, most working programmers will be out of a job, because most of what we do:
1) should not be done at all
2) is made overly complicated by a social structure around languages and toolchains that treats bureaucracy and incomprehensibility as virtues
3) cannot possibly match the desired behavior of a majority of users

@enkiv2 @byron @natecull
I take to think those examples are good but also kind of irrelevant. They had good characteristics but only some aspects, amid a lot of things going on. It's too complicated to be clear.

What I do see as a pure virtue that stood for something clear was 9p. Expose your state. Let it span systems. Use common os tools to manipulate state.

@jauntywunderkind420 @byron @natecull

Plan9 is definitely one of the more interesting systems here. Not so much due to 9p -- which is mostly interesting for workgroups -- but because of the mouse language, where arbitrary text can become a button because it becomes treated as a 'command' when you use a particular mouse button.

But the unix shell is a more user-centric interface than most GUIs, and exposes basic control flow and threading in an easily-understood way.

@jauntywunderkind420 @byron @natecull

To the extent that the unix shell is user-hostile, it's due to backwards compatibility. So, future systems not intended to be bourne-compatible can learn the good bits & ditch the bad bits.

The smalltalk language showed how to make a full language out of a handful of small parts that are understandable to non-programmers. The smalltalk environment showed how to extend inter-process pipelines to a GUI.

@jauntywunderkind420 @byron @natecull

"Use common OS tools to manipulate state" is a weaker form of "use the same language, API, and idioms all the way through the stack" -- a homogeneous system, like smalltalk, lisp, and forth machines have.

This is valuable because it means that 'non-technical end users', while they may need to learn various concepts or technical details, do not need to learn a new language to investigate or manipulate *anything* in the system. They own it.

@jauntywunderkind420 @byron @natecull

(Which means if they WANT to learn concepts or technical details of their system, they can use the knowledge they already have from using the system to observe the existing behavior, read the implementation, and experiment.)

@byron @enkiv2 @natecull my own theory comes close to something Jobs's said:

> “It’s not about Personal Computer .. it’s about *Interpersonal* Computing”.

via tbl! w3.org/blog/2011/10/steve-jobs

The OS layer is, imo, frankly unimportant. All our efforts & interests need to align to a much broader vision of computing that makes the box incidental. Unless that is your starting place, you are probably not going to contribute anything relevant: you'll be remixing & shuffling the obsolete & at this point irrelevant experience.

@jauntywunderkind420 @enkiv2 @natecull Oh that's REALLY GOOD! Good share, thanks.

I bring up Jobs a lot because I think that he and FOSS in general had complementary strengths. He was trying to get to a Star Trek world where you'd chat with a computer, not program it, and tech would look like pretty, fashionable everyday things, not explicitly "technological."

And a lot of us geeks *want* to program it, and like that "tech" look that says "this can be hacked."

Again, convergence is exciting.

@enkiv2 @natecull @jauntywunderkind420 I had a neat journey with that. I started off with interpreted BASIC and it was on even the weakest machines, so thinking of it as "too slow" was kind of funny. I made games and pretty much any kind of program.

Then I did switch to compiled Pascal & C to run things faster. I developed that "interpreted languages suck" mentality--the first web code I wrote was in C.

Quickly I learned Perl was much easier, and learned JavaScript as it ate the world.

@enkiv2 @natecull @jauntywunderkind420 The new book Valley of Genius gives great perspective on this, incl what Jobs got & didn't re the Altair. He valued a product being physically well-crafted inside & out, but not on the software end, until NeXT.

But as you say, getting the Mac to work at all on that hw was a miracle.

"Valley" also shows Jobs crystallizing what I'd call his big theme: making computers *appliances*. He didn't see coding as creative, but an *artifact* of poor product design.

@enkiv2 @natecull @jauntywunderkind420 In that way I think Jobs just represented the overall view of the tech industry: coding is just a nasty job people have to do to use computers, and if we can reduce or avoid that, more people will buy.

It turns out that when you grow up "digital native" (even I feel like I did to a degree), tech can be fun, even natural.

We see a kind of human/tech co-design.

BTW I learned *about* Smalltalk in school but never used it. Didn't seem kid-level easy though.

@byron @enkiv2 @jauntywunderkind420

I agree with just about all of this.

I do ops rather than dev, so I don't "actually program" nearly as much as I want to, but.. Programming is also just *hard* and I feel like our tools still make it way harder than it needs to be.

I want to explore data sets, write queries, save them... There still seem not many good tools for this. And the ones that exist, mostly are aimed at bigcorps with HUGE datasets.

@natecull @byron @jauntywunderkind420

To tangent a bit, the whole "dev" vs "ops" (and now "devops" vs both "dev" and "ops") is an indication that the user/developer division is getting worse.

Once upon a time, users were people who used computers, and using a computer was the same as programming a computer.

Then we made "applications", so-called because they made specific ways to apply the computer to problem-solving easier.

Now there's a special class of paid expert non-programmer users.

@natecull @byron @jauntywunderkind420

And on top of that, another classification of paid expert non-programmer users who can program a bit.

Which means that application configuration, by itself, is now harder than application development was 40 years ago.

@natecull @byron @jauntywunderkind420

(Most of this difficulty is because of scale, and specifically centralization. We need experts because scale is hard and we need scale because centralization is hard and we need centralization because people will pay us more than server costs to run open source programs on our machines but wouldn't pay anybody anything to run it on their own machines.)

@natecull @byron @jauntywunderkind420

(But if we make the configuration complicated enough & make it default to behaviors that only make sense on clusters of beefy datacenter racks, then nobody will run it on their machines anyway and our income is ensured!)

@urusan @natecull @byron @jauntywunderkind420

I find it hard to believe that 'computer operators' and ops are the same people, since the job has changed fundamentally since the hey-day of computer operators in the 60s and 70s. But then again, this classification includes data entry as a subclassification, so it's absurdly broad.

@enkiv2 @natecull @byron @jauntywunderkind420 Well, my point is just that there's always been a special class of paid expert non-programmer computer users. BLS was even tracking the role until very recently (the link was from 2017).

Of course ops is a different job (or really several different jobs). Here's OOH's page on the main ops job bls.gov/ooh/computer-and-infor

@enkiv2 @natecull @byron @jauntywunderkind420 One of the biggest forces changing the landscape of IT careers is automation. For instance, the "computer programmer" career is in massive decline, the same situation computer operators were in over the last few decades: bls.gov/ooh/computer-and-infor

It's not that programmers are going away, they're just being reclassified as "software developers" and "software engineers"
bls.gov/ooh/computer-and-infor

@urusan @natecull @byron @jauntywunderkind420

I don't know why the BLS would have a separate classification for programmer and software developer. In practice, a software engineer is just a programmer with a college degree and $5000 added to their starting salary.

(Yeah, yeah, in theory software engineers come from engineering schools and know how to do statistical analysis for quality control etc. Never met another SE in a work context with that background, though.)

@enkiv2 @natecull @byron @jauntywunderkind420 BLS says that their programmer classification typically has a bachelor's degree, that's not the difference here.

Similarly, a degree isn't necessarily required for the developer role.

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!