Show more

I make my living as a systems administrator and support engineer. (and I'm looking for a new gig, if you're hiring.) That's a fancy way of saying that I solve people's computer problems. Professionally, I'm responsible for identifying and mitigating the shortcomings of various computer systems.

Guess what?
There are a lot of these shortcomings. Like, a lot. More than I ever expected.

Some of these shortcomings are legitimate bugs. Some of them are bafflingly short sighted or poorly considered architectural decisions. Just as many are cases of a divergence between the needs of the user and the abilities of a program. Modern programs are often feature incomplete, poorly supported, and difficult or impossible to customize. Modern computers are often slow, and cranky. I'm responsible for handling the fallout of this unfortunate situation.

I've seen how revolutionary a computer can be, if it is designed with the needs of the user in mind, and how disastrous the same can be when it is not. I've seen computers used to empower people, and used to oppress. I've seen computers be Good, and the consequences of when they are not.

So that's who I am, and my experience with computers so far. Those are my credentials, and my qualifications.

Before we go any further, let's talk about The Computer Chronicles.

The Computer Chronicles was a TV show that ran from the early 80s through the early 00s. Over it's nearly 20 year run, The Computer Chronicles covered nearly every facet of the newly developing Computer industry. It was hosted by people with Opinions.

The guests were, frequently, people who were proud of the things they made, or the software they represented.

Watching the developer of CP/M and DR DOS talk to a mainframe engineer who worked at IBM in the 50s about the future of computers as seen from the 1980s was eye opening.

On the one hand, this show serves as an excellent introduction to, or reminder of, the capabilities of computers 35 years ago. It helps us see how far we've come in terms of miniaturization, while also demonstrating again that, in many ways, there is nothing new under the sun.

Before the advent of the internet, reporters were writing their stories on laptops and sending them in over phone lines, 25 years before the release of the iphone HP released a computer with a touchscreen, three years before microsoft released he first version of windows Apple and Visicorp demontrated GUIs wih features that Windows wouldn't be able to approach for another 9+ years.

And, of course, I'm reminded again of Douglas Engelbart's 1968 "Mother of all Demos", in which he demonstrated the mouse, the GUI, instant messaging, networked gaming, and basically every other important development of the following 50 years.

It took 5 years for Xerox to refine and miniturize Engelbart's ideas to the point that they thought they could market them, and another 10 years before Apple refined and further miniturizaed the same ideas, and brought us the Mac.

Nothing is ever new.

The whole video of Engelbart's Online System (NLS) is available on youtube. Some of it is *really* interesting. Most of it is unfortunately dry. It's easy to forget that this was 50 years ago, and also mindblowing that it was only 50 years ago.

Anyway, back to Computer Chronicles. In an episode about Word Proccessors, the man they were interviewing said "There's a lot of talk about making people more computer literate. I'd rather make computers more people literate." There's a phrase that resonated with me in a big way.

It sounds like the kind of semantic buzzword shuffling so common in standard corporate speak, but I got the impression that the guy that said it, believed it. He believed that computers had gotten powerful enough that they no longer had to be inscrutable.

There were others working around the same time on similar ideas, or at least from a similar philosophy. Working to make computers, if not intuitive, at least comprehensible. I think this is a noble goal.

The computer is often thought of as a tool, but it is more like a tool shed, in which we store a collection of tools, a source of power, and a workspace.

The tools of the 60s and 70s were primitive, partially because of the limited space and limited power our toolbox could provide for them, but also because our ideas and understanding of how these tools should work were limited by the audience who was using the tools.

That is to say, in the 60s and 70s, computers were weak and slow and computer users were also computer programmers. A small, tight knit circle of developers and computer scientists were responsible for the bulk of the progress made in that time, and the idea of designing tools for non-technical users was never considered.

Computer culture had, by and large, a kind of elitism about it as a result of the expense and education required to really spend much time with a computer. This changed, slowly, starting in the mid 70s with the development of the Microcomputer Market and CP/M.

Computers became more affordable, slowly. Affordable computers became more powerful, quickly. Within 10 years, non-technical users were interacting with computers on a daily basis. It was against the beginnings of this backdrop that the phrase I mentioned earlier was coined. "Human Literate Computers" or "Human Centered Computing."

Ease of Use was the holy grail for a lot of computer companies. A computer that was so easy to use that they could sell it to grandma. But, to me at least, Human Literate and Easy to Use are distinct ideas. Many modern applications are Easy to Use. Netflix is Easy to Use. Facebook is, for all it's faults, pretty easy to use. The iPhone, the iPad, and ChromeOS are super easy to use.

Well, they are easy to use as long as you use them in the prescribed way. As long as you let them tell you what you want to do, instead of the other way around.

That, IMO, is the distinction.

I think that many of the steps towards demystifying the computer of the 80s and 90s did good work, but ultimately, the computer industry left the whole idea behind, in favor of making some tasks Very Easy while making other tasks Practically Impossible, and turning everything into a surveillance device.

When I was a kid I was brought up with computers that showed you how they worked.

You booted in to a command prompt or a programming language, or you could get to one, if you wanted to.

I got to play with GW Basic and qBasic and, a little, with hypercard.

I got to take apart software and put it back together and make things that made people happy.

I got to make things that I needed. I got to make things that make me happy.

Today, the tools to do that are complex to compensate for the vast additional capabilities of a modern computer, but also to reinforce technical elitism.

I often wonder why Hypercard had to die.

It was because Jobs wanted the Computer to be an Appliance. A thing only used in prescribed ways.

Letting people build their own tools means letting people control their own destiny.

If I can make what I want, or if someone else can make what they want, and then I can take it apart and improve it, why would I pay for an upgrade? Why would I pay you to build something that doesn't meet my needs?

I'm mentioning hypercard specifically because I've been relearning hypercard recently, and it is *better* and more useful than I remember it being.

It's honestly revelatory.

Hypercard, if your unfamiliar, is powerpoint + instructions.

Here's a great introduction/example: loper-os.org/?p=568

The author walks you through building a calculator app in about 5 minutes, step by step.

Warning: There's a bit of ableist language tossed around in the last paragraph. Skip it, there's nothing worth reading there anyway.

You use the same kinds of tools you would use to build a slideshow, but you couple them with links, multimedia, and scripting.

Want a visual interface for your database of client data? Great! slap together a roladex card, and drop in a search function.

Go from concept to presentation ready in an hour or two (or less, if you've done this before!)

Hypercard was easy to use. Everyone who used it loved it. It was integral to many businesses daily operations.

Jobs killed it because he couldn't control it.

Microsoft doesn't ship any tools for building programs with their OS anymore, either.

They used to. There was a time when you could sit down at any windows or DOS machine and code up a program that would run on any other Windows or DOS machine.

But we can't have that anymore.

In the name of Ease of Use, they left out the Human aspect.

Use your computer how you're told to use it, and everything is easy.

Do anything new or novel and it's a struggle.

My nephew has an ipad.

He asked his dad how to write games. His dad didn't know. His dad asked me how to write games on an iPad. I told him not to bother.

My nephew asked me how to learn to write games.

I gave him a raspberry pi and a copy of pico 8.

Now he writes computer games.

He couldn't do that on his iPad.

Hypercard would be a perfect fit for the iPad and iPhone.

Imagine it!

Imagine the things you could build.

But we aren't allowed to have computers that are fun to use, that are easy to build for, that are human centric, or human literate.

The last 10 years of development in computers were a mistake. Maybe longer.

Instead of making computers Do More, or making them Feel Faster, we've chased benchmarks, made them more reliant on remote servers, and made them less generally useful. We brought back the digital serfdom of the mainframe.

In the first episode of computer chronicles (youtube.com/watch?v=wpXnqBfgvP) the mainframe guy is real adamant about how mainframes are good and micros are bad.

The host, a microcomputer legend, disagrees pretty strongly.

Later, when they talk about the future of networking, the mainframe guy talks about it as a return to mainframes. The micro guy talks about BBSs, peer to peer networks.

The mainframe guys are winning.

(this is not to say that I think mainframes are bad. I don't. Mainframes can be really good and interesting! Plato was wonderful, as were some of the early unix mainframes.

But IBM style Mainframe culture is The Computer as a thing you Use but don't Control culture, and I am very against that.)

I have to step away for a while. I'll continue this later.

@ajroach42 I want to respond, elaborate, & discuss at length here. I spent about 10 months some years ago immersed in the computing literature around the history of debuggers, during which I went from EDSAC to Visual Studio, but also all the other half-dead ends ends of computing history such as, e.g., Lisp machines.

Naturally, I came out of it a Common Lisper, and also naturally, with Opinions about modern computing.

Up for the discussion? It could get wordy and over a few days. :)

@pnathan for sure.

I haven’t gotten in to lisp machines yet, but I’m always down for discussion.

@ajroach42 @pnathan
This thread is going to be gold :)
(I'm replying here so that I won't forget about it...)

@ciaby @pnathan I hope you enjoy! I'm looking forward to the discussion as well.

@ajroach42 @ciaby
OK, so, I'm about a decade older than you, Andrew: I taught myself QBasic in the mid 90s, got online late 90s, never really looked back.

First, I want to say this: older computer systems - considered as systems - were generally more capable.

But to be clear, they were limited in use for those who didn't take an interest in learning them. I'm talking about things that weren't Windows 3.1+.

Follow

@ajroach42 @ciaby This was the Great Debate that was largely won by Microsoft. "Everyone can 'use' a computer.". That is to say, everyone can operate the appliance with preinstalled software. *everyone*. Apple pioneered the notion, but it turns out to be the preferred mode for businesses, who really rather don't like having specialized experts.

@ajroach42 @ciaby It is my contention that Windows (& *nix) computer systems are designed to be administrated and managed by sysadmins, and the user experience in this case is great.

When you have sysadmins, there are no driver problems. There are no printer problems. There are no problems, as a matter of fact: it's all been taken care of by the admins.

This is exactly how executives like it.

Apple does the same, with their iPhone.

Apple is the sysadmin, metaphorically.

@ajroach42 @ciaby

Here is the fundamental conundrum of computers: to use at an expert level - to really make the machine work for you, you must become an expert too, and usually a programmer, even ad hoc.

Efforts to avoid and deny this have occurred for *decades*.

COBOL.

Some of Engelbarts work.

Algol (ish)

Excel.

Chris Granger's 'Eve'.

Tableau

FPGA designers with CAD addons.

Embedded system CAD tooling

numerous academic papers

@ajroach42 @ciaby

all these systems collapsed at a point: the point where the fundamental reality of the problem met the fundamental reality of the machine.

programming had to occur.

Apple solved this by making so many programs available on the iThings for so many niche issues, that programmers would code what was needed and the user didn't have to care anymore about surmounting the issue.

Same for businesses & windows, essentially.

@ajroach42 @ciaby

so here's the problem: you're right. computers are easier to use, fsvo of use.

but the truth was, back when computers were harder to use, in the 90s... people really hated learning how to use them. there was an immense demand for not having to think (there's a book called "don't make me think" about this whole problem).

so we have this weird place where no one outside of the "'elite" wanted to care, and they resented being made to care.

so apple won by fulfilling that.

@ajroach42 @ciaby let's talk about lisp machines as I understand them - being born in their heyday.

lisp machines presumed the user and the programmer were the same person. user had root on everything, and everything was in lisp, and was mutable.

this worked GREAT, basically. multiprocessing, security, meh, whatever.

total control in the hands of the user. to be honest, most programmers at that time were not ready for it, didn't want it, and the machines were 10x the cost)

@ajroach42 @ciaby Also, lisp machines were made by hippie engineers who were really bad at business. so that didn't work out.

but you have this enormous tension between Lisp "we expect you to come up to our level, here's the manual, we'll answer all your Qs", and Windows/Java "here's the basics, don't poke yourself with the sharp bits"

@ajroach42 @ciaby

as an example of Lisp-world, for instance - it had debuggers that essentially ran as in-process monitors that could take and trigger recovery actions based on logical conditions - in '92. we don't have that today, and in languages which are compiled, it will never exist.

@pnathan @ajroach42 @ciaby
A sort of side dilemma with this is that, by turning computers into magic boxes for making increasingly complex layers of tasks accessible to average people, this understanding gap just widens. Average users become increasingly disconnected from even a baseline understanding of the processes and design patterns at work in computing, and the knowledge of the "elites" becomes ever more rarified.

@pnathan @ajroach42 @ciaby
How can there ever be reasoned popular discourse about the practical, moral, and political implications of modern computing, if you have to be a developer or programmer to even understand the basic concepts?

@cossimo @ajroach42 @ciaby this is an incredibly important point and its part of why I reluctantly support "everyone must code" efforts in schools, despite its attachment to the jobs ideal.

it's analogous to the idea that in a lab at school, you encounter ideas of safety and ideas relevant to the discipline, even if you never do anything with it again.

but, then again, we can describe the effects of computing without being a programmer. This is, I think, the lesson of the environmental movemen

@cossimo @ajroach42 @ciaby You don't have to be a chemist to demand that a paper mill not put outputs into your drinking water.

Likewise, you don't have to be a programmer to note that Facebook & Twitter's algorithms are outrage machines and should be regulated for the good of our society.

@pnathan @ajroach42 @ciaby
Very true. The "outrage machine" is a pretty easily understandable by-product of FB and Twitter, because it is so overt. By contrast, I think average people have much less of an understanding, for example, of the APIs, tracking pixels/widgets, apps, etc., the FB and Twitter algorithms use to collect and aggregate data about them, or how that data gets used to tailor their everyday experience.

@pnathan @ajroach42 @ciaby
I think most people are still fairly ignorant (perhaps willfully so) of how closely they are tracked and how their phones make their every action a data point.

Similarly, to riff on the chemistry example, most people are blissfully ignorant about all the *stuff* that gets put in their food and most of the inhumane or unsustainable process that are used to create it.

@pnathan @ajroach42 @ciaby
The more seamlessly invisible a technology is, the more people willfully ignore it, no matter how dangerous it is.

Anyway, didn't mean to drag this (awesome) thread off on such a tangent.

@cossimo @ajroach42 @ciaby I don't think it's precisely a tangent though: invisibility and lack of understanding - or lack of desire to understand - helped build the problem we have today.

if all users really cared deeply about understanding and collapsing the user/programmer division, then we'd probably all be using a Linux core with a Lisp machine on top; everyone would intuitively understand algorithms and how the net worked.

but they prioritize other things, WHICH IS FINE.

@pnathan @ciaby @cossimo

Lisp machines also failed economically.

CP/M was better and more widely used than DOS, but IBM took DOS anyway.

It doesn't matter what users want, if it's not offered to the users.

We have software monopolies today.

@pnathan @ciaby I'm with you on how and why this happened.

You seem to be discussing it as if it was inevitable, though. I'm firmly of the opinion that it was not inevitable, and that compromises were possible.

Right now, there is very little space for the users in the middle. It's all concentrated at the edges. You're a coder or a user. There's no middle, and there *could* be.

@ajroach42 @ciaby

Right, middle, What was the compromise, given the users desperate not to think though?

The effectual compromise made was Linux - that lets off the pressure from Microsoft & Apple and directs all these maker-types over to a system that fits them.

@ajroach42 @ciaby
I argue that Linux is closer to the old paradigm - users and programmers are much closer and there is a strong pressure to be "some" kind of programmer, even if its just a scary terminal shell occasionally.

@pnathan @ciaby that’s fair.

I guess I need to add ‘design an idea software package for Linux, and write documentation’ to my list of future projects. :-D

@pnathan @ciaby the users desperate not to think are not the only users.

My point is that we have tools for programmers, tools for users who just want to do what they are told, and nothing (or very little) for the folks in the middle.

Maybe it’s a smaller group than I think, but I doubt it.

@ajroach42 @ciaby
tools for people in the middle: what would that be?

if it's mathematics/business, that'd be excel.

if its programming, then VBA is still a thing, yes?

why don't you hone in on what you really want from a tool? what does it do? if it's 'general purpose computing', then beware - a lisp macro & a library might be the right way to go. :)

@ajroach42 @ciaby Speaking of, I'm going to focus on writing code for the next hour to grind on my stupid business idea before bed.

@pnathan @ciaby I fell asleep before we could continue this conversation, so I'm catching up today.

When I say tools for people in the middle I mean tools for development that do a little hand holding. Hypercard, Pico-8, GW-Basic.

Right now, we have a culture that tells people that Programming is Hard (because it often is, even with the 'easy' tools)

Some kinds of programming could be much easier, if we'd let them.

@ajroach42
@ciaby @pnathan
Add MS Access to that list of tools in the middle.

In IT we hate it because it encourages users to have their data scattered and not backed up, and full of misspelled duplicate records. But it sure lets a certain type of user get their real work done.

@gcupc @pnathan @ciaby Access is a great example of the kind of system I think that we need to refine and produce more of.

It's not great, and it's overwhelming, but if it's done right it can be *very* useful to some use cases.

I want more of that, but with the lessons we've learned on version control and the like baked in.

@ajroach42 @ciaby I want to also argue that there's an eternal september problem inherent in the situation.

but here we have a core issue: should a user be a programmer? at all? if so, we are easing them towards the "elite", or so it would be said.

or, alternatively, is this a consciousness raising exercise where this OS - UnicornOS - raises the consciousness of the user to deeply engage with the Computer?

what should Unicorn do, anyway? See the conclusion of: graydon2.dreamwidth.org/259333

@ajroach42 @ciaby the more you ask Unicorn to interop with the existing world, the more you constrain to the limitations and expectations of the existing system, which tends to remove agency of the operator.

I frankly think its time to build a new OS from the assembly on up to empower people, but I'm loathe to take that on when I'm dependent on a company to pay mortgage and health insurance

@pnathan @ciaby This is a good point, but I think it deserves scrutiny.

I am employed as a support engineer and a sysadmin, and I still run in to driver issues, printer issues, etc.

I take care of them, eventually, when I can.

But, even after doing this for 10 years, I still encounter problems that I can't solve (because there isn't a solution.)

but the metaphor of Apple as sysadmin, I'll accept. I disagree with someone else admining my phone, but that's another issue.

@ajroach42 @ciaby your users pay you so they don't have to care about sysadmin issues. their world is great!

@ajroach42 @ciaby I'm glossing over the 1% failures to get at the core point: sysadmins are designed into the windows and unix world so users can focus on their core competency.

@Shamar @ajroach42 @ciaby I'll eyeball your work.

people can program. people do program. where there is a will there is a way.

and there are many many ways to program.

arguably most are terrible, and the ones that condesendingly target newbies produce the worst systems overall.

@Shamar @ajroach42 @ciaby

That the of is not necessarily inherent to the matter.

here is where I disagree.

the complexity of understanding the "web stack" is incidental; the compelxity of understanding the concept of distributed computing and comms protocols is fundamental.

or something as simple as rendering bits to the screen. raster? vector? what abstraction do you choose to execute the display mechanism. now you have a model.

@Shamar @ajroach42 @ciaby ... continuing. Next year, maybe you want a different model, so you break off and redo it a bit. Now you have to figure out how to juggle two incompatible models in your code, and you're on your way to inventing an abstract interface system.

even if you're doing assembly!

@Shamar @ajroach42 @ciaby

here's my claim: software is crystallized thought, with all the complexities, ambiguities, and changing over time of thoughts. we can gut the whole shaky tower of modern computing, and we'll still be confronted with the core problem (even assuming a workable and standard bit of hardware for the engineering problems, themselves non-trivial sometimes)

@Shamar @ajroach42 @ciaby

EWD was probably the most astute prophet of software engineering that has lived to date.

let me challenge you: what is the secret knowledge which, knowing, would unlock the door?

@Shamar @ajroach42 @ciaby ah but that doesn't get anywhere until we start digging.

what is simple? is it the ability to point and click a mouse? is it a keyboard key?

both of those have deep wells of complexity and knowledge to make happen, despite surface simplicity.

or is it a transistor, which accumulation of produces unspeakable complexity?

@Shamar @ciaby @pnathan The web isn't all bad, and it's not all bad technologies, but it isn't all good either.

All I'm asking is that we take a step back and examine our modern software with a more critical eye towards how we could improve it for future generations.

I'm not sure why this has become so controversial.

@pnathan @Shamar @ajroach42 @ciaby
I'm gonna repost my newly relevant diagram.

there isn't a bright line between programming and passive computer use. all UIs are programming languages. most are simply shitty, overly constrained languages that make simple tasks nearly impossible.

niu.moe/media/uHnt_DOdjxWrSE0G

@enkiv2 @pnathan @Shamar @ajroach42 @ciaby Those are good charts that depict a useful idea, but I don't think what they describe is what most people know as a “learning curve”—which would be proficiency (as a percent of the tool’s available capability, y-axis) across time spent (x-axis).

@pnathan @ciaby @ajroach42 @joeld @Shamar You're right. They describe the difficulty of solving a problem with a tool vs the inherent difficulty of that problem -- which is essentially an inverse of the learning curve for tools that can address all problems. Tools that fail by making tasks that aren't easy impossible, of course, have a misleadingly good-looking learning curve.

@pnathan @ciaby @ajroach42 @joeld @Shamar (The catch being: they can't actually do anything, so having 100% mastery over them isn't actually valuable.)

@pnathan @ciaby @ajroach42 @joeld @Shamar I cross-posted this thread to lobste.rs, producing a comment thread over there that is at turns useful and infuriating: lobste.rs/s/q31cqp/thread_abou

@pnathan @ciaby @Shamar

To the user who wants to display bits on the screen, it shouldn't matter unless/until they want to display bits in a way that one format handles over the other.

I can see how and why it matters to someone building more complex systems, but if all I want to do is have a text input box, why do I need to care about anything else you said?

@Shamar @ajroach42 @ciaby One system that I have been curious about is ethoberon.ethz.ch/white.html#S Oberon OS. Apparently it was extremely successful but external pressures collapsed it.

@Shamar @pnathan @ciaby I never said people can't learn to program.

I'm saying that some people don't want to learn to program, and that what we call "programming" is needlessly difficult for some tasks, in the name of corporate profits.

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!