That is to say, in the 60s and 70s, computers were weak and slow and computer users were also computer programmers. A small, tight knit circle of developers and computer scientists were responsible for the bulk of the progress made in that time, and the idea of designing tools for non-technical users was never considered.

Computer culture had, by and large, a kind of elitism about it as a result of the expense and education required to really spend much time with a computer. This changed, slowly, starting in the mid 70s with the development of the Microcomputer Market and CP/M.

Computers became more affordable, slowly. Affordable computers became more powerful, quickly. Within 10 years, non-technical users were interacting with computers on a daily basis. It was against the beginnings of this backdrop that the phrase I mentioned earlier was coined. "Human Literate Computers" or "Human Centered Computing."

Ease of Use was the holy grail for a lot of computer companies. A computer that was so easy to use that they could sell it to grandma. But, to me at least, Human Literate and Easy to Use are distinct ideas. Many modern applications are Easy to Use. Netflix is Easy to Use. Facebook is, for all it's faults, pretty easy to use. The iPhone, the iPad, and ChromeOS are super easy to use.

Well, they are easy to use as long as you use them in the prescribed way. As long as you let them tell you what you want to do, instead of the other way around.

That, IMO, is the distinction.

I think that many of the steps towards demystifying the computer of the 80s and 90s did good work, but ultimately, the computer industry left the whole idea behind, in favor of making some tasks Very Easy while making other tasks Practically Impossible, and turning everything into a surveillance device.

When I was a kid I was brought up with computers that showed you how they worked.

You booted in to a command prompt or a programming language, or you could get to one, if you wanted to.

I got to play with GW Basic and qBasic and, a little, with hypercard.

I got to take apart software and put it back together and make things that made people happy.

I got to make things that I needed. I got to make things that make me happy.

Today, the tools to do that are complex to compensate for the vast additional capabilities of a modern computer, but also to reinforce technical elitism.

I often wonder why Hypercard had to die.

It was because Jobs wanted the Computer to be an Appliance. A thing only used in prescribed ways.

Letting people build their own tools means letting people control their own destiny.

If I can make what I want, or if someone else can make what they want, and then I can take it apart and improve it, why would I pay for an upgrade? Why would I pay you to build something that doesn't meet my needs?

I'm mentioning hypercard specifically because I've been relearning hypercard recently, and it is *better* and more useful than I remember it being.

It's honestly revelatory.

Hypercard, if your unfamiliar, is powerpoint + instructions.

Here's a great introduction/example: loper-os.org/?p=568

The author walks you through building a calculator app in about 5 minutes, step by step.

Warning: There's a bit of ableist language tossed around in the last paragraph. Skip it, there's nothing worth reading there anyway.

You use the same kinds of tools you would use to build a slideshow, but you couple them with links, multimedia, and scripting.

Want a visual interface for your database of client data? Great! slap together a roladex card, and drop in a search function.

Go from concept to presentation ready in an hour or two (or less, if you've done this before!)

Hypercard was easy to use. Everyone who used it loved it. It was integral to many businesses daily operations.

Jobs killed it because he couldn't control it.

Microsoft doesn't ship any tools for building programs with their OS anymore, either.

They used to. There was a time when you could sit down at any windows or DOS machine and code up a program that would run on any other Windows or DOS machine.

But we can't have that anymore.

In the name of Ease of Use, they left out the Human aspect.

Use your computer how you're told to use it, and everything is easy.

Do anything new or novel and it's a struggle.

My nephew has an ipad.

He asked his dad how to write games. His dad didn't know. His dad asked me how to write games on an iPad. I told him not to bother.

My nephew asked me how to learn to write games.

I gave him a raspberry pi and a copy of pico 8.

Now he writes computer games.

He couldn't do that on his iPad.

Hypercard would be a perfect fit for the iPad and iPhone.

Imagine it!

Imagine the things you could build.

But we aren't allowed to have computers that are fun to use, that are easy to build for, that are human centric, or human literate.

The last 10 years of development in computers were a mistake. Maybe longer.

Instead of making computers Do More, or making them Feel Faster, we've chased benchmarks, made them more reliant on remote servers, and made them less generally useful. We brought back the digital serfdom of the mainframe.

In the first episode of computer chronicles (youtube.com/watch?v=wpXnqBfgvP) the mainframe guy is real adamant about how mainframes are good and micros are bad.

The host, a microcomputer legend, disagrees pretty strongly.

Later, when they talk about the future of networking, the mainframe guy talks about it as a return to mainframes. The micro guy talks about BBSs, peer to peer networks.

The mainframe guys are winning.

(this is not to say that I think mainframes are bad. I don't. Mainframes can be really good and interesting! Plato was wonderful, as were some of the early unix mainframes.

But IBM style Mainframe culture is The Computer as a thing you Use but don't Control culture, and I am very against that.)

I have to step away for a while. I'll continue this later.

@ajroach42 I want to respond, elaborate, & discuss at length here. I spent about 10 months some years ago immersed in the computing literature around the history of debuggers, during which I went from EDSAC to Visual Studio, but also all the other half-dead ends ends of computing history such as, e.g., Lisp machines.

Naturally, I came out of it a Common Lisper, and also naturally, with Opinions about modern computing.

Up for the discussion? It could get wordy and over a few days. :)

@pnathan for sure.

I haven’t gotten in to lisp machines yet, but I’m always down for discussion.

@ajroach42 @pnathan
This thread is going to be gold :)
(I'm replying here so that I won't forget about it...)

@ciaby @pnathan I hope you enjoy! I'm looking forward to the discussion as well.

@ajroach42 @ciaby
OK, so, I'm about a decade older than you, Andrew: I taught myself QBasic in the mid 90s, got online late 90s, never really looked back.

First, I want to say this: older computer systems - considered as systems - were generally more capable.

But to be clear, they were limited in use for those who didn't take an interest in learning them. I'm talking about things that weren't Windows 3.1+.

@ajroach42 @ciaby This was the Great Debate that was largely won by Microsoft. "Everyone can 'use' a computer.". That is to say, everyone can operate the appliance with preinstalled software. *everyone*. Apple pioneered the notion, but it turns out to be the preferred mode for businesses, who really rather don't like having specialized experts.

@ajroach42 @ciaby It is my contention that Windows (& *nix) computer systems are designed to be administrated and managed by sysadmins, and the user experience in this case is great.

When you have sysadmins, there are no driver problems. There are no printer problems. There are no problems, as a matter of fact: it's all been taken care of by the admins.

This is exactly how executives like it.

Apple does the same, with their iPhone.

Apple is the sysadmin, metaphorically.

@pnathan @ciaby This is a good point, but I think it deserves scrutiny.

I am employed as a support engineer and a sysadmin, and I still run in to driver issues, printer issues, etc.

I take care of them, eventually, when I can.

But, even after doing this for 10 years, I still encounter problems that I can't solve (because there isn't a solution.)

but the metaphor of Apple as sysadmin, I'll accept. I disagree with someone else admining my phone, but that's another issue.

@ajroach42 @ciaby your users pay you so they don't have to care about sysadmin issues. their world is great!

@ajroach42 @ciaby I'm glossing over the 1% failures to get at the core point: sysadmins are designed into the windows and unix world so users can focus on their core competency.

@Shamar @pnathan @ciaby I never said people can't learn to program.

I'm saying that some people don't want to learn to program, and that what we call "programming" is needlessly difficult for some tasks, in the name of corporate profits.

@Shamar @pnathan @ciaby I feel like you think this was a clever point, but I don't understand what you mean.

Programming is a specialty, and some people have other specialties. Expecting them to also become expert programmers because our current expert programmers can't be arsed to make extensible and understandable tools is unreasonable.

@Shamar @pnathan @ciaby

Some kinds of programming (just like some kinds of math) will remain hard.

But better tools are what I'm after, yeah.

@Shamar
I feel that it's not only a matter of research, but also a point of throwing away some tech that we take for granted (x86, for example) and rebuild from scratch with different assumptions in mind. In the current economic system I find it quite hard to do...
@pnathan @ajroach42

@ciaby @pnathan @ajroach42

Technology can change the world for the better or for the worse.

It can disrupt "the current economic system".

And that's why are in a hurry to keep under control.

So, I don't think that the "current economic system" should be a problem for hackers.

We CAN throw away the web.

I really think it (I work with browser all the day, I know the stack pretty well...).

From scratch, with the lessons learned, it will take a fraction of what it took.

@Shamar @ajroach42 @pnathan
That's possible, and somehow is already happening.
What I'm talking about, however, goes much deeper than that. I'm talking about open hardware infrastructures, where every component is documented, there are no binary blobs or proprietary firmware .
Very important is also the instruction set, because what we have now (x86/amd64) is incredibly bloated and full of backward compatibility shit.
RISC-V is a step in the right direction. If only the hardware wasn't so expensive... ;)

@Shamar @ciaby @ajroach42 Bold claim: open source or non-open source hardware doesn't matter when deployed at scale.

the essential problems today are, in a sense, all software, mediated by the scale.

@Shamar @ciaby @ajroach42 that said:

we have these *inter-twingled* issues: the hardware is manky, the software is manky, and the incentives to improve are perverse.

nymag.com/selectall/2018/04/da

My reckoning is that there is a space today for a sort of New System, a Unicorn OS, where the whole thing is largely rebuilt. Does the web have to exist? does tcp/ip? are there better systems?

here we see we make choices and one prioritizes those who take the time to learn the system and one ...doesn't

@Shamar @ciaby @ajroach42 UnicronOS : the magic OS that we're talking about that solves the problem.

with a sparkling dash of rainbow over it, because, you know it's magic. :)(

@Shamar @ciaby @ajroach42

ah jeeze man, think of the sysadmin needs.

the mail servers fail. the administration is confusing because docs aren't perfect, so it gets misconfigured. the network goes down. baby pukes on server and it fails to boot. server is overloaded by volume of spam.

then the task is outsourced to a guy interested in managing the emails....... whoop whoop we're recentralizing.

@Shamar @ciaby @ajroach42 my Inner Young Geek wants to argue that actual configurable systems are actually not used in the home outside and that mail servers cross that barrier between appliance and administrating-needing machine.

but let's not rabbit trail onto that. ;-)

more my contention and question is: should we expect a member of cyberspace to be knowledgable in minor sysadmin?

I argue yes! we expect people to be able to refill their oil in cars, right?

@pnathan ... no?

There's a whole industry out there of shops that only exist because people don't change their own oil.

@ajroach42 changing oil isn't refilling oil.

one you just stick a can of oil in, the other requires draining the system, changing the filter, etc. much more specialized tooling & environment to do it right.

@Shamar @pnathan @ajroach42

in the UK they are now starting to teach this at junior school level (this is for children at biological ages 5-7, which is called Key Stage 1 here)

When I grew up in 1980s it was only taught in high school at age 14+, to those who had opted to take Computer Studies (a introductory CS course)

bbc.co.uk/guides/z3whpv4

@pnathan

@ajroach42

a better metaphor is cooking. everybody is expected to know enough about cooking to feed themselves. some people cook at a much more expert level, and people who are capable of feeding themselves pay those experts to feed them occasionally. cooking for yourself has benefits over eating out even if you aren't very good, because you can cater to unusual preferences.

@enkiv2 @pnathan @ajroach42
cooking for yourself also keeps the cost of eating out down, because professionals are competing with free. if all professional chefs started doing something (like cooking 'rare' burgers as well-done to avoid liability), home cooking isn't subject to those rules.

It's possible because cookbooks are mostly for the intermediate talented-amateur cook.

@enkiv2 @pnathan

And even still, we have tools (frozen dinners, spice blends, hamburger helper) to help folks that can't cook well still manage to cook what they want.

I want the Hamburger Helper of modern software development.

@Shamar @ciaby @pnathan Pi 0 W?

$10 + a power supply.

But you still have to deal with NAT, or you have to deal with IPv6, or you have to not deal with the internet.

@pnathan
At scale, yes.
Although I feel that software is not evolving because:
The effort to develop new OS is too great, given the amount and complexity of the modern hardware (and closed specs).
Without a new OS, you can't develop new paradigms, and so we're stuck with ideas from the 70s (unix mostly, plus VMS-influenced Windows).
Programming languages are going to use the OS, and that's why we're not really progressing...
My proposal: simpler hardware, open and documented. Build on top of that. No backward compatibility. :)
@ajroach42 @Shamar

@ciaby @ajroach42 @Shamar I agree that backward compatibility has to be nixed for real research and change to occur.

now I have to debug a piece of code that is like the reification of all bad backend possibilities combined.

@Shamar @ajroach42 @pnathan
I was actually looking at it before and find the concept quite interesting :)
Can I run it in a VM easily?
Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!