Microsoft doesn't ship any tools for building programs with their OS anymore, either.
They used to. There was a time when you could sit down at any windows or DOS machine and code up a program that would run on any other Windows or DOS machine.
But we can't have that anymore.
In the name of Ease of Use, they left out the Human aspect.
Use your computer how you're told to use it, and everything is easy.
Do anything new or novel and it's a struggle.
My nephew has an ipad.
He asked his dad how to write games. His dad didn't know. His dad asked me how to write games on an iPad. I told him not to bother.
My nephew asked me how to learn to write games.
I gave him a raspberry pi and a copy of pico 8.
Now he writes computer games.
He couldn't do that on his iPad.
Hypercard would be a perfect fit for the iPad and iPhone.
Imagine the things you could build.
But we aren't allowed to have computers that are fun to use, that are easy to build for, that are human centric, or human literate.
The last 10 years of development in computers were a mistake. Maybe longer.
Instead of making computers Do More, or making them Feel Faster, we've chased benchmarks, made them more reliant on remote servers, and made them less generally useful. We brought back the digital serfdom of the mainframe.
In the first episode of computer chronicles (https://www.youtube.com/watch?v=wpXnqBfgvPM) the mainframe guy is real adamant about how mainframes are good and micros are bad.
The host, a microcomputer legend, disagrees pretty strongly.
Later, when they talk about the future of networking, the mainframe guy talks about it as a return to mainframes. The micro guy talks about BBSs, peer to peer networks.
The mainframe guys are winning.
(this is not to say that I think mainframes are bad. I don't. Mainframes can be really good and interesting! Plato was wonderful, as were some of the early unix mainframes.
But IBM style Mainframe culture is The Computer as a thing you Use but don't Control culture, and I am very against that.)
I have to step away for a while. I'll continue this later.
@ajroach42 I want to respond, elaborate, & discuss at length here. I spent about 10 months some years ago immersed in the computing literature around the history of debuggers, during which I went from EDSAC to Visual Studio, but also all the other half-dead ends ends of computing history such as, e.g., Lisp machines.
Naturally, I came out of it a Common Lisper, and also naturally, with Opinions about modern computing.
Up for the discussion? It could get wordy and over a few days. :)
@pnathan for sure.
I haven’t gotten in to lisp machines yet, but I’m always down for discussion.
First, I want to say this: older computer systems - considered as systems - were generally more capable.
But to be clear, they were limited in use for those who didn't take an interest in learning them. I'm talking about things that weren't Windows 3.1+.
@ajroach42 @ciaby This was the Great Debate that was largely won by Microsoft. "Everyone can 'use' a computer.". That is to say, everyone can operate the appliance with preinstalled software. *everyone*. Apple pioneered the notion, but it turns out to be the preferred mode for businesses, who really rather don't like having specialized experts.
When you have sysadmins, there are no driver problems. There are no printer problems. There are no problems, as a matter of fact: it's all been taken care of by the admins.
This is exactly how executives like it.
Apple does the same, with their iPhone.
Apple is the sysadmin, metaphorically.
I am employed as a support engineer and a sysadmin, and I still run in to driver issues, printer issues, etc.
I take care of them, eventually, when I can.
But, even after doing this for 10 years, I still encounter problems that I can't solve (because there isn't a solution.)
but the metaphor of Apple as sysadmin, I'll accept. I disagree with someone else admining my phone, but that's another issue.
sorry for digging up this old thread, but I have one remark that's been on my mind since I saw your post:
I knew how to read and write when I was 4. I don't remember how I learned it, but I guess I wanted to learn it, or found it fun.
Are not all people like that? Do other people only learn to read when forced to at school?
Is there a correlation between programmers and people who learnt to read before school?
Would homeschooling be better? In the best case it probably would, but what about the average case and worst case? Would homeschooling-as-default reinforce the divide between the rich and the poor?
Or maybe we should go for master-and-padawan model, where you learn by helping someone do what you want to learn?
homeschooling is not practical nor possible to do well at scale.
it relies on having at least one highly educated & disciplined parent who throws their career & potential in the trash to teach their children. to be clear, that largely means women.
I had an *excellent* homeschooling education and I have 0 desire to suggest that anyone should pursue that path who isn't wealthy already.
fix the frigging school system.
@Wolf480pl @ajroach42 @seanl @Shamar @ciaby to be even more blunt, most people aren't that unique or that interested in learning what's needed for general life success and achievement. That's the point of a regularized mandated curriculum: to ensure, on average, people know enough to be good citizens.
the legendary lack of care for education produces the antipathy towards education in the USA.
if you want excellent education, you must make rich kids go to public schools.
@pnathan @ajroach42 @seanl @Shamar @ciaby
so what you're saying is:
- to achieve success in a modern society, you need to have some basic skills that everyone is expected to have, before you reach the age of 18
- most people don't want to learn those skills before the age of 18
- we should force them to learn those skills so that they can be successful
Server run by the main developers of the project It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!