We need dumb tech and smart users, and not the other way around
@hisham_hm Is there any industry standard for ensuring that we get smart users? Any best practices to follow?
@phoe People are smart. The tech should not "dumb them down" by acting condescendingly, cutting down on their agency and limiting their opportunities of education.
@hisham_hm I don't know, a lot of new tech is going in that direction, and it doesn't seem to work too well. Maybe it's because even the smartest users behave dumbly once in a while.
@hisham_hm Agreed! It's worsened when "smart" tech is designed to intrude on privacy and decide what people can or cannot see.
@hisham_hm You're not going to find a practical solution by praying, hoping, pretending, or declaring all the children to be above average.
Population intellectual and cognitive capabilities are remarkably stubbornly fixed and finite. At best, about 10% of people really "get" computers. Usually far fewer.
Mind, I really wish this wasn't the case, and that your declaration was viable.
There are further confunding factors
@clacke Necessary disclaimer: I'm not a cognitive scientist. Synoptic discussion means operating well outside fields of expertise.
What I've seen and studied of symbolic understading, expression, representation, manipulation, and retrieval --- listening/reading, speaking/writing, analysis, memory --- suggests that there's a wide range of capabilities, that we're mostly unaware of because our normal interactions occur well within capabilites bounds.
But even with basic spoken or written language, there are people (in their native lftanguage) with very high ... or virtually no ... ability.
Most advanced countries have basic literacy rates of 95--100%. But basic literacy is simply the baseline. The US has a four-grade rating:
Below Basic: 14%
One third of US adults are at or below "basic" prose literacy.
Mind: A fair portion of these are nonnative speakers of English. Some border regions especially in Texas have remarkably low English literacy, they may be proficient in other languages.
But that's a third of the population with a major impediment to significant computer proficiency, on what is a principally text-and-language-based interface.
Keep in mind that secondary school graduation rates have been well above 90% since the 1950s. Educational access shouldn't be a major driver.
@clacke A lot of things I’ve learned from non-programming fields I’ve learned from programming. I think programming should be one the first things taught♥
For example yes programming becomes better and more efficient once you know math, but, math becomes easier to learn once you know programming so programming should be taught first♥♥ Something similar goes for philosophy. SICP the most important philosophical work of the 20th century.♥
Naw, but SICP taught me meta-programming and making DSLs. (I later read On Lisp and PAIP too.)
I originally learned programming through reading books about BASIC around five years before I ever saw a computer in real life. I was around 8 when reading those books (not having anything other than pen and paper) and 12 once I got access to DOS and QBasic. Me and my sister used to make toy games for each other. Mom would also make BASIC apps. Although she and my sister never moved beyond that. I loved it though.
SICP I was 21 and I already had some Lisp familiarity from Emacs. I’m 40 now.
SICP is more for the philosophy (meta-circular stuff, lexical scope etc and also explicating analogies of programming to other fields).
The programming kids should learn doesn’t have to be Ir-transforming macros or anything but creating and executing sets of instructions is great—algorithms! That will help a lot with math too!
@Sandra I fully agree with your opinion re math and other school stuff - how writing code gives kids ways to play with many abstract problems. I observed this recently with kids "struggling with maths" and suddenly they started playing with geometry (angles) and applied parametrized abstractions naturally without knowing what an "abstraction" really is.
@Sandra Now I am looking into ways to get there: I am 45 and still have trouble to go through SICP videos (they are excellent). I have avoided lisp in my life, preferring to work in low level stuff like assembly for example. (Only to end up writing Python that feels and looks like Scheme:)
@Sandra "On Lisp" bored me to death and couldn't get into emacs despite numerous focused attempts. But I think I learned a bit of those concepts in the meantime, for example by working with Forth and PostScript. I think playing directly with the compiler in Forth inspired my low-level assembly soul better than lisp macros.
@Sandra Shall we have a "assembly/embedded/forth" and "lisp/functional" class profiles to approach different minds? Or are those differences just a matter of exposure and experience? I think both you and me started with BASIC anyway....
I wonder if there is some research about it.
@saper In the past, and I still think this is a good idea, I’ve recommended teaching ASM and Scheme as the first two languages. Side by side. Deos not have to be X86-64 ASM. A simple toy or retro ASM is better. Purp is to teach von Neumann arch
@clacke You jump to blame.
I look at available data and research, and keep an open mind.
I'm not aware offhand of research into programming skill, distribution, cognitive determinants, or psychology. But it's a complex task with numerous components and variants. Even highly-proficient programmers tend to have affinity for a small set of tools and languages.
Computers are general purpose computing devices, but is the goal here for everyone to be computing things? What are we computing and why?
I know its kind of anathema to suggest this, but is computing and the ability/access to do computing a general good?
We have a lot of evidence that large concentrations of computing ability produces harm centers that warp all of our social and civic functioning toward ill.
I agree that “the computing of humans” is a source of harm, and is indeed one of the problems I have in mind when I ask whether “everyone should be doing computing” is a wholesome goal.
Yet, humans and human societies are the ones building, programming, and operating the computers that are being used on humans.
"I ask whether “everyone should be doing computing” is a wholesome goal."
I feel that if everyone ISN'T doing computing, then computing is being done TO them.
ie, that learning to "compute" (ie: think logically, manage incoming information streams, make tools) is a basic defensive mental skill that is necessary to avoid being manipulated by the bad people with big computers.
But I don't know if that's actually true.
@vortex_egg Fair question.
"Computer" itself is a misnomer. Our devices are informators, they process information (hence: information technology and data processing), and are connected to communications networks and information storage.
The information roles themselves are, generally:
Interpersonal communications. Text, image, voice, video.
Document access, in the sense of #PaulOtlet: any fixed record (text, image, voice, video, ...) not primarily interactive. May be informational or entertainment. I'm including streamed/live acces here.
Commerce & business: buying, selling, and transacting business (buying, selling, services, activities).
Government: Financial (mostly taxes/fees) and other informational transactions.
Personal, household, task/activity, financial management & organisation. Recordkeeping, planning, budgeting, designing, device/services management, etc.
Other technical/scientific data streams, e.g.utility, safety, environmental monitoring, healthcare.
Creation & expression.
@vortex_egg And it's not all roses.
There's disinformation, fraud, abuse, bullying, doxxing, surveillance, manipulation, harassment, distraction, information overload, malware.
But think generally information than computing. What computing is involved largely relates to those information streams and records.
And increasingly, everyday life presumes some form of digital access.
"Informator" isn't original to me, though I can't find the source right now.
@vortex_egg But think of the "computer" as the interface through which information is created, accessed, updated, input, streamed, converted, stored, retrieved, duplicated, destroyed, received, transmitted, analysed, and modeled.
It replaces or modifies writing, speech, audio, video, data, records, streams, and algorithms (programs / software), as well as systems: control, management, monitoring, manipulation, alerting, logging.
How necessary all this is ... ?
Though it's ever harder to live without it.
Yes that largely aligns with how I already think about the situation at a metaphysical level: Computation is a stand-in for information, or rather en-formation - the creation of forms, the calling out of distinctions from the unformed to say "this" is not "what is not this".
Computational technologies automate and accelerate the production of forms, and forms of forms. But this process of en-forming predates its mechanization.
To the hermeticists and rosicrucians one half of reality is form en-forming... but the other half is the unformed, or perhaps we know it as "that which is formed".
The question is: In this age of proliferation of computing devices, and the unending production of forms, what is that we are en-forming so rapidly? Why and to what end?
Norbert Wiener and Shoshana Zuboff offer that we are trying to en-form human life. Instrumentalization. The human use of human beings.
This again predates computers as the interface for en-formation. But the concentration of such devices, compounded with monopolization, has channeled this process perhaps to an unprecedented scale.
So it gives me pause when we ask about making computers more "getable".
Of course we don't want to further pull the ladder up behind us on the way to the tower of the computer priesthood.
We are kind of in a situation not unlike the Paris agreement for climate reductions and pre-industrialized nations that are preemptively punished for not having already exercised the abuses of petro-industrialization before the larger nations got to it first.
Now that I'm reading @dredmorbius's lovely link, I was thinking about how to better tie my interjection back to @hisham_hm's original toot (about dumb tech and smart users)... and I think this comes close to my own intuitions on the matter.
Maybe there's something about the knowledge imbalance between an information technology and its user that creates a flow that can go in either direction:
Either you know more than the machine and can thus do the forming, or you know less and become formed by it.
I'm not sure if "knowledge" is not exactly the right way to frame it, but we're looking at something in that rough zone.
Dumb tech is inherently exclusionary, but we still need it. Not everyone is ready to use the console, but that doesn't mean the console shouldn't exist.
Smart tech is inherently manipulative, but we need that too. Ethical smart tech is manipulative in the same way as education, claiming to know how a person should be shaped.
People shape tech, tech shapes people. Believing we KNOW what's best for others is hubris, but guessing is our job.
@hisham_hm @kensanata @dredmorbius @vortex_egg
"To the hermeticists and rosicrucians one half of reality is form en-forming... but the other half is the unformed, or perhaps we know it as "that which is formed".
The question is: In this age of proliferation of computing devices, and the unending production of forms, what is that we are en-forming so rapidly? Why and to what end?"
Oh, that's *very* good and it's what I worry about too.
I think late Wittgenstein had similar thoughts.
Side-toot, "computer" is a misnomer in the way that "digital" is a misnomer, in that both are historical accidents instead of etymologically-germane descriptors. They end up hiding and obfuscating the subjects under inquiry instead of revealing their inner nature.
> "Computer" itself is a misnomer. Our devices are informators, they process information (hence: information technology and data processing), and are connected to communications networks and information storage.Makes me think of the Spanish word for computer, "ordenador", a thing that sorts and organizes.
@clacke No idea if "ordinateur" is still being used in French? Used to be common during the home computer times. German has "Rechner" (calculator) or "Rechenmaschine" (calculating machine), though almost everyone will use "Computer" too, nowadays.
Server run by the main developers of the project It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!