Follow

We need dumb tech and smart users, and not the other way around

· · Mastodon Twitter Crossposter · 4 · 17 · 40

To make this super clear, I'm not calling users of "smart" devices dumb. People *are* smart. The tech should try to not "dumb them down" by acting condescendingly, cutting down on their agency and limiting their opportunities of education.

@hisham_hm Is there any industry standard for ensuring that we get smart users? Any best practices to follow?

@phoe People are smart. The tech should not "dumb them down" by acting condescendingly, cutting down on their agency and limiting their opportunities of education.

@hisham_hm I don't know, a lot of new tech is going in that direction, and it doesn't seem to work too well. Maybe it's because even the smartest users behave dumbly once in a while.

@hisham_hm Agreed! It's worsened when "smart" tech is designed to intrude on privacy and decide what people can or cannot see.

@hisham_hm You're not going to find a practical solution by praying, hoping, pretending, or declaring all the children to be above average.

Population intellectual and cognitive capabilities are remarkably stubbornly fixed and finite. At best, about 10% of people really "get" computers. Usually far fewer.

@kensanata already linked reddit.com/r/dredmorbius/comme

Mind, I really wish this wasn't the case, and that your declaration was viable.

There are further confunding factors

@dredmorbius @kensanata @hisham_hm Are you saying we have no way of affecting how gettable computers are without dumbing them down?
@dredmorbius @hisham_hm @kensanata Now I've read it. And you really do seem to be saying that "Population intellectual and cognitive capabilities are remarkably stubbornly fixed and finite". I think the end result of who is computer-literate depends on both the computer and the literate.

Capitalism and the lean drive to the MVU created the current figure. Software that had crafted interfaces around the desire paths but still allowed the user discoverable paths to upgrading themself and access the generative layers beneath could increase the figure.

Everyone can't code. Everyone shouldn't code marketable software solutions. But there a lot of people who aren't coding today that could be coding, if we found the equivalent of Excel for their problem domain and made it discoverable and self-teachable to them.

@clacke Necessary disclaimer: I'm not a cognitive scientist. Synoptic discussion means operating well outside fields of expertise.

What I've seen and studied of symbolic understading, expression, representation, manipulation, and retrieval --- listening/reading, speaking/writing, analysis, memory --- suggests that there's a wide range of capabilities, that we're mostly unaware of because our normal interactions occur well within capabilites bounds.

But even with basic spoken or written language, there are people (in their native lftanguage) with very high ... or virtually no ... ability.

Most advanced countries have basic literacy rates of 95--100%. But basic literacy is simply the baseline. The US has a four-grade rating:

Proficient: 13%
Intermediate: 44%
Basic: 29%
Below Basic: 14%

nces.ed.gov/naal/kf_demographi

One third of US adults are at or below "basic" prose literacy.

Mind: A fair portion of these are nonnative speakers of English. Some border regions especially in Texas have remarkably low English literacy, they may be proficient in other languages.

But that's a third of the population with a major impediment to significant computer proficiency, on what is a principally text-and-language-based interface.

Keep in mind that secondary school graduation rates have been well above 90% since the 1950s. Educational access shouldn't be a major driver.

@hisham_hm @kensanata

1/

@clacke If you want to design a universal-access device or platform, these are the constraints you're faced with.

Keep in mind that major expansion markets, with ~ 1 billion souls each, are China, India, and Africa.

Literacy challenges are high.

@hisham_hm @kensanata

@dredmorbius @hisham_hm @kensanata What I'm seeing is that 60% have reasonable to amazing literacy and yet they aren't capable of combining simple programs into slightly less simple programs.

I blame mostly the programs and how we combine them.

@clacke A lot of things I’ve learned from non-programming fields I’ve learned from programming. I think programming should be one the first things taught♥

For example yes programming becomes better and more efficient once you know math, but, math becomes easier to learn once you know programming so programming should be taught first♥♥ Something similar goes for philosophy. SICP the most important philosophical work of the 20th century.♥

@saper

Naw, but SICP taught me meta-programming and making DSLs. (I later read On Lisp and PAIP too.)

I originally learned programming through reading books about BASIC around five years before I ever saw a computer in real life. I was around 8 when reading those books (not having anything other than pen and paper) and 12 once I got access to DOS and QBasic. Me and my sister used to make toy games for each other. Mom would also make BASIC apps. Although she and my sister never moved beyond that. I loved it though.

SICP I was 21 and I already had some Lisp familiarity from Emacs. I’m 40 now.

SICP is more for the philosophy (meta-circular stuff, lexical scope etc and also explicating analogies of programming to other fields).

The programming kids should learn doesn’t have to be Ir-transforming macros or anything but creating and executing sets of instructions is great—algorithms! That will help a lot with math too!

@Sandra I fully agree with your opinion re math and other school stuff - how writing code gives kids ways to play with many abstract problems. I observed this recently with kids "struggling with maths" and suddenly they started playing with geometry (angles) and applied parametrized abstractions naturally without knowing what an "abstraction" really is.

Contd.

@Sandra Now I am looking into ways to get there: I am 45 and still have trouble to go through SICP videos (they are excellent). I have avoided lisp in my life, preferring to work in low level stuff like assembly for example. (Only to end up writing Python that feels and looks like Scheme:)

Cont.

@Sandra "On Lisp" bored me to death and couldn't get into emacs despite numerous focused attempts. But I think I learned a bit of those concepts in the meantime, for example by working with Forth and PostScript. I think playing directly with the compiler in Forth inspired my low-level assembly soul better than lisp macros.

Contd

@Sandra Shall we have a "assembly/embedded/forth" and "lisp/functional" class profiles to approach different minds? Or are those differences just a matter of exposure and experience? I think both you and me started with BASIC anyway....

I wonder if there is some research about it.

@saper In the past, and I still think this is a good idea, I’ve recommended teaching ASM and Scheme as the first two languages. Side by side. Deos not have to be X86-64 ASM. A simple toy or retro ASM is better. Purp is to teach von Neumann arch

@clacke You jump to blame.

I look at available data and research, and keep an open mind.

I'm not aware offhand of research into programming skill, distribution, cognitive determinants, or psychology. But it's a complex task with numerous components and variants. Even highly-proficient programmers tend to have affinity for a small set of tools and languages.

@hisham_hm @kensanata

@dredmorbius @clacke @hisham_hm @kensanata I have a weird question that keeps popping up when I think about this thread (and others like it): What do we expect 100% of humans to _do_ with computers?

Computers are general purpose computing devices, but is the goal here for everyone to be computing things? What are we computing and why?

@dredmorbius @clacke @hisham_hm @kensanata

I know its kind of anathema to suggest this, but is computing and the ability/access to do computing a general good?

We have a lot of evidence that large concentrations of computing ability produces harm centers that warp all of our social and civic functioning toward ill.

@vortex_egg @dredmorbius @clacke @hisham_hm @kensanata You've got it the wrong way around; it's not what we use computers for; it's what computers use *us* for.

@penguin42 @dredmorbius @clacke @hisham_hm @kensanata

I agree that “the computing of humans” is a source of harm, and is indeed one of the problems I have in mind when I ask whether “everyone should be doing computing” is a wholesome goal.

Yet, humans and human societies are the ones building, programming, and operating the computers that are being used on humans.

@vortex_egg @penguin42 @dredmorbius @clacke @hisham_hm @kensanata

"I ask whether “everyone should be doing computing” is a wholesome goal."

I feel that if everyone ISN'T doing computing, then computing is being done TO them.

ie, that learning to "compute" (ie: think logically, manage incoming information streams, make tools) is a basic defensive mental skill that is necessary to avoid being manipulated by the bad people with big computers.

But I don't know if that's actually true.

@vortex_egg Fair question.

"Computer" itself is a misnomer. Our devices are informators, they process information (hence: information technology and data processing), and are connected to communications networks and information storage.

The information roles themselves are, generally:

Interpersonal communications. Text, image, voice, video.
Document access, in the sense of #PaulOtlet: any fixed record (text, image, voice, video, ...) not primarily interactive. May be informational or entertainment. I'm including streamed/live acces here.
Commerce & business: buying, selling, and transacting business (buying, selling, services, activities).
Government: Financial (mostly taxes/fees) and other informational transactions.
Personal, household, task/activity, financial management & organisation. Recordkeeping, planning, budgeting, designing, device/services management, etc.
Other technical/scientific data streams, e.g.utility, safety, environmental monitoring, healthcare.
Creation & expression.

@clacke @hisham_hm @kensanata

1/

@vortex_egg And it's not all roses.

There's disinformation, fraud, abuse, bullying, doxxing, surveillance, manipulation, harassment, distraction, information overload, malware.

But think generally information than computing. What computing is involved largely relates to those information streams and records.

And increasingly, everyday life presumes some form of digital access.

"Informator" isn't original to me, though I can't find the source right now.

@clacke @hisham_hm @kensanata

@vortex_egg But think of the "computer" as the interface through which information is created, accessed, updated, input, streamed, converted, stored, retrieved, duplicated, destroyed, received, transmitted, analysed, and modeled.

It replaces or modifies writing, speech, audio, video, data, records, streams, and algorithms (programs / software), as well as systems: control, management, monitoring, manipulation, alerting, logging.

How necessary all this is ... ?

Though it's ever harder to live without it.

@clacke @hisham_hm @kensanata

@dredmorbius

Yes that largely aligns with how I already think about the situation at a metaphysical level: Computation is a stand-in for information, or rather en-formation - the creation of forms, the calling out of distinctions from the unformed to say "this" is not "what is not this".

Computational technologies automate and accelerate the production of forms, and forms of forms. But this process of en-forming predates its mechanization.

@clacke @hisham_hm @kensanata

@dredmorbius

To the hermeticists and rosicrucians one half of reality is form en-forming... but the other half is the unformed, or perhaps we know it as "that which is formed".

The question is: In this age of proliferation of computing devices, and the unending production of forms, what is that we are en-forming so rapidly? Why and to what end?

@clacke @hisham_hm @kensanata

@dredmorbius

Norbert Wiener and Shoshana Zuboff offer that we are trying to en-form human life. Instrumentalization. The human use of human beings.

This again predates computers as the interface for en-formation. But the concentration of such devices, compounded with monopolization, has channeled this process perhaps to an unprecedented scale.

So it gives me pause when we ask about making computers more "getable".

@clacke @hisham_hm @kensanata

@dredmorbius

Of course we don't want to further pull the ladder up behind us on the way to the tower of the computer priesthood.

We are kind of in a situation not unlike the Paris agreement for climate reductions and pre-industrialized nations that are preemptively punished for not having already exercised the abuses of petro-industrialization before the larger nations got to it first.

@clacke @hisham_hm @kensanata

@clacke

Now that I'm reading @dredmorbius's lovely link, I was thinking about how to better tie my interjection back to @hisham_hm's original toot (about dumb tech and smart users)... and I think this comes close to my own intuitions on the matter.

Maybe there's something about the knowledge imbalance between an information technology and its user that creates a flow that can go in either direction:

Either you know more than the machine and can thus do the forming, or you know less and become formed by it.

I'm not sure if "knowledge" is not exactly the right way to frame it, but we're looking at something in that rough zone.

@kensanata

@vortex_egg @hisham_hm @kensanata @dredmorbius There is something to be said for learning arithmetic before allowing yourself the use of an electronic calculator, and that idea scales all the way up to using a thing like Kubernetes.

@clacke
Dumb tech is inherently exclusionary, but we still need it. Not everyone is ready to use the console, but that doesn't mean the console shouldn't exist.
Smart tech is inherently manipulative, but we need that too. Ethical smart tech is manipulative in the same way as education, claiming to know how a person should be shaped.
People shape tech, tech shapes people. Believing we KNOW what's best for others is hubris, but guessing is our job.
@hisham_hm @kensanata @dredmorbius @vortex_egg

Show newer

@clacke @hisham_hm @kensanata @dredmorbius @vortex_egg

"For me it's about allowing more people to form rather than be formed."

And this is also a great line! I like it.

@vortex_egg @dredmorbius @clacke @hisham_hm @kensanata

"To the hermeticists and rosicrucians one half of reality is form en-forming... but the other half is the unformed, or perhaps we know it as "that which is formed".

The question is: In this age of proliferation of computing devices, and the unending production of forms, what is that we are en-forming so rapidly? Why and to what end?"

Oh, that's *very* good and it's what I worry about too.

I think late Wittgenstein had similar thoughts.

@dredmorbius

Side-toot, "computer" is a misnomer in the way that "digital" is a misnomer, in that both are historical accidents instead of etymologically-germane descriptors. They end up hiding and obfuscating the subjects under inquiry instead of revealing their inner nature.

@clacke @hisham_hm @kensanata

@vortex_egg Agreed.

See also: manipulate, manufacture, language (esp. in context of "computer"), battery, album (record), film, tape, disk (SSD), ...

@clacke @hisham_hm @kensanata

@dredmorbius @hisham_hm @kensanata @vortex_egg

> "Computer" itself is a misnomer. Our devices are informators, they process information (hence: information technology and data processing), and are connected to communications networks and information storage.

Makes me think of the Spanish word for computer, "ordenador", a thing that sorts and organizes.
@mithrandir @hisham_hm @kensanata @dredmorbius @vortex_egg Not in Spanish Spanish. Hadn't heard this word before, thanks.

"Used in Mexico, Puerto Rico, Venezuela, Argentina, Peru, Uruguay, Cuba, Costa Rica, Bolivia, and Paraguay."

Interesting that it's female. I wonder if it derives directly from "computer" for pre-automation professional computing people, which were mainly women, rather than deriving from the modern English word.

@clacke No idea if "ordinateur" is still being used in French? Used to be common during the home computer times. German has "Rechner" (calculator) or "Rechenmaschine" (calculating machine), though almost everyone will use "Computer" too, nowadays.

@hisham_hm @kensanata @mithrandir @dredmorbius @vortex_egg

@galaxis @hisham_hm @kensanata @mithrandir @dredmorbius @vortex_egg "ordinateur" is where Wikipedia takes you if you go to the English "computer", then go to French.

@clacke Earlier English terms:

Analytical Engine
Tabulator
Electronic Brain
Data Bank
Firing Computer (war, not employment, though one wonders)
Differential Engine
Calculator
Microprocessor

@hisham_hm @kensanata @vortex_egg

@dredmorbius @hisham_hm @kensanata @vortex_egg The Swedish word "dator" for it also emphasizes the data processing aspect rather than the number-crunching bit.

@bkhl @clacke @hisham_hm @kensanata @dredmorbius @vortex_egg

Which Nordic language word is the one that translates as "numberwitch"?

(again probably from "female calculating staff" maybe?"

@bkhl @clacke @hisham_hm @kensanata @dredmorbius @vortex_egg

oh right, Icelandic

en.wiktionary.org/wiki/t%C3%B6

<< A portmanteau word, coined by Sigurður Nordal in 1965 from tala (“number”) +‎ völva (“prophetess”). >>

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!