Nate Cull


When designing a user interface, imagine some old woman using it, say Margaret Hamilton, and she's clicking your app's buttons and saying to you, as old people do,

"Young whippersnapper, when I was your age, I sent 24 people to the ACTUAL MOON with my software in 4K of RAM and here I am clicking your button and it takes ten seconds to load a 50 megabyte video ad and then it crashes

I'm not even ANGRY with you, I'm just disappointed."

@natecull Well, people don't optimize for resources anymore

@Zulgrib @natecull Human time is more valuable than computer time in most circumstances. Or at least, thats the way the incentives lean. :/

@natecull @Angle But if human time is wasted due to waiting result from unoptimized code ?

@Zulgrib @natecull @Angle I think the usual reasoning is: "Get a better computer then."

@er1n this is both one of my biggest agony points in UI design and one reason I haven't released SotF :P I can lag it in some cases and it really bugs me because I can't figure out why, not even with flame graphs and profiler timelines

@ninjawedding i was actually thinking about your scrolling demo when i cc'd you :)

@er1n oh the scrolling thing, heh

yeah I was really (disproportionately?) happy when that worked the way I wanted it to. I mean it's just scrolling but being able to zoom through tens or hundreds of thousands of items at full framerate and bounded memory is still something I love

@LottieVixen @ninjawedding @er1n profiler output visualization tool, helps you see what functions and system calls your program is spending the most time on

this, or similar (like what erin linked) :

@cascode @LottieVixen @ninjawedding the difference with what i posted is that it's very good for monitoring infrequent events

@cascode @LottieVixen @er1n yep, pretty much that (I got weird lag on this status 🤷 )

first image is an example of a flame graph from the QML profiler; you can see that the majority of total time was spent updating a binding (and, amusingly, there's no sub-operation inside it). you can then look at the timeline to see what that looks like over time, which is useful for analyzing things like "what's causing jank" etc and getting an idea of what that 51.7% of time spent means in terms of frame budget

FlameScope does look cool, I just don't know how to integrate it yet :P
@er1n oh huh

that could make it really easy then 🤔

actually I think you told me about this before lol

@ninjawedding @cascode @er1n

oh heck this looks awesome, also erm....the delay may be fed issues....*sigh*

@LottieVixen @cascode @er1n yeah the tools are great

though tbh similar tools have been in web browsers for quite a while too :)

actually I'd say that browsers have probably some of the best profiling tools out there right now? next to maybe like Instruments, bespoke profiling tools in game engines, and Telemetry, which I have never used but have seen and it looks sick

@ninjawedding @LottieVixen @cascode telemetry is theoretically amazing but it's also probably unfathomably expensive, like so expensive that they just say "contact us if you're interested"

@er1n @LottieVixen @cascode heh yeah -- when I first saw Telemetry in use I was like "oh how much could this be? certainly no worse than CLion"

lol me
@LottieVixen @er1n @cascode I think you would do fine -- Telemetry's data comes from hooks you write into the code, so they work a lot like print statements

(or so I hear)

meanwhile the Gemini Guidance Computer team laugh

"you MIT people had 4K of RAM, we had 39 whole bits AND WE WERE GRATEFUL"

ah, actually they did have 4096... 36-bit words of writeable core RAM. Weird. Was the Gemini computer *bigger* than the Apollo one ????

The Apollo LVDC is the third computer on the ship that never gets any love cos it just ran the engines and wasn't sexy

<< and the MIT Instrumentation Labs' antibodies flooded in to destroy the invader with critiques and reports negative of the IBM report. >>

lol programmers then just like today

Ah! The LVDC had no ROM at all! Good lord. The entire program sat in RAM. Aaaaaaaaaaaa

@natecull please tell me they at least had a bunch of toggle switches somewhere in case it got wiped so houston could read it back to them and they could program it back in

@jk I think if the LVDC failed you had bigger problems since it literally only ran the launch stage and that either got jettisoned or exploded in the first few minutes

but they could patch it right up til launch time, yes

<< A so-called "bugger word" has been stuck at the end of each bank—no comments on this terminology, please, since I didn't invent it; when I asked Don Eyles some question that involved them, he somewhat-laconically stated "we called them check sums">>

Huh, and if you have ROM and RAM I guess it literally is a Harvard Architecture

I never thought of that before!

@natecull john roderick often talks about his ~85 year old mother on his podcasts, and describes how she (as a programmer in the 1950s-1960s) is appalled at pretty much all bugs in computer programs and the blasé attitude developers have, saying something like "back when I worked on those machines we made sure the code was correct before we sent it to anybody, we spent months and years making sure everything was completely correct before any of it was sold"

@jk @natecull but today, it's possible to patch software faster than it used to be. But I agree that just accepting bugs in the software is stupid.

@natecull Actually, the Apollo 11 block II guidance computer had 2k of RAM (core memory) and 32k of ROM. This was a significant upgrade from the previous block I unit that had 1k of RAM (again, core memory) and 24k of ROM.

Even a VIC 20's feeble 3.5k RAM was luxurious by comparison.

@natecull The programming paradigm I've always loved is this: "The program should always act in the way that is least surprising to the user".

@natecull If I had to take a dressing down about my code's performance from Margaret fucking Hamilton, I don't think I'd be able to go back to work.

@natecull I was half way into writing a response pointing out the fact that the AGC had 48 kwords of storage. Then I realised you explicitly said RAM. So you're right.

But, just to give me a reason to comment on one of the more interesting machines of the 60's I will just mention that the word length of the AGC was 15 bits, so it was a bit more storage than 4 k may seem like.

@natecull i'm disappointed too, hypothetical margaret hamilton. i'm disappointed too...

@natecull this is not even far from what would really happen

In fact: the Apollo guidance computer had awesome and groundbreaking UI.

@natecull This reminds me. I was at a 2 day course in usability by Norman Nielsen Group.

They talked about they where frustrated that devs didn't take them seriously when the pointed out (via usability testing) that a UI was too difficult to use. The devs said "well the users are just not smart enough!".

To prove that the UI really was too difficult to use they did the usability test with rocket scientists from NASA and showed that even they couldn't figure out the UI.

Huh talk about Margaret Hamilton and there she is!

and there's a million things she hasn't done
but just you wait

Sign in to participate in the conversation

Follow friends and discover new ones. Publish anything you want: links, pictures, text, video. This server is run by the main developers of the Mastodon project. Everyone is welcome as long as you follow our code of conduct!