Follow

May's Law:

Software efficiency halves every 18 months, compensating Moore’s Law.

@fribbledom So we're kinda screwed since hardware manufacturers aren't keeping up with Moore's Law these days, aren't we. >_>

@fribbledom No, I think it's not that simple. I think that current devs are not the equal of their intellectual forebearers.

Previous generations of devs had to work miracles in highly constrained environments, often working very hard to optimize their code effectively. In the era of "cheap' CPU and RAM, nearly everyone has forgotten the art of optimization.

Seriously, go ask a modern dev to describe the functionality of a specific CPU register of your choice and see what he says.

@drewcassidy @fribbledom Well, there is a difference between EE and computation.

To barrow a quote from a CS professor at Harvard who's name escapes me just now, "God made NAND and we made everything else."

Below the level of logic gates, you're doing EE. At the level of logic gates and above, you're doing computation.

@profoundlynerdy @fribbledom I don't remember everything I learned about registers and etc. and certainly don't keep up to date on that stuff, but I do remember lessons I learned about efficiency, which is definitely part of the craft. But the number of times I've heard, "eh, computers are getting faster"--No! This kind of laziness is what keeps the benefits of these faster computers from getting to the users. (Also part of the fun for me is finding the elegant sollution.)

@dlek @fribbledom This is all so fresh in my mind. I'm in the process of drafting a CS course now. I start with the abacus and ask "what's the simplest general purpose digital computer you could build."

Making me think about registers and clock cycles has forced me to revisit some assembly. I've been fiddling with MOS 6502 assembly for days and not writing as much of my course draft as I should have. Hahaha! First World problems.

@profoundlynerdy @fribbledom Oh man I should go find an excuse to write something in assembly. I love me some basic logic and computing. For some reason working within limits can really spur creativity, and not just in computing.

@dlek @profoundlynerdy @fribbledom for a miniature version, you could try any of the games by Zachtronics

@dlek @profoundlynerdy @fribbledom Well, TIS-100 might be the game for you.

A puzzle game in which you have to code assembler (well a modified version) in a machine that computes... different than normal.

@dlek @fribbledom Yeah, have been reading through *Retro Game Dev* (C64 edition) by Derek Morris. Check it out: amzn.com/0692980652.

MOS 6502 Assembly is surprisingly interesting.

@profoundlynerdy @fribbledom @wersimmon @Mnemonic Thanks for the recommendations, I am stoked to check them out! :)

@dlek @profoundlynerdy @fribbledom This is why I started trying to write programs for my calculator in the last week. 2.5 MiB of storage, and ~20 KiB of RAM, on 15 MHz, pretty fun, especially when you just figured out an optimization to boost the speed by a factor of 10 with hand-tuned assembly and by counting the CPU cycles instruction-by-instruction. cybre.space/@niconiconi/100992

@dlek @profoundlynerdy @fribbledom Just write lowlevel stuff and you very quickly get to the limits of C and *have* to fall back to assembly.

@profoundlynerdy
You know, forcing the kids to program a game in a limited system like a C64 would be a great idea to teach them programming.

@dlek @fribbledom

@profoundlynerdy
Have you heard of @ky0ko ? Girl's writing a 6502 emulator for fun, with video and everything.

@dlek @fribbledom

@profoundlynerdy @fribbledom I think you mean "forebears".

If you asked a non-modern dev to describe the functionality of a specific JRE library function or CSS selector, they'd also have trouble. If you're measuring by knowledge in terms of what you can recite, modern devs probably know as much, if not more, than programmers from the 1980s. But it's different stuff.

(Also, though, programming isn't about reciting facts. It's about creating things.)

@kragen @profoundlynerdy @fribbledom You’re making it sound as if one could not do both. I can write assembly and JavaScript.

@js @profoundlynerdy @fribbledom Sure, I can write assembly and JS too. I didn't mean to imply you can't do both. I just meant that nobody in 1985 was writing JS. (And don't tell me the VMS orange manuals were longer than the HTML5 spec. They weren't.)

In the 1983 hdd's would have about 10MB. Talk about the need of a dev to optimise resourses. Now a days who gives a dam about the size their desktop app has

@profoundlynerdy @fribbledom I think there's an argument to be had about dev time vs the task you're coding for.

If you only need to run a task once and/or not very often, it doesn't matter what language or tools you use so long as it's accomplished in a reasonable time.

If the task is performance-dependent or needs to run on older hardware then that's when optimisation becomes an issue developers care about.

@profoundlynerdy @fribbledom That might be true, although I think there's a big functionality/performance tradeoff. While some modern abstractions cost a lot in performance, more features get added because the work is less error-prone and there's less reinventing the wheel. Ultimately this is about what the industry is demanding from programmers rather than some innate inability to memorise registers or what have you.

@captainbland @fribbledom Also a fair argument.

That said, I still cringe whenever I hear of some company replacing their legacy COBOL applications with theoretically equivalent Java code.

While COBOL isn't bullet proof and can certainly ABEND (terminate abnormally) the solution for the Java crowd always seems to be "Hmm... I don't know, try restarting the JVM."

@profoundlynerdy @fribbledom hey now that's unfair, if your process doesn't terminate properly all you have to do is determine the correct PID from your twelve identically named "java" processes and kill it. 😂

@profoundlynerdy @fribbledom

Ok, but go back in time to 1980 and ask a programmer to describe how he'd implement an internet-enabled service with cloud backups.

You're not asking for programmers as knowledgeable as ones from the 80s, you're asking for some union of both sets of knowledge and experience.

That's going to be rare in any era.

@apLundell @fribbledom At 300-1200 bps, he wouldn't have the bandwidth to do any of that. Not withstanding the math of the situation, I get your point.

@profoundlynerdy @fribbledom The era of specific CPU registers is long gone. Very few special purpose registers remain.

@js @fribbledom @profoundlynerdy CPUs nowadays seem to translate machine code again internally?

@veer66 @fribbledom @profoundlynerdy That as well. But the times where ax (eax, rax) are strictly the accumulator are over. We left that behind with the 8 bitters. Even in x86 this was just a name, while in 6502, registers were still restricted to certain purposes. Today RISC has won. Lots of general purpose registers. Even x86 is translated to RISC-like instructions internally by the CPU these days.

@veer66 @js @fribbledom Sort of: on the whole most Intel CPUs use CISC instructions as, well, basically macros for lower level RISC instructions for performance reasons. I'm not sure if all of this is defined at the microcode level or not, I'm sure some of it must be.

So, CISCS persists but RISC is the real winner here.

@profoundlynerdy @veer66 @fribbledom Almost all instructions are translated, only the most basic ones have a 1:1 translation, most others get translated into several microcode instructions.

@profoundlynerdy @fribbledom Also, ask a modern dev about cache lines. This did not exist back then and allows you a lot of optimization today. The difference is not time. The difference is people who care for performance and details about how things work vs. people who don’t give a shit and quickly want to get something done that falls apart next week.

@profoundlynerdy
I recall reading that Minecraft was horribly designed, using structs when words could suffice. As a result, performance suffered horribly.
@fribbledom

@profoundlynerdy @fribbledom

I see your point but I wonder if it puts things in the proper light. Modern developers might not need to work small and efficient and can create more complex and powerful products. Super Mario 64 is 8 megs; Dark Souls III is almost 20 Gigs; neither the small nor the large is necessarily superior, their developers simply had a different environment to work with.

@fribbledom By my observations, the efficiency of software is less now that we have everything running in the cloud :)

"Software's slow... Ah, just add another instance to the cluster!"
@fribbledom "positive" thing is: it takes less time to produce unoptimized software, than an efficient one, so we can make more of it 😂 😂 😂

@fribbledom
I would call this the "eh, good enough" equilibrium. You have better hardware, you don't need as tightly optimized (and simple) programs to make it reasonable to use.

@fribbledom LOL--and this is why I still code in C rather than those new-fangled languages.

@fribbledom Aw, we aren't punning,
and calling it "Leess's Law"?
(or something similar)

Sign in to participate in the conversation
Mastodon

Follow friends and discover new ones. Publish anything you want: links, pictures, text, video. This server is run by the main developers of the Mastodon project. Everyone is welcome as long as you follow our code of conduct!