May's Law:

Software efficiency halves every 18 months, compensating Moore’s Law.

@fribbledom So we're kinda screwed since hardware manufacturers aren't keeping up with Moore's Law these days, aren't we. >_>

@fribbledom No, I think it's not that simple. I think that current devs are not the equal of their intellectual forebearers.

Previous generations of devs had to work miracles in highly constrained environments, often working very hard to optimize their code effectively. In the era of "cheap' CPU and RAM, nearly everyone has forgotten the art of optimization.

Seriously, go ask a modern dev to describe the functionality of a specific CPU register of your choice and see what he says.

@drewcassidy @fribbledom Well, there is a difference between EE and computation.

To barrow a quote from a CS professor at Harvard who's name escapes me just now, "God made NAND and we made everything else."

Below the level of logic gates, you're doing EE. At the level of logic gates and above, you're doing computation.

@profoundlynerdy @fribbledom I think you mean "forebears".

If you asked a non-modern dev to describe the functionality of a specific JRE library function or CSS selector, they'd also have trouble. If you're measuring by knowledge in terms of what you can recite, modern devs probably know as much, if not more, than programmers from the 1980s. But it's different stuff.

(Also, though, programming isn't about reciting facts. It's about creating things.)

@kragen @profoundlynerdy @fribbledom You’re making it sound as if one could not do both. I can write assembly and JavaScript.

@js @profoundlynerdy @fribbledom Sure, I can write assembly and JS too. I didn't mean to imply you can't do both. I just meant that nobody in 1985 was writing JS. (And don't tell me the VMS orange manuals were longer than the HTML5 spec. They weren't.)

In the 1983 hdd's would have about 10MB. Talk about the need of a dev to optimise resourses. Now a days who gives a dam about the size their desktop app has

@profoundlynerdy @fribbledom I think there's an argument to be had about dev time vs the task you're coding for.

If you only need to run a task once and/or not very often, it doesn't matter what language or tools you use so long as it's accomplished in a reasonable time.

If the task is performance-dependent or needs to run on older hardware then that's when optimisation becomes an issue developers care about.

@profoundlynerdy @fribbledom That might be true, although I think there's a big functionality/performance tradeoff. While some modern abstractions cost a lot in performance, more features get added because the work is less error-prone and there's less reinventing the wheel. Ultimately this is about what the industry is demanding from programmers rather than some innate inability to memorise registers or what have you.

@captainbland @fribbledom Also a fair argument.

That said, I still cringe whenever I hear of some company replacing their legacy COBOL applications with theoretically equivalent Java code.

While COBOL isn't bullet proof and can certainly ABEND (terminate abnormally) the solution for the Java crowd always seems to be "Hmm... I don't know, try restarting the JVM."

@profoundlynerdy @fribbledom hey now that's unfair, if your process doesn't terminate properly all you have to do is determine the correct PID from your twelve identically named "java" processes and kill it. 😂

@profoundlynerdy @fribbledom

Ok, but go back in time to 1980 and ask a programmer to describe how he'd implement an internet-enabled service with cloud backups.

You're not asking for programmers as knowledgeable as ones from the 80s, you're asking for some union of both sets of knowledge and experience.

That's going to be rare in any era.

@apLundell @fribbledom At 300-1200 bps, he wouldn't have the bandwidth to do any of that. Not withstanding the math of the situation, I get your point.

@profoundlynerdy @fribbledom The era of specific CPU registers is long gone. Very few special purpose registers remain.

@js @fribbledom @profoundlynerdy CPUs nowadays seem to translate machine code again internally?

@veer66 @fribbledom @profoundlynerdy That as well. But the times where ax (eax, rax) are strictly the accumulator are over. We left that behind with the 8 bitters. Even in x86 this was just a name, while in 6502, registers were still restricted to certain purposes. Today RISC has won. Lots of general purpose registers. Even x86 is translated to RISC-like instructions internally by the CPU these days.

@veer66 @js @fribbledom Sort of: on the whole most Intel CPUs use CISC instructions as, well, basically macros for lower level RISC instructions for performance reasons. I'm not sure if all of this is defined at the microcode level or not, I'm sure some of it must be.

So, CISCS persists but RISC is the real winner here.

@profoundlynerdy @veer66 @fribbledom Almost all instructions are translated, only the most basic ones have a 1:1 translation, most others get translated into several microcode instructions.

@profoundlynerdy @fribbledom Also, ask a modern dev about cache lines. This did not exist back then and allows you a lot of optimization today. The difference is not time. The difference is people who care for performance and details about how things work vs. people who don’t give a shit and quickly want to get something done that falls apart next week.

I recall reading that Minecraft was horribly designed, using structs when words could suffice. As a result, performance suffered horribly.

@profoundlynerdy @fribbledom

I see your point but I wonder if it puts things in the proper light. Modern developers might not need to work small and efficient and can create more complex and powerful products. Super Mario 64 is 8 megs; Dark Souls III is almost 20 Gigs; neither the small nor the large is necessarily superior, their developers simply had a different environment to work with.

@fribbledom By my observations, the efficiency of software is less now that we have everything running in the cloud :)

"Software's slow... Ah, just add another instance to the cluster!"

I would call this the "eh, good enough" equilibrium. You have better hardware, you don't need as tightly optimized (and simple) programs to make it reasonable to use.

@fribbledom LOL--and this is why I still code in C rather than those new-fangled languages.

@fribbledom Aw, we aren't punning,
and calling it "Leess's Law"?
(or something similar)

Sign in to participate in the conversation

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!