Follow

Why do programming languages allow newbies to make the same mistakes every year? Is it so that we, senior developers and software architects, can chuckle at them derisively and feel good about ourselves? Why don't we make languages better instead? Remove confusing and financially damaging elements?

For instance: noobs tend to store money in a floating-point data type. That leads to rounding errors. Why don't our languages pick a better type? Why don't our tutorials teach better?

· Tootle for Mastodon · 4 · 5 · 5

@aeveltstra +1 for having to deal with money/floating point rounding errors. Ugh.

@veer66 Yeah, like Java has a BigDecimal, but that isn't the default. You wouldn't find out about it until after you got burned by floating-point rounding errors.

@veer66 That article in particular is part of the problem: nothing there stops a beginning programmer from typing money as a floating point data type. Why do we let that happen? It's a very common mistake that takes hours if not days to track down, making it very costly to fix.

And it's not just this-the money data type is just an example. If I remember correctly, MS TSql does offer a money data type, which makes it intuitive to choose.

@aeveltstra What data type would you use instead? All options have disadvantages. Bignum: speed; integers: limited amounts; floats: rounding errors; structs or tuples: calculation complexity and speed...

@alvarezp Certainly. Wouldn't you agree that that is an area of concern? And if senior devs know which to use when, then why can't that knowledge make it back into the language or compiler, to help beginners skip our mistakes?

@aeveltstra Yes. We are so bad at explaining things. We tend to say that floating point data types are used for decimal (as in fractional) numbers. We should say "it's used for imprecise decimal numbers" instead.

In any case, even if you had "cents" and declared it as integer and used it for storing money amounts, what should happen if you do cents/2, when cents is not even?

@alvarezp Could it become a fractional? We can express fractions with integers. Mathematics has done that for centuries.

But the money typing is just 1 issue. There's also the problem of storing phone "numbers" as a numerical data type. And storing addresses in "normalized" fields, separating what should stay together, imposing arbitrary data storage limitations. And so, so much more.

It's as much an educational lacking as it is perpetuated by programming languages.

@aeveltstra I don't think it's the languages really, just that some properties of computers take longer to understand than others, like how data is represented in memory.

I suppose you could make the case that many languages prioritize memory and space more than they need to in the modern day (e.g. floats and ints versus arbitrary-precision numbers), which causes weirdness with how they behave, but software is already getting slower and slower, I don't think we need the language itself to help.

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!