Hot take on economics 

Hot take on economics 

Hot take on economics 

Hot take on economics 

Hot take on economics 

Hot take on economics 

Hot take on economics 

Hot take on economics 

Hot take on economics 

Hot take on economics 

Hot take on economics 

Hot take on economics 

Hot take on economics 

@freakazoid So, #GreshamsLaw dynamics can turn up in various forms. I've tried (unsuccessfully) to catalogue the in the past.

There's fiat or imposed value, as with coin. Also with transjurisdictional standards, such as divorce law and shipping registries ("flags of convenience"). Whatever the *minimum* acceptable *somewhere* is, is acceptable *everywhere*.

There's effective perceived value -- Mencken's "Brayard", or consumer technologies, or bicycles.

@o @woozle

1/

@freakazoid Underlying quality is difficult to communicate, so some *quality indicator* is substituted. Accent. Vocabulary. Cultural myths. Clothing. Food. Table manners. Branding. Musical tastes. Books read. Schools attended. Management fads.

These signal *both* quality *and* group alignment -- and the wrong set can easily get you killed in many cases.

*Changing* signifiers is highly traumatic: culture wars and value shifts.

This also leads to cargo culting.

@o @woozle

6/

@dredmorbius @woozle @o These fall into a few different possibly overlapping categories: implicit bias, laziness or ignorance (because the information is available but people don't bother to look or don't know it's there), and places where it's genuinely hard to know, like interviewing and managing (though there's a lot we do know about management and interviewing so laziness and ignorance applies there).

...

@o @woozle @dredmorbius Volume also contributes to this a lot: for cheap things, the cost of research can be a significant fraction of the cost of actually buying it. This is probably why for many things there's not much of a "middle ground", just super cheap and super expensive things.

You can also get seemingly paradoxical effects where the brand with the better reputation has lower quality at a higher price point. I've noticed in general an inverse correlation between marketing and quality.

Follow

@freakazoid @dredmorbius @woozle @o

Mature markets tend to end up with two market leaders and a bunch of also-rans. In that kind of market, the #1 is often complacent and of poor quality, but the #2 tends to be better because it wants to knock the leader off the top spot.

e.g. VHS vs Betamax, Windows vs macOS, VW vs Toyota for cars, etc.

(Obviously there are counterexamples, and I think the trend is becoming less clear as markets fragment.)

@mathew @o @woozle @dredmorbius Two of the three examples you cite have strong network effects, where that's certainly true. But car manufacturers don't have this problem. Globally, in 2014 (the year I can easily find data for), the number 8 automaker by number of cars (Honda) sold almost 43% of the number of cars of the number one (Toyota). In the US, the number 7 manufacturer, Kia, sold 43% as many passenger cars as the top manufacturer, GM. And number 3, Toyota, has almost 83% of GM's sales.

@dredmorbius @woozle @o @mathew Actually, now that I think about it, VHS vs Betamax happened in a market that wasn't remotely mature, and it was a competition among standards, not companies. There were plenty of manufacturers of both tapes and players.

I'm having difficulty coming up with an example in any situation where there aren't strong network effects, at least in the US.

@mathew @o @woozle @dredmorbius During my orientation at Google, when they were talking about the datacenters, someone asked if they ever planned to open source the designs like Facebook had. The speaker replied, "Open source is what the company in second place does." That wasn't the only thing in orientation that made me think about just walking out.

@dredmorbius @woozle @o @mathew Maybe commercial airline manufacture is a good example? Boeing and Airbus are definitely the top two, and Bombardier is the only other manufacturer I can even think of, but they only do regional jets AFAIK. But I'm not sure either Boeing or Airbus ever really acts like they're either especially comfortable or hungry; the competition seems to keep both companies on their toes pretty well.

@freakazoid China and Russia both have indigenous aircraft industries, and there's Embrar of Brasil, though theirs are also largely regional / corporate jets.

There are more small- and mid-sized aircraft manufacturers.

The industry as a whole is *extremely* conservative, almost wholly governed by engineering and aeronautical constraints (there are only so many arrangements of sausages, engines, and lifting surfaces).

Plus insurance risks and regulation.

@mathew @o @woozle

@freakazoid An interesting parallel is actually cargo ship design and use in the 13th / 14th centuries, about the time the lateen rig was adopted by Europeans, a millennium or more after its appearance on the Indian Ocean and Arabia.

The problem was insurers.

Shipping is high-risk, and voyages were insured individually, as separate ventures. Insuring syndicates wouldn't take risks on new-fangled tech like lateen rigs.

As a consequence, European ships could ...

@mathew @o @woozle

@freakazoid ... not sail close to the wind *at all*, often had to wait *weeks* before entering port (for favourable winds), and were limited to sailing between May and October. November through April nothing moved by ship, which is to say: nothing moved.

@mathew @o @woozle

@dredmorbius @woozle @o @mathew Yup, sounds like the aircraft industry. Interesting to see the same conservatism develop without (much?) government regulation. Was there much competition among insurers? Were *they* regulated or otherwise privileged?

commercial airliners 

@freakazoid There are some interesting exceptions, yes, but most of them show strong evidence of forces encouraging regionalisation.

The film industry is a key case in point. Reels of film, or now, digitial streams or recordings, can be transmitted virtually effortlessly worldwide. The *fixed* infrastructure of film development is largely the support industry: carpenters, casting agencies, caterers, coaches, costume & set designers, electricians.

@woozle @o @mathew

1/

@freakazoid So centres of specialisation appear.

But you also have *globalised* centres, especially in India, China, Japan, and multiple European countries.

Most of that is language, though culture also plays a major role, and government programmes specifically encouraging and suporting an indigenous film industry -- a powerful propaganda and cultural tool, kept under local control.

The auto industry is similar, in respects. Not because it's projectable...

@woozle @o @mathew

2/

@freakazoid ... though cars ship easily, factories don't, so it centralises, at least within countries.

(JIT and improved transport networks are changing that somewhat. Factories are more distributed in the US than they were in Detroit's heyday, but still cluster somewhat.)

But: there are both regional taste differences and economics, as well as national interests involved.

Building cars and military vehicles shares much in common, and military manufacture is ...

@woozle @o @mathew

3/

@dredmorbius @mathew @o @woozle Is physical colocation a problem? It's generally centralization of control or coordination that somehow discourages defection (cartels have tended to disintegrate rapidly historically) that is the problem, right? And often when a company does manage to dominate that's because new entrants have to face regulatory barriers to entry it didn't, like Amazon with sales taxes.

@freakazoid Colocation used to be highly important because there was a lot of interplay between the automanufacturers themselves and supplier pipelines. Sometimes meeting F2F and getting your mitts on metal is the best way to resolve stuff.

That's either not so much the case, or other factors matter more, but you still have forms of clustering which matter, ranging from support industries to education and infrastructure.

Early colo was driven by bulk materials.

@woozle @o @mathew

@freakazoid Detroit was where raw iron ore and steel could be directly offloaded via ship and cars shipped by rail to mostly Eastern markets.

Interestingly, Los Angeles once featured pretty much the largest of every factory plant *outside* the primary core group, within the US. Which is to say: at LA's distance from the Rust Belt, 2ndary localisation made sense.

@woozle @o @mathew

@freakazoid ... of strategic interest. So countries otherwise not particularly vested in car manufacturing sustain it.

Local tastes and regulations vary, so cars get built for specific regulatory and cultural markets, as well as price points -- both inputs and consumers. Hence: much more variance *between* national markets, but typically little *within* them.

Aircraft are somewhat similar, though more constrained.

@woozle @o @mathewi

4/

@freakazoid @dredmorbius @woozle @o Cars may not be a two-player market, but I still maintain that VW has gotten lazy (and indeed downright criminal), lets its quality slip and failed to invest in new tech, while Toyota has focused on making better cars, even if they did make a disastrously bad move betting on hydrogen rather than battery storage. (There's probably an interesting case study there on why they went the way they did.)

@mathew @o @woozle @dredmorbius No case study needed: they did it because hydrogen is heavily subsidized in Japan.

VW's failure to invest in new tech is the case with car makers across the board. Their cheating was to try to avoid losing a bunch of car sales as diesel was essentially getting regulated out of business. Which IMO was a stupid move on the government's part since diesel has lower CO2 emissions than gasoline.

@dredmorbius @woozle @o @mathew Actually I should qualify that - it has lower emission not because its specific CO2 is lower but because diesel engines have higher compression ratios so tend to be more efficient. You can also get more of it from oil without having to resort to cracking. But hybrids are better, so probably not stupid to regulate its emissions, really.

@freakazoid @mathew @o @woozle @dredmorbius You can also decouple compression ratio (which actually increases thermal losses the higher you go) from expansion ratio (what actually improves efficiency) through either crankshaft linkages (true Atkinson-cycle engines) or valve timing ("Atkinson"/Miller-cycle engines with late intake valve closing or an extra valve, Budack-cycle engines with early intake valve closing).

And the compression losses actually mean that, in an engine that has a full compression stroke, optimum compression ratio is about 16:1 - anything more than that, and you start losing more to heat than you get back in expansion, as I understand. (Diesels ran significantly more than that in the past because they needed the excess heat to reliably ignite fuel, but in the 2010s they got down to 16.5:1 for most engines.) There's gasoline engines that run 13:1 on American regular fuel in full Otto cycle operation, though, and 14:1 on American premium/European regular.
@freakazoid @dredmorbius @mathew @o @woozle And, yeah, as much as TDIClubbers like to go on about "but diesels beat their EPA mileage and hybrids don't!"... they only do that if they're cheating and/or you're driving slower on the freeway than the current EPA freeway cycle. And hybrids can beat it too if driven like that.

My pre-Dieselgate 1999 New Golf TDI (which was fairly heavily modified, but one of those mods cheated constantly, improving thermal efficiency) pretty reliably got 47-51 miles per US gallon on the highway - original EPA highway was 49, 2007 re-rated EPA highway is 44.

By comparison, on road trips, my 2016 Prius gets about 54 MPG, versus a rating of 50 highway. And, that's on a lower carbon per gallon fuel. (Better aero does help.)

And then, in the city it does decently, I typically get 40-60 on my commute (if it's spring/fall, 60, summer, 50-55, winter, 40). The TDI would be more like 30-35 MPG on that commute.

@freakazoid Deisel fuel itself has a slightly higher energy content than petrol/gasoline, the engines run at higher compression ratios, and at higher temperatures (Carnot efficiency), all of which net more mileage and lower CO2 emissions.

Emissions of *particulates* (especially PM2.5, v. bad for lungs and health), and of NOx (nitrogen oxidising at high temps and pressures) are *worse* for deisel than petrol engines.

Also possibly sulfer and other sour crude contaminants.

@mathew @o @woozle

@freakazoid Incidentally, two cases of dyanamics I've been describingl

Toyota's forray into hydrogen fuel cells is based on government policies and incentives, creating a localised specialisation.

Volkswagon's diesel emissions fraud is a #GreshamsLaw dynamic: trying to substitute a lower-value quality for a higher-value one, through fraud.

@woozle @o @mathew

@dredmorbius @mathew @o @woozle It seems like this is also the case with software. People pick software on the basis of features or price, because they have no idea how to measure quality. So there's no market for high-quality software.

A "Consumer Reports for software" might help. It could track historical bugs, usability/accessibility problems, vulnerabilities, attacks, and the maker's response to them, etc.

@freakazoid "The Tyranny of the Minimum Viable User"

old.reddit.com/r/dredmorbius/c

Since users' _capabilities_ also vary strongly, the problem goes beyond this.

You see similar types of dynamics in, e.g., "audiophile" gear, much of which seems principally engineered to separate rich idiots from their lucre.

A better comparison might be precision or highly-skilled equipment, also somewhat affected.

@woozle @o @mathew

@dredmorbius @mathew @o @woozle This is why I think that products should lift up the user, not descend to the user's level.

@freakazoid The problem, given the dynamic, is that users don't _want_ to be lifted. They want to be comforted. You can try going against the grain. The market will punish you.

I'm not saying the market is right. The market and I disagree violently.

But the market is bigger than me.

@woozle @o @mathew

@dredmorbius @mathew @o @woozle Will it? I can think of plenty of examples of brands marketing how dumb their products are ("You already know how to use it" being a well-known example), but not of the market punishing products that are self-teaching. Do you know of some?

@freakazoid P.T. Barnum's dictum isn't an absolute universal, but it's close.

You can swim upstream, but you're going to find yourself in niche space. That *may* be a *profitable* niche, but it's still a niche.

The useful thing to do is look for cases of exceptions to the rule -- where is coplex, respectful, high-information-density content (or products or services) found?

Quality literature, news, education, music, information gear, etc.

@woozle @o @mathew

@dredmorbius @mathew @o @woozle Sure, but couldn't the reason for that be that our current method of creating new products doesn't tend to incorporate pedagogy as a skill, not that the market doesn't desire pedagogy?

@woozle @o @mathew @dredmorbius To expand on the "You already know how to use it" example, it could be that pedagogical Apple failed because they didn't have much business sense, and dumbed-down Jobs apple succeeded not because of Jobs's dumbing-down of their products but because he understood business and marketing.

@freakazoid The winner-take-all dynamic of many tech-based products (hardware, OS, software, services, social media) makes attribution highly risk prone: success succeeds, failure fails. Survivor bias is manifest.

But having witnessed enough cases directly, and studied numerous others, the general rule of "don't outsmart your market" seems to hold.

Apple's big success is smartphones. Mac is a fairly small share of their market. Though they seem to be catering it again.

@mathew @o @woozle

@freakazoid I was looking at the specs for the upcoming Mac Pro release. It's mind-boggling.

Base model: $4000. Top of the line is 28 cores and 1.5 TB RAM, 4 TB SSD, 4xGPU. Speculation is that this will run north of $35,000, and I suspect that's low. This is a supercomputer in a mesh cage.

(I'm wondering what Linux or Window equivalents there might be.)

Definitely drool-worthy: apple.com/mac-pro

@mathew @o @woozle

@freakazoid @dredmorbius @mathew @o

Note that even Mac mice now have 2 buttons (or so I've been told).

@woozle The latest Mac "magic mice" have *zero* buttons, though there are multiple, non-determinable, sensing zones where things may or may not happen.

@freakazoid @mathew @o

@freakazoid Mostly it's cases where one of several conditions is met:

1. The good is a signalling mechanism. I can advertise my own capabilities in a space by using (or producing) the good. Uni education especially.

2. Direct beneficial use. If the good provides a _direct_ and _quantifiable_ or _perceptible_ benefit, it may find a niche. That will by definition be limited, and faces challenges by imitative competitors and measurement difficulty/costs. Examples ...

@woozle @o @mathew

2/

@freakazoid ... include business/financial news, policy news, etc.

3. Quality professional gear: audio, photo, video gear. Linux/BSD vs. Windows/Mac, Mac vs. Windows. Small, focused, niche audiences.

4. Regulatory quality floor on goods *or* users. Commercial and civil aviation vs. automobiles. Any idiot can (and does) drive a car. Pilots are licensed. Commercial pilots are certified to specific aircraft. There is a very strong quality floor.

@woozle @o @mathew

3/

@freakazoid 5. With some limits: self-use. Especially where tools are mutually developed by specialists within a craft. Linux *used* to occupy this space, it's drifting from it. Whether there's a replacement isn't yet clear. The death of the desktop, may, paradoxically, save Linux, if the idiots all use smartphones instead.

There are some parameters that may influence this. The scope of network effects especially. If intelligence counters network, then a ...

@woozle @o @mathew

4/

@dredmorbius

Are you saying Linux used to be self-teaching? Because in my experience, it used to be worse about that but has slowly improved (from, like, 0% to maybe 20%).

@freakazoid @o @mathew

@woozle @mathew @o @dredmorbius Linux has dramatically dumbed down over time. I wouldn't really call it "self-teaching" at any point, but it used to be that, *if you used Linux*, you made heavy use of man pages and documentation that was included with Linux. So simply having sufficient interest in using Linux to get you over the hurdles would have left you significantly more competent in using Linux than it does today.

@dredmorbius @o @mathew @woozle Today people's response to some random thing breaking in GNOME 3 or KDE (let's ignore Android and Chrome OS) seems to be about the same as it is if something breaks in Windows: format and reinstall.

Some of that is just an increase in accessibility. But it's specifically an increase in accessibility gained by dumbing down the system instead of by improving the system's self-documentation/self-teaching.

@starbreaker Debian still makes the effort. Not having manpages remains a bug (though not a release-critical one).

Red Hat has almost always been far less useful -- missing manpages and even /usr/share/doc/<packagename> entries.

Debian's dwww is hugely useful.

FreeBSD / OpenBSD manpage quality is typically higher, though they use the wrong utilities.

GNU's insistance on info is a fucking brain disease.

@woozle @freakazoid @o @mathew

@starbreaker #ItsAJokeSon: BSD manpages document BSD utilities, not GNU utilities.

The arguments are all wrong ;-)

@woozle @freakazoid @o @mathew

@woozle Not so much that, as "developed principally by its own users", much as early Unix had been (1970 - 1990 or so).

That is, "users" weren't a separate class, they were "us", from the developers standpoint.

Today, you've got a much larger nontechnical userbase. The total installed base hasn't changed much by _percentage_ but it's vastly greater by _number_ than in the late 1990s.

Self-documentation through code, manpages, info docs, & HOWTOs has varied.

@freakazoid @o @mathew

@freakazoid ... quality product stands a far better chance. If production can be readily distributed and decentralised, similarly. Open source software seems vastly more tractable than open source hardware. Fabrication, logistics, and distribution are far harder for physical commodities.

In particular, if there's no way to impose some kind of effective floor (as with pilots/aircraft, certified industrial equipment, etc.), the market will seek the minimum viable user.

@woozle @o @mathew

5/

@freakazoid To counter that, you've got to raise the bound on that minimum.

You can gatekeep the users (certification). Or you can make sufficient degrees of incompetence nonviable -- harms or at least does not help the incompetent user is one route. This will still limit the scope of the market, but at least won't dilute the product. Call it a talent bar.

This also means a noneconomic motivation. You're not profit-maximising, but maximising for individual benefit.

@woozle @o @mathew

6/

@dredmorbius @mathew @o @woozle I don't agree with that, because it assumes that the reason for incompetence is lack of ability or desire to become competent. If it's lack of desire, let's exclude them not just from products, but from the planet, since they're ruining humanity. And I suspect lack of ability represents only a tiny fraction of the population.

But I think the real answer is that our system selects for people who are shitty at teaching.

@freakazoid So ... well, current use of idiots notwithstanding, I try to avoid prejudiced language, and the whole long first part of the Reddit essay goes into detail about why simple tools are often a net win.

The problem is where the dynamic directly impedes development of useful tools, systems, goods, services, etc.

And I really _don't_ think it's something you can chalk up only to pedagogy. Put another way: we're at the end of a phenomenal 300 yr ramp up in literacy.

@woozle @o @mathew

@freakazoid Which includes a hell of a lot of pedagical reforms (the history's interesting, esp. for trivia, or trivium, fans).

Literacy ~1700 was ~10%. By 1900 in US/Europe, 90%+

HS graduation in the US 1900: 6% 1950: 90%. Bachellors is now 30%+ and PhD > 8%. There are more PhDs in the US now than HS grads 120 years ago.

But: the quality of that HS education is also, in some measures, much lower: lower language/logic skills, better scientific knowledge.

@woozle @o @mathew

2/

@freakazoid And the general informational tools we have *are* better than 300 years ago. (In part: they deliver us those 300 year old works instantly.) But they're far short of their potential.

And I'm trying to suss out *why*.

Teaching/training is _part_ of it, and yes, the education system likely doesn't meet its potential either, but it does a tremendous amount, for a tremendous number. And hasn't _worsened_ appreciably since the 1950s.

Most variation is ...

@woozle @o @mathew

3/

@freakazoid ... actually, if you look at it, changes in either who's included in classes or testing. Increased access => falling test scores. Rising test scores => falling access. That points to some population-level intractability.

(With exceptions. "Stand and Deliver".)

But trying to make all the children above average is a Sysiphean task, and a doomed premise for progress. You've got to work with the talent you've got.

My point is to not get in its way.

@woozle @o @mathew

4/

@dredmorbius @mathew @o @woozle Replying mid-thread because I think a lot of your reasoning farther down hinges on what I believe to be a mistake in this post. The fall in test scores from "increased access" is not necessarily because the larger group is not learning as well, but because the test wasn't actually testing how effectively the students were being taught. Most of our standardized tests are really indirect tests of socioeconomic class, not of how much students are learning.

@woozle @o @mathew @dredmorbius I think that the real problem is that "education systems" are super bad at educating. They can take a subset of students who have the right background and right set of parents and get them to do well on standardized tests, but they cannot take a random person out of a population of, say, English speakers, and on net provide them significant benefit.

@dredmorbius @mathew @o @woozle The reason students at "elite schools" tend to do better is that the school only allows in students who are going to be successful no matter what. They're *filtering*, not teaching. But they're not really filtering for innate skill. They're filtering for what the student has already absorbed from the world, largely due to the circumstances of their birth.

@woozle @o @mathew @dredmorbius And the problem isn't really that teachers are incompetent, though a bureaucracy isn't capable of hiring competent people; it's that it's not possible to be competent at teaching a class of 30+ randomly selected students.

@dredmorbius @mathew @o @woozle The fundamental problem IMO is that almost all societies treat teaching and learning as just one function among many, and something that's confined to particular institutions and particular phases of a person's life.

IOW it's not just Americans who are anti-intellectual but most of human society. And the reason is that we have entrenched groups who have a vested interest in a stupid population.

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!