Brandon Hall โœฮฆ ๐Ÿ’๐Ÿ is a user on mastodon.social. You can follow them or interact with them if you have an account anywhere in the fediverse. If you don't, you can sign up here.

I realized that I've been systematically doing an assignment wrong, or not as intended. We're supposed to choose competing values for ourselves, so deciding whether A or B is more important, after we've already decided that A and B are important to me.

I've been going about it like "well if A, there's a decent probability of B, so A", but that's quite besides the question of "if A, how much satisfaction do you derive, and do you derive more than if B?" ๐Ÿค“ Redo!

Ah, it was only after erasing my original answers that I realized how interesting it'd be to compare my original answers to the re-done ones. ๐Ÿ™ƒ

Oh man, I could technically do a sort of mashup of the two approaches: if A, then B has a probability of .3; A has a probability of .2 and provides 3/10 satisfaction while B provides 5/10; which pairing, given their probabilities and values, is optimal? I haven't explored this realm of logic/math much but this'd be cool for a later project. ๐Ÿ˜‚

@bthall This is way cool. I need to get through my book, An Introduction to Probability and Inductive Logic.

Brandon Hall โœฮฆ ๐Ÿ’๐Ÿ @bthall

@zacts The basic that I've done in Econ used some of this logic, with the Payoff of A being multiplied by the probability of A, but I'm not sure how it plays out from there.

ยท Tusky ยท 0 ยท 0

@bthall An Introduction to Probability and Inductive Logic is like one of the best intros to the subject.