What's new
Fantasy Football - Footballguys Forums

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

Throw my game to change playoff teams? (7 Viewers)

Did you randomly select it, or did you search for it and pick it out?

Obviously the odds it is a king are 1/1.
How can you possibly know the probability is 1 if you don't know whether or not it was selected randomly? You said randomness was a necessary condition for you to calculate true probabilities.
The odds of it being a king, if you see that it is a king, are 1/1 whether you picked it randomly or not.
So randomness is not a necessary condition for you to calculate a true probability. Got it. :thumbup:

 
Did you randomly select it, or did you search for it and pick it out?

Obviously the odds it is a king are 1/1.
How can you possibly know the probability is 1 if you don't know whether or not it was selected randomly? You said randomness was a necessary condition for you to calculate true probabilities.
The odds of it being a king, if you see that it is a king, are 1/1 whether you picked it randomly or not.
So randomness is not a necessary condition for you to calculate a true probability. Got it. :thumbup:
Well that depends. If you know the answer, that is kind of a different situation.

For all the odds I have discussed, they are of course based on randomness of the event, like with Craps and Roulette.

I guess randomness isnt necessary when the answer is 1 or 0, and you know this answer.

Did I miss another point? Whats your point on this one?

 
Last edited by a moderator:
I guess actually, I take that back.

When you have a king in front of you, technically you are picking that king out of a stack of ONE card. So if you randomly choose one card from a deck of one card, the odds are 1/1 (assuming you know it is a king).

 
Last edited by a moderator:
So if you randomly choose one card from a deck of one card
That's not random.

Did I miss another point? Whats your point on this one?
My point is that people (not just you, but you're one of them) need to stop making up their own definitions of words. Things like "probability," "odds," "random," "variable," etc. all have actual meanings - you can surely find them all in your stats book, assuming you didn't really burn it - yet several of you have basically disregarded what those words actually mean and instead have just adapted them to suit whatever point you're trying to make at the moment. Believe it or not, when you do that, the points you're trying to make end up making no sense.

 
So if you randomly choose one card from a deck of one card
That's not random.

Did I miss another point? Whats your point on this one?
My point is that people (not just you, but you're one of them) need to stop making up their own definitions of words. Things like "probability," "odds," "random," "variable," etc. all have actual meanings - you can surely find them all in your stats book, assuming you didn't really burn it - yet several of you have basically disregarded what those words actually mean and instead have just adapted them to suit whatever point you're trying to make at the moment. Believe it or not, when you do that, the points you're trying to make end up making no sense.
You also are not asking straightforward questions some of the time, and our questions and answers are getting mixed up.

I guess form now on, ask a question and include all of it, and ask for a specific answer of what you are looking for. Instead of taking two sentences I wrote and assuming one had to do with the other. They didnt. You asked two questions that had two different asnwers, and I answered them individually.

I am well aware of all those terms,w hat they mean, and how to use them.

I still don't know what point you are trying to make, and what you are saying I am wrong about.

If you are not saying I am wrong about anything, then what are we arguing about?

If I dont explain things the exact way necessary for someone to understand, oh well. But if it's right, its right, and instead of attacking the approach, add to it.

 
Last edited by a moderator:
I am well aware of all those terms,w hat they mean, and how to use them.
You also said that you were well aware of how probability works, and then when I posted a couple of probability problems, you got them wrong. Should we do a few more to see if that was a fluke or something?

You can't just say you're well aware of how to use all those words, when your posts indicate that there's a good chance you're not.

I could claim I'm well aware of medical terms, but would you believe me if I went around saying, "Based on your symptoms, I'm prescribing you a headache. We'll have to surgery your diagnosis." Yeah, those are all words that a doctor might use, and if you didn't know any better it might sound like I know what I'm talking about, but if you look up what those words actually mean, it's obvious that I don't.

If I dont explain things the exact way necessary for someone to understand, oh well.
That's a tragically low standard you've set for yourself.

But if it's right, its right
"Hmm, you have a migraine? You need to stitches your ulna hypodermic. That makes no sense? Oh well, whatevsss............I'm right. If you don't believe me go open a doctor book."

What if it isn't right? Then what do we do?

 
So back on topic, tanking is ok when it's your ONLY chance to make the playoffs, right?

Can you describe a situation where you have a 0% chance of making the playoffs if you don't tank?

 
So back on topic, tanking is ok when it's your ONLY chance to make the playoffs, right?

Can you describe a situation where you have a 0% chance of making the playoffs if you don't tank?
I was wondering what the scenario might be for that to be true. It's clear you could tank to keep someone else out of the play-offs but when is a loss more likely to secure a play-off berth than a win? I imagine it would take some odd tie-breaker situation but seems quite unlikely.

 
Last edited by a moderator:
So back on topic, tanking is ok when it's your ONLY chance to make the playoffs, right?

Can you describe a situation where you have a 0% chance of making the playoffs if you don't tank?
I was wondering what the scenario might be for that to be true. It's clear you could tank to keep someone else out of the play-offs but when is a loss more likely to secure a play-off berth than a win? I imagine it would take some odd tie-breaker situation but seems quite unlikely.
The scenario's already been mentioned (total points winner automatically gets a playoff spot). But the point is that even if you're in that situation (a win knocks you out of the playoffs), you don't have a 0% chance to make the playoffs if you don't tank. You could not tank and still lose your game.

 
So back on topic, tanking is ok when it's your ONLY chance to make the playoffs, right?

Can you describe a situation where you have a 0% chance of making the playoffs if you don't tank?
I was wondering what the scenario might be for that to be true. It's clear you could tank to keep someone else out of the play-offs but when is a loss more likely to secure a play-off berth than a win? I imagine it would take some odd tie-breaker situation but seems quite unlikely.
The scenario's already been mentioned (total points winner automatically gets a playoff spot). But the point is that even if you're in that situation (a win knocks you out of the playoffs), you don't have a 0% chance to make the playoffs if you don't tank. You could not tank and still lose your game.
Could you expand on the other aspects of the situation? Not lazy but the thread is now 26 pages.

 
So back on topic, tanking is ok when it's your ONLY chance to make the playoffs, right?

Can you describe a situation where you have a 0% chance of making the playoffs if you don't tank?
I was wondering what the scenario might be for that to be true. It's clear you could tank to keep someone else out of the play-offs but when is a loss more likely to secure a play-off berth than a win? I imagine it would take some odd tie-breaker situation but seems quite unlikely.
The scenario's already been mentioned (total points winner automatically gets a playoff spot). But the point is that even if you're in that situation (a win knocks you out of the playoffs), you don't have a 0% chance to make the playoffs if you don't tank. You could not tank and still lose your game.
Could you expand on the other aspects of the situation? Not lazy but the thread is now 26 pages.
Any scenario where there are two different criteria to make the playoffs (typically, when the points leader is guaranteed a spot) can set up a situation where a team needs to lose. You have to let the points leader get one of the normal playoff spots, instead of bumping off the last wildcard (you).

 
I never said the odds don't change. I said that the the original probability of selecting a king didn't change.
"It used to be 1/13. Now it's 1/2, but that doesn't change the fact that it used to be 1/13." That's some really insightful stuff there.
In regards to the Monty Hall Paradox, smarter mathematicians than you have analyzed and debated it so please spare us that you have the definitive answer to it.
There's not really any debate about how to resolve the paradox. As long as you're clear about the assumptions of the problem, the solution is easy. The trick is in clarifying the assumptions, which is the point - calculating probabilities is largely dependent on what we know (and don't know), and how we learned it, and related concepts.
But please, share with all of us your thesis on the Monty Hall Paradox.
Well, if I answered it the way you approached CalBear's scenario, I guess I'd have to say, "The probability that you'd win the car used to be 1/3. Then some stuff changed, but it still used to be 1/3. Just your 'predictive confidence' changed." Do I have that right?
The odds changed but the original probability did not. When you selected the card there were 52 possible outcomes. Flipping the cards did not change the fact that your original selection was made from 52 cards. Your probability only changes if you predict each card before you draw from the deck.
So will you give me 10:1 odds on the down card being an ace? Because the "original probability" was 13:1.

If you won't give me those odds, you might be starting to understand why the original probability is meaningless.
This is true only if you are placing a new bet based on the new odds but your "bet" or "declaration" was based on the probability of drawing a King from a 52 card deck. Drawing the cards after the fact is only serving to confirm your original selection. You could simply turn the original card over to prove it.

If the original probability is meaningless then Vegas would retroactively adjust your payouts according to current odds and not the odds of your original bet or declaration.

Probability is based on the total number of possible outcomes. In this case there are 52 possible outcomes. When you drew the original card you drew one of 52 possible outcomes. Proceeding to draw the other cards does not change the actual outcome of the drawn card, it only represents the odds that what you claim you drew from the deck is actually correct.

Odds are based on the number of positive or negative outcomes based on the remaining chances you are allotted. In this case The odds simply represents the chance that your decision is actually correct but it does not affect the probability of the original bet or decision for the simple fact that you are not making a new declaration or bet of what the drawn card is. If 3 kings get immediately drawn you may no longer like your remaining odds that you drew a King but it didn't change the probability that you did in fact draw a King.

Probability and Odds represent chance in slightly different ways, similar to the differences in how mean, median and mode represent averages slightly differently.

 
Could you expand on the other aspects of the situation? Not lazy but the thread is now 26 pages.
Here's an example from earlier in the thread:

Top 3 seeds are awarded based on record, 4th seed is given to the remaining team with the most total points.

Team A is 11-2 1500 points

Team B is 10-3, 1450 points

Team C is 8-5, 1400 points

Team D is 8-5, 950 points

Team E is 7-6, 1200 points

Team C owns the tiebreaker over Team D

Team E plays Team C in the final week of the regular season.

If Team E loses, Team C gets the 3 seed and Team E gets the 4 seed.

There are lots of variations of this than can happen, and they're even more prevalent when you introduce things like Divisions, H2H tiebreakers, etc.
As CalBear noted, the gist of any of these scenarios is that there are two or more different ways to earn a playoff spot (e.g. by record, by winning your division, by total points, by H2H tiebreakers, etc.) The situation arises when you're playing a team that owns one of those tiebreakers over you - but you can avoid that tiebreaker by giving them a win so they win one of the other available playoff spots.

 
Probability is based on the total number of possible outcomes. In this case there are 52 possible outcomes.
Not after you've flipped over 44 other cards, there aren't. That's why the probability changes.

If 3 kings get immediately drawn you may no longer like your remaining odds that you drew a King but it didn't change the probability that you did in fact draw a King.
Yes it does. I understand that you're redefining "probability" to mean something else, but that's silly. Your definition of "probability" isn't the same one that mathematicians use.

As I noted earlier, your point is basically, "The probability that the card is a King used to be 1/13. Now things have changed, but it still used to be 1/13." Well... ok. That's not exactly a useful insight.

Probability and Odds represent chance in slightly different ways, similar to the differences in how mean, median and mode represent averages slightly differently.
Probability and odds, mathematically, are the same thing. It's not like mean, median, and mode which are related, but different concepts that have different values. The relationship between probability and odds is more like the relationship between 1/2 and 0.5 - they're two different ways of representing the same exact value. Odds are calculated from the probability and vice versa. You can't change one without changing the other.

Gambling odds are something different, but CalBear's card scenario never involved you wagering on the outcome with a bookmaker.

 
Last edited by a moderator:
Probability is based on the total number of possible outcomes. In this case there are 52 possible outcomes.
Not after you've flipped over 44 other cards, there aren't. That's why the probability changes.
If 3 kings get immediately drawn you may no longer like your remaining odds that you drew a King but it didn't change the probability that you did in fact draw a King.
Yes it does. I understand that you're redefining "probability" to mean something else, but that's silly. Your definition of "probability" isn't the same one that mathematicians use.As I noted earlier, your point is basically, "The probability that the card is a King used to be 1/13. Now things have changed, but it still used to be 1/13." Well... ok. That's not exactly a useful insight.
Probability and Odds represent chance in slightly different ways, similar to the differences in how mean, median and mode represent averages slightly differently.
Probability and odds, mathematically, are the same thing. It's not like mean, median, and mode which are related, but different concepts that have different values. The relationship between probability and odds is more like the relationship between 1/2 and 0.5 - they're two different ways of representing the same exact value. Odds are calculated from the probability and vice versa. You can't change one without changing the other.Gambling odds are something different, but CalBear's card scenario never involved you wagering on the outcome with a bookmaker.
A raffle has 400 tickets. One ticket will be drawn. With a probability of 1/400

You can buy two tickets, increasing your odds of having a winning ticket, but each ticket still has a 1/400 chance of being drawn.

The probability states what the conditions are, 400 tickets, one winning ticket. Odds are a ratio that reflect a relationship based on either static or variable conditions. The key difference is Odds calculate "chances for in relation to chances against". Buying two tickets provides two chances for which improves you odds to 2:398 but the probability of either ticket being drawn remains the same at 1/400.

You can improve your odds without changing the probability of a winning ticket.

The only way to increase the probability of a winning ticket is to change the conditions that two tickets will be drawn altering the probability to 2/400.

 
You're hung up on this mistaken idea that there's a meaningful difference between "probability" and "odds." I already told you they're the same thing, like 1/2 and 0.5 are the same thing.

A raffle has 400 tickets. One ticket will be drawn. With a probability of 1/400

You can buy two tickets, increasing your odds of having a winning ticket, but each ticket still has a 1/400 chance of being drawn.
Right, that's not a contradiction. The probability (and the odds) that an individual ticket will be a winner hasn't changed, because buying a second ticket hasn't given you any additional information about which of the 400 tickets will be the winner. The probability (and the odds) that you are holding the winning ticket has now doubled, because you now have twice as many tickets.

The probability states what the conditions are, 400 tickets, one winning ticket. Odds are a ratio that reflect a relationship based on either static or variable conditions. The key difference is Odds calculate "chances for in relation to chances against". Buying two tickets provides two chances for which improves you odds to 2:398 but the probability of either ticket being drawn remains the same at 1/400.
Once again, you're mixing two different events.

A: An individual ticket is the winner.

B: You're holding the winning ticket.

Neither the probability nor the odds of A have changed. Both the probability and odds of B have changed once you bought a second ticket.

I'm not sure how much simpler to put it - probability and odds are just two different ways of writing the same exact value. You can't change one without the other. You're comparing the odds of one event to the probability of a different event, and then claiming that the odds have changed but the probability hasn't. This kind of thing is a common mistake for people with little to no background in probability, but it's still wrong.

 
Ignoratio Elenchi said:
You're hung up on this mistaken idea that there's a meaningful difference between "probability" and "odds." I already told you they're the same thing, like 1/2 and 0.5 are the same thing.

TheStig said:
A raffle has 400 tickets. One ticket will be drawn. With a probability of 1/400You can buy two tickets, increasing your odds of having a winning ticket, but each ticket still has a 1/400 chance of being drawn.
Right, that's not a contradiction. The probability (and the odds) that an individual ticket will be a winner hasn't changed, because buying a second ticket hasn't given you any additional information about which of the 400 tickets will be the winner. The probability (and the odds) that you are holding the winning ticket has now doubled, because you now have twice as many tickets.
The probability states what the conditions are, 400 tickets, one winning ticket. Odds are a ratio that reflect a relationship based on either static or variable conditions. The key difference is Odds calculate "chances for in relation to chances against". Buying two tickets provides two chances for which improves you odds to 2:398 but the probability of either ticket being drawn remains the same at 1/400.
Once again, you're mixing two different events.A: An individual ticket is the winner.B: You're holding the winning ticket.Neither the probability nor the odds of A have changed. Both the probability and odds of B have changed once you bought a second ticket.I'm not sure how much simpler to put it - probability and odds are just two different ways of writing the same exact value. You can't change one without the other. You're comparing the odds of one event to the probability of a different event, and then claiming that the odds have changed but the probability hasn't. This kind of thing is a common mistake for people with little to no background in probability, but it's still wrong.
You seem to be hung up on the fact that the additional information changed the ticket bought or what card is drawn.

 
This thread is so long that I don't remember exactly what it was, but I remember "TheStig" failed badly on an earlier argument, to the point where everyone could see he was wrong (I think even himself) yet he wouldn't acquiesce. I think it's happening again, but he seems to be warping the discussion a bit so he can still be correct in some way.

It's just a different road down the same hill, I believe.

 
This thread is so long that I don't remember exactly what it was, but I remember "TheStig" failed badly on an earlier argument, to the point where everyone could see he was wrong (I think even himself) yet he wouldn't acquiesce. I think it's happening again, but he seems to be warping the discussion a bit so he can still be correct in some way.

It's just a different road down the same hill, I believe.
True.

I'm out

 
This thread is so long that I don't remember exactly what it was, but I remember "TheStig" failed badly on an earlier argument, to the point where everyone could see he was wrong (I think even himself) yet he wouldn't acquiesce. I think it's happening again, but he seems to be warping the discussion a bit so he can still be correct in some way.

It's just a different road down the same hill, I believe.
True.

I'm out
If this is said in earnest, then I can respect that.

:thumbup:

 
This thread is so long that I don't remember exactly what it was, but I remember "TheStig" failed badly on an earlier argument, to the point where everyone could see he was wrong (I think even himself) yet he wouldn't acquiesce. I think it's happening again, but he seems to be warping the discussion a bit so he can still be correct in some way.

It's just a different road down the same hill, I believe.
True.
:thumbup:

You could stick around and try to learn how this stuff actually works. :shrug:

 
This thread is so long that I don't remember exactly what it was, but I remember "TheStig" failed badly on an earlier argument, to the point where everyone could see he was wrong (I think even himself) yet he wouldn't acquiesce. I think it's happening again, but he seems to be warping the discussion a bit so he can still be correct in some way.

It's just a different road down the same hill, I believe.
True.
:thumbup:

You could stick around and try to learn how this stuff actually works. :shrug:
You're needling, IE.

Do you know how hard it was to get to this point of admission?

Let the man walk away. ;)

 
This thread is so long that I don't remember exactly what it was, but I remember "TheStig" failed badly on an earlier argument, to the point where everyone could see he was wrong (I think even himself) yet he wouldn't acquiesce. I think it's happening again, but he seems to be warping the discussion a bit so he can still be correct in some way.

It's just a different road down the same hill, I believe.
True.
:thumbup:

You could stick around and try to learn how this stuff actually works. :shrug:
You're needling, IE.Do you know how hard it was to get to this point of admission?

Let the man walk away. ;)
I admit I'm not above that kind of needling, but in this case I was honestly being serious. I went to the trouble to explain the mistakes he was making with his raffle ticket example, I'd be happy to continue explaining things he might be confused about. I'd rather live in a universe where TheStig learns how probability works, than a universe where TheStig disappears, never figures out why he was wrong and just repeats the same mistakes again. Getting him to admit he was wrong isn't the endgame, that's just the first step.

 
This thread is so long that I don't remember exactly what it was, but I remember "TheStig" failed badly on an earlier argument, to the point where everyone could see he was wrong (I think even himself) yet he wouldn't acquiesce. I think it's happening again, but he seems to be warping the discussion a bit so he can still be correct in some way.

It's just a different road down the same hill, I believe.
True.
:thumbup:

You could stick around and try to learn how this stuff actually works. :shrug:
You're needling, IE.Do you know how hard it was to get to this point of admission?

Let the man walk away. ;)
I admit I'm not above that kind of needling, but in this case I was honestly being serious. I went to the trouble to explain the mistakes he was making with his raffle ticket example, I'd be happy to continue explaining things he might be confused about. I'd rather live in a universe where TheStig learns how probability works, than a universe where TheStig disappears, never figures out why he was wrong and just repeats the same mistakes again. Getting him to admit he was wrong isn't the endgame, that's just the first step.
I'm not sure he noticed you lay the board down and place the pieces while the discussion was going on.

I'm sure he will come back with any questions if he so desires. I believe his endgame might be different than yours.

 
Probability is based on the total number of possible outcomes. In this case there are 52 possible outcomes.
Not after you've flipped over 44 other cards, there aren't. That's why the probability changes.
If 3 kings get immediately drawn you may no longer like your remaining odds that you drew a King but it didn't change the probability that you did in fact draw a King.
Yes it does. I understand that you're redefining "probability" to mean something else, but that's silly. Your definition of "probability" isn't the same one that mathematicians use.As I noted earlier, your point is basically, "The probability that the card is a King used to be 1/13. Now things have changed, but it still used to be 1/13." Well... ok. That's not exactly a useful insight.
Probability and Odds represent chance in slightly different ways, similar to the differences in how mean, median and mode represent averages slightly differently.
Probability and odds, mathematically, are the same thing. It's not like mean, median, and mode which are related, but different concepts that have different values. The relationship between probability and odds is more like the relationship between 1/2 and 0.5 - they're two different ways of representing the same exact value. Odds are calculated from the probability and vice versa. You can't change one without changing the other.Gambling odds are something different, but CalBear's card scenario never involved you wagering on the outcome with a bookmaker.
You are simply wrong on this point. And even if others agree with you it doesn't make you right. Probability and Odds are two different things that are calculated differently. Odds are a ratio of the probability.

Your problem Ignoratio is that you suffer from the Dunning–Kruger effect. I know when I'm wrong, your arrogance blinds you when you are wrong.

I have only claimed that "Odds" and "Probability" are not only defined differently but that they are calculated differently. EVERYONE in this forum has used the term interchangeably as the same thing and this has led to lots of misinformation.

I have not defined "Odds" and "Probability", I have used their PROPER definitions and meanings as a simple point of differentiation.

You may pick my examples apart as incorrect and I will not debate that as they failed to serve my point, which is correct regardless if my example used.

My analogy to Mean, Median and Mode holds as all three represent in different ways what the most common number among a group of numbers is. They each have different uses yet they represent similar yet not equal values based on the same sample of numbers.

But I am done trying to explain the simple difference and misuse based on a common misunderstanding of the two terms, I'll let the Journal of Statistcal Education explain it for me instead. The authors name is provided so that you may continue your argument with him.

http://www.amstat.org/publications/jse/v20n3/fulton.pdf

The problem with the "road down the same hill" is that we both started on the top of the same hill and gravity always wins.

 
Poor Stig.

Probability and Odds are two different things that are calculated differently. Odds are a ratio of the probability.
This doesn't contradict what I said. On the contrary, you're proving my point. Because odds are a ratio of the probability, how do the odds change if the probability doesn't change?

 
Poor Stig.

Probability and Odds are two different things that are calculated differently. Odds are a ratio of the probability.
This doesn't contradict what I said. On the contrary, you're proving my point. Because odds are a ratio of the probability, how do the odds change if the probability doesn't change?
Ummm, no. Your point that I addressed was bolded. But you are at least adhearing to your Latin username.

 
Ignoratio Elenchi said:
Poor Stig.

TheStig said:
Probability and Odds are two different things that are calculated differently. Odds are a ratio of the probability.
This doesn't contradict what I said. On the contrary, you're proving my point. Because odds are a ratio of the probability, how do the odds change if the probability doesn't change?
Ok, I'm not a math major, so I'm giong through that paper Stig linked to on the fly - just looking at the formulas and making assumptions from just eyeballing them and using basic problem solving. Most are pretty simple to unravel - others would take more time (which I'm not going to spend).

IE, I think Stig might be right. IF that paper he linked to is correct, then odds and probability are not the same (if I'm looking at those formulas and explanations correctly - which I might not be).

Do I think Stig knew he was right before he Googled and breathed a sigh of relief when that item popped up? No. I think the tail wagged the dog, here. But he might have stumbled upon something and proven you wrong.

According to the paper he linked - your "odds" and your "probability" will be one unit different from each other.

This line, about 1/2 of the way down the paper, is an extreme simplification of the formulas listed, but I think it is the most simple breakdown of the entire paper. (It is quoting a Lotto ticket and the way they listed the "odds" of winning)

Therefore, each of these permutations of 6 numbers has a probability of 1 in 18,595,558,800; but

the odds in favor of choosing any of these permutations is 1 on 18,595,558,799.
As you can see, the odds and probability are NOT mathematically the same, as you said in this quote earlier:

IE's quote:

Probability and odds, mathematically, are the same thing.
I quite enjoy following your posts, IE, so don't take this the wrong way - you're one of my favs. But...

I am getting a huge kick out of thinking that "TheStig" might have painted you into a corner here, even if it was accidental. :lol:

 
Last edited by a moderator:
TheStig said:
Your point that I addressed was bolded.
TheStig said:
Yes it does. I understand that you're redefining "probability" to mean something else, but that's silly. Your definition of "probability" isn't the same one that mathematicians use.
You didn't address this. The way you've used the term "probability" is not consistent with the way a mathematician would use the term.

 
IE, I think Stig might be right. IF that paper he linked to is correct, then odds and probability are not the same.
They're "not the same" in the way that 1/2 and 0.5 are "not the same." They're just different ways of representing the same value.

The point is that Stig believes that odds and probability are things independent of one another. That the odds can change while the probability stays the same. This is not the case. The odds are calculated based on the probability and vice versa.

As you can see, the odds and probability are NOT mathematically the same
They're not written the same way, just like 1/2 and 0.5 are not written the same way. They are mathematically the same, just like 1/2 and 0.5 are mathematically the same.

The "difference" between probability and odds is just a rearrangement of variables. They're both strictly determined by the number of successful outcomes in the sample space.

 
IE, I think Stig might be right. IF that paper he linked to is correct, then odds and probability are not the same.
They're "not the same" in the way that 1/2 and 0.5 are "not the same." They're just different ways of representing the same value.

The point is that Stig believes that odds and probability are things independent of one another. That the odds can change while the probability stays the same. This is not the case. The odds are calculated based on the probability and vice versa.

As you can see, the odds and probability are NOT mathematically the same
They're not written the same way, just like 1/2 and 0.5 are not written the same way. They are mathematically the same, just like 1/2 and 0.5 are mathematically the same.

The "difference" between probability and odds is just a rearrangement of variables. They're both strictly determined by the number of successful outcomes in the sample space.
If you view that paper as correct, then they are most definately not "mathematically" the same. You are looking at the same thing - agreed. But you are calculating two different things.

I believe the definition of "mathematically" the same, means the numbers are precisely the same, exactly the same. According to the paper, they are not.

The paper is saying that when you use "odds" and "probability" to measure something, that you are measuring two different things - which is why the numbers aren't mathematically exact.

IE, you are excellent at doing the dance to make the stats say what you want them to say. Maybe we should call this new dance "TheStig".

 
Last edited by a moderator:
If you view that paper as correct, then they are most definately not "mathematically" the same. You are looking at the same thing - agreed. But you are calculating two different things.
You're representing the same quantity in two different ways. They are mathematically the same. I'm not saying the numeric portions of their representations are equal.

Perhaps a better example will help illustrate, since I probably introduced some unintended confusion with the 1/2 -> 0.5 thing.

2 gallons of milk is the "same" as 8 quarts of milk. Superficially, someone might say, "You're wrong, 2 isn't equal to 8!" Well of course not - you can't just look at the numerical portion, you have to look at the whole thing. A probability of 1/100 is the same as odds in favor of 1:99. 100 isn't equal to 99, but 1/100 probability is equal to odds in favor of 1:99. More importantly, discrete probabilities and odds form a bijection - for every discrete probability, there is one and only one odds value, and vice versa.

Stig's mistake since the beginning is claiming that when you change the sample space of a problem, the odds change but the probability doesn't change. This is impossible. Going back to the milk example, imagine you have 2 gallons of milk (which is also 8 quarts of milk). Then you add another gallon. What Stig is basically saying is that you now have 3 gallons of milk, but you still only have 8 quarts of milk.

 
Last edited by a moderator:
IE, I think Stig might be right. IF that paper he linked to is correct, then odds and probability are not the same.
They're "not the same" in the way that 1/2 and 0.5 are "not the same." They're just different ways of representing the same value.

The point is that Stig believes that odds and probability are things independent of one another. That the odds can change while the probability stays the same. This is not the case. The odds are calculated based on the probability and vice versa.

As you can see, the odds and probability are NOT mathematically the same
They're not written the same way, just like 1/2 and 0.5 are not written the same way. They are mathematically the same, just like 1/2 and 0.5 are mathematically the same.

The "difference" between probability and odds is just a rearrangement of variables. They're both strictly determined by the number of successful outcomes in the sample space.
If you view that paper as correct, then they are most definately not "mathematically" the same. You are looking at the same thing - agreed. But you are calculating two different things.

I believe the definition of "mathematically" the same, means the numbers are precisely the same, exactly the same. According to the paper, they are not.

The paper is saying that when you use "odds" and "probability" to measure something, that you are measuring two different things - which is why the numbers aren't mathematically exact.

IE, you are excellent at doing the dance to make the stats say what you want them to say. Maybe we should call this new dance "TheStig".
My takeaways from that article:

1. It's terrible, and i counted a few factual errors.

2. It does present cases where odds were slightly miscalculated. The ones shown have the right side of the odds ratio including all outcomes rather than just all complementary outcomes.

3. It proves IE's point. Converting from odds to probabilities and back is a matter of rearranging terms.

Probability = n / N (in english, probability is the desired outcome, n, divided by all possible outcomes N)

Odds = n : N-n (in english, odds are the ratio of the desired outcome, n, to all other outcomes, N-n)

 
If you view that paper as correct, then they are most definately not "mathematically" the same. You are looking at the same thing - agreed. But you are calculating two different things.
You're representing the same quantity in two different ways. They are mathematically the same. I'm not saying the numeric portions of their representations are equal.

Perhaps a better example will help illustrate, since I probably introduced some unintended confusion with the 1/2 -> 0.5 thing.

2 gallons of milk is the "same" as 8 quarts of milk. Superficially, someone might say, "You're wrong, 2 isn't equal to 8!" Well of course not - you can't just look at the numerical portion, you have to look at the whole thing. A probability of 1/100 is the same as odds in favor of 1:99. 100 isn't equal to 99, but 1/100 probability is equal to odds in favor of 1:99. More importantly, discrete probabilities and odds form a bijection - for every discrete probability, there is one and only one odds value, and vice versa.

Stig's mistake since the beginning is claiming that when you change the sample space of a problem, the odds change but the probability doesn't change. This is impossible. Going back to the milk example, imagine you have 2 gallons of milk (which is also 8 quarts of milk). Then you add another gallon. What Stig is basically saying is that you now have 3 gallons of milk, but you still only have 8 quarts of milk.
I agree with what you say here. I just disagree with what we both say the paper is trying to point out.

I figured you would come back with an example similar to what you just posted. So, I was thinking of an example while I made some food of how I could express how I saw the point. Here's what I came up with:

What the paper is trying to say, in my opinion, is that odds and probability are two entirely different measurements. I think they make this fairly obvious in their statements. To me, it is as if you have a grain of salt. You want to measure the chlorine in the salt. One time you measure the chlorine by its mass. You get a number. Then you measure again but this time you measure the chlorine by it's volume. The chlorine stays the same, yet the way you measure it is different - producing two mathematically different answers.

In my humble opinion, this is what the mathematicians are trying to point out: Odds and probabiliy are not the same type of measurement and therefor will not produce mathematically similar answers in all (if any) instances.

So your milk example is a fine one, but for a different discussion. From what I read, they are not saying odds and probability are different ways of measuring the same thing - they are saying that odds and probability are measuring different things altogether..

I may be wrong here, in the fact that I might have misinterpreted the paper. Either way, I'm going to probably let this lie since I'm pretty sure that if we don't agree, it's because we view the conclusions of the paper in different ways.

Either way, fun discussion.

 
IE, I think Stig might be right. IF that paper he linked to is correct, then odds and probability are not the same.
They're "not the same" in the way that 1/2 and 0.5 are "not the same." They're just different ways of representing the same value.

The point is that Stig believes that odds and probability are things independent of one another. That the odds can change while the probability stays the same. This is not the case. The odds are calculated based on the probability and vice versa.

As you can see, the odds and probability are NOT mathematically the same
They're not written the same way, just like 1/2 and 0.5 are not written the same way. They are mathematically the same, just like 1/2 and 0.5 are mathematically the same.

The "difference" between probability and odds is just a rearrangement of variables. They're both strictly determined by the number of successful outcomes in the sample space.
If you view that paper as correct, then they are most definately not "mathematically" the same. You are looking at the same thing - agreed. But you are calculating two different things.I believe the definition of "mathematically" the same, means the numbers are precisely the same, exactly the same. According to the paper, they are not.

The paper is saying that when you use "odds" and "probability" to measure something, that you are measuring two different things - which is why the numbers aren't mathematically exact.

IE, you are excellent at doing the dance to make the stats say what you want them to say. Maybe we should call this new dance "TheStig".
My takeaways from that article:

1. It's terrible, and i counted a few factual errors.

2. It does present cases where odds were slightly miscalculated. The ones shown have the right side of the odds ratio including all outcomes rather than just all complementary outcomes.

3. It proves IE's point. Converting from odds to probabilities and back is a matter of rearranging terms.

Probability = n / N (in english, probability is the desired outcome, n, divided by all possible outcomes N)

Odds = n : N-n (in english, odds are the ratio of the desired outcome, n, to all other outcomes, N-n)
I didn't read the entire thing, but it seems that what the article is basically saying is that people often use the word "odds" when they really mean "probability." They're right, it happens all the time (even in posted lotto probabilities, etc.), and it's something people have done right here in this thread many times.

For example, there have been many instances of someone saying something like, "Your odds of drawing a King are 1/13." That's incorrect. Your probability of drawing a King is 1/13. Your odds in favor of drawing a King are 1:12. I haven't made a big stink about it, because I recognize that when your average joe says, "your odds of..." they usually really mean, "the probability of."

 
IE, I think Stig might be right. IF that paper he linked to is correct, then odds and probability are not the same.
They're "not the same" in the way that 1/2 and 0.5 are "not the same." They're just different ways of representing the same value.

The point is that Stig believes that odds and probability are things independent of one another. That the odds can change while the probability stays the same. This is not the case. The odds are calculated based on the probability and vice versa.

As you can see, the odds and probability are NOT mathematically the same
They're not written the same way, just like 1/2 and 0.5 are not written the same way. They are mathematically the same, just like 1/2 and 0.5 are mathematically the same.

The "difference" between probability and odds is just a rearrangement of variables. They're both strictly determined by the number of successful outcomes in the sample space.
If you view that paper as correct, then they are most definately not "mathematically" the same. You are looking at the same thing - agreed. But you are calculating two different things.

I believe the definition of "mathematically" the same, means the numbers are precisely the same, exactly the same. According to the paper, they are not.

The paper is saying that when you use "odds" and "probability" to measure something, that you are measuring two different things - which is why the numbers aren't mathematically exact.

IE, you are excellent at doing the dance to make the stats say what you want them to say. Maybe we should call this new dance "TheStig".
My takeaways from that article:

1. It's terrible, and i counted a few factual errors.

2. It does present cases where odds were slightly miscalculated. The ones shown have the right side of the odds ratio including all outcomes rather than just all complementary outcomes.

3. It proves IE's point. Converting from odds to probabilities and back is a matter of rearranging terms.

Probability = n / N (in english, probability is the desired outcome, n, divided by all possible outcomes N)

Odds = n : N-n (in english, odds are the ratio of the desired outcome, n, to all other outcomes, N-n)
It could be terrible. You sound as if you're more astute on the subject that I.

That being said, I do disagree on what the paper is trying to say. See my above post as to how I see it.

 
TheStig said:
I never said the odds don't change. I said that the the original probability of selecting a king didn't change.
"It used to be 1/13. Now it's 1/2, but that doesn't change the fact that it used to be 1/13." That's some really insightful stuff there.
In regards to the Monty Hall Paradox, smarter mathematicians than you have analyzed and debated it so please spare us that you have the definitive answer to it.
There's not really any debate about how to resolve the paradox. As long as you're clear about the assumptions of the problem, the solution is easy. The trick is in clarifying the assumptions, which is the point - calculating probabilities is largely dependent on what we know (and don't know), and how we learned it, and related concepts.
But please, share with all of us your thesis on the Monty Hall Paradox.
Well, if I answered it the way you approached CalBear's scenario, I guess I'd have to say, "The probability that you'd win the car used to be 1/3. Then some stuff changed, but it still used to be 1/3. Just your 'predictive confidence' changed." Do I have that right?
The odds changed but the original probability did not. When you selected the card there were 52 possible outcomes. Flipping the cards did not change the fact that your original selection was made from 52 cards. Your probability only changes if you predict each card before you draw from the deck.
So will you give me 10:1 odds on the down card being an ace? Because the "original probability" was 13:1.

If you won't give me those odds, you might be starting to understand why the original probability is meaningless.
This is true only if you are placing a new bet based on the new odds but your "bet" or "declaration" was based on the probability of drawing a King from a 52 card deck. Drawing the cards after the fact is only serving to confirm your original selection. You could simply turn the original card over to prove it.

If the original probability is meaningless then Vegas would retroactively adjust your payouts according to current odds and not the odds of your original bet or declaration.

Probability is based on the total number of possible outcomes. In this case there are 52 possible outcomes. When you drew the original card you drew one of 52 possible outcomes. Proceeding to draw the other cards does not change the actual outcome of the drawn card, it only represents the odds that what you claim you drew from the deck is actually correct.

Odds are based on the number of positive or negative outcomes based on the remaining chances you are allotted. In this case The odds simply represents the chance that your decision is actually correct but it does not affect the probability of the original bet or decision for the simple fact that you are not making a new declaration or bet of what the drawn card is. If 3 kings get immediately drawn you may no longer like your remaining odds that you drew a King but it didn't change the probability that you did in fact draw a King.

Probability and Odds represent chance in slightly different ways, similar to the differences in how mean, median and mode represent averages slightly differently.
Probability and Odds represent the same chance in slightly different ways.

Mean median and mode represent different things, and no one of them can be derived from either of the other two.

That is, if you tell me the odds but I really want to know the probability, then I can compute what I need from what you've given me.

If you give me the mean but I want to know the median, I'm hosed.

 
If you view that paper as correct, then they are most definately not "mathematically" the same. You are looking at the same thing - agreed. But you are calculating two different things.
You're representing the same quantity in two different ways. They are mathematically the same. I'm not saying the numeric portions of their representations are equal.Perhaps a better example will help illustrate, since I probably introduced some unintended confusion with the 1/2 -> 0.5 thing.2 gallons of milk is the "same" as 8 quarts of milk. Superficially, someone might say, "You're wrong, 2 isn't equal to 8!" Well of course not - you can't just look at the numerical portion, you have to look at the whole thing. A probability of 1/100 is the same as odds in favor of 1:99. 100 isn't equal to 99, but 1/100 probability is equal to odds in favor of 1:99. More importantly, discrete probabilities and odds form a bijection - for every discrete probability, there is one and only one odds value, and vice versa.Stig's mistake since the beginning is claiming that when you change the sample space of a problem, the odds change but the probability doesn't change. This is impossible. Going back to the milk example, imagine you have 2 gallons of milk (which is also 8 quarts of milk). Then you add another gallon. What Stig is basically saying is that you now have 3 gallons of milk, but you still only have 8 quarts of milk.
Opportunities Abound

Another key aspect to computing probability is factoring in the number of opportunities for something to occur. The more opportunities there are, the more likely it is that an event will occur. The more tickets a player buys or the more often a player buys them, the greater the player’s chances of winning. At the same time, the more tickets purchased, the greater the average expected loss. One thousand tickets means 1,000 opportunities to win, so that the chance of winning Lotto 6/49 goes from 1 in 14 million to 1 in 14,000. However, because the expected return is nearly always negative, the player will still lose money, on average, no matter how many tickets the player purchases (see “Playing Multiple Hands, Tickets or Bets” in “Games and Systems”). This is true whether the player buys several tickets for the same draw or one ticket for every draw. Adding more opportunities (e.g., more tickets, bingo cards or slot machines) increases a player’s chance of a win, but does not allow him/her to beat the odds.

http://www.problemgambling.ca/en/resourcesforprofessionals/pages/probabilityoddsandrandomchance.aspx

But the p-value for that effect is not the p-value for the differences in probabilities.

If you present a table of probabilities at different values of X, most research audiences will, at least in their minds, make those difference comparisons between the probabilities. They do this because they’ve been trained to do this in linear models.

These differences in probabilities don’t line up with the p-values in logistic regression models, though. And this can get quite confusing.

Second, when X, the predictor is continuous, the odds ratio is constant across values of X. But probabilities aren’t.

It works exactly the same way as interest rates. I can tell you that an annual interest rate is 8%. So at the end of the year, you’ll earn $8 if you invested $100, or $40 if you invested $500. The rate stays constant, but the actual amount earned differs based on the amount invested.

Odds ratios work the same. An odds ratio of 1.08 will give you an 8% increase in the odds at any value of X.

Likewise, the difference in the probability (or the odds) depends on the value of X.

http://www.theanalysisfactor.com/why-use-odds-ratios/

 
IE, I think Stig might be right. IF that paper he linked to is correct, then odds and probability are not the same.
They're "not the same" in the way that 1/2 and 0.5 are "not the same." They're just different ways of representing the same value.

The point is that Stig believes that odds and probability are things independent of one another. That the odds can change while the probability stays the same. This is not the case. The odds are calculated based on the probability and vice versa.

As you can see, the odds and probability are NOT mathematically the same
They're not written the same way, just like 1/2 and 0.5 are not written the same way. They are mathematically the same, just like 1/2 and 0.5 are mathematically the same.

The "difference" between probability and odds is just a rearrangement of variables. They're both strictly determined by the number of successful outcomes in the sample space.
If you view that paper as correct, then they are most definately not "mathematically" the same. You are looking at the same thing - agreed. But you are calculating two different things.I believe the definition of "mathematically" the same, means the numbers are precisely the same, exactly the same. According to the paper, they are not.

The paper is saying that when you use "odds" and "probability" to measure something, that you are measuring two different things - which is why the numbers aren't mathematically exact.

IE, you are excellent at doing the dance to make the stats say what you want them to say. Maybe we should call this new dance "TheStig".
My takeaways from that article:

1. It's terrible, and i counted a few factual errors.

2. It does present cases where odds were slightly miscalculated. The ones shown have the right side of the odds ratio including all outcomes rather than just all complementary outcomes.

3. It proves IE's point. Converting from odds to probabilities and back is a matter of rearranging terms.

Probability = n / N (in english, probability is the desired outcome, n, divided by all possible outcomes N)

Odds = n : N-n (in english, odds are the ratio of the desired outcome, n, to all other outcomes, N-n)
I didn't read the entire thing, but it seems that what the article is basically saying is that people often use the word "odds" when they really mean "probability." They're right, it happens all the time (even in posted lotto probabilities, etc.), and it's something people have done right here in this thread many times.

For example, there have been many instances of someone saying something like, "Your odds of drawing a King are 1/13." That's incorrect. Your probability of drawing a King is 1/13. Your odds in favor of drawing a King are 1:12. I haven't made a big stink about it, because I recognize that when your average joe says, "your odds of..." they usually really mean, "the probability of."
The bolded is interesting to me.

This could be where we are not seeing eye-to-eye - you, being in the math world (an actuary, I believe? One of my best friends is one, as well, so I know it's a math guy's job), and myself being not in that world. It seems I think that when you say ""mathematically" the same" that the numbers at the end of the equation will be the same. Yet in the bolded (above) the numbers are not the same, yet you infer that this is what you meant by mathematically the same.

Interesting. I may be disagreeing with you simply because I have a definition incorrect. I'm using a layman's understanding of it - you I think are using the technical, more proper definition.

If so, apologies. I think you see what I was getting at, and now I think I understand your point as well.

I still might disagree on what the paper is trying to say, but I'm not versed enough in the field to even try to argue the point. I could very well be wrong. :)

 
Last edited by a moderator:
IE, I think Stig might be right. IF that paper he linked to is correct, then odds and probability are not the same.
They're "not the same" in the way that 1/2 and 0.5 are "not the same." They're just different ways of representing the same value.

The point is that Stig believes that odds and probability are things independent of one another. That the odds can change while the probability stays the same. This is not the case. The odds are calculated based on the probability and vice versa.

As you can see, the odds and probability are NOT mathematically the same
They're not written the same way, just like 1/2 and 0.5 are not written the same way. They are mathematically the same, just like 1/2 and 0.5 are mathematically the same.

The "difference" between probability and odds is just a rearrangement of variables. They're both strictly determined by the number of successful outcomes in the sample space.
If you view that paper as correct, then they are most definately not "mathematically" the same. You are looking at the same thing - agreed. But you are calculating two different things.

I believe the definition of "mathematically" the same, means the numbers are precisely the same, exactly the same. According to the paper, they are not.

The paper is saying that when you use "odds" and "probability" to measure something, that you are measuring two different things - which is why the numbers aren't mathematically exact.

IE, you are excellent at doing the dance to make the stats say what you want them to say. Maybe we should call this new dance "TheStig".
My takeaways from that article:

1. It's terrible, and i counted a few factual errors.

2. It does present cases where odds were slightly miscalculated. The ones shown have the right side of the odds ratio including all outcomes rather than just all complementary outcomes.

3. It proves IE's point. Converting from odds to probabilities and back is a matter of rearranging terms.

Probability = n / N (in english, probability is the desired outcome, n, divided by all possible outcomes N)

Odds = n : N-n (in english, odds are the ratio of the desired outcome, n, to all other outcomes, N-n)
It could be terrible. You sound as if you're more astute on the subject that I.

That being said, I do disagree on what the paper is trying to say. See my above post as to how I see it.
Eh, I don't think so.

On page 4, the authors illustrate:

P(A) = n(A) / N

odds in favor of A = n(A) : n(Ac)

"the odds in favor of A can be expressed as the ratio of the probability of event A and the probability of the complement of A"

N = n(A) + n(Ac) is given.

So the difference is only in the denominator: N for probability, and N-n(A) for odds.

 
To me, it is as if you have a grain of salt. You want to measure the chlorine in the salt. One time you measure the chlorine by its mass. You get a number. Then you measure again but this time you measure the chlorine by it's volume. The chlorine stays the same, yet the way you measure it is different - producing two mathematically different answers.
This is no different than my milk example. As you said, the chlorine stays the same. The milk stays the same. The quantity that probability and odds represent stays the same. In each case, we just have two different ways of representing the same exact thing.

In your example, let's say you do the first measurement and find 2 grams of chlorine. You don't have to measure again, because you already know the volume (it's a function of the mass, plus density and temperature or whatever - it's been years since I've done any chemistry). If you add more chlorine, the mass changes and the volume changes correspondingly.

If you measure 2 gallons of milk, you don't have to measure again to figure out how many quarts you have. You already know. If you add more milk, the number of gallons changes and the number of quarts changes correspondingly.

If you measure a probability of 1/13, you don't have to measure again to figure out what the odds are. You already know. If you change the sample space of a problem by introducing new information (for example, by flipping over 44 cards), the probability changes and the odds change in lockstep.

In my humble opinion, this is what the mathematicians are trying to point out: Odds and probabiliy are not the same type of measurement and therefor will not produce mathematically similar answers in all (if any) instances.
That's not what they're saying. They're just saying that people frequently use the word "odds" when they really mean "probability." It would be like someone walking in with two gallons of milk and claiming they have 2 quarts of milk. It's not the numerical portion they got wrong, it's the units.

 
To me, it is as if you have a grain of salt. You want to measure the chlorine in the salt. One time you measure the chlorine by its mass. You get a number. Then you measure again but this time you measure the chlorine by it's volume. The chlorine stays the same, yet the way you measure it is different - producing two mathematically different answers.
This is no different than my milk example. As you said, the chlorine stays the same. The milk stays the same. The quantity that probability and odds represent stays the same. In each case, we just have two different ways of representing the same exact thing.

In your example, let's say you do the first measurement and find 2 grams of chlorine. You don't have to measure again, because you already know the volume (it's a function of the mass, plus density and temperature or whatever - it's been years since I've done any chemistry). If you add more chlorine, the mass changes and the volume changes correspondingly.

If you measure 2 gallons of milk, you don't have to measure again to figure out how many quarts you have. You already know. If you add more milk, the number of gallons changes and the number of quarts changes correspondingly.

If you measure a probability of 1/13, you don't have to measure again to figure out what the odds are. You already know. If you change the sample space of a problem by introducing new information (for example, by flipping over 44 cards), the probability changes and the odds change in lockstep.

In my humble opinion, this is what the mathematicians are trying to point out: Odds and probabiliy are not the same type of measurement and therefor will not produce mathematically similar answers in all (if any) instances.
That's not what they're saying. They're just saying that people frequently use the word "odds" when they really mean "probability." It would be like someone walking in with two gallons of milk and claiming they have 2 quarts of milk. It's not the numerical portion they got wrong, it's the units.
I see your point (see my above post), yet I disagree that when you measure mass and volume it is the same comparison as measuring milk in quarts and gallons.

To clarify my point even more - lets say that instead of measuring mass and volume on the chlorine, you were instead measuring texture and density of the salt cube itself. This is an even more extreme example of what I figured the paper was trying to point out - that odds and probabilty were entirely two different things.

Now, the paper seems to be wrong (according to Dave) and so this invalidates whatever I was trying to say. My point may be wrong, I just wanted to clarify what I thought the paper was saying.

 
Last edited by a moderator:
I would pay someone money to be able to resist going back through the last couple pages here, but at some point I probably will. Odds are I will. :coffee:

 
To me, it is as if you have a grain of salt. You want to measure the chlorine in the salt. One time you measure the chlorine by its mass. You get a number. Then you measure again but this time you measure the chlorine by it's volume. The chlorine stays the same, yet the way you measure it is different - producing two mathematically different answers.
This is no different than my milk example. As you said, the chlorine stays the same. The milk stays the same. The quantity that probability and odds represent stays the same. In each case, we just have two different ways of representing the same exact thing.

In your example, let's say you do the first measurement and find 2 grams of chlorine. You don't have to measure again, because you already know the volume (it's a function of the mass, plus density and temperature or whatever - it's been years since I've done any chemistry). If you add more chlorine, the mass changes and the volume changes correspondingly.

If you measure 2 gallons of milk, you don't have to measure again to figure out how many quarts you have. You already know. If you add more milk, the number of gallons changes and the number of quarts changes correspondingly.

If you measure a probability of 1/13, you don't have to measure again to figure out what the odds are. You already know. If you change the sample space of a problem by introducing new information (for example, by flipping over 44 cards), the probability changes and the odds change in lockstep.

In my humble opinion, this is what the mathematicians are trying to point out: Odds and probabiliy are not the same type of measurement and therefor will not produce mathematically similar answers in all (if any) instances.
That's not what they're saying. They're just saying that people frequently use the word "odds" when they really mean "probability." It would be like someone walking in with two gallons of milk and claiming they have 2 quarts of milk. It's not the numerical portion they got wrong, it's the units.
I see your point (see my above post), yet I disagree that when you measure mass and volume it is the same comparison as measuring milk in quarts and gallons.

To clarify my point even more - lets say that instead of measuring mass and volume on the chlorine, you were instead measuring texture and density of the salt cube itself. This is an even more extreme example of what I was trying to say about the paper - that odds and probabilty were entirely two different measurments.

Now, the paper seems to be wrong (according to Dave) and so this invalidates whatever I was trying to say. My point may be wrong, I just wanted to clarify what I thought the paper was saying.
There are some mistakes in the paper, but it's not wrong in its premise.

The probability of rolling a 6 is 1/6... one chance you do, out of 6 possible outcomes.

The odds of rolling a 6 are 1 : 5... one chance you do, and 5 you don't.

That's the distinction they seem to be highlighting.

 
So let's say you have the four Aces from a standard deck of playing cards. You shuffle them up thoroughly, such that every possible ordering of the four cards is equiprobable, and place the pile face down on the table.

  1. You take the top card off the pile and put it in your pocket. What is the probability that the card in your pocket is the Ace of Hearts? What are the odds in favor of it being the Ace of Hearts?
  2. You then flip over the next two cards and find that they're the Ace of Spades and the Ace of Clubs. What is the probability that the card in your pocket is the Ace of Hearts? What are the odds in favor of it being the Ace of Hearts?
 
To me, it is as if you have a grain of salt. You want to measure the chlorine in the salt. One time you measure the chlorine by its mass. You get a number. Then you measure again but this time you measure the chlorine by it's volume. The chlorine stays the same, yet the way you measure it is different - producing two mathematically different answers.
This is no different than my milk example. As you said, the chlorine stays the same. The milk stays the same. The quantity that probability and odds represent stays the same. In each case, we just have two different ways of representing the same exact thing.In your example, let's say you do the first measurement and find 2 grams of chlorine. You don't have to measure again, because you already know the volume (it's a function of the mass, plus density and temperature or whatever - it's been years since I've done any chemistry). If you add more chlorine, the mass changes and the volume changes correspondingly.If you measure 2 gallons of milk, you don't have to measure again to figure out how many quarts you have. You already know. If you add more milk, the number of gallons changes and the number of quarts changes correspondingly.If you measure a probability of 1/13, you don't have to measure again to figure out what the odds are. You already know. If you change the sample space of a problem by introducing new information (for example, by flipping over 44 cards), the probability changes and the odds change in lockstep.
In my humble opinion, this is what the mathematicians are trying to point out: Odds and probabiliy are not the same type of measurement and therefor will not produce mathematically similar answers in all (if any) instances.
That's not what they're saying. They're just saying that people frequently use the word "odds" when they really mean "probability." It would be like someone walking in with two gallons of milk and claiming they have 2 quarts of milk. It's not the numerical portion they got wrong, it's the units.
No, you are wrong. Probability is based on the totality of events. Odds are based on a condition in relation to the probability.

It is a matter of tense. Odds are calculating that something has happened and that information is used to predict the likely outcome of the act. Probability simply states the chance that a particular outcome will happen.

One measures the introduction of a variable or new information the other does not.

Before you drew the card your probability of drawing a King was 1/13. Continuing to draw cards does not change the probability that you drew a King. Drawing the cards is just a simple form of a regression to prove what card you originally drew. All you are doing is going through the deck. Announcing the second and first runners up at the Miss America Pageant did not change the probability of the winner in totality when they took the stage. You could prove the card by flipping it or you could prove it by flipping the remaining cards first neither act changes the probability of what was drawn.

But again, I refer you to the authors of the articles to debate your point.

 
No, you are wrong.
Whatever helps you sleep at night.

Probability is based on the totality of events. Odds are based on a condition in relation to the probability.
Can you link to the academic paper you quoted these definitions from?

It is a matter of tense. Odds are calculating that something has happened and that information is used to predict the likely outcome of the act. Probability simply states the chance that a particular outcome will happen.

One measures the introduction of a variable or new information the other does not.
No, you are wrong. (See, I can do that too!) Can you link to the academic paper that told you that odds measures the introduction of a variable or new information, while probability doesn't?

Continuing to draw cards does not change the probability that you drew a King.
It doesn't change the probability that you drew (past tense) a King. I've pointed out at least twice that we all agree the probability used to be 1/13, and even after looking at other cards, it still used to be 1/13 (probability isn't a magic time machine). It also doesn't change the odds that you drew (past tense) a King, since the odds are a direct function of the probability.

It does change the probability, and the odds, that the card you drew is (present tense) a King, because now we have more information with which to compute the probability (and the odds).

But again, I refer you to the authors of the articles to debate your point.
I haven't seen much to debate in any of the articles you posted. None of them appears to be saying what you wish they were saying.

 
Last edited by a moderator:
After looking at some of this page, odds are I will probably not take my chances on this. Seems like a losing battle where people are saying the same things in different ways. It's like watching Michael J Fox try and put the thread through the needle. I guess at some point it will match up.

 

Users who are viewing this thread

Back
Top