K^2 2,047 Posted May 29, 2007 Since debate is, at least in part, about formal logic, I think a small exercise is in order. I hope that everyone here is familiar with simple probability theory. If I flip a coin, there are two outcomes Heads (H), or Tails (T), both equally likely. Possibility of the coin staying on its edge will be neglected. If you take two coins, there are four outcomes: HH, HT, TH, and TT. Each, now, with 1/4 odds. Now, imagine that I invite you to play a little game. I will flip two coins, and I will name you one of them, asking you to name me the other one. You win if you guess correctly. For instance, if the coins landed in combination TH, I might tell you, "One of these coins is Heads", and you would win by suggesting that the other is Tails. What are your odds of winning? Well, there are still four possible ways the coins can fall. The same HH, HT, TH, and TT. By telling you that one of the coins is Tails, I eliminate possibility of HH. If I tell you that one of the coins is Heads, you know that the combination is not TT. Either way, I eliminate one out of four possible combinations with the information that I give you. Whether you are left with {HH, HT, TH} or {HT, TH, TT}, your guess of the second coin being the same gives you only 1/3 odds of winning, while guessing the opposite gives you 2/3. As such, you can win 2/3 of the time. But can you? Your strategy of picking the coin is very simple. You merely name the state opposite of the one I tell you. Lets re-examine the four 2-coin states. Of them, two states are of the coins landing the same, and two states of them landing differently. The odds of coins being the same is always 1/2. Before I even said anything to you, your strategy is doomed 1/2 of the time. There is no way your odds of winning are going above these 1/2. You are back down to random guessing. The question is not which of the two is wrong. That should be obvious. If it isn't, try it with somebody. The question is why the wrong answer wrong. It was constructed the way you would normally construct a probability proof. It accounts for all possible states. And yet, it fails. I can kind of see where it fails, but I'm at a bit of a loss trying to come up with a good, clean, 1-2 paragraph explanation of that. If you want to take a stab at it, go ahead. If you have any other thoughts on this problem, or can think of something similar that fits, post them. Quote Share this post Link to post Share on other sites
Otter 7,900 Posted May 29, 2007 I've been up for three days straight, but here's something that sticks out at me - by asking if a flip will be the same, you're essentially asking "heads or tails" again. What the other coin was has no more bearing on this flip than the price of tea in China. And that sh*t is getting expensive. Quote Share this post Link to post Share on other sites
Mortukai 1 Posted May 29, 2007 It's a common mistake in probability for people to assume that subsequent outcomes are affected by previous ones. For example, the vast majority of people will assume that the probability of getting this result when flipping a coin: HHHHHHHHHHHHHHHHHHHH is much less than the probability of getting this result: HHTHTHTTHTTHTTTHTHTHTT But the reality is that the probability of getting either string of results is precisely the same. There is no paradox. By declaring that one of the results is heads you haven't in any way affected what the other result could be. Like Otter said, the "combination" is an illusion. Having a Head appear does not in any way affect whether the other result will be a Heads or Tails, whether flipped at the same time or sequentially. The probability for either is always 50%, regardless of the number in the combination, the number of possible combinations, when the coins are flipped in relation to each other, if the same coin is used or two seperate coins, or whatever. Quote Share this post Link to post Share on other sites
K^2 2,047 Posted May 29, 2007 That was my first thought, but there is still a bit of a problem. You only exclude one coin easily when you imply order. Here, there is no such implication. Easy way to check this, instead of saying, "One coin is Tails", I can tell you that, "The combination of two coins is not HH", and ask if the coins are the same or different. It's exactly the same problem, but you do know that probability of the coins being different, which is (HT OR TH) = 1/2 is greater than probability of TT = 1/4 in the initial toss. Why should the probability ratios change merely because I told you that it isn't HH. Why should suddenly probability of TT go up to 1/2 while probability of (HT OR TH) remains the same? But yes, this is the way at showing the problem. You just need to take a few extra steps to demonstrate that you can exclude another 2-coin state. Mortukai, there is no fallacy in considering all coin tosses, even if they are sequential, as long as it is done right. For example, if I flip the first coin, it lands as Tails, and I ask you want the odds of the second one landing Tails, you can happily inform me that it is 1/2, despite the odds of TT being 1/4. Your approach is basic enough in correctly assuming that previous tosses don't affect the current one. But what if you do want to consider all tosses? You still have 4 combinations then, but since you are only interested in the second coin, you group them as (HH OR TH), and (HT OR TT). Since the probability of AB AND CD = 1/4*delta(A,C)*delta(B,D), you say that P(HH OR TH)=1/4+1/4=1/2, and P(HT OR TT)=1/2, giving you 1/2 probability on the second coin being Tails, regardless of the first toss. It still works. Note: delta(A,B) is the Kronecker Delta of A and B. delta(A,B)=1 if A==B and 0 if A!=B. Quote Share this post Link to post Share on other sites
Mortukai 1 Posted May 29, 2007 For starters, I don't know why you're treating TH and HT as two seperate possibilities. These should only be seperate if the order is important. Otherwise it should be only 3 possible groupings of outcomes: both heads, both tails, and a heads and a tails. If order is important, it works like this. Say you flip and you get Tails. That excludes both HH and HT. Leaving only TH and TT. ie: a 50:50 chance of heads or tails. Even with order being important, it always boils down to heads or tails. Because the nature of flipping a coin is such that it cannot affect subsequent flips. If order isn't important, then TH and HT are the same outcome, so there is a 1/3 chance of any specific outcome (heads, tails, and both), and if you reveal one of the coins then you cut that down to a 1/2 chance. No matter how you actually perform the task, the odds of me guessing right will always be 50:50. The only way you can decrease that chance is to increase the number of coins encompassed by the decision. Like, if you have to guess both at the same time, then you have a 1 in 3 chance of being right. If you have to guess the first coin, and then the second coin, then you will have a 1/4 chance. And so on as you add more coins into the decision. But yeah I think you're problem is that you're treating HT and TH as different cases, when they aren't unless the experiment is specifically set up in a way as to make them different, such as making order important and simultaneously making you guess each coin result one after the other. But when the experiment is set up this way the ambiguity disappears and it all makes sense within the rules. Quote Share this post Link to post Share on other sites
K^2 2,047 Posted May 30, 2007 The TH and HT are distinct states, though. If I ask you the odds of a "double" (both coins landing the same), and you count TH and HT as the same state, you will end up with 3 states, which gives you 2/3 odds for a double, which is wrong. You can count the TH and HT as the same state for the purpose of a particular setup, but then you must account for the fact that HT/TH state has double the weight of TT or HH states, getting you back to 1/2 for the double. This is very similar to the idea of microscopic vs macroscopic states in statistical mechanics. You can think of HT/TH as a macroscopic state that is more likely than TT or HH, because it consists of two microscopic states: HT and TH. You were on the better path earlier. When I tell you that one of the coins is Heads, I don't just eliminate the TT possibility, but also either TH or HT, I just don't tell you which, because I don't tell you which coin I chose to reveal. (e.g, if I was to tell you that the first coin is Heads, you eliminate TT and TH states.) You now know that the possible set is either {HH,HT} or {HH,TH}. Since either one gives you a 1/2 odds of guessing right, which set you actually operate within is irrelevant. The problem, however, remains when the wording is different. If I tell you directly that the state is not TT, it is not clear why either HT or TH states should be eliminated, despite the fact that I'm essentially telling you exactly the same thing, and all of the previous logic should apply. We know that the HT/TH possibility is twice more likely than HH. We know that TT has been thrown away. So how come the odds of HH are not 1/3 now? Quote Share this post Link to post Share on other sites
Mortukai 1 Posted May 30, 2007 (edited) If I ask you the odds of a "double" (both coins landing the same), and you count TH and HT as the same state, you will end up with 3 states, which gives you 2/3 odds for a double, which is wrong. Umm, a "double" would be either HH or TT. So it can be HH/TT, or HT/TH. Two states. It's a 50:50 chance. But I think you're getting very confused by not thinking in terms of actual practise. If you set up an experiment, where someone flipped two coins, looked at them both, and then told the subject that "it's not TT" (or "it's not HH" if it is TT), and the subject had to guess both coins from this information, the odds will sit around 1/3 for them being correct. Because they have to guess from 3 options, and HT is different to TH. In other words they have to make two guesses. The first is the position of the Heads (is it the first coin or the second one) and the second guess it the outcome of the other coin from a 50% chance of Heads of Tails. Basically two 50:50 guesses. The probability of us guessing right is seperate from the probability of a given specific outcome. This is what I was talking about when I collapsed TH and HT into one outcome. If we just have to guess whether the coins are the same or different, that's a 50% chance, because out of the four outcomes, half of them are different and half are the same. The chance of any individual outcome is always 25% regardless of anything. The chance of us guessing changes according to the parameters of the experiment. Try imagining this experimental setup: You have 8 coins. You grab two of them, flip them, and then hide them under a cloth without looking at the outcome. Then you grab the next two and do the same, putting them a bit further away from the first two. And so on until you've made 4 flips of 2 pairs each. Statistically, each flip had a 25% chance of being TT. Now, without looking, you exclude one pair from under the cloth and discard those coins. Now you only have 3 pairs of coins left. Consider the difference between these probabilities: 1. The probability that one of the 3 pairs left is TT. 2. The probability that you will pick the pair that is TT. 3. The probability that one of the 3 pairs left is either HT or TH. 4. The probability that you pick a pair that is either HT or TH. Now imagine you didn't exclude a pair, and work out the probabilities again. This may help illuminate the difference between working out probabilities of certain outcomes, and probabilities of guessing those specific outcomes. Edited May 30, 2007 by Mortukai Quote Share this post Link to post Share on other sites
K^2 2,047 Posted May 30, 2007 Umm, a "double" would be either HH or TT. So it can be HH/TT, or HT/TH. Two states. It's a 50:50 chance. The TT and HH states are distinguishable, even if coins are not. The HT and TH states are indistinguishable if you cannot distinguish coins. There is a big difference. For a large-scale example, imagine a cylinder filled with gas. Mentally, divide it into the left half and the right half. Most likely, you will find roughly half the particles on one side, and half on the other. Now imagine I had particles numbered 1-N. Could you tell me which of these particles are on the left, and which are on the right? No, you could not. Even if I tell you that the particles are divided exactly evenly, there are N!/(((N/2)!)^2) possible combinations. For large N, that's a lot. Now, imagine that all the particles are on one side. Statistical impossibility, I know, but if you happen to observe the cylinder in such a state, can you tell me which particles are on that side? Of course. All of them. There is only 1 possible state. If we assume that every microscopic state is equally likely, as it is, you can immediately tell me that the odds of all particles being shared equally between two sides are N!/(((N/2)!)^2) times more likely than all the particles being on one side. Which is exactly why you never observe this in real life. With two coins, it is like having just two particles, but if you can't tell which coin is which, the effect is exactly the same. You have three microscopic states: TT, HH, and (HT/TH), the last one being twice as likely as any of the former two. [2!/((1!)^2)=2] If you set up an experiment, where someone flipped two coins, looked at them both, and then told the subject that "it's not TT" (or "it's not HH" if it is TT), and the subject had to guess both coins from this information, the odds will sit around 1/3 for them being correct. Because they have to guess from 3 options, and HT is different to TH. In other words they have to make two guesses. The first is the position of the Heads (is it the first coin or the second one) and the second guess it the outcome of the other coin from a 50% chance of Heads of Tails. Basically two 50:50 guesses. Actually, in this setup, the odds of guessing right are 1/2 if you name a possible double, and 1/4 for each of the TH and HT combinations. In this setup, you want to always guess a double, because that will give you higher odds. But this is not what you are asked. To be equivalent to the initial experiment, you do not have to tell which is the first coin and which is the second. All you have to say is whether both are Heads, both are Tails, or one is Heads and another is Tails. If you are given no information, you should guess the later, because that gives you 1/2 odds of winning, while naming a particular double gives you only 1/4 odds. But once you are told that it is not TT, for example, that only leaves HH or TH/HT to be guessed. The interesting bit is that now, suddenly, HH is no longer less likely than TH/HT. They both give you 1/2 odds of guessing. Another way to look at it, you are given the information about the outcome. It excludes one of the possible states. Shouldn't it make guessing easier? However, it does nothing to improve your odds. Your best guess is 1/2 regardless of wither you are told anything or not. Quote Share this post Link to post Share on other sites
Mortukai 1 Posted May 30, 2007 Lol, I don't have time to respond properly, but you just used an analogy that is inherently MORE COMPLEX than the thing you are trying to explain. Hahahha. An inability to simplify something shows a deficit in understanding. It's like trying to explain how to ride a bike by using the analogy of programming a mission to mars. Quote Share this post Link to post Share on other sites
K^2 2,047 Posted May 30, 2007 The complexity increase is only in the number of "coins". (Do we really care what gives us binary states?) It's a good way to demonstrate a difference between microstates and macrostates. Besides, you already tried demonstrating examples with more coins. I just took that to the limit. Quote Share this post Link to post Share on other sites
Otter 7,900 Posted May 31, 2007 So, in effect, you're comparing the probability of a series versus the probability of an instance? The confusion comes into play when you ask your witness for only half of the series - you're essentially asking him heads or tails. At this point, the first coin has been flipped, and is either an H or a T. While the options in the beginning were HT, TH, HH and TT (one in four) you've now effectively cut them in half. I still fail to see the paradox. Quote Share this post Link to post Share on other sites
K^2 2,047 Posted June 1, 2007 Otter, the coins are flipped together, and you never say which coin is which. If I flip two coins, and ask you to guess how they fell, what are you going to tell me? Keep in mind that there is no first or second coin. If I flip coins, and you name the combination, there is no way to check which coin you are referring to as which. So all you have to tell me is if both are Heads, both are Tails, or one is Heads and one is Tails. Of course, you are going to go for the last option, because that lets you win 1/2 of the time, while other two options only let you win 1/4 of the time. Now, imagine I decide to help you, and say, "Hey, look, it isn't HH!". Why shouldn't your odds of guessing correctly go up? Picture the same game with a single 6-sided die. I ask you to guess the number that will roll. Your odds are 1/6. If I tell you, "Hey, look, it's not 2!", your odds are suddenly up to 1/5, because there are only 5 possibilities left. With the coins, after I tell you, "It's not HH," why does the possibility of HT/TH stay the same, while the possibility of TT goes up? I eliminated a possible outcome. Why doesn't it help you to guess right? Shouldn't your odds go up to 2/3, since there are 3 microstates left (HT, TH, and TT), and you cover two of them by saying, "One is Heads, and one is Tails." Quote Share this post Link to post Share on other sites
Otter 7,900 Posted June 1, 2007 Shouldn't your odds go up to 2/3, since there are 3 microstates left (HT, TH, and TT), and you cover two of them by saying, "One is Heads, and one is Tails." I don't see why not. There are now three options, any three of which being just as likely. The odds of any combination happening remain the same, but my odds of guessing correctly improve because I now have a two in three shot. Quote Share this post Link to post Share on other sites
K^2 2,047 Posted June 1, 2007 See, now you are confused. The odds of winning in that setup are still just 1/2. If you have any doubts, just try it. Quote Share this post Link to post Share on other sites
Mortukai 1 Posted June 1, 2007 The odds of winning in that setup are still just 1/2. If you have any doubts, just try it. Umm, no. Ok, let's back up. You're saying that if we eliminate TT, then our ods of guessing correctly out of HH, HT, and TH, is 1/2, instead of 1/3. Who's confused here? Let's do this another way. Statistically, it should be identical to your setup, but in practice, it's a little different, and can be done with one person. Ok. You are are flipping two coins. When you reveal them, you have to reveal one at a time (giving us our "order", to ensure we have differentiation between HT and TH). Write down every combination you get. Every time you get TT, ignore that result. This step simulates the "it's not TT" condition in your setup. There's no guessing, you just flip, reveal, and discard all TTs. I gaurantee you that you will not see HH as often as you see HT and TH combined. In fact, I'm not doing anything right now, so here I go: HH: 11111 11111 11111 TH: 11111 11111 11111 1 HT: 11111 11111 1111 Ok bored now. Well look at that huh. I made 60 flips, and got 15 TTs. Out of 45 "not TT" results: HH = 15 (~33%) TH = 16 (~33%) HT = 14 (~33%) TH+HT = 30 (~66%) So that's the probability of them happening when we exclude TT by saying "it's not TT". You cannot tell me that if I were to say "it's either HT or TH", that I'd only have a 50% chance of being right. I'd have a 66% chance of being right. I don't know what sort of maths you are doing, but it's wrong. So, what is your paradox again? Perhaps you are just not stating it clearly enough. Or perhaps the paradox only exists if you think about this whole thing in a really weird and impractical sense. Because so far both Otter and I have concluded that if you say "The first coin is Heads", then our chance of guessing right is 50%. Which it is. If you say "It's not TT", our chance of guessing right if we choose both heads and tails is 66%. Which it is. So, where are you getting confused again? Quote Share this post Link to post Share on other sites
Mortukai 1 Posted June 1, 2007 Wait, I re-read your original post and I think I figured out where you're getting messed up. You are applying a strategy which works in one instance of an event because it relies on information relevant only to that instance of the event, to predict future events. Here's a better example to illustrate your point: I have a deck of cards. I pick one and tell you that the card is not clubs, and that you have to pick the colour. You have a 66% chance of being right if you guess the color opposite the suit I say it's not. ie: Red. This is true. But you are claiming there's a paradox, because in the next draw I can still draw a black card 50% of the time. You're claiming that your "strategy" of picking the opposite is a paradox and will yeild you only winning 50% of the time. This isn't a paradox, this is fallacious reasoning. Just because you think that your odds are better on an individual instance of an event does not mean they are better over a series of events. In the above example, if I say "it's not clubs", and so you pick red, that really is your best bet, because you are just picking one out of three, and you really do have a 66% chance of being right with red. But over time, this is stupid, because I will draw a black card 50% of the time. The reason is because over time, the goalposts move. I'm not just saying "it's not clubs" all the time, because I'd have to lie 25% of the time, invalidating the game. If I don't say anything whenever it's clubs, you can guess black when it is clubs with 100% accuracy, because I have a tell when it's clubs. If I alternate which suit I say it isn't randomly, then you're a fool to listen to what I say. And therein lies the rub. If the information you are given is only randomly telling you what the result isn't, then it is useless, and forming a long term strategy around it would be dumb. To illustrate this point, look back on my previous post, where I flipped the coins and counted the results. I excluded all TT combinations. If I instead had chosen the excluded combinations randomly before each flip, the odds for each combinatino would be the same. instead of 33% for each, it'd be 25% for each, because not only does each combination have the same chance of being right, they also have the same chance of being excluded. No paradox, just strategies which are ill-suited to certain experimental conditions. If I take all the clubs out of the deck, and each time I pick a card and get you to guess it's color I say "it's not clubs", then you'd be a moron to not guess red, because you have a 66% chance of being right every time. If all cards were equally represented and I randomly chose which suit to tell you it wasn't each time, you'd have a 50% chance of guessing right. No paradox, just mis-matched strategies and conditions. Also you weren't specifying that each flip you'd be saying "one of them is X", where X is equally randomly Heads or Tails. If you had specified that, then you would have seen what Otter and I pointed out to you in our first posts: it doesn't matter at all what you say about what one is or isn't, we are just guessing heads or tails. But then with your later posts you started implying there was an order to the information given, by saying things like "if I exclude this, shouldn't this change the odds?" The answer to that is "yes". But if you exclude "any of these", the answer is "no". So yeah, like we've been saying, there's no paradox, your strategy is just crap. Quote Share this post Link to post Share on other sites
Otter 7,900 Posted June 1, 2007 TH HH TT HT Are the possible combinations. If you say to me, "one of these coins is tails," Then I'm left with these combinations: TH TT HT I can make an informed decision because I know at least one of the two coins is tails - tails is obviously the way to go. The 50/50 odds remain the same, but now I've been slipped some insider information that compounds my odds of guessing correctly. But if you tell me "this coin is tails," I'm left with: TT HT And there goes my advantage. Quote Share this post Link to post Share on other sites
K^2 2,047 Posted June 1, 2007 Oh, darn. Now you are both confused. (Actually, managing confused Mortukai to such a degree feels almost like an achievement.) Umm, no. Ok, let's back up. You're saying that if we eliminate TT, then our ods of guessing correctly out of HH, HT, and TH, is 1/2, instead of 1/3. Who's confused here? Let's do this another way. Statistically, it should be identical to your setup, but in practice, it's a little different, and can be done with one person. Ok. You are are flipping two coins. When you reveal them, you have to reveal one at a time (giving us our "order", to ensure we have differentiation between HT and TH). Write down every combination you get. Every time you get TT, ignore that result. This step simulates the "it's not TT" condition in your setup. There's no guessing, you just flip, reveal, and discard all TTs. I gaurantee you that you will not see HH as often as you see HT and TH combined. In fact, I'm not doing anything right now, so here I go: HH: 11111 11111 11111 TH: 11111 11111 11111 1 HT: 11111 11111 1111 Ok bored now. Different game. If the coins did fall TT, you don't ignore it. You say that the situation is not HH. As a result, no matter what you are told, you pick HT/TH from the start, and the odds remain unchanged. Try it that way. The odds are 1/2. Here's a better example to illustrate your point: I have a deck of cards. I pick one and tell you that the card is not clubs, and that you have to pick the colour. You have a 66% chance of being right if you guess the color opposite the suit I say it's not. ie: Red. This is true. The difference here is that your choice might change. Imagine I was going to say "Red", and you tell me that the card is not diamonds. Ooops. My odds have just gone down, I better go for Black, then. With coins, there is nothing you can tell me to change my mind. When you flipped the coin I want to guess HT/TH because that gives me 1/2 chance of being right. Regardless of whether you tell me TT or HH, I am not changing my mind. The information does not allow me to adjust my prediction in the least. Then how can the odds change from 1/2? Seriously, play the game. Watch what happens. The odds are still 1/2. Quote Share this post Link to post Share on other sites
Otter 7,900 Posted June 1, 2007 (edited) Edit - nevermind. Can you address my confusion, then? Hmph. Ok, what it seems to me is that Mort had it right from the beginning. You're essentially asking for one of two possible outcomes. The illusion is that four states exist, when in fact only two do - that the coins are the same, or different. Edited June 1, 2007 by Otter Quote Share this post Link to post Share on other sites
K^2 2,047 Posted June 1, 2007 Well, yes, that is correct. Essentially, that's what you are doing, which is why it works out to 1/2. You can also think of it in the way that if I tell you that one of the coins is Tails, not only do I exclude HH, but also one of the two: HT or TH. I just never tell you which one. But since you don't care which is which, and you would have one in two guess in either case, it ends up working out to be the same. Quote Share this post Link to post Share on other sites
Otter 7,900 Posted June 2, 2007 not only do I exclude HH, but also one of the two: HT or TH. Ah! That never crossed my mind either, but it's obvious. So, you figure out how to express it yet? Quote Share this post Link to post Share on other sites
Mortukai 1 Posted June 2, 2007 Different game. If the coins did fall TT, you don't ignore it. You say that the situation is not HH. As a result, no matter what you are told, you pick HT/TH from the start, and the odds remain unchanged. Try it that way. The odds are 1/2. Duh. The rules are different, ergo, different game. But I was illustrating a point. A point I've been making since the beginning. The odds do not change according to your guessing. Result is independant of your guessing. I used that example because there was no guessing, I systematically took away a combination, showing that the odds do go up when you systematically remove a possibility. My next post explained how the odds do not change when you do not systematically remove a possibility. The difference here is that your choice might change. Imagine I was going to say "Red", and you tell me that the card is not diamonds. Ooops. My odds have just gone down, I better go for Black, then. With coins, there is nothing you can tell me to change my mind. When you flipped the coin I want to guess HT/TH because that gives me 1/2 chance of being right. Regardless of whether you tell me TT or HH, I am not changing my mind. The information does not allow me to adjust my prediction in the least. Then how can the odds change from 1/2? Seriously, play the game. Watch what happens. The odds are still 1/2. How can you simultaneously make my point and miss it completely like that? The card game was not meant to duplicate the coin game, but to point out the flaw in the coin game in a more obvious way. The mechanics of the odds and the strategies are the same though. Your coin strategy isn't to pick HT or TH, it's to pick the opposite of what you are being told one of them is / isn't. In your coin game, you say "one of these coins is Heads, so I pick tails, because I think this will give me the best odds of winning. In the card game, you say "this card is not spades", so I pick red because this will give me the best odds of winning. In both cases you exclude 1/4 of the possibilities and make me guess something which has a 50% chance of happening a priori, but which I think changes to a 66% chance based on the information you give me. The flaw in both is that you assume that the excluded option is systematic, when it isn't. If it was systematic, then you'd be correct and would actually have better odds by stating the opposite of what you were told the result is not. ie: red if told clubs, or HT/TH if told not TT (not TT is the same as "one is H"). Look, here's some examples to show, again, what I mean, and you keep f*cking missing. I'll even use the coin game so you don't get extra confused. 1. I flip two coins, and look at both. I tell you one of them is Heads. You have to guess whether both coins will be the same or different. For every single coin toss, I will tell you either these exact words: "One of them is Heads" or if this cannot be true, then I will re-flip the coins You will know 100% when the coins are TT, because I re-flip them, but it doesn't matter, you know for certain that only 3 of the 4 possible results can be true. Also, you know that my information to you is completely irrelevant, because it doesn't in any way alter the game. This game is systematic, and your odds will be higher because of it. 2. I flip two coins, and look at both. I tell you one of them is Heads. You have to guess whether both coins will be the same or different. For every single coin toss, I will state "One of them is X", where X is either Heads or Tails, which will always be true and which I will choose randomly if both can be true. You will never know what the other coin might be because there is no order to the information you are given. Obviously 75% of the options are wrong, and the only information you are ever being given is 25% of that 75%, and the particular 25% you get fluctuates wildly within that 75%, making it effectively useless. No options are excluded. All information I give you is irrelevant because it doesn't impact the game in any way. This game is not systematic, and your odds of winning are completely independant of any information you are given. In other words, 50:50. What is happening is that your "strategy" is being formed based on the assumption of 1, when 2 is actually the case. This is because 2 seems systematic when you only have one instance of it to judge by. If you only have one instance to judge by, your odds aren't actually any different, but you have a better chance of winning if you assume that it is 1. If it is 2 and you think it is 1, then you have a 50% chance of winning. It it's 2 and you think it is 2, you have a 50% chance of winning. If it's 1 and you think it's 1, you have a 66% chance of winning. If it's 1 and you think it's 2, then you WOULD have a 66% chance of winning, but you'll only get a 50% chance if you treat it like 2. So in terms of chances for payoff, the best option is to always treat it like it is systematic, because if it's random, it doesn't matter how you treat it. But if it's systematic, then you can gain much more by developing a strategy that works for that. Like I said already: Your guessing is independant of the actual results. Your strategy is only relevant if it matches the strategy of the game. If the game is random, any strategy you think is good will be sh*t. But it's in your best interest to think that's it's good because you lose nothing by doing so but can potentially win more. I really don't know how much clearer I can make this. It's pretty bloody obvious. To sum it up I guess I could say that random information about random possibilities cannot allow you to form a systematic strategy that actually works. But systematic information about random possibilities can. And it is a good idea to treat things as systematic because there is a profit if you are right, and no loss if you are wrong, whereas if we treat things as non-systematic, there is no gain when we are right and a loss if we are wrong. If you still don't get this then maybe maths isn't for you after all. Quote Share this post Link to post Share on other sites
K^2 2,047 Posted June 2, 2007 (edited) Mortukai, your systematic vs non-systematic is bull. Lets go back to the cards. The reason is because over time, the goalposts move. I'm not just saying "it's not clubs" all the time, because I'd have to lie 25% of the time, invalidating the game. If I don't say anything whenever it's clubs, you can guess black when it is clubs with 100% accuracy, because I have a tell when it's clubs. If I alternate which suit I say it isn't randomly, then you're a fool to listen to what I say. And therein lies the rub. If the information you are given is only randomly telling you what the result isn't, then it is useless, and forming a long term strategy around it would be dumb. Random card is drawn. You tell me, "The suit is not X", where X is any suit that does not match the one on the card, and you select it at random. I chose the color that is opposite of X. Full range of possibilities (card, X, result): D, X=H, I lose. D, X=C, I win. D, X=S, I win. H, X=D, I lose. H, X=C, I win. H, X=S, I win. C, X=D, I win. C, X=H, I win. C, X=S, I lose. S, X=D, I win. S, X=H, I win. S, X=C, I lose. You alternate the suite you say it isn't randomly. I listen. I win in 2/3 of the cases. Seems you got a flaw in your logic somewhere. Edit: You can force me back into 50% in this game, but then you must not select suit at random for X. You must select X to be of either color with 50/50 odds. Then my strategy sinks. Edited June 2, 2007 by K^2 Quote Share this post Link to post Share on other sites
Mortukai 1 Posted June 2, 2007 Uhuh. now do that with the coins, since you didn't like my card game from the start. Quote Share this post Link to post Share on other sites
Otter 7,900 Posted June 2, 2007 Uhuh. now do that with the coins, since you didn't like my card game from the start. Yeah, see, that's where the "illusion" lies. Where as you feel like you're being given the suit of the card (I say one of the coins is H, so you assume TT is out) I'm also eliminating an instance of either TH or HT. Tricksy. Quote Share this post Link to post Share on other sites
K^2 2,047 Posted June 2, 2007 Uhuh. now do that with the coins, since you didn't like my card game from the start. I don't see where you are getting at. Your explanation to what happens with the coins was that since the information to be revealed is selected at random, it has no impact. Obviously, that does not work with the cards, so I don't see how you jump to a conclusion that it works with coins. Quote Share this post Link to post Share on other sites
Mortukai 1 Posted June 3, 2007 I don't see where you are getting at. Your explanation to what happens with the coins was that since the information to be revealed is selected at random, it has no impact. Obviously, that does not work with the cards, so I don't see how you jump to a conclusion that it works with coins. Sooo, you can't then, or you would have. Look, the cards game may well be a different case. I only came up with it to illustrate my point because it broke down the odds into different layers. Whereas with the coins each individual coin has 50% odds of being either Heads or tails, and if you combine two coins you then get 25% odds, with the cards they each have 25% odds of being a particular suit and we can break the suits into 50% odds by color. It illustrated my point at the time, but because we're dealing with only 1 card being drawn as opposed to two coins being flipped, and information given about that card will be directly relevant to it, whereas information given about one coin has no bearing whatsoever on the other coin. So the basic mechanics don't work the same. So yeah, perhaps in the case that there is only ever one particular element about which we are given information from a finite range, then it may well be that even random information will help. Perhaps in this case "random" non-systematic information is effectively a different systematic system for each new instance, because there is only ever one element being manipulated. But in a case where there are multiple elements, and we are only given information about some, and that information is random, obviously the random information doesn't help. It would help if it were systematic, but it doesn't if it is random. ---- If you still don't like that, then I'm going to reflect the challenge back on you: come up with a better solution. I've come up with a model that explains the data. You, have not. When you challenged its use on the card game, I adapted it to include the condition of involved elements, and maintained it's rigor. When challenged to explain what is going on, you have had nothing constructive to say whatsoever. It's like you're taking the position that it IS a paradox a priori, and will do anything to maintain this idea. For a mathematician, you're certainly a sh*t one. Even if my model is completely wrong, right now, it's the best model we've got. Like Newton's theory of gravity. Turned out to be completely wrong, but hey, that sh*t was useful. Still is for most applications, really. My model predicts not only what types of systems can yeild better odds, but also what types of strategies may be employed, when a system is systematic or non-systematic, which strategy is best to use in all cases, and why we percieve the games the way we do. Your model... remains largely silent and non-commital. So go on, come up with a better model than mine. Alternatively, come up with a system which uses multiple elements, and we are given random information about only some of them (randomly chosen), where there IS a strategy which yeilds better odds than the base system would predict. If you can do that, then my model is wrong. I'll still have one-up on you though, for having constructed a working model at all, but you'll even the score again by deconstructin it if you succeed in making a system which violates it fundamentally. Quote Share this post Link to post Share on other sites
K^2 2,047 Posted June 3, 2007 Mortukai, this is mathematics. Mathematics doesn't work with models. Unlike science, where we say, "Show evidence", here we say, "Prove it". With your methodology, proving Fermat's Last Theorem would be limited to, "Well, I can't find any exceptions, so it must be true." It's actually slightly more complex, though. Furthermore, you are not just given info about one coin. As I said a few times, being told "One of the coins is Heads," is equivalent to being told that "Not both heads are Tails", and that is information about both coins. It is information that excludes an entire 1/4 probability state, just like saying that the card is not clubs excludes an entire 1/4 probability state. And no, situation with coins is not a paradox in itself. We know what the correct answer is. The solution that gives you 2/3 probability of guessing is the paradox. There is an error in it somewhere, but until you find where it is actually at, it is difficult to discard. For the actual mathematical solution of the problem with two coins, it is sufficient to say that you pick the HT/TH macrostate regardless of information provided. This means that the odds of you winning are the same as if no information is provided, and that gives you 1/2 odds. This is the way the problem is actually solved. But I still can't quite pin-point where in the 2/3 version the problem resides. Obviously, if you want to describe it in terms of HH, HT, TH, and TT states, you have to either show that two states are removed by the information provided, or keep splitting this up into smaller states with different information being provided. P.S. The main reason I brought up this problem is to show something that has a wrong solution that appears to be sound but gives you a wrong answer. Mostly, it is just to remind people to be careful with logic in proofs. Quote Share this post Link to post Share on other sites
Mortukai 1 Posted June 3, 2007 (edited) Mortukai, this is mathematics. Mathematics doesn't work with models. HAHAHAHAAHAHA HAHAHA HA Furthermore, you are not just given info about one coin. As I said a few times, being told "One of the coins is Heads," is equivalent to being told that "Not both heads are Tails", and that is information about both coins. That's bullsh*t. Partial information about both coins as a whole says absolutely nothing about any individual coins. Honestly man, logic. Learn it. I have five balls. One of them is black. This is equivelant to saying "Not all five balls are white". Tell, me, what color are the other 4 balls. Idiot. And no, situation with coins is not a paradox in itself. We know what the correct answer is. The solution that gives you 2/3 probability of guessing is the paradox. There is an error in it somewhere, but until you find where it is actually at, it is difficult to discard. See above. The error lies in assuming that any knowledge which pertains to one part will affect the other independant part, when that knowledge and the part is random. For the actual mathematical solution of the problem with two coins, it is sufficient to say that you pick the HT/TH macrostate regardless of information provided. This means that the odds of you winning are the same as if no information is provided, and that gives you 1/2 odds. This is the way the problem is actually solved. But I still can't quite pin-point where in the 2/3 version the problem resides. Firstly, no, you don't pick the HT/TH option regardless. You pick anything regardless. Statistically your odds of winning when picking HT/TH are identical to picking HH/TT. Or if the experiment allows it, HH/TH, HH/HT, TT/TH, or TT/HT. Secondly, yes, you finally acknowledge that the information provided is completely irrelevant. Thirdly, the problem with the 2/3 version is that it is operating under false assumptions about the game and the relevance of the information to the outcome. If the 2/3 option is dependant upon the information given (it is, entirely dependant on it), and the information is irrelevant (it is, even you have finally figured this out), then obviously the 2/3 version is simply wrong because it is based on false foundations. The reasons WHY the foundations are false are laid out in my model and all the examples I've given. And conveniently, the reason why it doesn't SEEM false is also laid out: because that's the option with the most potential for gain. Obviously, if you want to describe it in terms of HH, HT, TH, and TT states, you have to either show that two states are removed by the information provided, or keep splitting this up into smaller states with different information being provided. And now you step backwards again. Tell me, why would you have to show that two states are removed by the information if the information is completely irrelevant? You're trying to make a broken strategy work. Why bother? Just make a better strategy that actually fits the setup. The main reason I brought up this problem is to show something that has a wrong solution that appears to be sound but gives you a wrong answer. Mostly, it is just to remind people to be careful with logic in proofs. Yes, it is important to be mindful of the logical soundness and robustness of their arguments. This whole thing has just been an exercise in logic. I mean, really, for your specific coin example, it all just boils down to this: You tell me one of the coin results, and the other can be either Heads or Tails. My chances of guessing right have nothing to do with what the coin you told me is, it has to do with my guessing Heads or Tails. Whether you say "it's not TT", or "one is heads", you are eliminating one of the coins. It doesn't matter which one, because one more is left. So I have to guess Heads or Tails. 50:50 chance of my being right. Nothing else matters. Kind of like Otter and I both said from the start. As to why it appears that you can strategise it, well, that's what this whole discussion has been about. So tell me again your argument for why you think the coin strategy doesn't work? Edited June 3, 2007 by Mortukai Quote Share this post Link to post Share on other sites
Otter 7,900 Posted June 3, 2007 (edited) I think the language barrier here has clouded the fact that K was trying to "pull one over" on us from the start. He failed because he didn't spell it out properly from the get-go. Edited June 3, 2007 by Otter Quote Share this post Link to post Share on other sites