Wikipedia:Reference desk/Archives/Mathematics/2017 August 7
Mathematics desk | ||
---|---|---|
< August 6 | << Jul | August | Sep >> | Current desk > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
August 7
[edit]Coupon collector's problem variation
[edit]In this version there are 2 decks of coupons of n coupons forming disjoint sets. In each round a coupon is randomly selected from each deck and shown to you, then you can choose one of the selected coupons to collect. At the end each round the coupon you selected is replaced (from an infinite stock) and the decks are re-shuffled. There is strategy involved since you make a choice each round. If you already have both coupons then the choice doesn't matter and if you already have one coupon but not the other then you'll want to take the one you don't already have. But if both your options are coupons you don't already have, which one should you choose? Is it always for the deck which you have the most collected, or least collected? --RDBury (talk) 23:39, 7 August 2017 (UTC)
- It seems to me that you should take the one from the deck from which you have the most collected, assuming the probabilities of drawing a given coupon are equal both within and between decks. E.g., suppose the coupons drawn from decks A and B are c_A and c_B, and suppose that c_A is the last one you have not yet collected from A. If you keep c_B and reject c_A, then the chance of getting a good draw next time from A is low since there would be only one acceptable draw. But if you accept c_A and reject c_B, you have a better chance of a good draw from B next time since there are multiple draws that would be good. That's if taking c_A finishes what you need from A. More generally, even if c_A does not complete your set from A, suppose that taking c_A gives a better probability of a good draw from somewhere next time, which can be computed using all the coupons' individual probabilities. Then I would speculate that you should take c_A. Proving this might me very hard, though, involving some dynamic programming. Loraof (talk) 20:40, 8 August 2017 (UTC)
- I would take from the deck that I have the most of. From your first draw you are presented with a choice since you have no coupons so you need both. Which to keep? Toss a coin and go with that deck as your first choice (Deck A). The probability of your second draw producing 2 coupons you need is still very high so I would continue to take from Deck A. I suspect, counter-intuitively perhaps, that it would not make any difference.41.13.194.117 (talk) 06:53, 9 August 2017 (UTC)
- Notation: you start at (0, 0) and you are trying to get to (n, n). Assume that you always take a coupon if one is available. In case n = 2, regardless of what strategy you are using it requires an expectation of 3 steps to finish once you are at (2, 0) and an expectation of 10/3 steps to finish once you are at (1, 1). So, when you find yourself at (1, 0) with a choice to be made, it is always better to go to (2, 0) than to (1, 1). Similarly, when n = 3 it is always better to be at (3, 1) (expectation 9/2) than (2, 2) (expectation 24/5), and so from (2, 1) if given a choice you should go to (3, 1); it is always better to be at (3, 0) (e = 11/2) than (2, 1) (e >= 417/70); and it is always better to be at (2, 0) than (1, 1) (regardless of what later choices you make). In fact this is true also if your strategy is probabilistic (i.e., given a choice you flip a weighted coin). This suggests that "take from the bigger pile" really is better under all circumstances. --JBL (talk) 16:18, 9 August 2017 (UTC)
- This agrees of with how I was thinking about the problem heuristically in that you're always picking the most 'valuable' coupon. The argument here is that a coupon from a deck where you're missing 1 is more valuable than a coupon from a deck where you're missing 2 because you are twice as likely to get a coupon of the second type. In other words scarcity -> value -> take first. It seems difficult to make this rigorous though and perhaps more experimental evidence is needed, perhaps considering the possibility of different size decks, to see if it's even valid. --RDBury (talk) 22:16, 9 August 2017 (UTC)
- Brute force computer computations for all sizes up to 40 confirm it. But my naive attempt to prove it inductively quickly ran into problems. --JBL (talk) 02:48, 10 August 2017 (UTC)
- This agrees of with how I was thinking about the problem heuristically in that you're always picking the most 'valuable' coupon. The argument here is that a coupon from a deck where you're missing 1 is more valuable than a coupon from a deck where you're missing 2 because you are twice as likely to get a coupon of the second type. In other words scarcity -> value -> take first. It seems difficult to make this rigorous though and perhaps more experimental evidence is needed, perhaps considering the possibility of different size decks, to see if it's even valid. --RDBury (talk) 22:16, 9 August 2017 (UTC)
- Notation: you start at (0, 0) and you are trying to get to (n, n). Assume that you always take a coupon if one is available. In case n = 2, regardless of what strategy you are using it requires an expectation of 3 steps to finish once you are at (2, 0) and an expectation of 10/3 steps to finish once you are at (1, 1). So, when you find yourself at (1, 0) with a choice to be made, it is always better to go to (2, 0) than to (1, 1). Similarly, when n = 3 it is always better to be at (3, 1) (expectation 9/2) than (2, 2) (expectation 24/5), and so from (2, 1) if given a choice you should go to (3, 1); it is always better to be at (3, 0) (e = 11/2) than (2, 1) (e >= 417/70); and it is always better to be at (2, 0) than (1, 1) (regardless of what later choices you make). In fact this is true also if your strategy is probabilistic (i.e., given a choice you flip a weighted coin). This suggests that "take from the bigger pile" really is better under all circumstances. --JBL (talk) 16:18, 9 August 2017 (UTC)
- Always choosing the least likely one must be at least pretty close to optimal and II'd guess it is optimal. If the decks are of equal size and the first deck is always chosen if there is a choice then the estimated time till the first deck is complete is - which will leave that-n chances for the second before it is filled so in total for both it takes something like (in fact I think it is a bit less than that but haven't worked it out). If one deck has many more cards than the other then always choosing the larger deck when there is a choice should normally given enough gaps to fill the second deck before the first is filled.
- I get the feeling there is some nice way of getting this but can't see it. Dmcq (talk) 21:11, 10 August 2017 (UTC)