So here is what I have done and seems much easier too comprehend: Let´s start with 10 cards. Everytime you draw a card you have a 10% chance to win and a 90% to miss. You might think, that the probability changes whenever you reveal the next card, but this is not true, since the decreasing number of cards which you still have to reveal make up for the increasing chance, that the needed card is already on the table. For example if you reveal the 4th card the chance of hitting the heart 4 is: (probabilty the 4 is already on the table)(chance of getting the 4 in this case)+(probabilty the 4 is not already on the table)(chance of getting the 4 in this case) =(1-(9/10*8/9*7/8))*0+9/10*8/9*7/8*1/7=0+1/10=0.1% Therefore the chance for losing 10 times in a row is 0.9^10=34.9% My result was not that far away from the one of James, but there was still a gap. For 100 cards I calculated: 0.99^100=36.6% This was even closer to his result but still not the same. But now that I have seen the extra footage, I know, that my results are correct, and for a large number of cards, we even get the same solution. What I have done is basically: (1-1/n)^n, where n is the number of cards. And lim n->infinity (1-1/n)^n=1/e Since James always gives the answer 1/e he has a large errror for small n, which decreases with increasing n. But to answer his question: More cards mean a higher probabilty of losing, but never higher than 1/e.
soeinkrankerscheiss Actually, there is a slight error in your solution. for instance, check the case for n=2. It should be 1/2 but your formula gives 1/4. You can't just multiply the possibilities of an n'th card not being at the n'th place since they aren't independent from each other. In the case for n=2, 1st card not being in the first place assures that the 2nd card is at the first place instead.
My guess for the thing in the beginning: 1: Rephrase question - if you can choose freely between all the cards not used, then put it down on a random position (where there hasn't yet been a chance), what chance is it that you'll get one in the right position 2: For any N cards you have a 1/N chance of getting it correct the first draw, other wise go to step 3 with x being 1 3: If it's not correct, choose the card that should have been in that slot, and put it in a random position, this gives you a 1/(N-x) chance of being back at step 1 with the new N being (x+1) less than the original 4: If you didn't go back to step 1, repeat 3 with x one higher than last time, until you run out of cards. We can split this into two states, having n cards and a chance to win, or having n cards and a chance to close a loop: Win: 1: 1/1 chance to win 2: 1/2 chance to win 3: 1/3 chance to win, 2/3 chance to go to Close:2 - 1/3 + 2/3 * 1/2 = 2/3 4: 1/4 chance to win, 3/4 chance to go to Close:3 - 1/4 + 3/4 * 1/2 = 5/8 5: 1/5 chance to win, 4/5 chance to go to Close:4 - 1/5 + 4/5 * 13/24 = 19/30 (....) Close: 2: 1/2 chance to go to Win:1, 1/2 chance to lose 3: 1/3 chance to go to Win:2, 2/3 chance to go to Close:2 - 1/3 * 1/2 + 2/3 * 1/2 = 1/2 4: 1/4 chance to go to Win:3, 3/4 chance to go to Close:3 - 1/4 * 2/3 + 3/4 * 1/2 = 13/24 5: 1/5 chance to go to Win:4, 4/5 chance to go to Close:4 - 1/5 * 5/8 + 4/5 * 13/24 = 67/120 (...) Since both Close:2 and Win:2 have a 50% chance of winning, and all streaks of loss will eventually go to one of those two, we know for certain that there's at least 50% chance of winning with any amount of cards larger than 0. However, continuing like this is boring, and prone to error (I probably did something wrong in all that, actually), so let's look for abstractions. We could consider a game where you start with a stick, and a number N. There's a 1/N chance of you picking up a stick, and an (N-1)/N chance of losing a stick, if you have one. If you manage to get two sticks, you win. This means that you have to get sticks twice in a row, except on the start, meaning we can abstract having 0 sticks to: 1/(N*(N-1)) chance of winning => 1/(N^2 -N) (N-1)/N chance of subtracting one from N => 1-1/N (N-2)/(N*(N-1)) chance of subtracting two from N => 1/(N-1) - 2/(N^2 - N) Then we just start from 10 and add downwards to get the chance of having a specific amount on N: 10: 1/1 9: 9/10 (from 10) 8: 4/45 (from 10) + 9/10 * 8/9 (from 9) = 8/9 7: 9/10 * 7/72 + 8/9 * 7/8 = 623/720 6: 8/9 * 3/28 + 623/720 * 6/7 = 703/840 5: 623/720 * 5/42 + 703/840 * 5/6 = 4841/6048 4: 703/840 * 2/15 + 4841/6048 * 4/5 = 28423/37800 3: 4841/6048 * 3/20 + 28423/37800 * 3/4 = 137897/201600 2: 28423/37800 * 1/6 + 137897/201600 * 2/3 = 527383/907200 1: 137897/201600 * 1/6 + 527383/907200 * 1/2 = 1468457/3628800 Since 1 is the only loss-state, we know that there's a 1-1468457/3628800 = 2160343/3628800 ≃ 0.595332616843 chance of winning, unless I did something wrong. I have no idea what a generic method would be though. EDIT: Okay, turns out I was wrong. EDIT 2: NVM, I was right, I just forgot that I was supposed to start with 1 stick, not 0: 2160343/3628800 * 10/11 + 1/11 = 2523223/3991680 = 0.6321205607664 (though that's for 11 cards)
The probability of getting a derrangement with n cards is (n-1/n)^(n-1) which approaches to 1/e. The probability for n=10 is actually 38. 7%. NOT 36. 8% which is 1/e
I pondered this during my years of watching auto racing, it seemed that every race, at least one of the cars finished in the same position it started in. Never worked out the probability though, but it didn't come as a surprise that it's genuinely over 50%.
I might be wrong, but if instead of a discrete deck of cards you had a "continuous" deck with infinite cards (each having a number between 0 and 1, for example) then you'd always get a fixed point :D Edit: As Donald pointed out, that's wrong since the 'shuffle' function isn't continuous.
Well, given n cards we also have at least n-1 ways of placing each card somewhere else, just covering the most simple (de)arrangement. And if n is infinity or not doesn't really matter... In most calculations where you put infinity in, the answer is "An infinite amount". But that doesn't mean there has to be a fixed point
what you have is a bijection f(x) from [0,1] to itself. you want x st f(x)=x Continuous means a tiny change in x causes a tiny change in f(x). If endpoints are not included then f(x)=x^2 has no fixed point and is continuous. If endpoints are included then all continuous functions have fixed points. However this does not hold for non continuous functions.
I suspect (but could be completely wrong) that the reason this is approximately 1/e instead of exactly 1/e is because it only reaches 1/e at infinity. In other words 5 cards would be close but not quite 1/e. 6 cards would be even closer but still not quite and so on until infinity you get true 1/e. Can anyone confirm this?
Thank you very much I finally understand fixpoint, I was scared that my time was wasted because I didn't saw the term fixpoint in the title or description of the video. It wasn't clickbait after all thank you haha. :)
When the number of cards is big, the probability of getting k follows Poisson distribution with lambda equals to 1, which yields 1/e for 0 or 1 and 1/(2e) for two.
The chance of getting at least one right is approaching 1/e for n going to infinity. The formula for calculating the chance for any number n can be derived from first principles..
finding out how likely it is for m out of n match up should be rather straightforward, as long as n-m>3. After all, if you have at least m-1 matching up, then the chance for m matching up should just be the same problem as 1 matching up out of the n-m+1 subset, so you can just concatenate to (1-1/e)^m. if you want to exclude any more after m to match up, you jsut multiply by 1/e again
This video should have also covered arrangements, which show the number of ways you can arrange n distinct objects, and are essentially the inverse of derangements. The way to work out the number of arrangements is the same as derangements except instead of alternating between adding and subtracting, you just add. For example, for five objects: Number of derangements = 5! - 5(4!) + 10(3!) - 10(2!) + 5(1!) - 0! = 120 - 120 + 60 - 20 + 5 - 1 = 44 Number of arrangements = 5! + 5(4!) + 10(3!) + 10(2!) + 5(1!) + 0! = 120 + 120 + 60 + 20 + 5 + 1 = 326 The interesting thing about this is that the ratio between number of permutations (120) and arrangements (326) is the same as the ratio between the number of derangements and permutations. They both converge to e (~2.718).
actually finished discrete mathematics this semester and we solved this in one of the lectures. suddenly i feel so competent, knowing what you're talking about
If the formula for the probability of a derangement is round(n! / e) / n!, then the formula still applies at n=2, since round(2/e)/2 = 1/2. since 2/e = 0.735758882 and at n=1, round(1/e)/1 = 0, you can't not get a derangement with 1 card, so the formula works for any number in the natural numbers. Also as n increases towards infinity the results approaches 1/e, despite never being exactly 1/e.
I thought he would mention this. When you divide n! by e, you get a number between two integers. The closer one is the number of derangements. The other one is the number of permutations with exactly one fixed point. For example 4!/e is about 8.83. 9 derangements and 8 permutations with one fixed point. 5!/e gives you 44.145. 44 derangements and 45 permutations with one fixed point. For n even, the number of derangements is larger. For n odd, it is smaller.
The Bernoulli numbers are recursive, you can also work them out through Ramanujan congruence relations. Use the Even Zeta function. Actually the Zeta function is connected with cot(x) by the complex numbers. Using an identity, it makes the power series for tan(x) AND csc(x) appear in closed form. Here is the formula I proved if you want to try the tan(x) : SUM from j=1 to j=Q of (2*(2^(2*j)-1))*Zeta(2*j)*x^(2*j-1)/Pi^(2*j), where Q is the jth term of the expansion Example what is the 4th term of the tan(x) power series expansion WITHOUT computing the terms before ? 2*(2^(2*4)-1)/pi^8 *x^7 * zeta(8) => 17/315 * x^7 as you can easily check by long division. Zeta(8) = pi^8/9450 (Fourier series and identity of Parseval, or just ask Euler who computed all even Zeta functions up to Zeta(26) by hand with polynomials...)
Please also make a video about sterling function in permutation and combination. Or various ways we distribute objects to people. It would be a great help. Thanks
2 questions: 1). Why is the probability a "constant"? The approximation of e get better as n gets larger, and thus the probability will increase as n increases, approaching 1/e, but not truly a constant? 2). How does this formulation of the secretary problem relate to the secretary problem discussed in the "choosing the best toilet video"? That is, you should reject the first 100/e% of applicants then choose the next one that is better than all the previous and you'll have a 100/e% chance of having the best one overall?
A great video! It's a nice throwback to Dr. Grime's video on the topic of e, and, low and behold, there is some actual maths! Cannot wait for the Numberphile2 part, is !n = n! - sum(k=1 -> n)[nCk * !(n-k)]? Curious to see the link to e.
I recently needed to solve a question which seems to relate to a subset of derangements, which contains no rotations of non-derangements (ie for a seed 1234, do not include 2341, 3412, 4123). The problem presented itself in the form of a sort of virtual pass-the-parcel with 6 players and 6 parcels, whereby the goal is that no player passes to the same player twice (following that no player takes a parcel from the same player twice). A solution I found involved simply stepping by increasing factors, and mirroring, like so: 123456 246135 365214 412563 531642 654321
I've been looking at what happens when you sum mutual digit derangements. For example 247, 742, 274 are derangements of each other. Any other permutation, such as 724 has at least one element, 7, in the same position. The sum always seems to be a multiple of 37. I don't think that's got anything to do with 1/e, but with what you get when you factorize repunits of varying size. 111 = 3*37, and 1111 = 11*101, so the sum of say 2473 and its derangements 3247, 7324, 4732 is 17776 = 11*101*16. For 5 digit derangement sets the sum is a multiple of 41 and 271.
The answer for the 7:18 question is: The probability of a person get his own hats among "n" hats is: 1/n The probability of a person doesn't get his own hats among "n" hats is: (n-1)/n The probability of "k" persons get their own hats among "n" hats is: (1/n)^(k) The probability of the rest of them doesn't get their own hats is: ((n-1)/n)^(n - k) Then, the probability of "k" persons get their own hats and the others don't is: (1/n)^k * ((n-1)/n)^(n-k) The number of k-combinations in a set has "n" elements is equal to: n!/(k!(n-k)!) So, the probability of only "k" persons gets their own hats is: (1/n)^k * ((n-1)/n)^(n-k) * n!/(k!(n-k)!) Q.E.D.
It's actually not much more difficult to answer the question at the end. There are approximately n!/(e k!) permutations fixing exactly k positions. Can find this from the same method as used to find the original approximation, or just apply the original esimate (for example note that for a given set of fixed positions there are ~(n-k)!/e derangements of the remaining, and there are n choose k ways to choose the fixed positions. The product of these is the above estimate) I'm not a fan of people saying this or that math is hard or complicated; it's very hard to make such a claim objective, and just biases the hearer against the question. It's better to say "I don't know" than to say it's hard.
I am reminded of "manotazo", where players take turns laying down cards and have to slap down the pile when the card layer down coincides with name of the card. The names are spoken in order during turns from ace to King and then ace to King again. Last hand on top of the pile takes the pile. You notice derangements quite often during play
There is a simpler proof: so for the example with n cards in order to lose the first hand has to be anything but 1 and for that to happen the probability is n-1/n. For the second card anything but 2 and so on. So there are n cards and for this to be true it has to be true for all of the cards. So the probability is ((n-1)/n)^n = (1-1/n)^n) but (1 +x/n)^n = e^x so the probability of losing is e^-1
So I worked on a very similar problem in school. The idea is that you've got a list of questions and answers on a test that you need to match up. Maybe you've got a list of definitions that you need to match to words from a word bank, for example. The question I asked myself was, "If I were to guess totally randomly, how many can expect to get right?" It turns out, the answer to that question is exactly 1, regardless of how many questions there are, provided that you give a unique answer to each of them. This, of course, is no better than if you simply guessed the same answer on all of them. Approximately 37% of all cases will yield no correct answers, and 1 of all cases will yield all correct answers, but it averages out at exactly 1 correct answer, given any number of questions. One other thing I wanted to point out is that James was not completely clear at the beginning when he said that it's always 63%. In truth, the probability approaches 1-1/e as the number of cards approaches infinity. For cases with 4 or fewer cards, it still approaches this value, but with fewer cards, the result is off by more than 1%, so it doesn't look like the same number by system. 1 card is 100% 2 cards is 50% 3 cards is 66.666...% 4 cards is 62.5% An interesting point to note here is that all odd amounts will have a probability of over 1-1/e, and all even numbers will have a probability under 1-1/e I also found this value 1-1/e elsewhere in probability; the idea being that if you have something with a 1/n chance of happening with each iteration, and you preform that iteration n times, what's the probability that it will occur at least once. For example, what's the probability of rolling a 6 on a standard 6 sided die, if you roll it 6 times. The answer here is also 1-1/e, or about 63%. Also here, the % approaches 1-1/e as n approaches infinity, but it approaches it more slowly, and always from above, rather than alternating between above and below, as in the above problem. So really, the probability is always over 1-1/e.
So I watched 8 minutes for seeing how he calculates the number of derangements, because that is the hard part in this, but then he explains everything else than that and just throws !n=n!/e at us with no explanation
Very nice. But what about the probability of getting 1 match, 2 matches etc. I know you said it is harder, but It would be interesting to know. Or any simple references to an answer? Btw, loved the addresses you chose for the envelopes!
A similar but harder question. Two packs of cards are shuffled together. What is the probability that two identical cards will lie adjacent to each other? Would love to see a solution.
I don't think the question for exactly k fixed points is much harder; the total number of permutations of n elements having exactly k fixed points should be (n choose k) * !(n-k), which is about n!/k! * 1/e for n large compared to k.
If you want to know the probability of 2 results, 3, etc.. wouldn't it be valid to just multiply that 63% chance N times? I.e., for two coincidental matching positions, .63 * .63 = .3969 or 39.69%.. This should be valid in my mind because if you look at the new set of items to derange as a new problem, there's a 63% chance again (unless you're at
Let me try to clear things up. The probability is technically never exactly .63 but rather just approaches 1-1/e which is approximately .632 and when he says that it is 63% at 4 or above he means that, to the nearest percent, it is 63%. You must use a special formula for binomial probability to find the probability for an exact number of times. I have a video on this on my channel called the "Probability Challenge" that explains this better.
I tried to tackle this question, what is the probability of winning if the win condition is at least two cards are in the correct position. We see in the video that for at least 1 card in the correct position, the probability is 1-(1/e). The number of ways to arrange n things so that exactly 1 item is in its correct position would be the same as the number of ways to fix a single point which is n, multiplied by the number of ways to make a derangement of the other n-1 points which is !(n-1). n * !(n-1) = n * (n-1)!/e or n!/e. So the probability of two or more fixed points would be (1 - (P 0 fixed points) - (P 1 fixed point)) = 1 - ((n!/e)/n!) - ((n!/e)/n!) = 1-(2/e). If this is not correct, can someone point out where I went wrong.
I felt so smart right until you said, "Divide by e," and then I felt super excited and confused. Surprise "e" indeed! I wish I could have seen that coming.
My thought after some calculations with low card numbers was that it would approach some particular probability with increasing card numbers. I was surprised when the video said just how rapid it is. 1 card: 100% 2 cards: 50% (1/2) 3 cards: 66% (4/6) 4 cards: 62.5% (15/24) 5 cards: 63.3% (76/120, I assume but have not confirmed) It is not that the probability is exactly 1-1/e. It is that it is as close as you can get given the finite number of permutations for any given card count.
Interestingly, the same 63/37 % applies for the decay in moving average CPU load in UNIX systems calculation (i.e. a 5-minute average CPU load contains 37% of the past and 63% of the past 5 minutes) - according to Wikipedia anyway.
I did similar math that came up with a similar result a while ago. I was doing shiny-hunting in Pokemon by breeding, and figuring out what my chances are. The raw chances are 1/512, so I asked "What's the chance I'd get one within 512 eggs?" Turns out it was the same number, around 64%. So I wondered how much that changed if I played with the numbers. I found out a limit this way, and stumbled across e. Turns out, according to Wolfram|Alpha, the problem spits out like this: 1-((x-1)/x)^x→(e-1)/e, as x→∞. That's your chance of rolling a natural 20 somewhere in 20 d20 rolls, your chance of rolling a natural 100 in 100 d100 rolls. Amazing where e pops up.
@Numberphile Where's the error in this? The chance of not getting a single match: 9/10*8/9*7/8*6/7*5/6*4/5*3/4*2/3*1/2 = 0,1. So the chance of hitting at least one match is 1-0,1=0,9
I sometimes play a game with a full deck of 52 cards. I would flip one card at a time, saying ace, 2, 3, 4, 5, and so on. When I get to King, I would start over at Ace again. What is the probability of getting through the entire deck without matching?
why 4? the exact probability is: 1-((n-1)/n)^n so if there are 5 cards it's around 67% even with 10 cards its 65% you need 65 cards for the probability to be 63%
Extra footage from this interview: ruclips.net/video/qYAWjIVY7Zw/видео.html
I absolutely LOVE when you film with James!
So here is what I have done and seems much easier too comprehend:
Let´s start with 10 cards. Everytime you draw a card you have a 10%
chance to win and a 90% to miss. You might think, that the probability
changes whenever you reveal the next card, but this is not true, since
the decreasing number of cards which you still have to reveal make up
for the increasing chance, that the needed card is already on the table.
For example if you reveal the 4th card the chance of hitting the heart 4
is:
(probabilty the 4 is already on the table)(chance of getting the 4 in this case)+(probabilty the 4 is not already on the table)(chance
of getting the 4 in this case)
=(1-(9/10*8/9*7/8))*0+9/10*8/9*7/8*1/7=0+1/10=0.1%
Therefore the chance for losing 10 times in a row is 0.9^10=34.9%
My result was not that far away from the one of James, but there was
still a gap. For 100 cards I calculated: 0.99^100=36.6%
This was even closer to his result but still not the same. But now that I
have seen the extra footage, I know, that my results are correct, and
for a large number of cards, we even get the same solution. What I have
done is basically:
(1-1/n)^n, where n is the number of cards. And lim n->infinity
(1-1/n)^n=1/e
Since James always gives the answer 1/e he has a large errror for small
n, which decreases with increasing n. But to answer his question: More
cards mean a higher probabilty of losing, but never higher than 1/e.
soeinkrankerscheiss Actually, there is a slight error in your solution. for instance, check the case for n=2. It should be 1/2 but your formula gives 1/4. You can't just multiply the possibilities of an n'th card not being at the n'th place since they aren't independent from each other. In the case for n=2, 1st card not being in the first place assures that the 2nd card is at the first place instead.
The video itself specified only values of n above 4. n=2 does not apply.
soeinkrankerscheiss's solution applies to all positive integers. Likewise 민경효's point applies to all positive integers n as well.
Accidental Derangement would make a great name for a band...
Marc Raccioppo +
Yes!
Imma write that down, if you don't mind
All the other professors (and hosts) are really really great (seriously they're awesome) but Dr. James Grime is my absolute favorite.
The correct title for this video is “Parker permutations”
Was looking for this :D
You win.
Did you use word document?
Those left/right quotation marks are strange indeed...
Parker quotations
Need much more 'surprise e' in my life
I definitely find these videos interesting, but the main reason I watch them is to see the mathematicians get really, really excited. It's so fun.
I always knew he was a tiny person
but did you know he's especially unlucky?
He do looks like Bilbo Baggins.
For some reason, I pictured Dr. James as being taller than me (I'm only 5'10"). I guess big brains makes me picture a giant.😄
TINY HANDS!!?? JAMES IS TRUMP!!!!!!
We did this in class today. If only this video released yesterday...
Lelouch Yagami watch tge video tomorrow, then the video would have been released yesterday...
Well aren't you a helpful chap
What is today but yesterday's tomorrow?
What is tomorrow but today's destination?
isn't it summer break right now?
(Well it is where I live)
e is like a Spanish Inquisition
Lucas Preti Nobody expects the Spanish inquisition (first heard of it from MikeBurnFire :3)
Lucas Preti Oh no: Not the comfy chair!
Our 2.718281828... chief weapons are...
I'm past being surprised when e comes up. It's so unpredictable that it's now predictable. When in doubt, use e.
My guess for the thing in the beginning:
1: Rephrase question - if you can choose freely between all the cards not used, then put it down on a random position (where there hasn't yet been a chance), what chance is it that you'll get one in the right position
2: For any N cards you have a 1/N chance of getting it correct the first draw, other wise go to step 3 with x being 1
3: If it's not correct, choose the card that should have been in that slot, and put it in a random position, this gives you a 1/(N-x) chance of being back at step 1 with the new N being (x+1) less than the original
4: If you didn't go back to step 1, repeat 3 with x one higher than last time, until you run out of cards.
We can split this into two states, having n cards and a chance to win, or having n cards and a chance to close a loop:
Win:
1: 1/1 chance to win
2: 1/2 chance to win
3: 1/3 chance to win, 2/3 chance to go to Close:2 - 1/3 + 2/3 * 1/2 = 2/3
4: 1/4 chance to win, 3/4 chance to go to Close:3 - 1/4 + 3/4 * 1/2 = 5/8
5: 1/5 chance to win, 4/5 chance to go to Close:4 - 1/5 + 4/5 * 13/24 = 19/30
(....)
Close:
2: 1/2 chance to go to Win:1, 1/2 chance to lose
3: 1/3 chance to go to Win:2, 2/3 chance to go to Close:2 - 1/3 * 1/2 + 2/3 * 1/2 = 1/2
4: 1/4 chance to go to Win:3, 3/4 chance to go to Close:3 - 1/4 * 2/3 + 3/4 * 1/2 = 13/24
5: 1/5 chance to go to Win:4, 4/5 chance to go to Close:4 - 1/5 * 5/8 + 4/5 * 13/24 = 67/120
(...)
Since both Close:2 and Win:2 have a 50% chance of winning, and all streaks of loss will eventually go to one of those two, we know for certain that there's at least 50% chance of winning with any amount of cards larger than 0.
However, continuing like this is boring, and prone to error (I probably did something wrong in all that, actually), so let's look for abstractions.
We could consider a game where you start with a stick, and a number N. There's a 1/N chance of you picking up a stick, and an (N-1)/N chance of losing a stick, if you have one. If you manage to get two sticks, you win.
This means that you have to get sticks twice in a row, except on the start, meaning we can abstract having 0 sticks to:
1/(N*(N-1)) chance of winning => 1/(N^2 -N)
(N-1)/N chance of subtracting one from N => 1-1/N
(N-2)/(N*(N-1)) chance of subtracting two from N => 1/(N-1) - 2/(N^2 - N)
Then we just start from 10 and add downwards to get the chance of having a specific amount on N:
10: 1/1
9: 9/10 (from 10)
8: 4/45 (from 10) + 9/10 * 8/9 (from 9) = 8/9
7: 9/10 * 7/72 + 8/9 * 7/8 = 623/720
6: 8/9 * 3/28 + 623/720 * 6/7 = 703/840
5: 623/720 * 5/42 + 703/840 * 5/6 = 4841/6048
4: 703/840 * 2/15 + 4841/6048 * 4/5 = 28423/37800
3: 4841/6048 * 3/20 + 28423/37800 * 3/4 = 137897/201600
2: 28423/37800 * 1/6 + 137897/201600 * 2/3 = 527383/907200
1: 137897/201600 * 1/6 + 527383/907200 * 1/2 = 1468457/3628800
Since 1 is the only loss-state, we know that there's a 1-1468457/3628800 = 2160343/3628800 ≃ 0.595332616843 chance of winning, unless I did something wrong.
I have no idea what a generic method would be though.
EDIT: Okay, turns out I was wrong.
EDIT 2: NVM, I was right, I just forgot that I was supposed to start with 1 stick, not 0: 2160343/3628800 * 10/11 + 1/11 = 2523223/3991680 = 0.6321205607664 (though that's for 11 cards)
The probability of getting a derrangement with n cards is (n-1/n)^(n-1) which approaches to 1/e. The probability for n=10 is actually 38. 7%. NOT 36. 8% which is 1/e
The man, the myth, the legend is back. Best videos when you're in them
Only watching because of James Grime
ZetDude Gaming same
I pondered this during my years of watching auto racing, it seemed that every race, at least one of the cars finished in the same position it started in. Never worked out the probability though, but it didn't come as a surprise that it's genuinely over 50%.
James Grimes + Mathematics has just taken the No 1 spot in my all time list of best things I have ever watched.
Best explanation so far on derangements. Thank you for making such a wonderful video.
What do you mean old-fashioned? Don't you check your hat at the theatre? What do you do with it? How do the people behind you see the stage?
Fraser McFadyen you get your Hat off...or don't wear one at all
Just watched an ad on the derangements video! Glad we got that sorted. Tims rejoyce.
Fascinating video! Thank you! Please do more probability videos.
I loved this explanation for Derangements by Jamie!
Yay! James Grime!!!
I might be wrong, but if instead of a discrete deck of cards you had a "continuous" deck with infinite cards (each having a number between 0 and 1, for example) then you'd always get a fixed point :D
Edit: As Donald pointed out, that's wrong since the 'shuffle' function isn't continuous.
Well, given n cards we also have at least n-1 ways of placing each card somewhere else, just covering the most simple (de)arrangement. And if n is infinity or not doesn't really matter... In most calculations where you put infinity in, the answer is "An infinite amount". But that doesn't mean there has to be a fixed point
what you have is a bijection f(x) from [0,1] to itself. you want x st f(x)=x Continuous means a tiny change in x causes a tiny change in f(x). If endpoints are not included then f(x)=x^2 has no fixed point and is continuous. If endpoints are included then all continuous functions have fixed points. However this does not hold for non continuous functions.
I suspect (but could be completely wrong) that the reason this is approximately 1/e instead of exactly 1/e is because it only reaches 1/e at infinity. In other words 5 cards would be close but not quite 1/e. 6 cards would be even closer but still not quite and so on until infinity you get true 1/e. Can anyone confirm this?
Isn't this Banach's Fixed Point theorem?
+wingracer 16 Look at the follow up video. He calculates the exact probability for n cards. 1/e shows up only at the limit.
( 1- (1/n) ) ^ n = 1/e, as n --> infinity
For an easy example, try 0.9^10 and 0.99^100. Converges toward 0.367
Thank you very much I finally understand fixpoint, I was scared that my time was wasted because I didn't saw the term fixpoint in the title or description of the video. It wasn't clickbait after all thank you haha. :)
!n = floor of (n!/e + 1/2). No need to say that it's approximately n!/e. That works for all non-negative integers except for !1 which is 0.
When the number of cards is big, the probability of getting k follows Poisson distribution with lambda equals to 1, which yields 1/e for 0 or 1 and 1/(2e) for two.
The chance of getting at least one right is approaching 1/e for n going to infinity. The formula for calculating the chance for any number n can be derived from first principles..
finding out how likely it is for m out of n match up should be rather straightforward, as long as n-m>3. After all, if you have at least m-1 matching up, then the chance for m matching up should just be the same problem as 1 matching up out of the n-m+1 subset, so you can just concatenate to (1-1/e)^m. if you want to exclude any more after m to match up, you jsut multiply by 1/e again
1:16
"37.....Oh that number again"
Veritasium sound in the background
You guys never fail to amaze me. Keep it up!
A new video!!! and it's from my boy grime!!!! IM SO HAPPY
This video should have also covered arrangements, which show the number of ways you can arrange n distinct objects, and are essentially the inverse of derangements. The way to work out the number of arrangements is the same as derangements except instead of alternating between adding and subtracting, you just add. For example, for five objects:
Number of derangements = 5! - 5(4!) + 10(3!) - 10(2!) + 5(1!) - 0! = 120 - 120 + 60 - 20 + 5 - 1 = 44
Number of arrangements = 5! + 5(4!) + 10(3!) + 10(2!) + 5(1!) + 0! = 120 + 120 + 60 + 20 + 5 + 1 = 326
The interesting thing about this is that the ratio between number of permutations (120) and arrangements (326) is the same as the ratio between the number of derangements and permutations. They both converge to e (~2.718).
actually finished discrete mathematics this semester and we solved this in one of the lectures. suddenly i feel so competent, knowing what you're talking about
If the formula for the probability of a derangement is round(n! / e) / n!, then the formula still applies at n=2, since round(2/e)/2 = 1/2. since 2/e = 0.735758882 and at n=1, round(1/e)/1 = 0, you can't not get a derangement with 1 card, so the formula works for any number in the natural numbers. Also as n increases towards infinity the results approaches 1/e, despite never being exactly 1/e.
I thought he would mention this. When you divide n! by e, you get a number between two integers. The closer one is the number of derangements. The other one is the number of permutations with exactly one fixed point. For example 4!/e is about 8.83. 9 derangements and 8 permutations with one fixed point. 5!/e gives you 44.145. 44 derangements and 45 permutations with one fixed point. For n even, the number of derangements is larger. For n odd, it is smaller.
Interesting how both transcendental numbers can be expressed as a ratio of two quantities,
tau : Circumference/radius
e : n!/!n, n->infinity
YAY new video from Dr. James Grime
I absolutely love these videos
thanks
The Bernoulli numbers are recursive, you can also work them out through Ramanujan congruence relations.
Use the Even Zeta function. Actually the Zeta function is connected with cot(x) by the complex numbers.
Using an identity, it makes the power series for tan(x) AND csc(x) appear in closed form.
Here is the formula I proved if you want to try the tan(x) :
SUM from j=1 to j=Q of (2*(2^(2*j)-1))*Zeta(2*j)*x^(2*j-1)/Pi^(2*j), where Q is the jth term of the expansion
Example what is the 4th term of the tan(x) power series expansion WITHOUT computing the terms before ?
2*(2^(2*4)-1)/pi^8 *x^7 * zeta(8) => 17/315 * x^7 as you can easily check by long division.
Zeta(8) = pi^8/9450 (Fourier series and identity of Parseval, or just ask Euler who computed all even Zeta functions up to Zeta(26) by hand with polynomials...)
Please also make a video about sterling function in permutation and combination. Or various ways we distribute objects to people. It would be a great help. Thanks
That was soooo amazing!!!!!
Dr Grimes is BACK ! Yesssssss
2 questions:
1). Why is the probability a "constant"? The approximation of e get better as n gets larger, and thus the probability will increase as n increases, approaching 1/e, but not truly a constant?
2). How does this formulation of the secretary problem relate to the secretary problem discussed in the "choosing the best toilet video"? That is, you should reject the first 100/e% of applicants then choose the next one that is better than all the previous and you'll have a 100/e% chance of having the best one overall?
A great video! It's a nice throwback to Dr. Grime's video on the topic of e, and, low and behold, there is some actual maths! Cannot wait for the Numberphile2 part, is !n = n! - sum(k=1 -> n)[nCk * !(n-k)]? Curious to see the link to e.
I'm a simple man, I see a video with Dr. Grimes, I drop everything to watch it.
2:04 Right before he said "What of it's 2 cards? I had it in my mind... Cool!
"We're gonna get deranged, let's do that."
Okay?
I recently needed to solve a question which seems to relate to a subset of derangements, which contains no rotations of non-derangements (ie for a seed 1234, do not include 2341, 3412, 4123).
The problem presented itself in the form of a sort of virtual pass-the-parcel with 6 players and 6 parcels, whereby the goal is that no player passes to the same player twice (following that no player takes a parcel from the same player twice).
A solution I found involved simply stepping by increasing factors, and mirroring, like so:
123456
246135
365214
412563
531642
654321
I've been looking at what happens when you sum mutual digit derangements. For example 247, 742, 274 are derangements of each other. Any other permutation, such as 724 has at least one element, 7, in the same position. The sum always seems to be a multiple of 37. I don't think that's got anything to do with 1/e, but with what you get when you factorize repunits of varying size. 111 = 3*37, and 1111 = 11*101, so the sum of say 2473 and its derangements 3247, 7324, 4732 is 17776 = 11*101*16. For 5 digit derangement sets the sum is a multiple of 41 and 271.
The answer for the 7:18 question is:
The probability of a person get his own hats among "n" hats is:
1/n
The probability of a person doesn't get his own hats among "n" hats is:
(n-1)/n
The probability of "k" persons get their own hats among "n" hats is:
(1/n)^(k)
The probability of the rest of them doesn't get their own hats is:
((n-1)/n)^(n - k)
Then, the probability of "k" persons get their own hats and the others don't is:
(1/n)^k * ((n-1)/n)^(n-k)
The number of k-combinations in a set has "n" elements is equal to:
n!/(k!(n-k)!)
So, the probability of only "k" persons gets their own hats is:
(1/n)^k * ((n-1)/n)^(n-k) * n!/(k!(n-k)!)
Q.E.D.
It's actually not much more difficult to answer the question at the end. There are approximately n!/(e k!) permutations fixing exactly k positions. Can find this from the same method as used to find the original approximation, or just apply the original esimate (for example note that for a given set of fixed positions there are ~(n-k)!/e derangements of the remaining, and there are n choose k ways to choose the fixed positions. The product of these is the above estimate) I'm not a fan of people saying this or that math is hard or complicated; it's very hard to make such a claim objective, and just biases the hearer against the question. It's better to say "I don't know" than to say it's hard.
I see what you did there! 1:46 #ParkerSquare
Huh? You use the same print on the background as the 'square number glitch-y' in the Perfect Square video!
_I love the little details by the way_
The probability for no match does not converge as fast as it was stated in the video. It is actually 65.1% for 10 cards [1-(1-1/10)^10=~0.651]
Awesome explanation
Thank you Dr. James
I like this guy's 24/7 smile
Love the videos with Dr Grime.
Simple and clever. I like this concept.
I am reminded of "manotazo", where players take turns laying down cards and have to slap down the pile when the card layer down coincides with name of the card. The names are spoken in order during turns from ace to King and then ace to King again. Last hand on top of the pile takes the pile. You notice derangements quite often during play
6:31 #ParkerSquare
I like that because 3 is almost 4
1:40 Thx for the example. Was so hard to figure out :D
the cool feeling you get when you saw e coming right after seeing the percentages.
There is a simpler proof: so for the example with n cards in order to lose the first hand has to be anything but 1 and for that to happen the probability is n-1/n. For the second card anything but 2 and so on. So there are n cards and for this to be true it has to be true for all of the cards. So the probability is ((n-1)/n)^n = (1-1/n)^n) but (1 +x/n)^n = e^x so the probability of losing is e^-1
I could honestly listen to James Grime talk all day long about math 😍😍😍🤓🤓🤓
So I worked on a very similar problem in school.
The idea is that you've got a list of questions and answers on a test that you need to match up. Maybe you've got a list of definitions that you need to match to words from a word bank, for example. The question I asked myself was, "If I were to guess totally randomly, how many can expect to get right?"
It turns out, the answer to that question is exactly 1, regardless of how many questions there are, provided that you give a unique answer to each of them. This, of course, is no better than if you simply guessed the same answer on all of them. Approximately 37% of all cases will yield no correct answers, and 1 of all cases will yield all correct answers, but it averages out at exactly 1 correct answer, given any number of questions.
One other thing I wanted to point out is that James was not completely clear at the beginning when he said that it's always 63%. In truth, the probability approaches 1-1/e as the number of cards approaches infinity. For cases with 4 or fewer cards, it still approaches this value, but with fewer cards, the result is off by more than 1%, so it doesn't look like the same number by system.
1 card is 100%
2 cards is 50%
3 cards is 66.666...%
4 cards is 62.5%
An interesting point to note here is that all odd amounts will have a probability of over 1-1/e, and all even numbers will have a probability under 1-1/e
I also found this value 1-1/e elsewhere in probability; the idea being that if you have something with a 1/n chance of happening with each iteration, and you preform that iteration n times, what's the probability that it will occur at least once. For example, what's the probability of rolling a 6 on a standard 6 sided die, if you roll it 6 times. The answer here is also 1-1/e, or about 63%. Also here, the % approaches 1-1/e as n approaches infinity, but it approaches it more slowly, and always from above, rather than alternating between above and below, as in the above problem. So really, the probability is always over 1-1/e.
it started off so random but saying real life applications, even if not very popular situations, is still very interesting.
So I watched 8 minutes for seeing how he calculates the number of derangements, because that is the hard part in this, but then he explains everything else than that and just throws !n=n!/e at us with no explanation
The whole Derangement concept is an application of inclusion exclusion principle. I hope you are familiar with this beautiful principle.
So haven't you watched the second video?
Well that’s what the extra footage is for
He's a tiny smart hell of a mathematician and I love it ❤
Very nice. But what about the probability of getting 1 match, 2 matches etc. I know you said it is harder, but It would be interesting to know. Or any simple references to an answer? Btw, loved the addresses you chose for the envelopes!
A similar but harder question. Two packs of cards are shuffled together. What is the probability that two identical cards will lie adjacent to each other? Would love to see a solution.
JEE aspirant attendance here
"It looks like you've got the world's smallest hands."
Perhaps James could be President of the United States.
Id vote for him.... if I were American....
@@bwayagnesarchives or if he were american :P
Orange man bad.
Pretty fitting how the title is called "Derangements"...
??
You explained so well
I don't think the question for exactly k fixed points is much harder; the total number of permutations of n elements having exactly k fixed points should be (n choose k) * !(n-k), which is about n!/k! * 1/e for n large compared to k.
If you want to know the probability of 2 results, 3, etc.. wouldn't it be valid to just multiply that 63% chance N times? I.e., for two coincidental matching positions, .63 * .63 = .3969 or 39.69%.. This should be valid in my mind because if you look at the new set of items to derange as a new problem, there's a 63% chance again (unless you're at
Let me try to clear things up. The probability is technically never exactly .63 but rather just approaches 1-1/e which is approximately .632 and when he says that it is 63% at 4 or above he means that, to the nearest percent, it is 63%. You must use a special formula for binomial probability to find the probability for an exact number of times. I have a video on this on my channel called the "Probability Challenge" that explains this better.
I tried to tackle this question, what is the probability of winning if the win condition is at least two cards are in the correct position.
We see in the video that for at least 1 card in the correct position, the probability is 1-(1/e).
The number of ways to arrange n things so that exactly 1 item is in its correct position would be the same as the number of ways to fix a single point which is n, multiplied by the number of ways to make a derangement of the other n-1 points which is !(n-1). n * !(n-1) = n * (n-1)!/e or n!/e.
So the probability of two or more fixed points would be (1 - (P 0 fixed points) - (P 1 fixed point)) = 1 - ((n!/e)/n!) - ((n!/e)/n!) = 1-(2/e).
If this is not correct, can someone point out where I went wrong.
I'm a simple man. I hear or see James Grime on a Numberphile video, I hit "like"
I felt so smart right until you said, "Divide by e," and then I felt super excited and confused. Surprise "e" indeed! I wish I could have seen that coming.
I was just wondering when the next James Grimes video would happen, and then this happened.
The n! over e result is pretty interesting, as math doesn't usually "round" in combinatorics...
I liked the video before fully watching it.
my favorite professor
As soon as he said 37% I knew e was coming!
I love when the camera guy calls out general statements like at 2:04
Can’t unsee tiny James Grime
My thought after some calculations with low card numbers was that it would approach some particular probability with increasing card numbers. I was surprised when the video said just how rapid it is.
1 card: 100%
2 cards: 50% (1/2)
3 cards: 66% (4/6)
4 cards: 62.5% (15/24)
5 cards: 63.3% (76/120, I assume but have not confirmed)
It is not that the probability is exactly 1-1/e. It is that it is as close as you can get given the finite number of permutations for any given card count.
And cluing in on the meaning behind it, it makes some sense that it approaches so rapidly; factorials increase rapidly too.
Interestingly, the same 63/37 % applies for the decay in moving average CPU load in UNIX systems calculation (i.e. a 5-minute average CPU load contains 37% of the past and 63% of the past 5 minutes) - according to Wikipedia anyway.
James is the most interesting person on numberphile. Change my mind
Dr Grime is the best.
James Bissonette? Of "History Matters" fame?
I did similar math that came up with a similar result a while ago. I was doing shiny-hunting in Pokemon by breeding, and figuring out what my chances are. The raw chances are 1/512, so I asked "What's the chance I'd get one within 512 eggs?" Turns out it was the same number, around 64%. So I wondered how much that changed if I played with the numbers. I found out a limit this way, and stumbled across e. Turns out, according to Wolfram|Alpha, the problem spits out like this: 1-((x-1)/x)^x→(e-1)/e, as x→∞. That's your chance of rolling a natural 20 somewhere in 20 d20 rolls, your chance of rolling a natural 100 in 100 d100 rolls. Amazing where e pops up.
What about the probabilities of getting multiple matches? Does Euler’s number still show up in those formulas?
James was a little misleading in these videos. It's not always 37% for 4 or more elements. It approaches 1/e as the number of elements get large.
0:17
Aaaahahah! He was making the same joke as I was making it! Wonderful!
I’m imagining James with those cards now not at a 6’4” guy but as a tiny leprechaun holding a normal deck.
@Numberphile
Where's the error in this?
The chance of not getting a single match: 9/10*8/9*7/8*6/7*5/6*4/5*3/4*2/3*1/2 = 0,1.
So the chance of hitting at least one match is 1-0,1=0,9
Can we do more videos on the millenium problems, calculus, and topology?
I sometimes play a game with a full deck of 52 cards. I would flip one card at a time, saying ace, 2, 3, 4, 5, and so on. When I get to King, I would start over at Ace again. What is the probability of getting through the entire deck without matching?
why 4?
the exact probability is:
1-((n-1)/n)^n
so if there are 5 cards it's around 67%
even with 10 cards its 65%
you need 65 cards for the probability to be 63%
I said "What about two cards?" right as he said it. I was so happy he immediately addressed it. :D