I thought of a sort of "game" that illustrates a situation where a person's probability of "winning" ranges from 0 to 1. I'm sure there's a name for it, or a better way to put it, but anyway, it goes like this:
There are N entries allowed in a contest. Every time someone submits an entry, one of the existing entries is swapped out (at random) and the new entry is added. Each person can submit more than 1 entry, but only 1 at a time. There's a glitch in the system where all the entries are in and ready to pull, but Tom, who currently has no entries in the contest, is still allowed to submit. What is the average number of times Tom has to submit to ensure all of the entries are his, resulting in a 100% chance of winning?
Obviously, if Tom submits 0 entries, he has a 0% chance of winning. The entries are swapped out at random, so he will at least have to submit 100 entries; however, every time he submits he might swap out an entry he already has. If he keeps swapping one of his entries for a new one of his entries, he stays at a 1/N probability of winning no matter how many entries he submits. So obviously the expected value of number of turns to guaranty a win ranges from 100 to Infinity.
For N=100, the average appears to be ~520, which I determined empirically. I am looking for a mathematical way to determine what this average, expected value would be based on N.
For various values of N:
N Apparent Ave
10 29
50 220
100 520
500 3360
800 5820
I scripted up a little R code to determine what the value might be, and that should help illustrate what I'm asking. Obviously not going for code-golf here.
tres <- 1000
ress <- vector("numeric",tres)
bigN <- 100
for (i in seq_along(ress)){
sdf <- rep(0,bigN)
counts <- 0
while (any(sdf==0)) {
sdf[sample(length(sdf),1)]<-1
counts=counts+1
}
ress[i] <- counts
}
summary(ress)