In a recent video the legendary Matt Parker claimed he kept flipping a two-sided (fair) coin untill he scored a sequence of ten consecutive 'switch flips', i.e. letting $T$ denote a tail and $H$ a head, then a sequence of ten switch flips is defined to be either $THTHTHTHTH$ or $HTHTHTHTHT$. He set up a contest and allowed each viewer to guess once at the exact amount of flips he needed to obtain such a sequence. The ten viewers with the ten closest answers would be awarded a prize.
The contest is over, so there is no incentive to keep a solution to the following problem to yourself. What is the best number to bet? Of course this somehow depends on how other viewers answer: you are more likely to win if your bet is not close to many other bets, so if a large number of viewers is mathematically inclined and bets the same number - say 1023 - then it no longer is the best (profit maximizing) bet. I've therefore simplified to the following question: let $X$ be the stochastic variable representing the number of flips needed untill a sequence of ten consecutive switch flips if obtained, then for which number $a \in \mathbb{N}$ does the expected value of the (absolute value of the) error $$ \mathbb{E}[\vert X - a \vert] $$ reach its minimal value? It is well-known that $a$ is the median of the (distribution of) $X$, but how can one compute it? Numerical approximations are welcome, theoretical (generalizable) results are preferred.
I found a way to compute the expected value of $X$ itself for general $n$ (i.e. the total number of coin flips needed to get a sequence of the form $THTHTHTH...$ or $HTHTHTHT...$ of length $n$). Let $\mathbb{E}_i$ denote the expected number of coin flips needed to get a desired sequence of length $n$, assuming we already have a sequence of length $i \in \mathbb{N}$. We immediately find $$ \mathbb{E}_0 = 1 + \mathbb{E}_1 $$ since we are certain to have a sequence of length $1$ after one flip. Furthermore, for $1 \leq i \leq n-1$ $$ \mathbb{E}_i = \frac{1}{2}\left(\mathbb{E}_1 + 1\right) + \frac{1}{2}\left(\mathbb{E}_{i+1} + 1\right) $$ since, given a sequence of $i$ flips which ends, say, on a tail, we have a $\frac{1}{2}$ chance to increase this to a sequence of $i+1$ flips (if we get, say, a head) and a $\frac{1}{2}$ chance to get back where we started, at $1$ flip. Using that $\mathbb{E}_n = 0$ the above gives us system of $n$ equations in the $n$ variables $\mathbb{E}_0, \ldots, \mathbb{E}_{n-1}$. One can easily check that the unique solution is given by $$ \mathbb{E}_i = 2^{n} - 2^{i} \quad 0 \leq i \leq n $$ Since Matt Parker started at $0$ and wanted to get $10$ flips, the expected value of the number of flips needed is $2^{10} - 1 = 1023$ and this should be a reasonable bet.
Does anyone know how to find the distribution of $X$ (or directly the median of $X$)? Like I said, analytical solutions are of course preferred, but any kind of method - even requiring numerical computations, but preferably not Monte Carlo simulations - would be interesting to me.
EDIT: I found out that the problem can be reduced to a combinatorial problem. Indeed, we have that $$ P(X \leq k) = \frac{\# \lbrace \text{sequences of length $k$ which contain a desired subsequence of length $n$}\rbrace}{2^k} $$ where $2^k$ is the total number of sequences of length $k$, since every sequence of length $k$ is equally likely to occur. Let $S_k$ be the set of sequences of length $k$ of $0$'s and $1$'s (we identify tails with $0$ and heads with $1$). We have a map $$ f: S_k \to S_{k-1} $$ where, for any sequence $s \in S_k$, the $i$th element of $f(s)$ is $1$ if $s(i) \neq s(i+1)$ and $0$ is $s(i) = s(i+1)$. This map makes the desired sequences in $S_k$ correspond bijectively with the sequences in $S_{k-1}$ which contain $n-1$ zeroes in a row. Hence, it suffices to count the number of sequences of a given length $k-1$ which contain $n-1$ zeroes in a row. Any ideas on how to continue?