Suppose $X_1,\ldots,X_k$ are k independent geometric random variables with the same success probability and let $X=X_1+\cdots+X_k$.
Hence $E[X_i] = 1/p$, the expected number of trials needed is $E[X]=k/p$. I would like to use the Chernoff bound to find a good upper bound for the tail od the distribution, more exactly, the probability:
$$\Pr[X > (1+\delta)E[X]]$$
If X was the sum of variables with alternative distribution (taking values from $\{0,1\}$), we could simply use Chernoff bound. I tried to modify it, but without success. I found a similar thread, but about the sum of non-identical variables: Tail bound on the sum of independent (non-identical) geometric random variables and although they mentioned also the case of identical variables, I didn't understand it well. Could you give me a more elaborate explanation on how to compute the tail bound in this case?