Thanks in advance for the help.
I have $n$ dependent and identically distributed Bernoulli r.v.s $X_1, \dots, X_n$ with success probability $p$. Consider $X = \sum_{i=1}^n X_i$. I know its first ($\mu = \mathbb{E}[X]$) and second ($\nu = \mathbb{E}[X^2]$) moment.
What I'm looking for, it's an upper bound to the quantity $\text{Pr}\left[X \ge 1 \right]$ that is more complex than the simple Markov's inequality (maybe involving $\nu$ also?). Something which does the "opposite job" of the second moment method.
More in general, are there any known lower bounds (involving first and second moment) of the probabilities $\text{Pr}\left[X \le (1 - \epsilon) \mu \right]$ and $\text{Pr}\left[X \le \mu - \lambda \right]$, for $\epsilon \in (0,1)$ and $0 <\lambda < \mu$?
I know there exists a "reverse Chernoff's bound" (here) for independent Bernoulli r.v.s, but is there anything known when we have dependency and the only known things are $\mu$ and $\nu$?
I have to specify that I suspect the variables are positively correlated, but it is very difficult to prove.
EDIT: possibly, the third moment $\mathbb{E}[X^3]$ can be computed too.