Questions tagged [chernoff-bounds]

questions about concentration inequalities for sum of independent random variables, martingales, and their applications

21 questions
8
votes
2 answers

Chernoff bound when we only have upper bound of expectation

If $X$ is a sum of i.i.d. random variables taking values in $\{0,1\}$ and $E[X]=\mu$, the Chernoff bound tells us that $$\Pr(X\geq (1+\delta)\mu)\leq e^{-\frac{\delta^2\mu}{3}}$$ for all $0<\delta<1$. If $E[X]\leq\mu$ instead, does there exist a…
mba
  • 385
  • 1
  • 2
  • 6
6
votes
1 answer

Chernoff-Hoeffding bounds for the number of nonzeros in a submatrix

Consider a $n \times n$ matrix $A$ with $k$ nonzero entries. Assume every row and every column of $A$ has at most $\sqrt{k}$ nonzeros. Permute uniformly at random the rows and the columns of $A$. Divide $A$ in $k$ submatrices of size $n/\sqrt{k}…
Matteo
  • 63
  • 5
5
votes
2 answers

Chernoff-like Concentration Bounds on Permutations

Suppose I have $n$ balls. Among them, there are $m \leq n$ black balls and the other $n - m$ balls are white. Fix a random permutation $\pi$ over these balls and denote by $Y_i$ the number of black balls in the first $i$ positions of our permutation…
4
votes
2 answers

Chernoff bound when we only have a lower bound of expecation

This question: Chernoff bound when we only have upper bound of expectation is similar, but for an upper bound of expectation. The standard Chernoff bound says that is $X$ is a sum of 0/1 random variables, then, for any $\delta \in (0,1)$, $P(X \leq…
user341502
  • 143
  • 2
3
votes
1 answer

Reducing randomness needed by turing machine

I am reading an article related to streaming algorithms named "Turnstile streaming algorithms might as well be linear sketched" by Yi Li, Huy Nguyen and David Woodruff, At some point they have a random algorithm (uses a tape of random bits) that…
3
votes
1 answer

Sampling from a set of numbers with a fixed sum

Let $s = \{x_1, x_2, \ldots, x_n\}$ be a set of $n$ random non-negative integers where $\sum_i x_i = n$. And let $\{y_1, y_2, \ldots, y_{\sqrt{n}}\}$ denote a subset of size $\sqrt{n}$ of $s$, chosen uniformly at random. Defining $y$ to be $\sum_i…
3
votes
1 answer

"Practical forms" of Chernoff bound for inequality in expectation

From Wikipedia: The above formula is often unwieldy in practice, so the following looser but more convenient bounds are often used: (i) $Pr(X\geq (1+\delta)\mu)\leq e^{-\frac{\delta^2\mu}{3}}, 0<\delta<1$ (ii) $Pr(X\leq (1-\delta)\mu)\leq…
mba
  • 385
  • 1
  • 2
  • 6
3
votes
1 answer

Expected maximum bin load, for balls in bins with equal number of balls and bins

Suppose we have $n$ balls and $n$ bins. We put the balls into the bins randomly. If we count the maximum number of balls in any bin, the expected value of this is $\Theta(\ln n/\ln\ln n)$. How can we derive this fact? Are Chernoff bounds helpful?
3
votes
1 answer

Chernoff bounds and Monte Carlo algorithms

One of Wikipedia examples of use of Chernoff bounds is the one where an algorithm $A$ computes the correct value of function $f$ with probability $p > 1/2$. Basically, Chernoff bounds are used to bound the error probability of $A$ using repeated…
zpavlinovic
  • 1,664
  • 10
  • 19
3
votes
1 answer

Find expectation with Chernoff bound

We have a group of employees and their company will assign a prize to as many employees as possible by finding the ones probably better than the rest. The company assigned the same $2$ tasks to every employee and scored their results with $2$ values…
2
votes
0 answers

Prove $\forall a>0.BPP[{a,a+\frac{1}{n}}]=BPP$

I need to prove that $\forall a>0.BPP[{a,a+\frac{1}{n}}]=BPP$ $BPP[a,b]$ definition: A language L is in BPP(a,b) if and only if there exists a probabilistic Turing machine M, such that M runs for polynomial time on all inputs For all x in L, M …
Mugen
  • 121
  • 4
2
votes
0 answers

Concentration inequality of sum of geometric random variables taken to a power

Let $X_1, \cdots, X_n$ be $n$ independent geometric random variables with success probability parameter $p = 1/2$, where $X_i = j$ means it took $j$ trials to get the first success. Let $S_d = \sum_{i=1}^n X_i^d$ and $\mu_d = \mathbb{E}\left( S_d…
2
votes
1 answer

Can BPP be bounded around any constant other than 1/2?

A language $L$ is in BPP if there exists a randomised TM such that it outputs a correct answer with probability at least $1/2+1/p(n)$ for some polynomial $p(n)$, where $n$ is the length of the input. This probability can be amplified to…
1
vote
0 answers

Proving a randomized algorithm that sums array elements

I am trying to prove the following algorithm to be correct: Sum(A[1..n], s)= sum = 0 for r=1 to s i = some random index i in interval [1, n] sum = sum + A[i] return n(sum/s) where A is an array consisting of $n$…
1
vote
0 answers

Chernoff bound on the maximum of multinomial distribution

I am reading some heavy hitter (HH) papers when I run into the following reduction theorem. The theorem attempts to reduce an HH problem with a very small tail frequency $\epsilon$ to multiple HH instances with higher tail frequencies. More…
Symbol 1
  • 61
  • 4
1
2