Questions tagged [cumulative-distribution-functions]

For questions related to cumulative distribution functions.

The cumulative distribution function $F_X: \Bbb R \to [0,1]$ of a real-valued random variable $X$ is the probability that random variable $X$ will take a value less than $x$. Given the probability distribution of $X$ given by $f_X:\mathbb R\to[0,1]$, this is formally written as: $$F_X(x) := \Bbb P (X \leq x)=\int_{-\infty}^xf_Xdx$$

Every CDF is non-decreasing, right-continuous, and satisfies $\lim_{x\to-\infty}F_X=0$ and $\lim_{x\to\infty}F_X=1$. These conditions are sufficient to be a CDF, as any function satisfying these $4$ conditions is a CDF of some probability distribution.

The concept of the cumulative distribution function makes an explicit appearance in statistical analysis in two (similar) ways.

  • Cumulative frequency analysis is the analysis of the frequency of occurrence of values of a phenomenon less than a reference value.
  • The empirical distribution function is a formal direct estimate of the cumulative distribution function for which simple statistical properties can be derived and which can form the basis of various statistical hypothesis tests.

Such tests can assess whether there is evidence against a sample of data having arisen from a given distribution or evidence against two samples of data having arisen from the same (unknown) population distribution.

673 questions
14
votes
3 answers

Discrete Random Variables May Have Uncountable Images

For a probability space $(\Omega, \mathcal F, P)$, I'm trying to construct a discrete random variable $X$ (one for which $P(X\in K) = 1$ for some countable set $K$) for which $\text{im}(X)$ is uncountable. $\text{im}(X_*P)$ is uncountable, where…
9
votes
2 answers

Expected value of normal CDF

I am trying to calculate the expected value of a Normal CDF, but I have gotten stuck. I want to find the expected value of $\Phi\left( \frac{a-bX}{c} \right)$ where $X$ is distributed as $\mathcal{N}(0,1)$ and $\Phi$ is the standard normal CDF. I…
9
votes
5 answers

Conditions for uniqueness of the median

A median of a random variable is defined as any $m \in \mathbb{R}$ such that $P(X \le m) \ge 1/2$ and $P(X \ge m) \ge 1/2$. Alternatively, in terms of the CDF $F$ of $X$ defined by $F(x) := P(X \le x)$, we need $F(m) \ge 1/2$ and $F(m^-) \le…
7
votes
2 answers

Expectation of a monotone function of CDF: $\mathbb E \left [g(F(X)) \right ]$

If $X$ is a random variable with distribution function $F,$ then $\mathbb{E}[(F[X])^{-1/2}]$ can be computed by integration by parts, if $X$ has a continuous density $f$. What happens in the general case? Can we always compute it (or bound it)?
7
votes
2 answers

Deceptively simple problem to find expected number of seconds it takes a particle to exit $(-1, 1)$

I've been working through past MIT Primes problems, and got stuck on 2021 Problem M4: A particle is initially on the number line at a position of $0$. Every second, if it is at position $x$, it chooses a real number $t \in [−1, 1]$ uniformly and…
7
votes
3 answers

Conditions for weak convergence and generalized distribution functions

I am having some trouble proving Corollary 6.3.2 in Borovkov's Probability Theory (for reference, this material is on pages 147 to 149 in the book). For convenience, I provide some definitions and related theorems. Skip to the end for a…
6
votes
1 answer

What are basic properties of first order stochastic dominance?

I have a research problem involving stochastic dominance (1st and 2nd order). I understand the definition, but I'm surprised that I've had trouble finding some basic rules governing these relationships. I would expect there would be some rules…
6
votes
3 answers

Variance of the minimum of two r.v.'s

For two nonnegative independent r.v.'s, $X,Y$, with the same distribution and finite second moment, I'm trying to show that $Var[\min(X,Y)]\leqslant Var[X]$. Attempt 1. For the continuous case, I've written the first and second moments of…
6
votes
1 answer

Evaluating a summation involving binomial coefficients and gaussian CDF

I am trying to evaluate the following summation: $\hspace{2in}f=\sum_{z=0}^{N-1}Q(z)\phi(z)$ Here, $\hspace{1in}Q(z)=\left(\begin{array}{c}N-1 \\ z\end{array}\right)\left(\frac{ q}{N}\right)^{z}\left(1-\frac{…
6
votes
0 answers

Countably many discontinuities of a CDF in n-dimensions

It is well known that a cdf of a random variable in $\mathbb{R}$ has at most countably many discontinuities. How does this generalize to $n$-dimensions from the $1$-dimensional case? That is, for the $1$-dimensional case we show the following: If…
5
votes
1 answer

Is the probability of $X$ in the interval $[0, \mathbb{E}{[X]}]$ at least $1/2$, if $X \sim \chi^2$?

I am working on the $\chi^2$ distribution and have the following assumption: The cumulative distribution function of a $\chi^2$ distributed random variable is greater than $\frac{1}{2}$ at the right boundary of the interval $[0,…
5
votes
1 answer

Analyzing Cumulative Distribution Functions in Sampling Without Replacement vs. With Replacement

I am studying a population of $N$ bits, comprising $K$ ones and $N-K$ zeros. For sampling $n$ bits without replacement, the situation conforms to a hypergeometric distribution. The sum of these $n$ bits, $S_n$, yields a mean of $n\frac{K}{N}$ and a…
5
votes
0 answers

Multidimensional version of inverse transform sampling

I want to generalize the inverse transform sampling technique to a multidimensional setting. I have only seen this for the circle, as described in the motivation below, but I can't seem to find any good resources on the generalized problem. The…
5
votes
1 answer

A coin is thrown until two heads and two tails appears. What is the cumulative distribution function of this situation?

A coin is thrown until two heads and two tails appears. Let $Y$ be the number of throws until this happens. What is the cumulative distribution function of $Y$? What I have gotten so far: The last throw can end up being either head or tail. Let's…
1
2 3
44 45