Is there a theorem that states what the distribution of a function of a random variable should be given the distribution of a random variable?
For example, say $X_1$,$X_2$,...$X_n$ is a sequence of iid random variables drawn from a Bernoulli distribution using some parameter p.
$X_n \sim B(p)$.
Lets also say that $\bar{X}_n = \sum_{i=1}^{n}X_n$
What is the distribution of $\bar{X}_n$? And $n\bar{X}_n?$
My guess is that they are both normal distributions.
$\bar{X}_n$ ~ $N\left(p, \frac{p(1-p)}{n}\right)$ , because of the CLT which states that given a high enough sample size the distribution of sample means approaches a normal distribution regardless of what the underlying distribution is. In this case the underlying distribution is Bernoulli.
Using the delta method I was able to arrive at
$$g(\bar{X_n}) = N\left(g(p), \frac{g'(p)^2\sigma^2}{n}\right) $$
$$n\bar{X}_n = N\left(np,n(p-1)\right)$$
Was it appropriate to use the Delta method here?