If $X_n$ is a sequence of iid random variables with expected value $\mu$, and the random variable $$ \overline{X}_n := \frac{1}{n} \sum_{i=1}^n X_i, \tag{1} $$ then the weak form of the Law of Large Numbers states that for all $\epsilon > 0$, $$ \lim_{n \to \infty} \Pr \left( \left| \overline{X}_n - \mu \right| > \epsilon \right) = 0, \tag{2} $$ while the strong form states that $$ \Pr \left( \lim_{n \to \infty} \overline{X}_n = \mu \right) = 1. \tag{3} $$
I'm just an idiot physicist, so I have a very pedestrian view of random variables: I think of a random variable as basically being equivalent to its probability density function. (I know that not all random variables have PDFs. But like I said: idiot physicist.)
Is there an explicit example of a sequence $\overline{X}_n$ of continuous random variables with PDFs that satisfies condition (2) but not condition (3)? (It doesn't need to be a running average of the form (1); any sequence of random variables will do.) If so, what does the sequence of PDFs look like?
This would help make the difference between the weak and strong forms very concrete in my mind. I'm more used to thinking about sequences of functions than about sequences of random variables.
The answers to this question are useful, but (as far as I can tell) they both discuss sequences of binary variables, which seem less natural than continuous variables in the setting of the LLN because you need to average the values.