For each n \in \mathbb N let X_n \sim \mathrm {Po} (n). By considering the probability \mathbb P(X_n \le n) evaluate:

\displaystyle \lim_{n \to \infty} e^{-n} \left(1 + n + \frac {n^2} {2!} + \ldots + \frac {n^n} {n!}\right)

You might be surprised by the result.

You may use the central limit theorem without proof.

The answer is correct but Iâ€™m not quite sure if I understand this solution. I think itâ€™s along the right lines though. It looks youâ€™re taking a limit as n \to \infty getting a result in terms of n?

1 Like

Yeah, I was trying to say that in the limit - the cumulative distribution function of the Poisson R.V would tend to the cumulative distribution function of the Normal R.V with parameters n and n. I can see that how Iâ€™ve written may be confusing though - what would be the correct notation to express this?

1 Like

Iâ€™m not sure if thatâ€™s true. Remember that CLT says that if \langle X_i\rangle is a random sample of size n of X then:

\displaystyle \frac {S_n - \mu} {\frac {\sigma} {\sqrt n}} \xrightarrow D N(0,1)

where \xrightarrow D means distributional convergence, (this means the limit of the cumulative distribution function of the RV on the LHS as n \to \infty is the N(0,1) cumulative distribution function) with \displaystyle S_n = \frac 1 n \sum_{i = 1}^n X_i, \mu = \mathrm E[X], \sigma = \sqrt {\mathrm {var}(X)}.

What could X be in this instance?

[I realised I reused notation confusingly in this, the $X_n$ s here should not be the same as in the questionâ€¦]

1 Like

Yes, sorry. Here is a (hopefully) fixed version.

2 Likes

Cant wait to learn how to do this in Further Stats!

1 Like

Yes this works!

You donâ€™t learn it â€śproperlyâ€ť in FS1 - mainly because youâ€™re given no notion of â€śconvergenceâ€ť of sequences of random variables. You might meet these in a first or second year probability theory module, but youâ€™ll probably only see the Central Limit Theorem proved from scratch in the third year. (at Warwick this happens in ST318 Probability Theory)

Basically you learn that for large n, the sample mean S_n approximates \displaystyle N \left(\mu, \frac {\sigma^2} n\right), whatever the distribution of X, then you plug that into a calculator to find some probability. Thatâ€™s about as far as A-level goes. Since â€śapproximatesâ€ť is vacuous without at least quantifying the error, this isnâ€™t really a proper statement of CLT.

1 Like

Oh, ill have to wait 4 years then

To be fair - you might see a sketch proof in the first year, but that proof will assume Levyâ€™s continuity theorem, intuitively obvious but the proof is very non-trivial. Hence I say â€śfrom scratchâ€ť.

It is frustrating when things are put off like this but luckily once you get to university itâ€™s not done too often.

2 Likes