Quantitative Analysis
Parallel Processing
Numerical Analysis
C++ Multithreading
Python for Excel
Python Utilities
Services
Author
Printable PDF file
I. Basic math.
II. Pricing and Hedging.
III. Explicit techniques.
IV. Data Analysis.
V. Implementation tools.
VI. Basic Math II.
1. Real Variable.
2. Laws of large numbers.
A. Weak law of large numbers.
B. Convergence of series of random variables.
C. Strong law of large numbers.
3. Characteristic function.
4. Central limit theorem (CLT) II.
5. Random walk.
6. Conditional probability II.
7. Martingales and stopping times.
8. Markov process.
9. Levy process.
10. Weak derivative. Fundamental solution. Calculus of distributions.
11. Functional Analysis.
12. Fourier analysis.
13. Sobolev spaces.
14. Elliptic PDE.
15. Parabolic PDE.
VII. Implementation tools II.
VIII. Bibliography
Notation. Index. Contents.

Convergence of series of random variables.


roposition

(Kolmogorov inequality for series 1) Let MATH are independent r.v. such that MATH Then for any $\varepsilon>0$ MATH

Proof

We introduce the notations MATH Observe that MATH We calculate MATH The term MATH vanishes as follows MATH Note that MATH is a function of $X_{1},...,X_{k}$ and $S_{n}-S_{k}$ is a function of $X_{k+1},...,X_{n}$ . Hence, these are independent: MATH because the second integral is zero: MATH . We continue the calculation of $\sigma^{2}$ : MATH

Proposition

(Kolmogorov inequality for series 2) Let MATH be a sequence of independent r.v. such that MATH for every $n$ . Then for any $\varepsilon>0$ we have MATH

Proposition

(Kolmogorov's three series theorem) Let MATH be independent r.v. Fix a constant $A>0$ and define MATH Then the series $\sum_{n}X_{n}$ converges a.s. iff all of the following series converge:

1. MATH ,

2. MATH ,

3. MATH .

Proof

Suppose that the series 1,2,3 converge. We prove that $\sum_{n}X_{n}$ converges as follows. We apply the proposition ( Kolmogorov inequality for series 1 ) to the r.v. MATH for $j=n,...,N$ : MATH Since the series 3 converge, the RHS vanishes as MATH : MATH According to the proposition ( Probability based criteria for AS convergence ) this implies that the series MATH converge a.s.

Since the series 2 also converge, we conclude that the series $\sum_{j}Y_{j}$ converge a.s.

Note that MATH and the series 1 converge. Hence, the $Y_{n}$ and $X_{n}$ are equivalent sequences and we already established that the $\sum_{n}Y_{n}$ converges a.s. Hence, by the proposition ( Property of equivalent sequences of r.v. ) the $\sum_{n}X_{n}$ converges a.s.

Suppose that the series $\sum_{n}X_{n}$ converge. We prove that 1,2,3 converge as follows. Since $\sum_{n}X_{n}$ converges the MATH cannot be greater then $A$ for infinitely many $n$ . Hence, MATH By the proposition ( Borel-Cantelli lemma, part 2 ) that the series 1 converge a.s. Hence, the $X_{n}$ and $Y_{n}$ are equivalent and, by proposition ( Property of equivalent sequences of r.v. ), the series $\sum_{n}Y_{n}$ converge. Then the series 2 converge.

To prove that the series 3 converge we apply the proposition ( Kolmogorov inequality for series 2 ) to the r.v. MATH for $j=n,...,N$ : MATH If 3 diverges then the RHS above tends to 0 as MATH . Thus, the tail of MATH is not bounded a.s. and such series cannot converge. The contradiction shows that the series 3 must converge.

Proposition

(Equivalence of AS and PR convergence for series) If MATH are independent r.v. then the convergence of $\sum_{n}X_{n}$ a.s. is equivalent to the convergence of $\sum_{n}X_{n}$ in pr.

Proof

Because of the proposition ( AS convergence vs convergence in pr 1 ) it suffices to prove that the convergence in pr implies the convergence a.s.

Hence, we assume that $\sum_{n}X_{n}$ converges in pr: MATH Note that for a $k:m<k\leq n$ MATH hence, MATH We make the LHS sets disjoint if we add the following modification: MATH Therefore, MATH and the $S_{k,n}$ is independent from the MATH and MATH . Hence, we continue MATH We conclude, MATH The statement now follows from the proposition ( Probability based criteria for AS convergence ).





Notation. Index. Contents.


















Copyright 2007