A.S. convergence of sum of square-integrable independent random variables with summable variation
$begingroup$
I'm working on the following exercise from Achim Klenke's "Probability Theory: A Comprehensive Course" (exercise 6.1.4):
Let $X_1, X_2, ldots$ be independent, square integrable, centered random variables with $sum_{i=1}^infty mathbf{Var}[X_i] < infty$. Show that there exists a square integrable $X$ with $X = lim_{n to infty} sum_{i=1}^n X_i$ almost surely.
Chebyshev's inequality gives us
$$
mathbf Pleft[|S_m - S_n| > epsilonright] leq epsilon^{-2} mathbf{Var}left[ sum_{i=m+1}^n X_iright] = epsilon^{-2} sum_{i=m+1}^n mathbf{Var}left[X_iright] xrightarrow{m,n to infty} 0.
$$
whence $(S_n)_{n in mathbb N}$ is a Cauchy sequence in probability. Thus $S_n xrightarrow{mathbf P} X$. Using a similar strategy, we can in fact show that $S_n to X$ in $L^2$.
Now, to prove almost sure convergence, I'd like to use the following result (Corollary 6.13 in Klenke):
Let $(E,d)$ be a separable metric space. Let $f, f_1, f_2, ldots$ be measurable maps $Omega to E$. Then the following statements are equivalent.
(i)$quad f_n to f$ in measure as $n to infty$.
(ii)$quad$For any subsequence of $(f_n)_{n in mathbb N}$, there exists a sub-subsequence that converges to $f$ almost everywhere.
and somehow use the fact that we're working with a sum of centered random variables to show that in fact every subsequence converges a.s. But I'm not sure how to do this since our $X_i$ are not nonnegative. I tried reconstructing the proof of this theorem, but I've only been able to show once again that there are a.e. convergent subsequences.
My other thought was to apply the Borel-Cantelli lemma to the events $B_n(epsilon) := left{ |X - S_n| > epsilonright}$ and prove that $limsup_{n to infty} B_n(epsilon) =: B(epsilon)$ has probability $0$, but in the latter case I don't know how to approximate the probability of $B_n(epsilon)$. Chebyshev doesn't seem available to us since strictly speaking we don't know what $X$ looks like, only that $S_n$ converges in $L^2$ to it. Even if we could say $X - S_n = sum_{i=n+1}^infty X_i$, the above approximation using Chebyshev with $|X - S_n|$ instead of $|S_m - S_n|$ would work out to
$$
mathbf Pleft[|X - S_n| > epsilonright] leq epsilon^{-2} sum_{i=n+1}^infty mathbf{Var}[X_i]
$$
which would sum to $epsilon^{-2} sum_{n=1}^infty nmathbf{Var}[X_n]$, but I don't see why this series converges.
Any thoughts on how to prove $S_n to X$ almost surely?
real-analysis probability probability-theory convergence borel-cantelli-lemmas
$endgroup$
add a comment |
$begingroup$
I'm working on the following exercise from Achim Klenke's "Probability Theory: A Comprehensive Course" (exercise 6.1.4):
Let $X_1, X_2, ldots$ be independent, square integrable, centered random variables with $sum_{i=1}^infty mathbf{Var}[X_i] < infty$. Show that there exists a square integrable $X$ with $X = lim_{n to infty} sum_{i=1}^n X_i$ almost surely.
Chebyshev's inequality gives us
$$
mathbf Pleft[|S_m - S_n| > epsilonright] leq epsilon^{-2} mathbf{Var}left[ sum_{i=m+1}^n X_iright] = epsilon^{-2} sum_{i=m+1}^n mathbf{Var}left[X_iright] xrightarrow{m,n to infty} 0.
$$
whence $(S_n)_{n in mathbb N}$ is a Cauchy sequence in probability. Thus $S_n xrightarrow{mathbf P} X$. Using a similar strategy, we can in fact show that $S_n to X$ in $L^2$.
Now, to prove almost sure convergence, I'd like to use the following result (Corollary 6.13 in Klenke):
Let $(E,d)$ be a separable metric space. Let $f, f_1, f_2, ldots$ be measurable maps $Omega to E$. Then the following statements are equivalent.
(i)$quad f_n to f$ in measure as $n to infty$.
(ii)$quad$For any subsequence of $(f_n)_{n in mathbb N}$, there exists a sub-subsequence that converges to $f$ almost everywhere.
and somehow use the fact that we're working with a sum of centered random variables to show that in fact every subsequence converges a.s. But I'm not sure how to do this since our $X_i$ are not nonnegative. I tried reconstructing the proof of this theorem, but I've only been able to show once again that there are a.e. convergent subsequences.
My other thought was to apply the Borel-Cantelli lemma to the events $B_n(epsilon) := left{ |X - S_n| > epsilonright}$ and prove that $limsup_{n to infty} B_n(epsilon) =: B(epsilon)$ has probability $0$, but in the latter case I don't know how to approximate the probability of $B_n(epsilon)$. Chebyshev doesn't seem available to us since strictly speaking we don't know what $X$ looks like, only that $S_n$ converges in $L^2$ to it. Even if we could say $X - S_n = sum_{i=n+1}^infty X_i$, the above approximation using Chebyshev with $|X - S_n|$ instead of $|S_m - S_n|$ would work out to
$$
mathbf Pleft[|X - S_n| > epsilonright] leq epsilon^{-2} sum_{i=n+1}^infty mathbf{Var}[X_i]
$$
which would sum to $epsilon^{-2} sum_{n=1}^infty nmathbf{Var}[X_n]$, but I don't see why this series converges.
Any thoughts on how to prove $S_n to X$ almost surely?
real-analysis probability probability-theory convergence borel-cantelli-lemmas
$endgroup$
add a comment |
$begingroup$
I'm working on the following exercise from Achim Klenke's "Probability Theory: A Comprehensive Course" (exercise 6.1.4):
Let $X_1, X_2, ldots$ be independent, square integrable, centered random variables with $sum_{i=1}^infty mathbf{Var}[X_i] < infty$. Show that there exists a square integrable $X$ with $X = lim_{n to infty} sum_{i=1}^n X_i$ almost surely.
Chebyshev's inequality gives us
$$
mathbf Pleft[|S_m - S_n| > epsilonright] leq epsilon^{-2} mathbf{Var}left[ sum_{i=m+1}^n X_iright] = epsilon^{-2} sum_{i=m+1}^n mathbf{Var}left[X_iright] xrightarrow{m,n to infty} 0.
$$
whence $(S_n)_{n in mathbb N}$ is a Cauchy sequence in probability. Thus $S_n xrightarrow{mathbf P} X$. Using a similar strategy, we can in fact show that $S_n to X$ in $L^2$.
Now, to prove almost sure convergence, I'd like to use the following result (Corollary 6.13 in Klenke):
Let $(E,d)$ be a separable metric space. Let $f, f_1, f_2, ldots$ be measurable maps $Omega to E$. Then the following statements are equivalent.
(i)$quad f_n to f$ in measure as $n to infty$.
(ii)$quad$For any subsequence of $(f_n)_{n in mathbb N}$, there exists a sub-subsequence that converges to $f$ almost everywhere.
and somehow use the fact that we're working with a sum of centered random variables to show that in fact every subsequence converges a.s. But I'm not sure how to do this since our $X_i$ are not nonnegative. I tried reconstructing the proof of this theorem, but I've only been able to show once again that there are a.e. convergent subsequences.
My other thought was to apply the Borel-Cantelli lemma to the events $B_n(epsilon) := left{ |X - S_n| > epsilonright}$ and prove that $limsup_{n to infty} B_n(epsilon) =: B(epsilon)$ has probability $0$, but in the latter case I don't know how to approximate the probability of $B_n(epsilon)$. Chebyshev doesn't seem available to us since strictly speaking we don't know what $X$ looks like, only that $S_n$ converges in $L^2$ to it. Even if we could say $X - S_n = sum_{i=n+1}^infty X_i$, the above approximation using Chebyshev with $|X - S_n|$ instead of $|S_m - S_n|$ would work out to
$$
mathbf Pleft[|X - S_n| > epsilonright] leq epsilon^{-2} sum_{i=n+1}^infty mathbf{Var}[X_i]
$$
which would sum to $epsilon^{-2} sum_{n=1}^infty nmathbf{Var}[X_n]$, but I don't see why this series converges.
Any thoughts on how to prove $S_n to X$ almost surely?
real-analysis probability probability-theory convergence borel-cantelli-lemmas
$endgroup$
I'm working on the following exercise from Achim Klenke's "Probability Theory: A Comprehensive Course" (exercise 6.1.4):
Let $X_1, X_2, ldots$ be independent, square integrable, centered random variables with $sum_{i=1}^infty mathbf{Var}[X_i] < infty$. Show that there exists a square integrable $X$ with $X = lim_{n to infty} sum_{i=1}^n X_i$ almost surely.
Chebyshev's inequality gives us
$$
mathbf Pleft[|S_m - S_n| > epsilonright] leq epsilon^{-2} mathbf{Var}left[ sum_{i=m+1}^n X_iright] = epsilon^{-2} sum_{i=m+1}^n mathbf{Var}left[X_iright] xrightarrow{m,n to infty} 0.
$$
whence $(S_n)_{n in mathbb N}$ is a Cauchy sequence in probability. Thus $S_n xrightarrow{mathbf P} X$. Using a similar strategy, we can in fact show that $S_n to X$ in $L^2$.
Now, to prove almost sure convergence, I'd like to use the following result (Corollary 6.13 in Klenke):
Let $(E,d)$ be a separable metric space. Let $f, f_1, f_2, ldots$ be measurable maps $Omega to E$. Then the following statements are equivalent.
(i)$quad f_n to f$ in measure as $n to infty$.
(ii)$quad$For any subsequence of $(f_n)_{n in mathbb N}$, there exists a sub-subsequence that converges to $f$ almost everywhere.
and somehow use the fact that we're working with a sum of centered random variables to show that in fact every subsequence converges a.s. But I'm not sure how to do this since our $X_i$ are not nonnegative. I tried reconstructing the proof of this theorem, but I've only been able to show once again that there are a.e. convergent subsequences.
My other thought was to apply the Borel-Cantelli lemma to the events $B_n(epsilon) := left{ |X - S_n| > epsilonright}$ and prove that $limsup_{n to infty} B_n(epsilon) =: B(epsilon)$ has probability $0$, but in the latter case I don't know how to approximate the probability of $B_n(epsilon)$. Chebyshev doesn't seem available to us since strictly speaking we don't know what $X$ looks like, only that $S_n$ converges in $L^2$ to it. Even if we could say $X - S_n = sum_{i=n+1}^infty X_i$, the above approximation using Chebyshev with $|X - S_n|$ instead of $|S_m - S_n|$ would work out to
$$
mathbf Pleft[|X - S_n| > epsilonright] leq epsilon^{-2} sum_{i=n+1}^infty mathbf{Var}[X_i]
$$
which would sum to $epsilon^{-2} sum_{n=1}^infty nmathbf{Var}[X_n]$, but I don't see why this series converges.
Any thoughts on how to prove $S_n to X$ almost surely?
real-analysis probability probability-theory convergence borel-cantelli-lemmas
real-analysis probability probability-theory convergence borel-cantelli-lemmas
asked Dec 16 '18 at 6:15
D FordD Ford
655313
655313
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Since
$$
lim_{ntoinfty}mathsf{P}left(sup_{kge n}|S_n-S_k|> epsilonright)=0,
$$
the set on which the sequence ${S_n}$ is not Cauchy,
$$
N=bigcup_{epsilon>0}bigcap_{nge 1}left{sup_{j,kge n}|S_j-S_k|>epsilonright}
$$
is a null set ($because sup_{j,kge n}|S_j-S_k|le 2sup_{kge n}|S_n-S_k|$). So you define $X:=lim_{ntoinfty} S_n1_{N^c}$.
$endgroup$
$begingroup$
This seems a little too good to be true. Why does this not imply, for example, that every sequence of random variables that converges in probability converges almost surely? (This is a false result, e.g. $X_n sim mathrm{Ber}_{1/n}$.)
$endgroup$
– D Ford
Dec 16 '18 at 14:27
$begingroup$
@DFord "Why does this not imply..."? Why should it?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:20
$begingroup$
Well, replace $S_n$ with any sequence of random variables $X_n$ that converges in probability to $X$. It appears as though the same argument applies: the set on which ${X_n}$ is not Cauchy is a null set.
$endgroup$
– D Ford
Dec 16 '18 at 18:26
$begingroup$
@DFord Does $mathsf{P}(sup_{kge n}|X_n-X_k|>epsilon)$ converge to $0$ in that case?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:28
1
$begingroup$
Cauchy in prob. means that $mathsf{P}(|X_j-X_k|>epsilon)to 0$ as $j,kto infty$. For the a.s. convergence you need a stronger condition.
$endgroup$
– d.k.o.
Dec 16 '18 at 19:00
|
show 2 more comments
$begingroup$
$var (sum _{i=n}^{m} X_i) =sum _{i=n}^{m} var(X_i) to 0$ as $n,m to infty$so the partial sums of $sum X_i$ form a Cauchy sequence in $L^{2}$. Hence there is a square integrable random variable $X$ such that $sum _1^{n} X_i to X$ in $L^{2}$. Now convergence in mean square implies convergence in probability and for series of independent random variables convergence in probability implies almost sure convergence.
$endgroup$
$begingroup$
``For series of independent random variables, convergence in probability implies almost sure convergence." Is this a standard result? Does it have a name? I'm not familiar with this.
$endgroup$
– D Ford
Dec 16 '18 at 14:28
$begingroup$
It is a well known result. A proof can be found in Chung's 'A course in probability Theory'. @DFord
$endgroup$
– Kavi Rama Murthy
Dec 16 '18 at 23:22
add a comment |
$begingroup$
Your try is not bad, but I doubt that those methods will give you the result. We should show that the sum of independent random variables
$$S_n = sum_{i=1}^n X_i to S_infty
$$ almost surely. But we know that in general $$
P(sum_{i=1}^infty |X_i|<infty) =1
$$ fails. This means that the series is conditionally convergent in most cases, and it is known that some kind of maximal inequality such as $$
P(max_{nin mathbb{N}} |S_n|>lambda) leq frac{C}{lambda^2}sum_{n=1}^infty operatorname{Var}(X_n)
$$ provides sufficient and necessary condition for the almost sure convergence. It is necessary since we need to control the oscillation of the sequence
$$
nmapsto S_n(omega)
$$ for almost all $omegain Omega$. Fortunately there are several known maximal inequalities such as Kolmogorov's maximal inequality, Etemadi's inequality or martingale maximal inequalities. In particular, Kolmogorov's inequality can establish that $$
S_n text{ converges a.s.} iff S_n text{ converges in probability.}
$$ (Or you can see this: https://en.wikipedia.org/wiki/Kolmogorov%27s_two-series_theorem.) If you are allowed to use more powerful tools such as martingale convergence theorem, then $$S_n = sum_{i=1}^n X_i to S_infty
$$ almost surely follows immediately.
$endgroup$
$begingroup$
Is there a clear reason why $S_n$ converges a.s. $iff$ $S_n$ converges in probability that follows from Kolmogorov's inequality? I'm having trouble seeing it.
$endgroup$
– D Ford
Dec 16 '18 at 18:48
1
$begingroup$
It seems that there are (at least) two Kolmogorov's maximal inequalities. The version I refered to is this one: Let $x>a>0$ and $p = max_{jleq n} P(|S_n-S_j|>a).$ Then $P(max_{jleq n}|S_j|>x) leq frac{1}{1-p}P(|S_n|>x-a)$. What this inequality can show is @d.k.o's argument below. Of course, $S_n=X_1+X_2+cdots +X_n$ and $X_i$ are independent (need not be identical).
$endgroup$
– Song
Dec 16 '18 at 18:53
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3042298%2fa-s-convergence-of-sum-of-square-integrable-independent-random-variables-with-s%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Since
$$
lim_{ntoinfty}mathsf{P}left(sup_{kge n}|S_n-S_k|> epsilonright)=0,
$$
the set on which the sequence ${S_n}$ is not Cauchy,
$$
N=bigcup_{epsilon>0}bigcap_{nge 1}left{sup_{j,kge n}|S_j-S_k|>epsilonright}
$$
is a null set ($because sup_{j,kge n}|S_j-S_k|le 2sup_{kge n}|S_n-S_k|$). So you define $X:=lim_{ntoinfty} S_n1_{N^c}$.
$endgroup$
$begingroup$
This seems a little too good to be true. Why does this not imply, for example, that every sequence of random variables that converges in probability converges almost surely? (This is a false result, e.g. $X_n sim mathrm{Ber}_{1/n}$.)
$endgroup$
– D Ford
Dec 16 '18 at 14:27
$begingroup$
@DFord "Why does this not imply..."? Why should it?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:20
$begingroup$
Well, replace $S_n$ with any sequence of random variables $X_n$ that converges in probability to $X$. It appears as though the same argument applies: the set on which ${X_n}$ is not Cauchy is a null set.
$endgroup$
– D Ford
Dec 16 '18 at 18:26
$begingroup$
@DFord Does $mathsf{P}(sup_{kge n}|X_n-X_k|>epsilon)$ converge to $0$ in that case?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:28
1
$begingroup$
Cauchy in prob. means that $mathsf{P}(|X_j-X_k|>epsilon)to 0$ as $j,kto infty$. For the a.s. convergence you need a stronger condition.
$endgroup$
– d.k.o.
Dec 16 '18 at 19:00
|
show 2 more comments
$begingroup$
Since
$$
lim_{ntoinfty}mathsf{P}left(sup_{kge n}|S_n-S_k|> epsilonright)=0,
$$
the set on which the sequence ${S_n}$ is not Cauchy,
$$
N=bigcup_{epsilon>0}bigcap_{nge 1}left{sup_{j,kge n}|S_j-S_k|>epsilonright}
$$
is a null set ($because sup_{j,kge n}|S_j-S_k|le 2sup_{kge n}|S_n-S_k|$). So you define $X:=lim_{ntoinfty} S_n1_{N^c}$.
$endgroup$
$begingroup$
This seems a little too good to be true. Why does this not imply, for example, that every sequence of random variables that converges in probability converges almost surely? (This is a false result, e.g. $X_n sim mathrm{Ber}_{1/n}$.)
$endgroup$
– D Ford
Dec 16 '18 at 14:27
$begingroup$
@DFord "Why does this not imply..."? Why should it?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:20
$begingroup$
Well, replace $S_n$ with any sequence of random variables $X_n$ that converges in probability to $X$. It appears as though the same argument applies: the set on which ${X_n}$ is not Cauchy is a null set.
$endgroup$
– D Ford
Dec 16 '18 at 18:26
$begingroup$
@DFord Does $mathsf{P}(sup_{kge n}|X_n-X_k|>epsilon)$ converge to $0$ in that case?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:28
1
$begingroup$
Cauchy in prob. means that $mathsf{P}(|X_j-X_k|>epsilon)to 0$ as $j,kto infty$. For the a.s. convergence you need a stronger condition.
$endgroup$
– d.k.o.
Dec 16 '18 at 19:00
|
show 2 more comments
$begingroup$
Since
$$
lim_{ntoinfty}mathsf{P}left(sup_{kge n}|S_n-S_k|> epsilonright)=0,
$$
the set on which the sequence ${S_n}$ is not Cauchy,
$$
N=bigcup_{epsilon>0}bigcap_{nge 1}left{sup_{j,kge n}|S_j-S_k|>epsilonright}
$$
is a null set ($because sup_{j,kge n}|S_j-S_k|le 2sup_{kge n}|S_n-S_k|$). So you define $X:=lim_{ntoinfty} S_n1_{N^c}$.
$endgroup$
Since
$$
lim_{ntoinfty}mathsf{P}left(sup_{kge n}|S_n-S_k|> epsilonright)=0,
$$
the set on which the sequence ${S_n}$ is not Cauchy,
$$
N=bigcup_{epsilon>0}bigcap_{nge 1}left{sup_{j,kge n}|S_j-S_k|>epsilonright}
$$
is a null set ($because sup_{j,kge n}|S_j-S_k|le 2sup_{kge n}|S_n-S_k|$). So you define $X:=lim_{ntoinfty} S_n1_{N^c}$.
edited Dec 16 '18 at 8:01
answered Dec 16 '18 at 7:15
d.k.o.d.k.o.
10.2k629
10.2k629
$begingroup$
This seems a little too good to be true. Why does this not imply, for example, that every sequence of random variables that converges in probability converges almost surely? (This is a false result, e.g. $X_n sim mathrm{Ber}_{1/n}$.)
$endgroup$
– D Ford
Dec 16 '18 at 14:27
$begingroup$
@DFord "Why does this not imply..."? Why should it?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:20
$begingroup$
Well, replace $S_n$ with any sequence of random variables $X_n$ that converges in probability to $X$. It appears as though the same argument applies: the set on which ${X_n}$ is not Cauchy is a null set.
$endgroup$
– D Ford
Dec 16 '18 at 18:26
$begingroup$
@DFord Does $mathsf{P}(sup_{kge n}|X_n-X_k|>epsilon)$ converge to $0$ in that case?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:28
1
$begingroup$
Cauchy in prob. means that $mathsf{P}(|X_j-X_k|>epsilon)to 0$ as $j,kto infty$. For the a.s. convergence you need a stronger condition.
$endgroup$
– d.k.o.
Dec 16 '18 at 19:00
|
show 2 more comments
$begingroup$
This seems a little too good to be true. Why does this not imply, for example, that every sequence of random variables that converges in probability converges almost surely? (This is a false result, e.g. $X_n sim mathrm{Ber}_{1/n}$.)
$endgroup$
– D Ford
Dec 16 '18 at 14:27
$begingroup$
@DFord "Why does this not imply..."? Why should it?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:20
$begingroup$
Well, replace $S_n$ with any sequence of random variables $X_n$ that converges in probability to $X$. It appears as though the same argument applies: the set on which ${X_n}$ is not Cauchy is a null set.
$endgroup$
– D Ford
Dec 16 '18 at 18:26
$begingroup$
@DFord Does $mathsf{P}(sup_{kge n}|X_n-X_k|>epsilon)$ converge to $0$ in that case?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:28
1
$begingroup$
Cauchy in prob. means that $mathsf{P}(|X_j-X_k|>epsilon)to 0$ as $j,kto infty$. For the a.s. convergence you need a stronger condition.
$endgroup$
– d.k.o.
Dec 16 '18 at 19:00
$begingroup$
This seems a little too good to be true. Why does this not imply, for example, that every sequence of random variables that converges in probability converges almost surely? (This is a false result, e.g. $X_n sim mathrm{Ber}_{1/n}$.)
$endgroup$
– D Ford
Dec 16 '18 at 14:27
$begingroup$
This seems a little too good to be true. Why does this not imply, for example, that every sequence of random variables that converges in probability converges almost surely? (This is a false result, e.g. $X_n sim mathrm{Ber}_{1/n}$.)
$endgroup$
– D Ford
Dec 16 '18 at 14:27
$begingroup$
@DFord "Why does this not imply..."? Why should it?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:20
$begingroup$
@DFord "Why does this not imply..."? Why should it?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:20
$begingroup$
Well, replace $S_n$ with any sequence of random variables $X_n$ that converges in probability to $X$. It appears as though the same argument applies: the set on which ${X_n}$ is not Cauchy is a null set.
$endgroup$
– D Ford
Dec 16 '18 at 18:26
$begingroup$
Well, replace $S_n$ with any sequence of random variables $X_n$ that converges in probability to $X$. It appears as though the same argument applies: the set on which ${X_n}$ is not Cauchy is a null set.
$endgroup$
– D Ford
Dec 16 '18 at 18:26
$begingroup$
@DFord Does $mathsf{P}(sup_{kge n}|X_n-X_k|>epsilon)$ converge to $0$ in that case?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:28
$begingroup$
@DFord Does $mathsf{P}(sup_{kge n}|X_n-X_k|>epsilon)$ converge to $0$ in that case?
$endgroup$
– d.k.o.
Dec 16 '18 at 18:28
1
1
$begingroup$
Cauchy in prob. means that $mathsf{P}(|X_j-X_k|>epsilon)to 0$ as $j,kto infty$. For the a.s. convergence you need a stronger condition.
$endgroup$
– d.k.o.
Dec 16 '18 at 19:00
$begingroup$
Cauchy in prob. means that $mathsf{P}(|X_j-X_k|>epsilon)to 0$ as $j,kto infty$. For the a.s. convergence you need a stronger condition.
$endgroup$
– d.k.o.
Dec 16 '18 at 19:00
|
show 2 more comments
$begingroup$
$var (sum _{i=n}^{m} X_i) =sum _{i=n}^{m} var(X_i) to 0$ as $n,m to infty$so the partial sums of $sum X_i$ form a Cauchy sequence in $L^{2}$. Hence there is a square integrable random variable $X$ such that $sum _1^{n} X_i to X$ in $L^{2}$. Now convergence in mean square implies convergence in probability and for series of independent random variables convergence in probability implies almost sure convergence.
$endgroup$
$begingroup$
``For series of independent random variables, convergence in probability implies almost sure convergence." Is this a standard result? Does it have a name? I'm not familiar with this.
$endgroup$
– D Ford
Dec 16 '18 at 14:28
$begingroup$
It is a well known result. A proof can be found in Chung's 'A course in probability Theory'. @DFord
$endgroup$
– Kavi Rama Murthy
Dec 16 '18 at 23:22
add a comment |
$begingroup$
$var (sum _{i=n}^{m} X_i) =sum _{i=n}^{m} var(X_i) to 0$ as $n,m to infty$so the partial sums of $sum X_i$ form a Cauchy sequence in $L^{2}$. Hence there is a square integrable random variable $X$ such that $sum _1^{n} X_i to X$ in $L^{2}$. Now convergence in mean square implies convergence in probability and for series of independent random variables convergence in probability implies almost sure convergence.
$endgroup$
$begingroup$
``For series of independent random variables, convergence in probability implies almost sure convergence." Is this a standard result? Does it have a name? I'm not familiar with this.
$endgroup$
– D Ford
Dec 16 '18 at 14:28
$begingroup$
It is a well known result. A proof can be found in Chung's 'A course in probability Theory'. @DFord
$endgroup$
– Kavi Rama Murthy
Dec 16 '18 at 23:22
add a comment |
$begingroup$
$var (sum _{i=n}^{m} X_i) =sum _{i=n}^{m} var(X_i) to 0$ as $n,m to infty$so the partial sums of $sum X_i$ form a Cauchy sequence in $L^{2}$. Hence there is a square integrable random variable $X$ such that $sum _1^{n} X_i to X$ in $L^{2}$. Now convergence in mean square implies convergence in probability and for series of independent random variables convergence in probability implies almost sure convergence.
$endgroup$
$var (sum _{i=n}^{m} X_i) =sum _{i=n}^{m} var(X_i) to 0$ as $n,m to infty$so the partial sums of $sum X_i$ form a Cauchy sequence in $L^{2}$. Hence there is a square integrable random variable $X$ such that $sum _1^{n} X_i to X$ in $L^{2}$. Now convergence in mean square implies convergence in probability and for series of independent random variables convergence in probability implies almost sure convergence.
answered Dec 16 '18 at 11:56
Kavi Rama MurthyKavi Rama Murthy
66k42867
66k42867
$begingroup$
``For series of independent random variables, convergence in probability implies almost sure convergence." Is this a standard result? Does it have a name? I'm not familiar with this.
$endgroup$
– D Ford
Dec 16 '18 at 14:28
$begingroup$
It is a well known result. A proof can be found in Chung's 'A course in probability Theory'. @DFord
$endgroup$
– Kavi Rama Murthy
Dec 16 '18 at 23:22
add a comment |
$begingroup$
``For series of independent random variables, convergence in probability implies almost sure convergence." Is this a standard result? Does it have a name? I'm not familiar with this.
$endgroup$
– D Ford
Dec 16 '18 at 14:28
$begingroup$
It is a well known result. A proof can be found in Chung's 'A course in probability Theory'. @DFord
$endgroup$
– Kavi Rama Murthy
Dec 16 '18 at 23:22
$begingroup$
``For series of independent random variables, convergence in probability implies almost sure convergence." Is this a standard result? Does it have a name? I'm not familiar with this.
$endgroup$
– D Ford
Dec 16 '18 at 14:28
$begingroup$
``For series of independent random variables, convergence in probability implies almost sure convergence." Is this a standard result? Does it have a name? I'm not familiar with this.
$endgroup$
– D Ford
Dec 16 '18 at 14:28
$begingroup$
It is a well known result. A proof can be found in Chung's 'A course in probability Theory'. @DFord
$endgroup$
– Kavi Rama Murthy
Dec 16 '18 at 23:22
$begingroup$
It is a well known result. A proof can be found in Chung's 'A course in probability Theory'. @DFord
$endgroup$
– Kavi Rama Murthy
Dec 16 '18 at 23:22
add a comment |
$begingroup$
Your try is not bad, but I doubt that those methods will give you the result. We should show that the sum of independent random variables
$$S_n = sum_{i=1}^n X_i to S_infty
$$ almost surely. But we know that in general $$
P(sum_{i=1}^infty |X_i|<infty) =1
$$ fails. This means that the series is conditionally convergent in most cases, and it is known that some kind of maximal inequality such as $$
P(max_{nin mathbb{N}} |S_n|>lambda) leq frac{C}{lambda^2}sum_{n=1}^infty operatorname{Var}(X_n)
$$ provides sufficient and necessary condition for the almost sure convergence. It is necessary since we need to control the oscillation of the sequence
$$
nmapsto S_n(omega)
$$ for almost all $omegain Omega$. Fortunately there are several known maximal inequalities such as Kolmogorov's maximal inequality, Etemadi's inequality or martingale maximal inequalities. In particular, Kolmogorov's inequality can establish that $$
S_n text{ converges a.s.} iff S_n text{ converges in probability.}
$$ (Or you can see this: https://en.wikipedia.org/wiki/Kolmogorov%27s_two-series_theorem.) If you are allowed to use more powerful tools such as martingale convergence theorem, then $$S_n = sum_{i=1}^n X_i to S_infty
$$ almost surely follows immediately.
$endgroup$
$begingroup$
Is there a clear reason why $S_n$ converges a.s. $iff$ $S_n$ converges in probability that follows from Kolmogorov's inequality? I'm having trouble seeing it.
$endgroup$
– D Ford
Dec 16 '18 at 18:48
1
$begingroup$
It seems that there are (at least) two Kolmogorov's maximal inequalities. The version I refered to is this one: Let $x>a>0$ and $p = max_{jleq n} P(|S_n-S_j|>a).$ Then $P(max_{jleq n}|S_j|>x) leq frac{1}{1-p}P(|S_n|>x-a)$. What this inequality can show is @d.k.o's argument below. Of course, $S_n=X_1+X_2+cdots +X_n$ and $X_i$ are independent (need not be identical).
$endgroup$
– Song
Dec 16 '18 at 18:53
add a comment |
$begingroup$
Your try is not bad, but I doubt that those methods will give you the result. We should show that the sum of independent random variables
$$S_n = sum_{i=1}^n X_i to S_infty
$$ almost surely. But we know that in general $$
P(sum_{i=1}^infty |X_i|<infty) =1
$$ fails. This means that the series is conditionally convergent in most cases, and it is known that some kind of maximal inequality such as $$
P(max_{nin mathbb{N}} |S_n|>lambda) leq frac{C}{lambda^2}sum_{n=1}^infty operatorname{Var}(X_n)
$$ provides sufficient and necessary condition for the almost sure convergence. It is necessary since we need to control the oscillation of the sequence
$$
nmapsto S_n(omega)
$$ for almost all $omegain Omega$. Fortunately there are several known maximal inequalities such as Kolmogorov's maximal inequality, Etemadi's inequality or martingale maximal inequalities. In particular, Kolmogorov's inequality can establish that $$
S_n text{ converges a.s.} iff S_n text{ converges in probability.}
$$ (Or you can see this: https://en.wikipedia.org/wiki/Kolmogorov%27s_two-series_theorem.) If you are allowed to use more powerful tools such as martingale convergence theorem, then $$S_n = sum_{i=1}^n X_i to S_infty
$$ almost surely follows immediately.
$endgroup$
$begingroup$
Is there a clear reason why $S_n$ converges a.s. $iff$ $S_n$ converges in probability that follows from Kolmogorov's inequality? I'm having trouble seeing it.
$endgroup$
– D Ford
Dec 16 '18 at 18:48
1
$begingroup$
It seems that there are (at least) two Kolmogorov's maximal inequalities. The version I refered to is this one: Let $x>a>0$ and $p = max_{jleq n} P(|S_n-S_j|>a).$ Then $P(max_{jleq n}|S_j|>x) leq frac{1}{1-p}P(|S_n|>x-a)$. What this inequality can show is @d.k.o's argument below. Of course, $S_n=X_1+X_2+cdots +X_n$ and $X_i$ are independent (need not be identical).
$endgroup$
– Song
Dec 16 '18 at 18:53
add a comment |
$begingroup$
Your try is not bad, but I doubt that those methods will give you the result. We should show that the sum of independent random variables
$$S_n = sum_{i=1}^n X_i to S_infty
$$ almost surely. But we know that in general $$
P(sum_{i=1}^infty |X_i|<infty) =1
$$ fails. This means that the series is conditionally convergent in most cases, and it is known that some kind of maximal inequality such as $$
P(max_{nin mathbb{N}} |S_n|>lambda) leq frac{C}{lambda^2}sum_{n=1}^infty operatorname{Var}(X_n)
$$ provides sufficient and necessary condition for the almost sure convergence. It is necessary since we need to control the oscillation of the sequence
$$
nmapsto S_n(omega)
$$ for almost all $omegain Omega$. Fortunately there are several known maximal inequalities such as Kolmogorov's maximal inequality, Etemadi's inequality or martingale maximal inequalities. In particular, Kolmogorov's inequality can establish that $$
S_n text{ converges a.s.} iff S_n text{ converges in probability.}
$$ (Or you can see this: https://en.wikipedia.org/wiki/Kolmogorov%27s_two-series_theorem.) If you are allowed to use more powerful tools such as martingale convergence theorem, then $$S_n = sum_{i=1}^n X_i to S_infty
$$ almost surely follows immediately.
$endgroup$
Your try is not bad, but I doubt that those methods will give you the result. We should show that the sum of independent random variables
$$S_n = sum_{i=1}^n X_i to S_infty
$$ almost surely. But we know that in general $$
P(sum_{i=1}^infty |X_i|<infty) =1
$$ fails. This means that the series is conditionally convergent in most cases, and it is known that some kind of maximal inequality such as $$
P(max_{nin mathbb{N}} |S_n|>lambda) leq frac{C}{lambda^2}sum_{n=1}^infty operatorname{Var}(X_n)
$$ provides sufficient and necessary condition for the almost sure convergence. It is necessary since we need to control the oscillation of the sequence
$$
nmapsto S_n(omega)
$$ for almost all $omegain Omega$. Fortunately there are several known maximal inequalities such as Kolmogorov's maximal inequality, Etemadi's inequality or martingale maximal inequalities. In particular, Kolmogorov's inequality can establish that $$
S_n text{ converges a.s.} iff S_n text{ converges in probability.}
$$ (Or you can see this: https://en.wikipedia.org/wiki/Kolmogorov%27s_two-series_theorem.) If you are allowed to use more powerful tools such as martingale convergence theorem, then $$S_n = sum_{i=1}^n X_i to S_infty
$$ almost surely follows immediately.
edited Dec 16 '18 at 12:06
answered Dec 16 '18 at 7:15
SongSong
16.8k21145
16.8k21145
$begingroup$
Is there a clear reason why $S_n$ converges a.s. $iff$ $S_n$ converges in probability that follows from Kolmogorov's inequality? I'm having trouble seeing it.
$endgroup$
– D Ford
Dec 16 '18 at 18:48
1
$begingroup$
It seems that there are (at least) two Kolmogorov's maximal inequalities. The version I refered to is this one: Let $x>a>0$ and $p = max_{jleq n} P(|S_n-S_j|>a).$ Then $P(max_{jleq n}|S_j|>x) leq frac{1}{1-p}P(|S_n|>x-a)$. What this inequality can show is @d.k.o's argument below. Of course, $S_n=X_1+X_2+cdots +X_n$ and $X_i$ are independent (need not be identical).
$endgroup$
– Song
Dec 16 '18 at 18:53
add a comment |
$begingroup$
Is there a clear reason why $S_n$ converges a.s. $iff$ $S_n$ converges in probability that follows from Kolmogorov's inequality? I'm having trouble seeing it.
$endgroup$
– D Ford
Dec 16 '18 at 18:48
1
$begingroup$
It seems that there are (at least) two Kolmogorov's maximal inequalities. The version I refered to is this one: Let $x>a>0$ and $p = max_{jleq n} P(|S_n-S_j|>a).$ Then $P(max_{jleq n}|S_j|>x) leq frac{1}{1-p}P(|S_n|>x-a)$. What this inequality can show is @d.k.o's argument below. Of course, $S_n=X_1+X_2+cdots +X_n$ and $X_i$ are independent (need not be identical).
$endgroup$
– Song
Dec 16 '18 at 18:53
$begingroup$
Is there a clear reason why $S_n$ converges a.s. $iff$ $S_n$ converges in probability that follows from Kolmogorov's inequality? I'm having trouble seeing it.
$endgroup$
– D Ford
Dec 16 '18 at 18:48
$begingroup$
Is there a clear reason why $S_n$ converges a.s. $iff$ $S_n$ converges in probability that follows from Kolmogorov's inequality? I'm having trouble seeing it.
$endgroup$
– D Ford
Dec 16 '18 at 18:48
1
1
$begingroup$
It seems that there are (at least) two Kolmogorov's maximal inequalities. The version I refered to is this one: Let $x>a>0$ and $p = max_{jleq n} P(|S_n-S_j|>a).$ Then $P(max_{jleq n}|S_j|>x) leq frac{1}{1-p}P(|S_n|>x-a)$. What this inequality can show is @d.k.o's argument below. Of course, $S_n=X_1+X_2+cdots +X_n$ and $X_i$ are independent (need not be identical).
$endgroup$
– Song
Dec 16 '18 at 18:53
$begingroup$
It seems that there are (at least) two Kolmogorov's maximal inequalities. The version I refered to is this one: Let $x>a>0$ and $p = max_{jleq n} P(|S_n-S_j|>a).$ Then $P(max_{jleq n}|S_j|>x) leq frac{1}{1-p}P(|S_n|>x-a)$. What this inequality can show is @d.k.o's argument below. Of course, $S_n=X_1+X_2+cdots +X_n$ and $X_i$ are independent (need not be identical).
$endgroup$
– Song
Dec 16 '18 at 18:53
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3042298%2fa-s-convergence-of-sum-of-square-integrable-independent-random-variables-with-s%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown