Showing independence of increments of a stochastic process












4












$begingroup$


The textbook on stochastic calculus I am now reading says that
if $Xcolon [0,infty)timesOmegarightarrowmathbb R$ is a stochastic process such that




  1. $X(t)-X(s)sim N(0,t-s)$ for all $t geq s geq 0$,


  2. $E[X(t)X(s)]=min{s,t}$ for all $s,t geq 0$,



then, $X$ exhibits independence increment, i.e. for every $0 leq t_1<...<t_n$, $X(t_1)$, $X(t_2)-X(t_1)$, …, $X(t_n)-X(t_{n-1})$ are independent.



Here $X(t)$ denotes a random variable $X(t):Omegarightarrow mathbb R$ such that $X(t)(omega)=X(t,omega)$.



But I guess this is not true. I suspect that we need an additional condition that $X$ is a Gaussian process. (Then, it is easy to show the independence)



Am I on the right track? If so, can you give me some counterexamples?



Or can it be shown without assuming Gaussian process?



Any hint would be appreciated! Thanks and regards.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
    $endgroup$
    – AddSup
    Dec 25 '18 at 19:31










  • $begingroup$
    @AddSup yes also interested to get the original source to check for myself as well!
    $endgroup$
    – Ezy
    Dec 29 '18 at 4:34
















4












$begingroup$


The textbook on stochastic calculus I am now reading says that
if $Xcolon [0,infty)timesOmegarightarrowmathbb R$ is a stochastic process such that




  1. $X(t)-X(s)sim N(0,t-s)$ for all $t geq s geq 0$,


  2. $E[X(t)X(s)]=min{s,t}$ for all $s,t geq 0$,



then, $X$ exhibits independence increment, i.e. for every $0 leq t_1<...<t_n$, $X(t_1)$, $X(t_2)-X(t_1)$, …, $X(t_n)-X(t_{n-1})$ are independent.



Here $X(t)$ denotes a random variable $X(t):Omegarightarrow mathbb R$ such that $X(t)(omega)=X(t,omega)$.



But I guess this is not true. I suspect that we need an additional condition that $X$ is a Gaussian process. (Then, it is easy to show the independence)



Am I on the right track? If so, can you give me some counterexamples?



Or can it be shown without assuming Gaussian process?



Any hint would be appreciated! Thanks and regards.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
    $endgroup$
    – AddSup
    Dec 25 '18 at 19:31










  • $begingroup$
    @AddSup yes also interested to get the original source to check for myself as well!
    $endgroup$
    – Ezy
    Dec 29 '18 at 4:34














4












4








4


2



$begingroup$


The textbook on stochastic calculus I am now reading says that
if $Xcolon [0,infty)timesOmegarightarrowmathbb R$ is a stochastic process such that




  1. $X(t)-X(s)sim N(0,t-s)$ for all $t geq s geq 0$,


  2. $E[X(t)X(s)]=min{s,t}$ for all $s,t geq 0$,



then, $X$ exhibits independence increment, i.e. for every $0 leq t_1<...<t_n$, $X(t_1)$, $X(t_2)-X(t_1)$, …, $X(t_n)-X(t_{n-1})$ are independent.



Here $X(t)$ denotes a random variable $X(t):Omegarightarrow mathbb R$ such that $X(t)(omega)=X(t,omega)$.



But I guess this is not true. I suspect that we need an additional condition that $X$ is a Gaussian process. (Then, it is easy to show the independence)



Am I on the right track? If so, can you give me some counterexamples?



Or can it be shown without assuming Gaussian process?



Any hint would be appreciated! Thanks and regards.










share|cite|improve this question











$endgroup$




The textbook on stochastic calculus I am now reading says that
if $Xcolon [0,infty)timesOmegarightarrowmathbb R$ is a stochastic process such that




  1. $X(t)-X(s)sim N(0,t-s)$ for all $t geq s geq 0$,


  2. $E[X(t)X(s)]=min{s,t}$ for all $s,t geq 0$,



then, $X$ exhibits independence increment, i.e. for every $0 leq t_1<...<t_n$, $X(t_1)$, $X(t_2)-X(t_1)$, …, $X(t_n)-X(t_{n-1})$ are independent.



Here $X(t)$ denotes a random variable $X(t):Omegarightarrow mathbb R$ such that $X(t)(omega)=X(t,omega)$.



But I guess this is not true. I suspect that we need an additional condition that $X$ is a Gaussian process. (Then, it is easy to show the independence)



Am I on the right track? If so, can you give me some counterexamples?



Or can it be shown without assuming Gaussian process?



Any hint would be appreciated! Thanks and regards.







probability-theory stochastic-processes brownian-motion






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 21 '18 at 1:24









Saad

20.4k92352




20.4k92352










asked Nov 4 '18 at 13:45









MhrMhr

54119




54119








  • 1




    $begingroup$
    Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
    $endgroup$
    – AddSup
    Dec 25 '18 at 19:31










  • $begingroup$
    @AddSup yes also interested to get the original source to check for myself as well!
    $endgroup$
    – Ezy
    Dec 29 '18 at 4:34














  • 1




    $begingroup$
    Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
    $endgroup$
    – AddSup
    Dec 25 '18 at 19:31










  • $begingroup$
    @AddSup yes also interested to get the original source to check for myself as well!
    $endgroup$
    – Ezy
    Dec 29 '18 at 4:34








1




1




$begingroup$
Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
$endgroup$
– AddSup
Dec 25 '18 at 19:31




$begingroup$
Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
$endgroup$
– AddSup
Dec 25 '18 at 19:31












$begingroup$
@AddSup yes also interested to get the original source to check for myself as well!
$endgroup$
– Ezy
Dec 29 '18 at 4:34




$begingroup$
@AddSup yes also interested to get the original source to check for myself as well!
$endgroup$
– Ezy
Dec 29 '18 at 4:34










2 Answers
2






active

oldest

votes


















0












$begingroup$

Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $iin[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality)
begin{align*}
mathbb{E}[Y_i Y_j] &= mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\
&=mathbb{E}[X(t_i)X(t_j)]-mathbb{E}[X(t_{i-1})X(t_j)]-mathbb{E}[X(t_i)X(t_{j-1})]+mathbb{E}[X(t_{i-1})X(t_{j-1})]\
&=t_i-t_{i-1}-t_i+t_{i-1}\
&=0
end{align*}

So the covariance matrix of $(Y_1,dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $iin[2:n]$.



The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.





All this is true only if $Y=(Y_1,dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $mathbf a$ of size $n$, $mathbf a^T Y$ is a Gaussian random variable.



I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.






share|cite|improve this answer











$endgroup$









  • 2




    $begingroup$
    That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
    $endgroup$
    – Mhr
    Nov 4 '18 at 14:13



















0












$begingroup$

I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.



See this article for details



http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
    $endgroup$
    – AddSup
    Dec 27 '18 at 8:19












  • $begingroup$
    To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
    $endgroup$
    – AddSup
    Dec 27 '18 at 8:55










  • $begingroup$
    @AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
    $endgroup$
    – Ezy
    Dec 27 '18 at 8:56










  • $begingroup$
    Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
    $endgroup$
    – AddSup
    Dec 27 '18 at 9:42










  • $begingroup$
    @AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
    $endgroup$
    – Ezy
    Dec 27 '18 at 12:09












Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2984235%2fshowing-independence-of-increments-of-a-stochastic-process%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









0












$begingroup$

Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $iin[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality)
begin{align*}
mathbb{E}[Y_i Y_j] &= mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\
&=mathbb{E}[X(t_i)X(t_j)]-mathbb{E}[X(t_{i-1})X(t_j)]-mathbb{E}[X(t_i)X(t_{j-1})]+mathbb{E}[X(t_{i-1})X(t_{j-1})]\
&=t_i-t_{i-1}-t_i+t_{i-1}\
&=0
end{align*}

So the covariance matrix of $(Y_1,dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $iin[2:n]$.



The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.





All this is true only if $Y=(Y_1,dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $mathbf a$ of size $n$, $mathbf a^T Y$ is a Gaussian random variable.



I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.






share|cite|improve this answer











$endgroup$









  • 2




    $begingroup$
    That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
    $endgroup$
    – Mhr
    Nov 4 '18 at 14:13
















0












$begingroup$

Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $iin[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality)
begin{align*}
mathbb{E}[Y_i Y_j] &= mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\
&=mathbb{E}[X(t_i)X(t_j)]-mathbb{E}[X(t_{i-1})X(t_j)]-mathbb{E}[X(t_i)X(t_{j-1})]+mathbb{E}[X(t_{i-1})X(t_{j-1})]\
&=t_i-t_{i-1}-t_i+t_{i-1}\
&=0
end{align*}

So the covariance matrix of $(Y_1,dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $iin[2:n]$.



The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.





All this is true only if $Y=(Y_1,dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $mathbf a$ of size $n$, $mathbf a^T Y$ is a Gaussian random variable.



I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.






share|cite|improve this answer











$endgroup$









  • 2




    $begingroup$
    That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
    $endgroup$
    – Mhr
    Nov 4 '18 at 14:13














0












0








0





$begingroup$

Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $iin[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality)
begin{align*}
mathbb{E}[Y_i Y_j] &= mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\
&=mathbb{E}[X(t_i)X(t_j)]-mathbb{E}[X(t_{i-1})X(t_j)]-mathbb{E}[X(t_i)X(t_{j-1})]+mathbb{E}[X(t_{i-1})X(t_{j-1})]\
&=t_i-t_{i-1}-t_i+t_{i-1}\
&=0
end{align*}

So the covariance matrix of $(Y_1,dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $iin[2:n]$.



The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.





All this is true only if $Y=(Y_1,dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $mathbf a$ of size $n$, $mathbf a^T Y$ is a Gaussian random variable.



I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.






share|cite|improve this answer











$endgroup$



Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $iin[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality)
begin{align*}
mathbb{E}[Y_i Y_j] &= mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\
&=mathbb{E}[X(t_i)X(t_j)]-mathbb{E}[X(t_{i-1})X(t_j)]-mathbb{E}[X(t_i)X(t_{j-1})]+mathbb{E}[X(t_{i-1})X(t_{j-1})]\
&=t_i-t_{i-1}-t_i+t_{i-1}\
&=0
end{align*}

So the covariance matrix of $(Y_1,dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $iin[2:n]$.



The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.





All this is true only if $Y=(Y_1,dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $mathbf a$ of size $n$, $mathbf a^T Y$ is a Gaussian random variable.



I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Nov 4 '18 at 14:52

























answered Nov 4 '18 at 14:10









P. QuintonP. Quinton

1,9561214




1,9561214








  • 2




    $begingroup$
    That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
    $endgroup$
    – Mhr
    Nov 4 '18 at 14:13














  • 2




    $begingroup$
    That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
    $endgroup$
    – Mhr
    Nov 4 '18 at 14:13








2




2




$begingroup$
That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
$endgroup$
– Mhr
Nov 4 '18 at 14:13




$begingroup$
That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
$endgroup$
– Mhr
Nov 4 '18 at 14:13











0












$begingroup$

I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.



See this article for details



http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
    $endgroup$
    – AddSup
    Dec 27 '18 at 8:19












  • $begingroup$
    To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
    $endgroup$
    – AddSup
    Dec 27 '18 at 8:55










  • $begingroup$
    @AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
    $endgroup$
    – Ezy
    Dec 27 '18 at 8:56










  • $begingroup$
    Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
    $endgroup$
    – AddSup
    Dec 27 '18 at 9:42










  • $begingroup$
    @AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
    $endgroup$
    – Ezy
    Dec 27 '18 at 12:09
















0












$begingroup$

I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.



See this article for details



http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
    $endgroup$
    – AddSup
    Dec 27 '18 at 8:19












  • $begingroup$
    To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
    $endgroup$
    – AddSup
    Dec 27 '18 at 8:55










  • $begingroup$
    @AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
    $endgroup$
    – Ezy
    Dec 27 '18 at 8:56










  • $begingroup$
    Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
    $endgroup$
    – AddSup
    Dec 27 '18 at 9:42










  • $begingroup$
    @AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
    $endgroup$
    – Ezy
    Dec 27 '18 at 12:09














0












0








0





$begingroup$

I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.



See this article for details



http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf






share|cite|improve this answer









$endgroup$



I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.



See this article for details



http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Dec 27 '18 at 6:10









EzyEzy

56429




56429












  • $begingroup$
    Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
    $endgroup$
    – AddSup
    Dec 27 '18 at 8:19












  • $begingroup$
    To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
    $endgroup$
    – AddSup
    Dec 27 '18 at 8:55










  • $begingroup$
    @AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
    $endgroup$
    – Ezy
    Dec 27 '18 at 8:56










  • $begingroup$
    Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
    $endgroup$
    – AddSup
    Dec 27 '18 at 9:42










  • $begingroup$
    @AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
    $endgroup$
    – Ezy
    Dec 27 '18 at 12:09


















  • $begingroup$
    Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
    $endgroup$
    – AddSup
    Dec 27 '18 at 8:19












  • $begingroup$
    To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
    $endgroup$
    – AddSup
    Dec 27 '18 at 8:55










  • $begingroup$
    @AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
    $endgroup$
    – Ezy
    Dec 27 '18 at 8:56










  • $begingroup$
    Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
    $endgroup$
    – AddSup
    Dec 27 '18 at 9:42










  • $begingroup$
    @AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
    $endgroup$
    – Ezy
    Dec 27 '18 at 12:09
















$begingroup$
Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
$endgroup$
– AddSup
Dec 27 '18 at 8:19






$begingroup$
Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
$endgroup$
– AddSup
Dec 27 '18 at 8:19














$begingroup$
To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
$endgroup$
– AddSup
Dec 27 '18 at 8:55




$begingroup$
To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
$endgroup$
– AddSup
Dec 27 '18 at 8:55












$begingroup$
@AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
$endgroup$
– Ezy
Dec 27 '18 at 8:56




$begingroup$
@AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
$endgroup$
– Ezy
Dec 27 '18 at 8:56












$begingroup$
Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
$endgroup$
– AddSup
Dec 27 '18 at 9:42




$begingroup$
Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
$endgroup$
– AddSup
Dec 27 '18 at 9:42












$begingroup$
@AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
$endgroup$
– Ezy
Dec 27 '18 at 12:09




$begingroup$
@AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
$endgroup$
– Ezy
Dec 27 '18 at 12:09


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2984235%2fshowing-independence-of-increments-of-a-stochastic-process%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Plaza Victoria

In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...