Campbell Theorem for $n$-th power
up vote
0
down vote
favorite
Campbell's theorem
gives a method for calculating expectations of sums of measurable functions f(x).
While I was solving my system model considering a Poisson point process, I came across the equation
$operatorname {E}big[big(sum_{xepsilon N} f(x)big)^nbig]$
Where $N$ is a Poisson point process
Is it possible to implement Campbell's theorem to find the expectation of the sums of measurable functions f(x) raised to the power n?
measure-theory summation poisson-process
add a comment |
up vote
0
down vote
favorite
Campbell's theorem
gives a method for calculating expectations of sums of measurable functions f(x).
While I was solving my system model considering a Poisson point process, I came across the equation
$operatorname {E}big[big(sum_{xepsilon N} f(x)big)^nbig]$
Where $N$ is a Poisson point process
Is it possible to implement Campbell's theorem to find the expectation of the sums of measurable functions f(x) raised to the power n?
measure-theory summation poisson-process
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Campbell's theorem
gives a method for calculating expectations of sums of measurable functions f(x).
While I was solving my system model considering a Poisson point process, I came across the equation
$operatorname {E}big[big(sum_{xepsilon N} f(x)big)^nbig]$
Where $N$ is a Poisson point process
Is it possible to implement Campbell's theorem to find the expectation of the sums of measurable functions f(x) raised to the power n?
measure-theory summation poisson-process
Campbell's theorem
gives a method for calculating expectations of sums of measurable functions f(x).
While I was solving my system model considering a Poisson point process, I came across the equation
$operatorname {E}big[big(sum_{xepsilon N} f(x)big)^nbig]$
Where $N$ is a Poisson point process
Is it possible to implement Campbell's theorem to find the expectation of the sums of measurable functions f(x) raised to the power n?
measure-theory summation poisson-process
measure-theory summation poisson-process
edited Nov 9 at 1:50
Le Anh Dung
1,2071421
1,2071421
asked Nov 8 at 20:48
Yuva
33
33
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
up vote
1
down vote
accepted
Notice that
$$ mathbf{E}left[ expleft{ -s sum_{x in N} f(x) right} right]
= expleft{ - int left(1 - e^{-sf(x)}right) , Lambda(dx) right}.$$
So the $n$-th moment of the sum $sum_{x in N} f(x)$ can be computed by differentiating both sides by $n$ times and plugging $s = 0$. To this end, we may invoke the following instance of Faa di Bruno's formula,
$$ frac{d^n}{ds^n}e^{g(s)} = e^{g(s)} sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( g^{(i)}(s) right)^{lambda_i}, $$
where the sum on the right-hand side runs over all the integer partitions $lambda$ of $n$, i.e., sequences $lambda in mathbb{N}_{0}^{n}$ satisfying $sum_{i=1}^{n} ilambda_i = n$. Plugging $g(s) = - int left(1 - e^{-sf(x)}right) , Lambda(dx)$, The result is that
begin{align*}
mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
&= sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{lambda_i}.
end{align*}
Addendum. The right-hand side admits a useful probabilistic expression: Let $S_n$ the symmetric group over the set ${1,cdots,n}$. For each $pi in S_n$ and each $i in {1, cdots, n}$, define $m_i(pi)$ as the number of $i$-cycles in $pi$. If $pi$ is chosen uniformly at random from $S_n$, then
begin{align*}
mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
&= n! , mathbb{E}left[ prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{m_i(pi)} right]
end{align*}
Since the cycle structure of random permutation is well-studied, this allows to study the asymptotic behavior of the expectation for large $n$.
Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
– Yuva
yesterday
add a comment |
up vote
0
down vote
You can approximate $sum_{x in N} f(x)$ by $sum_j U_j f(x_j)$ where you divide your domain up into small pieces, $x_j$ is in the $j$'th piece, and $U_j$ is the number of points of your point process in the $j$'th piece. Further assume the pieces are so small that we can neglect values of $U_j$ other than $0$ and $1$.
Then for example
$ left(sum_{xin N} f(x)right)^3$ becomes
$$ sum_{i} U_{i} f(x_i)^3 + 3 sum_{i} sum_{jne i} U_i U_j f(x_i) f(x_j)^2
+ sum_{i} sum_{j ne i} sum_{k ne i,j} U_i U_j U_k f(x_i) f(x_j) f(x_k)$$
so that we get
$$ mathbb Eleft[left(sum_{xin N} f(x)right)^3right]=int f(x)^3; dLambda(x) + 3 iint f(x) f(y)^2 ; dLambda(x); dLambda(y)
+ iiint f(x) f(y) f(z); dLambda(x); dLambda(y); dLambda(z)$$
where $dLambda$ is the intensity of your point process.
Similarly for other powers, where in general for the $n$'th power you need to consider all ways
of partitioning $[1,ldots,n]$ into disjoint nonempty subsets.
add a comment |
up vote
0
down vote
Thank you for the answers, it is useful and gave a different perspective. I also found a similar result as Lee suggested, text here, Page 20.
add a comment |
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
Notice that
$$ mathbf{E}left[ expleft{ -s sum_{x in N} f(x) right} right]
= expleft{ - int left(1 - e^{-sf(x)}right) , Lambda(dx) right}.$$
So the $n$-th moment of the sum $sum_{x in N} f(x)$ can be computed by differentiating both sides by $n$ times and plugging $s = 0$. To this end, we may invoke the following instance of Faa di Bruno's formula,
$$ frac{d^n}{ds^n}e^{g(s)} = e^{g(s)} sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( g^{(i)}(s) right)^{lambda_i}, $$
where the sum on the right-hand side runs over all the integer partitions $lambda$ of $n$, i.e., sequences $lambda in mathbb{N}_{0}^{n}$ satisfying $sum_{i=1}^{n} ilambda_i = n$. Plugging $g(s) = - int left(1 - e^{-sf(x)}right) , Lambda(dx)$, The result is that
begin{align*}
mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
&= sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{lambda_i}.
end{align*}
Addendum. The right-hand side admits a useful probabilistic expression: Let $S_n$ the symmetric group over the set ${1,cdots,n}$. For each $pi in S_n$ and each $i in {1, cdots, n}$, define $m_i(pi)$ as the number of $i$-cycles in $pi$. If $pi$ is chosen uniformly at random from $S_n$, then
begin{align*}
mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
&= n! , mathbb{E}left[ prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{m_i(pi)} right]
end{align*}
Since the cycle structure of random permutation is well-studied, this allows to study the asymptotic behavior of the expectation for large $n$.
Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
– Yuva
yesterday
add a comment |
up vote
1
down vote
accepted
Notice that
$$ mathbf{E}left[ expleft{ -s sum_{x in N} f(x) right} right]
= expleft{ - int left(1 - e^{-sf(x)}right) , Lambda(dx) right}.$$
So the $n$-th moment of the sum $sum_{x in N} f(x)$ can be computed by differentiating both sides by $n$ times and plugging $s = 0$. To this end, we may invoke the following instance of Faa di Bruno's formula,
$$ frac{d^n}{ds^n}e^{g(s)} = e^{g(s)} sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( g^{(i)}(s) right)^{lambda_i}, $$
where the sum on the right-hand side runs over all the integer partitions $lambda$ of $n$, i.e., sequences $lambda in mathbb{N}_{0}^{n}$ satisfying $sum_{i=1}^{n} ilambda_i = n$. Plugging $g(s) = - int left(1 - e^{-sf(x)}right) , Lambda(dx)$, The result is that
begin{align*}
mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
&= sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{lambda_i}.
end{align*}
Addendum. The right-hand side admits a useful probabilistic expression: Let $S_n$ the symmetric group over the set ${1,cdots,n}$. For each $pi in S_n$ and each $i in {1, cdots, n}$, define $m_i(pi)$ as the number of $i$-cycles in $pi$. If $pi$ is chosen uniformly at random from $S_n$, then
begin{align*}
mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
&= n! , mathbb{E}left[ prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{m_i(pi)} right]
end{align*}
Since the cycle structure of random permutation is well-studied, this allows to study the asymptotic behavior of the expectation for large $n$.
Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
– Yuva
yesterday
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
Notice that
$$ mathbf{E}left[ expleft{ -s sum_{x in N} f(x) right} right]
= expleft{ - int left(1 - e^{-sf(x)}right) , Lambda(dx) right}.$$
So the $n$-th moment of the sum $sum_{x in N} f(x)$ can be computed by differentiating both sides by $n$ times and plugging $s = 0$. To this end, we may invoke the following instance of Faa di Bruno's formula,
$$ frac{d^n}{ds^n}e^{g(s)} = e^{g(s)} sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( g^{(i)}(s) right)^{lambda_i}, $$
where the sum on the right-hand side runs over all the integer partitions $lambda$ of $n$, i.e., sequences $lambda in mathbb{N}_{0}^{n}$ satisfying $sum_{i=1}^{n} ilambda_i = n$. Plugging $g(s) = - int left(1 - e^{-sf(x)}right) , Lambda(dx)$, The result is that
begin{align*}
mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
&= sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{lambda_i}.
end{align*}
Addendum. The right-hand side admits a useful probabilistic expression: Let $S_n$ the symmetric group over the set ${1,cdots,n}$. For each $pi in S_n$ and each $i in {1, cdots, n}$, define $m_i(pi)$ as the number of $i$-cycles in $pi$. If $pi$ is chosen uniformly at random from $S_n$, then
begin{align*}
mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
&= n! , mathbb{E}left[ prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{m_i(pi)} right]
end{align*}
Since the cycle structure of random permutation is well-studied, this allows to study the asymptotic behavior of the expectation for large $n$.
Notice that
$$ mathbf{E}left[ expleft{ -s sum_{x in N} f(x) right} right]
= expleft{ - int left(1 - e^{-sf(x)}right) , Lambda(dx) right}.$$
So the $n$-th moment of the sum $sum_{x in N} f(x)$ can be computed by differentiating both sides by $n$ times and plugging $s = 0$. To this end, we may invoke the following instance of Faa di Bruno's formula,
$$ frac{d^n}{ds^n}e^{g(s)} = e^{g(s)} sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( g^{(i)}(s) right)^{lambda_i}, $$
where the sum on the right-hand side runs over all the integer partitions $lambda$ of $n$, i.e., sequences $lambda in mathbb{N}_{0}^{n}$ satisfying $sum_{i=1}^{n} ilambda_i = n$. Plugging $g(s) = - int left(1 - e^{-sf(x)}right) , Lambda(dx)$, The result is that
begin{align*}
mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
&= sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{lambda_i}.
end{align*}
Addendum. The right-hand side admits a useful probabilistic expression: Let $S_n$ the symmetric group over the set ${1,cdots,n}$. For each $pi in S_n$ and each $i in {1, cdots, n}$, define $m_i(pi)$ as the number of $i$-cycles in $pi$. If $pi$ is chosen uniformly at random from $S_n$, then
begin{align*}
mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
&= n! , mathbb{E}left[ prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{m_i(pi)} right]
end{align*}
Since the cycle structure of random permutation is well-studied, this allows to study the asymptotic behavior of the expectation for large $n$.
edited yesterday
answered Nov 9 at 0:06
Sangchul Lee
89.3k12161261
89.3k12161261
Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
– Yuva
yesterday
add a comment |
Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
– Yuva
yesterday
Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
– Yuva
yesterday
Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
– Yuva
yesterday
add a comment |
up vote
0
down vote
You can approximate $sum_{x in N} f(x)$ by $sum_j U_j f(x_j)$ where you divide your domain up into small pieces, $x_j$ is in the $j$'th piece, and $U_j$ is the number of points of your point process in the $j$'th piece. Further assume the pieces are so small that we can neglect values of $U_j$ other than $0$ and $1$.
Then for example
$ left(sum_{xin N} f(x)right)^3$ becomes
$$ sum_{i} U_{i} f(x_i)^3 + 3 sum_{i} sum_{jne i} U_i U_j f(x_i) f(x_j)^2
+ sum_{i} sum_{j ne i} sum_{k ne i,j} U_i U_j U_k f(x_i) f(x_j) f(x_k)$$
so that we get
$$ mathbb Eleft[left(sum_{xin N} f(x)right)^3right]=int f(x)^3; dLambda(x) + 3 iint f(x) f(y)^2 ; dLambda(x); dLambda(y)
+ iiint f(x) f(y) f(z); dLambda(x); dLambda(y); dLambda(z)$$
where $dLambda$ is the intensity of your point process.
Similarly for other powers, where in general for the $n$'th power you need to consider all ways
of partitioning $[1,ldots,n]$ into disjoint nonempty subsets.
add a comment |
up vote
0
down vote
You can approximate $sum_{x in N} f(x)$ by $sum_j U_j f(x_j)$ where you divide your domain up into small pieces, $x_j$ is in the $j$'th piece, and $U_j$ is the number of points of your point process in the $j$'th piece. Further assume the pieces are so small that we can neglect values of $U_j$ other than $0$ and $1$.
Then for example
$ left(sum_{xin N} f(x)right)^3$ becomes
$$ sum_{i} U_{i} f(x_i)^3 + 3 sum_{i} sum_{jne i} U_i U_j f(x_i) f(x_j)^2
+ sum_{i} sum_{j ne i} sum_{k ne i,j} U_i U_j U_k f(x_i) f(x_j) f(x_k)$$
so that we get
$$ mathbb Eleft[left(sum_{xin N} f(x)right)^3right]=int f(x)^3; dLambda(x) + 3 iint f(x) f(y)^2 ; dLambda(x); dLambda(y)
+ iiint f(x) f(y) f(z); dLambda(x); dLambda(y); dLambda(z)$$
where $dLambda$ is the intensity of your point process.
Similarly for other powers, where in general for the $n$'th power you need to consider all ways
of partitioning $[1,ldots,n]$ into disjoint nonempty subsets.
add a comment |
up vote
0
down vote
up vote
0
down vote
You can approximate $sum_{x in N} f(x)$ by $sum_j U_j f(x_j)$ where you divide your domain up into small pieces, $x_j$ is in the $j$'th piece, and $U_j$ is the number of points of your point process in the $j$'th piece. Further assume the pieces are so small that we can neglect values of $U_j$ other than $0$ and $1$.
Then for example
$ left(sum_{xin N} f(x)right)^3$ becomes
$$ sum_{i} U_{i} f(x_i)^3 + 3 sum_{i} sum_{jne i} U_i U_j f(x_i) f(x_j)^2
+ sum_{i} sum_{j ne i} sum_{k ne i,j} U_i U_j U_k f(x_i) f(x_j) f(x_k)$$
so that we get
$$ mathbb Eleft[left(sum_{xin N} f(x)right)^3right]=int f(x)^3; dLambda(x) + 3 iint f(x) f(y)^2 ; dLambda(x); dLambda(y)
+ iiint f(x) f(y) f(z); dLambda(x); dLambda(y); dLambda(z)$$
where $dLambda$ is the intensity of your point process.
Similarly for other powers, where in general for the $n$'th power you need to consider all ways
of partitioning $[1,ldots,n]$ into disjoint nonempty subsets.
You can approximate $sum_{x in N} f(x)$ by $sum_j U_j f(x_j)$ where you divide your domain up into small pieces, $x_j$ is in the $j$'th piece, and $U_j$ is the number of points of your point process in the $j$'th piece. Further assume the pieces are so small that we can neglect values of $U_j$ other than $0$ and $1$.
Then for example
$ left(sum_{xin N} f(x)right)^3$ becomes
$$ sum_{i} U_{i} f(x_i)^3 + 3 sum_{i} sum_{jne i} U_i U_j f(x_i) f(x_j)^2
+ sum_{i} sum_{j ne i} sum_{k ne i,j} U_i U_j U_k f(x_i) f(x_j) f(x_k)$$
so that we get
$$ mathbb Eleft[left(sum_{xin N} f(x)right)^3right]=int f(x)^3; dLambda(x) + 3 iint f(x) f(y)^2 ; dLambda(x); dLambda(y)
+ iiint f(x) f(y) f(z); dLambda(x); dLambda(y); dLambda(z)$$
where $dLambda$ is the intensity of your point process.
Similarly for other powers, where in general for the $n$'th power you need to consider all ways
of partitioning $[1,ldots,n]$ into disjoint nonempty subsets.
answered Nov 8 at 21:18
Robert Israel
313k23206452
313k23206452
add a comment |
add a comment |
up vote
0
down vote
Thank you for the answers, it is useful and gave a different perspective. I also found a similar result as Lee suggested, text here, Page 20.
add a comment |
up vote
0
down vote
Thank you for the answers, it is useful and gave a different perspective. I also found a similar result as Lee suggested, text here, Page 20.
add a comment |
up vote
0
down vote
up vote
0
down vote
Thank you for the answers, it is useful and gave a different perspective. I also found a similar result as Lee suggested, text here, Page 20.
Thank you for the answers, it is useful and gave a different perspective. I also found a similar result as Lee suggested, text here, Page 20.
answered Nov 9 at 2:33
Yuva
33
33
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2990572%2fcampbell-theorem-for-n-th-power%23new-answer', 'question_page');
}
);
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password