Campbell Theorem for $n$-th power











up vote
0
down vote

favorite












Campbell's theorem
gives a method for calculating expectations of sums of measurable functions f(x).



While I was solving my system model considering a Poisson point process, I came across the equation



$operatorname {E}big[big(sum_{xepsilon N} f(x)big)^nbig]$



Where $N$ is a Poisson point process



Is it possible to implement Campbell's theorem to find the expectation of the sums of measurable functions f(x) raised to the power n?










share|cite|improve this question




























    up vote
    0
    down vote

    favorite












    Campbell's theorem
    gives a method for calculating expectations of sums of measurable functions f(x).



    While I was solving my system model considering a Poisson point process, I came across the equation



    $operatorname {E}big[big(sum_{xepsilon N} f(x)big)^nbig]$



    Where $N$ is a Poisson point process



    Is it possible to implement Campbell's theorem to find the expectation of the sums of measurable functions f(x) raised to the power n?










    share|cite|improve this question


























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      Campbell's theorem
      gives a method for calculating expectations of sums of measurable functions f(x).



      While I was solving my system model considering a Poisson point process, I came across the equation



      $operatorname {E}big[big(sum_{xepsilon N} f(x)big)^nbig]$



      Where $N$ is a Poisson point process



      Is it possible to implement Campbell's theorem to find the expectation of the sums of measurable functions f(x) raised to the power n?










      share|cite|improve this question















      Campbell's theorem
      gives a method for calculating expectations of sums of measurable functions f(x).



      While I was solving my system model considering a Poisson point process, I came across the equation



      $operatorname {E}big[big(sum_{xepsilon N} f(x)big)^nbig]$



      Where $N$ is a Poisson point process



      Is it possible to implement Campbell's theorem to find the expectation of the sums of measurable functions f(x) raised to the power n?







      measure-theory summation poisson-process






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 9 at 1:50









      Le Anh Dung

      1,2071421




      1,2071421










      asked Nov 8 at 20:48









      Yuva

      33




      33






















          3 Answers
          3






          active

          oldest

          votes

















          up vote
          1
          down vote



          accepted










          Notice that



          $$ mathbf{E}left[ expleft{ -s sum_{x in N} f(x) right} right]
          = expleft{ - int left(1 - e^{-sf(x)}right) , Lambda(dx) right}.$$



          So the $n$-th moment of the sum $sum_{x in N} f(x)$ can be computed by differentiating both sides by $n$ times and plugging $s = 0$. To this end, we may invoke the following instance of Faa di Bruno's formula,



          $$ frac{d^n}{ds^n}e^{g(s)} = e^{g(s)} sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( g^{(i)}(s) right)^{lambda_i}, $$



          where the sum on the right-hand side runs over all the integer partitions $lambda$ of $n$, i.e., sequences $lambda in mathbb{N}_{0}^{n}$ satisfying $sum_{i=1}^{n} ilambda_i = n$. Plugging $g(s) = - int left(1 - e^{-sf(x)}right) , Lambda(dx)$, The result is that



          begin{align*}
          mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
          &= sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{lambda_i}.
          end{align*}



          Addendum. The right-hand side admits a useful probabilistic expression: Let $S_n$ the symmetric group over the set ${1,cdots,n}$. For each $pi in S_n$ and each $i in {1, cdots, n}$, define $m_i(pi)$ as the number of $i$-cycles in $pi$. If $pi$ is chosen uniformly at random from $S_n$, then



          begin{align*}
          mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
          &= n! , mathbb{E}left[ prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{m_i(pi)} right]
          end{align*}



          Since the cycle structure of random permutation is well-studied, this allows to study the asymptotic behavior of the expectation for large $n$.






          share|cite|improve this answer























          • Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
            – Yuva
            yesterday




















          up vote
          0
          down vote













          You can approximate $sum_{x in N} f(x)$ by $sum_j U_j f(x_j)$ where you divide your domain up into small pieces, $x_j$ is in the $j$'th piece, and $U_j$ is the number of points of your point process in the $j$'th piece. Further assume the pieces are so small that we can neglect values of $U_j$ other than $0$ and $1$.
          Then for example
          $ left(sum_{xin N} f(x)right)^3$ becomes
          $$ sum_{i} U_{i} f(x_i)^3 + 3 sum_{i} sum_{jne i} U_i U_j f(x_i) f(x_j)^2
          + sum_{i} sum_{j ne i} sum_{k ne i,j} U_i U_j U_k f(x_i) f(x_j) f(x_k)$$

          so that we get
          $$ mathbb Eleft[left(sum_{xin N} f(x)right)^3right]=int f(x)^3; dLambda(x) + 3 iint f(x) f(y)^2 ; dLambda(x); dLambda(y)
          + iiint f(x) f(y) f(z); dLambda(x); dLambda(y); dLambda(z)$$

          where $dLambda$ is the intensity of your point process.
          Similarly for other powers, where in general for the $n$'th power you need to consider all ways
          of partitioning $[1,ldots,n]$ into disjoint nonempty subsets.






          share|cite|improve this answer




























            up vote
            0
            down vote













            Thank you for the answers, it is useful and gave a different perspective. I also found a similar result as Lee suggested, text here, Page 20.






            share|cite|improve this answer





















              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "69"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














               

              draft saved


              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2990572%2fcampbell-theorem-for-n-th-power%23new-answer', 'question_page');
              }
              );

              Post as a guest
































              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              1
              down vote



              accepted










              Notice that



              $$ mathbf{E}left[ expleft{ -s sum_{x in N} f(x) right} right]
              = expleft{ - int left(1 - e^{-sf(x)}right) , Lambda(dx) right}.$$



              So the $n$-th moment of the sum $sum_{x in N} f(x)$ can be computed by differentiating both sides by $n$ times and plugging $s = 0$. To this end, we may invoke the following instance of Faa di Bruno's formula,



              $$ frac{d^n}{ds^n}e^{g(s)} = e^{g(s)} sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( g^{(i)}(s) right)^{lambda_i}, $$



              where the sum on the right-hand side runs over all the integer partitions $lambda$ of $n$, i.e., sequences $lambda in mathbb{N}_{0}^{n}$ satisfying $sum_{i=1}^{n} ilambda_i = n$. Plugging $g(s) = - int left(1 - e^{-sf(x)}right) , Lambda(dx)$, The result is that



              begin{align*}
              mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
              &= sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{lambda_i}.
              end{align*}



              Addendum. The right-hand side admits a useful probabilistic expression: Let $S_n$ the symmetric group over the set ${1,cdots,n}$. For each $pi in S_n$ and each $i in {1, cdots, n}$, define $m_i(pi)$ as the number of $i$-cycles in $pi$. If $pi$ is chosen uniformly at random from $S_n$, then



              begin{align*}
              mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
              &= n! , mathbb{E}left[ prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{m_i(pi)} right]
              end{align*}



              Since the cycle structure of random permutation is well-studied, this allows to study the asymptotic behavior of the expectation for large $n$.






              share|cite|improve this answer























              • Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
                – Yuva
                yesterday

















              up vote
              1
              down vote



              accepted










              Notice that



              $$ mathbf{E}left[ expleft{ -s sum_{x in N} f(x) right} right]
              = expleft{ - int left(1 - e^{-sf(x)}right) , Lambda(dx) right}.$$



              So the $n$-th moment of the sum $sum_{x in N} f(x)$ can be computed by differentiating both sides by $n$ times and plugging $s = 0$. To this end, we may invoke the following instance of Faa di Bruno's formula,



              $$ frac{d^n}{ds^n}e^{g(s)} = e^{g(s)} sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( g^{(i)}(s) right)^{lambda_i}, $$



              where the sum on the right-hand side runs over all the integer partitions $lambda$ of $n$, i.e., sequences $lambda in mathbb{N}_{0}^{n}$ satisfying $sum_{i=1}^{n} ilambda_i = n$. Plugging $g(s) = - int left(1 - e^{-sf(x)}right) , Lambda(dx)$, The result is that



              begin{align*}
              mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
              &= sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{lambda_i}.
              end{align*}



              Addendum. The right-hand side admits a useful probabilistic expression: Let $S_n$ the symmetric group over the set ${1,cdots,n}$. For each $pi in S_n$ and each $i in {1, cdots, n}$, define $m_i(pi)$ as the number of $i$-cycles in $pi$. If $pi$ is chosen uniformly at random from $S_n$, then



              begin{align*}
              mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
              &= n! , mathbb{E}left[ prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{m_i(pi)} right]
              end{align*}



              Since the cycle structure of random permutation is well-studied, this allows to study the asymptotic behavior of the expectation for large $n$.






              share|cite|improve this answer























              • Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
                – Yuva
                yesterday















              up vote
              1
              down vote



              accepted







              up vote
              1
              down vote



              accepted






              Notice that



              $$ mathbf{E}left[ expleft{ -s sum_{x in N} f(x) right} right]
              = expleft{ - int left(1 - e^{-sf(x)}right) , Lambda(dx) right}.$$



              So the $n$-th moment of the sum $sum_{x in N} f(x)$ can be computed by differentiating both sides by $n$ times and plugging $s = 0$. To this end, we may invoke the following instance of Faa di Bruno's formula,



              $$ frac{d^n}{ds^n}e^{g(s)} = e^{g(s)} sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( g^{(i)}(s) right)^{lambda_i}, $$



              where the sum on the right-hand side runs over all the integer partitions $lambda$ of $n$, i.e., sequences $lambda in mathbb{N}_{0}^{n}$ satisfying $sum_{i=1}^{n} ilambda_i = n$. Plugging $g(s) = - int left(1 - e^{-sf(x)}right) , Lambda(dx)$, The result is that



              begin{align*}
              mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
              &= sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{lambda_i}.
              end{align*}



              Addendum. The right-hand side admits a useful probabilistic expression: Let $S_n$ the symmetric group over the set ${1,cdots,n}$. For each $pi in S_n$ and each $i in {1, cdots, n}$, define $m_i(pi)$ as the number of $i$-cycles in $pi$. If $pi$ is chosen uniformly at random from $S_n$, then



              begin{align*}
              mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
              &= n! , mathbb{E}left[ prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{m_i(pi)} right]
              end{align*}



              Since the cycle structure of random permutation is well-studied, this allows to study the asymptotic behavior of the expectation for large $n$.






              share|cite|improve this answer














              Notice that



              $$ mathbf{E}left[ expleft{ -s sum_{x in N} f(x) right} right]
              = expleft{ - int left(1 - e^{-sf(x)}right) , Lambda(dx) right}.$$



              So the $n$-th moment of the sum $sum_{x in N} f(x)$ can be computed by differentiating both sides by $n$ times and plugging $s = 0$. To this end, we may invoke the following instance of Faa di Bruno's formula,



              $$ frac{d^n}{ds^n}e^{g(s)} = e^{g(s)} sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( g^{(i)}(s) right)^{lambda_i}, $$



              where the sum on the right-hand side runs over all the integer partitions $lambda$ of $n$, i.e., sequences $lambda in mathbb{N}_{0}^{n}$ satisfying $sum_{i=1}^{n} ilambda_i = n$. Plugging $g(s) = - int left(1 - e^{-sf(x)}right) , Lambda(dx)$, The result is that



              begin{align*}
              mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
              &= sum_{lambda vdash n} left( frac{n!}{prod_{i=1}^{n} lambda_i! i^{lambda_i}} right) prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{lambda_i}.
              end{align*}



              Addendum. The right-hand side admits a useful probabilistic expression: Let $S_n$ the symmetric group over the set ${1,cdots,n}$. For each $pi in S_n$ and each $i in {1, cdots, n}$, define $m_i(pi)$ as the number of $i$-cycles in $pi$. If $pi$ is chosen uniformly at random from $S_n$, then



              begin{align*}
              mathbf{E}left[ left( sum_{x in N} f(x) right)^n right]
              &= n! , mathbb{E}left[ prod_{i=1}^{n} left( int f(x)^i , Lambda(dx) right)^{m_i(pi)} right]
              end{align*}



              Since the cycle structure of random permutation is well-studied, this allows to study the asymptotic behavior of the expectation for large $n$.







              share|cite|improve this answer














              share|cite|improve this answer



              share|cite|improve this answer








              edited yesterday

























              answered Nov 9 at 0:06









              Sangchul Lee

              89.3k12161261




              89.3k12161261












              • Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
                – Yuva
                yesterday




















              • Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
                – Yuva
                yesterday


















              Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
              – Yuva
              yesterday






              Hey, is it possible to express in terms of Bell polynomial ${displaystyle {d^{n} over dx^{n}}f(g(x))=mathbf{E}left[ left( sum_{x in N} f(x) right)^n right] =sum _{k=1}^{n}B_{n,k}left(left( int f(x) , Lambda(dx) right),left( int f(x)^2 , Lambda(dx) right),dots ,left( int f(x)^{(n-k+1)} , Lambda(dx) right)right).}$
              – Yuva
              yesterday












              up vote
              0
              down vote













              You can approximate $sum_{x in N} f(x)$ by $sum_j U_j f(x_j)$ where you divide your domain up into small pieces, $x_j$ is in the $j$'th piece, and $U_j$ is the number of points of your point process in the $j$'th piece. Further assume the pieces are so small that we can neglect values of $U_j$ other than $0$ and $1$.
              Then for example
              $ left(sum_{xin N} f(x)right)^3$ becomes
              $$ sum_{i} U_{i} f(x_i)^3 + 3 sum_{i} sum_{jne i} U_i U_j f(x_i) f(x_j)^2
              + sum_{i} sum_{j ne i} sum_{k ne i,j} U_i U_j U_k f(x_i) f(x_j) f(x_k)$$

              so that we get
              $$ mathbb Eleft[left(sum_{xin N} f(x)right)^3right]=int f(x)^3; dLambda(x) + 3 iint f(x) f(y)^2 ; dLambda(x); dLambda(y)
              + iiint f(x) f(y) f(z); dLambda(x); dLambda(y); dLambda(z)$$

              where $dLambda$ is the intensity of your point process.
              Similarly for other powers, where in general for the $n$'th power you need to consider all ways
              of partitioning $[1,ldots,n]$ into disjoint nonempty subsets.






              share|cite|improve this answer

























                up vote
                0
                down vote













                You can approximate $sum_{x in N} f(x)$ by $sum_j U_j f(x_j)$ where you divide your domain up into small pieces, $x_j$ is in the $j$'th piece, and $U_j$ is the number of points of your point process in the $j$'th piece. Further assume the pieces are so small that we can neglect values of $U_j$ other than $0$ and $1$.
                Then for example
                $ left(sum_{xin N} f(x)right)^3$ becomes
                $$ sum_{i} U_{i} f(x_i)^3 + 3 sum_{i} sum_{jne i} U_i U_j f(x_i) f(x_j)^2
                + sum_{i} sum_{j ne i} sum_{k ne i,j} U_i U_j U_k f(x_i) f(x_j) f(x_k)$$

                so that we get
                $$ mathbb Eleft[left(sum_{xin N} f(x)right)^3right]=int f(x)^3; dLambda(x) + 3 iint f(x) f(y)^2 ; dLambda(x); dLambda(y)
                + iiint f(x) f(y) f(z); dLambda(x); dLambda(y); dLambda(z)$$

                where $dLambda$ is the intensity of your point process.
                Similarly for other powers, where in general for the $n$'th power you need to consider all ways
                of partitioning $[1,ldots,n]$ into disjoint nonempty subsets.






                share|cite|improve this answer























                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  You can approximate $sum_{x in N} f(x)$ by $sum_j U_j f(x_j)$ where you divide your domain up into small pieces, $x_j$ is in the $j$'th piece, and $U_j$ is the number of points of your point process in the $j$'th piece. Further assume the pieces are so small that we can neglect values of $U_j$ other than $0$ and $1$.
                  Then for example
                  $ left(sum_{xin N} f(x)right)^3$ becomes
                  $$ sum_{i} U_{i} f(x_i)^3 + 3 sum_{i} sum_{jne i} U_i U_j f(x_i) f(x_j)^2
                  + sum_{i} sum_{j ne i} sum_{k ne i,j} U_i U_j U_k f(x_i) f(x_j) f(x_k)$$

                  so that we get
                  $$ mathbb Eleft[left(sum_{xin N} f(x)right)^3right]=int f(x)^3; dLambda(x) + 3 iint f(x) f(y)^2 ; dLambda(x); dLambda(y)
                  + iiint f(x) f(y) f(z); dLambda(x); dLambda(y); dLambda(z)$$

                  where $dLambda$ is the intensity of your point process.
                  Similarly for other powers, where in general for the $n$'th power you need to consider all ways
                  of partitioning $[1,ldots,n]$ into disjoint nonempty subsets.






                  share|cite|improve this answer












                  You can approximate $sum_{x in N} f(x)$ by $sum_j U_j f(x_j)$ where you divide your domain up into small pieces, $x_j$ is in the $j$'th piece, and $U_j$ is the number of points of your point process in the $j$'th piece. Further assume the pieces are so small that we can neglect values of $U_j$ other than $0$ and $1$.
                  Then for example
                  $ left(sum_{xin N} f(x)right)^3$ becomes
                  $$ sum_{i} U_{i} f(x_i)^3 + 3 sum_{i} sum_{jne i} U_i U_j f(x_i) f(x_j)^2
                  + sum_{i} sum_{j ne i} sum_{k ne i,j} U_i U_j U_k f(x_i) f(x_j) f(x_k)$$

                  so that we get
                  $$ mathbb Eleft[left(sum_{xin N} f(x)right)^3right]=int f(x)^3; dLambda(x) + 3 iint f(x) f(y)^2 ; dLambda(x); dLambda(y)
                  + iiint f(x) f(y) f(z); dLambda(x); dLambda(y); dLambda(z)$$

                  where $dLambda$ is the intensity of your point process.
                  Similarly for other powers, where in general for the $n$'th power you need to consider all ways
                  of partitioning $[1,ldots,n]$ into disjoint nonempty subsets.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Nov 8 at 21:18









                  Robert Israel

                  313k23206452




                  313k23206452






















                      up vote
                      0
                      down vote













                      Thank you for the answers, it is useful and gave a different perspective. I also found a similar result as Lee suggested, text here, Page 20.






                      share|cite|improve this answer

























                        up vote
                        0
                        down vote













                        Thank you for the answers, it is useful and gave a different perspective. I also found a similar result as Lee suggested, text here, Page 20.






                        share|cite|improve this answer























                          up vote
                          0
                          down vote










                          up vote
                          0
                          down vote









                          Thank you for the answers, it is useful and gave a different perspective. I also found a similar result as Lee suggested, text here, Page 20.






                          share|cite|improve this answer












                          Thank you for the answers, it is useful and gave a different perspective. I also found a similar result as Lee suggested, text here, Page 20.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Nov 9 at 2:33









                          Yuva

                          33




                          33






























                               

                              draft saved


                              draft discarded



















































                               


                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2990572%2fcampbell-theorem-for-n-th-power%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest




















































































                              Popular posts from this blog

                              Plaza Victoria

                              In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

                              How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...