Expectation of Independent Bernoulli Trials












1












$begingroup$


Say there is a sequence of IID bernoulli trials $(x_1,x_2,...,x_N)$. Where $X in mathbb{R}^{Ntimes N}$ is a diagonal matrix with these RV's on the main diagonal and $mathbb{P}left[x_k =1right]=p, 0<kle N in mathbb{N}$. I want to know what is:
$$
mathbb{E}left[ X W X right],
$$



where $W inmathbb{R}^{Ntimes N}succ 0 $ is some positive definite weightings matrix. Due to this being bernoulli trials I believe the main diagonal after the expectation is excecuted to be $W_{ii} cdot p$ and $W_{ij}cdot p^2 $ for $ineq j$. However, I am unsure.










share|cite|improve this question









$endgroup$

















    1












    $begingroup$


    Say there is a sequence of IID bernoulli trials $(x_1,x_2,...,x_N)$. Where $X in mathbb{R}^{Ntimes N}$ is a diagonal matrix with these RV's on the main diagonal and $mathbb{P}left[x_k =1right]=p, 0<kle N in mathbb{N}$. I want to know what is:
    $$
    mathbb{E}left[ X W X right],
    $$



    where $W inmathbb{R}^{Ntimes N}succ 0 $ is some positive definite weightings matrix. Due to this being bernoulli trials I believe the main diagonal after the expectation is excecuted to be $W_{ii} cdot p$ and $W_{ij}cdot p^2 $ for $ineq j$. However, I am unsure.










    share|cite|improve this question









    $endgroup$















      1












      1








      1





      $begingroup$


      Say there is a sequence of IID bernoulli trials $(x_1,x_2,...,x_N)$. Where $X in mathbb{R}^{Ntimes N}$ is a diagonal matrix with these RV's on the main diagonal and $mathbb{P}left[x_k =1right]=p, 0<kle N in mathbb{N}$. I want to know what is:
      $$
      mathbb{E}left[ X W X right],
      $$



      where $W inmathbb{R}^{Ntimes N}succ 0 $ is some positive definite weightings matrix. Due to this being bernoulli trials I believe the main diagonal after the expectation is excecuted to be $W_{ii} cdot p$ and $W_{ij}cdot p^2 $ for $ineq j$. However, I am unsure.










      share|cite|improve this question









      $endgroup$




      Say there is a sequence of IID bernoulli trials $(x_1,x_2,...,x_N)$. Where $X in mathbb{R}^{Ntimes N}$ is a diagonal matrix with these RV's on the main diagonal and $mathbb{P}left[x_k =1right]=p, 0<kle N in mathbb{N}$. I want to know what is:
      $$
      mathbb{E}left[ X W X right],
      $$



      where $W inmathbb{R}^{Ntimes N}succ 0 $ is some positive definite weightings matrix. Due to this being bernoulli trials I believe the main diagonal after the expectation is excecuted to be $W_{ii} cdot p$ and $W_{ij}cdot p^2 $ for $ineq j$. However, I am unsure.







      probability expected-value






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Dec 5 '18 at 11:15









      p32fr4p32fr4

      3717




      3717






















          1 Answer
          1






          active

          oldest

          votes


















          1












          $begingroup$

          You can explicitly give the elements of the product $X W X$ as, for element at coordinate $(i,j)$
          begin{align*}
          (X W X)_{ij} &= sum_{k} X_{ik} (WX)_{kj}\
          &=sum_{k} X_{ik} sum_{ell} W_{kell} X_{ell j}\
          &=sum_k sum_ell W_{kell} X_{ik}X_{ell j}
          end{align*}



          The expectation is linear hence we only look at $mathbb E[X_{ab} X_{cd}]$ which is $p$ if $(a,b)=(c,d)$ and $p^2$ otherwise.



          Hence
          begin{align*}
          mathbb E[(XWX)_{ij}] &= sum_k sum_ell W_{kell} [pmathbb 1(i=ell land k=j) + p^2 (1-mathbb 1(i=ell land k=j))]\
          &=sum_k sum_ell W_{kell} [mathbb 1(i=ell land k=j)(p-p^2) + p^2)]\
          &=p(1-p)W_{ij} + p^2 sum_k sum_ell W_{kell}
          end{align*}



          Or in matrix notation $mathbb E[XWX]=p(1-p)W + p^2 mathbf 1 W mathbf 1$ where in this case $mathbf 1$ is the all one matrix.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            $X$ was a diagonal matrix so I don't believe there is a dimensional issue. Either way i also have a column vector multiplying each side ( I didn't post this in the problem above). So it should boil down to a variation of you answer thank you.
            $endgroup$
            – p32fr4
            Dec 5 '18 at 11:55






          • 1




            $begingroup$
            True, I fixed my answer so that it answers the problem, the trace trick cannot be used anymore and so I had to do it explicitly.
            $endgroup$
            – P. Quinton
            Dec 5 '18 at 12:12










          • $begingroup$
            That's great thanks.
            $endgroup$
            – p32fr4
            Dec 5 '18 at 12:19











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3026950%2fexpectation-of-independent-bernoulli-trials%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1












          $begingroup$

          You can explicitly give the elements of the product $X W X$ as, for element at coordinate $(i,j)$
          begin{align*}
          (X W X)_{ij} &= sum_{k} X_{ik} (WX)_{kj}\
          &=sum_{k} X_{ik} sum_{ell} W_{kell} X_{ell j}\
          &=sum_k sum_ell W_{kell} X_{ik}X_{ell j}
          end{align*}



          The expectation is linear hence we only look at $mathbb E[X_{ab} X_{cd}]$ which is $p$ if $(a,b)=(c,d)$ and $p^2$ otherwise.



          Hence
          begin{align*}
          mathbb E[(XWX)_{ij}] &= sum_k sum_ell W_{kell} [pmathbb 1(i=ell land k=j) + p^2 (1-mathbb 1(i=ell land k=j))]\
          &=sum_k sum_ell W_{kell} [mathbb 1(i=ell land k=j)(p-p^2) + p^2)]\
          &=p(1-p)W_{ij} + p^2 sum_k sum_ell W_{kell}
          end{align*}



          Or in matrix notation $mathbb E[XWX]=p(1-p)W + p^2 mathbf 1 W mathbf 1$ where in this case $mathbf 1$ is the all one matrix.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            $X$ was a diagonal matrix so I don't believe there is a dimensional issue. Either way i also have a column vector multiplying each side ( I didn't post this in the problem above). So it should boil down to a variation of you answer thank you.
            $endgroup$
            – p32fr4
            Dec 5 '18 at 11:55






          • 1




            $begingroup$
            True, I fixed my answer so that it answers the problem, the trace trick cannot be used anymore and so I had to do it explicitly.
            $endgroup$
            – P. Quinton
            Dec 5 '18 at 12:12










          • $begingroup$
            That's great thanks.
            $endgroup$
            – p32fr4
            Dec 5 '18 at 12:19
















          1












          $begingroup$

          You can explicitly give the elements of the product $X W X$ as, for element at coordinate $(i,j)$
          begin{align*}
          (X W X)_{ij} &= sum_{k} X_{ik} (WX)_{kj}\
          &=sum_{k} X_{ik} sum_{ell} W_{kell} X_{ell j}\
          &=sum_k sum_ell W_{kell} X_{ik}X_{ell j}
          end{align*}



          The expectation is linear hence we only look at $mathbb E[X_{ab} X_{cd}]$ which is $p$ if $(a,b)=(c,d)$ and $p^2$ otherwise.



          Hence
          begin{align*}
          mathbb E[(XWX)_{ij}] &= sum_k sum_ell W_{kell} [pmathbb 1(i=ell land k=j) + p^2 (1-mathbb 1(i=ell land k=j))]\
          &=sum_k sum_ell W_{kell} [mathbb 1(i=ell land k=j)(p-p^2) + p^2)]\
          &=p(1-p)W_{ij} + p^2 sum_k sum_ell W_{kell}
          end{align*}



          Or in matrix notation $mathbb E[XWX]=p(1-p)W + p^2 mathbf 1 W mathbf 1$ where in this case $mathbf 1$ is the all one matrix.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            $X$ was a diagonal matrix so I don't believe there is a dimensional issue. Either way i also have a column vector multiplying each side ( I didn't post this in the problem above). So it should boil down to a variation of you answer thank you.
            $endgroup$
            – p32fr4
            Dec 5 '18 at 11:55






          • 1




            $begingroup$
            True, I fixed my answer so that it answers the problem, the trace trick cannot be used anymore and so I had to do it explicitly.
            $endgroup$
            – P. Quinton
            Dec 5 '18 at 12:12










          • $begingroup$
            That's great thanks.
            $endgroup$
            – p32fr4
            Dec 5 '18 at 12:19














          1












          1








          1





          $begingroup$

          You can explicitly give the elements of the product $X W X$ as, for element at coordinate $(i,j)$
          begin{align*}
          (X W X)_{ij} &= sum_{k} X_{ik} (WX)_{kj}\
          &=sum_{k} X_{ik} sum_{ell} W_{kell} X_{ell j}\
          &=sum_k sum_ell W_{kell} X_{ik}X_{ell j}
          end{align*}



          The expectation is linear hence we only look at $mathbb E[X_{ab} X_{cd}]$ which is $p$ if $(a,b)=(c,d)$ and $p^2$ otherwise.



          Hence
          begin{align*}
          mathbb E[(XWX)_{ij}] &= sum_k sum_ell W_{kell} [pmathbb 1(i=ell land k=j) + p^2 (1-mathbb 1(i=ell land k=j))]\
          &=sum_k sum_ell W_{kell} [mathbb 1(i=ell land k=j)(p-p^2) + p^2)]\
          &=p(1-p)W_{ij} + p^2 sum_k sum_ell W_{kell}
          end{align*}



          Or in matrix notation $mathbb E[XWX]=p(1-p)W + p^2 mathbf 1 W mathbf 1$ where in this case $mathbf 1$ is the all one matrix.






          share|cite|improve this answer











          $endgroup$



          You can explicitly give the elements of the product $X W X$ as, for element at coordinate $(i,j)$
          begin{align*}
          (X W X)_{ij} &= sum_{k} X_{ik} (WX)_{kj}\
          &=sum_{k} X_{ik} sum_{ell} W_{kell} X_{ell j}\
          &=sum_k sum_ell W_{kell} X_{ik}X_{ell j}
          end{align*}



          The expectation is linear hence we only look at $mathbb E[X_{ab} X_{cd}]$ which is $p$ if $(a,b)=(c,d)$ and $p^2$ otherwise.



          Hence
          begin{align*}
          mathbb E[(XWX)_{ij}] &= sum_k sum_ell W_{kell} [pmathbb 1(i=ell land k=j) + p^2 (1-mathbb 1(i=ell land k=j))]\
          &=sum_k sum_ell W_{kell} [mathbb 1(i=ell land k=j)(p-p^2) + p^2)]\
          &=p(1-p)W_{ij} + p^2 sum_k sum_ell W_{kell}
          end{align*}



          Or in matrix notation $mathbb E[XWX]=p(1-p)W + p^2 mathbf 1 W mathbf 1$ where in this case $mathbf 1$ is the all one matrix.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Dec 5 '18 at 12:11

























          answered Dec 5 '18 at 11:44









          P. QuintonP. Quinton

          1,8011213




          1,8011213












          • $begingroup$
            $X$ was a diagonal matrix so I don't believe there is a dimensional issue. Either way i also have a column vector multiplying each side ( I didn't post this in the problem above). So it should boil down to a variation of you answer thank you.
            $endgroup$
            – p32fr4
            Dec 5 '18 at 11:55






          • 1




            $begingroup$
            True, I fixed my answer so that it answers the problem, the trace trick cannot be used anymore and so I had to do it explicitly.
            $endgroup$
            – P. Quinton
            Dec 5 '18 at 12:12










          • $begingroup$
            That's great thanks.
            $endgroup$
            – p32fr4
            Dec 5 '18 at 12:19


















          • $begingroup$
            $X$ was a diagonal matrix so I don't believe there is a dimensional issue. Either way i also have a column vector multiplying each side ( I didn't post this in the problem above). So it should boil down to a variation of you answer thank you.
            $endgroup$
            – p32fr4
            Dec 5 '18 at 11:55






          • 1




            $begingroup$
            True, I fixed my answer so that it answers the problem, the trace trick cannot be used anymore and so I had to do it explicitly.
            $endgroup$
            – P. Quinton
            Dec 5 '18 at 12:12










          • $begingroup$
            That's great thanks.
            $endgroup$
            – p32fr4
            Dec 5 '18 at 12:19
















          $begingroup$
          $X$ was a diagonal matrix so I don't believe there is a dimensional issue. Either way i also have a column vector multiplying each side ( I didn't post this in the problem above). So it should boil down to a variation of you answer thank you.
          $endgroup$
          – p32fr4
          Dec 5 '18 at 11:55




          $begingroup$
          $X$ was a diagonal matrix so I don't believe there is a dimensional issue. Either way i also have a column vector multiplying each side ( I didn't post this in the problem above). So it should boil down to a variation of you answer thank you.
          $endgroup$
          – p32fr4
          Dec 5 '18 at 11:55




          1




          1




          $begingroup$
          True, I fixed my answer so that it answers the problem, the trace trick cannot be used anymore and so I had to do it explicitly.
          $endgroup$
          – P. Quinton
          Dec 5 '18 at 12:12




          $begingroup$
          True, I fixed my answer so that it answers the problem, the trace trick cannot be used anymore and so I had to do it explicitly.
          $endgroup$
          – P. Quinton
          Dec 5 '18 at 12:12












          $begingroup$
          That's great thanks.
          $endgroup$
          – p32fr4
          Dec 5 '18 at 12:19




          $begingroup$
          That's great thanks.
          $endgroup$
          – p32fr4
          Dec 5 '18 at 12:19


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3026950%2fexpectation-of-independent-bernoulli-trials%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Plaza Victoria

          In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

          How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...