What is the joint density of 2 random variables that Linear combinations of the same random variables?












0












$begingroup$


Suppose we have random variable $W$ and $M$ that are independent standard normal random variables. If we were to define $X$ and $Y$ as:



$X=aW +bM$ and $Y=cW+bM$



How do we find the joint density of $X$ and $Y$? ie $f_{X,Y}$.



I found the pdf of $X$ and the pdf of $Y$ (linear combination of Normals are normal with new mean and variance) but im not sure where to go from there. I believe you can't just multiply $f_U$ and $f_V$ together because they share $W$ and $M$ which make them dependent. However, when you find $f_X$ and $f_Y$ those $M$ and $W$ terms just disappear leaving you with only $a$'s, $b$'s etc.










share|cite|improve this question









$endgroup$

















    0












    $begingroup$


    Suppose we have random variable $W$ and $M$ that are independent standard normal random variables. If we were to define $X$ and $Y$ as:



    $X=aW +bM$ and $Y=cW+bM$



    How do we find the joint density of $X$ and $Y$? ie $f_{X,Y}$.



    I found the pdf of $X$ and the pdf of $Y$ (linear combination of Normals are normal with new mean and variance) but im not sure where to go from there. I believe you can't just multiply $f_U$ and $f_V$ together because they share $W$ and $M$ which make them dependent. However, when you find $f_X$ and $f_Y$ those $M$ and $W$ terms just disappear leaving you with only $a$'s, $b$'s etc.










    share|cite|improve this question









    $endgroup$















      0












      0








      0





      $begingroup$


      Suppose we have random variable $W$ and $M$ that are independent standard normal random variables. If we were to define $X$ and $Y$ as:



      $X=aW +bM$ and $Y=cW+bM$



      How do we find the joint density of $X$ and $Y$? ie $f_{X,Y}$.



      I found the pdf of $X$ and the pdf of $Y$ (linear combination of Normals are normal with new mean and variance) but im not sure where to go from there. I believe you can't just multiply $f_U$ and $f_V$ together because they share $W$ and $M$ which make them dependent. However, when you find $f_X$ and $f_Y$ those $M$ and $W$ terms just disappear leaving you with only $a$'s, $b$'s etc.










      share|cite|improve this question









      $endgroup$




      Suppose we have random variable $W$ and $M$ that are independent standard normal random variables. If we were to define $X$ and $Y$ as:



      $X=aW +bM$ and $Y=cW+bM$



      How do we find the joint density of $X$ and $Y$? ie $f_{X,Y}$.



      I found the pdf of $X$ and the pdf of $Y$ (linear combination of Normals are normal with new mean and variance) but im not sure where to go from there. I believe you can't just multiply $f_U$ and $f_V$ together because they share $W$ and $M$ which make them dependent. However, when you find $f_X$ and $f_Y$ those $M$ and $W$ terms just disappear leaving you with only $a$'s, $b$'s etc.







      statistics normal-distribution






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Aug 24 '17 at 18:17









      RibbonSannyRibbonSanny

      1




      1






















          2 Answers
          2






          active

          oldest

          votes


















          0












          $begingroup$

          Linear combinations of jointly Gaussian random variables are also jointly Gaussian.



          Independent Gaussians are jointly Gaussian, so $(X,Y)$ follow a joint Gaussian distribution. This is specified by the $E[X],E[Y], sigma_X^2, sigma_Y^2, sigma_{XY}$.



          $E[X] = E[aW+bM] = a E[W] + b E[M] =0$ and similarly $E[Y]=0$.



          $sigma_X^2 = var(aW+bM) = a^2 var(W) + b^2 var(M) = a^2+b^2$ and similarly $sigma_Y^2 = c^2+b^2$.



          $sigma_{XY} = E[XY] - E[X]E[Y] = E[XY] - 0 = E[(aW+bM)(cW+bM)] = E[acW^2+b^2M + (ab+bc) MW]= ac+b^2 + (ab+bc)E[MW]=ac+b^2$ since $E[MW]=E[M]E[W]=0$.



          Thus, $(X,Y)$ follows a normal distribution with mean zero (vector) and covariance matrix $begin{bmatrix} sigma_X^2 & sigma_{XY} \ sigma_{XY} & sigma_Y^2 end{bmatrix}$.






          share|cite|improve this answer









          $endgroup$





















            0












            $begingroup$

            Summarizing @Batman succinctly, let $A=begin{pmatrix}a&b\c&bend{pmatrix}$. From $begin{pmatrix}W\M end{pmatrix} sim N(0, I)$, we have



            begin{align*}
            begin{pmatrix}X\Y end{pmatrix} &= Abegin{pmatrix}W\M end{pmatrix}\
            &sim N(A0, AIA^T)\
            &= Nleft(0, begin{pmatrix}a^2+b^2 & ac+b^2 \ ac+b^2 & b^2+c^2end{pmatrix}right)
            end{align*}



            Note there is implicit assumption that $A$ is full rank, i.e. $aneq c$. To prove that independent linear combination of normal is normal, you could use the moment generating function. Linear combination of normal distribution






            share|cite|improve this answer









            $endgroup$













              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "69"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2404936%2fwhat-is-the-joint-density-of-2-random-variables-that-linear-combinations-of-the%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              0












              $begingroup$

              Linear combinations of jointly Gaussian random variables are also jointly Gaussian.



              Independent Gaussians are jointly Gaussian, so $(X,Y)$ follow a joint Gaussian distribution. This is specified by the $E[X],E[Y], sigma_X^2, sigma_Y^2, sigma_{XY}$.



              $E[X] = E[aW+bM] = a E[W] + b E[M] =0$ and similarly $E[Y]=0$.



              $sigma_X^2 = var(aW+bM) = a^2 var(W) + b^2 var(M) = a^2+b^2$ and similarly $sigma_Y^2 = c^2+b^2$.



              $sigma_{XY} = E[XY] - E[X]E[Y] = E[XY] - 0 = E[(aW+bM)(cW+bM)] = E[acW^2+b^2M + (ab+bc) MW]= ac+b^2 + (ab+bc)E[MW]=ac+b^2$ since $E[MW]=E[M]E[W]=0$.



              Thus, $(X,Y)$ follows a normal distribution with mean zero (vector) and covariance matrix $begin{bmatrix} sigma_X^2 & sigma_{XY} \ sigma_{XY} & sigma_Y^2 end{bmatrix}$.






              share|cite|improve this answer









              $endgroup$


















                0












                $begingroup$

                Linear combinations of jointly Gaussian random variables are also jointly Gaussian.



                Independent Gaussians are jointly Gaussian, so $(X,Y)$ follow a joint Gaussian distribution. This is specified by the $E[X],E[Y], sigma_X^2, sigma_Y^2, sigma_{XY}$.



                $E[X] = E[aW+bM] = a E[W] + b E[M] =0$ and similarly $E[Y]=0$.



                $sigma_X^2 = var(aW+bM) = a^2 var(W) + b^2 var(M) = a^2+b^2$ and similarly $sigma_Y^2 = c^2+b^2$.



                $sigma_{XY} = E[XY] - E[X]E[Y] = E[XY] - 0 = E[(aW+bM)(cW+bM)] = E[acW^2+b^2M + (ab+bc) MW]= ac+b^2 + (ab+bc)E[MW]=ac+b^2$ since $E[MW]=E[M]E[W]=0$.



                Thus, $(X,Y)$ follows a normal distribution with mean zero (vector) and covariance matrix $begin{bmatrix} sigma_X^2 & sigma_{XY} \ sigma_{XY} & sigma_Y^2 end{bmatrix}$.






                share|cite|improve this answer









                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  Linear combinations of jointly Gaussian random variables are also jointly Gaussian.



                  Independent Gaussians are jointly Gaussian, so $(X,Y)$ follow a joint Gaussian distribution. This is specified by the $E[X],E[Y], sigma_X^2, sigma_Y^2, sigma_{XY}$.



                  $E[X] = E[aW+bM] = a E[W] + b E[M] =0$ and similarly $E[Y]=0$.



                  $sigma_X^2 = var(aW+bM) = a^2 var(W) + b^2 var(M) = a^2+b^2$ and similarly $sigma_Y^2 = c^2+b^2$.



                  $sigma_{XY} = E[XY] - E[X]E[Y] = E[XY] - 0 = E[(aW+bM)(cW+bM)] = E[acW^2+b^2M + (ab+bc) MW]= ac+b^2 + (ab+bc)E[MW]=ac+b^2$ since $E[MW]=E[M]E[W]=0$.



                  Thus, $(X,Y)$ follows a normal distribution with mean zero (vector) and covariance matrix $begin{bmatrix} sigma_X^2 & sigma_{XY} \ sigma_{XY} & sigma_Y^2 end{bmatrix}$.






                  share|cite|improve this answer









                  $endgroup$



                  Linear combinations of jointly Gaussian random variables are also jointly Gaussian.



                  Independent Gaussians are jointly Gaussian, so $(X,Y)$ follow a joint Gaussian distribution. This is specified by the $E[X],E[Y], sigma_X^2, sigma_Y^2, sigma_{XY}$.



                  $E[X] = E[aW+bM] = a E[W] + b E[M] =0$ and similarly $E[Y]=0$.



                  $sigma_X^2 = var(aW+bM) = a^2 var(W) + b^2 var(M) = a^2+b^2$ and similarly $sigma_Y^2 = c^2+b^2$.



                  $sigma_{XY} = E[XY] - E[X]E[Y] = E[XY] - 0 = E[(aW+bM)(cW+bM)] = E[acW^2+b^2M + (ab+bc) MW]= ac+b^2 + (ab+bc)E[MW]=ac+b^2$ since $E[MW]=E[M]E[W]=0$.



                  Thus, $(X,Y)$ follows a normal distribution with mean zero (vector) and covariance matrix $begin{bmatrix} sigma_X^2 & sigma_{XY} \ sigma_{XY} & sigma_Y^2 end{bmatrix}$.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Aug 24 '17 at 18:49









                  BatmanBatman

                  16.4k11735




                  16.4k11735























                      0












                      $begingroup$

                      Summarizing @Batman succinctly, let $A=begin{pmatrix}a&b\c&bend{pmatrix}$. From $begin{pmatrix}W\M end{pmatrix} sim N(0, I)$, we have



                      begin{align*}
                      begin{pmatrix}X\Y end{pmatrix} &= Abegin{pmatrix}W\M end{pmatrix}\
                      &sim N(A0, AIA^T)\
                      &= Nleft(0, begin{pmatrix}a^2+b^2 & ac+b^2 \ ac+b^2 & b^2+c^2end{pmatrix}right)
                      end{align*}



                      Note there is implicit assumption that $A$ is full rank, i.e. $aneq c$. To prove that independent linear combination of normal is normal, you could use the moment generating function. Linear combination of normal distribution






                      share|cite|improve this answer









                      $endgroup$


















                        0












                        $begingroup$

                        Summarizing @Batman succinctly, let $A=begin{pmatrix}a&b\c&bend{pmatrix}$. From $begin{pmatrix}W\M end{pmatrix} sim N(0, I)$, we have



                        begin{align*}
                        begin{pmatrix}X\Y end{pmatrix} &= Abegin{pmatrix}W\M end{pmatrix}\
                        &sim N(A0, AIA^T)\
                        &= Nleft(0, begin{pmatrix}a^2+b^2 & ac+b^2 \ ac+b^2 & b^2+c^2end{pmatrix}right)
                        end{align*}



                        Note there is implicit assumption that $A$ is full rank, i.e. $aneq c$. To prove that independent linear combination of normal is normal, you could use the moment generating function. Linear combination of normal distribution






                        share|cite|improve this answer









                        $endgroup$
















                          0












                          0








                          0





                          $begingroup$

                          Summarizing @Batman succinctly, let $A=begin{pmatrix}a&b\c&bend{pmatrix}$. From $begin{pmatrix}W\M end{pmatrix} sim N(0, I)$, we have



                          begin{align*}
                          begin{pmatrix}X\Y end{pmatrix} &= Abegin{pmatrix}W\M end{pmatrix}\
                          &sim N(A0, AIA^T)\
                          &= Nleft(0, begin{pmatrix}a^2+b^2 & ac+b^2 \ ac+b^2 & b^2+c^2end{pmatrix}right)
                          end{align*}



                          Note there is implicit assumption that $A$ is full rank, i.e. $aneq c$. To prove that independent linear combination of normal is normal, you could use the moment generating function. Linear combination of normal distribution






                          share|cite|improve this answer









                          $endgroup$



                          Summarizing @Batman succinctly, let $A=begin{pmatrix}a&b\c&bend{pmatrix}$. From $begin{pmatrix}W\M end{pmatrix} sim N(0, I)$, we have



                          begin{align*}
                          begin{pmatrix}X\Y end{pmatrix} &= Abegin{pmatrix}W\M end{pmatrix}\
                          &sim N(A0, AIA^T)\
                          &= Nleft(0, begin{pmatrix}a^2+b^2 & ac+b^2 \ ac+b^2 & b^2+c^2end{pmatrix}right)
                          end{align*}



                          Note there is implicit assumption that $A$ is full rank, i.e. $aneq c$. To prove that independent linear combination of normal is normal, you could use the moment generating function. Linear combination of normal distribution







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Aug 24 '17 at 19:17









                          Jirapat SamranvedhyaJirapat Samranvedhya

                          27118




                          27118






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2404936%2fwhat-is-the-joint-density-of-2-random-variables-that-linear-combinations-of-the%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Plaza Victoria

                              Puebla de Zaragoza

                              Musa