Poisson Distribution of sum of two random independent variables $X$, $Y$












42












$begingroup$


$X sim mathcal{P}( lambda) $ and $Y sim mathcal{P}( mu)$ meaning that $X$ and $Y$ are Poisson distributions. What is the probability distribution law of $X + Y$. I know it is $X+Y sim mathcal{P}( lambda + mu)$ but I don't understand how to derive it.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Try using the method of moment generating functions :)
    $endgroup$
    – Samuel Reid
    Nov 25 '13 at 7:03










  • $begingroup$
    All I've learned in the definition of a Poisson Random Variable, is there a simpler way?
    $endgroup$
    – user82004
    Nov 25 '13 at 7:07






  • 4




    $begingroup$
    If they are independent.
    $endgroup$
    – Did
    Nov 25 '13 at 8:14










  • $begingroup$
    Doesn’t it suffice that their covariance vanishes?
    $endgroup$
    – Michael Hoppe
    Feb 1 '18 at 7:09
















42












$begingroup$


$X sim mathcal{P}( lambda) $ and $Y sim mathcal{P}( mu)$ meaning that $X$ and $Y$ are Poisson distributions. What is the probability distribution law of $X + Y$. I know it is $X+Y sim mathcal{P}( lambda + mu)$ but I don't understand how to derive it.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Try using the method of moment generating functions :)
    $endgroup$
    – Samuel Reid
    Nov 25 '13 at 7:03










  • $begingroup$
    All I've learned in the definition of a Poisson Random Variable, is there a simpler way?
    $endgroup$
    – user82004
    Nov 25 '13 at 7:07






  • 4




    $begingroup$
    If they are independent.
    $endgroup$
    – Did
    Nov 25 '13 at 8:14










  • $begingroup$
    Doesn’t it suffice that their covariance vanishes?
    $endgroup$
    – Michael Hoppe
    Feb 1 '18 at 7:09














42












42








42


20



$begingroup$


$X sim mathcal{P}( lambda) $ and $Y sim mathcal{P}( mu)$ meaning that $X$ and $Y$ are Poisson distributions. What is the probability distribution law of $X + Y$. I know it is $X+Y sim mathcal{P}( lambda + mu)$ but I don't understand how to derive it.










share|cite|improve this question











$endgroup$




$X sim mathcal{P}( lambda) $ and $Y sim mathcal{P}( mu)$ meaning that $X$ and $Y$ are Poisson distributions. What is the probability distribution law of $X + Y$. I know it is $X+Y sim mathcal{P}( lambda + mu)$ but I don't understand how to derive it.







probability probability-theory probability-distributions






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Feb 13 '13 at 6:18









Stefan Hansen

20.9k73865




20.9k73865










asked Oct 25 '12 at 20:00







user31280



















  • $begingroup$
    Try using the method of moment generating functions :)
    $endgroup$
    – Samuel Reid
    Nov 25 '13 at 7:03










  • $begingroup$
    All I've learned in the definition of a Poisson Random Variable, is there a simpler way?
    $endgroup$
    – user82004
    Nov 25 '13 at 7:07






  • 4




    $begingroup$
    If they are independent.
    $endgroup$
    – Did
    Nov 25 '13 at 8:14










  • $begingroup$
    Doesn’t it suffice that their covariance vanishes?
    $endgroup$
    – Michael Hoppe
    Feb 1 '18 at 7:09


















  • $begingroup$
    Try using the method of moment generating functions :)
    $endgroup$
    – Samuel Reid
    Nov 25 '13 at 7:03










  • $begingroup$
    All I've learned in the definition of a Poisson Random Variable, is there a simpler way?
    $endgroup$
    – user82004
    Nov 25 '13 at 7:07






  • 4




    $begingroup$
    If they are independent.
    $endgroup$
    – Did
    Nov 25 '13 at 8:14










  • $begingroup$
    Doesn’t it suffice that their covariance vanishes?
    $endgroup$
    – Michael Hoppe
    Feb 1 '18 at 7:09
















$begingroup$
Try using the method of moment generating functions :)
$endgroup$
– Samuel Reid
Nov 25 '13 at 7:03




$begingroup$
Try using the method of moment generating functions :)
$endgroup$
– Samuel Reid
Nov 25 '13 at 7:03












$begingroup$
All I've learned in the definition of a Poisson Random Variable, is there a simpler way?
$endgroup$
– user82004
Nov 25 '13 at 7:07




$begingroup$
All I've learned in the definition of a Poisson Random Variable, is there a simpler way?
$endgroup$
– user82004
Nov 25 '13 at 7:07




4




4




$begingroup$
If they are independent.
$endgroup$
– Did
Nov 25 '13 at 8:14




$begingroup$
If they are independent.
$endgroup$
– Did
Nov 25 '13 at 8:14












$begingroup$
Doesn’t it suffice that their covariance vanishes?
$endgroup$
– Michael Hoppe
Feb 1 '18 at 7:09




$begingroup$
Doesn’t it suffice that their covariance vanishes?
$endgroup$
– Michael Hoppe
Feb 1 '18 at 7:09










7 Answers
7






active

oldest

votes


















78












$begingroup$

This only holds if $X$ and $Y$ are independent, so we suppose this from now on. We have for $k ge 0$:
begin{align*}
P(X+ Y =k) &= sum_{i = 0}^k P(X+ Y = k, X = i)\
&= sum_{i=0}^k P(Y = k-i , X =i)\
&= sum_{i=0}^k P(Y = k-i)P(X=i)\
&= sum_{i=0}^k e^{-mu}frac{mu^{k-i}}{(k-i)!}e^{-lambda}frac{lambda^i}{i!}\
&= e^{-(mu + lambda)}frac 1{k!}sum_{i=0}^k frac{k!}{i!(k-i)!}mu^{k-i}lambda^i\
&= e^{-(mu + lambda)}frac 1{k!}sum_{i=0}^k binom kimu^{k-i}lambda^i\
&= frac{(mu + lambda)^k}{k!} cdot e^{-(mu + lambda)}
end{align*}
Hence, $X+ Y sim mathcal P(mu + lambda)$.






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    Thank you! but what happens if they are not independent?
    $endgroup$
    – user31280
    Oct 25 '12 at 20:20






  • 8




    $begingroup$
    In general we can't say anything then. It depends on how they depend on another.
    $endgroup$
    – martini
    Oct 25 '12 at 20:22






  • 1




    $begingroup$
    Thank you! it's very simple and I feel like a complete idiot.
    $endgroup$
    – user31280
    Oct 25 '12 at 20:40






  • 1




    $begingroup$
    Nice derivation: specifically the transformation of (a) the i/k factorials and (b) the mu/lambda polynomials into the binomial form of the polynomial power expression.
    $endgroup$
    – javadba
    Aug 30 '14 at 20:59








  • 1




    $begingroup$
    @LiorA Yes. k! included to combine with the rest and simplify as intended, so 1/k! is included to compensate.
    $endgroup$
    – Rolazaro Azeveires
    Jan 7 '18 at 14:23



















17












$begingroup$

Another approach is to use characteristic functions. If $Xsim mathrm{po}(lambda)$, then the characteristic function of $X$ is (if this is unknown, just calculate it)
$$
varphi_X(t)=E[e^{itX}]=e^{lambda(e^{it}-1)},quad tinmathbb{R}.
$$
Now suppose that $X$ and $Y$ are independent Poisson distributed random variables with parameters $lambda$ and $mu$ respectively. Then due to the independence we have that
$$
varphi_{X+Y}(t)=varphi_X(t)varphi_Y(t)=e^{lambda(e^{it}-1)}e^{mu(e^{it}-1)}=e^{(mu+lambda)(e^{it}-1)},quad tinmathbb{R}.
$$
As the characteristic function completely determines the distribution, we conclude that $X+Ysimmathrm{po}(lambda+mu)$.






share|cite|improve this answer









$endgroup$





















    7












    $begingroup$

    You can use Probability Generating Function(P.G.F). As poisson distribution is a discrete probability distribution, P.G.F. fits better in this case.For independent X and Y random variable which follows distribution Po($lambda$) and Po($mu$).
    P.G.F of X is
    begin{equation*}
    begin{split}
    P_X[t] = E[t^X]&= sum_{x=0}^{infty}t^xe^{-lambda}frac{lambda^x}{x!}\
    &=sum_{x=0}^{infty}e^{-lambda}frac{(lambda t)^x}{x!}\
    &=e^{-lambda}e^{lambda t}\
    &=e^{-lambda (1-t)}\
    end{split}
    end{equation*}
    P.G.F of Y is
    begin{equation*}
    begin{split}
    P_Y[t] = E[t^Y]&= sum_{y=0}^{infty}t^ye^{-mu}frac{mu^y}{y!}\
    &=sum_{y=0}^{infty}e^{-mu}frac{(mu t)^y}{y!}\
    &=e^{-mu}e^{mu t}\
    &=e^{-mu (1-t)}\
    end{split}
    end{equation*}



    Now think about P.G.F of U = X+Y.
    As X and Y are independent,
    begin{equation*}
    begin{split}
    P_U(t)=P_{X+Y}(t)=P_X(t)P_Y(t)=E[t^{X+Y}]=E[t^X t^Y]&= E[t^X]E[t^Y]\
    &= e^{-lambda (1-t)}e^{-mu (1-t)}\
    &= e^{-(lambda+mu) (1-t)}\
    end{split}
    end{equation*}



    Now this is the P.G.F of $Po(lambda + mu)$ distribution. Therefore,we can say U=X+Y follows Po($lambda+mu$)






    share|cite|improve this answer











    $endgroup$





















      4












      $begingroup$

      In short, you can show this by using the fact that $$Pr(X+Y=k)=sum_{i=0}^kPr(X+Y=k, X=i).$$



      If $X$ and $Y$ are independent, this is equal to
      $$
      Pr(X+Y=k)=sum_{i=0}^kPr(Y=k-i)Pr(X=i)
      $$
      which is
      $$
      begin{align}
      Pr(X+Y=k)&=sum_{i=0}^kfrac{e^{-lambda_y}lambda_y^{k-i}}{(k-i)!}frac{e^{-lambda_x}lambda_x^i}{i!}\
      &=e^{-lambda_y}e^{-lambda_x}sum_{i=0}^kfrac{lambda_y^{k-i}}{(k-i)!}frac{lambda_x^i}{i!}\
      &=frac{e^{-(lambda_y+lambda_x)}}{k!}sum_{i=0}^kfrac{k!}{i!(k-i)!}lambda_y^{k-i}lambda_x^i\
      &=frac{e^{-(lambda_y+lambda_x)}}{k!}sum_{i=0}^k{kchoose i}lambda_y^{k-i}lambda_x^i
      end{align}
      $$
      The sum part is just
      $$
      sum_{i=0}^k{kchoose i}lambda_y^{k-i}lambda_x^i=(lambda_y+lambda_x)^k
      $$
      by the binomial theorem.
      So the end result is
      $$
      begin{align}
      Pr(X+Y=k)&=frac{e^{-(lambda_y+lambda_x)}}{k!}(lambda_y+lambda_x)^k
      end{align}
      $$
      which is the pmf of $Po(lambda_y+lambda_x)$.






      share|cite|improve this answer









      $endgroup$













      • $begingroup$
        Moderator notice: This answer was moved here as a consequence of merging two questions. This explains the small differences in notation. The OP's $lambda$ is $lambda_x$ here, and OP's $mu$ is $lambda_y$. Otherwise there is no difference.
        $endgroup$
        – Jyrki Lahtonen
        Apr 23 '15 at 6:55





















      2












      $begingroup$

      Using Moment Generating Function.



      If $X sim mathcal{P}(lambda)$, $Y sim mathcal{P}(mu)$ and S=X+Y.

      We know that MGF(Moment Generating Function) of $mathcal{P}(lambda)=e^{lambda(e^t-1)}$(See the end if you need proof)

      MGF of S would be
      $$begin{align}
      M_S(t)&=E[e^{tS}]\&=E[e^{t(X+Y)}]\&=E[e^{tX}e^{tY}]\&=E[e^{tX}]E[e^{tY}]quad text{given }X,Ytext{ are independent}\&=e^{lambda(e^t-1)}e^{mu(e^t-1)}\&=e^{(lambda+mu)(e^t-1)}
      end{align}$$

      Thus S is a Poisson Distribution with parameter $lambda+mu$.





      MGF of Poisson Distribution



      If $X sim mathcal{P}(lambda)$, then by definition Probability Mass Function is

      $$begin{align}
      f_X(k)=frac{lambda^k}{k!}e^{-lambda},quad k in 0,1,2....
      end{align}$$
      It's MGF is
      $$begin{align}
      M_X(t)&=E[e^{tX}]\&=sum_{k=0}^{infty}frac{lambda^k}{k!}e^{-lambda}e^{tk}\&=e^{-lambda}sum_{k=0}^{infty}frac{lambda^ke^{tk}}{k!}\&=e^{-lambda}sum_{k=0}^{infty}frac{(lambda e^t)^k}{k!}\&=e^{-lambda}e^{lambda e^t}\&=e^{lambda e^t-lambda}\&=e^{lambda(e^t-1)}
      end{align}$$






      share|cite|improve this answer











      $endgroup$





















        1












        $begingroup$

        hint: $sum_{k=0}^{n} P(X = k)P(Y = n-k)$






        share|cite|improve this answer









        $endgroup$













        • $begingroup$
          why this hint, why the sum? This is what I don't understand
          $endgroup$
          – user31280
          Oct 25 '12 at 20:22










        • $begingroup$
          adding two random variables is simply convolution of those random variables. That's why.
          $endgroup$
          – jay-sun
          Oct 25 '12 at 20:24










        • $begingroup$
          gotcha! Thanks!
          $endgroup$
          – user31280
          Oct 25 '12 at 20:31










        • $begingroup$
          adding two random variables is simply convolution of those random variables... Sorry but no.
          $endgroup$
          – Did
          Feb 13 '13 at 6:28






        • 1




          $begingroup$
          There is no usual sense for convolution of random variables. Either convolution of distributions or addition of random variables.
          $endgroup$
          – Did
          Feb 13 '13 at 6:51



















        0












        $begingroup$

        Here's a much cleaner solution:



        Consider a two Poisson processes occuring with rates $lambda$ and $mu$, where a Poisson process of rate $r$ is viewed as the limit of $n$ consecutive Bernoulli trials each with probability $frac{r}{n}$, as $ntoinfty$.



        Then $X$ counts the number of successes in the trials of rate $lambda$ and $Y$ counts the number of successes in the trials of rate $mu$, so the total number of successes is the same as if we had each trial succeed with probability $frac{lambda + mu}{n}$, where we take $n$ to be large enough so that the event where the $i$th Bernoulli trial in both processes are successdul has a negligible probability.
        Then we are done.






        share|cite|improve this answer









        $endgroup$













          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f221078%2fpoisson-distribution-of-sum-of-two-random-independent-variables-x-y%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown
























          7 Answers
          7






          active

          oldest

          votes








          7 Answers
          7






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          78












          $begingroup$

          This only holds if $X$ and $Y$ are independent, so we suppose this from now on. We have for $k ge 0$:
          begin{align*}
          P(X+ Y =k) &= sum_{i = 0}^k P(X+ Y = k, X = i)\
          &= sum_{i=0}^k P(Y = k-i , X =i)\
          &= sum_{i=0}^k P(Y = k-i)P(X=i)\
          &= sum_{i=0}^k e^{-mu}frac{mu^{k-i}}{(k-i)!}e^{-lambda}frac{lambda^i}{i!}\
          &= e^{-(mu + lambda)}frac 1{k!}sum_{i=0}^k frac{k!}{i!(k-i)!}mu^{k-i}lambda^i\
          &= e^{-(mu + lambda)}frac 1{k!}sum_{i=0}^k binom kimu^{k-i}lambda^i\
          &= frac{(mu + lambda)^k}{k!} cdot e^{-(mu + lambda)}
          end{align*}
          Hence, $X+ Y sim mathcal P(mu + lambda)$.






          share|cite|improve this answer









          $endgroup$









          • 1




            $begingroup$
            Thank you! but what happens if they are not independent?
            $endgroup$
            – user31280
            Oct 25 '12 at 20:20






          • 8




            $begingroup$
            In general we can't say anything then. It depends on how they depend on another.
            $endgroup$
            – martini
            Oct 25 '12 at 20:22






          • 1




            $begingroup$
            Thank you! it's very simple and I feel like a complete idiot.
            $endgroup$
            – user31280
            Oct 25 '12 at 20:40






          • 1




            $begingroup$
            Nice derivation: specifically the transformation of (a) the i/k factorials and (b) the mu/lambda polynomials into the binomial form of the polynomial power expression.
            $endgroup$
            – javadba
            Aug 30 '14 at 20:59








          • 1




            $begingroup$
            @LiorA Yes. k! included to combine with the rest and simplify as intended, so 1/k! is included to compensate.
            $endgroup$
            – Rolazaro Azeveires
            Jan 7 '18 at 14:23
















          78












          $begingroup$

          This only holds if $X$ and $Y$ are independent, so we suppose this from now on. We have for $k ge 0$:
          begin{align*}
          P(X+ Y =k) &= sum_{i = 0}^k P(X+ Y = k, X = i)\
          &= sum_{i=0}^k P(Y = k-i , X =i)\
          &= sum_{i=0}^k P(Y = k-i)P(X=i)\
          &= sum_{i=0}^k e^{-mu}frac{mu^{k-i}}{(k-i)!}e^{-lambda}frac{lambda^i}{i!}\
          &= e^{-(mu + lambda)}frac 1{k!}sum_{i=0}^k frac{k!}{i!(k-i)!}mu^{k-i}lambda^i\
          &= e^{-(mu + lambda)}frac 1{k!}sum_{i=0}^k binom kimu^{k-i}lambda^i\
          &= frac{(mu + lambda)^k}{k!} cdot e^{-(mu + lambda)}
          end{align*}
          Hence, $X+ Y sim mathcal P(mu + lambda)$.






          share|cite|improve this answer









          $endgroup$









          • 1




            $begingroup$
            Thank you! but what happens if they are not independent?
            $endgroup$
            – user31280
            Oct 25 '12 at 20:20






          • 8




            $begingroup$
            In general we can't say anything then. It depends on how they depend on another.
            $endgroup$
            – martini
            Oct 25 '12 at 20:22






          • 1




            $begingroup$
            Thank you! it's very simple and I feel like a complete idiot.
            $endgroup$
            – user31280
            Oct 25 '12 at 20:40






          • 1




            $begingroup$
            Nice derivation: specifically the transformation of (a) the i/k factorials and (b) the mu/lambda polynomials into the binomial form of the polynomial power expression.
            $endgroup$
            – javadba
            Aug 30 '14 at 20:59








          • 1




            $begingroup$
            @LiorA Yes. k! included to combine with the rest and simplify as intended, so 1/k! is included to compensate.
            $endgroup$
            – Rolazaro Azeveires
            Jan 7 '18 at 14:23














          78












          78








          78





          $begingroup$

          This only holds if $X$ and $Y$ are independent, so we suppose this from now on. We have for $k ge 0$:
          begin{align*}
          P(X+ Y =k) &= sum_{i = 0}^k P(X+ Y = k, X = i)\
          &= sum_{i=0}^k P(Y = k-i , X =i)\
          &= sum_{i=0}^k P(Y = k-i)P(X=i)\
          &= sum_{i=0}^k e^{-mu}frac{mu^{k-i}}{(k-i)!}e^{-lambda}frac{lambda^i}{i!}\
          &= e^{-(mu + lambda)}frac 1{k!}sum_{i=0}^k frac{k!}{i!(k-i)!}mu^{k-i}lambda^i\
          &= e^{-(mu + lambda)}frac 1{k!}sum_{i=0}^k binom kimu^{k-i}lambda^i\
          &= frac{(mu + lambda)^k}{k!} cdot e^{-(mu + lambda)}
          end{align*}
          Hence, $X+ Y sim mathcal P(mu + lambda)$.






          share|cite|improve this answer









          $endgroup$



          This only holds if $X$ and $Y$ are independent, so we suppose this from now on. We have for $k ge 0$:
          begin{align*}
          P(X+ Y =k) &= sum_{i = 0}^k P(X+ Y = k, X = i)\
          &= sum_{i=0}^k P(Y = k-i , X =i)\
          &= sum_{i=0}^k P(Y = k-i)P(X=i)\
          &= sum_{i=0}^k e^{-mu}frac{mu^{k-i}}{(k-i)!}e^{-lambda}frac{lambda^i}{i!}\
          &= e^{-(mu + lambda)}frac 1{k!}sum_{i=0}^k frac{k!}{i!(k-i)!}mu^{k-i}lambda^i\
          &= e^{-(mu + lambda)}frac 1{k!}sum_{i=0}^k binom kimu^{k-i}lambda^i\
          &= frac{(mu + lambda)^k}{k!} cdot e^{-(mu + lambda)}
          end{align*}
          Hence, $X+ Y sim mathcal P(mu + lambda)$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Oct 25 '12 at 20:19









          martinimartini

          70.6k45991




          70.6k45991








          • 1




            $begingroup$
            Thank you! but what happens if they are not independent?
            $endgroup$
            – user31280
            Oct 25 '12 at 20:20






          • 8




            $begingroup$
            In general we can't say anything then. It depends on how they depend on another.
            $endgroup$
            – martini
            Oct 25 '12 at 20:22






          • 1




            $begingroup$
            Thank you! it's very simple and I feel like a complete idiot.
            $endgroup$
            – user31280
            Oct 25 '12 at 20:40






          • 1




            $begingroup$
            Nice derivation: specifically the transformation of (a) the i/k factorials and (b) the mu/lambda polynomials into the binomial form of the polynomial power expression.
            $endgroup$
            – javadba
            Aug 30 '14 at 20:59








          • 1




            $begingroup$
            @LiorA Yes. k! included to combine with the rest and simplify as intended, so 1/k! is included to compensate.
            $endgroup$
            – Rolazaro Azeveires
            Jan 7 '18 at 14:23














          • 1




            $begingroup$
            Thank you! but what happens if they are not independent?
            $endgroup$
            – user31280
            Oct 25 '12 at 20:20






          • 8




            $begingroup$
            In general we can't say anything then. It depends on how they depend on another.
            $endgroup$
            – martini
            Oct 25 '12 at 20:22






          • 1




            $begingroup$
            Thank you! it's very simple and I feel like a complete idiot.
            $endgroup$
            – user31280
            Oct 25 '12 at 20:40






          • 1




            $begingroup$
            Nice derivation: specifically the transformation of (a) the i/k factorials and (b) the mu/lambda polynomials into the binomial form of the polynomial power expression.
            $endgroup$
            – javadba
            Aug 30 '14 at 20:59








          • 1




            $begingroup$
            @LiorA Yes. k! included to combine with the rest and simplify as intended, so 1/k! is included to compensate.
            $endgroup$
            – Rolazaro Azeveires
            Jan 7 '18 at 14:23








          1




          1




          $begingroup$
          Thank you! but what happens if they are not independent?
          $endgroup$
          – user31280
          Oct 25 '12 at 20:20




          $begingroup$
          Thank you! but what happens if they are not independent?
          $endgroup$
          – user31280
          Oct 25 '12 at 20:20




          8




          8




          $begingroup$
          In general we can't say anything then. It depends on how they depend on another.
          $endgroup$
          – martini
          Oct 25 '12 at 20:22




          $begingroup$
          In general we can't say anything then. It depends on how they depend on another.
          $endgroup$
          – martini
          Oct 25 '12 at 20:22




          1




          1




          $begingroup$
          Thank you! it's very simple and I feel like a complete idiot.
          $endgroup$
          – user31280
          Oct 25 '12 at 20:40




          $begingroup$
          Thank you! it's very simple and I feel like a complete idiot.
          $endgroup$
          – user31280
          Oct 25 '12 at 20:40




          1




          1




          $begingroup$
          Nice derivation: specifically the transformation of (a) the i/k factorials and (b) the mu/lambda polynomials into the binomial form of the polynomial power expression.
          $endgroup$
          – javadba
          Aug 30 '14 at 20:59






          $begingroup$
          Nice derivation: specifically the transformation of (a) the i/k factorials and (b) the mu/lambda polynomials into the binomial form of the polynomial power expression.
          $endgroup$
          – javadba
          Aug 30 '14 at 20:59






          1




          1




          $begingroup$
          @LiorA Yes. k! included to combine with the rest and simplify as intended, so 1/k! is included to compensate.
          $endgroup$
          – Rolazaro Azeveires
          Jan 7 '18 at 14:23




          $begingroup$
          @LiorA Yes. k! included to combine with the rest and simplify as intended, so 1/k! is included to compensate.
          $endgroup$
          – Rolazaro Azeveires
          Jan 7 '18 at 14:23











          17












          $begingroup$

          Another approach is to use characteristic functions. If $Xsim mathrm{po}(lambda)$, then the characteristic function of $X$ is (if this is unknown, just calculate it)
          $$
          varphi_X(t)=E[e^{itX}]=e^{lambda(e^{it}-1)},quad tinmathbb{R}.
          $$
          Now suppose that $X$ and $Y$ are independent Poisson distributed random variables with parameters $lambda$ and $mu$ respectively. Then due to the independence we have that
          $$
          varphi_{X+Y}(t)=varphi_X(t)varphi_Y(t)=e^{lambda(e^{it}-1)}e^{mu(e^{it}-1)}=e^{(mu+lambda)(e^{it}-1)},quad tinmathbb{R}.
          $$
          As the characteristic function completely determines the distribution, we conclude that $X+Ysimmathrm{po}(lambda+mu)$.






          share|cite|improve this answer









          $endgroup$


















            17












            $begingroup$

            Another approach is to use characteristic functions. If $Xsim mathrm{po}(lambda)$, then the characteristic function of $X$ is (if this is unknown, just calculate it)
            $$
            varphi_X(t)=E[e^{itX}]=e^{lambda(e^{it}-1)},quad tinmathbb{R}.
            $$
            Now suppose that $X$ and $Y$ are independent Poisson distributed random variables with parameters $lambda$ and $mu$ respectively. Then due to the independence we have that
            $$
            varphi_{X+Y}(t)=varphi_X(t)varphi_Y(t)=e^{lambda(e^{it}-1)}e^{mu(e^{it}-1)}=e^{(mu+lambda)(e^{it}-1)},quad tinmathbb{R}.
            $$
            As the characteristic function completely determines the distribution, we conclude that $X+Ysimmathrm{po}(lambda+mu)$.






            share|cite|improve this answer









            $endgroup$
















              17












              17








              17





              $begingroup$

              Another approach is to use characteristic functions. If $Xsim mathrm{po}(lambda)$, then the characteristic function of $X$ is (if this is unknown, just calculate it)
              $$
              varphi_X(t)=E[e^{itX}]=e^{lambda(e^{it}-1)},quad tinmathbb{R}.
              $$
              Now suppose that $X$ and $Y$ are independent Poisson distributed random variables with parameters $lambda$ and $mu$ respectively. Then due to the independence we have that
              $$
              varphi_{X+Y}(t)=varphi_X(t)varphi_Y(t)=e^{lambda(e^{it}-1)}e^{mu(e^{it}-1)}=e^{(mu+lambda)(e^{it}-1)},quad tinmathbb{R}.
              $$
              As the characteristic function completely determines the distribution, we conclude that $X+Ysimmathrm{po}(lambda+mu)$.






              share|cite|improve this answer









              $endgroup$



              Another approach is to use characteristic functions. If $Xsim mathrm{po}(lambda)$, then the characteristic function of $X$ is (if this is unknown, just calculate it)
              $$
              varphi_X(t)=E[e^{itX}]=e^{lambda(e^{it}-1)},quad tinmathbb{R}.
              $$
              Now suppose that $X$ and $Y$ are independent Poisson distributed random variables with parameters $lambda$ and $mu$ respectively. Then due to the independence we have that
              $$
              varphi_{X+Y}(t)=varphi_X(t)varphi_Y(t)=e^{lambda(e^{it}-1)}e^{mu(e^{it}-1)}=e^{(mu+lambda)(e^{it}-1)},quad tinmathbb{R}.
              $$
              As the characteristic function completely determines the distribution, we conclude that $X+Ysimmathrm{po}(lambda+mu)$.







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered Feb 13 '13 at 6:23









              Stefan HansenStefan Hansen

              20.9k73865




              20.9k73865























                  7












                  $begingroup$

                  You can use Probability Generating Function(P.G.F). As poisson distribution is a discrete probability distribution, P.G.F. fits better in this case.For independent X and Y random variable which follows distribution Po($lambda$) and Po($mu$).
                  P.G.F of X is
                  begin{equation*}
                  begin{split}
                  P_X[t] = E[t^X]&= sum_{x=0}^{infty}t^xe^{-lambda}frac{lambda^x}{x!}\
                  &=sum_{x=0}^{infty}e^{-lambda}frac{(lambda t)^x}{x!}\
                  &=e^{-lambda}e^{lambda t}\
                  &=e^{-lambda (1-t)}\
                  end{split}
                  end{equation*}
                  P.G.F of Y is
                  begin{equation*}
                  begin{split}
                  P_Y[t] = E[t^Y]&= sum_{y=0}^{infty}t^ye^{-mu}frac{mu^y}{y!}\
                  &=sum_{y=0}^{infty}e^{-mu}frac{(mu t)^y}{y!}\
                  &=e^{-mu}e^{mu t}\
                  &=e^{-mu (1-t)}\
                  end{split}
                  end{equation*}



                  Now think about P.G.F of U = X+Y.
                  As X and Y are independent,
                  begin{equation*}
                  begin{split}
                  P_U(t)=P_{X+Y}(t)=P_X(t)P_Y(t)=E[t^{X+Y}]=E[t^X t^Y]&= E[t^X]E[t^Y]\
                  &= e^{-lambda (1-t)}e^{-mu (1-t)}\
                  &= e^{-(lambda+mu) (1-t)}\
                  end{split}
                  end{equation*}



                  Now this is the P.G.F of $Po(lambda + mu)$ distribution. Therefore,we can say U=X+Y follows Po($lambda+mu$)






                  share|cite|improve this answer











                  $endgroup$


















                    7












                    $begingroup$

                    You can use Probability Generating Function(P.G.F). As poisson distribution is a discrete probability distribution, P.G.F. fits better in this case.For independent X and Y random variable which follows distribution Po($lambda$) and Po($mu$).
                    P.G.F of X is
                    begin{equation*}
                    begin{split}
                    P_X[t] = E[t^X]&= sum_{x=0}^{infty}t^xe^{-lambda}frac{lambda^x}{x!}\
                    &=sum_{x=0}^{infty}e^{-lambda}frac{(lambda t)^x}{x!}\
                    &=e^{-lambda}e^{lambda t}\
                    &=e^{-lambda (1-t)}\
                    end{split}
                    end{equation*}
                    P.G.F of Y is
                    begin{equation*}
                    begin{split}
                    P_Y[t] = E[t^Y]&= sum_{y=0}^{infty}t^ye^{-mu}frac{mu^y}{y!}\
                    &=sum_{y=0}^{infty}e^{-mu}frac{(mu t)^y}{y!}\
                    &=e^{-mu}e^{mu t}\
                    &=e^{-mu (1-t)}\
                    end{split}
                    end{equation*}



                    Now think about P.G.F of U = X+Y.
                    As X and Y are independent,
                    begin{equation*}
                    begin{split}
                    P_U(t)=P_{X+Y}(t)=P_X(t)P_Y(t)=E[t^{X+Y}]=E[t^X t^Y]&= E[t^X]E[t^Y]\
                    &= e^{-lambda (1-t)}e^{-mu (1-t)}\
                    &= e^{-(lambda+mu) (1-t)}\
                    end{split}
                    end{equation*}



                    Now this is the P.G.F of $Po(lambda + mu)$ distribution. Therefore,we can say U=X+Y follows Po($lambda+mu$)






                    share|cite|improve this answer











                    $endgroup$
















                      7












                      7








                      7





                      $begingroup$

                      You can use Probability Generating Function(P.G.F). As poisson distribution is a discrete probability distribution, P.G.F. fits better in this case.For independent X and Y random variable which follows distribution Po($lambda$) and Po($mu$).
                      P.G.F of X is
                      begin{equation*}
                      begin{split}
                      P_X[t] = E[t^X]&= sum_{x=0}^{infty}t^xe^{-lambda}frac{lambda^x}{x!}\
                      &=sum_{x=0}^{infty}e^{-lambda}frac{(lambda t)^x}{x!}\
                      &=e^{-lambda}e^{lambda t}\
                      &=e^{-lambda (1-t)}\
                      end{split}
                      end{equation*}
                      P.G.F of Y is
                      begin{equation*}
                      begin{split}
                      P_Y[t] = E[t^Y]&= sum_{y=0}^{infty}t^ye^{-mu}frac{mu^y}{y!}\
                      &=sum_{y=0}^{infty}e^{-mu}frac{(mu t)^y}{y!}\
                      &=e^{-mu}e^{mu t}\
                      &=e^{-mu (1-t)}\
                      end{split}
                      end{equation*}



                      Now think about P.G.F of U = X+Y.
                      As X and Y are independent,
                      begin{equation*}
                      begin{split}
                      P_U(t)=P_{X+Y}(t)=P_X(t)P_Y(t)=E[t^{X+Y}]=E[t^X t^Y]&= E[t^X]E[t^Y]\
                      &= e^{-lambda (1-t)}e^{-mu (1-t)}\
                      &= e^{-(lambda+mu) (1-t)}\
                      end{split}
                      end{equation*}



                      Now this is the P.G.F of $Po(lambda + mu)$ distribution. Therefore,we can say U=X+Y follows Po($lambda+mu$)






                      share|cite|improve this answer











                      $endgroup$



                      You can use Probability Generating Function(P.G.F). As poisson distribution is a discrete probability distribution, P.G.F. fits better in this case.For independent X and Y random variable which follows distribution Po($lambda$) and Po($mu$).
                      P.G.F of X is
                      begin{equation*}
                      begin{split}
                      P_X[t] = E[t^X]&= sum_{x=0}^{infty}t^xe^{-lambda}frac{lambda^x}{x!}\
                      &=sum_{x=0}^{infty}e^{-lambda}frac{(lambda t)^x}{x!}\
                      &=e^{-lambda}e^{lambda t}\
                      &=e^{-lambda (1-t)}\
                      end{split}
                      end{equation*}
                      P.G.F of Y is
                      begin{equation*}
                      begin{split}
                      P_Y[t] = E[t^Y]&= sum_{y=0}^{infty}t^ye^{-mu}frac{mu^y}{y!}\
                      &=sum_{y=0}^{infty}e^{-mu}frac{(mu t)^y}{y!}\
                      &=e^{-mu}e^{mu t}\
                      &=e^{-mu (1-t)}\
                      end{split}
                      end{equation*}



                      Now think about P.G.F of U = X+Y.
                      As X and Y are independent,
                      begin{equation*}
                      begin{split}
                      P_U(t)=P_{X+Y}(t)=P_X(t)P_Y(t)=E[t^{X+Y}]=E[t^X t^Y]&= E[t^X]E[t^Y]\
                      &= e^{-lambda (1-t)}e^{-mu (1-t)}\
                      &= e^{-(lambda+mu) (1-t)}\
                      end{split}
                      end{equation*}



                      Now this is the P.G.F of $Po(lambda + mu)$ distribution. Therefore,we can say U=X+Y follows Po($lambda+mu$)







                      share|cite|improve this answer














                      share|cite|improve this answer



                      share|cite|improve this answer








                      edited Sep 2 '14 at 5:34

























                      answered Jul 25 '13 at 15:52









                      AnandaAnanda

                      7315




                      7315























                          4












                          $begingroup$

                          In short, you can show this by using the fact that $$Pr(X+Y=k)=sum_{i=0}^kPr(X+Y=k, X=i).$$



                          If $X$ and $Y$ are independent, this is equal to
                          $$
                          Pr(X+Y=k)=sum_{i=0}^kPr(Y=k-i)Pr(X=i)
                          $$
                          which is
                          $$
                          begin{align}
                          Pr(X+Y=k)&=sum_{i=0}^kfrac{e^{-lambda_y}lambda_y^{k-i}}{(k-i)!}frac{e^{-lambda_x}lambda_x^i}{i!}\
                          &=e^{-lambda_y}e^{-lambda_x}sum_{i=0}^kfrac{lambda_y^{k-i}}{(k-i)!}frac{lambda_x^i}{i!}\
                          &=frac{e^{-(lambda_y+lambda_x)}}{k!}sum_{i=0}^kfrac{k!}{i!(k-i)!}lambda_y^{k-i}lambda_x^i\
                          &=frac{e^{-(lambda_y+lambda_x)}}{k!}sum_{i=0}^k{kchoose i}lambda_y^{k-i}lambda_x^i
                          end{align}
                          $$
                          The sum part is just
                          $$
                          sum_{i=0}^k{kchoose i}lambda_y^{k-i}lambda_x^i=(lambda_y+lambda_x)^k
                          $$
                          by the binomial theorem.
                          So the end result is
                          $$
                          begin{align}
                          Pr(X+Y=k)&=frac{e^{-(lambda_y+lambda_x)}}{k!}(lambda_y+lambda_x)^k
                          end{align}
                          $$
                          which is the pmf of $Po(lambda_y+lambda_x)$.






                          share|cite|improve this answer









                          $endgroup$













                          • $begingroup$
                            Moderator notice: This answer was moved here as a consequence of merging two questions. This explains the small differences in notation. The OP's $lambda$ is $lambda_x$ here, and OP's $mu$ is $lambda_y$. Otherwise there is no difference.
                            $endgroup$
                            – Jyrki Lahtonen
                            Apr 23 '15 at 6:55


















                          4












                          $begingroup$

                          In short, you can show this by using the fact that $$Pr(X+Y=k)=sum_{i=0}^kPr(X+Y=k, X=i).$$



                          If $X$ and $Y$ are independent, this is equal to
                          $$
                          Pr(X+Y=k)=sum_{i=0}^kPr(Y=k-i)Pr(X=i)
                          $$
                          which is
                          $$
                          begin{align}
                          Pr(X+Y=k)&=sum_{i=0}^kfrac{e^{-lambda_y}lambda_y^{k-i}}{(k-i)!}frac{e^{-lambda_x}lambda_x^i}{i!}\
                          &=e^{-lambda_y}e^{-lambda_x}sum_{i=0}^kfrac{lambda_y^{k-i}}{(k-i)!}frac{lambda_x^i}{i!}\
                          &=frac{e^{-(lambda_y+lambda_x)}}{k!}sum_{i=0}^kfrac{k!}{i!(k-i)!}lambda_y^{k-i}lambda_x^i\
                          &=frac{e^{-(lambda_y+lambda_x)}}{k!}sum_{i=0}^k{kchoose i}lambda_y^{k-i}lambda_x^i
                          end{align}
                          $$
                          The sum part is just
                          $$
                          sum_{i=0}^k{kchoose i}lambda_y^{k-i}lambda_x^i=(lambda_y+lambda_x)^k
                          $$
                          by the binomial theorem.
                          So the end result is
                          $$
                          begin{align}
                          Pr(X+Y=k)&=frac{e^{-(lambda_y+lambda_x)}}{k!}(lambda_y+lambda_x)^k
                          end{align}
                          $$
                          which is the pmf of $Po(lambda_y+lambda_x)$.






                          share|cite|improve this answer









                          $endgroup$













                          • $begingroup$
                            Moderator notice: This answer was moved here as a consequence of merging two questions. This explains the small differences in notation. The OP's $lambda$ is $lambda_x$ here, and OP's $mu$ is $lambda_y$. Otherwise there is no difference.
                            $endgroup$
                            – Jyrki Lahtonen
                            Apr 23 '15 at 6:55
















                          4












                          4








                          4





                          $begingroup$

                          In short, you can show this by using the fact that $$Pr(X+Y=k)=sum_{i=0}^kPr(X+Y=k, X=i).$$



                          If $X$ and $Y$ are independent, this is equal to
                          $$
                          Pr(X+Y=k)=sum_{i=0}^kPr(Y=k-i)Pr(X=i)
                          $$
                          which is
                          $$
                          begin{align}
                          Pr(X+Y=k)&=sum_{i=0}^kfrac{e^{-lambda_y}lambda_y^{k-i}}{(k-i)!}frac{e^{-lambda_x}lambda_x^i}{i!}\
                          &=e^{-lambda_y}e^{-lambda_x}sum_{i=0}^kfrac{lambda_y^{k-i}}{(k-i)!}frac{lambda_x^i}{i!}\
                          &=frac{e^{-(lambda_y+lambda_x)}}{k!}sum_{i=0}^kfrac{k!}{i!(k-i)!}lambda_y^{k-i}lambda_x^i\
                          &=frac{e^{-(lambda_y+lambda_x)}}{k!}sum_{i=0}^k{kchoose i}lambda_y^{k-i}lambda_x^i
                          end{align}
                          $$
                          The sum part is just
                          $$
                          sum_{i=0}^k{kchoose i}lambda_y^{k-i}lambda_x^i=(lambda_y+lambda_x)^k
                          $$
                          by the binomial theorem.
                          So the end result is
                          $$
                          begin{align}
                          Pr(X+Y=k)&=frac{e^{-(lambda_y+lambda_x)}}{k!}(lambda_y+lambda_x)^k
                          end{align}
                          $$
                          which is the pmf of $Po(lambda_y+lambda_x)$.






                          share|cite|improve this answer









                          $endgroup$



                          In short, you can show this by using the fact that $$Pr(X+Y=k)=sum_{i=0}^kPr(X+Y=k, X=i).$$



                          If $X$ and $Y$ are independent, this is equal to
                          $$
                          Pr(X+Y=k)=sum_{i=0}^kPr(Y=k-i)Pr(X=i)
                          $$
                          which is
                          $$
                          begin{align}
                          Pr(X+Y=k)&=sum_{i=0}^kfrac{e^{-lambda_y}lambda_y^{k-i}}{(k-i)!}frac{e^{-lambda_x}lambda_x^i}{i!}\
                          &=e^{-lambda_y}e^{-lambda_x}sum_{i=0}^kfrac{lambda_y^{k-i}}{(k-i)!}frac{lambda_x^i}{i!}\
                          &=frac{e^{-(lambda_y+lambda_x)}}{k!}sum_{i=0}^kfrac{k!}{i!(k-i)!}lambda_y^{k-i}lambda_x^i\
                          &=frac{e^{-(lambda_y+lambda_x)}}{k!}sum_{i=0}^k{kchoose i}lambda_y^{k-i}lambda_x^i
                          end{align}
                          $$
                          The sum part is just
                          $$
                          sum_{i=0}^k{kchoose i}lambda_y^{k-i}lambda_x^i=(lambda_y+lambda_x)^k
                          $$
                          by the binomial theorem.
                          So the end result is
                          $$
                          begin{align}
                          Pr(X+Y=k)&=frac{e^{-(lambda_y+lambda_x)}}{k!}(lambda_y+lambda_x)^k
                          end{align}
                          $$
                          which is the pmf of $Po(lambda_y+lambda_x)$.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Nov 25 '13 at 7:54









                          hejsebhejseb

                          3,8421930




                          3,8421930












                          • $begingroup$
                            Moderator notice: This answer was moved here as a consequence of merging two questions. This explains the small differences in notation. The OP's $lambda$ is $lambda_x$ here, and OP's $mu$ is $lambda_y$. Otherwise there is no difference.
                            $endgroup$
                            – Jyrki Lahtonen
                            Apr 23 '15 at 6:55




















                          • $begingroup$
                            Moderator notice: This answer was moved here as a consequence of merging two questions. This explains the small differences in notation. The OP's $lambda$ is $lambda_x$ here, and OP's $mu$ is $lambda_y$. Otherwise there is no difference.
                            $endgroup$
                            – Jyrki Lahtonen
                            Apr 23 '15 at 6:55


















                          $begingroup$
                          Moderator notice: This answer was moved here as a consequence of merging two questions. This explains the small differences in notation. The OP's $lambda$ is $lambda_x$ here, and OP's $mu$ is $lambda_y$. Otherwise there is no difference.
                          $endgroup$
                          – Jyrki Lahtonen
                          Apr 23 '15 at 6:55






                          $begingroup$
                          Moderator notice: This answer was moved here as a consequence of merging two questions. This explains the small differences in notation. The OP's $lambda$ is $lambda_x$ here, and OP's $mu$ is $lambda_y$. Otherwise there is no difference.
                          $endgroup$
                          – Jyrki Lahtonen
                          Apr 23 '15 at 6:55













                          2












                          $begingroup$

                          Using Moment Generating Function.



                          If $X sim mathcal{P}(lambda)$, $Y sim mathcal{P}(mu)$ and S=X+Y.

                          We know that MGF(Moment Generating Function) of $mathcal{P}(lambda)=e^{lambda(e^t-1)}$(See the end if you need proof)

                          MGF of S would be
                          $$begin{align}
                          M_S(t)&=E[e^{tS}]\&=E[e^{t(X+Y)}]\&=E[e^{tX}e^{tY}]\&=E[e^{tX}]E[e^{tY}]quad text{given }X,Ytext{ are independent}\&=e^{lambda(e^t-1)}e^{mu(e^t-1)}\&=e^{(lambda+mu)(e^t-1)}
                          end{align}$$

                          Thus S is a Poisson Distribution with parameter $lambda+mu$.





                          MGF of Poisson Distribution



                          If $X sim mathcal{P}(lambda)$, then by definition Probability Mass Function is

                          $$begin{align}
                          f_X(k)=frac{lambda^k}{k!}e^{-lambda},quad k in 0,1,2....
                          end{align}$$
                          It's MGF is
                          $$begin{align}
                          M_X(t)&=E[e^{tX}]\&=sum_{k=0}^{infty}frac{lambda^k}{k!}e^{-lambda}e^{tk}\&=e^{-lambda}sum_{k=0}^{infty}frac{lambda^ke^{tk}}{k!}\&=e^{-lambda}sum_{k=0}^{infty}frac{(lambda e^t)^k}{k!}\&=e^{-lambda}e^{lambda e^t}\&=e^{lambda e^t-lambda}\&=e^{lambda(e^t-1)}
                          end{align}$$






                          share|cite|improve this answer











                          $endgroup$


















                            2












                            $begingroup$

                            Using Moment Generating Function.



                            If $X sim mathcal{P}(lambda)$, $Y sim mathcal{P}(mu)$ and S=X+Y.

                            We know that MGF(Moment Generating Function) of $mathcal{P}(lambda)=e^{lambda(e^t-1)}$(See the end if you need proof)

                            MGF of S would be
                            $$begin{align}
                            M_S(t)&=E[e^{tS}]\&=E[e^{t(X+Y)}]\&=E[e^{tX}e^{tY}]\&=E[e^{tX}]E[e^{tY}]quad text{given }X,Ytext{ are independent}\&=e^{lambda(e^t-1)}e^{mu(e^t-1)}\&=e^{(lambda+mu)(e^t-1)}
                            end{align}$$

                            Thus S is a Poisson Distribution with parameter $lambda+mu$.





                            MGF of Poisson Distribution



                            If $X sim mathcal{P}(lambda)$, then by definition Probability Mass Function is

                            $$begin{align}
                            f_X(k)=frac{lambda^k}{k!}e^{-lambda},quad k in 0,1,2....
                            end{align}$$
                            It's MGF is
                            $$begin{align}
                            M_X(t)&=E[e^{tX}]\&=sum_{k=0}^{infty}frac{lambda^k}{k!}e^{-lambda}e^{tk}\&=e^{-lambda}sum_{k=0}^{infty}frac{lambda^ke^{tk}}{k!}\&=e^{-lambda}sum_{k=0}^{infty}frac{(lambda e^t)^k}{k!}\&=e^{-lambda}e^{lambda e^t}\&=e^{lambda e^t-lambda}\&=e^{lambda(e^t-1)}
                            end{align}$$






                            share|cite|improve this answer











                            $endgroup$
















                              2












                              2








                              2





                              $begingroup$

                              Using Moment Generating Function.



                              If $X sim mathcal{P}(lambda)$, $Y sim mathcal{P}(mu)$ and S=X+Y.

                              We know that MGF(Moment Generating Function) of $mathcal{P}(lambda)=e^{lambda(e^t-1)}$(See the end if you need proof)

                              MGF of S would be
                              $$begin{align}
                              M_S(t)&=E[e^{tS}]\&=E[e^{t(X+Y)}]\&=E[e^{tX}e^{tY}]\&=E[e^{tX}]E[e^{tY}]quad text{given }X,Ytext{ are independent}\&=e^{lambda(e^t-1)}e^{mu(e^t-1)}\&=e^{(lambda+mu)(e^t-1)}
                              end{align}$$

                              Thus S is a Poisson Distribution with parameter $lambda+mu$.





                              MGF of Poisson Distribution



                              If $X sim mathcal{P}(lambda)$, then by definition Probability Mass Function is

                              $$begin{align}
                              f_X(k)=frac{lambda^k}{k!}e^{-lambda},quad k in 0,1,2....
                              end{align}$$
                              It's MGF is
                              $$begin{align}
                              M_X(t)&=E[e^{tX}]\&=sum_{k=0}^{infty}frac{lambda^k}{k!}e^{-lambda}e^{tk}\&=e^{-lambda}sum_{k=0}^{infty}frac{lambda^ke^{tk}}{k!}\&=e^{-lambda}sum_{k=0}^{infty}frac{(lambda e^t)^k}{k!}\&=e^{-lambda}e^{lambda e^t}\&=e^{lambda e^t-lambda}\&=e^{lambda(e^t-1)}
                              end{align}$$






                              share|cite|improve this answer











                              $endgroup$



                              Using Moment Generating Function.



                              If $X sim mathcal{P}(lambda)$, $Y sim mathcal{P}(mu)$ and S=X+Y.

                              We know that MGF(Moment Generating Function) of $mathcal{P}(lambda)=e^{lambda(e^t-1)}$(See the end if you need proof)

                              MGF of S would be
                              $$begin{align}
                              M_S(t)&=E[e^{tS}]\&=E[e^{t(X+Y)}]\&=E[e^{tX}e^{tY}]\&=E[e^{tX}]E[e^{tY}]quad text{given }X,Ytext{ are independent}\&=e^{lambda(e^t-1)}e^{mu(e^t-1)}\&=e^{(lambda+mu)(e^t-1)}
                              end{align}$$

                              Thus S is a Poisson Distribution with parameter $lambda+mu$.





                              MGF of Poisson Distribution



                              If $X sim mathcal{P}(lambda)$, then by definition Probability Mass Function is

                              $$begin{align}
                              f_X(k)=frac{lambda^k}{k!}e^{-lambda},quad k in 0,1,2....
                              end{align}$$
                              It's MGF is
                              $$begin{align}
                              M_X(t)&=E[e^{tX}]\&=sum_{k=0}^{infty}frac{lambda^k}{k!}e^{-lambda}e^{tk}\&=e^{-lambda}sum_{k=0}^{infty}frac{lambda^ke^{tk}}{k!}\&=e^{-lambda}sum_{k=0}^{infty}frac{(lambda e^t)^k}{k!}\&=e^{-lambda}e^{lambda e^t}\&=e^{lambda e^t-lambda}\&=e^{lambda(e^t-1)}
                              end{align}$$







                              share|cite|improve this answer














                              share|cite|improve this answer



                              share|cite|improve this answer








                              edited Apr 5 '18 at 20:46

























                              answered Apr 5 '18 at 20:21









                              kazakaza

                              1388




                              1388























                                  1












                                  $begingroup$

                                  hint: $sum_{k=0}^{n} P(X = k)P(Y = n-k)$






                                  share|cite|improve this answer









                                  $endgroup$













                                  • $begingroup$
                                    why this hint, why the sum? This is what I don't understand
                                    $endgroup$
                                    – user31280
                                    Oct 25 '12 at 20:22










                                  • $begingroup$
                                    adding two random variables is simply convolution of those random variables. That's why.
                                    $endgroup$
                                    – jay-sun
                                    Oct 25 '12 at 20:24










                                  • $begingroup$
                                    gotcha! Thanks!
                                    $endgroup$
                                    – user31280
                                    Oct 25 '12 at 20:31










                                  • $begingroup$
                                    adding two random variables is simply convolution of those random variables... Sorry but no.
                                    $endgroup$
                                    – Did
                                    Feb 13 '13 at 6:28






                                  • 1




                                    $begingroup$
                                    There is no usual sense for convolution of random variables. Either convolution of distributions or addition of random variables.
                                    $endgroup$
                                    – Did
                                    Feb 13 '13 at 6:51
















                                  1












                                  $begingroup$

                                  hint: $sum_{k=0}^{n} P(X = k)P(Y = n-k)$






                                  share|cite|improve this answer









                                  $endgroup$













                                  • $begingroup$
                                    why this hint, why the sum? This is what I don't understand
                                    $endgroup$
                                    – user31280
                                    Oct 25 '12 at 20:22










                                  • $begingroup$
                                    adding two random variables is simply convolution of those random variables. That's why.
                                    $endgroup$
                                    – jay-sun
                                    Oct 25 '12 at 20:24










                                  • $begingroup$
                                    gotcha! Thanks!
                                    $endgroup$
                                    – user31280
                                    Oct 25 '12 at 20:31










                                  • $begingroup$
                                    adding two random variables is simply convolution of those random variables... Sorry but no.
                                    $endgroup$
                                    – Did
                                    Feb 13 '13 at 6:28






                                  • 1




                                    $begingroup$
                                    There is no usual sense for convolution of random variables. Either convolution of distributions or addition of random variables.
                                    $endgroup$
                                    – Did
                                    Feb 13 '13 at 6:51














                                  1












                                  1








                                  1





                                  $begingroup$

                                  hint: $sum_{k=0}^{n} P(X = k)P(Y = n-k)$






                                  share|cite|improve this answer









                                  $endgroup$



                                  hint: $sum_{k=0}^{n} P(X = k)P(Y = n-k)$







                                  share|cite|improve this answer












                                  share|cite|improve this answer



                                  share|cite|improve this answer










                                  answered Oct 25 '12 at 20:20









                                  jay-sunjay-sun

                                  736513




                                  736513












                                  • $begingroup$
                                    why this hint, why the sum? This is what I don't understand
                                    $endgroup$
                                    – user31280
                                    Oct 25 '12 at 20:22










                                  • $begingroup$
                                    adding two random variables is simply convolution of those random variables. That's why.
                                    $endgroup$
                                    – jay-sun
                                    Oct 25 '12 at 20:24










                                  • $begingroup$
                                    gotcha! Thanks!
                                    $endgroup$
                                    – user31280
                                    Oct 25 '12 at 20:31










                                  • $begingroup$
                                    adding two random variables is simply convolution of those random variables... Sorry but no.
                                    $endgroup$
                                    – Did
                                    Feb 13 '13 at 6:28






                                  • 1




                                    $begingroup$
                                    There is no usual sense for convolution of random variables. Either convolution of distributions or addition of random variables.
                                    $endgroup$
                                    – Did
                                    Feb 13 '13 at 6:51


















                                  • $begingroup$
                                    why this hint, why the sum? This is what I don't understand
                                    $endgroup$
                                    – user31280
                                    Oct 25 '12 at 20:22










                                  • $begingroup$
                                    adding two random variables is simply convolution of those random variables. That's why.
                                    $endgroup$
                                    – jay-sun
                                    Oct 25 '12 at 20:24










                                  • $begingroup$
                                    gotcha! Thanks!
                                    $endgroup$
                                    – user31280
                                    Oct 25 '12 at 20:31










                                  • $begingroup$
                                    adding two random variables is simply convolution of those random variables... Sorry but no.
                                    $endgroup$
                                    – Did
                                    Feb 13 '13 at 6:28






                                  • 1




                                    $begingroup$
                                    There is no usual sense for convolution of random variables. Either convolution of distributions or addition of random variables.
                                    $endgroup$
                                    – Did
                                    Feb 13 '13 at 6:51
















                                  $begingroup$
                                  why this hint, why the sum? This is what I don't understand
                                  $endgroup$
                                  – user31280
                                  Oct 25 '12 at 20:22




                                  $begingroup$
                                  why this hint, why the sum? This is what I don't understand
                                  $endgroup$
                                  – user31280
                                  Oct 25 '12 at 20:22












                                  $begingroup$
                                  adding two random variables is simply convolution of those random variables. That's why.
                                  $endgroup$
                                  – jay-sun
                                  Oct 25 '12 at 20:24




                                  $begingroup$
                                  adding two random variables is simply convolution of those random variables. That's why.
                                  $endgroup$
                                  – jay-sun
                                  Oct 25 '12 at 20:24












                                  $begingroup$
                                  gotcha! Thanks!
                                  $endgroup$
                                  – user31280
                                  Oct 25 '12 at 20:31




                                  $begingroup$
                                  gotcha! Thanks!
                                  $endgroup$
                                  – user31280
                                  Oct 25 '12 at 20:31












                                  $begingroup$
                                  adding two random variables is simply convolution of those random variables... Sorry but no.
                                  $endgroup$
                                  – Did
                                  Feb 13 '13 at 6:28




                                  $begingroup$
                                  adding two random variables is simply convolution of those random variables... Sorry but no.
                                  $endgroup$
                                  – Did
                                  Feb 13 '13 at 6:28




                                  1




                                  1




                                  $begingroup$
                                  There is no usual sense for convolution of random variables. Either convolution of distributions or addition of random variables.
                                  $endgroup$
                                  – Did
                                  Feb 13 '13 at 6:51




                                  $begingroup$
                                  There is no usual sense for convolution of random variables. Either convolution of distributions or addition of random variables.
                                  $endgroup$
                                  – Did
                                  Feb 13 '13 at 6:51











                                  0












                                  $begingroup$

                                  Here's a much cleaner solution:



                                  Consider a two Poisson processes occuring with rates $lambda$ and $mu$, where a Poisson process of rate $r$ is viewed as the limit of $n$ consecutive Bernoulli trials each with probability $frac{r}{n}$, as $ntoinfty$.



                                  Then $X$ counts the number of successes in the trials of rate $lambda$ and $Y$ counts the number of successes in the trials of rate $mu$, so the total number of successes is the same as if we had each trial succeed with probability $frac{lambda + mu}{n}$, where we take $n$ to be large enough so that the event where the $i$th Bernoulli trial in both processes are successdul has a negligible probability.
                                  Then we are done.






                                  share|cite|improve this answer









                                  $endgroup$


















                                    0












                                    $begingroup$

                                    Here's a much cleaner solution:



                                    Consider a two Poisson processes occuring with rates $lambda$ and $mu$, where a Poisson process of rate $r$ is viewed as the limit of $n$ consecutive Bernoulli trials each with probability $frac{r}{n}$, as $ntoinfty$.



                                    Then $X$ counts the number of successes in the trials of rate $lambda$ and $Y$ counts the number of successes in the trials of rate $mu$, so the total number of successes is the same as if we had each trial succeed with probability $frac{lambda + mu}{n}$, where we take $n$ to be large enough so that the event where the $i$th Bernoulli trial in both processes are successdul has a negligible probability.
                                    Then we are done.






                                    share|cite|improve this answer









                                    $endgroup$
















                                      0












                                      0








                                      0





                                      $begingroup$

                                      Here's a much cleaner solution:



                                      Consider a two Poisson processes occuring with rates $lambda$ and $mu$, where a Poisson process of rate $r$ is viewed as the limit of $n$ consecutive Bernoulli trials each with probability $frac{r}{n}$, as $ntoinfty$.



                                      Then $X$ counts the number of successes in the trials of rate $lambda$ and $Y$ counts the number of successes in the trials of rate $mu$, so the total number of successes is the same as if we had each trial succeed with probability $frac{lambda + mu}{n}$, where we take $n$ to be large enough so that the event where the $i$th Bernoulli trial in both processes are successdul has a negligible probability.
                                      Then we are done.






                                      share|cite|improve this answer









                                      $endgroup$



                                      Here's a much cleaner solution:



                                      Consider a two Poisson processes occuring with rates $lambda$ and $mu$, where a Poisson process of rate $r$ is viewed as the limit of $n$ consecutive Bernoulli trials each with probability $frac{r}{n}$, as $ntoinfty$.



                                      Then $X$ counts the number of successes in the trials of rate $lambda$ and $Y$ counts the number of successes in the trials of rate $mu$, so the total number of successes is the same as if we had each trial succeed with probability $frac{lambda + mu}{n}$, where we take $n$ to be large enough so that the event where the $i$th Bernoulli trial in both processes are successdul has a negligible probability.
                                      Then we are done.







                                      share|cite|improve this answer












                                      share|cite|improve this answer



                                      share|cite|improve this answer










                                      answered Dec 15 '18 at 4:00









                                      AnonAnon

                                      378313




                                      378313






























                                          draft saved

                                          draft discarded




















































                                          Thanks for contributing an answer to Mathematics Stack Exchange!


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid



                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.


                                          Use MathJax to format equations. MathJax reference.


                                          To learn more, see our tips on writing great answers.




                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function () {
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f221078%2fpoisson-distribution-of-sum-of-two-random-independent-variables-x-y%23new-answer', 'question_page');
                                          }
                                          );

                                          Post as a guest















                                          Required, but never shown





















































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown

































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown







                                          Popular posts from this blog

                                          Plaza Victoria

                                          In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

                                          How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...