UMVUE Geometric Distribution












2












$begingroup$


I am trying to find the UMVUE for the parameter $p$ for an n i.i.d geometric distribution:



$(1-p)^{x-1}p$ for $x=1,2,…$ and $0<p<1$



and found that:



$P(X_1=1)$ is an unbiased estimator , so let $w=I[X_1=1]$ be my unbiased estimator and since $sum_i X_i=t$ is complete and sufficient statistic for geometric distribution, I can improve my unbiased estimator as follows:



$E[wmidsum_i X_i=t] = P(X_1=1midsum_i X_i=t) = frac{P(X_1=1,sum_i X_i=t-1)}{P(sum_i X_i=t)}$



So I have two questions now:
what is the pdf for $sum_i X_i=t-1$ ? .. I know it is negative binomial but can't write it correctly
and my second question is what is the variance of this modified unbiased estimator and does it achieve the Cramer-Rao lower bound ?










share|cite|improve this question











$endgroup$

















    2












    $begingroup$


    I am trying to find the UMVUE for the parameter $p$ for an n i.i.d geometric distribution:



    $(1-p)^{x-1}p$ for $x=1,2,…$ and $0<p<1$



    and found that:



    $P(X_1=1)$ is an unbiased estimator , so let $w=I[X_1=1]$ be my unbiased estimator and since $sum_i X_i=t$ is complete and sufficient statistic for geometric distribution, I can improve my unbiased estimator as follows:



    $E[wmidsum_i X_i=t] = P(X_1=1midsum_i X_i=t) = frac{P(X_1=1,sum_i X_i=t-1)}{P(sum_i X_i=t)}$



    So I have two questions now:
    what is the pdf for $sum_i X_i=t-1$ ? .. I know it is negative binomial but can't write it correctly
    and my second question is what is the variance of this modified unbiased estimator and does it achieve the Cramer-Rao lower bound ?










    share|cite|improve this question











    $endgroup$















      2












      2








      2





      $begingroup$


      I am trying to find the UMVUE for the parameter $p$ for an n i.i.d geometric distribution:



      $(1-p)^{x-1}p$ for $x=1,2,…$ and $0<p<1$



      and found that:



      $P(X_1=1)$ is an unbiased estimator , so let $w=I[X_1=1]$ be my unbiased estimator and since $sum_i X_i=t$ is complete and sufficient statistic for geometric distribution, I can improve my unbiased estimator as follows:



      $E[wmidsum_i X_i=t] = P(X_1=1midsum_i X_i=t) = frac{P(X_1=1,sum_i X_i=t-1)}{P(sum_i X_i=t)}$



      So I have two questions now:
      what is the pdf for $sum_i X_i=t-1$ ? .. I know it is negative binomial but can't write it correctly
      and my second question is what is the variance of this modified unbiased estimator and does it achieve the Cramer-Rao lower bound ?










      share|cite|improve this question











      $endgroup$




      I am trying to find the UMVUE for the parameter $p$ for an n i.i.d geometric distribution:



      $(1-p)^{x-1}p$ for $x=1,2,…$ and $0<p<1$



      and found that:



      $P(X_1=1)$ is an unbiased estimator , so let $w=I[X_1=1]$ be my unbiased estimator and since $sum_i X_i=t$ is complete and sufficient statistic for geometric distribution, I can improve my unbiased estimator as follows:



      $E[wmidsum_i X_i=t] = P(X_1=1midsum_i X_i=t) = frac{P(X_1=1,sum_i X_i=t-1)}{P(sum_i X_i=t)}$



      So I have two questions now:
      what is the pdf for $sum_i X_i=t-1$ ? .. I know it is negative binomial but can't write it correctly
      and my second question is what is the variance of this modified unbiased estimator and does it achieve the Cramer-Rao lower bound ?







      self-learning statistical-inference






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Dec 1 '16 at 1:46









      Momo

      12k21430




      12k21430










      asked Dec 1 '16 at 1:10









      BassemBassem

      478




      478






















          1 Answer
          1






          active

          oldest

          votes


















          0












          $begingroup$

          For the first question only:



          $P(X_1=1)=p$



          $P(sum_{i=2}^{n}X_i=t-1)={t-2choose n-2}p^{n-1}(1-p)^{t-n}$, $t=n,n+1...$



          $P(sum_{i=1}^{n}X_i=t)={t-1choose n-1}p^{n}(1-p)^{t-n}$, $t=n,n+1...$



          So the UMVUE is $hat p=frac{n-1}{sum_{i=1}^{n} X_i-1}$



          For CRLB you may look here.



          But for the variance of the UMVUE:



          $Var(hat p)=sum_{t=n}^infty left(frac{n-1}{t-1}-pright)^2 {t-1choose n-1}p^n(1-p)^{t-n}$



          I'm afraid I was not able to get a closed form. Neither it worked for $E(hat p^2)$



          Maybe somebody else can step in.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Can we find a closed form for the variance or E(p^2) when n=2 ? that is $ hat p=frac{n-1}{sum_{i=1}^{2} X_i-1}$ and how did you develop the variance summation? thanks
            $endgroup$
            – Bassem
            Dec 1 '16 at 12:24










          • $begingroup$
            $Y=sum_{i=1}^n X_i$ is Negative Binomial, $T=frac{n-1}{Y-1}$ is unbiased ($E[T]=p$), so $Var(T)=E[(T-E[T])^2]=Eleft[left(frac{n-1}{Y-1}-pright)^2right]=sum_{t=n}^inftyleft(frac{n-1}{t-1}-pright)^2 P(Y=t)$
            $endgroup$
            – Momo
            Dec 1 '16 at 14:33










          • $begingroup$
            Also, for $n=2$ you have $E[hat{p}^2]=frac{p^2log(1/p)}{1-p}$ so $Var(hat p)=E[hat{p}^2]-p^2$
            $endgroup$
            – Momo
            Dec 1 '16 at 14:39












          • $begingroup$
            Great .. thanks for the help .. but could you show what form did you use to find $E[hat{p}^2]$ ?
            $endgroup$
            – Bassem
            Dec 1 '16 at 23:37










          • $begingroup$
            $E(hat p^2)=sum_{t=n}^infty left(frac{n-1}{t-1}right)^2 {t-1choose n-1}p^n(1-p)^{t-n}$ So for $n=2$ $E(hat p^2)=sum_{t=2}^infty frac{1}{t-1} p^2(1-p)^{t-2}=frac{p^2}{1-p}sum_{t=2}^infty frac{1}{t-1} (1-p)^{t-1}=frac{p^2}{1-p}sum_{i=1}^infty frac{1}{i} (1-p)^i$ The last series needs $sum_{i=1}^infty frac{x^i}{i}$, which is obtained by integrating $sum_{i=1}^infty u^{i-1}=frac{1}{1-u}$ term by term from $u=0$ to $x$ You might consider upvoting and accepting the answer, if it was useful for you.
            $endgroup$
            – Momo
            Dec 1 '16 at 23:50













          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2038222%2fumvue-geometric-distribution%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0












          $begingroup$

          For the first question only:



          $P(X_1=1)=p$



          $P(sum_{i=2}^{n}X_i=t-1)={t-2choose n-2}p^{n-1}(1-p)^{t-n}$, $t=n,n+1...$



          $P(sum_{i=1}^{n}X_i=t)={t-1choose n-1}p^{n}(1-p)^{t-n}$, $t=n,n+1...$



          So the UMVUE is $hat p=frac{n-1}{sum_{i=1}^{n} X_i-1}$



          For CRLB you may look here.



          But for the variance of the UMVUE:



          $Var(hat p)=sum_{t=n}^infty left(frac{n-1}{t-1}-pright)^2 {t-1choose n-1}p^n(1-p)^{t-n}$



          I'm afraid I was not able to get a closed form. Neither it worked for $E(hat p^2)$



          Maybe somebody else can step in.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Can we find a closed form for the variance or E(p^2) when n=2 ? that is $ hat p=frac{n-1}{sum_{i=1}^{2} X_i-1}$ and how did you develop the variance summation? thanks
            $endgroup$
            – Bassem
            Dec 1 '16 at 12:24










          • $begingroup$
            $Y=sum_{i=1}^n X_i$ is Negative Binomial, $T=frac{n-1}{Y-1}$ is unbiased ($E[T]=p$), so $Var(T)=E[(T-E[T])^2]=Eleft[left(frac{n-1}{Y-1}-pright)^2right]=sum_{t=n}^inftyleft(frac{n-1}{t-1}-pright)^2 P(Y=t)$
            $endgroup$
            – Momo
            Dec 1 '16 at 14:33










          • $begingroup$
            Also, for $n=2$ you have $E[hat{p}^2]=frac{p^2log(1/p)}{1-p}$ so $Var(hat p)=E[hat{p}^2]-p^2$
            $endgroup$
            – Momo
            Dec 1 '16 at 14:39












          • $begingroup$
            Great .. thanks for the help .. but could you show what form did you use to find $E[hat{p}^2]$ ?
            $endgroup$
            – Bassem
            Dec 1 '16 at 23:37










          • $begingroup$
            $E(hat p^2)=sum_{t=n}^infty left(frac{n-1}{t-1}right)^2 {t-1choose n-1}p^n(1-p)^{t-n}$ So for $n=2$ $E(hat p^2)=sum_{t=2}^infty frac{1}{t-1} p^2(1-p)^{t-2}=frac{p^2}{1-p}sum_{t=2}^infty frac{1}{t-1} (1-p)^{t-1}=frac{p^2}{1-p}sum_{i=1}^infty frac{1}{i} (1-p)^i$ The last series needs $sum_{i=1}^infty frac{x^i}{i}$, which is obtained by integrating $sum_{i=1}^infty u^{i-1}=frac{1}{1-u}$ term by term from $u=0$ to $x$ You might consider upvoting and accepting the answer, if it was useful for you.
            $endgroup$
            – Momo
            Dec 1 '16 at 23:50


















          0












          $begingroup$

          For the first question only:



          $P(X_1=1)=p$



          $P(sum_{i=2}^{n}X_i=t-1)={t-2choose n-2}p^{n-1}(1-p)^{t-n}$, $t=n,n+1...$



          $P(sum_{i=1}^{n}X_i=t)={t-1choose n-1}p^{n}(1-p)^{t-n}$, $t=n,n+1...$



          So the UMVUE is $hat p=frac{n-1}{sum_{i=1}^{n} X_i-1}$



          For CRLB you may look here.



          But for the variance of the UMVUE:



          $Var(hat p)=sum_{t=n}^infty left(frac{n-1}{t-1}-pright)^2 {t-1choose n-1}p^n(1-p)^{t-n}$



          I'm afraid I was not able to get a closed form. Neither it worked for $E(hat p^2)$



          Maybe somebody else can step in.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Can we find a closed form for the variance or E(p^2) when n=2 ? that is $ hat p=frac{n-1}{sum_{i=1}^{2} X_i-1}$ and how did you develop the variance summation? thanks
            $endgroup$
            – Bassem
            Dec 1 '16 at 12:24










          • $begingroup$
            $Y=sum_{i=1}^n X_i$ is Negative Binomial, $T=frac{n-1}{Y-1}$ is unbiased ($E[T]=p$), so $Var(T)=E[(T-E[T])^2]=Eleft[left(frac{n-1}{Y-1}-pright)^2right]=sum_{t=n}^inftyleft(frac{n-1}{t-1}-pright)^2 P(Y=t)$
            $endgroup$
            – Momo
            Dec 1 '16 at 14:33










          • $begingroup$
            Also, for $n=2$ you have $E[hat{p}^2]=frac{p^2log(1/p)}{1-p}$ so $Var(hat p)=E[hat{p}^2]-p^2$
            $endgroup$
            – Momo
            Dec 1 '16 at 14:39












          • $begingroup$
            Great .. thanks for the help .. but could you show what form did you use to find $E[hat{p}^2]$ ?
            $endgroup$
            – Bassem
            Dec 1 '16 at 23:37










          • $begingroup$
            $E(hat p^2)=sum_{t=n}^infty left(frac{n-1}{t-1}right)^2 {t-1choose n-1}p^n(1-p)^{t-n}$ So for $n=2$ $E(hat p^2)=sum_{t=2}^infty frac{1}{t-1} p^2(1-p)^{t-2}=frac{p^2}{1-p}sum_{t=2}^infty frac{1}{t-1} (1-p)^{t-1}=frac{p^2}{1-p}sum_{i=1}^infty frac{1}{i} (1-p)^i$ The last series needs $sum_{i=1}^infty frac{x^i}{i}$, which is obtained by integrating $sum_{i=1}^infty u^{i-1}=frac{1}{1-u}$ term by term from $u=0$ to $x$ You might consider upvoting and accepting the answer, if it was useful for you.
            $endgroup$
            – Momo
            Dec 1 '16 at 23:50
















          0












          0








          0





          $begingroup$

          For the first question only:



          $P(X_1=1)=p$



          $P(sum_{i=2}^{n}X_i=t-1)={t-2choose n-2}p^{n-1}(1-p)^{t-n}$, $t=n,n+1...$



          $P(sum_{i=1}^{n}X_i=t)={t-1choose n-1}p^{n}(1-p)^{t-n}$, $t=n,n+1...$



          So the UMVUE is $hat p=frac{n-1}{sum_{i=1}^{n} X_i-1}$



          For CRLB you may look here.



          But for the variance of the UMVUE:



          $Var(hat p)=sum_{t=n}^infty left(frac{n-1}{t-1}-pright)^2 {t-1choose n-1}p^n(1-p)^{t-n}$



          I'm afraid I was not able to get a closed form. Neither it worked for $E(hat p^2)$



          Maybe somebody else can step in.






          share|cite|improve this answer











          $endgroup$



          For the first question only:



          $P(X_1=1)=p$



          $P(sum_{i=2}^{n}X_i=t-1)={t-2choose n-2}p^{n-1}(1-p)^{t-n}$, $t=n,n+1...$



          $P(sum_{i=1}^{n}X_i=t)={t-1choose n-1}p^{n}(1-p)^{t-n}$, $t=n,n+1...$



          So the UMVUE is $hat p=frac{n-1}{sum_{i=1}^{n} X_i-1}$



          For CRLB you may look here.



          But for the variance of the UMVUE:



          $Var(hat p)=sum_{t=n}^infty left(frac{n-1}{t-1}-pright)^2 {t-1choose n-1}p^n(1-p)^{t-n}$



          I'm afraid I was not able to get a closed form. Neither it worked for $E(hat p^2)$



          Maybe somebody else can step in.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Apr 13 '17 at 12:21









          Community

          1




          1










          answered Dec 1 '16 at 1:33









          MomoMomo

          12k21430




          12k21430












          • $begingroup$
            Can we find a closed form for the variance or E(p^2) when n=2 ? that is $ hat p=frac{n-1}{sum_{i=1}^{2} X_i-1}$ and how did you develop the variance summation? thanks
            $endgroup$
            – Bassem
            Dec 1 '16 at 12:24










          • $begingroup$
            $Y=sum_{i=1}^n X_i$ is Negative Binomial, $T=frac{n-1}{Y-1}$ is unbiased ($E[T]=p$), so $Var(T)=E[(T-E[T])^2]=Eleft[left(frac{n-1}{Y-1}-pright)^2right]=sum_{t=n}^inftyleft(frac{n-1}{t-1}-pright)^2 P(Y=t)$
            $endgroup$
            – Momo
            Dec 1 '16 at 14:33










          • $begingroup$
            Also, for $n=2$ you have $E[hat{p}^2]=frac{p^2log(1/p)}{1-p}$ so $Var(hat p)=E[hat{p}^2]-p^2$
            $endgroup$
            – Momo
            Dec 1 '16 at 14:39












          • $begingroup$
            Great .. thanks for the help .. but could you show what form did you use to find $E[hat{p}^2]$ ?
            $endgroup$
            – Bassem
            Dec 1 '16 at 23:37










          • $begingroup$
            $E(hat p^2)=sum_{t=n}^infty left(frac{n-1}{t-1}right)^2 {t-1choose n-1}p^n(1-p)^{t-n}$ So for $n=2$ $E(hat p^2)=sum_{t=2}^infty frac{1}{t-1} p^2(1-p)^{t-2}=frac{p^2}{1-p}sum_{t=2}^infty frac{1}{t-1} (1-p)^{t-1}=frac{p^2}{1-p}sum_{i=1}^infty frac{1}{i} (1-p)^i$ The last series needs $sum_{i=1}^infty frac{x^i}{i}$, which is obtained by integrating $sum_{i=1}^infty u^{i-1}=frac{1}{1-u}$ term by term from $u=0$ to $x$ You might consider upvoting and accepting the answer, if it was useful for you.
            $endgroup$
            – Momo
            Dec 1 '16 at 23:50




















          • $begingroup$
            Can we find a closed form for the variance or E(p^2) when n=2 ? that is $ hat p=frac{n-1}{sum_{i=1}^{2} X_i-1}$ and how did you develop the variance summation? thanks
            $endgroup$
            – Bassem
            Dec 1 '16 at 12:24










          • $begingroup$
            $Y=sum_{i=1}^n X_i$ is Negative Binomial, $T=frac{n-1}{Y-1}$ is unbiased ($E[T]=p$), so $Var(T)=E[(T-E[T])^2]=Eleft[left(frac{n-1}{Y-1}-pright)^2right]=sum_{t=n}^inftyleft(frac{n-1}{t-1}-pright)^2 P(Y=t)$
            $endgroup$
            – Momo
            Dec 1 '16 at 14:33










          • $begingroup$
            Also, for $n=2$ you have $E[hat{p}^2]=frac{p^2log(1/p)}{1-p}$ so $Var(hat p)=E[hat{p}^2]-p^2$
            $endgroup$
            – Momo
            Dec 1 '16 at 14:39












          • $begingroup$
            Great .. thanks for the help .. but could you show what form did you use to find $E[hat{p}^2]$ ?
            $endgroup$
            – Bassem
            Dec 1 '16 at 23:37










          • $begingroup$
            $E(hat p^2)=sum_{t=n}^infty left(frac{n-1}{t-1}right)^2 {t-1choose n-1}p^n(1-p)^{t-n}$ So for $n=2$ $E(hat p^2)=sum_{t=2}^infty frac{1}{t-1} p^2(1-p)^{t-2}=frac{p^2}{1-p}sum_{t=2}^infty frac{1}{t-1} (1-p)^{t-1}=frac{p^2}{1-p}sum_{i=1}^infty frac{1}{i} (1-p)^i$ The last series needs $sum_{i=1}^infty frac{x^i}{i}$, which is obtained by integrating $sum_{i=1}^infty u^{i-1}=frac{1}{1-u}$ term by term from $u=0$ to $x$ You might consider upvoting and accepting the answer, if it was useful for you.
            $endgroup$
            – Momo
            Dec 1 '16 at 23:50


















          $begingroup$
          Can we find a closed form for the variance or E(p^2) when n=2 ? that is $ hat p=frac{n-1}{sum_{i=1}^{2} X_i-1}$ and how did you develop the variance summation? thanks
          $endgroup$
          – Bassem
          Dec 1 '16 at 12:24




          $begingroup$
          Can we find a closed form for the variance or E(p^2) when n=2 ? that is $ hat p=frac{n-1}{sum_{i=1}^{2} X_i-1}$ and how did you develop the variance summation? thanks
          $endgroup$
          – Bassem
          Dec 1 '16 at 12:24












          $begingroup$
          $Y=sum_{i=1}^n X_i$ is Negative Binomial, $T=frac{n-1}{Y-1}$ is unbiased ($E[T]=p$), so $Var(T)=E[(T-E[T])^2]=Eleft[left(frac{n-1}{Y-1}-pright)^2right]=sum_{t=n}^inftyleft(frac{n-1}{t-1}-pright)^2 P(Y=t)$
          $endgroup$
          – Momo
          Dec 1 '16 at 14:33




          $begingroup$
          $Y=sum_{i=1}^n X_i$ is Negative Binomial, $T=frac{n-1}{Y-1}$ is unbiased ($E[T]=p$), so $Var(T)=E[(T-E[T])^2]=Eleft[left(frac{n-1}{Y-1}-pright)^2right]=sum_{t=n}^inftyleft(frac{n-1}{t-1}-pright)^2 P(Y=t)$
          $endgroup$
          – Momo
          Dec 1 '16 at 14:33












          $begingroup$
          Also, for $n=2$ you have $E[hat{p}^2]=frac{p^2log(1/p)}{1-p}$ so $Var(hat p)=E[hat{p}^2]-p^2$
          $endgroup$
          – Momo
          Dec 1 '16 at 14:39






          $begingroup$
          Also, for $n=2$ you have $E[hat{p}^2]=frac{p^2log(1/p)}{1-p}$ so $Var(hat p)=E[hat{p}^2]-p^2$
          $endgroup$
          – Momo
          Dec 1 '16 at 14:39














          $begingroup$
          Great .. thanks for the help .. but could you show what form did you use to find $E[hat{p}^2]$ ?
          $endgroup$
          – Bassem
          Dec 1 '16 at 23:37




          $begingroup$
          Great .. thanks for the help .. but could you show what form did you use to find $E[hat{p}^2]$ ?
          $endgroup$
          – Bassem
          Dec 1 '16 at 23:37












          $begingroup$
          $E(hat p^2)=sum_{t=n}^infty left(frac{n-1}{t-1}right)^2 {t-1choose n-1}p^n(1-p)^{t-n}$ So for $n=2$ $E(hat p^2)=sum_{t=2}^infty frac{1}{t-1} p^2(1-p)^{t-2}=frac{p^2}{1-p}sum_{t=2}^infty frac{1}{t-1} (1-p)^{t-1}=frac{p^2}{1-p}sum_{i=1}^infty frac{1}{i} (1-p)^i$ The last series needs $sum_{i=1}^infty frac{x^i}{i}$, which is obtained by integrating $sum_{i=1}^infty u^{i-1}=frac{1}{1-u}$ term by term from $u=0$ to $x$ You might consider upvoting and accepting the answer, if it was useful for you.
          $endgroup$
          – Momo
          Dec 1 '16 at 23:50






          $begingroup$
          $E(hat p^2)=sum_{t=n}^infty left(frac{n-1}{t-1}right)^2 {t-1choose n-1}p^n(1-p)^{t-n}$ So for $n=2$ $E(hat p^2)=sum_{t=2}^infty frac{1}{t-1} p^2(1-p)^{t-2}=frac{p^2}{1-p}sum_{t=2}^infty frac{1}{t-1} (1-p)^{t-1}=frac{p^2}{1-p}sum_{i=1}^infty frac{1}{i} (1-p)^i$ The last series needs $sum_{i=1}^infty frac{x^i}{i}$, which is obtained by integrating $sum_{i=1}^infty u^{i-1}=frac{1}{1-u}$ term by term from $u=0$ to $x$ You might consider upvoting and accepting the answer, if it was useful for you.
          $endgroup$
          – Momo
          Dec 1 '16 at 23:50




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2038222%2fumvue-geometric-distribution%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Plaza Victoria

          Musa

          Puebla de Zaragoza