MLE of the unknown radius





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}







2












$begingroup$


Consider this question,




Suppose that $(X_1, Y_1),(X_2, Y_2), . . . ,(X_n, Y_n)$ are the
coordinates of $n$ points chosen independently and uniformly at random
within a circle with center $(0, 0)$ and unknown radius $r$. Obtain
the MLE $hat{r}_n$ of $r$.




My attempt:



I have thought about the question but I am not able to put it formally. This is my reasoning.
$X_i$ and $Y_i$ are both $uniformly$ $distributed$ independent random variables on a circle of radius $r$ and center at $(0,0)$. Since $(X_i, Y_i)$ are the coordinates of the $i^text{th}$ point, transforming these coordinates to polar coordinates we get for the $i^text{th}$ point $(theta_i,a_i)$ (say) where $theta_i$ follows $Uniform(0,2pi)$ and $a_i$ follows $Uniform(0,r)$ independently. Then $a_{(n)} = max(a_i)$ is the MLE of $r$.



Is this reasoning correct? How do I put it formally? And if not, how should this problem be solved?










share|cite|improve this question











$endgroup$



















    2












    $begingroup$


    Consider this question,




    Suppose that $(X_1, Y_1),(X_2, Y_2), . . . ,(X_n, Y_n)$ are the
    coordinates of $n$ points chosen independently and uniformly at random
    within a circle with center $(0, 0)$ and unknown radius $r$. Obtain
    the MLE $hat{r}_n$ of $r$.




    My attempt:



    I have thought about the question but I am not able to put it formally. This is my reasoning.
    $X_i$ and $Y_i$ are both $uniformly$ $distributed$ independent random variables on a circle of radius $r$ and center at $(0,0)$. Since $(X_i, Y_i)$ are the coordinates of the $i^text{th}$ point, transforming these coordinates to polar coordinates we get for the $i^text{th}$ point $(theta_i,a_i)$ (say) where $theta_i$ follows $Uniform(0,2pi)$ and $a_i$ follows $Uniform(0,r)$ independently. Then $a_{(n)} = max(a_i)$ is the MLE of $r$.



    Is this reasoning correct? How do I put it formally? And if not, how should this problem be solved?










    share|cite|improve this question











    $endgroup$















      2












      2








      2


      2



      $begingroup$


      Consider this question,




      Suppose that $(X_1, Y_1),(X_2, Y_2), . . . ,(X_n, Y_n)$ are the
      coordinates of $n$ points chosen independently and uniformly at random
      within a circle with center $(0, 0)$ and unknown radius $r$. Obtain
      the MLE $hat{r}_n$ of $r$.




      My attempt:



      I have thought about the question but I am not able to put it formally. This is my reasoning.
      $X_i$ and $Y_i$ are both $uniformly$ $distributed$ independent random variables on a circle of radius $r$ and center at $(0,0)$. Since $(X_i, Y_i)$ are the coordinates of the $i^text{th}$ point, transforming these coordinates to polar coordinates we get for the $i^text{th}$ point $(theta_i,a_i)$ (say) where $theta_i$ follows $Uniform(0,2pi)$ and $a_i$ follows $Uniform(0,r)$ independently. Then $a_{(n)} = max(a_i)$ is the MLE of $r$.



      Is this reasoning correct? How do I put it formally? And if not, how should this problem be solved?










      share|cite|improve this question











      $endgroup$




      Consider this question,




      Suppose that $(X_1, Y_1),(X_2, Y_2), . . . ,(X_n, Y_n)$ are the
      coordinates of $n$ points chosen independently and uniformly at random
      within a circle with center $(0, 0)$ and unknown radius $r$. Obtain
      the MLE $hat{r}_n$ of $r$.




      My attempt:



      I have thought about the question but I am not able to put it formally. This is my reasoning.
      $X_i$ and $Y_i$ are both $uniformly$ $distributed$ independent random variables on a circle of radius $r$ and center at $(0,0)$. Since $(X_i, Y_i)$ are the coordinates of the $i^text{th}$ point, transforming these coordinates to polar coordinates we get for the $i^text{th}$ point $(theta_i,a_i)$ (say) where $theta_i$ follows $Uniform(0,2pi)$ and $a_i$ follows $Uniform(0,r)$ independently. Then $a_{(n)} = max(a_i)$ is the MLE of $r$.



      Is this reasoning correct? How do I put it formally? And if not, how should this problem be solved?







      self-study estimation maximum-likelihood inference






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Apr 19 at 6:55







      Sanket Agrawal

















      asked Apr 19 at 5:48









      Sanket AgrawalSanket Agrawal

      847




      847






















          1 Answer
          1






          active

          oldest

          votes


















          4












          $begingroup$

          Uniform sampling over geometric shapes: With these kinds of problems, where your sample points are uniformly distributed over some fixed geometric shape, it is almost always easiest to proceed by first deriving the distribution function for the relevant quantity of interest. This is quite simple because the probability of falling within any subset of the overall sampling space is equal to the proportion that this subset takes up in the total area/volume of the total sampling space.





          This question is substantially simplified if we recognise that the points give information about the radius $r$ only through their distance from the centre (i.e., the zero point). Let $R_1,R_2,R_3,...$ be these distance values corresponding to each of the points and let $A(r) = pi cdot r^2$ be the area of a circle with radius $r$. Since each sample point is uniform in the circle we must have:



          $$mathbb{P}(R_i leq t) = frac{A(t)}{A(r)} = frac{pi cdot t^2}{pi cdot r^2} = Big( frac{t}{r} Big)^2
          quad quad quad text{for all } 0 leq t leq r.$$



          This gives the corresponding sample density $p_r(t) = 2t/r^2$ over the support $0 leq t leq r$. As you can see, this sampling density is not uniform over the support. This is actually unsurprising --- the probability density for the distance $R_i$ is proportional to the circumference of a circle with radius equal to that distance.





          The maximum likelihood estimator: The likelihood function for $n$ observed data points is:



          $$L_mathbf{r}(r) = prod_{i=1}^n p_r(r_i) propto frac{1}{r^{2n}} cdot mathbb{I}(r geq r_{(n)}).$$



          We can see that the likelihood function is strictly decreasing in $r$, so the maximum likelihood estimator (MLE) occurs at the boundary point $hat{r}_n = r_{(n)}$. That is, the MLE for the true radius is the maximum of the lengths from the zero point to the sample points. Since the true parameter $r$ is at least as large as the MLE, the MLE is biased, and will tend to underestimate the true radius. Specifically, we have:



          $$begin{equation} begin{aligned}
          mathbb{E}(hat{R}_n) = mathbb{E}(R_{(n)})
          &= int limits_0^r mathbb{P}(R_{(n)} > t) dt \[6pt]
          &= int limits_0^r (1 - mathbb{P}(R_{(n)} leq t)) dt \[6pt]
          &= int limits_0^r Big( 1 - frac{t^{2n}}{r^{2n}} Big) dt \[6pt]
          &= Bigg[ t - frac{1}{2n+1} cdot frac{t^{2n+1}}{r^{2n}} Bigg]_{t=0}^{t=r} \[6pt]
          &= r Big( 1 - frac{1}{2n+1} Big) \[6pt]
          &= frac{2n}{2n+1} cdot r. \[6pt]
          end{aligned} end{equation}$$



          Thus, a bias-corrected scaled version of the MLE is:



          $$tilde{r}_n = frac{2n+1}{2n} cdot r_{(n)}.$$



          Incidentally, this process can easily be extended to a hypersphere in higher dimensions. In each case the MLE is the maximum distance from the zero point. For a hypershere in $k$ dimensions, the bias-corrected scaled MLE is:



          $$tilde{r}_n = frac{kn+1}{kn} cdot r_{(n)}.$$






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            No problem, thanks for the edit! (Will come back and delete this comment tomorrow.)
            $endgroup$
            – Silverfish
            Apr 20 at 0:08












          Your Answer








          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "65"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f403914%2fmle-of-the-unknown-radius%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          4












          $begingroup$

          Uniform sampling over geometric shapes: With these kinds of problems, where your sample points are uniformly distributed over some fixed geometric shape, it is almost always easiest to proceed by first deriving the distribution function for the relevant quantity of interest. This is quite simple because the probability of falling within any subset of the overall sampling space is equal to the proportion that this subset takes up in the total area/volume of the total sampling space.





          This question is substantially simplified if we recognise that the points give information about the radius $r$ only through their distance from the centre (i.e., the zero point). Let $R_1,R_2,R_3,...$ be these distance values corresponding to each of the points and let $A(r) = pi cdot r^2$ be the area of a circle with radius $r$. Since each sample point is uniform in the circle we must have:



          $$mathbb{P}(R_i leq t) = frac{A(t)}{A(r)} = frac{pi cdot t^2}{pi cdot r^2} = Big( frac{t}{r} Big)^2
          quad quad quad text{for all } 0 leq t leq r.$$



          This gives the corresponding sample density $p_r(t) = 2t/r^2$ over the support $0 leq t leq r$. As you can see, this sampling density is not uniform over the support. This is actually unsurprising --- the probability density for the distance $R_i$ is proportional to the circumference of a circle with radius equal to that distance.





          The maximum likelihood estimator: The likelihood function for $n$ observed data points is:



          $$L_mathbf{r}(r) = prod_{i=1}^n p_r(r_i) propto frac{1}{r^{2n}} cdot mathbb{I}(r geq r_{(n)}).$$



          We can see that the likelihood function is strictly decreasing in $r$, so the maximum likelihood estimator (MLE) occurs at the boundary point $hat{r}_n = r_{(n)}$. That is, the MLE for the true radius is the maximum of the lengths from the zero point to the sample points. Since the true parameter $r$ is at least as large as the MLE, the MLE is biased, and will tend to underestimate the true radius. Specifically, we have:



          $$begin{equation} begin{aligned}
          mathbb{E}(hat{R}_n) = mathbb{E}(R_{(n)})
          &= int limits_0^r mathbb{P}(R_{(n)} > t) dt \[6pt]
          &= int limits_0^r (1 - mathbb{P}(R_{(n)} leq t)) dt \[6pt]
          &= int limits_0^r Big( 1 - frac{t^{2n}}{r^{2n}} Big) dt \[6pt]
          &= Bigg[ t - frac{1}{2n+1} cdot frac{t^{2n+1}}{r^{2n}} Bigg]_{t=0}^{t=r} \[6pt]
          &= r Big( 1 - frac{1}{2n+1} Big) \[6pt]
          &= frac{2n}{2n+1} cdot r. \[6pt]
          end{aligned} end{equation}$$



          Thus, a bias-corrected scaled version of the MLE is:



          $$tilde{r}_n = frac{2n+1}{2n} cdot r_{(n)}.$$



          Incidentally, this process can easily be extended to a hypersphere in higher dimensions. In each case the MLE is the maximum distance from the zero point. For a hypershere in $k$ dimensions, the bias-corrected scaled MLE is:



          $$tilde{r}_n = frac{kn+1}{kn} cdot r_{(n)}.$$






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            No problem, thanks for the edit! (Will come back and delete this comment tomorrow.)
            $endgroup$
            – Silverfish
            Apr 20 at 0:08
















          4












          $begingroup$

          Uniform sampling over geometric shapes: With these kinds of problems, where your sample points are uniformly distributed over some fixed geometric shape, it is almost always easiest to proceed by first deriving the distribution function for the relevant quantity of interest. This is quite simple because the probability of falling within any subset of the overall sampling space is equal to the proportion that this subset takes up in the total area/volume of the total sampling space.





          This question is substantially simplified if we recognise that the points give information about the radius $r$ only through their distance from the centre (i.e., the zero point). Let $R_1,R_2,R_3,...$ be these distance values corresponding to each of the points and let $A(r) = pi cdot r^2$ be the area of a circle with radius $r$. Since each sample point is uniform in the circle we must have:



          $$mathbb{P}(R_i leq t) = frac{A(t)}{A(r)} = frac{pi cdot t^2}{pi cdot r^2} = Big( frac{t}{r} Big)^2
          quad quad quad text{for all } 0 leq t leq r.$$



          This gives the corresponding sample density $p_r(t) = 2t/r^2$ over the support $0 leq t leq r$. As you can see, this sampling density is not uniform over the support. This is actually unsurprising --- the probability density for the distance $R_i$ is proportional to the circumference of a circle with radius equal to that distance.





          The maximum likelihood estimator: The likelihood function for $n$ observed data points is:



          $$L_mathbf{r}(r) = prod_{i=1}^n p_r(r_i) propto frac{1}{r^{2n}} cdot mathbb{I}(r geq r_{(n)}).$$



          We can see that the likelihood function is strictly decreasing in $r$, so the maximum likelihood estimator (MLE) occurs at the boundary point $hat{r}_n = r_{(n)}$. That is, the MLE for the true radius is the maximum of the lengths from the zero point to the sample points. Since the true parameter $r$ is at least as large as the MLE, the MLE is biased, and will tend to underestimate the true radius. Specifically, we have:



          $$begin{equation} begin{aligned}
          mathbb{E}(hat{R}_n) = mathbb{E}(R_{(n)})
          &= int limits_0^r mathbb{P}(R_{(n)} > t) dt \[6pt]
          &= int limits_0^r (1 - mathbb{P}(R_{(n)} leq t)) dt \[6pt]
          &= int limits_0^r Big( 1 - frac{t^{2n}}{r^{2n}} Big) dt \[6pt]
          &= Bigg[ t - frac{1}{2n+1} cdot frac{t^{2n+1}}{r^{2n}} Bigg]_{t=0}^{t=r} \[6pt]
          &= r Big( 1 - frac{1}{2n+1} Big) \[6pt]
          &= frac{2n}{2n+1} cdot r. \[6pt]
          end{aligned} end{equation}$$



          Thus, a bias-corrected scaled version of the MLE is:



          $$tilde{r}_n = frac{2n+1}{2n} cdot r_{(n)}.$$



          Incidentally, this process can easily be extended to a hypersphere in higher dimensions. In each case the MLE is the maximum distance from the zero point. For a hypershere in $k$ dimensions, the bias-corrected scaled MLE is:



          $$tilde{r}_n = frac{kn+1}{kn} cdot r_{(n)}.$$






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            No problem, thanks for the edit! (Will come back and delete this comment tomorrow.)
            $endgroup$
            – Silverfish
            Apr 20 at 0:08














          4












          4








          4





          $begingroup$

          Uniform sampling over geometric shapes: With these kinds of problems, where your sample points are uniformly distributed over some fixed geometric shape, it is almost always easiest to proceed by first deriving the distribution function for the relevant quantity of interest. This is quite simple because the probability of falling within any subset of the overall sampling space is equal to the proportion that this subset takes up in the total area/volume of the total sampling space.





          This question is substantially simplified if we recognise that the points give information about the radius $r$ only through their distance from the centre (i.e., the zero point). Let $R_1,R_2,R_3,...$ be these distance values corresponding to each of the points and let $A(r) = pi cdot r^2$ be the area of a circle with radius $r$. Since each sample point is uniform in the circle we must have:



          $$mathbb{P}(R_i leq t) = frac{A(t)}{A(r)} = frac{pi cdot t^2}{pi cdot r^2} = Big( frac{t}{r} Big)^2
          quad quad quad text{for all } 0 leq t leq r.$$



          This gives the corresponding sample density $p_r(t) = 2t/r^2$ over the support $0 leq t leq r$. As you can see, this sampling density is not uniform over the support. This is actually unsurprising --- the probability density for the distance $R_i$ is proportional to the circumference of a circle with radius equal to that distance.





          The maximum likelihood estimator: The likelihood function for $n$ observed data points is:



          $$L_mathbf{r}(r) = prod_{i=1}^n p_r(r_i) propto frac{1}{r^{2n}} cdot mathbb{I}(r geq r_{(n)}).$$



          We can see that the likelihood function is strictly decreasing in $r$, so the maximum likelihood estimator (MLE) occurs at the boundary point $hat{r}_n = r_{(n)}$. That is, the MLE for the true radius is the maximum of the lengths from the zero point to the sample points. Since the true parameter $r$ is at least as large as the MLE, the MLE is biased, and will tend to underestimate the true radius. Specifically, we have:



          $$begin{equation} begin{aligned}
          mathbb{E}(hat{R}_n) = mathbb{E}(R_{(n)})
          &= int limits_0^r mathbb{P}(R_{(n)} > t) dt \[6pt]
          &= int limits_0^r (1 - mathbb{P}(R_{(n)} leq t)) dt \[6pt]
          &= int limits_0^r Big( 1 - frac{t^{2n}}{r^{2n}} Big) dt \[6pt]
          &= Bigg[ t - frac{1}{2n+1} cdot frac{t^{2n+1}}{r^{2n}} Bigg]_{t=0}^{t=r} \[6pt]
          &= r Big( 1 - frac{1}{2n+1} Big) \[6pt]
          &= frac{2n}{2n+1} cdot r. \[6pt]
          end{aligned} end{equation}$$



          Thus, a bias-corrected scaled version of the MLE is:



          $$tilde{r}_n = frac{2n+1}{2n} cdot r_{(n)}.$$



          Incidentally, this process can easily be extended to a hypersphere in higher dimensions. In each case the MLE is the maximum distance from the zero point. For a hypershere in $k$ dimensions, the bias-corrected scaled MLE is:



          $$tilde{r}_n = frac{kn+1}{kn} cdot r_{(n)}.$$






          share|cite|improve this answer











          $endgroup$



          Uniform sampling over geometric shapes: With these kinds of problems, where your sample points are uniformly distributed over some fixed geometric shape, it is almost always easiest to proceed by first deriving the distribution function for the relevant quantity of interest. This is quite simple because the probability of falling within any subset of the overall sampling space is equal to the proportion that this subset takes up in the total area/volume of the total sampling space.





          This question is substantially simplified if we recognise that the points give information about the radius $r$ only through their distance from the centre (i.e., the zero point). Let $R_1,R_2,R_3,...$ be these distance values corresponding to each of the points and let $A(r) = pi cdot r^2$ be the area of a circle with radius $r$. Since each sample point is uniform in the circle we must have:



          $$mathbb{P}(R_i leq t) = frac{A(t)}{A(r)} = frac{pi cdot t^2}{pi cdot r^2} = Big( frac{t}{r} Big)^2
          quad quad quad text{for all } 0 leq t leq r.$$



          This gives the corresponding sample density $p_r(t) = 2t/r^2$ over the support $0 leq t leq r$. As you can see, this sampling density is not uniform over the support. This is actually unsurprising --- the probability density for the distance $R_i$ is proportional to the circumference of a circle with radius equal to that distance.





          The maximum likelihood estimator: The likelihood function for $n$ observed data points is:



          $$L_mathbf{r}(r) = prod_{i=1}^n p_r(r_i) propto frac{1}{r^{2n}} cdot mathbb{I}(r geq r_{(n)}).$$



          We can see that the likelihood function is strictly decreasing in $r$, so the maximum likelihood estimator (MLE) occurs at the boundary point $hat{r}_n = r_{(n)}$. That is, the MLE for the true radius is the maximum of the lengths from the zero point to the sample points. Since the true parameter $r$ is at least as large as the MLE, the MLE is biased, and will tend to underestimate the true radius. Specifically, we have:



          $$begin{equation} begin{aligned}
          mathbb{E}(hat{R}_n) = mathbb{E}(R_{(n)})
          &= int limits_0^r mathbb{P}(R_{(n)} > t) dt \[6pt]
          &= int limits_0^r (1 - mathbb{P}(R_{(n)} leq t)) dt \[6pt]
          &= int limits_0^r Big( 1 - frac{t^{2n}}{r^{2n}} Big) dt \[6pt]
          &= Bigg[ t - frac{1}{2n+1} cdot frac{t^{2n+1}}{r^{2n}} Bigg]_{t=0}^{t=r} \[6pt]
          &= r Big( 1 - frac{1}{2n+1} Big) \[6pt]
          &= frac{2n}{2n+1} cdot r. \[6pt]
          end{aligned} end{equation}$$



          Thus, a bias-corrected scaled version of the MLE is:



          $$tilde{r}_n = frac{2n+1}{2n} cdot r_{(n)}.$$



          Incidentally, this process can easily be extended to a hypersphere in higher dimensions. In each case the MLE is the maximum distance from the zero point. For a hypershere in $k$ dimensions, the bias-corrected scaled MLE is:



          $$tilde{r}_n = frac{kn+1}{kn} cdot r_{(n)}.$$







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Apr 19 at 23:25

























          answered Apr 19 at 8:51









          BenBen

          29.1k234130




          29.1k234130












          • $begingroup$
            No problem, thanks for the edit! (Will come back and delete this comment tomorrow.)
            $endgroup$
            – Silverfish
            Apr 20 at 0:08


















          • $begingroup$
            No problem, thanks for the edit! (Will come back and delete this comment tomorrow.)
            $endgroup$
            – Silverfish
            Apr 20 at 0:08
















          $begingroup$
          No problem, thanks for the edit! (Will come back and delete this comment tomorrow.)
          $endgroup$
          – Silverfish
          Apr 20 at 0:08




          $begingroup$
          No problem, thanks for the edit! (Will come back and delete this comment tomorrow.)
          $endgroup$
          – Silverfish
          Apr 20 at 0:08


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Cross Validated!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f403914%2fmle-of-the-unknown-radius%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Plaza Victoria

          Puebla de Zaragoza

          Musa