Integral of 1/x - base of logarithm












0












$begingroup$


I see a proof in https://arxiv.org/abs/1805.11965 (equation 3.36) that uses the following.



$log x = int_0^{infty} ds left(frac{1}{1+s} - frac{1}{s+x}right)$.



This seems to hinge on $int frac{1}{x} = log_2 x$ (the context is information theory), as opposed to $log_e(x)$. Why is this true?










share|cite|improve this question











$endgroup$












  • $begingroup$
    It isn't? It is true up to a multiplied constant, though, since $log_2(x) = ln(x)/ln(2)$
    $endgroup$
    – Juan Sebastian Lozano
    Dec 11 '18 at 15:41










  • $begingroup$
    What is the "this" in "this seems to hinge"? The equation $log x = int_0^{infty} ds left(frac{1}{1+s} - frac{1}{s+x}right)$? If so, why do you say it seems to hinge on $int frac 1x$[sic] $= log_2x$? If something else in the article, what?
    $endgroup$
    – fleablood
    Dec 11 '18 at 16:01






  • 1




    $begingroup$
    The notation in the pape is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)
    $endgroup$
    – mlerma54
    Dec 11 '18 at 16:08












  • $begingroup$
    Ah @mlerma54, I think that's the error in my assumption then. I assumed quantum entropy also stayed in $log_2$. If you put your comment as an answer, I can accept it. Thank you
    $endgroup$
    – user1936752
    Dec 11 '18 at 17:23
















0












$begingroup$


I see a proof in https://arxiv.org/abs/1805.11965 (equation 3.36) that uses the following.



$log x = int_0^{infty} ds left(frac{1}{1+s} - frac{1}{s+x}right)$.



This seems to hinge on $int frac{1}{x} = log_2 x$ (the context is information theory), as opposed to $log_e(x)$. Why is this true?










share|cite|improve this question











$endgroup$












  • $begingroup$
    It isn't? It is true up to a multiplied constant, though, since $log_2(x) = ln(x)/ln(2)$
    $endgroup$
    – Juan Sebastian Lozano
    Dec 11 '18 at 15:41










  • $begingroup$
    What is the "this" in "this seems to hinge"? The equation $log x = int_0^{infty} ds left(frac{1}{1+s} - frac{1}{s+x}right)$? If so, why do you say it seems to hinge on $int frac 1x$[sic] $= log_2x$? If something else in the article, what?
    $endgroup$
    – fleablood
    Dec 11 '18 at 16:01






  • 1




    $begingroup$
    The notation in the pape is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)
    $endgroup$
    – mlerma54
    Dec 11 '18 at 16:08












  • $begingroup$
    Ah @mlerma54, I think that's the error in my assumption then. I assumed quantum entropy also stayed in $log_2$. If you put your comment as an answer, I can accept it. Thank you
    $endgroup$
    – user1936752
    Dec 11 '18 at 17:23














0












0








0





$begingroup$


I see a proof in https://arxiv.org/abs/1805.11965 (equation 3.36) that uses the following.



$log x = int_0^{infty} ds left(frac{1}{1+s} - frac{1}{s+x}right)$.



This seems to hinge on $int frac{1}{x} = log_2 x$ (the context is information theory), as opposed to $log_e(x)$. Why is this true?










share|cite|improve this question











$endgroup$




I see a proof in https://arxiv.org/abs/1805.11965 (equation 3.36) that uses the following.



$log x = int_0^{infty} ds left(frac{1}{1+s} - frac{1}{s+x}right)$.



This seems to hinge on $int frac{1}{x} = log_2 x$ (the context is information theory), as opposed to $log_e(x)$. Why is this true?







integration logarithms






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 11 '18 at 15:45







user1936752

















asked Dec 11 '18 at 15:35









user1936752user1936752

5531513




5531513












  • $begingroup$
    It isn't? It is true up to a multiplied constant, though, since $log_2(x) = ln(x)/ln(2)$
    $endgroup$
    – Juan Sebastian Lozano
    Dec 11 '18 at 15:41










  • $begingroup$
    What is the "this" in "this seems to hinge"? The equation $log x = int_0^{infty} ds left(frac{1}{1+s} - frac{1}{s+x}right)$? If so, why do you say it seems to hinge on $int frac 1x$[sic] $= log_2x$? If something else in the article, what?
    $endgroup$
    – fleablood
    Dec 11 '18 at 16:01






  • 1




    $begingroup$
    The notation in the pape is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)
    $endgroup$
    – mlerma54
    Dec 11 '18 at 16:08












  • $begingroup$
    Ah @mlerma54, I think that's the error in my assumption then. I assumed quantum entropy also stayed in $log_2$. If you put your comment as an answer, I can accept it. Thank you
    $endgroup$
    – user1936752
    Dec 11 '18 at 17:23


















  • $begingroup$
    It isn't? It is true up to a multiplied constant, though, since $log_2(x) = ln(x)/ln(2)$
    $endgroup$
    – Juan Sebastian Lozano
    Dec 11 '18 at 15:41










  • $begingroup$
    What is the "this" in "this seems to hinge"? The equation $log x = int_0^{infty} ds left(frac{1}{1+s} - frac{1}{s+x}right)$? If so, why do you say it seems to hinge on $int frac 1x$[sic] $= log_2x$? If something else in the article, what?
    $endgroup$
    – fleablood
    Dec 11 '18 at 16:01






  • 1




    $begingroup$
    The notation in the pape is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)
    $endgroup$
    – mlerma54
    Dec 11 '18 at 16:08












  • $begingroup$
    Ah @mlerma54, I think that's the error in my assumption then. I assumed quantum entropy also stayed in $log_2$. If you put your comment as an answer, I can accept it. Thank you
    $endgroup$
    – user1936752
    Dec 11 '18 at 17:23
















$begingroup$
It isn't? It is true up to a multiplied constant, though, since $log_2(x) = ln(x)/ln(2)$
$endgroup$
– Juan Sebastian Lozano
Dec 11 '18 at 15:41




$begingroup$
It isn't? It is true up to a multiplied constant, though, since $log_2(x) = ln(x)/ln(2)$
$endgroup$
– Juan Sebastian Lozano
Dec 11 '18 at 15:41












$begingroup$
What is the "this" in "this seems to hinge"? The equation $log x = int_0^{infty} ds left(frac{1}{1+s} - frac{1}{s+x}right)$? If so, why do you say it seems to hinge on $int frac 1x$[sic] $= log_2x$? If something else in the article, what?
$endgroup$
– fleablood
Dec 11 '18 at 16:01




$begingroup$
What is the "this" in "this seems to hinge"? The equation $log x = int_0^{infty} ds left(frac{1}{1+s} - frac{1}{s+x}right)$? If so, why do you say it seems to hinge on $int frac 1x$[sic] $= log_2x$? If something else in the article, what?
$endgroup$
– fleablood
Dec 11 '18 at 16:01




1




1




$begingroup$
The notation in the pape is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)
$endgroup$
– mlerma54
Dec 11 '18 at 16:08






$begingroup$
The notation in the pape is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)
$endgroup$
– mlerma54
Dec 11 '18 at 16:08














$begingroup$
Ah @mlerma54, I think that's the error in my assumption then. I assumed quantum entropy also stayed in $log_2$. If you put your comment as an answer, I can accept it. Thank you
$endgroup$
– user1936752
Dec 11 '18 at 17:23




$begingroup$
Ah @mlerma54, I think that's the error in my assumption then. I assumed quantum entropy also stayed in $log_2$. If you put your comment as an answer, I can accept it. Thank you
$endgroup$
– user1936752
Dec 11 '18 at 17:23










2 Answers
2






active

oldest

votes


















1












$begingroup$

The notation in the paper is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)






share|cite|improve this answer









$endgroup$





















    0












    $begingroup$

    Note that for $a > 0$,
    $$
    int_0^M frac{1}{a+s}ds = int_{a}^{M+a} frac{1}{s} ds = ln (M+a) - ln a.
    $$

    Therefore,
    $$
    int_0^M left(frac{1}{1+s} - frac{1}{x+s}right) ds =
    ln (M+1) - ln 1 - (ln (M+x) - ln x).
    $$

    Therefore, by letting $M to infty$, we have
    $$
    int_0^infty left(frac{1}{1+s} - frac{1}{x+s}right) ds = ln x.
    $$






    share|cite|improve this answer









    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3035408%2fintegral-of-1-x-base-of-logarithm%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      1












      $begingroup$

      The notation in the paper is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)






      share|cite|improve this answer









      $endgroup$


















        1












        $begingroup$

        The notation in the paper is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)






        share|cite|improve this answer









        $endgroup$
















          1












          1








          1





          $begingroup$

          The notation in the paper is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)






          share|cite|improve this answer









          $endgroup$



          The notation in the paper is a little confusing because in the classical part entropy is measured in bits and $log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Dec 11 '18 at 19:45









          mlerma54mlerma54

          1,177148




          1,177148























              0












              $begingroup$

              Note that for $a > 0$,
              $$
              int_0^M frac{1}{a+s}ds = int_{a}^{M+a} frac{1}{s} ds = ln (M+a) - ln a.
              $$

              Therefore,
              $$
              int_0^M left(frac{1}{1+s} - frac{1}{x+s}right) ds =
              ln (M+1) - ln 1 - (ln (M+x) - ln x).
              $$

              Therefore, by letting $M to infty$, we have
              $$
              int_0^infty left(frac{1}{1+s} - frac{1}{x+s}right) ds = ln x.
              $$






              share|cite|improve this answer









              $endgroup$


















                0












                $begingroup$

                Note that for $a > 0$,
                $$
                int_0^M frac{1}{a+s}ds = int_{a}^{M+a} frac{1}{s} ds = ln (M+a) - ln a.
                $$

                Therefore,
                $$
                int_0^M left(frac{1}{1+s} - frac{1}{x+s}right) ds =
                ln (M+1) - ln 1 - (ln (M+x) - ln x).
                $$

                Therefore, by letting $M to infty$, we have
                $$
                int_0^infty left(frac{1}{1+s} - frac{1}{x+s}right) ds = ln x.
                $$






                share|cite|improve this answer









                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  Note that for $a > 0$,
                  $$
                  int_0^M frac{1}{a+s}ds = int_{a}^{M+a} frac{1}{s} ds = ln (M+a) - ln a.
                  $$

                  Therefore,
                  $$
                  int_0^M left(frac{1}{1+s} - frac{1}{x+s}right) ds =
                  ln (M+1) - ln 1 - (ln (M+x) - ln x).
                  $$

                  Therefore, by letting $M to infty$, we have
                  $$
                  int_0^infty left(frac{1}{1+s} - frac{1}{x+s}right) ds = ln x.
                  $$






                  share|cite|improve this answer









                  $endgroup$



                  Note that for $a > 0$,
                  $$
                  int_0^M frac{1}{a+s}ds = int_{a}^{M+a} frac{1}{s} ds = ln (M+a) - ln a.
                  $$

                  Therefore,
                  $$
                  int_0^M left(frac{1}{1+s} - frac{1}{x+s}right) ds =
                  ln (M+1) - ln 1 - (ln (M+x) - ln x).
                  $$

                  Therefore, by letting $M to infty$, we have
                  $$
                  int_0^infty left(frac{1}{1+s} - frac{1}{x+s}right) ds = ln x.
                  $$







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Dec 11 '18 at 15:51









                  induction601induction601

                  1,276314




                  1,276314






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3035408%2fintegral-of-1-x-base-of-logarithm%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Plaza Victoria

                      In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

                      How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...