Proof of AM GM theorem using Lagrangian











up vote
0
down vote

favorite












Given:





  1. $prod_{i=1}^n x_i = 1$ leads to constraint function $G(x_1,x_2,...,x_n)=prod_{i=1}^n x_i-1$


($prod_{i=1}^n x_i =x_1 x_2...x_n$)



Task is to to find the minimum using conditional extrema of the following (the induction method that is most convinient is forbidden), if we proove this special case then the derivation can be generalised to prove the AMGM theorem:




  1. $F(x_1,x_2,...,x_n) = sum_{i=1}^n x_i$


($sum_{i=1}^n x_i =x_1+x_2+...+x_n$)



Idea is that it should be $sum_{i=1}^n x_i geqslant n$ afaik



And finally Using the derivation above prove the AM-GM theorem:



$frac{sum_{i=1}^n x_i}{n} geqslant sqrt[n]{prod_{i=1}^n x_i - 1}$



Solution so far:



what I come up with is writing down the Lagrangian:



$L = F(x_1,x_2,...,x_n) - lambda G(x_1,x_2,...,x_n) implies L = sum_{i=1}^n x_i - lambda (prod_{i=1}^n x_i-1)$



taking the partial derivatives



$frac{dL}{dx_1} = 1 - lambda frac{prod_{i=1}^n x_i}{x_1}=0 implies lambda frac{prod_{i=1}^n x_i}{x_1}=1 implies lambda frac{1}{x_1}=1 implies lambda = x_1$



$frac{dL}{dx_2} = 1 - lambda frac{prod_{i=1}^n x_i}{x_2}=0 implies lambda frac{prod_{i=1}^n x_i}{x_2}=1 implies lambda frac{1}{x_2}=1implies lambda = x_2$



...



$frac{dL}{dx_n} = 1 - lambda frac{prod_{i=1}^n x_i}{x_n}=0 implies lambda frac{prod_{i=1}^n x_i}{x_n}=1 implies lambda frac{1}{x_n}=1 implies lambda = x_n$



$frac{dL}{dlambda} = - prod_{i=1}^n x_i + 1= 0 implies$ $prod_{i=1}^n x_i = 1$



$ lambda = x_1 = x_2 = ...=x_n = 1$ is our critical point.



Taking the differential of our constraint



$dG(x_1,x_2,...,x_n) = 0$



$frac{partial G}{partial x_1}Delta x_1+frac{partial G}{partial x_2}Delta x_2+...+frac{partial G}{partial x_n}Delta x_n=0$



$frac{prod_{i=1}^n x_i}{x_1}Delta x_1 + frac{prod_{i=1}^n x_i}{x_2}Delta x_2+...+frac{prod_{i=1}^n x_i}{x_n}Delta x_n=0$



substituting the roots of critical point $x_1=x_2=...=x_n=1$ and $prod_{i=1}^n x_i=1$ leads to



$Delta x_1 + Delta x_2 +...+ Delta x_n = 0$



First question.



update, answered by Andreas below



The second order differential of Lagrangian has to be positive, but I’m getting negative sign



$d^2L = sum_{j=1}^nsum_{i=1}^n L_{x_j x_i} Delta x_j Delta x_i=-lambda (prod x_i )(frac{Delta x_i Delta x_j}{x_i x_j})<0$



Here I took the second order partial derivatives of L



$frac{d^2 L} {dx_1 dx_1} = -lambda frac{prod x_i}{x_1 x_1}$



$frac{d^2 L} {dx_1 dx_2} = -lambda frac{prod x_i}{x_1 x_2}$



...



$frac{d^2 L} {dx_n dx_n} = -lambda frac{prod x_i}{x_n x_n}$



Second question.
Are these reasonings correct?



It is needed to justify why local extrema is global as well.



If the second differential will become positive and therefore at the point (1,1,...,1) is local minima(since the function might be just like a cubic polynomial with no global minima), then we can check the function limit along axis direction (by sending all but one variable to infinity and getting the that one variable limit to zero) and thus we find out that the function F is located in the positive n-dimensional quadrant and therefore the extreme point has to be global minima.
I don't know it's just a feeling, was thinking of rotating the function 45 degrees towards the vertical axis and then stating that function goes to infinity in each direction.



Update. Probs solved the global issue part, gonna consilt with Professor and update the solution if my assumptions are right.



Update. Just posted Proof to Wiki: MyProof
Idea was to simply use the Weierstrass theorem and apply it to any closed domain interval of function.










share|cite|improve this question




























    up vote
    0
    down vote

    favorite












    Given:





    1. $prod_{i=1}^n x_i = 1$ leads to constraint function $G(x_1,x_2,...,x_n)=prod_{i=1}^n x_i-1$


    ($prod_{i=1}^n x_i =x_1 x_2...x_n$)



    Task is to to find the minimum using conditional extrema of the following (the induction method that is most convinient is forbidden), if we proove this special case then the derivation can be generalised to prove the AMGM theorem:




    1. $F(x_1,x_2,...,x_n) = sum_{i=1}^n x_i$


    ($sum_{i=1}^n x_i =x_1+x_2+...+x_n$)



    Idea is that it should be $sum_{i=1}^n x_i geqslant n$ afaik



    And finally Using the derivation above prove the AM-GM theorem:



    $frac{sum_{i=1}^n x_i}{n} geqslant sqrt[n]{prod_{i=1}^n x_i - 1}$



    Solution so far:



    what I come up with is writing down the Lagrangian:



    $L = F(x_1,x_2,...,x_n) - lambda G(x_1,x_2,...,x_n) implies L = sum_{i=1}^n x_i - lambda (prod_{i=1}^n x_i-1)$



    taking the partial derivatives



    $frac{dL}{dx_1} = 1 - lambda frac{prod_{i=1}^n x_i}{x_1}=0 implies lambda frac{prod_{i=1}^n x_i}{x_1}=1 implies lambda frac{1}{x_1}=1 implies lambda = x_1$



    $frac{dL}{dx_2} = 1 - lambda frac{prod_{i=1}^n x_i}{x_2}=0 implies lambda frac{prod_{i=1}^n x_i}{x_2}=1 implies lambda frac{1}{x_2}=1implies lambda = x_2$



    ...



    $frac{dL}{dx_n} = 1 - lambda frac{prod_{i=1}^n x_i}{x_n}=0 implies lambda frac{prod_{i=1}^n x_i}{x_n}=1 implies lambda frac{1}{x_n}=1 implies lambda = x_n$



    $frac{dL}{dlambda} = - prod_{i=1}^n x_i + 1= 0 implies$ $prod_{i=1}^n x_i = 1$



    $ lambda = x_1 = x_2 = ...=x_n = 1$ is our critical point.



    Taking the differential of our constraint



    $dG(x_1,x_2,...,x_n) = 0$



    $frac{partial G}{partial x_1}Delta x_1+frac{partial G}{partial x_2}Delta x_2+...+frac{partial G}{partial x_n}Delta x_n=0$



    $frac{prod_{i=1}^n x_i}{x_1}Delta x_1 + frac{prod_{i=1}^n x_i}{x_2}Delta x_2+...+frac{prod_{i=1}^n x_i}{x_n}Delta x_n=0$



    substituting the roots of critical point $x_1=x_2=...=x_n=1$ and $prod_{i=1}^n x_i=1$ leads to



    $Delta x_1 + Delta x_2 +...+ Delta x_n = 0$



    First question.



    update, answered by Andreas below



    The second order differential of Lagrangian has to be positive, but I’m getting negative sign



    $d^2L = sum_{j=1}^nsum_{i=1}^n L_{x_j x_i} Delta x_j Delta x_i=-lambda (prod x_i )(frac{Delta x_i Delta x_j}{x_i x_j})<0$



    Here I took the second order partial derivatives of L



    $frac{d^2 L} {dx_1 dx_1} = -lambda frac{prod x_i}{x_1 x_1}$



    $frac{d^2 L} {dx_1 dx_2} = -lambda frac{prod x_i}{x_1 x_2}$



    ...



    $frac{d^2 L} {dx_n dx_n} = -lambda frac{prod x_i}{x_n x_n}$



    Second question.
    Are these reasonings correct?



    It is needed to justify why local extrema is global as well.



    If the second differential will become positive and therefore at the point (1,1,...,1) is local minima(since the function might be just like a cubic polynomial with no global minima), then we can check the function limit along axis direction (by sending all but one variable to infinity and getting the that one variable limit to zero) and thus we find out that the function F is located in the positive n-dimensional quadrant and therefore the extreme point has to be global minima.
    I don't know it's just a feeling, was thinking of rotating the function 45 degrees towards the vertical axis and then stating that function goes to infinity in each direction.



    Update. Probs solved the global issue part, gonna consilt with Professor and update the solution if my assumptions are right.



    Update. Just posted Proof to Wiki: MyProof
    Idea was to simply use the Weierstrass theorem and apply it to any closed domain interval of function.










    share|cite|improve this question


























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      Given:





      1. $prod_{i=1}^n x_i = 1$ leads to constraint function $G(x_1,x_2,...,x_n)=prod_{i=1}^n x_i-1$


      ($prod_{i=1}^n x_i =x_1 x_2...x_n$)



      Task is to to find the minimum using conditional extrema of the following (the induction method that is most convinient is forbidden), if we proove this special case then the derivation can be generalised to prove the AMGM theorem:




      1. $F(x_1,x_2,...,x_n) = sum_{i=1}^n x_i$


      ($sum_{i=1}^n x_i =x_1+x_2+...+x_n$)



      Idea is that it should be $sum_{i=1}^n x_i geqslant n$ afaik



      And finally Using the derivation above prove the AM-GM theorem:



      $frac{sum_{i=1}^n x_i}{n} geqslant sqrt[n]{prod_{i=1}^n x_i - 1}$



      Solution so far:



      what I come up with is writing down the Lagrangian:



      $L = F(x_1,x_2,...,x_n) - lambda G(x_1,x_2,...,x_n) implies L = sum_{i=1}^n x_i - lambda (prod_{i=1}^n x_i-1)$



      taking the partial derivatives



      $frac{dL}{dx_1} = 1 - lambda frac{prod_{i=1}^n x_i}{x_1}=0 implies lambda frac{prod_{i=1}^n x_i}{x_1}=1 implies lambda frac{1}{x_1}=1 implies lambda = x_1$



      $frac{dL}{dx_2} = 1 - lambda frac{prod_{i=1}^n x_i}{x_2}=0 implies lambda frac{prod_{i=1}^n x_i}{x_2}=1 implies lambda frac{1}{x_2}=1implies lambda = x_2$



      ...



      $frac{dL}{dx_n} = 1 - lambda frac{prod_{i=1}^n x_i}{x_n}=0 implies lambda frac{prod_{i=1}^n x_i}{x_n}=1 implies lambda frac{1}{x_n}=1 implies lambda = x_n$



      $frac{dL}{dlambda} = - prod_{i=1}^n x_i + 1= 0 implies$ $prod_{i=1}^n x_i = 1$



      $ lambda = x_1 = x_2 = ...=x_n = 1$ is our critical point.



      Taking the differential of our constraint



      $dG(x_1,x_2,...,x_n) = 0$



      $frac{partial G}{partial x_1}Delta x_1+frac{partial G}{partial x_2}Delta x_2+...+frac{partial G}{partial x_n}Delta x_n=0$



      $frac{prod_{i=1}^n x_i}{x_1}Delta x_1 + frac{prod_{i=1}^n x_i}{x_2}Delta x_2+...+frac{prod_{i=1}^n x_i}{x_n}Delta x_n=0$



      substituting the roots of critical point $x_1=x_2=...=x_n=1$ and $prod_{i=1}^n x_i=1$ leads to



      $Delta x_1 + Delta x_2 +...+ Delta x_n = 0$



      First question.



      update, answered by Andreas below



      The second order differential of Lagrangian has to be positive, but I’m getting negative sign



      $d^2L = sum_{j=1}^nsum_{i=1}^n L_{x_j x_i} Delta x_j Delta x_i=-lambda (prod x_i )(frac{Delta x_i Delta x_j}{x_i x_j})<0$



      Here I took the second order partial derivatives of L



      $frac{d^2 L} {dx_1 dx_1} = -lambda frac{prod x_i}{x_1 x_1}$



      $frac{d^2 L} {dx_1 dx_2} = -lambda frac{prod x_i}{x_1 x_2}$



      ...



      $frac{d^2 L} {dx_n dx_n} = -lambda frac{prod x_i}{x_n x_n}$



      Second question.
      Are these reasonings correct?



      It is needed to justify why local extrema is global as well.



      If the second differential will become positive and therefore at the point (1,1,...,1) is local minima(since the function might be just like a cubic polynomial with no global minima), then we can check the function limit along axis direction (by sending all but one variable to infinity and getting the that one variable limit to zero) and thus we find out that the function F is located in the positive n-dimensional quadrant and therefore the extreme point has to be global minima.
      I don't know it's just a feeling, was thinking of rotating the function 45 degrees towards the vertical axis and then stating that function goes to infinity in each direction.



      Update. Probs solved the global issue part, gonna consilt with Professor and update the solution if my assumptions are right.



      Update. Just posted Proof to Wiki: MyProof
      Idea was to simply use the Weierstrass theorem and apply it to any closed domain interval of function.










      share|cite|improve this question















      Given:





      1. $prod_{i=1}^n x_i = 1$ leads to constraint function $G(x_1,x_2,...,x_n)=prod_{i=1}^n x_i-1$


      ($prod_{i=1}^n x_i =x_1 x_2...x_n$)



      Task is to to find the minimum using conditional extrema of the following (the induction method that is most convinient is forbidden), if we proove this special case then the derivation can be generalised to prove the AMGM theorem:




      1. $F(x_1,x_2,...,x_n) = sum_{i=1}^n x_i$


      ($sum_{i=1}^n x_i =x_1+x_2+...+x_n$)



      Idea is that it should be $sum_{i=1}^n x_i geqslant n$ afaik



      And finally Using the derivation above prove the AM-GM theorem:



      $frac{sum_{i=1}^n x_i}{n} geqslant sqrt[n]{prod_{i=1}^n x_i - 1}$



      Solution so far:



      what I come up with is writing down the Lagrangian:



      $L = F(x_1,x_2,...,x_n) - lambda G(x_1,x_2,...,x_n) implies L = sum_{i=1}^n x_i - lambda (prod_{i=1}^n x_i-1)$



      taking the partial derivatives



      $frac{dL}{dx_1} = 1 - lambda frac{prod_{i=1}^n x_i}{x_1}=0 implies lambda frac{prod_{i=1}^n x_i}{x_1}=1 implies lambda frac{1}{x_1}=1 implies lambda = x_1$



      $frac{dL}{dx_2} = 1 - lambda frac{prod_{i=1}^n x_i}{x_2}=0 implies lambda frac{prod_{i=1}^n x_i}{x_2}=1 implies lambda frac{1}{x_2}=1implies lambda = x_2$



      ...



      $frac{dL}{dx_n} = 1 - lambda frac{prod_{i=1}^n x_i}{x_n}=0 implies lambda frac{prod_{i=1}^n x_i}{x_n}=1 implies lambda frac{1}{x_n}=1 implies lambda = x_n$



      $frac{dL}{dlambda} = - prod_{i=1}^n x_i + 1= 0 implies$ $prod_{i=1}^n x_i = 1$



      $ lambda = x_1 = x_2 = ...=x_n = 1$ is our critical point.



      Taking the differential of our constraint



      $dG(x_1,x_2,...,x_n) = 0$



      $frac{partial G}{partial x_1}Delta x_1+frac{partial G}{partial x_2}Delta x_2+...+frac{partial G}{partial x_n}Delta x_n=0$



      $frac{prod_{i=1}^n x_i}{x_1}Delta x_1 + frac{prod_{i=1}^n x_i}{x_2}Delta x_2+...+frac{prod_{i=1}^n x_i}{x_n}Delta x_n=0$



      substituting the roots of critical point $x_1=x_2=...=x_n=1$ and $prod_{i=1}^n x_i=1$ leads to



      $Delta x_1 + Delta x_2 +...+ Delta x_n = 0$



      First question.



      update, answered by Andreas below



      The second order differential of Lagrangian has to be positive, but I’m getting negative sign



      $d^2L = sum_{j=1}^nsum_{i=1}^n L_{x_j x_i} Delta x_j Delta x_i=-lambda (prod x_i )(frac{Delta x_i Delta x_j}{x_i x_j})<0$



      Here I took the second order partial derivatives of L



      $frac{d^2 L} {dx_1 dx_1} = -lambda frac{prod x_i}{x_1 x_1}$



      $frac{d^2 L} {dx_1 dx_2} = -lambda frac{prod x_i}{x_1 x_2}$



      ...



      $frac{d^2 L} {dx_n dx_n} = -lambda frac{prod x_i}{x_n x_n}$



      Second question.
      Are these reasonings correct?



      It is needed to justify why local extrema is global as well.



      If the second differential will become positive and therefore at the point (1,1,...,1) is local minima(since the function might be just like a cubic polynomial with no global minima), then we can check the function limit along axis direction (by sending all but one variable to infinity and getting the that one variable limit to zero) and thus we find out that the function F is located in the positive n-dimensional quadrant and therefore the extreme point has to be global minima.
      I don't know it's just a feeling, was thinking of rotating the function 45 degrees towards the vertical axis and then stating that function goes to infinity in each direction.



      Update. Probs solved the global issue part, gonna consilt with Professor and update the solution if my assumptions are right.



      Update. Just posted Proof to Wiki: MyProof
      Idea was to simply use the Weierstrass theorem and apply it to any closed domain interval of function.







      calculus multivariable-calculus lagrange-multiplier extreme-value-theorem extreme-value-analysis






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 27 at 8:47

























      asked Nov 21 at 10:32









      MotherLand

      7218




      7218






















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote













          Taking the second derivatives and evaluating at $lambda = x_1 = x_2 = ...=x_n = 1$ gives
          $$ L = sum_{i=1}^n x_i - lambda (prod_{i=1}^n x_i-1)\
          frac{dL}{dx_k} = 1 - lambda frac{prod_{i=1}^n x_i}{x_k}\
          frac{d^2L}{dx_k^2} = 0\
          frac{d^2L}{dx_k d x_m} = - lambda frac{prod_{i=1}^n x_i}{x_k x_m} = -1 (k ne m)
          $$

          Now you need that for any vector $Delta x$ with $Delta x_1 + Delta x_2 +...+ Delta x_n = 0$ holds: $$sum_{i=1}^nsum_{j=1}^n frac{d^2L}{dx_i d x_j} Delta x_i Delta x_j > 0$$. We have
          $$sum_{i=1}^nsum_{j=1}^n frac{d^2L}{dx_i d x_j} Delta x_i Delta x_j = - sum_{i=1}^nsum_{j neq i} Delta x_i Delta x_j \
          = - sum_{i=1}^n Delta x_i sum_{j neq i} Delta x_j \
          = - sum_{i=1}^n Delta x_i (0 -Delta x_i ) = sum_{i=1}^n Delta x_i^2 > 0$$
          .






          share|cite|improve this answer





















          • Thanks! Figured out as well. Now left with the local to global part to think of.
            – MotherLand
            Nov 21 at 14:50











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3007537%2fproof-of-am-gm-theorem-using-lagrangian%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote













          Taking the second derivatives and evaluating at $lambda = x_1 = x_2 = ...=x_n = 1$ gives
          $$ L = sum_{i=1}^n x_i - lambda (prod_{i=1}^n x_i-1)\
          frac{dL}{dx_k} = 1 - lambda frac{prod_{i=1}^n x_i}{x_k}\
          frac{d^2L}{dx_k^2} = 0\
          frac{d^2L}{dx_k d x_m} = - lambda frac{prod_{i=1}^n x_i}{x_k x_m} = -1 (k ne m)
          $$

          Now you need that for any vector $Delta x$ with $Delta x_1 + Delta x_2 +...+ Delta x_n = 0$ holds: $$sum_{i=1}^nsum_{j=1}^n frac{d^2L}{dx_i d x_j} Delta x_i Delta x_j > 0$$. We have
          $$sum_{i=1}^nsum_{j=1}^n frac{d^2L}{dx_i d x_j} Delta x_i Delta x_j = - sum_{i=1}^nsum_{j neq i} Delta x_i Delta x_j \
          = - sum_{i=1}^n Delta x_i sum_{j neq i} Delta x_j \
          = - sum_{i=1}^n Delta x_i (0 -Delta x_i ) = sum_{i=1}^n Delta x_i^2 > 0$$
          .






          share|cite|improve this answer





















          • Thanks! Figured out as well. Now left with the local to global part to think of.
            – MotherLand
            Nov 21 at 14:50















          up vote
          1
          down vote













          Taking the second derivatives and evaluating at $lambda = x_1 = x_2 = ...=x_n = 1$ gives
          $$ L = sum_{i=1}^n x_i - lambda (prod_{i=1}^n x_i-1)\
          frac{dL}{dx_k} = 1 - lambda frac{prod_{i=1}^n x_i}{x_k}\
          frac{d^2L}{dx_k^2} = 0\
          frac{d^2L}{dx_k d x_m} = - lambda frac{prod_{i=1}^n x_i}{x_k x_m} = -1 (k ne m)
          $$

          Now you need that for any vector $Delta x$ with $Delta x_1 + Delta x_2 +...+ Delta x_n = 0$ holds: $$sum_{i=1}^nsum_{j=1}^n frac{d^2L}{dx_i d x_j} Delta x_i Delta x_j > 0$$. We have
          $$sum_{i=1}^nsum_{j=1}^n frac{d^2L}{dx_i d x_j} Delta x_i Delta x_j = - sum_{i=1}^nsum_{j neq i} Delta x_i Delta x_j \
          = - sum_{i=1}^n Delta x_i sum_{j neq i} Delta x_j \
          = - sum_{i=1}^n Delta x_i (0 -Delta x_i ) = sum_{i=1}^n Delta x_i^2 > 0$$
          .






          share|cite|improve this answer





















          • Thanks! Figured out as well. Now left with the local to global part to think of.
            – MotherLand
            Nov 21 at 14:50













          up vote
          1
          down vote










          up vote
          1
          down vote









          Taking the second derivatives and evaluating at $lambda = x_1 = x_2 = ...=x_n = 1$ gives
          $$ L = sum_{i=1}^n x_i - lambda (prod_{i=1}^n x_i-1)\
          frac{dL}{dx_k} = 1 - lambda frac{prod_{i=1}^n x_i}{x_k}\
          frac{d^2L}{dx_k^2} = 0\
          frac{d^2L}{dx_k d x_m} = - lambda frac{prod_{i=1}^n x_i}{x_k x_m} = -1 (k ne m)
          $$

          Now you need that for any vector $Delta x$ with $Delta x_1 + Delta x_2 +...+ Delta x_n = 0$ holds: $$sum_{i=1}^nsum_{j=1}^n frac{d^2L}{dx_i d x_j} Delta x_i Delta x_j > 0$$. We have
          $$sum_{i=1}^nsum_{j=1}^n frac{d^2L}{dx_i d x_j} Delta x_i Delta x_j = - sum_{i=1}^nsum_{j neq i} Delta x_i Delta x_j \
          = - sum_{i=1}^n Delta x_i sum_{j neq i} Delta x_j \
          = - sum_{i=1}^n Delta x_i (0 -Delta x_i ) = sum_{i=1}^n Delta x_i^2 > 0$$
          .






          share|cite|improve this answer












          Taking the second derivatives and evaluating at $lambda = x_1 = x_2 = ...=x_n = 1$ gives
          $$ L = sum_{i=1}^n x_i - lambda (prod_{i=1}^n x_i-1)\
          frac{dL}{dx_k} = 1 - lambda frac{prod_{i=1}^n x_i}{x_k}\
          frac{d^2L}{dx_k^2} = 0\
          frac{d^2L}{dx_k d x_m} = - lambda frac{prod_{i=1}^n x_i}{x_k x_m} = -1 (k ne m)
          $$

          Now you need that for any vector $Delta x$ with $Delta x_1 + Delta x_2 +...+ Delta x_n = 0$ holds: $$sum_{i=1}^nsum_{j=1}^n frac{d^2L}{dx_i d x_j} Delta x_i Delta x_j > 0$$. We have
          $$sum_{i=1}^nsum_{j=1}^n frac{d^2L}{dx_i d x_j} Delta x_i Delta x_j = - sum_{i=1}^nsum_{j neq i} Delta x_i Delta x_j \
          = - sum_{i=1}^n Delta x_i sum_{j neq i} Delta x_j \
          = - sum_{i=1}^n Delta x_i (0 -Delta x_i ) = sum_{i=1}^n Delta x_i^2 > 0$$
          .







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Nov 21 at 13:42









          Andreas

          7,6031037




          7,6031037












          • Thanks! Figured out as well. Now left with the local to global part to think of.
            – MotherLand
            Nov 21 at 14:50


















          • Thanks! Figured out as well. Now left with the local to global part to think of.
            – MotherLand
            Nov 21 at 14:50
















          Thanks! Figured out as well. Now left with the local to global part to think of.
          – MotherLand
          Nov 21 at 14:50




          Thanks! Figured out as well. Now left with the local to global part to think of.
          – MotherLand
          Nov 21 at 14:50


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3007537%2fproof-of-am-gm-theorem-using-lagrangian%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Plaza Victoria

          Puebla de Zaragoza

          Musa