Proving multiplicative property of the exponential function from its limit definition












3














I am trying to show the function defined by:
$$E(x)=lim_{ntoinfty}left(1+frac{x}{n}right)^n$$
satisfies the property:
$$E(x)E(y)=E(x+y)$$





Assuming $E$ is well defined, I can interchange products and limits (?). We have:



$$begin{aligned} E(x)E(y) &=lim_{ntoinfty}left(1+frac{x}{n}right)^nlim_{ntoinfty}left(1+frac{y}{n}right)^n \ &=lim_{ntoinfty}left[left(1+frac{x}{n}right)left(1+frac{y}{n}right)right]^n \ &=lim_{ntoinfty}left[1+frac{x+y}{n}+frac{xy}{n^2}right]^n\ &=lim_{ntoinfty} sum_{k=0}^nbinom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^k
\ & = E(x+y)+lim_{ntoinfty} sum_{k=1}^nbinom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^kend{aligned}$$

This is where I am stuck. I can see that, for any $k$:
$$binom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^k=left(frac{xy}{n}right)^kleft(1+frac{x+y}{n}right)^{n-k}prod_{r=0}^{k-1}left(1-frac{r}{n}right)overset{ntoinfty}to 0$$
But because the sum increases in terms as $n$ increases, I feel like this is not sufficient to argue that the limit of it is $0$. Is that right? If so I'm not sure how to go about it and would appreciate some help.










share|cite|improve this question






















  • If you know that $E(x) neq 0, forall xin mathbb {R} $ then an easy proof is given by considering the sequence $x_n=(1+((x+y) /n)) /((1+(x/n))(1+(y/n)))$ and noting that $ n(x_n-1)to 0$ so that $x_n^nto 1$. See this answer math.stackexchange.com/a/3000717/72031
    – Paramanand Singh
    Nov 23 at 4:34


















3














I am trying to show the function defined by:
$$E(x)=lim_{ntoinfty}left(1+frac{x}{n}right)^n$$
satisfies the property:
$$E(x)E(y)=E(x+y)$$





Assuming $E$ is well defined, I can interchange products and limits (?). We have:



$$begin{aligned} E(x)E(y) &=lim_{ntoinfty}left(1+frac{x}{n}right)^nlim_{ntoinfty}left(1+frac{y}{n}right)^n \ &=lim_{ntoinfty}left[left(1+frac{x}{n}right)left(1+frac{y}{n}right)right]^n \ &=lim_{ntoinfty}left[1+frac{x+y}{n}+frac{xy}{n^2}right]^n\ &=lim_{ntoinfty} sum_{k=0}^nbinom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^k
\ & = E(x+y)+lim_{ntoinfty} sum_{k=1}^nbinom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^kend{aligned}$$

This is where I am stuck. I can see that, for any $k$:
$$binom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^k=left(frac{xy}{n}right)^kleft(1+frac{x+y}{n}right)^{n-k}prod_{r=0}^{k-1}left(1-frac{r}{n}right)overset{ntoinfty}to 0$$
But because the sum increases in terms as $n$ increases, I feel like this is not sufficient to argue that the limit of it is $0$. Is that right? If so I'm not sure how to go about it and would appreciate some help.










share|cite|improve this question






















  • If you know that $E(x) neq 0, forall xin mathbb {R} $ then an easy proof is given by considering the sequence $x_n=(1+((x+y) /n)) /((1+(x/n))(1+(y/n)))$ and noting that $ n(x_n-1)to 0$ so that $x_n^nto 1$. See this answer math.stackexchange.com/a/3000717/72031
    – Paramanand Singh
    Nov 23 at 4:34
















3












3








3







I am trying to show the function defined by:
$$E(x)=lim_{ntoinfty}left(1+frac{x}{n}right)^n$$
satisfies the property:
$$E(x)E(y)=E(x+y)$$





Assuming $E$ is well defined, I can interchange products and limits (?). We have:



$$begin{aligned} E(x)E(y) &=lim_{ntoinfty}left(1+frac{x}{n}right)^nlim_{ntoinfty}left(1+frac{y}{n}right)^n \ &=lim_{ntoinfty}left[left(1+frac{x}{n}right)left(1+frac{y}{n}right)right]^n \ &=lim_{ntoinfty}left[1+frac{x+y}{n}+frac{xy}{n^2}right]^n\ &=lim_{ntoinfty} sum_{k=0}^nbinom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^k
\ & = E(x+y)+lim_{ntoinfty} sum_{k=1}^nbinom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^kend{aligned}$$

This is where I am stuck. I can see that, for any $k$:
$$binom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^k=left(frac{xy}{n}right)^kleft(1+frac{x+y}{n}right)^{n-k}prod_{r=0}^{k-1}left(1-frac{r}{n}right)overset{ntoinfty}to 0$$
But because the sum increases in terms as $n$ increases, I feel like this is not sufficient to argue that the limit of it is $0$. Is that right? If so I'm not sure how to go about it and would appreciate some help.










share|cite|improve this question













I am trying to show the function defined by:
$$E(x)=lim_{ntoinfty}left(1+frac{x}{n}right)^n$$
satisfies the property:
$$E(x)E(y)=E(x+y)$$





Assuming $E$ is well defined, I can interchange products and limits (?). We have:



$$begin{aligned} E(x)E(y) &=lim_{ntoinfty}left(1+frac{x}{n}right)^nlim_{ntoinfty}left(1+frac{y}{n}right)^n \ &=lim_{ntoinfty}left[left(1+frac{x}{n}right)left(1+frac{y}{n}right)right]^n \ &=lim_{ntoinfty}left[1+frac{x+y}{n}+frac{xy}{n^2}right]^n\ &=lim_{ntoinfty} sum_{k=0}^nbinom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^k
\ & = E(x+y)+lim_{ntoinfty} sum_{k=1}^nbinom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^kend{aligned}$$

This is where I am stuck. I can see that, for any $k$:
$$binom{n}{k}left(1+frac{x+y}{n}right)^{n-k}left(frac{xy}{n^2}right)^k=left(frac{xy}{n}right)^kleft(1+frac{x+y}{n}right)^{n-k}prod_{r=0}^{k-1}left(1-frac{r}{n}right)overset{ntoinfty}to 0$$
But because the sum increases in terms as $n$ increases, I feel like this is not sufficient to argue that the limit of it is $0$. Is that right? If so I'm not sure how to go about it and would appreciate some help.







real-analysis exponential-function






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 22 at 21:49









K. 622

443210




443210












  • If you know that $E(x) neq 0, forall xin mathbb {R} $ then an easy proof is given by considering the sequence $x_n=(1+((x+y) /n)) /((1+(x/n))(1+(y/n)))$ and noting that $ n(x_n-1)to 0$ so that $x_n^nto 1$. See this answer math.stackexchange.com/a/3000717/72031
    – Paramanand Singh
    Nov 23 at 4:34




















  • If you know that $E(x) neq 0, forall xin mathbb {R} $ then an easy proof is given by considering the sequence $x_n=(1+((x+y) /n)) /((1+(x/n))(1+(y/n)))$ and noting that $ n(x_n-1)to 0$ so that $x_n^nto 1$. See this answer math.stackexchange.com/a/3000717/72031
    – Paramanand Singh
    Nov 23 at 4:34


















If you know that $E(x) neq 0, forall xin mathbb {R} $ then an easy proof is given by considering the sequence $x_n=(1+((x+y) /n)) /((1+(x/n))(1+(y/n)))$ and noting that $ n(x_n-1)to 0$ so that $x_n^nto 1$. See this answer math.stackexchange.com/a/3000717/72031
– Paramanand Singh
Nov 23 at 4:34






If you know that $E(x) neq 0, forall xin mathbb {R} $ then an easy proof is given by considering the sequence $x_n=(1+((x+y) /n)) /((1+(x/n))(1+(y/n)))$ and noting that $ n(x_n-1)to 0$ so that $x_n^nto 1$. See this answer math.stackexchange.com/a/3000717/72031
– Paramanand Singh
Nov 23 at 4:34












3 Answers
3






active

oldest

votes


















4














You do not need to deal with that hard expansion. Here is another way using Squeeze theorem for $xyge 0$. The case $xyle 0$ is also similar. Notice that for any $a>0$ and large enough $n$ we have:$$1+{x+yover n}le 1+{x+yover n}+{xyover n^2}<1+{x+yover n}+{xyover an}=1+{x+y+{xyover a}over n}$$using Squeeze theorem we have$$lim _{nto infty}(1+{x+yover n})^nle lim _{nto infty}(1+{x+yover n}+{xyover n^2})^nle lim _{nto infty}(1+{x+y+{xyover a}over n})^n$$or using the definition $$E(x+y)le E(x)E(y)le E(x+y+{xyover a})$$since this is true for any $a>0$ by tending $a$ to $infty$ we obtain $$E(x+y)le E(x)E(y)le E(x+y)$$which yields to $$E(x+y)=E(x)E(y)$$






share|cite|improve this answer



















  • 1




    Thank you, that is easier indeed. I assume you mean $atoinfty$ as opposed to $to 0$?
    – K. 622
    Nov 22 at 22:48










  • Yes that's because of how I plugged $a$ in the equation.....
    – Mostafa Ayaz
    Nov 23 at 7:28










  • @ Mostafa Ayaz: What are the arguments for letting $a$ tend to 0? If seems that you use continuity of $E$ at a very early stage.
    – Jens Schwaiger
    Nov 27 at 8:14










  • I took this idea to use: $ax$ grows mush slowly from $x^2$ for any $a>0$ and for sufficiently large $x$.
    – Mostafa Ayaz
    Nov 27 at 8:19



















3














I don't recommend trying to do it this way. The cleanest proof I know is a bit less direct than this. First, show that $E(x)$ is the unique solution to the differential equation $E'(x) = E(x)$ with initial condition $E(0) = 1$, and more generally that $C E(x)$ is the unique solution with initial condition $E(0) = C$.



For existence you'll want to exchange a limit and a derivative and you'll need to be careful about that, but once you've justified that exchange, $frac{d}{dx} left( 1 + frac{x}{n} right)^n = left( 1 + frac{x}{n} right)^{n-1}$ so it's clear that the two limits are the same. Morally the point is that the limit definition of $E(x)$ is attempting to solve this differential equation using Euler's method.



Uniqueness is easier: if $E_1(x)$ and $E_2(x)$ are two solutions compute the derivative of $frac{E_1(x)}{E_2(x)}$. (Well, first show that solutions are always positive, so we can take this quotient.)



Once you have this everything is very easy: $E(x) E(y)$ and $E(x + y)$, as functions of $x$ with $y$ fixed, are both solutions to the differential equation $f'(x) = f(x)$ with initial condition $f(0) = E(y)$. And then we're done by uniqueness.






share|cite|improve this answer





















  • Thanks, that's a clever detour. I wasn't aware of the source of the limit definition (Euler's method), thank you for mentioning that.
    – K. 622
    Nov 22 at 22:50



















1














In the excellent book



Analysis 1. 6., korrigierte Aufl. (German)
Springer-Lehrbuch. Berlin: Springer. xiv, 398 S. (2001)



on page 78 (Exercise 14.) you find the following (approximately translated):



The exponential function as the limit of $left(1+frac x nright)^n$.
Show that $E(x)=lim E_n(x)$ exist, where $E_n(x)=left(1+frac x nright)^n$, exists and that this limit equals $e^x$.



Hint: Following example 4.7 [Existence of $E_n(x)$, proved by application of the AGM inequality; AGM inequality=inequality between arithmetic and geometric mean] the sequence is (finally) monotonically increasing, and $left(1+frac p nright)leqleft(1+frac1 nright)^p$ implies $E_n(p)leq e^p$, i.e., $E_n(x)leq E_n(y)leq e^p$ for $-nleq xleq yleq p$. Using the AGM inequality implies
$$ left(1+frac x nright)^n left(1+frac y nright)^nleq left(1+frac {x+y} {2n}right)^{2n}$$ and



$$left(1+frac x {n-1}right)^{n-1} left(1+frac {xy} nright)leq left(1+frac x n+frac y n+frac{xy}{n^2}right)^n=left(1+frac x nright)^nleft(1+frac y nright)^n$$
implying $E(x)E(y)leq E(x+y)leq E(x)E(y)$






share|cite|improve this answer























    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009706%2fproving-multiplicative-property-of-the-exponential-function-from-its-limit-defin%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    4














    You do not need to deal with that hard expansion. Here is another way using Squeeze theorem for $xyge 0$. The case $xyle 0$ is also similar. Notice that for any $a>0$ and large enough $n$ we have:$$1+{x+yover n}le 1+{x+yover n}+{xyover n^2}<1+{x+yover n}+{xyover an}=1+{x+y+{xyover a}over n}$$using Squeeze theorem we have$$lim _{nto infty}(1+{x+yover n})^nle lim _{nto infty}(1+{x+yover n}+{xyover n^2})^nle lim _{nto infty}(1+{x+y+{xyover a}over n})^n$$or using the definition $$E(x+y)le E(x)E(y)le E(x+y+{xyover a})$$since this is true for any $a>0$ by tending $a$ to $infty$ we obtain $$E(x+y)le E(x)E(y)le E(x+y)$$which yields to $$E(x+y)=E(x)E(y)$$






    share|cite|improve this answer



















    • 1




      Thank you, that is easier indeed. I assume you mean $atoinfty$ as opposed to $to 0$?
      – K. 622
      Nov 22 at 22:48










    • Yes that's because of how I plugged $a$ in the equation.....
      – Mostafa Ayaz
      Nov 23 at 7:28










    • @ Mostafa Ayaz: What are the arguments for letting $a$ tend to 0? If seems that you use continuity of $E$ at a very early stage.
      – Jens Schwaiger
      Nov 27 at 8:14










    • I took this idea to use: $ax$ grows mush slowly from $x^2$ for any $a>0$ and for sufficiently large $x$.
      – Mostafa Ayaz
      Nov 27 at 8:19
















    4














    You do not need to deal with that hard expansion. Here is another way using Squeeze theorem for $xyge 0$. The case $xyle 0$ is also similar. Notice that for any $a>0$ and large enough $n$ we have:$$1+{x+yover n}le 1+{x+yover n}+{xyover n^2}<1+{x+yover n}+{xyover an}=1+{x+y+{xyover a}over n}$$using Squeeze theorem we have$$lim _{nto infty}(1+{x+yover n})^nle lim _{nto infty}(1+{x+yover n}+{xyover n^2})^nle lim _{nto infty}(1+{x+y+{xyover a}over n})^n$$or using the definition $$E(x+y)le E(x)E(y)le E(x+y+{xyover a})$$since this is true for any $a>0$ by tending $a$ to $infty$ we obtain $$E(x+y)le E(x)E(y)le E(x+y)$$which yields to $$E(x+y)=E(x)E(y)$$






    share|cite|improve this answer



















    • 1




      Thank you, that is easier indeed. I assume you mean $atoinfty$ as opposed to $to 0$?
      – K. 622
      Nov 22 at 22:48










    • Yes that's because of how I plugged $a$ in the equation.....
      – Mostafa Ayaz
      Nov 23 at 7:28










    • @ Mostafa Ayaz: What are the arguments for letting $a$ tend to 0? If seems that you use continuity of $E$ at a very early stage.
      – Jens Schwaiger
      Nov 27 at 8:14










    • I took this idea to use: $ax$ grows mush slowly from $x^2$ for any $a>0$ and for sufficiently large $x$.
      – Mostafa Ayaz
      Nov 27 at 8:19














    4












    4








    4






    You do not need to deal with that hard expansion. Here is another way using Squeeze theorem for $xyge 0$. The case $xyle 0$ is also similar. Notice that for any $a>0$ and large enough $n$ we have:$$1+{x+yover n}le 1+{x+yover n}+{xyover n^2}<1+{x+yover n}+{xyover an}=1+{x+y+{xyover a}over n}$$using Squeeze theorem we have$$lim _{nto infty}(1+{x+yover n})^nle lim _{nto infty}(1+{x+yover n}+{xyover n^2})^nle lim _{nto infty}(1+{x+y+{xyover a}over n})^n$$or using the definition $$E(x+y)le E(x)E(y)le E(x+y+{xyover a})$$since this is true for any $a>0$ by tending $a$ to $infty$ we obtain $$E(x+y)le E(x)E(y)le E(x+y)$$which yields to $$E(x+y)=E(x)E(y)$$






    share|cite|improve this answer














    You do not need to deal with that hard expansion. Here is another way using Squeeze theorem for $xyge 0$. The case $xyle 0$ is also similar. Notice that for any $a>0$ and large enough $n$ we have:$$1+{x+yover n}le 1+{x+yover n}+{xyover n^2}<1+{x+yover n}+{xyover an}=1+{x+y+{xyover a}over n}$$using Squeeze theorem we have$$lim _{nto infty}(1+{x+yover n})^nle lim _{nto infty}(1+{x+yover n}+{xyover n^2})^nle lim _{nto infty}(1+{x+y+{xyover a}over n})^n$$or using the definition $$E(x+y)le E(x)E(y)le E(x+y+{xyover a})$$since this is true for any $a>0$ by tending $a$ to $infty$ we obtain $$E(x+y)le E(x)E(y)le E(x+y)$$which yields to $$E(x+y)=E(x)E(y)$$







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Nov 27 at 8:17

























    answered Nov 22 at 22:00









    Mostafa Ayaz

    13.7k3836




    13.7k3836








    • 1




      Thank you, that is easier indeed. I assume you mean $atoinfty$ as opposed to $to 0$?
      – K. 622
      Nov 22 at 22:48










    • Yes that's because of how I plugged $a$ in the equation.....
      – Mostafa Ayaz
      Nov 23 at 7:28










    • @ Mostafa Ayaz: What are the arguments for letting $a$ tend to 0? If seems that you use continuity of $E$ at a very early stage.
      – Jens Schwaiger
      Nov 27 at 8:14










    • I took this idea to use: $ax$ grows mush slowly from $x^2$ for any $a>0$ and for sufficiently large $x$.
      – Mostafa Ayaz
      Nov 27 at 8:19














    • 1




      Thank you, that is easier indeed. I assume you mean $atoinfty$ as opposed to $to 0$?
      – K. 622
      Nov 22 at 22:48










    • Yes that's because of how I plugged $a$ in the equation.....
      – Mostafa Ayaz
      Nov 23 at 7:28










    • @ Mostafa Ayaz: What are the arguments for letting $a$ tend to 0? If seems that you use continuity of $E$ at a very early stage.
      – Jens Schwaiger
      Nov 27 at 8:14










    • I took this idea to use: $ax$ grows mush slowly from $x^2$ for any $a>0$ and for sufficiently large $x$.
      – Mostafa Ayaz
      Nov 27 at 8:19








    1




    1




    Thank you, that is easier indeed. I assume you mean $atoinfty$ as opposed to $to 0$?
    – K. 622
    Nov 22 at 22:48




    Thank you, that is easier indeed. I assume you mean $atoinfty$ as opposed to $to 0$?
    – K. 622
    Nov 22 at 22:48












    Yes that's because of how I plugged $a$ in the equation.....
    – Mostafa Ayaz
    Nov 23 at 7:28




    Yes that's because of how I plugged $a$ in the equation.....
    – Mostafa Ayaz
    Nov 23 at 7:28












    @ Mostafa Ayaz: What are the arguments for letting $a$ tend to 0? If seems that you use continuity of $E$ at a very early stage.
    – Jens Schwaiger
    Nov 27 at 8:14




    @ Mostafa Ayaz: What are the arguments for letting $a$ tend to 0? If seems that you use continuity of $E$ at a very early stage.
    – Jens Schwaiger
    Nov 27 at 8:14












    I took this idea to use: $ax$ grows mush slowly from $x^2$ for any $a>0$ and for sufficiently large $x$.
    – Mostafa Ayaz
    Nov 27 at 8:19




    I took this idea to use: $ax$ grows mush slowly from $x^2$ for any $a>0$ and for sufficiently large $x$.
    – Mostafa Ayaz
    Nov 27 at 8:19











    3














    I don't recommend trying to do it this way. The cleanest proof I know is a bit less direct than this. First, show that $E(x)$ is the unique solution to the differential equation $E'(x) = E(x)$ with initial condition $E(0) = 1$, and more generally that $C E(x)$ is the unique solution with initial condition $E(0) = C$.



    For existence you'll want to exchange a limit and a derivative and you'll need to be careful about that, but once you've justified that exchange, $frac{d}{dx} left( 1 + frac{x}{n} right)^n = left( 1 + frac{x}{n} right)^{n-1}$ so it's clear that the two limits are the same. Morally the point is that the limit definition of $E(x)$ is attempting to solve this differential equation using Euler's method.



    Uniqueness is easier: if $E_1(x)$ and $E_2(x)$ are two solutions compute the derivative of $frac{E_1(x)}{E_2(x)}$. (Well, first show that solutions are always positive, so we can take this quotient.)



    Once you have this everything is very easy: $E(x) E(y)$ and $E(x + y)$, as functions of $x$ with $y$ fixed, are both solutions to the differential equation $f'(x) = f(x)$ with initial condition $f(0) = E(y)$. And then we're done by uniqueness.






    share|cite|improve this answer





















    • Thanks, that's a clever detour. I wasn't aware of the source of the limit definition (Euler's method), thank you for mentioning that.
      – K. 622
      Nov 22 at 22:50
















    3














    I don't recommend trying to do it this way. The cleanest proof I know is a bit less direct than this. First, show that $E(x)$ is the unique solution to the differential equation $E'(x) = E(x)$ with initial condition $E(0) = 1$, and more generally that $C E(x)$ is the unique solution with initial condition $E(0) = C$.



    For existence you'll want to exchange a limit and a derivative and you'll need to be careful about that, but once you've justified that exchange, $frac{d}{dx} left( 1 + frac{x}{n} right)^n = left( 1 + frac{x}{n} right)^{n-1}$ so it's clear that the two limits are the same. Morally the point is that the limit definition of $E(x)$ is attempting to solve this differential equation using Euler's method.



    Uniqueness is easier: if $E_1(x)$ and $E_2(x)$ are two solutions compute the derivative of $frac{E_1(x)}{E_2(x)}$. (Well, first show that solutions are always positive, so we can take this quotient.)



    Once you have this everything is very easy: $E(x) E(y)$ and $E(x + y)$, as functions of $x$ with $y$ fixed, are both solutions to the differential equation $f'(x) = f(x)$ with initial condition $f(0) = E(y)$. And then we're done by uniqueness.






    share|cite|improve this answer





















    • Thanks, that's a clever detour. I wasn't aware of the source of the limit definition (Euler's method), thank you for mentioning that.
      – K. 622
      Nov 22 at 22:50














    3












    3








    3






    I don't recommend trying to do it this way. The cleanest proof I know is a bit less direct than this. First, show that $E(x)$ is the unique solution to the differential equation $E'(x) = E(x)$ with initial condition $E(0) = 1$, and more generally that $C E(x)$ is the unique solution with initial condition $E(0) = C$.



    For existence you'll want to exchange a limit and a derivative and you'll need to be careful about that, but once you've justified that exchange, $frac{d}{dx} left( 1 + frac{x}{n} right)^n = left( 1 + frac{x}{n} right)^{n-1}$ so it's clear that the two limits are the same. Morally the point is that the limit definition of $E(x)$ is attempting to solve this differential equation using Euler's method.



    Uniqueness is easier: if $E_1(x)$ and $E_2(x)$ are two solutions compute the derivative of $frac{E_1(x)}{E_2(x)}$. (Well, first show that solutions are always positive, so we can take this quotient.)



    Once you have this everything is very easy: $E(x) E(y)$ and $E(x + y)$, as functions of $x$ with $y$ fixed, are both solutions to the differential equation $f'(x) = f(x)$ with initial condition $f(0) = E(y)$. And then we're done by uniqueness.






    share|cite|improve this answer












    I don't recommend trying to do it this way. The cleanest proof I know is a bit less direct than this. First, show that $E(x)$ is the unique solution to the differential equation $E'(x) = E(x)$ with initial condition $E(0) = 1$, and more generally that $C E(x)$ is the unique solution with initial condition $E(0) = C$.



    For existence you'll want to exchange a limit and a derivative and you'll need to be careful about that, but once you've justified that exchange, $frac{d}{dx} left( 1 + frac{x}{n} right)^n = left( 1 + frac{x}{n} right)^{n-1}$ so it's clear that the two limits are the same. Morally the point is that the limit definition of $E(x)$ is attempting to solve this differential equation using Euler's method.



    Uniqueness is easier: if $E_1(x)$ and $E_2(x)$ are two solutions compute the derivative of $frac{E_1(x)}{E_2(x)}$. (Well, first show that solutions are always positive, so we can take this quotient.)



    Once you have this everything is very easy: $E(x) E(y)$ and $E(x + y)$, as functions of $x$ with $y$ fixed, are both solutions to the differential equation $f'(x) = f(x)$ with initial condition $f(0) = E(y)$. And then we're done by uniqueness.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Nov 22 at 21:56









    Qiaochu Yuan

    276k32580918




    276k32580918












    • Thanks, that's a clever detour. I wasn't aware of the source of the limit definition (Euler's method), thank you for mentioning that.
      – K. 622
      Nov 22 at 22:50


















    • Thanks, that's a clever detour. I wasn't aware of the source of the limit definition (Euler's method), thank you for mentioning that.
      – K. 622
      Nov 22 at 22:50
















    Thanks, that's a clever detour. I wasn't aware of the source of the limit definition (Euler's method), thank you for mentioning that.
    – K. 622
    Nov 22 at 22:50




    Thanks, that's a clever detour. I wasn't aware of the source of the limit definition (Euler's method), thank you for mentioning that.
    – K. 622
    Nov 22 at 22:50











    1














    In the excellent book



    Analysis 1. 6., korrigierte Aufl. (German)
    Springer-Lehrbuch. Berlin: Springer. xiv, 398 S. (2001)



    on page 78 (Exercise 14.) you find the following (approximately translated):



    The exponential function as the limit of $left(1+frac x nright)^n$.
    Show that $E(x)=lim E_n(x)$ exist, where $E_n(x)=left(1+frac x nright)^n$, exists and that this limit equals $e^x$.



    Hint: Following example 4.7 [Existence of $E_n(x)$, proved by application of the AGM inequality; AGM inequality=inequality between arithmetic and geometric mean] the sequence is (finally) monotonically increasing, and $left(1+frac p nright)leqleft(1+frac1 nright)^p$ implies $E_n(p)leq e^p$, i.e., $E_n(x)leq E_n(y)leq e^p$ for $-nleq xleq yleq p$. Using the AGM inequality implies
    $$ left(1+frac x nright)^n left(1+frac y nright)^nleq left(1+frac {x+y} {2n}right)^{2n}$$ and



    $$left(1+frac x {n-1}right)^{n-1} left(1+frac {xy} nright)leq left(1+frac x n+frac y n+frac{xy}{n^2}right)^n=left(1+frac x nright)^nleft(1+frac y nright)^n$$
    implying $E(x)E(y)leq E(x+y)leq E(x)E(y)$






    share|cite|improve this answer




























      1














      In the excellent book



      Analysis 1. 6., korrigierte Aufl. (German)
      Springer-Lehrbuch. Berlin: Springer. xiv, 398 S. (2001)



      on page 78 (Exercise 14.) you find the following (approximately translated):



      The exponential function as the limit of $left(1+frac x nright)^n$.
      Show that $E(x)=lim E_n(x)$ exist, where $E_n(x)=left(1+frac x nright)^n$, exists and that this limit equals $e^x$.



      Hint: Following example 4.7 [Existence of $E_n(x)$, proved by application of the AGM inequality; AGM inequality=inequality between arithmetic and geometric mean] the sequence is (finally) monotonically increasing, and $left(1+frac p nright)leqleft(1+frac1 nright)^p$ implies $E_n(p)leq e^p$, i.e., $E_n(x)leq E_n(y)leq e^p$ for $-nleq xleq yleq p$. Using the AGM inequality implies
      $$ left(1+frac x nright)^n left(1+frac y nright)^nleq left(1+frac {x+y} {2n}right)^{2n}$$ and



      $$left(1+frac x {n-1}right)^{n-1} left(1+frac {xy} nright)leq left(1+frac x n+frac y n+frac{xy}{n^2}right)^n=left(1+frac x nright)^nleft(1+frac y nright)^n$$
      implying $E(x)E(y)leq E(x+y)leq E(x)E(y)$






      share|cite|improve this answer


























        1












        1








        1






        In the excellent book



        Analysis 1. 6., korrigierte Aufl. (German)
        Springer-Lehrbuch. Berlin: Springer. xiv, 398 S. (2001)



        on page 78 (Exercise 14.) you find the following (approximately translated):



        The exponential function as the limit of $left(1+frac x nright)^n$.
        Show that $E(x)=lim E_n(x)$ exist, where $E_n(x)=left(1+frac x nright)^n$, exists and that this limit equals $e^x$.



        Hint: Following example 4.7 [Existence of $E_n(x)$, proved by application of the AGM inequality; AGM inequality=inequality between arithmetic and geometric mean] the sequence is (finally) monotonically increasing, and $left(1+frac p nright)leqleft(1+frac1 nright)^p$ implies $E_n(p)leq e^p$, i.e., $E_n(x)leq E_n(y)leq e^p$ for $-nleq xleq yleq p$. Using the AGM inequality implies
        $$ left(1+frac x nright)^n left(1+frac y nright)^nleq left(1+frac {x+y} {2n}right)^{2n}$$ and



        $$left(1+frac x {n-1}right)^{n-1} left(1+frac {xy} nright)leq left(1+frac x n+frac y n+frac{xy}{n^2}right)^n=left(1+frac x nright)^nleft(1+frac y nright)^n$$
        implying $E(x)E(y)leq E(x+y)leq E(x)E(y)$






        share|cite|improve this answer














        In the excellent book



        Analysis 1. 6., korrigierte Aufl. (German)
        Springer-Lehrbuch. Berlin: Springer. xiv, 398 S. (2001)



        on page 78 (Exercise 14.) you find the following (approximately translated):



        The exponential function as the limit of $left(1+frac x nright)^n$.
        Show that $E(x)=lim E_n(x)$ exist, where $E_n(x)=left(1+frac x nright)^n$, exists and that this limit equals $e^x$.



        Hint: Following example 4.7 [Existence of $E_n(x)$, proved by application of the AGM inequality; AGM inequality=inequality between arithmetic and geometric mean] the sequence is (finally) monotonically increasing, and $left(1+frac p nright)leqleft(1+frac1 nright)^p$ implies $E_n(p)leq e^p$, i.e., $E_n(x)leq E_n(y)leq e^p$ for $-nleq xleq yleq p$. Using the AGM inequality implies
        $$ left(1+frac x nright)^n left(1+frac y nright)^nleq left(1+frac {x+y} {2n}right)^{2n}$$ and



        $$left(1+frac x {n-1}right)^{n-1} left(1+frac {xy} nright)leq left(1+frac x n+frac y n+frac{xy}{n^2}right)^n=left(1+frac x nright)^nleft(1+frac y nright)^n$$
        implying $E(x)E(y)leq E(x+y)leq E(x)E(y)$







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Nov 23 at 15:29

























        answered Nov 23 at 14:06









        Jens Schwaiger

        1,414128




        1,414128






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009706%2fproving-multiplicative-property-of-the-exponential-function-from-its-limit-defin%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Plaza Victoria

            Puebla de Zaragoza

            Musa