Prove that these sets of polynomials have real and distinct roots.












1












$begingroup$


Can anyone tell me if the following set of polynomials have a special name?
$$P_{0}(x)=1,P_{1}(x)=x$$
$$P_{n}(x)=xP_{n-1}-P_{n-2}$$
The above gives:
$$P_{2}(x)=x^2-1;P_{3}(x)=x^3-2x;P_{4}(x)=x^4-3x^2+1;P_{5}(x)=x^5-4x^3+3x$$
So $P_{n}(x)$ has parity $(-1)^n$. I was trying to find out whether they are orthogonal, but couldn't find a suitable weight function. My main concern is to prove that $P_{n}(x)$ has n distinct real roots, all larger than or equal to -2.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Alll roots cannot be less than -2. In fact for all your examples ($n=1..5)$ all roots are larger than $-2$.
    $endgroup$
    – user
    Dec 21 '18 at 12:31










  • $begingroup$
    Yeah, sorry. I meant I needed them to be larger than -2.
    $endgroup$
    – Mani Jha
    Dec 21 '18 at 12:33










  • $begingroup$
    In fact you can claim that all roots are larger than $-2$ and less than $2$.
    $endgroup$
    – user
    Dec 21 '18 at 12:54










  • $begingroup$
    For general n? How exactly?
    $endgroup$
    – Mani Jha
    Dec 21 '18 at 13:02










  • $begingroup$
    Generally the roots of the polynomial $P_n(x)$ are $2cosfrac{k}{n+1}pi$, with $k=1..n$. From this it is evident that they all are real, distinct and satisfy aforementioned inequality.
    $endgroup$
    – user
    Dec 21 '18 at 13:19


















1












$begingroup$


Can anyone tell me if the following set of polynomials have a special name?
$$P_{0}(x)=1,P_{1}(x)=x$$
$$P_{n}(x)=xP_{n-1}-P_{n-2}$$
The above gives:
$$P_{2}(x)=x^2-1;P_{3}(x)=x^3-2x;P_{4}(x)=x^4-3x^2+1;P_{5}(x)=x^5-4x^3+3x$$
So $P_{n}(x)$ has parity $(-1)^n$. I was trying to find out whether they are orthogonal, but couldn't find a suitable weight function. My main concern is to prove that $P_{n}(x)$ has n distinct real roots, all larger than or equal to -2.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Alll roots cannot be less than -2. In fact for all your examples ($n=1..5)$ all roots are larger than $-2$.
    $endgroup$
    – user
    Dec 21 '18 at 12:31










  • $begingroup$
    Yeah, sorry. I meant I needed them to be larger than -2.
    $endgroup$
    – Mani Jha
    Dec 21 '18 at 12:33










  • $begingroup$
    In fact you can claim that all roots are larger than $-2$ and less than $2$.
    $endgroup$
    – user
    Dec 21 '18 at 12:54










  • $begingroup$
    For general n? How exactly?
    $endgroup$
    – Mani Jha
    Dec 21 '18 at 13:02










  • $begingroup$
    Generally the roots of the polynomial $P_n(x)$ are $2cosfrac{k}{n+1}pi$, with $k=1..n$. From this it is evident that they all are real, distinct and satisfy aforementioned inequality.
    $endgroup$
    – user
    Dec 21 '18 at 13:19
















1












1








1


1



$begingroup$


Can anyone tell me if the following set of polynomials have a special name?
$$P_{0}(x)=1,P_{1}(x)=x$$
$$P_{n}(x)=xP_{n-1}-P_{n-2}$$
The above gives:
$$P_{2}(x)=x^2-1;P_{3}(x)=x^3-2x;P_{4}(x)=x^4-3x^2+1;P_{5}(x)=x^5-4x^3+3x$$
So $P_{n}(x)$ has parity $(-1)^n$. I was trying to find out whether they are orthogonal, but couldn't find a suitable weight function. My main concern is to prove that $P_{n}(x)$ has n distinct real roots, all larger than or equal to -2.










share|cite|improve this question











$endgroup$




Can anyone tell me if the following set of polynomials have a special name?
$$P_{0}(x)=1,P_{1}(x)=x$$
$$P_{n}(x)=xP_{n-1}-P_{n-2}$$
The above gives:
$$P_{2}(x)=x^2-1;P_{3}(x)=x^3-2x;P_{4}(x)=x^4-3x^2+1;P_{5}(x)=x^5-4x^3+3x$$
So $P_{n}(x)$ has parity $(-1)^n$. I was trying to find out whether they are orthogonal, but couldn't find a suitable weight function. My main concern is to prove that $P_{n}(x)$ has n distinct real roots, all larger than or equal to -2.







polynomials roots orthogonal-polynomials






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 21 '18 at 19:26









user

6,48511031




6,48511031










asked Dec 21 '18 at 12:24









Mani JhaMani Jha

94




94












  • $begingroup$
    Alll roots cannot be less than -2. In fact for all your examples ($n=1..5)$ all roots are larger than $-2$.
    $endgroup$
    – user
    Dec 21 '18 at 12:31










  • $begingroup$
    Yeah, sorry. I meant I needed them to be larger than -2.
    $endgroup$
    – Mani Jha
    Dec 21 '18 at 12:33










  • $begingroup$
    In fact you can claim that all roots are larger than $-2$ and less than $2$.
    $endgroup$
    – user
    Dec 21 '18 at 12:54










  • $begingroup$
    For general n? How exactly?
    $endgroup$
    – Mani Jha
    Dec 21 '18 at 13:02










  • $begingroup$
    Generally the roots of the polynomial $P_n(x)$ are $2cosfrac{k}{n+1}pi$, with $k=1..n$. From this it is evident that they all are real, distinct and satisfy aforementioned inequality.
    $endgroup$
    – user
    Dec 21 '18 at 13:19




















  • $begingroup$
    Alll roots cannot be less than -2. In fact for all your examples ($n=1..5)$ all roots are larger than $-2$.
    $endgroup$
    – user
    Dec 21 '18 at 12:31










  • $begingroup$
    Yeah, sorry. I meant I needed them to be larger than -2.
    $endgroup$
    – Mani Jha
    Dec 21 '18 at 12:33










  • $begingroup$
    In fact you can claim that all roots are larger than $-2$ and less than $2$.
    $endgroup$
    – user
    Dec 21 '18 at 12:54










  • $begingroup$
    For general n? How exactly?
    $endgroup$
    – Mani Jha
    Dec 21 '18 at 13:02










  • $begingroup$
    Generally the roots of the polynomial $P_n(x)$ are $2cosfrac{k}{n+1}pi$, with $k=1..n$. From this it is evident that they all are real, distinct and satisfy aforementioned inequality.
    $endgroup$
    – user
    Dec 21 '18 at 13:19


















$begingroup$
Alll roots cannot be less than -2. In fact for all your examples ($n=1..5)$ all roots are larger than $-2$.
$endgroup$
– user
Dec 21 '18 at 12:31




$begingroup$
Alll roots cannot be less than -2. In fact for all your examples ($n=1..5)$ all roots are larger than $-2$.
$endgroup$
– user
Dec 21 '18 at 12:31












$begingroup$
Yeah, sorry. I meant I needed them to be larger than -2.
$endgroup$
– Mani Jha
Dec 21 '18 at 12:33




$begingroup$
Yeah, sorry. I meant I needed them to be larger than -2.
$endgroup$
– Mani Jha
Dec 21 '18 at 12:33












$begingroup$
In fact you can claim that all roots are larger than $-2$ and less than $2$.
$endgroup$
– user
Dec 21 '18 at 12:54




$begingroup$
In fact you can claim that all roots are larger than $-2$ and less than $2$.
$endgroup$
– user
Dec 21 '18 at 12:54












$begingroup$
For general n? How exactly?
$endgroup$
– Mani Jha
Dec 21 '18 at 13:02




$begingroup$
For general n? How exactly?
$endgroup$
– Mani Jha
Dec 21 '18 at 13:02












$begingroup$
Generally the roots of the polynomial $P_n(x)$ are $2cosfrac{k}{n+1}pi$, with $k=1..n$. From this it is evident that they all are real, distinct and satisfy aforementioned inequality.
$endgroup$
– user
Dec 21 '18 at 13:19






$begingroup$
Generally the roots of the polynomial $P_n(x)$ are $2cosfrac{k}{n+1}pi$, with $k=1..n$. From this it is evident that they all are real, distinct and satisfy aforementioned inequality.
$endgroup$
– user
Dec 21 '18 at 13:19












2 Answers
2






active

oldest

votes


















0












$begingroup$

Your polynomials are indeed a special case of classical orthogonal polynomials.
According to Abramowitz/Stegun 22.7.6 you have
$$P_n(x) = S_n(x)= U_ nleft(frac{x}{2}right)$$
where $U_n(x)$ is the well-known Chebyshev polynomial of the second kind.



The weight function for the interval $(-2,2)$ is
$$w(x)=left(1-frac{x^2}{4}right)^{1/2}$$



And of course this means that the root are simple, distinct and located in the interval $(-2,2).$ For a proof see e.g. my answer
Proof the Legendre polynomial $P_n$ has $n$ distinct real zeros .





The orthogonality of $P_n$ follows from the correspending property of $U_n$ and $sin$ , see e.g. https://www.sciencedirect.com/science/article/pii/0377042793901485:



With $x=costheta$ and $sin theta = (1-x^2)^{1/2}$ you have
$$U_n(cos theta) = frac{sinbig((n{+}1)thetabig)}{sintheta}$$
so $$(1-x^2)^{1/2}U_n(x) = sinbig((n{+}1)thetabig)$$



(Although I did not see any fully formulated proof yet, maybe a direct proof from the recursion can be modelled after https://planetmath.org/orthogonalityofchebyshevpolynomialsfromrecursion)






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Is there a simple proof that $P_n(x)$ defined by the given recurrence relation are orthogonal on $(-2,2)$ with the weight function $w(x)$?
    $endgroup$
    – user
    Dec 21 '18 at 15:35





















0












$begingroup$

In what follows the explicit expression for the roots of the polynomials $P_n(x)$ will be derived. The statement "the roots are all distinct, real and less than 2 by absolute value" follows immideately.





Consider a family of $ntimes n$ bidiagonal matrices:
$$begin{align}
A^{(n)}_{ij}=&delta_{i-j,1}+delta_{j-i,1},
end{align}tag{1}$$

given below for $n=5$ as example:
$$
A^{(5)}=begin{pmatrix}
0&1&0&0&0\
1&0&1&0&0\
0&1&0&1&0\
0&0&1&0&1\
0&0&0&1&0
end{pmatrix}.
$$



Lemma 1. The eigenvalues of the matrix (1) are:
$$
begin{align} lambda_m=2cosfrac{pi m}{n+1},&
text{with associated eigenvectors } u_{mk}=sinfrac{pi m}{n+1}k,
end{align}tag{2}
$$

where $m$ and $k$ run from 1 to $n$.



Though it would suffice for the proof to let the matrix $A$ act on the given vectors, we present below an extended "constructive" version.



Assume the elements of an eigenvector $u$ have the form:
$$
u_k=e^{alpha k}+ae^{-alpha k},tag{3}
$$

with some parameters $a$ and $alpha$, which are to be found.



Obviously for all $k=2dots(n-1)$
$$
(Au)_k=left(e^{alpha k}+ae^{-alpha k}right)left(e^alpha+e^{-alpha}right)
=left(e^alpha+e^{-alpha}right)u_k.tag{4}$$



Thus it remains only to find such $a$ and $alpha$ that the equation (4) is satisfied for $k=1$ and $k=n$ as well.



For $k=1$:
$$
e^{alpha 2}+ae^{-alpha 2}=left(e^alpha+e^{-alpha}right)left(e^alpha+a e^{-alpha}right)
Leftrightarrow
1+a=0.tag{5}
$$



For $k=n$:
$$
e^{alpha (n-1)}+ae^{-alpha(n-1)}=left(e^alpha+e^{-alpha}right)left(e^{alpha n}+a e^{-alpha n}right)
Leftrightarrow e^{alpha (n+1)}+ae^{-alpha(n+1)}=0.tag{6}
$$



It follows: $a=-1$, $alpha=frac{pi m}{n+1}i$, where $m$ is an integer number. Plugging the values into (3) and (4) one obtains (2).



As all $n$ eigenvalues are distinct, Lemma 1 is proved.



Lemma 2. The characteristic polynomials of negated matrix (1):
$$
Q_n(x)equivleft|A^{(n)}+x I^{(n)}right|,
$$

where $I^{(n)}$ is $ntimes n$ dimensional identity matrix, are the polynomials in question:
$$Q_n(x)=P_n(x)tag{7}.$$



For $n=1$ and $n=2$ the equality (7) is obvious. Assume that (7) is valid for all $n<N$. Then it is valid for $n=N$ as well.



Indeed, applying the Laplace expansion to matrix $A^{(N)}$ (with $N>2$) one readily obtains:
$$
Q_N(x)=x Q_{N-1}(x)-Q_{N-2}(x)stackrel{I.H.}{=}x P_{N-1}(x)-P_{N-2}(x)=P_N(x).tag{8}
$$



Thus, by induction Lemma 2 is proved.



Now, as the eigenvalues of a matrix are exactly the roots of its characteristic polynomial,



Lemma 3. The roots of $P_n(lambda)$ are:
$$
lambda^{(n)}_m=2cosfrac{pi m}{n+1}, quad m=1dots n
$$

is a simple Corollary of Lemmas 1 and 2.






share|cite|improve this answer











$endgroup$














    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3048458%2fprove-that-these-sets-of-polynomials-have-real-and-distinct-roots%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0












    $begingroup$

    Your polynomials are indeed a special case of classical orthogonal polynomials.
    According to Abramowitz/Stegun 22.7.6 you have
    $$P_n(x) = S_n(x)= U_ nleft(frac{x}{2}right)$$
    where $U_n(x)$ is the well-known Chebyshev polynomial of the second kind.



    The weight function for the interval $(-2,2)$ is
    $$w(x)=left(1-frac{x^2}{4}right)^{1/2}$$



    And of course this means that the root are simple, distinct and located in the interval $(-2,2).$ For a proof see e.g. my answer
    Proof the Legendre polynomial $P_n$ has $n$ distinct real zeros .





    The orthogonality of $P_n$ follows from the correspending property of $U_n$ and $sin$ , see e.g. https://www.sciencedirect.com/science/article/pii/0377042793901485:



    With $x=costheta$ and $sin theta = (1-x^2)^{1/2}$ you have
    $$U_n(cos theta) = frac{sinbig((n{+}1)thetabig)}{sintheta}$$
    so $$(1-x^2)^{1/2}U_n(x) = sinbig((n{+}1)thetabig)$$



    (Although I did not see any fully formulated proof yet, maybe a direct proof from the recursion can be modelled after https://planetmath.org/orthogonalityofchebyshevpolynomialsfromrecursion)






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Is there a simple proof that $P_n(x)$ defined by the given recurrence relation are orthogonal on $(-2,2)$ with the weight function $w(x)$?
      $endgroup$
      – user
      Dec 21 '18 at 15:35


















    0












    $begingroup$

    Your polynomials are indeed a special case of classical orthogonal polynomials.
    According to Abramowitz/Stegun 22.7.6 you have
    $$P_n(x) = S_n(x)= U_ nleft(frac{x}{2}right)$$
    where $U_n(x)$ is the well-known Chebyshev polynomial of the second kind.



    The weight function for the interval $(-2,2)$ is
    $$w(x)=left(1-frac{x^2}{4}right)^{1/2}$$



    And of course this means that the root are simple, distinct and located in the interval $(-2,2).$ For a proof see e.g. my answer
    Proof the Legendre polynomial $P_n$ has $n$ distinct real zeros .





    The orthogonality of $P_n$ follows from the correspending property of $U_n$ and $sin$ , see e.g. https://www.sciencedirect.com/science/article/pii/0377042793901485:



    With $x=costheta$ and $sin theta = (1-x^2)^{1/2}$ you have
    $$U_n(cos theta) = frac{sinbig((n{+}1)thetabig)}{sintheta}$$
    so $$(1-x^2)^{1/2}U_n(x) = sinbig((n{+}1)thetabig)$$



    (Although I did not see any fully formulated proof yet, maybe a direct proof from the recursion can be modelled after https://planetmath.org/orthogonalityofchebyshevpolynomialsfromrecursion)






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Is there a simple proof that $P_n(x)$ defined by the given recurrence relation are orthogonal on $(-2,2)$ with the weight function $w(x)$?
      $endgroup$
      – user
      Dec 21 '18 at 15:35
















    0












    0








    0





    $begingroup$

    Your polynomials are indeed a special case of classical orthogonal polynomials.
    According to Abramowitz/Stegun 22.7.6 you have
    $$P_n(x) = S_n(x)= U_ nleft(frac{x}{2}right)$$
    where $U_n(x)$ is the well-known Chebyshev polynomial of the second kind.



    The weight function for the interval $(-2,2)$ is
    $$w(x)=left(1-frac{x^2}{4}right)^{1/2}$$



    And of course this means that the root are simple, distinct and located in the interval $(-2,2).$ For a proof see e.g. my answer
    Proof the Legendre polynomial $P_n$ has $n$ distinct real zeros .





    The orthogonality of $P_n$ follows from the correspending property of $U_n$ and $sin$ , see e.g. https://www.sciencedirect.com/science/article/pii/0377042793901485:



    With $x=costheta$ and $sin theta = (1-x^2)^{1/2}$ you have
    $$U_n(cos theta) = frac{sinbig((n{+}1)thetabig)}{sintheta}$$
    so $$(1-x^2)^{1/2}U_n(x) = sinbig((n{+}1)thetabig)$$



    (Although I did not see any fully formulated proof yet, maybe a direct proof from the recursion can be modelled after https://planetmath.org/orthogonalityofchebyshevpolynomialsfromrecursion)






    share|cite|improve this answer











    $endgroup$



    Your polynomials are indeed a special case of classical orthogonal polynomials.
    According to Abramowitz/Stegun 22.7.6 you have
    $$P_n(x) = S_n(x)= U_ nleft(frac{x}{2}right)$$
    where $U_n(x)$ is the well-known Chebyshev polynomial of the second kind.



    The weight function for the interval $(-2,2)$ is
    $$w(x)=left(1-frac{x^2}{4}right)^{1/2}$$



    And of course this means that the root are simple, distinct and located in the interval $(-2,2).$ For a proof see e.g. my answer
    Proof the Legendre polynomial $P_n$ has $n$ distinct real zeros .





    The orthogonality of $P_n$ follows from the correspending property of $U_n$ and $sin$ , see e.g. https://www.sciencedirect.com/science/article/pii/0377042793901485:



    With $x=costheta$ and $sin theta = (1-x^2)^{1/2}$ you have
    $$U_n(cos theta) = frac{sinbig((n{+}1)thetabig)}{sintheta}$$
    so $$(1-x^2)^{1/2}U_n(x) = sinbig((n{+}1)thetabig)$$



    (Although I did not see any fully formulated proof yet, maybe a direct proof from the recursion can be modelled after https://planetmath.org/orthogonalityofchebyshevpolynomialsfromrecursion)







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Dec 21 '18 at 16:39

























    answered Dec 21 '18 at 13:50









    gammatestergammatester

    16.8k21733




    16.8k21733












    • $begingroup$
      Is there a simple proof that $P_n(x)$ defined by the given recurrence relation are orthogonal on $(-2,2)$ with the weight function $w(x)$?
      $endgroup$
      – user
      Dec 21 '18 at 15:35




















    • $begingroup$
      Is there a simple proof that $P_n(x)$ defined by the given recurrence relation are orthogonal on $(-2,2)$ with the weight function $w(x)$?
      $endgroup$
      – user
      Dec 21 '18 at 15:35


















    $begingroup$
    Is there a simple proof that $P_n(x)$ defined by the given recurrence relation are orthogonal on $(-2,2)$ with the weight function $w(x)$?
    $endgroup$
    – user
    Dec 21 '18 at 15:35






    $begingroup$
    Is there a simple proof that $P_n(x)$ defined by the given recurrence relation are orthogonal on $(-2,2)$ with the weight function $w(x)$?
    $endgroup$
    – user
    Dec 21 '18 at 15:35













    0












    $begingroup$

    In what follows the explicit expression for the roots of the polynomials $P_n(x)$ will be derived. The statement "the roots are all distinct, real and less than 2 by absolute value" follows immideately.





    Consider a family of $ntimes n$ bidiagonal matrices:
    $$begin{align}
    A^{(n)}_{ij}=&delta_{i-j,1}+delta_{j-i,1},
    end{align}tag{1}$$

    given below for $n=5$ as example:
    $$
    A^{(5)}=begin{pmatrix}
    0&1&0&0&0\
    1&0&1&0&0\
    0&1&0&1&0\
    0&0&1&0&1\
    0&0&0&1&0
    end{pmatrix}.
    $$



    Lemma 1. The eigenvalues of the matrix (1) are:
    $$
    begin{align} lambda_m=2cosfrac{pi m}{n+1},&
    text{with associated eigenvectors } u_{mk}=sinfrac{pi m}{n+1}k,
    end{align}tag{2}
    $$

    where $m$ and $k$ run from 1 to $n$.



    Though it would suffice for the proof to let the matrix $A$ act on the given vectors, we present below an extended "constructive" version.



    Assume the elements of an eigenvector $u$ have the form:
    $$
    u_k=e^{alpha k}+ae^{-alpha k},tag{3}
    $$

    with some parameters $a$ and $alpha$, which are to be found.



    Obviously for all $k=2dots(n-1)$
    $$
    (Au)_k=left(e^{alpha k}+ae^{-alpha k}right)left(e^alpha+e^{-alpha}right)
    =left(e^alpha+e^{-alpha}right)u_k.tag{4}$$



    Thus it remains only to find such $a$ and $alpha$ that the equation (4) is satisfied for $k=1$ and $k=n$ as well.



    For $k=1$:
    $$
    e^{alpha 2}+ae^{-alpha 2}=left(e^alpha+e^{-alpha}right)left(e^alpha+a e^{-alpha}right)
    Leftrightarrow
    1+a=0.tag{5}
    $$



    For $k=n$:
    $$
    e^{alpha (n-1)}+ae^{-alpha(n-1)}=left(e^alpha+e^{-alpha}right)left(e^{alpha n}+a e^{-alpha n}right)
    Leftrightarrow e^{alpha (n+1)}+ae^{-alpha(n+1)}=0.tag{6}
    $$



    It follows: $a=-1$, $alpha=frac{pi m}{n+1}i$, where $m$ is an integer number. Plugging the values into (3) and (4) one obtains (2).



    As all $n$ eigenvalues are distinct, Lemma 1 is proved.



    Lemma 2. The characteristic polynomials of negated matrix (1):
    $$
    Q_n(x)equivleft|A^{(n)}+x I^{(n)}right|,
    $$

    where $I^{(n)}$ is $ntimes n$ dimensional identity matrix, are the polynomials in question:
    $$Q_n(x)=P_n(x)tag{7}.$$



    For $n=1$ and $n=2$ the equality (7) is obvious. Assume that (7) is valid for all $n<N$. Then it is valid for $n=N$ as well.



    Indeed, applying the Laplace expansion to matrix $A^{(N)}$ (with $N>2$) one readily obtains:
    $$
    Q_N(x)=x Q_{N-1}(x)-Q_{N-2}(x)stackrel{I.H.}{=}x P_{N-1}(x)-P_{N-2}(x)=P_N(x).tag{8}
    $$



    Thus, by induction Lemma 2 is proved.



    Now, as the eigenvalues of a matrix are exactly the roots of its characteristic polynomial,



    Lemma 3. The roots of $P_n(lambda)$ are:
    $$
    lambda^{(n)}_m=2cosfrac{pi m}{n+1}, quad m=1dots n
    $$

    is a simple Corollary of Lemmas 1 and 2.






    share|cite|improve this answer











    $endgroup$


















      0












      $begingroup$

      In what follows the explicit expression for the roots of the polynomials $P_n(x)$ will be derived. The statement "the roots are all distinct, real and less than 2 by absolute value" follows immideately.





      Consider a family of $ntimes n$ bidiagonal matrices:
      $$begin{align}
      A^{(n)}_{ij}=&delta_{i-j,1}+delta_{j-i,1},
      end{align}tag{1}$$

      given below for $n=5$ as example:
      $$
      A^{(5)}=begin{pmatrix}
      0&1&0&0&0\
      1&0&1&0&0\
      0&1&0&1&0\
      0&0&1&0&1\
      0&0&0&1&0
      end{pmatrix}.
      $$



      Lemma 1. The eigenvalues of the matrix (1) are:
      $$
      begin{align} lambda_m=2cosfrac{pi m}{n+1},&
      text{with associated eigenvectors } u_{mk}=sinfrac{pi m}{n+1}k,
      end{align}tag{2}
      $$

      where $m$ and $k$ run from 1 to $n$.



      Though it would suffice for the proof to let the matrix $A$ act on the given vectors, we present below an extended "constructive" version.



      Assume the elements of an eigenvector $u$ have the form:
      $$
      u_k=e^{alpha k}+ae^{-alpha k},tag{3}
      $$

      with some parameters $a$ and $alpha$, which are to be found.



      Obviously for all $k=2dots(n-1)$
      $$
      (Au)_k=left(e^{alpha k}+ae^{-alpha k}right)left(e^alpha+e^{-alpha}right)
      =left(e^alpha+e^{-alpha}right)u_k.tag{4}$$



      Thus it remains only to find such $a$ and $alpha$ that the equation (4) is satisfied for $k=1$ and $k=n$ as well.



      For $k=1$:
      $$
      e^{alpha 2}+ae^{-alpha 2}=left(e^alpha+e^{-alpha}right)left(e^alpha+a e^{-alpha}right)
      Leftrightarrow
      1+a=0.tag{5}
      $$



      For $k=n$:
      $$
      e^{alpha (n-1)}+ae^{-alpha(n-1)}=left(e^alpha+e^{-alpha}right)left(e^{alpha n}+a e^{-alpha n}right)
      Leftrightarrow e^{alpha (n+1)}+ae^{-alpha(n+1)}=0.tag{6}
      $$



      It follows: $a=-1$, $alpha=frac{pi m}{n+1}i$, where $m$ is an integer number. Plugging the values into (3) and (4) one obtains (2).



      As all $n$ eigenvalues are distinct, Lemma 1 is proved.



      Lemma 2. The characteristic polynomials of negated matrix (1):
      $$
      Q_n(x)equivleft|A^{(n)}+x I^{(n)}right|,
      $$

      where $I^{(n)}$ is $ntimes n$ dimensional identity matrix, are the polynomials in question:
      $$Q_n(x)=P_n(x)tag{7}.$$



      For $n=1$ and $n=2$ the equality (7) is obvious. Assume that (7) is valid for all $n<N$. Then it is valid for $n=N$ as well.



      Indeed, applying the Laplace expansion to matrix $A^{(N)}$ (with $N>2$) one readily obtains:
      $$
      Q_N(x)=x Q_{N-1}(x)-Q_{N-2}(x)stackrel{I.H.}{=}x P_{N-1}(x)-P_{N-2}(x)=P_N(x).tag{8}
      $$



      Thus, by induction Lemma 2 is proved.



      Now, as the eigenvalues of a matrix are exactly the roots of its characteristic polynomial,



      Lemma 3. The roots of $P_n(lambda)$ are:
      $$
      lambda^{(n)}_m=2cosfrac{pi m}{n+1}, quad m=1dots n
      $$

      is a simple Corollary of Lemmas 1 and 2.






      share|cite|improve this answer











      $endgroup$
















        0












        0








        0





        $begingroup$

        In what follows the explicit expression for the roots of the polynomials $P_n(x)$ will be derived. The statement "the roots are all distinct, real and less than 2 by absolute value" follows immideately.





        Consider a family of $ntimes n$ bidiagonal matrices:
        $$begin{align}
        A^{(n)}_{ij}=&delta_{i-j,1}+delta_{j-i,1},
        end{align}tag{1}$$

        given below for $n=5$ as example:
        $$
        A^{(5)}=begin{pmatrix}
        0&1&0&0&0\
        1&0&1&0&0\
        0&1&0&1&0\
        0&0&1&0&1\
        0&0&0&1&0
        end{pmatrix}.
        $$



        Lemma 1. The eigenvalues of the matrix (1) are:
        $$
        begin{align} lambda_m=2cosfrac{pi m}{n+1},&
        text{with associated eigenvectors } u_{mk}=sinfrac{pi m}{n+1}k,
        end{align}tag{2}
        $$

        where $m$ and $k$ run from 1 to $n$.



        Though it would suffice for the proof to let the matrix $A$ act on the given vectors, we present below an extended "constructive" version.



        Assume the elements of an eigenvector $u$ have the form:
        $$
        u_k=e^{alpha k}+ae^{-alpha k},tag{3}
        $$

        with some parameters $a$ and $alpha$, which are to be found.



        Obviously for all $k=2dots(n-1)$
        $$
        (Au)_k=left(e^{alpha k}+ae^{-alpha k}right)left(e^alpha+e^{-alpha}right)
        =left(e^alpha+e^{-alpha}right)u_k.tag{4}$$



        Thus it remains only to find such $a$ and $alpha$ that the equation (4) is satisfied for $k=1$ and $k=n$ as well.



        For $k=1$:
        $$
        e^{alpha 2}+ae^{-alpha 2}=left(e^alpha+e^{-alpha}right)left(e^alpha+a e^{-alpha}right)
        Leftrightarrow
        1+a=0.tag{5}
        $$



        For $k=n$:
        $$
        e^{alpha (n-1)}+ae^{-alpha(n-1)}=left(e^alpha+e^{-alpha}right)left(e^{alpha n}+a e^{-alpha n}right)
        Leftrightarrow e^{alpha (n+1)}+ae^{-alpha(n+1)}=0.tag{6}
        $$



        It follows: $a=-1$, $alpha=frac{pi m}{n+1}i$, where $m$ is an integer number. Plugging the values into (3) and (4) one obtains (2).



        As all $n$ eigenvalues are distinct, Lemma 1 is proved.



        Lemma 2. The characteristic polynomials of negated matrix (1):
        $$
        Q_n(x)equivleft|A^{(n)}+x I^{(n)}right|,
        $$

        where $I^{(n)}$ is $ntimes n$ dimensional identity matrix, are the polynomials in question:
        $$Q_n(x)=P_n(x)tag{7}.$$



        For $n=1$ and $n=2$ the equality (7) is obvious. Assume that (7) is valid for all $n<N$. Then it is valid for $n=N$ as well.



        Indeed, applying the Laplace expansion to matrix $A^{(N)}$ (with $N>2$) one readily obtains:
        $$
        Q_N(x)=x Q_{N-1}(x)-Q_{N-2}(x)stackrel{I.H.}{=}x P_{N-1}(x)-P_{N-2}(x)=P_N(x).tag{8}
        $$



        Thus, by induction Lemma 2 is proved.



        Now, as the eigenvalues of a matrix are exactly the roots of its characteristic polynomial,



        Lemma 3. The roots of $P_n(lambda)$ are:
        $$
        lambda^{(n)}_m=2cosfrac{pi m}{n+1}, quad m=1dots n
        $$

        is a simple Corollary of Lemmas 1 and 2.






        share|cite|improve this answer











        $endgroup$



        In what follows the explicit expression for the roots of the polynomials $P_n(x)$ will be derived. The statement "the roots are all distinct, real and less than 2 by absolute value" follows immideately.





        Consider a family of $ntimes n$ bidiagonal matrices:
        $$begin{align}
        A^{(n)}_{ij}=&delta_{i-j,1}+delta_{j-i,1},
        end{align}tag{1}$$

        given below for $n=5$ as example:
        $$
        A^{(5)}=begin{pmatrix}
        0&1&0&0&0\
        1&0&1&0&0\
        0&1&0&1&0\
        0&0&1&0&1\
        0&0&0&1&0
        end{pmatrix}.
        $$



        Lemma 1. The eigenvalues of the matrix (1) are:
        $$
        begin{align} lambda_m=2cosfrac{pi m}{n+1},&
        text{with associated eigenvectors } u_{mk}=sinfrac{pi m}{n+1}k,
        end{align}tag{2}
        $$

        where $m$ and $k$ run from 1 to $n$.



        Though it would suffice for the proof to let the matrix $A$ act on the given vectors, we present below an extended "constructive" version.



        Assume the elements of an eigenvector $u$ have the form:
        $$
        u_k=e^{alpha k}+ae^{-alpha k},tag{3}
        $$

        with some parameters $a$ and $alpha$, which are to be found.



        Obviously for all $k=2dots(n-1)$
        $$
        (Au)_k=left(e^{alpha k}+ae^{-alpha k}right)left(e^alpha+e^{-alpha}right)
        =left(e^alpha+e^{-alpha}right)u_k.tag{4}$$



        Thus it remains only to find such $a$ and $alpha$ that the equation (4) is satisfied for $k=1$ and $k=n$ as well.



        For $k=1$:
        $$
        e^{alpha 2}+ae^{-alpha 2}=left(e^alpha+e^{-alpha}right)left(e^alpha+a e^{-alpha}right)
        Leftrightarrow
        1+a=0.tag{5}
        $$



        For $k=n$:
        $$
        e^{alpha (n-1)}+ae^{-alpha(n-1)}=left(e^alpha+e^{-alpha}right)left(e^{alpha n}+a e^{-alpha n}right)
        Leftrightarrow e^{alpha (n+1)}+ae^{-alpha(n+1)}=0.tag{6}
        $$



        It follows: $a=-1$, $alpha=frac{pi m}{n+1}i$, where $m$ is an integer number. Plugging the values into (3) and (4) one obtains (2).



        As all $n$ eigenvalues are distinct, Lemma 1 is proved.



        Lemma 2. The characteristic polynomials of negated matrix (1):
        $$
        Q_n(x)equivleft|A^{(n)}+x I^{(n)}right|,
        $$

        where $I^{(n)}$ is $ntimes n$ dimensional identity matrix, are the polynomials in question:
        $$Q_n(x)=P_n(x)tag{7}.$$



        For $n=1$ and $n=2$ the equality (7) is obvious. Assume that (7) is valid for all $n<N$. Then it is valid for $n=N$ as well.



        Indeed, applying the Laplace expansion to matrix $A^{(N)}$ (with $N>2$) one readily obtains:
        $$
        Q_N(x)=x Q_{N-1}(x)-Q_{N-2}(x)stackrel{I.H.}{=}x P_{N-1}(x)-P_{N-2}(x)=P_N(x).tag{8}
        $$



        Thus, by induction Lemma 2 is proved.



        Now, as the eigenvalues of a matrix are exactly the roots of its characteristic polynomial,



        Lemma 3. The roots of $P_n(lambda)$ are:
        $$
        lambda^{(n)}_m=2cosfrac{pi m}{n+1}, quad m=1dots n
        $$

        is a simple Corollary of Lemmas 1 and 2.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Dec 23 '18 at 11:18

























        answered Dec 21 '18 at 19:20









        useruser

        6,48511031




        6,48511031






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3048458%2fprove-that-these-sets-of-polynomials-have-real-and-distinct-roots%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Plaza Victoria

            In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

            How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...