If you can multiple linearly independent eigenvectors for the same eigenvalue, why do we solve for just one?












0












$begingroup$


The way I have been taught to solve for eigenvalues and eigenvectors in linear algebra is thus: compute the eigenvalues using the determinant of A-I. Then use that to compute the eigenvectors for each eigenvalue using (A-I)x = 0. The way I understood it, each eigenvector served as a basis for the space that contained all eigenvectors of that eigenvalue. However, I just learned from the interwebs that an eigenvalue can have more than one linearly independent eigenvector. So, what's the point of calculating just one of these for each eigenvalue, if there are theoretically infinitely many linearly independent eigenvectors for each eigenvalue? Why do we choose just one?










share|cite|improve this question









$endgroup$

















    0












    $begingroup$


    The way I have been taught to solve for eigenvalues and eigenvectors in linear algebra is thus: compute the eigenvalues using the determinant of A-I. Then use that to compute the eigenvectors for each eigenvalue using (A-I)x = 0. The way I understood it, each eigenvector served as a basis for the space that contained all eigenvectors of that eigenvalue. However, I just learned from the interwebs that an eigenvalue can have more than one linearly independent eigenvector. So, what's the point of calculating just one of these for each eigenvalue, if there are theoretically infinitely many linearly independent eigenvectors for each eigenvalue? Why do we choose just one?










    share|cite|improve this question









    $endgroup$















      0












      0








      0





      $begingroup$


      The way I have been taught to solve for eigenvalues and eigenvectors in linear algebra is thus: compute the eigenvalues using the determinant of A-I. Then use that to compute the eigenvectors for each eigenvalue using (A-I)x = 0. The way I understood it, each eigenvector served as a basis for the space that contained all eigenvectors of that eigenvalue. However, I just learned from the interwebs that an eigenvalue can have more than one linearly independent eigenvector. So, what's the point of calculating just one of these for each eigenvalue, if there are theoretically infinitely many linearly independent eigenvectors for each eigenvalue? Why do we choose just one?










      share|cite|improve this question









      $endgroup$




      The way I have been taught to solve for eigenvalues and eigenvectors in linear algebra is thus: compute the eigenvalues using the determinant of A-I. Then use that to compute the eigenvectors for each eigenvalue using (A-I)x = 0. The way I understood it, each eigenvector served as a basis for the space that contained all eigenvectors of that eigenvalue. However, I just learned from the interwebs that an eigenvalue can have more than one linearly independent eigenvector. So, what's the point of calculating just one of these for each eigenvalue, if there are theoretically infinitely many linearly independent eigenvectors for each eigenvalue? Why do we choose just one?







      linear-algebra eigenvalues-eigenvectors






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Dec 19 '18 at 18:39









      Will BurghardWill Burghard

      42




      42






















          2 Answers
          2






          active

          oldest

          votes


















          0












          $begingroup$

          If you were taught that, then you were taught wrong. If $A$ is a $ntimes n$ matrix then, for each eigenvalue $lambda$ of $A$, the dimension of the space$$E_lambda={vinmathbb{R}^n,|,A.v=lambda v}$$can go from $1$ to $n$. And, of course, if it's greater than $1$, a single vector $vin E_lambda$ will not be a basis of it. If the dimension is $k$, every basis will have $k$ linearly independent eigenvectors (but, unlike what you wrote, never infinitely many).






          share|cite|improve this answer









          $endgroup$





















            0












            $begingroup$

            In the low-dimensional case, you start by finding the roots of the characteristic polynomial. For each root $lambda$, writing $A-lambda I=0$ gives you a matrix form of a linear system, and the solutions to this linear system represent the eigenspace for $lambda$. The dimension of the eigenspace is at most the multiplicity of the root $lambda$, which means in particular it will be finite-dimensional. And in fact there are only finitely many eigenvalues, which means there are only finitely many linearly independent eigenvectors.



            In the high-dimensional (but still finite-dimensional) case, the principles are the same but finding the roots and solving the linear system can be impractical.



            In the infinite-dimensional case, the above steps will fail (there is no characteristic polynomial for an infinite-rank operator). In fact there is no general method for finding eigenvalues or eigenvectors of an infinite-rank operator. Sometimes there are infinitely many linearly independent eigenvectors, and sometimes there are no eigenvectors at all.






            share|cite|improve this answer









            $endgroup$














              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "69"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3046727%2fif-you-can-multiple-linearly-independent-eigenvectors-for-the-same-eigenvalue-w%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              0












              $begingroup$

              If you were taught that, then you were taught wrong. If $A$ is a $ntimes n$ matrix then, for each eigenvalue $lambda$ of $A$, the dimension of the space$$E_lambda={vinmathbb{R}^n,|,A.v=lambda v}$$can go from $1$ to $n$. And, of course, if it's greater than $1$, a single vector $vin E_lambda$ will not be a basis of it. If the dimension is $k$, every basis will have $k$ linearly independent eigenvectors (but, unlike what you wrote, never infinitely many).






              share|cite|improve this answer









              $endgroup$


















                0












                $begingroup$

                If you were taught that, then you were taught wrong. If $A$ is a $ntimes n$ matrix then, for each eigenvalue $lambda$ of $A$, the dimension of the space$$E_lambda={vinmathbb{R}^n,|,A.v=lambda v}$$can go from $1$ to $n$. And, of course, if it's greater than $1$, a single vector $vin E_lambda$ will not be a basis of it. If the dimension is $k$, every basis will have $k$ linearly independent eigenvectors (but, unlike what you wrote, never infinitely many).






                share|cite|improve this answer









                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  If you were taught that, then you were taught wrong. If $A$ is a $ntimes n$ matrix then, for each eigenvalue $lambda$ of $A$, the dimension of the space$$E_lambda={vinmathbb{R}^n,|,A.v=lambda v}$$can go from $1$ to $n$. And, of course, if it's greater than $1$, a single vector $vin E_lambda$ will not be a basis of it. If the dimension is $k$, every basis will have $k$ linearly independent eigenvectors (but, unlike what you wrote, never infinitely many).






                  share|cite|improve this answer









                  $endgroup$



                  If you were taught that, then you were taught wrong. If $A$ is a $ntimes n$ matrix then, for each eigenvalue $lambda$ of $A$, the dimension of the space$$E_lambda={vinmathbb{R}^n,|,A.v=lambda v}$$can go from $1$ to $n$. And, of course, if it's greater than $1$, a single vector $vin E_lambda$ will not be a basis of it. If the dimension is $k$, every basis will have $k$ linearly independent eigenvectors (but, unlike what you wrote, never infinitely many).







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Dec 19 '18 at 18:50









                  José Carlos SantosJosé Carlos Santos

                  172k22132239




                  172k22132239























                      0












                      $begingroup$

                      In the low-dimensional case, you start by finding the roots of the characteristic polynomial. For each root $lambda$, writing $A-lambda I=0$ gives you a matrix form of a linear system, and the solutions to this linear system represent the eigenspace for $lambda$. The dimension of the eigenspace is at most the multiplicity of the root $lambda$, which means in particular it will be finite-dimensional. And in fact there are only finitely many eigenvalues, which means there are only finitely many linearly independent eigenvectors.



                      In the high-dimensional (but still finite-dimensional) case, the principles are the same but finding the roots and solving the linear system can be impractical.



                      In the infinite-dimensional case, the above steps will fail (there is no characteristic polynomial for an infinite-rank operator). In fact there is no general method for finding eigenvalues or eigenvectors of an infinite-rank operator. Sometimes there are infinitely many linearly independent eigenvectors, and sometimes there are no eigenvectors at all.






                      share|cite|improve this answer









                      $endgroup$


















                        0












                        $begingroup$

                        In the low-dimensional case, you start by finding the roots of the characteristic polynomial. For each root $lambda$, writing $A-lambda I=0$ gives you a matrix form of a linear system, and the solutions to this linear system represent the eigenspace for $lambda$. The dimension of the eigenspace is at most the multiplicity of the root $lambda$, which means in particular it will be finite-dimensional. And in fact there are only finitely many eigenvalues, which means there are only finitely many linearly independent eigenvectors.



                        In the high-dimensional (but still finite-dimensional) case, the principles are the same but finding the roots and solving the linear system can be impractical.



                        In the infinite-dimensional case, the above steps will fail (there is no characteristic polynomial for an infinite-rank operator). In fact there is no general method for finding eigenvalues or eigenvectors of an infinite-rank operator. Sometimes there are infinitely many linearly independent eigenvectors, and sometimes there are no eigenvectors at all.






                        share|cite|improve this answer









                        $endgroup$
















                          0












                          0








                          0





                          $begingroup$

                          In the low-dimensional case, you start by finding the roots of the characteristic polynomial. For each root $lambda$, writing $A-lambda I=0$ gives you a matrix form of a linear system, and the solutions to this linear system represent the eigenspace for $lambda$. The dimension of the eigenspace is at most the multiplicity of the root $lambda$, which means in particular it will be finite-dimensional. And in fact there are only finitely many eigenvalues, which means there are only finitely many linearly independent eigenvectors.



                          In the high-dimensional (but still finite-dimensional) case, the principles are the same but finding the roots and solving the linear system can be impractical.



                          In the infinite-dimensional case, the above steps will fail (there is no characteristic polynomial for an infinite-rank operator). In fact there is no general method for finding eigenvalues or eigenvectors of an infinite-rank operator. Sometimes there are infinitely many linearly independent eigenvectors, and sometimes there are no eigenvectors at all.






                          share|cite|improve this answer









                          $endgroup$



                          In the low-dimensional case, you start by finding the roots of the characteristic polynomial. For each root $lambda$, writing $A-lambda I=0$ gives you a matrix form of a linear system, and the solutions to this linear system represent the eigenspace for $lambda$. The dimension of the eigenspace is at most the multiplicity of the root $lambda$, which means in particular it will be finite-dimensional. And in fact there are only finitely many eigenvalues, which means there are only finitely many linearly independent eigenvectors.



                          In the high-dimensional (but still finite-dimensional) case, the principles are the same but finding the roots and solving the linear system can be impractical.



                          In the infinite-dimensional case, the above steps will fail (there is no characteristic polynomial for an infinite-rank operator). In fact there is no general method for finding eigenvalues or eigenvectors of an infinite-rank operator. Sometimes there are infinitely many linearly independent eigenvectors, and sometimes there are no eigenvectors at all.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Dec 19 '18 at 19:02









                          Ben WBen W

                          2,722818




                          2,722818






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3046727%2fif-you-can-multiple-linearly-independent-eigenvectors-for-the-same-eigenvalue-w%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Plaza Victoria

                              Puebla de Zaragoza

                              Musa