Is there an easy way to find the (analytical form of) remaining eigenvalues of a 4x4 matrix?












1












$begingroup$


Consider the following 4x4 real symmetric matrix:



$$
M = 2 ,
begin{pmatrix}
1 & c_{xy} & c_{xz} & c_{yz} \
c_{xy} & 1 & c_{yoverline{z}} & c_{xoverline{z}} \
c_{xz} & c_{yoverline{z}} & 1 & c_{xoverline{y}}\
c_{yz} & c_{xoverline{z}} & c_{xoverline{y}} & 1
end{pmatrix}
$$

where shortcuts $c_{xy} = cos(frac{x+ y}{4})$ and $c_{xoverline{y}} = cos(frac{x - y}{4})$ etc. are introduced.



I know the eigenvalues $lambda$ of $M$:
$$
lambda = 0, 0, 4pm 2sqrt{1+Q}
$$

where $Q = (c_{xy}^2 + c_{xoverline{y}}^2 + c_{yz}^2 + c_{yoverline{z}}^2 + c_{xz}^2 + c_{xoverline{z}}^2 - 3)$, i.e. two zeros and two non-zeros.



Question



Now my question is, if I know for a fact that 2 of the eigenvalues are 0, how can I arrive simply at the other two non-zero eigenvalues that I shown above?
Note I am interested in the analytical form (I know I can solve these easily with numerical methods).



I'm thinking maybe there is a way to reduce $M$ to a 2x2 matrix block (?)



My attempt



Not much of an attempt, but the usual trick of trace of matrix is the sum of eigenvalues and determinant is the product of eigenvalues doesn't really help here.



I solved for the eigenvalues symbolically using sympy in Python (I believe one can do the same in Mathematica), but this method doesn't scale well (see below context).



Context



The point of asking this is that I have another, more complicated, 4x4 real symmetric matrix filled with trigonometric expressions, that I know has 2 zero eigenvalues (like $M$).



I want to find the analytical form of the other two eigenvalues, but solving for all 4 of the eigenvalues symbolically is too computationally expensive. I was hoping I can make use of the fact that two of the eigenvalues are (known and) zero in some way.



Hence, a useful answer here would describe a method in which I can obtain the two non-zero $lambda$'s from $M$ -- then hopefully I will be able to generalize it for my own, complicated matrix.










share|cite|improve this question











$endgroup$

















    1












    $begingroup$


    Consider the following 4x4 real symmetric matrix:



    $$
    M = 2 ,
    begin{pmatrix}
    1 & c_{xy} & c_{xz} & c_{yz} \
    c_{xy} & 1 & c_{yoverline{z}} & c_{xoverline{z}} \
    c_{xz} & c_{yoverline{z}} & 1 & c_{xoverline{y}}\
    c_{yz} & c_{xoverline{z}} & c_{xoverline{y}} & 1
    end{pmatrix}
    $$

    where shortcuts $c_{xy} = cos(frac{x+ y}{4})$ and $c_{xoverline{y}} = cos(frac{x - y}{4})$ etc. are introduced.



    I know the eigenvalues $lambda$ of $M$:
    $$
    lambda = 0, 0, 4pm 2sqrt{1+Q}
    $$

    where $Q = (c_{xy}^2 + c_{xoverline{y}}^2 + c_{yz}^2 + c_{yoverline{z}}^2 + c_{xz}^2 + c_{xoverline{z}}^2 - 3)$, i.e. two zeros and two non-zeros.



    Question



    Now my question is, if I know for a fact that 2 of the eigenvalues are 0, how can I arrive simply at the other two non-zero eigenvalues that I shown above?
    Note I am interested in the analytical form (I know I can solve these easily with numerical methods).



    I'm thinking maybe there is a way to reduce $M$ to a 2x2 matrix block (?)



    My attempt



    Not much of an attempt, but the usual trick of trace of matrix is the sum of eigenvalues and determinant is the product of eigenvalues doesn't really help here.



    I solved for the eigenvalues symbolically using sympy in Python (I believe one can do the same in Mathematica), but this method doesn't scale well (see below context).



    Context



    The point of asking this is that I have another, more complicated, 4x4 real symmetric matrix filled with trigonometric expressions, that I know has 2 zero eigenvalues (like $M$).



    I want to find the analytical form of the other two eigenvalues, but solving for all 4 of the eigenvalues symbolically is too computationally expensive. I was hoping I can make use of the fact that two of the eigenvalues are (known and) zero in some way.



    Hence, a useful answer here would describe a method in which I can obtain the two non-zero $lambda$'s from $M$ -- then hopefully I will be able to generalize it for my own, complicated matrix.










    share|cite|improve this question











    $endgroup$















      1












      1








      1





      $begingroup$


      Consider the following 4x4 real symmetric matrix:



      $$
      M = 2 ,
      begin{pmatrix}
      1 & c_{xy} & c_{xz} & c_{yz} \
      c_{xy} & 1 & c_{yoverline{z}} & c_{xoverline{z}} \
      c_{xz} & c_{yoverline{z}} & 1 & c_{xoverline{y}}\
      c_{yz} & c_{xoverline{z}} & c_{xoverline{y}} & 1
      end{pmatrix}
      $$

      where shortcuts $c_{xy} = cos(frac{x+ y}{4})$ and $c_{xoverline{y}} = cos(frac{x - y}{4})$ etc. are introduced.



      I know the eigenvalues $lambda$ of $M$:
      $$
      lambda = 0, 0, 4pm 2sqrt{1+Q}
      $$

      where $Q = (c_{xy}^2 + c_{xoverline{y}}^2 + c_{yz}^2 + c_{yoverline{z}}^2 + c_{xz}^2 + c_{xoverline{z}}^2 - 3)$, i.e. two zeros and two non-zeros.



      Question



      Now my question is, if I know for a fact that 2 of the eigenvalues are 0, how can I arrive simply at the other two non-zero eigenvalues that I shown above?
      Note I am interested in the analytical form (I know I can solve these easily with numerical methods).



      I'm thinking maybe there is a way to reduce $M$ to a 2x2 matrix block (?)



      My attempt



      Not much of an attempt, but the usual trick of trace of matrix is the sum of eigenvalues and determinant is the product of eigenvalues doesn't really help here.



      I solved for the eigenvalues symbolically using sympy in Python (I believe one can do the same in Mathematica), but this method doesn't scale well (see below context).



      Context



      The point of asking this is that I have another, more complicated, 4x4 real symmetric matrix filled with trigonometric expressions, that I know has 2 zero eigenvalues (like $M$).



      I want to find the analytical form of the other two eigenvalues, but solving for all 4 of the eigenvalues symbolically is too computationally expensive. I was hoping I can make use of the fact that two of the eigenvalues are (known and) zero in some way.



      Hence, a useful answer here would describe a method in which I can obtain the two non-zero $lambda$'s from $M$ -- then hopefully I will be able to generalize it for my own, complicated matrix.










      share|cite|improve this question











      $endgroup$




      Consider the following 4x4 real symmetric matrix:



      $$
      M = 2 ,
      begin{pmatrix}
      1 & c_{xy} & c_{xz} & c_{yz} \
      c_{xy} & 1 & c_{yoverline{z}} & c_{xoverline{z}} \
      c_{xz} & c_{yoverline{z}} & 1 & c_{xoverline{y}}\
      c_{yz} & c_{xoverline{z}} & c_{xoverline{y}} & 1
      end{pmatrix}
      $$

      where shortcuts $c_{xy} = cos(frac{x+ y}{4})$ and $c_{xoverline{y}} = cos(frac{x - y}{4})$ etc. are introduced.



      I know the eigenvalues $lambda$ of $M$:
      $$
      lambda = 0, 0, 4pm 2sqrt{1+Q}
      $$

      where $Q = (c_{xy}^2 + c_{xoverline{y}}^2 + c_{yz}^2 + c_{yoverline{z}}^2 + c_{xz}^2 + c_{xoverline{z}}^2 - 3)$, i.e. two zeros and two non-zeros.



      Question



      Now my question is, if I know for a fact that 2 of the eigenvalues are 0, how can I arrive simply at the other two non-zero eigenvalues that I shown above?
      Note I am interested in the analytical form (I know I can solve these easily with numerical methods).



      I'm thinking maybe there is a way to reduce $M$ to a 2x2 matrix block (?)



      My attempt



      Not much of an attempt, but the usual trick of trace of matrix is the sum of eigenvalues and determinant is the product of eigenvalues doesn't really help here.



      I solved for the eigenvalues symbolically using sympy in Python (I believe one can do the same in Mathematica), but this method doesn't scale well (see below context).



      Context



      The point of asking this is that I have another, more complicated, 4x4 real symmetric matrix filled with trigonometric expressions, that I know has 2 zero eigenvalues (like $M$).



      I want to find the analytical form of the other two eigenvalues, but solving for all 4 of the eigenvalues symbolically is too computationally expensive. I was hoping I can make use of the fact that two of the eigenvalues are (known and) zero in some way.



      Hence, a useful answer here would describe a method in which I can obtain the two non-zero $lambda$'s from $M$ -- then hopefully I will be able to generalize it for my own, complicated matrix.







      linear-algebra matrices eigenvalues-eigenvectors






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Dec 5 '18 at 1:08







      Troy

















      asked Dec 5 '18 at 1:02









      TroyTroy

      4181519




      4181519






















          1 Answer
          1






          active

          oldest

          votes


















          1












          $begingroup$

          Possible idea. Since you know two eigenvalues are $0$ you know the $0$-eigenspace is $2$ dimensional. If you can (symbolically) see the null space, then its orthogonal complement will be $2$ dimensional and invariant. It will be spanned by the two missing eigenvectors. Finding the eigenvalues should be easier in just two dimensions.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            hi, thanks for the input. would you happen to know how to do this numerically? I can calculate nullspace of $M$ easily (I get 2 vectors that should span the space), how do I get the orthogonal complement from that? how do I link this to the missing eigenvalues? it would be nice if you could show that you can reproduce my $lambda$ expressions by the method you are describing.
            $endgroup$
            – Troy
            Dec 5 '18 at 1:23












          • $begingroup$
            I'm glad this might be helpful, but don't know how to do the next steps. Gram-Schmidt would find the orthogonal complement, but I think that's not numerically stable. But lots is known about numerical linear algebra. Perhaps get as far as you can and ask a new question (link to this one) or seriously edit to show progress and the new sticking point.
            $endgroup$
            – Ethan Bolker
            Dec 5 '18 at 1:33











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3026443%2fis-there-an-easy-way-to-find-the-analytical-form-of-remaining-eigenvalues-of-a%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1












          $begingroup$

          Possible idea. Since you know two eigenvalues are $0$ you know the $0$-eigenspace is $2$ dimensional. If you can (symbolically) see the null space, then its orthogonal complement will be $2$ dimensional and invariant. It will be spanned by the two missing eigenvectors. Finding the eigenvalues should be easier in just two dimensions.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            hi, thanks for the input. would you happen to know how to do this numerically? I can calculate nullspace of $M$ easily (I get 2 vectors that should span the space), how do I get the orthogonal complement from that? how do I link this to the missing eigenvalues? it would be nice if you could show that you can reproduce my $lambda$ expressions by the method you are describing.
            $endgroup$
            – Troy
            Dec 5 '18 at 1:23












          • $begingroup$
            I'm glad this might be helpful, but don't know how to do the next steps. Gram-Schmidt would find the orthogonal complement, but I think that's not numerically stable. But lots is known about numerical linear algebra. Perhaps get as far as you can and ask a new question (link to this one) or seriously edit to show progress and the new sticking point.
            $endgroup$
            – Ethan Bolker
            Dec 5 '18 at 1:33
















          1












          $begingroup$

          Possible idea. Since you know two eigenvalues are $0$ you know the $0$-eigenspace is $2$ dimensional. If you can (symbolically) see the null space, then its orthogonal complement will be $2$ dimensional and invariant. It will be spanned by the two missing eigenvectors. Finding the eigenvalues should be easier in just two dimensions.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            hi, thanks for the input. would you happen to know how to do this numerically? I can calculate nullspace of $M$ easily (I get 2 vectors that should span the space), how do I get the orthogonal complement from that? how do I link this to the missing eigenvalues? it would be nice if you could show that you can reproduce my $lambda$ expressions by the method you are describing.
            $endgroup$
            – Troy
            Dec 5 '18 at 1:23












          • $begingroup$
            I'm glad this might be helpful, but don't know how to do the next steps. Gram-Schmidt would find the orthogonal complement, but I think that's not numerically stable. But lots is known about numerical linear algebra. Perhaps get as far as you can and ask a new question (link to this one) or seriously edit to show progress and the new sticking point.
            $endgroup$
            – Ethan Bolker
            Dec 5 '18 at 1:33














          1












          1








          1





          $begingroup$

          Possible idea. Since you know two eigenvalues are $0$ you know the $0$-eigenspace is $2$ dimensional. If you can (symbolically) see the null space, then its orthogonal complement will be $2$ dimensional and invariant. It will be spanned by the two missing eigenvectors. Finding the eigenvalues should be easier in just two dimensions.






          share|cite|improve this answer











          $endgroup$



          Possible idea. Since you know two eigenvalues are $0$ you know the $0$-eigenspace is $2$ dimensional. If you can (symbolically) see the null space, then its orthogonal complement will be $2$ dimensional and invariant. It will be spanned by the two missing eigenvectors. Finding the eigenvalues should be easier in just two dimensions.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Dec 5 '18 at 1:34

























          answered Dec 5 '18 at 1:15









          Ethan BolkerEthan Bolker

          42.5k549113




          42.5k549113












          • $begingroup$
            hi, thanks for the input. would you happen to know how to do this numerically? I can calculate nullspace of $M$ easily (I get 2 vectors that should span the space), how do I get the orthogonal complement from that? how do I link this to the missing eigenvalues? it would be nice if you could show that you can reproduce my $lambda$ expressions by the method you are describing.
            $endgroup$
            – Troy
            Dec 5 '18 at 1:23












          • $begingroup$
            I'm glad this might be helpful, but don't know how to do the next steps. Gram-Schmidt would find the orthogonal complement, but I think that's not numerically stable. But lots is known about numerical linear algebra. Perhaps get as far as you can and ask a new question (link to this one) or seriously edit to show progress and the new sticking point.
            $endgroup$
            – Ethan Bolker
            Dec 5 '18 at 1:33


















          • $begingroup$
            hi, thanks for the input. would you happen to know how to do this numerically? I can calculate nullspace of $M$ easily (I get 2 vectors that should span the space), how do I get the orthogonal complement from that? how do I link this to the missing eigenvalues? it would be nice if you could show that you can reproduce my $lambda$ expressions by the method you are describing.
            $endgroup$
            – Troy
            Dec 5 '18 at 1:23












          • $begingroup$
            I'm glad this might be helpful, but don't know how to do the next steps. Gram-Schmidt would find the orthogonal complement, but I think that's not numerically stable. But lots is known about numerical linear algebra. Perhaps get as far as you can and ask a new question (link to this one) or seriously edit to show progress and the new sticking point.
            $endgroup$
            – Ethan Bolker
            Dec 5 '18 at 1:33
















          $begingroup$
          hi, thanks for the input. would you happen to know how to do this numerically? I can calculate nullspace of $M$ easily (I get 2 vectors that should span the space), how do I get the orthogonal complement from that? how do I link this to the missing eigenvalues? it would be nice if you could show that you can reproduce my $lambda$ expressions by the method you are describing.
          $endgroup$
          – Troy
          Dec 5 '18 at 1:23






          $begingroup$
          hi, thanks for the input. would you happen to know how to do this numerically? I can calculate nullspace of $M$ easily (I get 2 vectors that should span the space), how do I get the orthogonal complement from that? how do I link this to the missing eigenvalues? it would be nice if you could show that you can reproduce my $lambda$ expressions by the method you are describing.
          $endgroup$
          – Troy
          Dec 5 '18 at 1:23














          $begingroup$
          I'm glad this might be helpful, but don't know how to do the next steps. Gram-Schmidt would find the orthogonal complement, but I think that's not numerically stable. But lots is known about numerical linear algebra. Perhaps get as far as you can and ask a new question (link to this one) or seriously edit to show progress and the new sticking point.
          $endgroup$
          – Ethan Bolker
          Dec 5 '18 at 1:33




          $begingroup$
          I'm glad this might be helpful, but don't know how to do the next steps. Gram-Schmidt would find the orthogonal complement, but I think that's not numerically stable. But lots is known about numerical linear algebra. Perhaps get as far as you can and ask a new question (link to this one) or seriously edit to show progress and the new sticking point.
          $endgroup$
          – Ethan Bolker
          Dec 5 '18 at 1:33


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3026443%2fis-there-an-easy-way-to-find-the-analytical-form-of-remaining-eigenvalues-of-a%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Plaza Victoria

          Puebla de Zaragoza

          Musa