Proof verification for Identity matrices











up vote
7
down vote

favorite












So I have the following question:



Analyze the following 'Claim' (which may or may not be true) and the corresponding 'Proof', by writing 'TRUE' or 'FALSE' (together with the reason) for each step. [Note: $I_n$ is the $n times n$ identity matrix.]



Claim: Let $A$ be any $n times n$ matrix satisfying $A^2=I_n$. Then either $A=I_n$ or $A=-I_n$.



'Proof'.



Step 1: $A$ satisfies $A^2-I_n = 0$ (True or False)



True.



My reasoning: Clearly, this is true. $A^2=I_n$ is not always true, but because it is true, I should have no problem moving the Identity matrix the the LHS.



Step 2: So $(A+I_n)(A-I_n)=0$ (True or false)



True.



My reasoning: Because $I_n$ is the identity matrix, there should be no issues with factoring just like normal algebra.



Step 3: $A+I_n=0$ or $A-I_n=0$



I'm not sure about this part. I'm very tempted to say this is fine but I am not sure how I can justify this, if I even can.



Therefore $A=-I_n$ or $A=I_n$. (End of 'Proof'.)



Is what I am doing right so far or am I messing up somewhere?










share|cite|improve this question


















  • 1




    What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
    – md2perpe
    Nov 30 at 17:55















up vote
7
down vote

favorite












So I have the following question:



Analyze the following 'Claim' (which may or may not be true) and the corresponding 'Proof', by writing 'TRUE' or 'FALSE' (together with the reason) for each step. [Note: $I_n$ is the $n times n$ identity matrix.]



Claim: Let $A$ be any $n times n$ matrix satisfying $A^2=I_n$. Then either $A=I_n$ or $A=-I_n$.



'Proof'.



Step 1: $A$ satisfies $A^2-I_n = 0$ (True or False)



True.



My reasoning: Clearly, this is true. $A^2=I_n$ is not always true, but because it is true, I should have no problem moving the Identity matrix the the LHS.



Step 2: So $(A+I_n)(A-I_n)=0$ (True or false)



True.



My reasoning: Because $I_n$ is the identity matrix, there should be no issues with factoring just like normal algebra.



Step 3: $A+I_n=0$ or $A-I_n=0$



I'm not sure about this part. I'm very tempted to say this is fine but I am not sure how I can justify this, if I even can.



Therefore $A=-I_n$ or $A=I_n$. (End of 'Proof'.)



Is what I am doing right so far or am I messing up somewhere?










share|cite|improve this question


















  • 1




    What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
    – md2perpe
    Nov 30 at 17:55













up vote
7
down vote

favorite









up vote
7
down vote

favorite











So I have the following question:



Analyze the following 'Claim' (which may or may not be true) and the corresponding 'Proof', by writing 'TRUE' or 'FALSE' (together with the reason) for each step. [Note: $I_n$ is the $n times n$ identity matrix.]



Claim: Let $A$ be any $n times n$ matrix satisfying $A^2=I_n$. Then either $A=I_n$ or $A=-I_n$.



'Proof'.



Step 1: $A$ satisfies $A^2-I_n = 0$ (True or False)



True.



My reasoning: Clearly, this is true. $A^2=I_n$ is not always true, but because it is true, I should have no problem moving the Identity matrix the the LHS.



Step 2: So $(A+I_n)(A-I_n)=0$ (True or false)



True.



My reasoning: Because $I_n$ is the identity matrix, there should be no issues with factoring just like normal algebra.



Step 3: $A+I_n=0$ or $A-I_n=0$



I'm not sure about this part. I'm very tempted to say this is fine but I am not sure how I can justify this, if I even can.



Therefore $A=-I_n$ or $A=I_n$. (End of 'Proof'.)



Is what I am doing right so far or am I messing up somewhere?










share|cite|improve this question













So I have the following question:



Analyze the following 'Claim' (which may or may not be true) and the corresponding 'Proof', by writing 'TRUE' or 'FALSE' (together with the reason) for each step. [Note: $I_n$ is the $n times n$ identity matrix.]



Claim: Let $A$ be any $n times n$ matrix satisfying $A^2=I_n$. Then either $A=I_n$ or $A=-I_n$.



'Proof'.



Step 1: $A$ satisfies $A^2-I_n = 0$ (True or False)



True.



My reasoning: Clearly, this is true. $A^2=I_n$ is not always true, but because it is true, I should have no problem moving the Identity matrix the the LHS.



Step 2: So $(A+I_n)(A-I_n)=0$ (True or false)



True.



My reasoning: Because $I_n$ is the identity matrix, there should be no issues with factoring just like normal algebra.



Step 3: $A+I_n=0$ or $A-I_n=0$



I'm not sure about this part. I'm very tempted to say this is fine but I am not sure how I can justify this, if I even can.



Therefore $A=-I_n$ or $A=I_n$. (End of 'Proof'.)



Is what I am doing right so far or am I messing up somewhere?







linear-algebra matrices proof-verification






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 30 at 7:51









Future Math person

857717




857717








  • 1




    What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
    – md2perpe
    Nov 30 at 17:55














  • 1




    What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
    – md2perpe
    Nov 30 at 17:55








1




1




What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
– md2perpe
Nov 30 at 17:55




What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
– md2perpe
Nov 30 at 17:55










3 Answers
3






active

oldest

votes

















up vote
12
down vote



accepted











  • Rather than saying that moving the identiy to the LHS, it is due to we add $-I$ to both sides.


  • We have $A^2-I=(A-I)(A+I)$, we just have to expand the right hand side to verify that.


  • In matrices, $AB=0$ doesn't imply that $A=0$ or $B=0$. For example $$begin{bmatrix} 2 & 0 \ 0 & 0end{bmatrix}begin{bmatrix} 0 & 0 \ 0 & -2end{bmatrix}= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$


  • In particular,



$$left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}+begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}-begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$



that is we cant' conclude that $(A+I)(A-I)=0$ implies $A+I=0$ or $A-I=0$ as well.






share|cite|improve this answer























  • Aha. I knew something looked fishy with that last part. Thanks!
    – Future Math person
    Nov 30 at 7:59






  • 2




    When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
    – Eric Towers
    Nov 30 at 13:36








  • 1




    In the third point it should be "that $A=0$ or $B=0$".
    – Lonidard
    Nov 30 at 14:29












  • thanks for pointing that out.
    – Siong Thye Goh
    Nov 30 at 14:33






  • 1




    I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
    – Teepeemm
    Nov 30 at 18:45


















up vote
9
down vote













Consider any diagonal matrix with diagonal elements $pm 1$. Show that$A^{2}=I_n$. you get $2^{n}$ matrices whose square is $I_n$.






share|cite|improve this answer




























    up vote
    2
    down vote













    Furthermore if you want a concrete example of a matrix whose square is the identity but not itself a simple matrix consider for example this one:



    $$begin{bmatrix}frac{1}{2} & frac{3}{4} \ 1 & -frac{1}{2}end{bmatrix}$$



    These matrices are called involuntory






    share|cite|improve this answer





















      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3019798%2fproof-verification-for-identity-matrices%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      12
      down vote



      accepted











      • Rather than saying that moving the identiy to the LHS, it is due to we add $-I$ to both sides.


      • We have $A^2-I=(A-I)(A+I)$, we just have to expand the right hand side to verify that.


      • In matrices, $AB=0$ doesn't imply that $A=0$ or $B=0$. For example $$begin{bmatrix} 2 & 0 \ 0 & 0end{bmatrix}begin{bmatrix} 0 & 0 \ 0 & -2end{bmatrix}= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$


      • In particular,



      $$left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}+begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}-begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$



      that is we cant' conclude that $(A+I)(A-I)=0$ implies $A+I=0$ or $A-I=0$ as well.






      share|cite|improve this answer























      • Aha. I knew something looked fishy with that last part. Thanks!
        – Future Math person
        Nov 30 at 7:59






      • 2




        When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
        – Eric Towers
        Nov 30 at 13:36








      • 1




        In the third point it should be "that $A=0$ or $B=0$".
        – Lonidard
        Nov 30 at 14:29












      • thanks for pointing that out.
        – Siong Thye Goh
        Nov 30 at 14:33






      • 1




        I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
        – Teepeemm
        Nov 30 at 18:45















      up vote
      12
      down vote



      accepted











      • Rather than saying that moving the identiy to the LHS, it is due to we add $-I$ to both sides.


      • We have $A^2-I=(A-I)(A+I)$, we just have to expand the right hand side to verify that.


      • In matrices, $AB=0$ doesn't imply that $A=0$ or $B=0$. For example $$begin{bmatrix} 2 & 0 \ 0 & 0end{bmatrix}begin{bmatrix} 0 & 0 \ 0 & -2end{bmatrix}= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$


      • In particular,



      $$left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}+begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}-begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$



      that is we cant' conclude that $(A+I)(A-I)=0$ implies $A+I=0$ or $A-I=0$ as well.






      share|cite|improve this answer























      • Aha. I knew something looked fishy with that last part. Thanks!
        – Future Math person
        Nov 30 at 7:59






      • 2




        When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
        – Eric Towers
        Nov 30 at 13:36








      • 1




        In the third point it should be "that $A=0$ or $B=0$".
        – Lonidard
        Nov 30 at 14:29












      • thanks for pointing that out.
        – Siong Thye Goh
        Nov 30 at 14:33






      • 1




        I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
        – Teepeemm
        Nov 30 at 18:45













      up vote
      12
      down vote



      accepted







      up vote
      12
      down vote



      accepted







      • Rather than saying that moving the identiy to the LHS, it is due to we add $-I$ to both sides.


      • We have $A^2-I=(A-I)(A+I)$, we just have to expand the right hand side to verify that.


      • In matrices, $AB=0$ doesn't imply that $A=0$ or $B=0$. For example $$begin{bmatrix} 2 & 0 \ 0 & 0end{bmatrix}begin{bmatrix} 0 & 0 \ 0 & -2end{bmatrix}= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$


      • In particular,



      $$left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}+begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}-begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$



      that is we cant' conclude that $(A+I)(A-I)=0$ implies $A+I=0$ or $A-I=0$ as well.






      share|cite|improve this answer















      • Rather than saying that moving the identiy to the LHS, it is due to we add $-I$ to both sides.


      • We have $A^2-I=(A-I)(A+I)$, we just have to expand the right hand side to verify that.


      • In matrices, $AB=0$ doesn't imply that $A=0$ or $B=0$. For example $$begin{bmatrix} 2 & 0 \ 0 & 0end{bmatrix}begin{bmatrix} 0 & 0 \ 0 & -2end{bmatrix}= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$


      • In particular,



      $$left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}+begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}-begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$



      that is we cant' conclude that $(A+I)(A-I)=0$ implies $A+I=0$ or $A-I=0$ as well.







      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited Dec 1 at 5:08

























      answered Nov 30 at 7:57









      Siong Thye Goh

      95.9k1462116




      95.9k1462116












      • Aha. I knew something looked fishy with that last part. Thanks!
        – Future Math person
        Nov 30 at 7:59






      • 2




        When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
        – Eric Towers
        Nov 30 at 13:36








      • 1




        In the third point it should be "that $A=0$ or $B=0$".
        – Lonidard
        Nov 30 at 14:29












      • thanks for pointing that out.
        – Siong Thye Goh
        Nov 30 at 14:33






      • 1




        I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
        – Teepeemm
        Nov 30 at 18:45


















      • Aha. I knew something looked fishy with that last part. Thanks!
        – Future Math person
        Nov 30 at 7:59






      • 2




        When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
        – Eric Towers
        Nov 30 at 13:36








      • 1




        In the third point it should be "that $A=0$ or $B=0$".
        – Lonidard
        Nov 30 at 14:29












      • thanks for pointing that out.
        – Siong Thye Goh
        Nov 30 at 14:33






      • 1




        I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
        – Teepeemm
        Nov 30 at 18:45
















      Aha. I knew something looked fishy with that last part. Thanks!
      – Future Math person
      Nov 30 at 7:59




      Aha. I knew something looked fishy with that last part. Thanks!
      – Future Math person
      Nov 30 at 7:59




      2




      2




      When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
      – Eric Towers
      Nov 30 at 13:36






      When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
      – Eric Towers
      Nov 30 at 13:36






      1




      1




      In the third point it should be "that $A=0$ or $B=0$".
      – Lonidard
      Nov 30 at 14:29






      In the third point it should be "that $A=0$ or $B=0$".
      – Lonidard
      Nov 30 at 14:29














      thanks for pointing that out.
      – Siong Thye Goh
      Nov 30 at 14:33




      thanks for pointing that out.
      – Siong Thye Goh
      Nov 30 at 14:33




      1




      1




      I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
      – Teepeemm
      Nov 30 at 18:45




      I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
      – Teepeemm
      Nov 30 at 18:45










      up vote
      9
      down vote













      Consider any diagonal matrix with diagonal elements $pm 1$. Show that$A^{2}=I_n$. you get $2^{n}$ matrices whose square is $I_n$.






      share|cite|improve this answer

























        up vote
        9
        down vote













        Consider any diagonal matrix with diagonal elements $pm 1$. Show that$A^{2}=I_n$. you get $2^{n}$ matrices whose square is $I_n$.






        share|cite|improve this answer























          up vote
          9
          down vote










          up vote
          9
          down vote









          Consider any diagonal matrix with diagonal elements $pm 1$. Show that$A^{2}=I_n$. you get $2^{n}$ matrices whose square is $I_n$.






          share|cite|improve this answer












          Consider any diagonal matrix with diagonal elements $pm 1$. Show that$A^{2}=I_n$. you get $2^{n}$ matrices whose square is $I_n$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Nov 30 at 7:53









          Kavi Rama Murthy

          44.2k31852




          44.2k31852






















              up vote
              2
              down vote













              Furthermore if you want a concrete example of a matrix whose square is the identity but not itself a simple matrix consider for example this one:



              $$begin{bmatrix}frac{1}{2} & frac{3}{4} \ 1 & -frac{1}{2}end{bmatrix}$$



              These matrices are called involuntory






              share|cite|improve this answer

























                up vote
                2
                down vote













                Furthermore if you want a concrete example of a matrix whose square is the identity but not itself a simple matrix consider for example this one:



                $$begin{bmatrix}frac{1}{2} & frac{3}{4} \ 1 & -frac{1}{2}end{bmatrix}$$



                These matrices are called involuntory






                share|cite|improve this answer























                  up vote
                  2
                  down vote










                  up vote
                  2
                  down vote









                  Furthermore if you want a concrete example of a matrix whose square is the identity but not itself a simple matrix consider for example this one:



                  $$begin{bmatrix}frac{1}{2} & frac{3}{4} \ 1 & -frac{1}{2}end{bmatrix}$$



                  These matrices are called involuntory






                  share|cite|improve this answer












                  Furthermore if you want a concrete example of a matrix whose square is the identity but not itself a simple matrix consider for example this one:



                  $$begin{bmatrix}frac{1}{2} & frac{3}{4} \ 1 & -frac{1}{2}end{bmatrix}$$



                  These matrices are called involuntory







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Nov 30 at 15:43









                  gota

                  394315




                  394315






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.





                      Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                      Please pay close attention to the following guidance:


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3019798%2fproof-verification-for-identity-matrices%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Plaza Victoria

                      In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

                      How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...