If the system Ax=b is consistent for every n x 1 matrix in b, then A is invertible. [closed]












0












$begingroup$


I have trouble understanding the proof here. I spent an hour trying to understand it but I give up. Can anyone help me with it?












share|cite|improve this question











$endgroup$



closed as off-topic by user1551, Saad, Adrian Keister, user10354138, José Carlos Santos Nov 30 '18 at 23:59


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – user1551, Saad, Adrian Keister, user10354138, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.













  • $begingroup$
    1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
    $endgroup$
    – John Brevik
    Oct 7 '15 at 2:23












  • $begingroup$
    I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
    $endgroup$
    – Zhi J Teoh
    Oct 7 '15 at 3:29










  • $begingroup$
    Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
    $endgroup$
    – John Brevik
    Oct 7 '15 at 11:54
















0












$begingroup$


I have trouble understanding the proof here. I spent an hour trying to understand it but I give up. Can anyone help me with it?












share|cite|improve this question











$endgroup$



closed as off-topic by user1551, Saad, Adrian Keister, user10354138, José Carlos Santos Nov 30 '18 at 23:59


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – user1551, Saad, Adrian Keister, user10354138, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.













  • $begingroup$
    1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
    $endgroup$
    – John Brevik
    Oct 7 '15 at 2:23












  • $begingroup$
    I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
    $endgroup$
    – Zhi J Teoh
    Oct 7 '15 at 3:29










  • $begingroup$
    Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
    $endgroup$
    – John Brevik
    Oct 7 '15 at 11:54














0












0








0


1



$begingroup$


I have trouble understanding the proof here. I spent an hour trying to understand it but I give up. Can anyone help me with it?












share|cite|improve this question











$endgroup$




I have trouble understanding the proof here. I spent an hour trying to understand it but I give up. Can anyone help me with it?









linear-algebra systems-of-equations






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 30 '18 at 21:45









Martin Sleziak

44.7k9117272




44.7k9117272










asked Oct 7 '15 at 2:14









Zhi J TeohZhi J Teoh

5619




5619




closed as off-topic by user1551, Saad, Adrian Keister, user10354138, José Carlos Santos Nov 30 '18 at 23:59


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – user1551, Saad, Adrian Keister, user10354138, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.




closed as off-topic by user1551, Saad, Adrian Keister, user10354138, José Carlos Santos Nov 30 '18 at 23:59


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – user1551, Saad, Adrian Keister, user10354138, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.












  • $begingroup$
    1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
    $endgroup$
    – John Brevik
    Oct 7 '15 at 2:23












  • $begingroup$
    I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
    $endgroup$
    – Zhi J Teoh
    Oct 7 '15 at 3:29










  • $begingroup$
    Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
    $endgroup$
    – John Brevik
    Oct 7 '15 at 11:54


















  • $begingroup$
    1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
    $endgroup$
    – John Brevik
    Oct 7 '15 at 2:23












  • $begingroup$
    I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
    $endgroup$
    – Zhi J Teoh
    Oct 7 '15 at 3:29










  • $begingroup$
    Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
    $endgroup$
    – John Brevik
    Oct 7 '15 at 11:54
















$begingroup$
1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
$endgroup$
– John Brevik
Oct 7 '15 at 2:23






$begingroup$
1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
$endgroup$
– John Brevik
Oct 7 '15 at 2:23














$begingroup$
I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
$endgroup$
– Zhi J Teoh
Oct 7 '15 at 3:29




$begingroup$
I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
$endgroup$
– Zhi J Teoh
Oct 7 '15 at 3:29












$begingroup$
Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
$endgroup$
– John Brevik
Oct 7 '15 at 11:54




$begingroup$
Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
$endgroup$
– John Brevik
Oct 7 '15 at 11:54










1 Answer
1






active

oldest

votes


















2












$begingroup$

At the beginning, they're saying that since $Ax = b$ is consistent no matter which vector $b$ is, then in particular the system
begin{equation}
Ax = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}
end{equation}
is consistent. In other words, there exists a vector $x_1$ such that
begin{equation}
Ax_1 = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}.
end{equation}



And, since $Ax = b$ is consistent no matter what $b$ is, it must be true that the system
begin{equation}
Ax = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}
end{equation}
is consistent. In other words, there exists a vector $x_2$ such that
begin{equation}
Ax_2 = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}.
end{equation}
And so on. Now take these vectors $x_1, x_2,ldots$ and make them the columns of a matrix $C$:
begin{equation}
C = begin{bmatrix} x_1 & x_2 & cdots & x_n end{bmatrix}.
end{equation}
You can check that $AC = I$, the identity matrix. So $A$ is invertible.






share|cite|improve this answer









$endgroup$




















    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2












    $begingroup$

    At the beginning, they're saying that since $Ax = b$ is consistent no matter which vector $b$ is, then in particular the system
    begin{equation}
    Ax = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}
    end{equation}
    is consistent. In other words, there exists a vector $x_1$ such that
    begin{equation}
    Ax_1 = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}.
    end{equation}



    And, since $Ax = b$ is consistent no matter what $b$ is, it must be true that the system
    begin{equation}
    Ax = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}
    end{equation}
    is consistent. In other words, there exists a vector $x_2$ such that
    begin{equation}
    Ax_2 = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}.
    end{equation}
    And so on. Now take these vectors $x_1, x_2,ldots$ and make them the columns of a matrix $C$:
    begin{equation}
    C = begin{bmatrix} x_1 & x_2 & cdots & x_n end{bmatrix}.
    end{equation}
    You can check that $AC = I$, the identity matrix. So $A$ is invertible.






    share|cite|improve this answer









    $endgroup$


















      2












      $begingroup$

      At the beginning, they're saying that since $Ax = b$ is consistent no matter which vector $b$ is, then in particular the system
      begin{equation}
      Ax = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}
      end{equation}
      is consistent. In other words, there exists a vector $x_1$ such that
      begin{equation}
      Ax_1 = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}.
      end{equation}



      And, since $Ax = b$ is consistent no matter what $b$ is, it must be true that the system
      begin{equation}
      Ax = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}
      end{equation}
      is consistent. In other words, there exists a vector $x_2$ such that
      begin{equation}
      Ax_2 = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}.
      end{equation}
      And so on. Now take these vectors $x_1, x_2,ldots$ and make them the columns of a matrix $C$:
      begin{equation}
      C = begin{bmatrix} x_1 & x_2 & cdots & x_n end{bmatrix}.
      end{equation}
      You can check that $AC = I$, the identity matrix. So $A$ is invertible.






      share|cite|improve this answer









      $endgroup$
















        2












        2








        2





        $begingroup$

        At the beginning, they're saying that since $Ax = b$ is consistent no matter which vector $b$ is, then in particular the system
        begin{equation}
        Ax = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}
        end{equation}
        is consistent. In other words, there exists a vector $x_1$ such that
        begin{equation}
        Ax_1 = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}.
        end{equation}



        And, since $Ax = b$ is consistent no matter what $b$ is, it must be true that the system
        begin{equation}
        Ax = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}
        end{equation}
        is consistent. In other words, there exists a vector $x_2$ such that
        begin{equation}
        Ax_2 = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}.
        end{equation}
        And so on. Now take these vectors $x_1, x_2,ldots$ and make them the columns of a matrix $C$:
        begin{equation}
        C = begin{bmatrix} x_1 & x_2 & cdots & x_n end{bmatrix}.
        end{equation}
        You can check that $AC = I$, the identity matrix. So $A$ is invertible.






        share|cite|improve this answer









        $endgroup$



        At the beginning, they're saying that since $Ax = b$ is consistent no matter which vector $b$ is, then in particular the system
        begin{equation}
        Ax = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}
        end{equation}
        is consistent. In other words, there exists a vector $x_1$ such that
        begin{equation}
        Ax_1 = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}.
        end{equation}



        And, since $Ax = b$ is consistent no matter what $b$ is, it must be true that the system
        begin{equation}
        Ax = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}
        end{equation}
        is consistent. In other words, there exists a vector $x_2$ such that
        begin{equation}
        Ax_2 = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}.
        end{equation}
        And so on. Now take these vectors $x_1, x_2,ldots$ and make them the columns of a matrix $C$:
        begin{equation}
        C = begin{bmatrix} x_1 & x_2 & cdots & x_n end{bmatrix}.
        end{equation}
        You can check that $AC = I$, the identity matrix. So $A$ is invertible.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Oct 7 '15 at 2:25









        littleOlittleO

        29.5k645109




        29.5k645109















            Popular posts from this blog

            Plaza Victoria

            In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

            How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...