Proving that $det(A) ne 0$ with $A$ satisfying following conditions.











up vote
4
down vote

favorite
1












I am given $A in M_n(mathbb{R})$ which satisfies the following conditions.





  1. $A_{i,i} gt 0$ for all $1 le i le n$


  2. $A_{i,j} le 0$ for all distinct $1 le i, j le n$


  3. $sum_{j=1}^n A_{i,j} gt 0$ for all $1 le i le n$


Then, I am supposed to show that $det(A) ne 0$



Now, I am frankly not sure where to even start. However, I was given the following hint:



If not, there is a non-zero solution of $Ax = 0$. If $x_i$ has largest absolute value, show that the $i$th linear equation from $Ax=0$ leads to a contradiction.



I don't really quite get how to apply this hint either. Could someone help? Thanks.










share|cite|improve this question






















  • It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
    – K_inverse
    2 days ago












  • @dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
    – AnyAD
    2 days ago








  • 1




    Google "diagonally dominant matrix".
    – darij grinberg
    2 days ago















up vote
4
down vote

favorite
1












I am given $A in M_n(mathbb{R})$ which satisfies the following conditions.





  1. $A_{i,i} gt 0$ for all $1 le i le n$


  2. $A_{i,j} le 0$ for all distinct $1 le i, j le n$


  3. $sum_{j=1}^n A_{i,j} gt 0$ for all $1 le i le n$


Then, I am supposed to show that $det(A) ne 0$



Now, I am frankly not sure where to even start. However, I was given the following hint:



If not, there is a non-zero solution of $Ax = 0$. If $x_i$ has largest absolute value, show that the $i$th linear equation from $Ax=0$ leads to a contradiction.



I don't really quite get how to apply this hint either. Could someone help? Thanks.










share|cite|improve this question






















  • It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
    – K_inverse
    2 days ago












  • @dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
    – AnyAD
    2 days ago








  • 1




    Google "diagonally dominant matrix".
    – darij grinberg
    2 days ago













up vote
4
down vote

favorite
1









up vote
4
down vote

favorite
1






1





I am given $A in M_n(mathbb{R})$ which satisfies the following conditions.





  1. $A_{i,i} gt 0$ for all $1 le i le n$


  2. $A_{i,j} le 0$ for all distinct $1 le i, j le n$


  3. $sum_{j=1}^n A_{i,j} gt 0$ for all $1 le i le n$


Then, I am supposed to show that $det(A) ne 0$



Now, I am frankly not sure where to even start. However, I was given the following hint:



If not, there is a non-zero solution of $Ax = 0$. If $x_i$ has largest absolute value, show that the $i$th linear equation from $Ax=0$ leads to a contradiction.



I don't really quite get how to apply this hint either. Could someone help? Thanks.










share|cite|improve this question













I am given $A in M_n(mathbb{R})$ which satisfies the following conditions.





  1. $A_{i,i} gt 0$ for all $1 le i le n$


  2. $A_{i,j} le 0$ for all distinct $1 le i, j le n$


  3. $sum_{j=1}^n A_{i,j} gt 0$ for all $1 le i le n$


Then, I am supposed to show that $det(A) ne 0$



Now, I am frankly not sure where to even start. However, I was given the following hint:



If not, there is a non-zero solution of $Ax = 0$. If $x_i$ has largest absolute value, show that the $i$th linear equation from $Ax=0$ leads to a contradiction.



I don't really quite get how to apply this hint either. Could someone help? Thanks.







linear-algebra abstract-algebra matrices determinant






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 2 days ago









dmsj djsl

32317




32317












  • It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
    – K_inverse
    2 days ago












  • @dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
    – AnyAD
    2 days ago








  • 1




    Google "diagonally dominant matrix".
    – darij grinberg
    2 days ago


















  • It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
    – K_inverse
    2 days ago












  • @dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
    – AnyAD
    2 days ago








  • 1




    Google "diagonally dominant matrix".
    – darij grinberg
    2 days ago
















It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
– K_inverse
2 days ago






It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
– K_inverse
2 days ago














@dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
– AnyAD
2 days ago






@dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
– AnyAD
2 days ago






1




1




Google "diagonally dominant matrix".
– darij grinberg
2 days ago




Google "diagonally dominant matrix".
– darij grinberg
2 days ago










1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










Let $Ax = 0$. Then, let $x_i = arg max |x_j|$ i.e. $i$ is such that $|x_i| geq |x_j|$ for all $i neq j$. By assumption, if $x neq 0$ then $|x_i| > 0$.



Note that $Ax = 0$ implies that $A_i cdot x = 0$, where $A_i$ denotes the $i$th row of $A$, as a vector. This follows from the definition of matrix multiplication.



However, $A_i cdot x = sum_{j} A_{ij}x_j$. By definition, we have $|x_i| geq |x_j|$ for all $j$, so write $$A_i cdot x = A_{ii}x_i + sum_{j neq i} A_{ij}x_j$$ and use the inequality $|x+y| geq |x| - |y|$, to see that :
$$
|A_i cdot x| geq |A_{ii}x_i| - left|sum_{j neq i} A_{ij}x_jright|
$$



But, we know that $|x_j| leq |x_i|$, so it follows that $$|sum_{j neq i} A_{ij}x_j| leq sum_{j neq i} -A_{ij}|x_j| leq -|x_i|sum_{j neq i}A_{ij}$$.



Therefore,
$$
|x_i|A_{ii} - left|sum_{j neq i} A_{ij}x_jright| geq |x_i| times sum_{j} A_{ij} > 0
$$



Which is a contradiction, since $A_i cdot x = 0$. Consequently, no such $x$ exists.





More can be said. Indeed, the Gerschgorin circle theorem guarantees that every eigenvalue lies with a Gerschgorin disc, whose centre is one of the diagonal entries, and radius is the sum of the absolute values of the non-diagonal entries of the row. In this case, by the conditions given, the theorem gives that no eigenvalue can in fact be smaller than the smallest value of $sum_{j} A_{ij}$, which is greater than $0$. So this way the result is clear.



Also, the matrix with conditions given is strictly diagonally dominant, and from the Gerschgorin circle theorem is non-singular (known as the Levy-Desplanques theorem, and having applications in probability).






share|cite|improve this answer





















    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2997791%2fproving-that-deta-ne-0-with-a-satisfying-following-conditions%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote



    accepted










    Let $Ax = 0$. Then, let $x_i = arg max |x_j|$ i.e. $i$ is such that $|x_i| geq |x_j|$ for all $i neq j$. By assumption, if $x neq 0$ then $|x_i| > 0$.



    Note that $Ax = 0$ implies that $A_i cdot x = 0$, where $A_i$ denotes the $i$th row of $A$, as a vector. This follows from the definition of matrix multiplication.



    However, $A_i cdot x = sum_{j} A_{ij}x_j$. By definition, we have $|x_i| geq |x_j|$ for all $j$, so write $$A_i cdot x = A_{ii}x_i + sum_{j neq i} A_{ij}x_j$$ and use the inequality $|x+y| geq |x| - |y|$, to see that :
    $$
    |A_i cdot x| geq |A_{ii}x_i| - left|sum_{j neq i} A_{ij}x_jright|
    $$



    But, we know that $|x_j| leq |x_i|$, so it follows that $$|sum_{j neq i} A_{ij}x_j| leq sum_{j neq i} -A_{ij}|x_j| leq -|x_i|sum_{j neq i}A_{ij}$$.



    Therefore,
    $$
    |x_i|A_{ii} - left|sum_{j neq i} A_{ij}x_jright| geq |x_i| times sum_{j} A_{ij} > 0
    $$



    Which is a contradiction, since $A_i cdot x = 0$. Consequently, no such $x$ exists.





    More can be said. Indeed, the Gerschgorin circle theorem guarantees that every eigenvalue lies with a Gerschgorin disc, whose centre is one of the diagonal entries, and radius is the sum of the absolute values of the non-diagonal entries of the row. In this case, by the conditions given, the theorem gives that no eigenvalue can in fact be smaller than the smallest value of $sum_{j} A_{ij}$, which is greater than $0$. So this way the result is clear.



    Also, the matrix with conditions given is strictly diagonally dominant, and from the Gerschgorin circle theorem is non-singular (known as the Levy-Desplanques theorem, and having applications in probability).






    share|cite|improve this answer

























      up vote
      1
      down vote



      accepted










      Let $Ax = 0$. Then, let $x_i = arg max |x_j|$ i.e. $i$ is such that $|x_i| geq |x_j|$ for all $i neq j$. By assumption, if $x neq 0$ then $|x_i| > 0$.



      Note that $Ax = 0$ implies that $A_i cdot x = 0$, where $A_i$ denotes the $i$th row of $A$, as a vector. This follows from the definition of matrix multiplication.



      However, $A_i cdot x = sum_{j} A_{ij}x_j$. By definition, we have $|x_i| geq |x_j|$ for all $j$, so write $$A_i cdot x = A_{ii}x_i + sum_{j neq i} A_{ij}x_j$$ and use the inequality $|x+y| geq |x| - |y|$, to see that :
      $$
      |A_i cdot x| geq |A_{ii}x_i| - left|sum_{j neq i} A_{ij}x_jright|
      $$



      But, we know that $|x_j| leq |x_i|$, so it follows that $$|sum_{j neq i} A_{ij}x_j| leq sum_{j neq i} -A_{ij}|x_j| leq -|x_i|sum_{j neq i}A_{ij}$$.



      Therefore,
      $$
      |x_i|A_{ii} - left|sum_{j neq i} A_{ij}x_jright| geq |x_i| times sum_{j} A_{ij} > 0
      $$



      Which is a contradiction, since $A_i cdot x = 0$. Consequently, no such $x$ exists.





      More can be said. Indeed, the Gerschgorin circle theorem guarantees that every eigenvalue lies with a Gerschgorin disc, whose centre is one of the diagonal entries, and radius is the sum of the absolute values of the non-diagonal entries of the row. In this case, by the conditions given, the theorem gives that no eigenvalue can in fact be smaller than the smallest value of $sum_{j} A_{ij}$, which is greater than $0$. So this way the result is clear.



      Also, the matrix with conditions given is strictly diagonally dominant, and from the Gerschgorin circle theorem is non-singular (known as the Levy-Desplanques theorem, and having applications in probability).






      share|cite|improve this answer























        up vote
        1
        down vote



        accepted







        up vote
        1
        down vote



        accepted






        Let $Ax = 0$. Then, let $x_i = arg max |x_j|$ i.e. $i$ is such that $|x_i| geq |x_j|$ for all $i neq j$. By assumption, if $x neq 0$ then $|x_i| > 0$.



        Note that $Ax = 0$ implies that $A_i cdot x = 0$, where $A_i$ denotes the $i$th row of $A$, as a vector. This follows from the definition of matrix multiplication.



        However, $A_i cdot x = sum_{j} A_{ij}x_j$. By definition, we have $|x_i| geq |x_j|$ for all $j$, so write $$A_i cdot x = A_{ii}x_i + sum_{j neq i} A_{ij}x_j$$ and use the inequality $|x+y| geq |x| - |y|$, to see that :
        $$
        |A_i cdot x| geq |A_{ii}x_i| - left|sum_{j neq i} A_{ij}x_jright|
        $$



        But, we know that $|x_j| leq |x_i|$, so it follows that $$|sum_{j neq i} A_{ij}x_j| leq sum_{j neq i} -A_{ij}|x_j| leq -|x_i|sum_{j neq i}A_{ij}$$.



        Therefore,
        $$
        |x_i|A_{ii} - left|sum_{j neq i} A_{ij}x_jright| geq |x_i| times sum_{j} A_{ij} > 0
        $$



        Which is a contradiction, since $A_i cdot x = 0$. Consequently, no such $x$ exists.





        More can be said. Indeed, the Gerschgorin circle theorem guarantees that every eigenvalue lies with a Gerschgorin disc, whose centre is one of the diagonal entries, and radius is the sum of the absolute values of the non-diagonal entries of the row. In this case, by the conditions given, the theorem gives that no eigenvalue can in fact be smaller than the smallest value of $sum_{j} A_{ij}$, which is greater than $0$. So this way the result is clear.



        Also, the matrix with conditions given is strictly diagonally dominant, and from the Gerschgorin circle theorem is non-singular (known as the Levy-Desplanques theorem, and having applications in probability).






        share|cite|improve this answer












        Let $Ax = 0$. Then, let $x_i = arg max |x_j|$ i.e. $i$ is such that $|x_i| geq |x_j|$ for all $i neq j$. By assumption, if $x neq 0$ then $|x_i| > 0$.



        Note that $Ax = 0$ implies that $A_i cdot x = 0$, where $A_i$ denotes the $i$th row of $A$, as a vector. This follows from the definition of matrix multiplication.



        However, $A_i cdot x = sum_{j} A_{ij}x_j$. By definition, we have $|x_i| geq |x_j|$ for all $j$, so write $$A_i cdot x = A_{ii}x_i + sum_{j neq i} A_{ij}x_j$$ and use the inequality $|x+y| geq |x| - |y|$, to see that :
        $$
        |A_i cdot x| geq |A_{ii}x_i| - left|sum_{j neq i} A_{ij}x_jright|
        $$



        But, we know that $|x_j| leq |x_i|$, so it follows that $$|sum_{j neq i} A_{ij}x_j| leq sum_{j neq i} -A_{ij}|x_j| leq -|x_i|sum_{j neq i}A_{ij}$$.



        Therefore,
        $$
        |x_i|A_{ii} - left|sum_{j neq i} A_{ij}x_jright| geq |x_i| times sum_{j} A_{ij} > 0
        $$



        Which is a contradiction, since $A_i cdot x = 0$. Consequently, no such $x$ exists.





        More can be said. Indeed, the Gerschgorin circle theorem guarantees that every eigenvalue lies with a Gerschgorin disc, whose centre is one of the diagonal entries, and radius is the sum of the absolute values of the non-diagonal entries of the row. In this case, by the conditions given, the theorem gives that no eigenvalue can in fact be smaller than the smallest value of $sum_{j} A_{ij}$, which is greater than $0$. So this way the result is clear.



        Also, the matrix with conditions given is strictly diagonally dominant, and from the Gerschgorin circle theorem is non-singular (known as the Levy-Desplanques theorem, and having applications in probability).







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 2 days ago









        астон вілла олоф мэллбэрг

        36k33375




        36k33375






























             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2997791%2fproving-that-deta-ne-0-with-a-satisfying-following-conditions%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Plaza Victoria

            In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

            How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...