Proving that $det(A) ne 0$ with $A$ satisfying following conditions.
up vote
4
down vote
favorite
I am given $A in M_n(mathbb{R})$ which satisfies the following conditions.
$A_{i,i} gt 0$ for all $1 le i le n$
$A_{i,j} le 0$ for all distinct $1 le i, j le n$
$sum_{j=1}^n A_{i,j} gt 0$ for all $1 le i le n$
Then, I am supposed to show that $det(A) ne 0$
Now, I am frankly not sure where to even start. However, I was given the following hint:
If not, there is a non-zero solution of $Ax = 0$. If $x_i$ has largest absolute value, show that the $i$th linear equation from $Ax=0$ leads to a contradiction.
I don't really quite get how to apply this hint either. Could someone help? Thanks.
linear-algebra abstract-algebra matrices determinant
add a comment |
up vote
4
down vote
favorite
I am given $A in M_n(mathbb{R})$ which satisfies the following conditions.
$A_{i,i} gt 0$ for all $1 le i le n$
$A_{i,j} le 0$ for all distinct $1 le i, j le n$
$sum_{j=1}^n A_{i,j} gt 0$ for all $1 le i le n$
Then, I am supposed to show that $det(A) ne 0$
Now, I am frankly not sure where to even start. However, I was given the following hint:
If not, there is a non-zero solution of $Ax = 0$. If $x_i$ has largest absolute value, show that the $i$th linear equation from $Ax=0$ leads to a contradiction.
I don't really quite get how to apply this hint either. Could someone help? Thanks.
linear-algebra abstract-algebra matrices determinant
It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
– K_inverse
2 days ago
@dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
– AnyAD
2 days ago
1
Google "diagonally dominant matrix".
– darij grinberg
2 days ago
add a comment |
up vote
4
down vote
favorite
up vote
4
down vote
favorite
I am given $A in M_n(mathbb{R})$ which satisfies the following conditions.
$A_{i,i} gt 0$ for all $1 le i le n$
$A_{i,j} le 0$ for all distinct $1 le i, j le n$
$sum_{j=1}^n A_{i,j} gt 0$ for all $1 le i le n$
Then, I am supposed to show that $det(A) ne 0$
Now, I am frankly not sure where to even start. However, I was given the following hint:
If not, there is a non-zero solution of $Ax = 0$. If $x_i$ has largest absolute value, show that the $i$th linear equation from $Ax=0$ leads to a contradiction.
I don't really quite get how to apply this hint either. Could someone help? Thanks.
linear-algebra abstract-algebra matrices determinant
I am given $A in M_n(mathbb{R})$ which satisfies the following conditions.
$A_{i,i} gt 0$ for all $1 le i le n$
$A_{i,j} le 0$ for all distinct $1 le i, j le n$
$sum_{j=1}^n A_{i,j} gt 0$ for all $1 le i le n$
Then, I am supposed to show that $det(A) ne 0$
Now, I am frankly not sure where to even start. However, I was given the following hint:
If not, there is a non-zero solution of $Ax = 0$. If $x_i$ has largest absolute value, show that the $i$th linear equation from $Ax=0$ leads to a contradiction.
I don't really quite get how to apply this hint either. Could someone help? Thanks.
linear-algebra abstract-algebra matrices determinant
linear-algebra abstract-algebra matrices determinant
asked 2 days ago
dmsj djsl
32317
32317
It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
– K_inverse
2 days ago
@dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
– AnyAD
2 days ago
1
Google "diagonally dominant matrix".
– darij grinberg
2 days ago
add a comment |
It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
– K_inverse
2 days ago
@dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
– AnyAD
2 days ago
1
Google "diagonally dominant matrix".
– darij grinberg
2 days ago
It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
– K_inverse
2 days ago
It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
– K_inverse
2 days ago
@dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
– AnyAD
2 days ago
@dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
– AnyAD
2 days ago
1
1
Google "diagonally dominant matrix".
– darij grinberg
2 days ago
Google "diagonally dominant matrix".
– darij grinberg
2 days ago
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
Let $Ax = 0$. Then, let $x_i = arg max |x_j|$ i.e. $i$ is such that $|x_i| geq |x_j|$ for all $i neq j$. By assumption, if $x neq 0$ then $|x_i| > 0$.
Note that $Ax = 0$ implies that $A_i cdot x = 0$, where $A_i$ denotes the $i$th row of $A$, as a vector. This follows from the definition of matrix multiplication.
However, $A_i cdot x = sum_{j} A_{ij}x_j$. By definition, we have $|x_i| geq |x_j|$ for all $j$, so write $$A_i cdot x = A_{ii}x_i + sum_{j neq i} A_{ij}x_j$$ and use the inequality $|x+y| geq |x| - |y|$, to see that :
$$
|A_i cdot x| geq |A_{ii}x_i| - left|sum_{j neq i} A_{ij}x_jright|
$$
But, we know that $|x_j| leq |x_i|$, so it follows that $$|sum_{j neq i} A_{ij}x_j| leq sum_{j neq i} -A_{ij}|x_j| leq -|x_i|sum_{j neq i}A_{ij}$$.
Therefore,
$$
|x_i|A_{ii} - left|sum_{j neq i} A_{ij}x_jright| geq |x_i| times sum_{j} A_{ij} > 0
$$
Which is a contradiction, since $A_i cdot x = 0$. Consequently, no such $x$ exists.
More can be said. Indeed, the Gerschgorin circle theorem guarantees that every eigenvalue lies with a Gerschgorin disc, whose centre is one of the diagonal entries, and radius is the sum of the absolute values of the non-diagonal entries of the row. In this case, by the conditions given, the theorem gives that no eigenvalue can in fact be smaller than the smallest value of $sum_{j} A_{ij}$, which is greater than $0$. So this way the result is clear.
Also, the matrix with conditions given is strictly diagonally dominant, and from the Gerschgorin circle theorem is non-singular (known as the Levy-Desplanques theorem, and having applications in probability).
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
Let $Ax = 0$. Then, let $x_i = arg max |x_j|$ i.e. $i$ is such that $|x_i| geq |x_j|$ for all $i neq j$. By assumption, if $x neq 0$ then $|x_i| > 0$.
Note that $Ax = 0$ implies that $A_i cdot x = 0$, where $A_i$ denotes the $i$th row of $A$, as a vector. This follows from the definition of matrix multiplication.
However, $A_i cdot x = sum_{j} A_{ij}x_j$. By definition, we have $|x_i| geq |x_j|$ for all $j$, so write $$A_i cdot x = A_{ii}x_i + sum_{j neq i} A_{ij}x_j$$ and use the inequality $|x+y| geq |x| - |y|$, to see that :
$$
|A_i cdot x| geq |A_{ii}x_i| - left|sum_{j neq i} A_{ij}x_jright|
$$
But, we know that $|x_j| leq |x_i|$, so it follows that $$|sum_{j neq i} A_{ij}x_j| leq sum_{j neq i} -A_{ij}|x_j| leq -|x_i|sum_{j neq i}A_{ij}$$.
Therefore,
$$
|x_i|A_{ii} - left|sum_{j neq i} A_{ij}x_jright| geq |x_i| times sum_{j} A_{ij} > 0
$$
Which is a contradiction, since $A_i cdot x = 0$. Consequently, no such $x$ exists.
More can be said. Indeed, the Gerschgorin circle theorem guarantees that every eigenvalue lies with a Gerschgorin disc, whose centre is one of the diagonal entries, and radius is the sum of the absolute values of the non-diagonal entries of the row. In this case, by the conditions given, the theorem gives that no eigenvalue can in fact be smaller than the smallest value of $sum_{j} A_{ij}$, which is greater than $0$. So this way the result is clear.
Also, the matrix with conditions given is strictly diagonally dominant, and from the Gerschgorin circle theorem is non-singular (known as the Levy-Desplanques theorem, and having applications in probability).
add a comment |
up vote
1
down vote
accepted
Let $Ax = 0$. Then, let $x_i = arg max |x_j|$ i.e. $i$ is such that $|x_i| geq |x_j|$ for all $i neq j$. By assumption, if $x neq 0$ then $|x_i| > 0$.
Note that $Ax = 0$ implies that $A_i cdot x = 0$, where $A_i$ denotes the $i$th row of $A$, as a vector. This follows from the definition of matrix multiplication.
However, $A_i cdot x = sum_{j} A_{ij}x_j$. By definition, we have $|x_i| geq |x_j|$ for all $j$, so write $$A_i cdot x = A_{ii}x_i + sum_{j neq i} A_{ij}x_j$$ and use the inequality $|x+y| geq |x| - |y|$, to see that :
$$
|A_i cdot x| geq |A_{ii}x_i| - left|sum_{j neq i} A_{ij}x_jright|
$$
But, we know that $|x_j| leq |x_i|$, so it follows that $$|sum_{j neq i} A_{ij}x_j| leq sum_{j neq i} -A_{ij}|x_j| leq -|x_i|sum_{j neq i}A_{ij}$$.
Therefore,
$$
|x_i|A_{ii} - left|sum_{j neq i} A_{ij}x_jright| geq |x_i| times sum_{j} A_{ij} > 0
$$
Which is a contradiction, since $A_i cdot x = 0$. Consequently, no such $x$ exists.
More can be said. Indeed, the Gerschgorin circle theorem guarantees that every eigenvalue lies with a Gerschgorin disc, whose centre is one of the diagonal entries, and radius is the sum of the absolute values of the non-diagonal entries of the row. In this case, by the conditions given, the theorem gives that no eigenvalue can in fact be smaller than the smallest value of $sum_{j} A_{ij}$, which is greater than $0$. So this way the result is clear.
Also, the matrix with conditions given is strictly diagonally dominant, and from the Gerschgorin circle theorem is non-singular (known as the Levy-Desplanques theorem, and having applications in probability).
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
Let $Ax = 0$. Then, let $x_i = arg max |x_j|$ i.e. $i$ is such that $|x_i| geq |x_j|$ for all $i neq j$. By assumption, if $x neq 0$ then $|x_i| > 0$.
Note that $Ax = 0$ implies that $A_i cdot x = 0$, where $A_i$ denotes the $i$th row of $A$, as a vector. This follows from the definition of matrix multiplication.
However, $A_i cdot x = sum_{j} A_{ij}x_j$. By definition, we have $|x_i| geq |x_j|$ for all $j$, so write $$A_i cdot x = A_{ii}x_i + sum_{j neq i} A_{ij}x_j$$ and use the inequality $|x+y| geq |x| - |y|$, to see that :
$$
|A_i cdot x| geq |A_{ii}x_i| - left|sum_{j neq i} A_{ij}x_jright|
$$
But, we know that $|x_j| leq |x_i|$, so it follows that $$|sum_{j neq i} A_{ij}x_j| leq sum_{j neq i} -A_{ij}|x_j| leq -|x_i|sum_{j neq i}A_{ij}$$.
Therefore,
$$
|x_i|A_{ii} - left|sum_{j neq i} A_{ij}x_jright| geq |x_i| times sum_{j} A_{ij} > 0
$$
Which is a contradiction, since $A_i cdot x = 0$. Consequently, no such $x$ exists.
More can be said. Indeed, the Gerschgorin circle theorem guarantees that every eigenvalue lies with a Gerschgorin disc, whose centre is one of the diagonal entries, and radius is the sum of the absolute values of the non-diagonal entries of the row. In this case, by the conditions given, the theorem gives that no eigenvalue can in fact be smaller than the smallest value of $sum_{j} A_{ij}$, which is greater than $0$. So this way the result is clear.
Also, the matrix with conditions given is strictly diagonally dominant, and from the Gerschgorin circle theorem is non-singular (known as the Levy-Desplanques theorem, and having applications in probability).
Let $Ax = 0$. Then, let $x_i = arg max |x_j|$ i.e. $i$ is such that $|x_i| geq |x_j|$ for all $i neq j$. By assumption, if $x neq 0$ then $|x_i| > 0$.
Note that $Ax = 0$ implies that $A_i cdot x = 0$, where $A_i$ denotes the $i$th row of $A$, as a vector. This follows from the definition of matrix multiplication.
However, $A_i cdot x = sum_{j} A_{ij}x_j$. By definition, we have $|x_i| geq |x_j|$ for all $j$, so write $$A_i cdot x = A_{ii}x_i + sum_{j neq i} A_{ij}x_j$$ and use the inequality $|x+y| geq |x| - |y|$, to see that :
$$
|A_i cdot x| geq |A_{ii}x_i| - left|sum_{j neq i} A_{ij}x_jright|
$$
But, we know that $|x_j| leq |x_i|$, so it follows that $$|sum_{j neq i} A_{ij}x_j| leq sum_{j neq i} -A_{ij}|x_j| leq -|x_i|sum_{j neq i}A_{ij}$$.
Therefore,
$$
|x_i|A_{ii} - left|sum_{j neq i} A_{ij}x_jright| geq |x_i| times sum_{j} A_{ij} > 0
$$
Which is a contradiction, since $A_i cdot x = 0$. Consequently, no such $x$ exists.
More can be said. Indeed, the Gerschgorin circle theorem guarantees that every eigenvalue lies with a Gerschgorin disc, whose centre is one of the diagonal entries, and radius is the sum of the absolute values of the non-diagonal entries of the row. In this case, by the conditions given, the theorem gives that no eigenvalue can in fact be smaller than the smallest value of $sum_{j} A_{ij}$, which is greater than $0$. So this way the result is clear.
Also, the matrix with conditions given is strictly diagonally dominant, and from the Gerschgorin circle theorem is non-singular (known as the Levy-Desplanques theorem, and having applications in probability).
answered 2 days ago
астон вілла олоф мэллбэрг
36k33375
36k33375
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2997791%2fproving-that-deta-ne-0-with-a-satisfying-following-conditions%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
It looks like a Laplacian matrix with positive weight. But Laplacian matrix $mathbf{L}$ satisfies $sum_{i} L_{ij} = sum_{j} L_{ij} = 0$ which gives $det(mathbf{L}) = 0$.
– K_inverse
2 days ago
@dmsj djsl If $det (A) =0$ then the columns of $A $ are linearly dependent, as suggested by the hint. I'd try to use this i n conjunction with the three conditions to get a contradiction.
– AnyAD
2 days ago
1
Google "diagonally dominant matrix".
– darij grinberg
2 days ago