Why is $||Sigma - I||_2 =max|lambda_i|$ where $lambda_i$ is an eigenvalue of $Sigma - I$
$begingroup$
$||Sigma - I||_2 = max|lambda_i|$ where $lambda_i$ is an eigenvalue of $Sigma - I$
$Sigma $ is a diagonal matrix and $I$ is the identity matrix
I know that since $Sigma - I$ is diagonal, its eigenvalues are the values along the diagonal
I also know that the singular values of $Sigma - I$ are the square roots of the eigenvalues of $(Sigma - I)^T(Sigma - I) = (Sigma - I)^2$
I can't seem to see why this is true?
linear-algebra
$endgroup$
add a comment |
$begingroup$
$||Sigma - I||_2 = max|lambda_i|$ where $lambda_i$ is an eigenvalue of $Sigma - I$
$Sigma $ is a diagonal matrix and $I$ is the identity matrix
I know that since $Sigma - I$ is diagonal, its eigenvalues are the values along the diagonal
I also know that the singular values of $Sigma - I$ are the square roots of the eigenvalues of $(Sigma - I)^T(Sigma - I) = (Sigma - I)^2$
I can't seem to see why this is true?
linear-algebra
$endgroup$
$begingroup$
Your question is a special case of the more general fact.
$endgroup$
– A.Γ.
Dec 22 '18 at 21:38
3
$begingroup$
I don't think this is true. Take $Sigma=begin{bmatrix} 2 & 0\ 0 & 2end{bmatrix}$ for example. Then $Sigma-I$ is the identity matrix which has $2$-norm $sqrt 2$ but the maximum eigenvalue is $1$. I think you should consider another norm on the matrix space (for example the operator norm).
$endgroup$
– Levent
Dec 22 '18 at 21:39
$begingroup$
@A.Γ. You should give an official answer, even it is essentially a reference to another question plus Levent's comment..
$endgroup$
– Paul Frost
Dec 22 '18 at 22:46
2
$begingroup$
@Levent the matrix 2-norm is generally defined as $sup_{xneq 0} langle Ax,xrangle/langle x,xrangle$, while the Frobenius norm is the square root of sum of square of entries.
$endgroup$
– tch
Dec 23 '18 at 16:05
$begingroup$
@TylerChen Oh okay, thanks for pointing it out. I thought it meant the Euclidean norm.
$endgroup$
– Levent
Dec 23 '18 at 18:09
add a comment |
$begingroup$
$||Sigma - I||_2 = max|lambda_i|$ where $lambda_i$ is an eigenvalue of $Sigma - I$
$Sigma $ is a diagonal matrix and $I$ is the identity matrix
I know that since $Sigma - I$ is diagonal, its eigenvalues are the values along the diagonal
I also know that the singular values of $Sigma - I$ are the square roots of the eigenvalues of $(Sigma - I)^T(Sigma - I) = (Sigma - I)^2$
I can't seem to see why this is true?
linear-algebra
$endgroup$
$||Sigma - I||_2 = max|lambda_i|$ where $lambda_i$ is an eigenvalue of $Sigma - I$
$Sigma $ is a diagonal matrix and $I$ is the identity matrix
I know that since $Sigma - I$ is diagonal, its eigenvalues are the values along the diagonal
I also know that the singular values of $Sigma - I$ are the square roots of the eigenvalues of $(Sigma - I)^T(Sigma - I) = (Sigma - I)^2$
I can't seem to see why this is true?
linear-algebra
linear-algebra
edited Dec 22 '18 at 21:40
pablo_mathscobar
asked Dec 22 '18 at 21:26
pablo_mathscobarpablo_mathscobar
1277
1277
$begingroup$
Your question is a special case of the more general fact.
$endgroup$
– A.Γ.
Dec 22 '18 at 21:38
3
$begingroup$
I don't think this is true. Take $Sigma=begin{bmatrix} 2 & 0\ 0 & 2end{bmatrix}$ for example. Then $Sigma-I$ is the identity matrix which has $2$-norm $sqrt 2$ but the maximum eigenvalue is $1$. I think you should consider another norm on the matrix space (for example the operator norm).
$endgroup$
– Levent
Dec 22 '18 at 21:39
$begingroup$
@A.Γ. You should give an official answer, even it is essentially a reference to another question plus Levent's comment..
$endgroup$
– Paul Frost
Dec 22 '18 at 22:46
2
$begingroup$
@Levent the matrix 2-norm is generally defined as $sup_{xneq 0} langle Ax,xrangle/langle x,xrangle$, while the Frobenius norm is the square root of sum of square of entries.
$endgroup$
– tch
Dec 23 '18 at 16:05
$begingroup$
@TylerChen Oh okay, thanks for pointing it out. I thought it meant the Euclidean norm.
$endgroup$
– Levent
Dec 23 '18 at 18:09
add a comment |
$begingroup$
Your question is a special case of the more general fact.
$endgroup$
– A.Γ.
Dec 22 '18 at 21:38
3
$begingroup$
I don't think this is true. Take $Sigma=begin{bmatrix} 2 & 0\ 0 & 2end{bmatrix}$ for example. Then $Sigma-I$ is the identity matrix which has $2$-norm $sqrt 2$ but the maximum eigenvalue is $1$. I think you should consider another norm on the matrix space (for example the operator norm).
$endgroup$
– Levent
Dec 22 '18 at 21:39
$begingroup$
@A.Γ. You should give an official answer, even it is essentially a reference to another question plus Levent's comment..
$endgroup$
– Paul Frost
Dec 22 '18 at 22:46
2
$begingroup$
@Levent the matrix 2-norm is generally defined as $sup_{xneq 0} langle Ax,xrangle/langle x,xrangle$, while the Frobenius norm is the square root of sum of square of entries.
$endgroup$
– tch
Dec 23 '18 at 16:05
$begingroup$
@TylerChen Oh okay, thanks for pointing it out. I thought it meant the Euclidean norm.
$endgroup$
– Levent
Dec 23 '18 at 18:09
$begingroup$
Your question is a special case of the more general fact.
$endgroup$
– A.Γ.
Dec 22 '18 at 21:38
$begingroup$
Your question is a special case of the more general fact.
$endgroup$
– A.Γ.
Dec 22 '18 at 21:38
3
3
$begingroup$
I don't think this is true. Take $Sigma=begin{bmatrix} 2 & 0\ 0 & 2end{bmatrix}$ for example. Then $Sigma-I$ is the identity matrix which has $2$-norm $sqrt 2$ but the maximum eigenvalue is $1$. I think you should consider another norm on the matrix space (for example the operator norm).
$endgroup$
– Levent
Dec 22 '18 at 21:39
$begingroup$
I don't think this is true. Take $Sigma=begin{bmatrix} 2 & 0\ 0 & 2end{bmatrix}$ for example. Then $Sigma-I$ is the identity matrix which has $2$-norm $sqrt 2$ but the maximum eigenvalue is $1$. I think you should consider another norm on the matrix space (for example the operator norm).
$endgroup$
– Levent
Dec 22 '18 at 21:39
$begingroup$
@A.Γ. You should give an official answer, even it is essentially a reference to another question plus Levent's comment..
$endgroup$
– Paul Frost
Dec 22 '18 at 22:46
$begingroup$
@A.Γ. You should give an official answer, even it is essentially a reference to another question plus Levent's comment..
$endgroup$
– Paul Frost
Dec 22 '18 at 22:46
2
2
$begingroup$
@Levent the matrix 2-norm is generally defined as $sup_{xneq 0} langle Ax,xrangle/langle x,xrangle$, while the Frobenius norm is the square root of sum of square of entries.
$endgroup$
– tch
Dec 23 '18 at 16:05
$begingroup$
@Levent the matrix 2-norm is generally defined as $sup_{xneq 0} langle Ax,xrangle/langle x,xrangle$, while the Frobenius norm is the square root of sum of square of entries.
$endgroup$
– tch
Dec 23 '18 at 16:05
$begingroup$
@TylerChen Oh okay, thanks for pointing it out. I thought it meant the Euclidean norm.
$endgroup$
– Levent
Dec 23 '18 at 18:09
$begingroup$
@TylerChen Oh okay, thanks for pointing it out. I thought it meant the Euclidean norm.
$endgroup$
– Levent
Dec 23 '18 at 18:09
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
As @A.Γ noted, this is a special case of a more general fact about normal matrices, but we can prove it without that fact.
Let $Lambda$ be a diagonal matrix with entries $lambda_i$. We take $lVert{A}rVert_2 = sup_{xneq 0} lVert{Ax}rVert_2 / lVert{x}rVert_2 = sup_{lVert x rVert_2 = 1} lVert{Ax}rVert_2$.
Taking $x$ to be the eigenvector corresponding to the largest magntiude eigenvalue gives,
$$
lVert{Ax}rVert_2 = lVert{(max|lambda_i|)xrVert}_2 = (max|lambda_i|) lVert{x}rVert_2 = max|lambda_i|
$$
So, for any matrix, the matrix 2-norm is always at least the size of the largest magnitude eigenvalue.
Now, let $x=[x_1,x_2,ldots, x_n]^T$. Then $Ax = [lambda_1 x_1, lambda_2x_2, ldots, lambda_nx_n]^T$. Therefore,
$$
lVert{Ax}rVert_2 = sum_{i=1}^{n} lambda_i^2x_i^2
$$
Since $lVert x rVert_2 = 1$ we have that $sum_{i=1}^{n} x_i^2 = 1$ , which implies that $ 0leq x_i^2 leq 1$. Therefore,
$$
lVert{Ax}rVert_2^2 = sum_{i=1}^{n} lambda_i^2 x_i^2
leq sum_{i=1}^{n} (max|lambda_i|)^2 x_i^2
= (max|lambda_i|)^2 sum_{i=1}^{n} x_i^2
= (max|lambda_i|)^2
$$
Therefore, for diagonal matrices, the matrix 2-norm is bounded above by the size of the largest magnitude eigenvalue, and so the two quantities must be equal.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3049855%2fwhy-is-sigma-i-2-max-lambda-i-where-lambda-i-is-an-eigenvalue-of%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
As @A.Γ noted, this is a special case of a more general fact about normal matrices, but we can prove it without that fact.
Let $Lambda$ be a diagonal matrix with entries $lambda_i$. We take $lVert{A}rVert_2 = sup_{xneq 0} lVert{Ax}rVert_2 / lVert{x}rVert_2 = sup_{lVert x rVert_2 = 1} lVert{Ax}rVert_2$.
Taking $x$ to be the eigenvector corresponding to the largest magntiude eigenvalue gives,
$$
lVert{Ax}rVert_2 = lVert{(max|lambda_i|)xrVert}_2 = (max|lambda_i|) lVert{x}rVert_2 = max|lambda_i|
$$
So, for any matrix, the matrix 2-norm is always at least the size of the largest magnitude eigenvalue.
Now, let $x=[x_1,x_2,ldots, x_n]^T$. Then $Ax = [lambda_1 x_1, lambda_2x_2, ldots, lambda_nx_n]^T$. Therefore,
$$
lVert{Ax}rVert_2 = sum_{i=1}^{n} lambda_i^2x_i^2
$$
Since $lVert x rVert_2 = 1$ we have that $sum_{i=1}^{n} x_i^2 = 1$ , which implies that $ 0leq x_i^2 leq 1$. Therefore,
$$
lVert{Ax}rVert_2^2 = sum_{i=1}^{n} lambda_i^2 x_i^2
leq sum_{i=1}^{n} (max|lambda_i|)^2 x_i^2
= (max|lambda_i|)^2 sum_{i=1}^{n} x_i^2
= (max|lambda_i|)^2
$$
Therefore, for diagonal matrices, the matrix 2-norm is bounded above by the size of the largest magnitude eigenvalue, and so the two quantities must be equal.
$endgroup$
add a comment |
$begingroup$
As @A.Γ noted, this is a special case of a more general fact about normal matrices, but we can prove it without that fact.
Let $Lambda$ be a diagonal matrix with entries $lambda_i$. We take $lVert{A}rVert_2 = sup_{xneq 0} lVert{Ax}rVert_2 / lVert{x}rVert_2 = sup_{lVert x rVert_2 = 1} lVert{Ax}rVert_2$.
Taking $x$ to be the eigenvector corresponding to the largest magntiude eigenvalue gives,
$$
lVert{Ax}rVert_2 = lVert{(max|lambda_i|)xrVert}_2 = (max|lambda_i|) lVert{x}rVert_2 = max|lambda_i|
$$
So, for any matrix, the matrix 2-norm is always at least the size of the largest magnitude eigenvalue.
Now, let $x=[x_1,x_2,ldots, x_n]^T$. Then $Ax = [lambda_1 x_1, lambda_2x_2, ldots, lambda_nx_n]^T$. Therefore,
$$
lVert{Ax}rVert_2 = sum_{i=1}^{n} lambda_i^2x_i^2
$$
Since $lVert x rVert_2 = 1$ we have that $sum_{i=1}^{n} x_i^2 = 1$ , which implies that $ 0leq x_i^2 leq 1$. Therefore,
$$
lVert{Ax}rVert_2^2 = sum_{i=1}^{n} lambda_i^2 x_i^2
leq sum_{i=1}^{n} (max|lambda_i|)^2 x_i^2
= (max|lambda_i|)^2 sum_{i=1}^{n} x_i^2
= (max|lambda_i|)^2
$$
Therefore, for diagonal matrices, the matrix 2-norm is bounded above by the size of the largest magnitude eigenvalue, and so the two quantities must be equal.
$endgroup$
add a comment |
$begingroup$
As @A.Γ noted, this is a special case of a more general fact about normal matrices, but we can prove it without that fact.
Let $Lambda$ be a diagonal matrix with entries $lambda_i$. We take $lVert{A}rVert_2 = sup_{xneq 0} lVert{Ax}rVert_2 / lVert{x}rVert_2 = sup_{lVert x rVert_2 = 1} lVert{Ax}rVert_2$.
Taking $x$ to be the eigenvector corresponding to the largest magntiude eigenvalue gives,
$$
lVert{Ax}rVert_2 = lVert{(max|lambda_i|)xrVert}_2 = (max|lambda_i|) lVert{x}rVert_2 = max|lambda_i|
$$
So, for any matrix, the matrix 2-norm is always at least the size of the largest magnitude eigenvalue.
Now, let $x=[x_1,x_2,ldots, x_n]^T$. Then $Ax = [lambda_1 x_1, lambda_2x_2, ldots, lambda_nx_n]^T$. Therefore,
$$
lVert{Ax}rVert_2 = sum_{i=1}^{n} lambda_i^2x_i^2
$$
Since $lVert x rVert_2 = 1$ we have that $sum_{i=1}^{n} x_i^2 = 1$ , which implies that $ 0leq x_i^2 leq 1$. Therefore,
$$
lVert{Ax}rVert_2^2 = sum_{i=1}^{n} lambda_i^2 x_i^2
leq sum_{i=1}^{n} (max|lambda_i|)^2 x_i^2
= (max|lambda_i|)^2 sum_{i=1}^{n} x_i^2
= (max|lambda_i|)^2
$$
Therefore, for diagonal matrices, the matrix 2-norm is bounded above by the size of the largest magnitude eigenvalue, and so the two quantities must be equal.
$endgroup$
As @A.Γ noted, this is a special case of a more general fact about normal matrices, but we can prove it without that fact.
Let $Lambda$ be a diagonal matrix with entries $lambda_i$. We take $lVert{A}rVert_2 = sup_{xneq 0} lVert{Ax}rVert_2 / lVert{x}rVert_2 = sup_{lVert x rVert_2 = 1} lVert{Ax}rVert_2$.
Taking $x$ to be the eigenvector corresponding to the largest magntiude eigenvalue gives,
$$
lVert{Ax}rVert_2 = lVert{(max|lambda_i|)xrVert}_2 = (max|lambda_i|) lVert{x}rVert_2 = max|lambda_i|
$$
So, for any matrix, the matrix 2-norm is always at least the size of the largest magnitude eigenvalue.
Now, let $x=[x_1,x_2,ldots, x_n]^T$. Then $Ax = [lambda_1 x_1, lambda_2x_2, ldots, lambda_nx_n]^T$. Therefore,
$$
lVert{Ax}rVert_2 = sum_{i=1}^{n} lambda_i^2x_i^2
$$
Since $lVert x rVert_2 = 1$ we have that $sum_{i=1}^{n} x_i^2 = 1$ , which implies that $ 0leq x_i^2 leq 1$. Therefore,
$$
lVert{Ax}rVert_2^2 = sum_{i=1}^{n} lambda_i^2 x_i^2
leq sum_{i=1}^{n} (max|lambda_i|)^2 x_i^2
= (max|lambda_i|)^2 sum_{i=1}^{n} x_i^2
= (max|lambda_i|)^2
$$
Therefore, for diagonal matrices, the matrix 2-norm is bounded above by the size of the largest magnitude eigenvalue, and so the two quantities must be equal.
edited Dec 23 '18 at 16:34
answered Dec 23 '18 at 16:27
tchtch
833310
833310
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3049855%2fwhy-is-sigma-i-2-max-lambda-i-where-lambda-i-is-an-eigenvalue-of%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Your question is a special case of the more general fact.
$endgroup$
– A.Γ.
Dec 22 '18 at 21:38
3
$begingroup$
I don't think this is true. Take $Sigma=begin{bmatrix} 2 & 0\ 0 & 2end{bmatrix}$ for example. Then $Sigma-I$ is the identity matrix which has $2$-norm $sqrt 2$ but the maximum eigenvalue is $1$. I think you should consider another norm on the matrix space (for example the operator norm).
$endgroup$
– Levent
Dec 22 '18 at 21:39
$begingroup$
@A.Γ. You should give an official answer, even it is essentially a reference to another question plus Levent's comment..
$endgroup$
– Paul Frost
Dec 22 '18 at 22:46
2
$begingroup$
@Levent the matrix 2-norm is generally defined as $sup_{xneq 0} langle Ax,xrangle/langle x,xrangle$, while the Frobenius norm is the square root of sum of square of entries.
$endgroup$
– tch
Dec 23 '18 at 16:05
$begingroup$
@TylerChen Oh okay, thanks for pointing it out. I thought it meant the Euclidean norm.
$endgroup$
– Levent
Dec 23 '18 at 18:09