Bounding operator norm of matrix using eigen-values
$begingroup$
I have a symmetric matrix $A$ and I have managed to show that all eigenvalues of $A$ lie in $[-c,c]$. I want to prove that the operator norm of $A$ is less than $c$ i.e. $||A||leq c$.
I could prove that this norm is less than $c^2$ as follows :
From the hypothesis, we have that $cI-Asucceq 0$ and $cI+Asucceq 0$ since the hypothesis states that $-cx^Txleq x^TAxleq cx^Tx$. Now $(cI-A)(cI+A)=(cI+A)(cI-A)=c^2I-A^2succeq 0$.
So, $x^TA^2xleq c^2 x^Tximplies ||Ax||^2leq c^2 ||x||^2$.
However I don't see how to get the bound of $c$.
eigenvalues-eigenvectors norm spectral-theory
$endgroup$
add a comment |
$begingroup$
I have a symmetric matrix $A$ and I have managed to show that all eigenvalues of $A$ lie in $[-c,c]$. I want to prove that the operator norm of $A$ is less than $c$ i.e. $||A||leq c$.
I could prove that this norm is less than $c^2$ as follows :
From the hypothesis, we have that $cI-Asucceq 0$ and $cI+Asucceq 0$ since the hypothesis states that $-cx^Txleq x^TAxleq cx^Tx$. Now $(cI-A)(cI+A)=(cI+A)(cI-A)=c^2I-A^2succeq 0$.
So, $x^TA^2xleq c^2 x^Tximplies ||Ax||^2leq c^2 ||x||^2$.
However I don't see how to get the bound of $c$.
eigenvalues-eigenvectors norm spectral-theory
$endgroup$
add a comment |
$begingroup$
I have a symmetric matrix $A$ and I have managed to show that all eigenvalues of $A$ lie in $[-c,c]$. I want to prove that the operator norm of $A$ is less than $c$ i.e. $||A||leq c$.
I could prove that this norm is less than $c^2$ as follows :
From the hypothesis, we have that $cI-Asucceq 0$ and $cI+Asucceq 0$ since the hypothesis states that $-cx^Txleq x^TAxleq cx^Tx$. Now $(cI-A)(cI+A)=(cI+A)(cI-A)=c^2I-A^2succeq 0$.
So, $x^TA^2xleq c^2 x^Tximplies ||Ax||^2leq c^2 ||x||^2$.
However I don't see how to get the bound of $c$.
eigenvalues-eigenvectors norm spectral-theory
$endgroup$
I have a symmetric matrix $A$ and I have managed to show that all eigenvalues of $A$ lie in $[-c,c]$. I want to prove that the operator norm of $A$ is less than $c$ i.e. $||A||leq c$.
I could prove that this norm is less than $c^2$ as follows :
From the hypothesis, we have that $cI-Asucceq 0$ and $cI+Asucceq 0$ since the hypothesis states that $-cx^Txleq x^TAxleq cx^Tx$. Now $(cI-A)(cI+A)=(cI+A)(cI-A)=c^2I-A^2succeq 0$.
So, $x^TA^2xleq c^2 x^Tximplies ||Ax||^2leq c^2 ||x||^2$.
However I don't see how to get the bound of $c$.
eigenvalues-eigenvectors norm spectral-theory
eigenvalues-eigenvectors norm spectral-theory
edited Dec 25 '18 at 7:16
User Not Found
asked Dec 25 '18 at 6:28
User Not FoundUser Not Found
462422
462422
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Your $preceq$s are backwards. The hypothesis implies that $cI-A$ and $cI+A$ are both positive semidefinite, not negative semidefinite. So their product is also positive semidefinite, and this gives you the inequality $x^T A^2 x le c^2 x^T x$. Since $x^T A^2 x = x^T A^T A x = (Ax)^T (Ax) = |Ax|^2$, this implies $|Ax|^2 le c^2 |x|^2$ as desired. (What you wrote seemed to have lost some squares.)
The conclusion $|Ax| ge c^2 |x|$ (or even $|Ax|^2 ge c^2 |x|^2$ is clearly wrong, as you can see by taking $A$ to be the zero matrix.
$endgroup$
$begingroup$
Oh wait...this just proves my claim. Thanks for going through it and correcting it.
$endgroup$
– User Not Found
Dec 25 '18 at 7:19
add a comment |
$begingroup$
If your definition of operator norm is "maximum singular value," then simply note that the eigenvalues of $A^top A$ lie in $[0, c^2]$ so the singular values lie in $[0, c]$.
If your definition of operator norm is $max_{x : |x| = 1} |A x|$, then write $A = ULambda U^top$ where $U$ is orthogonal and $Lambda$ is diagonal. Then
$$max_{x : |x|=1} |Ax|^2 = max_{x : |x|=1} x^top A^top A x
= max_{x : |x|=1} x^top ULambda^2 U^top x overset{y=U^top x}{=} max_{y : |y|=1} y^top Lambda^2 y = max_{y : sum_i y_i^2 = 1} sum_{i=1}^n y_i^2 lambda_i^2.$$
WLOG let $lambda_1^2 ge cdots ge lambda_n^2$
By Hölder's inequality, $sum_{i=1}^n y_i^2 lambda_i^2 le left(sum_{i=1}^n y_i^2right) lambda_1^2$ with equality when $y = (1, 0, ldots, 0)$.
$endgroup$
add a comment |
$begingroup$
If $A$ is symmetric (and real) then it has an orthonormal basis of eigenvectors $v_k$.
Then any unit $x$ can be written as $x=sum_k x_k v_k$ with $sum_k |x_k|^2 = 1$ and so
$|Ax|^2 = |A(sum_k x_k v_k)|^2 =|sum_k x_k A v_k|^2 = |sum_k x_k lambda_k v_k|^2= sum_k |x_k|^2|lambda_k|^2 le c^2 sum_k |x_k|^2=c^2$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3051889%2fbounding-operator-norm-of-matrix-using-eigen-values%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Your $preceq$s are backwards. The hypothesis implies that $cI-A$ and $cI+A$ are both positive semidefinite, not negative semidefinite. So their product is also positive semidefinite, and this gives you the inequality $x^T A^2 x le c^2 x^T x$. Since $x^T A^2 x = x^T A^T A x = (Ax)^T (Ax) = |Ax|^2$, this implies $|Ax|^2 le c^2 |x|^2$ as desired. (What you wrote seemed to have lost some squares.)
The conclusion $|Ax| ge c^2 |x|$ (or even $|Ax|^2 ge c^2 |x|^2$ is clearly wrong, as you can see by taking $A$ to be the zero matrix.
$endgroup$
$begingroup$
Oh wait...this just proves my claim. Thanks for going through it and correcting it.
$endgroup$
– User Not Found
Dec 25 '18 at 7:19
add a comment |
$begingroup$
Your $preceq$s are backwards. The hypothesis implies that $cI-A$ and $cI+A$ are both positive semidefinite, not negative semidefinite. So their product is also positive semidefinite, and this gives you the inequality $x^T A^2 x le c^2 x^T x$. Since $x^T A^2 x = x^T A^T A x = (Ax)^T (Ax) = |Ax|^2$, this implies $|Ax|^2 le c^2 |x|^2$ as desired. (What you wrote seemed to have lost some squares.)
The conclusion $|Ax| ge c^2 |x|$ (or even $|Ax|^2 ge c^2 |x|^2$ is clearly wrong, as you can see by taking $A$ to be the zero matrix.
$endgroup$
$begingroup$
Oh wait...this just proves my claim. Thanks for going through it and correcting it.
$endgroup$
– User Not Found
Dec 25 '18 at 7:19
add a comment |
$begingroup$
Your $preceq$s are backwards. The hypothesis implies that $cI-A$ and $cI+A$ are both positive semidefinite, not negative semidefinite. So their product is also positive semidefinite, and this gives you the inequality $x^T A^2 x le c^2 x^T x$. Since $x^T A^2 x = x^T A^T A x = (Ax)^T (Ax) = |Ax|^2$, this implies $|Ax|^2 le c^2 |x|^2$ as desired. (What you wrote seemed to have lost some squares.)
The conclusion $|Ax| ge c^2 |x|$ (or even $|Ax|^2 ge c^2 |x|^2$ is clearly wrong, as you can see by taking $A$ to be the zero matrix.
$endgroup$
Your $preceq$s are backwards. The hypothesis implies that $cI-A$ and $cI+A$ are both positive semidefinite, not negative semidefinite. So their product is also positive semidefinite, and this gives you the inequality $x^T A^2 x le c^2 x^T x$. Since $x^T A^2 x = x^T A^T A x = (Ax)^T (Ax) = |Ax|^2$, this implies $|Ax|^2 le c^2 |x|^2$ as desired. (What you wrote seemed to have lost some squares.)
The conclusion $|Ax| ge c^2 |x|$ (or even $|Ax|^2 ge c^2 |x|^2$ is clearly wrong, as you can see by taking $A$ to be the zero matrix.
answered Dec 25 '18 at 6:38
Nate EldredgeNate Eldredge
63.2k682171
63.2k682171
$begingroup$
Oh wait...this just proves my claim. Thanks for going through it and correcting it.
$endgroup$
– User Not Found
Dec 25 '18 at 7:19
add a comment |
$begingroup$
Oh wait...this just proves my claim. Thanks for going through it and correcting it.
$endgroup$
– User Not Found
Dec 25 '18 at 7:19
$begingroup$
Oh wait...this just proves my claim. Thanks for going through it and correcting it.
$endgroup$
– User Not Found
Dec 25 '18 at 7:19
$begingroup$
Oh wait...this just proves my claim. Thanks for going through it and correcting it.
$endgroup$
– User Not Found
Dec 25 '18 at 7:19
add a comment |
$begingroup$
If your definition of operator norm is "maximum singular value," then simply note that the eigenvalues of $A^top A$ lie in $[0, c^2]$ so the singular values lie in $[0, c]$.
If your definition of operator norm is $max_{x : |x| = 1} |A x|$, then write $A = ULambda U^top$ where $U$ is orthogonal and $Lambda$ is diagonal. Then
$$max_{x : |x|=1} |Ax|^2 = max_{x : |x|=1} x^top A^top A x
= max_{x : |x|=1} x^top ULambda^2 U^top x overset{y=U^top x}{=} max_{y : |y|=1} y^top Lambda^2 y = max_{y : sum_i y_i^2 = 1} sum_{i=1}^n y_i^2 lambda_i^2.$$
WLOG let $lambda_1^2 ge cdots ge lambda_n^2$
By Hölder's inequality, $sum_{i=1}^n y_i^2 lambda_i^2 le left(sum_{i=1}^n y_i^2right) lambda_1^2$ with equality when $y = (1, 0, ldots, 0)$.
$endgroup$
add a comment |
$begingroup$
If your definition of operator norm is "maximum singular value," then simply note that the eigenvalues of $A^top A$ lie in $[0, c^2]$ so the singular values lie in $[0, c]$.
If your definition of operator norm is $max_{x : |x| = 1} |A x|$, then write $A = ULambda U^top$ where $U$ is orthogonal and $Lambda$ is diagonal. Then
$$max_{x : |x|=1} |Ax|^2 = max_{x : |x|=1} x^top A^top A x
= max_{x : |x|=1} x^top ULambda^2 U^top x overset{y=U^top x}{=} max_{y : |y|=1} y^top Lambda^2 y = max_{y : sum_i y_i^2 = 1} sum_{i=1}^n y_i^2 lambda_i^2.$$
WLOG let $lambda_1^2 ge cdots ge lambda_n^2$
By Hölder's inequality, $sum_{i=1}^n y_i^2 lambda_i^2 le left(sum_{i=1}^n y_i^2right) lambda_1^2$ with equality when $y = (1, 0, ldots, 0)$.
$endgroup$
add a comment |
$begingroup$
If your definition of operator norm is "maximum singular value," then simply note that the eigenvalues of $A^top A$ lie in $[0, c^2]$ so the singular values lie in $[0, c]$.
If your definition of operator norm is $max_{x : |x| = 1} |A x|$, then write $A = ULambda U^top$ where $U$ is orthogonal and $Lambda$ is diagonal. Then
$$max_{x : |x|=1} |Ax|^2 = max_{x : |x|=1} x^top A^top A x
= max_{x : |x|=1} x^top ULambda^2 U^top x overset{y=U^top x}{=} max_{y : |y|=1} y^top Lambda^2 y = max_{y : sum_i y_i^2 = 1} sum_{i=1}^n y_i^2 lambda_i^2.$$
WLOG let $lambda_1^2 ge cdots ge lambda_n^2$
By Hölder's inequality, $sum_{i=1}^n y_i^2 lambda_i^2 le left(sum_{i=1}^n y_i^2right) lambda_1^2$ with equality when $y = (1, 0, ldots, 0)$.
$endgroup$
If your definition of operator norm is "maximum singular value," then simply note that the eigenvalues of $A^top A$ lie in $[0, c^2]$ so the singular values lie in $[0, c]$.
If your definition of operator norm is $max_{x : |x| = 1} |A x|$, then write $A = ULambda U^top$ where $U$ is orthogonal and $Lambda$ is diagonal. Then
$$max_{x : |x|=1} |Ax|^2 = max_{x : |x|=1} x^top A^top A x
= max_{x : |x|=1} x^top ULambda^2 U^top x overset{y=U^top x}{=} max_{y : |y|=1} y^top Lambda^2 y = max_{y : sum_i y_i^2 = 1} sum_{i=1}^n y_i^2 lambda_i^2.$$
WLOG let $lambda_1^2 ge cdots ge lambda_n^2$
By Hölder's inequality, $sum_{i=1}^n y_i^2 lambda_i^2 le left(sum_{i=1}^n y_i^2right) lambda_1^2$ with equality when $y = (1, 0, ldots, 0)$.
edited Dec 25 '18 at 6:43
answered Dec 25 '18 at 6:38
angryavianangryavian
40.8k23380
40.8k23380
add a comment |
add a comment |
$begingroup$
If $A$ is symmetric (and real) then it has an orthonormal basis of eigenvectors $v_k$.
Then any unit $x$ can be written as $x=sum_k x_k v_k$ with $sum_k |x_k|^2 = 1$ and so
$|Ax|^2 = |A(sum_k x_k v_k)|^2 =|sum_k x_k A v_k|^2 = |sum_k x_k lambda_k v_k|^2= sum_k |x_k|^2|lambda_k|^2 le c^2 sum_k |x_k|^2=c^2$.
$endgroup$
add a comment |
$begingroup$
If $A$ is symmetric (and real) then it has an orthonormal basis of eigenvectors $v_k$.
Then any unit $x$ can be written as $x=sum_k x_k v_k$ with $sum_k |x_k|^2 = 1$ and so
$|Ax|^2 = |A(sum_k x_k v_k)|^2 =|sum_k x_k A v_k|^2 = |sum_k x_k lambda_k v_k|^2= sum_k |x_k|^2|lambda_k|^2 le c^2 sum_k |x_k|^2=c^2$.
$endgroup$
add a comment |
$begingroup$
If $A$ is symmetric (and real) then it has an orthonormal basis of eigenvectors $v_k$.
Then any unit $x$ can be written as $x=sum_k x_k v_k$ with $sum_k |x_k|^2 = 1$ and so
$|Ax|^2 = |A(sum_k x_k v_k)|^2 =|sum_k x_k A v_k|^2 = |sum_k x_k lambda_k v_k|^2= sum_k |x_k|^2|lambda_k|^2 le c^2 sum_k |x_k|^2=c^2$.
$endgroup$
If $A$ is symmetric (and real) then it has an orthonormal basis of eigenvectors $v_k$.
Then any unit $x$ can be written as $x=sum_k x_k v_k$ with $sum_k |x_k|^2 = 1$ and so
$|Ax|^2 = |A(sum_k x_k v_k)|^2 =|sum_k x_k A v_k|^2 = |sum_k x_k lambda_k v_k|^2= sum_k |x_k|^2|lambda_k|^2 le c^2 sum_k |x_k|^2=c^2$.
answered Dec 25 '18 at 7:08
copper.hatcopper.hat
127k559160
127k559160
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3051889%2fbounding-operator-norm-of-matrix-using-eigen-values%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown