Questions about left and right eigenvectors of symmetric complex matrix












0












$begingroup$


recently a question about right and left eigenvectors bother me a lot.



Here R is the matrix of the right eigenvectors of A, every column is one of A's eigenvectors; L is the matrix of the left eigenvectors of A, every row is one of A's eigenvectors; and "a" is a diagonalized matrix composed of the eigenvalues of A. A is a symmetric complex matrix





  1. $AR=Ra$,




    • we can take the transpose on both sides: $R^{T}A=aR^{T}$

    • We can also multiply the inverse of R on the right left on both sides: $R^{-1}A=aR^{-1}$

    • $LA=aL$

    • So, according a,b we can get $R^{T}=R^{-1}=L$




  2. $LA=aL$




    • we can take the conjugate transpose of both sides:$A^{dagger}L^{dagger}=L^{dagger}a*$

    • So we can get the left eigenvalue matrix just by getting the right eigenvalues of $A^{dagger}$ and then do the conjugate transpose of $L^{dagger}$




These two methods both seem to be right and I should have consistency in the answers got from the two ways for the same eigenvalue(in the second method we can get the same a by doing the conjugate).
$$A=begin{pmatrix}1+i&1-i&1-3.5i\1-i&5+5.5i&-1.5+3i\1-3.5i&-1.5+3i&1.1+2iend{pmatrix}$$
However, the result I got is totally different.The Lone is got from the inverse of the right eigenvector matrix, and the Ltwo is got from the second method.




  1. method 1:
    $$a={0.364039+7.3815i,4.52991+3.24693i,2.20606-2.12843i}$$
    $$R=begin{pmatrix}-0.491206-0.0760069 i&0.466312 - 0.324685 i& 0.655799\0.403372 - 0.138801 i&0.800166 & -0.250746 + 0.338948i\0.716577& -0.101062 - 0.121558 i&0.668775 + 0.119451i
    end{pmatrix}$$

    $$Lone=begin{pmatrix}-0.555223 - 0.26064i& 0.543653 + 0.0197978i&
    0.732705 - 0.143397 i\0.664945 - 0.222459i&
    0.952185 + 0.36692 i& -0.127662 - 0.10407 i\ 0.776518 +
    0.227821i& -0.45381 + 0.288362i& 0.707731 - 0.0116924i
    end{pmatrix}$$


  2. method 2:
    $$a^{*}={0.364039-7.3815i,4.52991-3.24693i,2.20606+2.12843i}$$
    $$R=begin{pmatrix}-0.491206-0.0760069 i&0.466312 - 0.324685 i& 0.655799\0.403372 - 0.138801 i&0.800166 & -0.250746 + 0.338948i\0.716577& -0.101062 - 0.121558 i&0.668775 + 0.119451i
    end{pmatrix}$$

    $$Ltwo^{dagger}=begin{pmatrix}-0.491206 - 0.0760069i& 0.403372 - 0.138801i&
    0.716577 \ 0.466312 - 0.324685i&
    0.800166 &-0.101062 - 0.121558i\ 0.655799&-0.250746 + 0.338948i& 0.668775 + 0.119451iend{pmatrix}$$



Is there anybody can help me to solve the problem and point out my mistake or something? I really appreciate!










share|cite|improve this question











$endgroup$












  • $begingroup$
    Please take the time to enter your work as text instead of linking to a picture of it. Images are neither searchable nor accessible to screen readers, nor do they show up in question summaries.
    $endgroup$
    – amd
    Jan 9 at 5:31










  • $begingroup$
    Have you actually tested your two results to see whether or not the rows are in fact independent eigenvectors of $A$? You do remember that eigenvectors are not unique, don’t you? There’s no particular reason to expect that the left eigenvectors produced by these two different methods will be identical.
    $endgroup$
    – amd
    Jan 9 at 7:53












  • $begingroup$
    @amd Thank you for your help!!! Since they have the same eigenvalues (L^dagger can be transformed to L), the L1 and L2 (the left eigenvectors in 2 methods) should be linear dependent? I thought the results are all normalized so they should have the same value.
    $endgroup$
    – mqy
    Jan 14 at 22:16










  • $begingroup$
    This is really hard to understand. Maybe you can put a sentence or two in the beginning to help us have some context on what assumptions about $A$, $R$, $L$, and $a$ you are making?
    $endgroup$
    – Morgan Rodgers
    Jan 14 at 22:16










  • $begingroup$
    @MorganRodgers I'm sorry about the unclear express. I edited it again. Thank you very much!
    $endgroup$
    – mqy
    Jan 14 at 22:22
















0












$begingroup$


recently a question about right and left eigenvectors bother me a lot.



Here R is the matrix of the right eigenvectors of A, every column is one of A's eigenvectors; L is the matrix of the left eigenvectors of A, every row is one of A's eigenvectors; and "a" is a diagonalized matrix composed of the eigenvalues of A. A is a symmetric complex matrix





  1. $AR=Ra$,




    • we can take the transpose on both sides: $R^{T}A=aR^{T}$

    • We can also multiply the inverse of R on the right left on both sides: $R^{-1}A=aR^{-1}$

    • $LA=aL$

    • So, according a,b we can get $R^{T}=R^{-1}=L$




  2. $LA=aL$




    • we can take the conjugate transpose of both sides:$A^{dagger}L^{dagger}=L^{dagger}a*$

    • So we can get the left eigenvalue matrix just by getting the right eigenvalues of $A^{dagger}$ and then do the conjugate transpose of $L^{dagger}$




These two methods both seem to be right and I should have consistency in the answers got from the two ways for the same eigenvalue(in the second method we can get the same a by doing the conjugate).
$$A=begin{pmatrix}1+i&1-i&1-3.5i\1-i&5+5.5i&-1.5+3i\1-3.5i&-1.5+3i&1.1+2iend{pmatrix}$$
However, the result I got is totally different.The Lone is got from the inverse of the right eigenvector matrix, and the Ltwo is got from the second method.




  1. method 1:
    $$a={0.364039+7.3815i,4.52991+3.24693i,2.20606-2.12843i}$$
    $$R=begin{pmatrix}-0.491206-0.0760069 i&0.466312 - 0.324685 i& 0.655799\0.403372 - 0.138801 i&0.800166 & -0.250746 + 0.338948i\0.716577& -0.101062 - 0.121558 i&0.668775 + 0.119451i
    end{pmatrix}$$

    $$Lone=begin{pmatrix}-0.555223 - 0.26064i& 0.543653 + 0.0197978i&
    0.732705 - 0.143397 i\0.664945 - 0.222459i&
    0.952185 + 0.36692 i& -0.127662 - 0.10407 i\ 0.776518 +
    0.227821i& -0.45381 + 0.288362i& 0.707731 - 0.0116924i
    end{pmatrix}$$


  2. method 2:
    $$a^{*}={0.364039-7.3815i,4.52991-3.24693i,2.20606+2.12843i}$$
    $$R=begin{pmatrix}-0.491206-0.0760069 i&0.466312 - 0.324685 i& 0.655799\0.403372 - 0.138801 i&0.800166 & -0.250746 + 0.338948i\0.716577& -0.101062 - 0.121558 i&0.668775 + 0.119451i
    end{pmatrix}$$

    $$Ltwo^{dagger}=begin{pmatrix}-0.491206 - 0.0760069i& 0.403372 - 0.138801i&
    0.716577 \ 0.466312 - 0.324685i&
    0.800166 &-0.101062 - 0.121558i\ 0.655799&-0.250746 + 0.338948i& 0.668775 + 0.119451iend{pmatrix}$$



Is there anybody can help me to solve the problem and point out my mistake or something? I really appreciate!










share|cite|improve this question











$endgroup$












  • $begingroup$
    Please take the time to enter your work as text instead of linking to a picture of it. Images are neither searchable nor accessible to screen readers, nor do they show up in question summaries.
    $endgroup$
    – amd
    Jan 9 at 5:31










  • $begingroup$
    Have you actually tested your two results to see whether or not the rows are in fact independent eigenvectors of $A$? You do remember that eigenvectors are not unique, don’t you? There’s no particular reason to expect that the left eigenvectors produced by these two different methods will be identical.
    $endgroup$
    – amd
    Jan 9 at 7:53












  • $begingroup$
    @amd Thank you for your help!!! Since they have the same eigenvalues (L^dagger can be transformed to L), the L1 and L2 (the left eigenvectors in 2 methods) should be linear dependent? I thought the results are all normalized so they should have the same value.
    $endgroup$
    – mqy
    Jan 14 at 22:16










  • $begingroup$
    This is really hard to understand. Maybe you can put a sentence or two in the beginning to help us have some context on what assumptions about $A$, $R$, $L$, and $a$ you are making?
    $endgroup$
    – Morgan Rodgers
    Jan 14 at 22:16










  • $begingroup$
    @MorganRodgers I'm sorry about the unclear express. I edited it again. Thank you very much!
    $endgroup$
    – mqy
    Jan 14 at 22:22














0












0








0





$begingroup$


recently a question about right and left eigenvectors bother me a lot.



Here R is the matrix of the right eigenvectors of A, every column is one of A's eigenvectors; L is the matrix of the left eigenvectors of A, every row is one of A's eigenvectors; and "a" is a diagonalized matrix composed of the eigenvalues of A. A is a symmetric complex matrix





  1. $AR=Ra$,




    • we can take the transpose on both sides: $R^{T}A=aR^{T}$

    • We can also multiply the inverse of R on the right left on both sides: $R^{-1}A=aR^{-1}$

    • $LA=aL$

    • So, according a,b we can get $R^{T}=R^{-1}=L$




  2. $LA=aL$




    • we can take the conjugate transpose of both sides:$A^{dagger}L^{dagger}=L^{dagger}a*$

    • So we can get the left eigenvalue matrix just by getting the right eigenvalues of $A^{dagger}$ and then do the conjugate transpose of $L^{dagger}$




These two methods both seem to be right and I should have consistency in the answers got from the two ways for the same eigenvalue(in the second method we can get the same a by doing the conjugate).
$$A=begin{pmatrix}1+i&1-i&1-3.5i\1-i&5+5.5i&-1.5+3i\1-3.5i&-1.5+3i&1.1+2iend{pmatrix}$$
However, the result I got is totally different.The Lone is got from the inverse of the right eigenvector matrix, and the Ltwo is got from the second method.




  1. method 1:
    $$a={0.364039+7.3815i,4.52991+3.24693i,2.20606-2.12843i}$$
    $$R=begin{pmatrix}-0.491206-0.0760069 i&0.466312 - 0.324685 i& 0.655799\0.403372 - 0.138801 i&0.800166 & -0.250746 + 0.338948i\0.716577& -0.101062 - 0.121558 i&0.668775 + 0.119451i
    end{pmatrix}$$

    $$Lone=begin{pmatrix}-0.555223 - 0.26064i& 0.543653 + 0.0197978i&
    0.732705 - 0.143397 i\0.664945 - 0.222459i&
    0.952185 + 0.36692 i& -0.127662 - 0.10407 i\ 0.776518 +
    0.227821i& -0.45381 + 0.288362i& 0.707731 - 0.0116924i
    end{pmatrix}$$


  2. method 2:
    $$a^{*}={0.364039-7.3815i,4.52991-3.24693i,2.20606+2.12843i}$$
    $$R=begin{pmatrix}-0.491206-0.0760069 i&0.466312 - 0.324685 i& 0.655799\0.403372 - 0.138801 i&0.800166 & -0.250746 + 0.338948i\0.716577& -0.101062 - 0.121558 i&0.668775 + 0.119451i
    end{pmatrix}$$

    $$Ltwo^{dagger}=begin{pmatrix}-0.491206 - 0.0760069i& 0.403372 - 0.138801i&
    0.716577 \ 0.466312 - 0.324685i&
    0.800166 &-0.101062 - 0.121558i\ 0.655799&-0.250746 + 0.338948i& 0.668775 + 0.119451iend{pmatrix}$$



Is there anybody can help me to solve the problem and point out my mistake or something? I really appreciate!










share|cite|improve this question











$endgroup$




recently a question about right and left eigenvectors bother me a lot.



Here R is the matrix of the right eigenvectors of A, every column is one of A's eigenvectors; L is the matrix of the left eigenvectors of A, every row is one of A's eigenvectors; and "a" is a diagonalized matrix composed of the eigenvalues of A. A is a symmetric complex matrix





  1. $AR=Ra$,




    • we can take the transpose on both sides: $R^{T}A=aR^{T}$

    • We can also multiply the inverse of R on the right left on both sides: $R^{-1}A=aR^{-1}$

    • $LA=aL$

    • So, according a,b we can get $R^{T}=R^{-1}=L$




  2. $LA=aL$




    • we can take the conjugate transpose of both sides:$A^{dagger}L^{dagger}=L^{dagger}a*$

    • So we can get the left eigenvalue matrix just by getting the right eigenvalues of $A^{dagger}$ and then do the conjugate transpose of $L^{dagger}$




These two methods both seem to be right and I should have consistency in the answers got from the two ways for the same eigenvalue(in the second method we can get the same a by doing the conjugate).
$$A=begin{pmatrix}1+i&1-i&1-3.5i\1-i&5+5.5i&-1.5+3i\1-3.5i&-1.5+3i&1.1+2iend{pmatrix}$$
However, the result I got is totally different.The Lone is got from the inverse of the right eigenvector matrix, and the Ltwo is got from the second method.




  1. method 1:
    $$a={0.364039+7.3815i,4.52991+3.24693i,2.20606-2.12843i}$$
    $$R=begin{pmatrix}-0.491206-0.0760069 i&0.466312 - 0.324685 i& 0.655799\0.403372 - 0.138801 i&0.800166 & -0.250746 + 0.338948i\0.716577& -0.101062 - 0.121558 i&0.668775 + 0.119451i
    end{pmatrix}$$

    $$Lone=begin{pmatrix}-0.555223 - 0.26064i& 0.543653 + 0.0197978i&
    0.732705 - 0.143397 i\0.664945 - 0.222459i&
    0.952185 + 0.36692 i& -0.127662 - 0.10407 i\ 0.776518 +
    0.227821i& -0.45381 + 0.288362i& 0.707731 - 0.0116924i
    end{pmatrix}$$


  2. method 2:
    $$a^{*}={0.364039-7.3815i,4.52991-3.24693i,2.20606+2.12843i}$$
    $$R=begin{pmatrix}-0.491206-0.0760069 i&0.466312 - 0.324685 i& 0.655799\0.403372 - 0.138801 i&0.800166 & -0.250746 + 0.338948i\0.716577& -0.101062 - 0.121558 i&0.668775 + 0.119451i
    end{pmatrix}$$

    $$Ltwo^{dagger}=begin{pmatrix}-0.491206 - 0.0760069i& 0.403372 - 0.138801i&
    0.716577 \ 0.466312 - 0.324685i&
    0.800166 &-0.101062 - 0.121558i\ 0.655799&-0.250746 + 0.338948i& 0.668775 + 0.119451iend{pmatrix}$$



Is there anybody can help me to solve the problem and point out my mistake or something? I really appreciate!







eigenvalues-eigenvectors






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 15 at 16:34







mqy

















asked Jan 8 at 22:31









mqymqy

11




11












  • $begingroup$
    Please take the time to enter your work as text instead of linking to a picture of it. Images are neither searchable nor accessible to screen readers, nor do they show up in question summaries.
    $endgroup$
    – amd
    Jan 9 at 5:31










  • $begingroup$
    Have you actually tested your two results to see whether or not the rows are in fact independent eigenvectors of $A$? You do remember that eigenvectors are not unique, don’t you? There’s no particular reason to expect that the left eigenvectors produced by these two different methods will be identical.
    $endgroup$
    – amd
    Jan 9 at 7:53












  • $begingroup$
    @amd Thank you for your help!!! Since they have the same eigenvalues (L^dagger can be transformed to L), the L1 and L2 (the left eigenvectors in 2 methods) should be linear dependent? I thought the results are all normalized so they should have the same value.
    $endgroup$
    – mqy
    Jan 14 at 22:16










  • $begingroup$
    This is really hard to understand. Maybe you can put a sentence or two in the beginning to help us have some context on what assumptions about $A$, $R$, $L$, and $a$ you are making?
    $endgroup$
    – Morgan Rodgers
    Jan 14 at 22:16










  • $begingroup$
    @MorganRodgers I'm sorry about the unclear express. I edited it again. Thank you very much!
    $endgroup$
    – mqy
    Jan 14 at 22:22


















  • $begingroup$
    Please take the time to enter your work as text instead of linking to a picture of it. Images are neither searchable nor accessible to screen readers, nor do they show up in question summaries.
    $endgroup$
    – amd
    Jan 9 at 5:31










  • $begingroup$
    Have you actually tested your two results to see whether or not the rows are in fact independent eigenvectors of $A$? You do remember that eigenvectors are not unique, don’t you? There’s no particular reason to expect that the left eigenvectors produced by these two different methods will be identical.
    $endgroup$
    – amd
    Jan 9 at 7:53












  • $begingroup$
    @amd Thank you for your help!!! Since they have the same eigenvalues (L^dagger can be transformed to L), the L1 and L2 (the left eigenvectors in 2 methods) should be linear dependent? I thought the results are all normalized so they should have the same value.
    $endgroup$
    – mqy
    Jan 14 at 22:16










  • $begingroup$
    This is really hard to understand. Maybe you can put a sentence or two in the beginning to help us have some context on what assumptions about $A$, $R$, $L$, and $a$ you are making?
    $endgroup$
    – Morgan Rodgers
    Jan 14 at 22:16










  • $begingroup$
    @MorganRodgers I'm sorry about the unclear express. I edited it again. Thank you very much!
    $endgroup$
    – mqy
    Jan 14 at 22:22
















$begingroup$
Please take the time to enter your work as text instead of linking to a picture of it. Images are neither searchable nor accessible to screen readers, nor do they show up in question summaries.
$endgroup$
– amd
Jan 9 at 5:31




$begingroup$
Please take the time to enter your work as text instead of linking to a picture of it. Images are neither searchable nor accessible to screen readers, nor do they show up in question summaries.
$endgroup$
– amd
Jan 9 at 5:31












$begingroup$
Have you actually tested your two results to see whether or not the rows are in fact independent eigenvectors of $A$? You do remember that eigenvectors are not unique, don’t you? There’s no particular reason to expect that the left eigenvectors produced by these two different methods will be identical.
$endgroup$
– amd
Jan 9 at 7:53






$begingroup$
Have you actually tested your two results to see whether or not the rows are in fact independent eigenvectors of $A$? You do remember that eigenvectors are not unique, don’t you? There’s no particular reason to expect that the left eigenvectors produced by these two different methods will be identical.
$endgroup$
– amd
Jan 9 at 7:53














$begingroup$
@amd Thank you for your help!!! Since they have the same eigenvalues (L^dagger can be transformed to L), the L1 and L2 (the left eigenvectors in 2 methods) should be linear dependent? I thought the results are all normalized so they should have the same value.
$endgroup$
– mqy
Jan 14 at 22:16




$begingroup$
@amd Thank you for your help!!! Since they have the same eigenvalues (L^dagger can be transformed to L), the L1 and L2 (the left eigenvectors in 2 methods) should be linear dependent? I thought the results are all normalized so they should have the same value.
$endgroup$
– mqy
Jan 14 at 22:16












$begingroup$
This is really hard to understand. Maybe you can put a sentence or two in the beginning to help us have some context on what assumptions about $A$, $R$, $L$, and $a$ you are making?
$endgroup$
– Morgan Rodgers
Jan 14 at 22:16




$begingroup$
This is really hard to understand. Maybe you can put a sentence or two in the beginning to help us have some context on what assumptions about $A$, $R$, $L$, and $a$ you are making?
$endgroup$
– Morgan Rodgers
Jan 14 at 22:16












$begingroup$
@MorganRodgers I'm sorry about the unclear express. I edited it again. Thank you very much!
$endgroup$
– mqy
Jan 14 at 22:22




$begingroup$
@MorganRodgers I'm sorry about the unclear express. I edited it again. Thank you very much!
$endgroup$
– mqy
Jan 14 at 22:22










0






active

oldest

votes












Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3066819%2fquestions-about-left-and-right-eigenvectors-of-symmetric-complex-matrix%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3066819%2fquestions-about-left-and-right-eigenvectors-of-symmetric-complex-matrix%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Bressuire

Cabo Verde

Gyllenstierna