How to prove this very interesting matrix identity?












0












$begingroup$


This is a very interesting identity but I don't know how to prove this,
note that $$A_1,ldots,A_J,B in mathbb{R}^{n times n}$$
and $m$ is the number of block diagonal in $mathbf{A}$ ,so consider
$$
mathbf{A} =
begin{bmatrix}
A_1 & A_2 & ldots &A_J & & & & \
& & & ddots \
& & & & A_1 & A_2 & ldots & A_J
end{bmatrix} in mathbb{R}^{nmtimes (Jnm)}$$

$$
mathbf{B} = begin{bmatrix}
B & & \
& ddots & \
& & B
end{bmatrix} in mathbb{R}^{(Jnm)times (Jnm)}$$

$$
mathbf{A}^prime =
left[begin{smallmatrix}
A_1 & & & & A_2 & & & & ldots & & & & A_J\
& A_1 & & & & A_2 & & & & ldots & & & & A_J\
& & ddots & & & & ddots & & & & ldots & & & & ddots \
& & & A_1 & & & & A_2 & & & & ldots & & & & A_J
end{smallmatrix}right] in mathbb{R}^{mntimes (Jmn)}$$



$$
mathbf{B}^prime =
left[begin{smallmatrix}
begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}\
& begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}\
& & ddots & & & & ddots & & & & ldots & & & & ddots \
& & & begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}
end{smallmatrix}right] in mathbb{R}^{Jmntimes (Jmn)}$$



It seems to be true that
$$mathbf{AB} = mathbf{A^prime B^prime}$$



what I guess might work



From $mathbf{A}$ to $mathbf{A}^prime$ it is a column permutation $mathbf{C}$, while explicitly find its inverse should give us $$mathbf{AB} = mathbf{A CC^{-1} B}= mathbf{A^prime B^prime}$$



But it seems to me, very difficult to write down the $mathbf{C}$



Update:



It seems that C can be written but well, I can show it is correct by hand-waving way to write it down then...










share|cite|improve this question











$endgroup$












  • $begingroup$
    An aside: your matrices need a bit of condensing. Hard to tell what is going on in $B,$ and $B'$ goes off my screen completely. I'm not sure which options MathJax supports for doing this though :/
    $endgroup$
    – Brevan Ellefsen
    Jan 8 at 22:01


















0












$begingroup$


This is a very interesting identity but I don't know how to prove this,
note that $$A_1,ldots,A_J,B in mathbb{R}^{n times n}$$
and $m$ is the number of block diagonal in $mathbf{A}$ ,so consider
$$
mathbf{A} =
begin{bmatrix}
A_1 & A_2 & ldots &A_J & & & & \
& & & ddots \
& & & & A_1 & A_2 & ldots & A_J
end{bmatrix} in mathbb{R}^{nmtimes (Jnm)}$$

$$
mathbf{B} = begin{bmatrix}
B & & \
& ddots & \
& & B
end{bmatrix} in mathbb{R}^{(Jnm)times (Jnm)}$$

$$
mathbf{A}^prime =
left[begin{smallmatrix}
A_1 & & & & A_2 & & & & ldots & & & & A_J\
& A_1 & & & & A_2 & & & & ldots & & & & A_J\
& & ddots & & & & ddots & & & & ldots & & & & ddots \
& & & A_1 & & & & A_2 & & & & ldots & & & & A_J
end{smallmatrix}right] in mathbb{R}^{mntimes (Jmn)}$$



$$
mathbf{B}^prime =
left[begin{smallmatrix}
begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}\
& begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}\
& & ddots & & & & ddots & & & & ldots & & & & ddots \
& & & begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}
end{smallmatrix}right] in mathbb{R}^{Jmntimes (Jmn)}$$



It seems to be true that
$$mathbf{AB} = mathbf{A^prime B^prime}$$



what I guess might work



From $mathbf{A}$ to $mathbf{A}^prime$ it is a column permutation $mathbf{C}$, while explicitly find its inverse should give us $$mathbf{AB} = mathbf{A CC^{-1} B}= mathbf{A^prime B^prime}$$



But it seems to me, very difficult to write down the $mathbf{C}$



Update:



It seems that C can be written but well, I can show it is correct by hand-waving way to write it down then...










share|cite|improve this question











$endgroup$












  • $begingroup$
    An aside: your matrices need a bit of condensing. Hard to tell what is going on in $B,$ and $B'$ goes off my screen completely. I'm not sure which options MathJax supports for doing this though :/
    $endgroup$
    – Brevan Ellefsen
    Jan 8 at 22:01
















0












0








0





$begingroup$


This is a very interesting identity but I don't know how to prove this,
note that $$A_1,ldots,A_J,B in mathbb{R}^{n times n}$$
and $m$ is the number of block diagonal in $mathbf{A}$ ,so consider
$$
mathbf{A} =
begin{bmatrix}
A_1 & A_2 & ldots &A_J & & & & \
& & & ddots \
& & & & A_1 & A_2 & ldots & A_J
end{bmatrix} in mathbb{R}^{nmtimes (Jnm)}$$

$$
mathbf{B} = begin{bmatrix}
B & & \
& ddots & \
& & B
end{bmatrix} in mathbb{R}^{(Jnm)times (Jnm)}$$

$$
mathbf{A}^prime =
left[begin{smallmatrix}
A_1 & & & & A_2 & & & & ldots & & & & A_J\
& A_1 & & & & A_2 & & & & ldots & & & & A_J\
& & ddots & & & & ddots & & & & ldots & & & & ddots \
& & & A_1 & & & & A_2 & & & & ldots & & & & A_J
end{smallmatrix}right] in mathbb{R}^{mntimes (Jmn)}$$



$$
mathbf{B}^prime =
left[begin{smallmatrix}
begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}\
& begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}\
& & ddots & & & & ddots & & & & ldots & & & & ddots \
& & & begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}
end{smallmatrix}right] in mathbb{R}^{Jmntimes (Jmn)}$$



It seems to be true that
$$mathbf{AB} = mathbf{A^prime B^prime}$$



what I guess might work



From $mathbf{A}$ to $mathbf{A}^prime$ it is a column permutation $mathbf{C}$, while explicitly find its inverse should give us $$mathbf{AB} = mathbf{A CC^{-1} B}= mathbf{A^prime B^prime}$$



But it seems to me, very difficult to write down the $mathbf{C}$



Update:



It seems that C can be written but well, I can show it is correct by hand-waving way to write it down then...










share|cite|improve this question











$endgroup$




This is a very interesting identity but I don't know how to prove this,
note that $$A_1,ldots,A_J,B in mathbb{R}^{n times n}$$
and $m$ is the number of block diagonal in $mathbf{A}$ ,so consider
$$
mathbf{A} =
begin{bmatrix}
A_1 & A_2 & ldots &A_J & & & & \
& & & ddots \
& & & & A_1 & A_2 & ldots & A_J
end{bmatrix} in mathbb{R}^{nmtimes (Jnm)}$$

$$
mathbf{B} = begin{bmatrix}
B & & \
& ddots & \
& & B
end{bmatrix} in mathbb{R}^{(Jnm)times (Jnm)}$$

$$
mathbf{A}^prime =
left[begin{smallmatrix}
A_1 & & & & A_2 & & & & ldots & & & & A_J\
& A_1 & & & & A_2 & & & & ldots & & & & A_J\
& & ddots & & & & ddots & & & & ldots & & & & ddots \
& & & A_1 & & & & A_2 & & & & ldots & & & & A_J
end{smallmatrix}right] in mathbb{R}^{mntimes (Jmn)}$$



$$
mathbf{B}^prime =
left[begin{smallmatrix}
begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}\
& begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}\
& & ddots & & & & ddots & & & & ldots & & & & ddots \
& & & begin{bmatrix}B \ 0\ vdots \ 0 \ end{bmatrix} & & & & begin{bmatrix}0 \ B\ vdots \ 0 \ end{bmatrix} & & & & ldots & & & & begin{bmatrix}0 \ 0\ vdots \ B \ end{bmatrix}
end{smallmatrix}right] in mathbb{R}^{Jmntimes (Jmn)}$$



It seems to be true that
$$mathbf{AB} = mathbf{A^prime B^prime}$$



what I guess might work



From $mathbf{A}$ to $mathbf{A}^prime$ it is a column permutation $mathbf{C}$, while explicitly find its inverse should give us $$mathbf{AB} = mathbf{A CC^{-1} B}= mathbf{A^prime B^prime}$$



But it seems to me, very difficult to write down the $mathbf{C}$



Update:



It seems that C can be written but well, I can show it is correct by hand-waving way to write it down then...







linear-algebra matrix-decomposition block-matrices






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 8 at 23:27







ArtificiallyIntelligence

















asked Jan 8 at 21:48









ArtificiallyIntelligenceArtificiallyIntelligence

310112




310112












  • $begingroup$
    An aside: your matrices need a bit of condensing. Hard to tell what is going on in $B,$ and $B'$ goes off my screen completely. I'm not sure which options MathJax supports for doing this though :/
    $endgroup$
    – Brevan Ellefsen
    Jan 8 at 22:01




















  • $begingroup$
    An aside: your matrices need a bit of condensing. Hard to tell what is going on in $B,$ and $B'$ goes off my screen completely. I'm not sure which options MathJax supports for doing this though :/
    $endgroup$
    – Brevan Ellefsen
    Jan 8 at 22:01


















$begingroup$
An aside: your matrices need a bit of condensing. Hard to tell what is going on in $B,$ and $B'$ goes off my screen completely. I'm not sure which options MathJax supports for doing this though :/
$endgroup$
– Brevan Ellefsen
Jan 8 at 22:01






$begingroup$
An aside: your matrices need a bit of condensing. Hard to tell what is going on in $B,$ and $B'$ goes off my screen completely. I'm not sure which options MathJax supports for doing this though :/
$endgroup$
– Brevan Ellefsen
Jan 8 at 22:01












2 Answers
2






active

oldest

votes


















1












$begingroup$

Let $P$ be the $nmJtimes nmJ$ matrix with $mtimes J$ blocks of size $nJtimes nm$ with blocks labeled $P_{i,j}$ for $i=1,ldots,m$ and $j=1,ldots,J$.



Define $P_{i,j}$ to be the block matrix of $Jtimes J$ blocks of size $ntimes n$ where the $j,i$ block is the $ntimes n$ identity and all other entries are zero.



Note that each row and column of $P$ has exactly one 1. Therefore $P$ is a permutation. Moreover $A' = AP$ and $B' = P^TB$ so that $A'B' = APP^TB = AB$.



For instance, here is $P$ when $J=3$, $m=4$
$$
P =
left[
begin{array}{cccc|cccc|cccc}
I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 \ hline
0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 \ hline
0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 \ hline
0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n \ hline
end{array}
right]
$$



However, you still need to verify that $A' = AP$ and $B' = P^TA$ which requires just as much work as directly verifying that $AB = A'B'$. Note that you can just assume that $I_n = 1$ and $A_i = a_i$ since block matrix rules are the same as regular matrix rules.






share|cite|improve this answer











$endgroup$





















    1












    $begingroup$

    It seems helpful to rewrite these matrices using Kronecker products. In particular, we have
    $$
    mathbf A = I_m otimes pmatrix{A_1 & cdots & A_J} = sum_{k=1}^J I_m otimes e_k^T otimes A_k\
    mathbf B = I_{Jm} otimes B = I_m otimes I_J otimes B \
    mathbf A' = pmatrix{I_m otimes A_1 & cdots & I_m otimes A_J} = sum_{k=1}^J e_k^T otimes I_m otimes A_k\
    mathbf B' = pmatrix{I_m otimes e_1 otimes B & cdots & I_m otimes e_J otimes B}
    = sum_{k=1}^J e_k^T otimes I_m otimes e_k otimes B
    $$

    where $I$ is the identity matrix, and $e_i$ is the $i$th column of the identity matrix (in this case, of the size $J$ identity matrix).



    With all that said, we can now use the properties of the Kronecker product to compute
    $$
    mathbf {AB} = [sum_{k=1}^J I_m otimes e_k^T otimes A_k][I_m otimes I_J otimes B]\
    = sum_{k=1}^JI_m otimes e_k^T otimes (A_kB)\
    = I_m otimes pmatrix{A_1B & cdots & A_JB}
    $$

    Unfortunately, I'm having trouble performing a similar computation on the product $mathbf{A'B'}$. However, I still think you will find this useful.





    The column permutation that takes us from $mathbf A$ to $mathbf A'$ can be nicely described by
    $$
    (e_i^{(m)} otimes e_j^{(J)} otimes e_k^{(n)})^TC = (e_i^{(J)} otimes e_j^{(m)} otimes e_k^{(n)})^T
    $$

    and in fact, we can deduce that $C = C^{-1}$. With that in mind, it seems that you have miscalculated $mathbf B'$. We should have
    $$
    mathbf B' = C mathbf B = mathbf B
    $$

    So you should find that $mathbf{AB} = mathbf{A'B}$.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      thanks! but isn't column permutation matrix C should have its inverse equals its transpose? so $C^{-1} = C^top$?
      $endgroup$
      – ArtificiallyIntelligence
      Jan 9 at 3:09










    • $begingroup$
      @ArtificiallyIntelligence right, so $C$ as it turns out is a symmetric matrix
      $endgroup$
      – Omnomnomnom
      Jan 9 at 15:30












    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3066762%2fhow-to-prove-this-very-interesting-matrix-identity%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    Let $P$ be the $nmJtimes nmJ$ matrix with $mtimes J$ blocks of size $nJtimes nm$ with blocks labeled $P_{i,j}$ for $i=1,ldots,m$ and $j=1,ldots,J$.



    Define $P_{i,j}$ to be the block matrix of $Jtimes J$ blocks of size $ntimes n$ where the $j,i$ block is the $ntimes n$ identity and all other entries are zero.



    Note that each row and column of $P$ has exactly one 1. Therefore $P$ is a permutation. Moreover $A' = AP$ and $B' = P^TB$ so that $A'B' = APP^TB = AB$.



    For instance, here is $P$ when $J=3$, $m=4$
    $$
    P =
    left[
    begin{array}{cccc|cccc|cccc}
    I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
    0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
    0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 \ hline
    0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
    0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 \
    0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 \ hline
    0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
    0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 \
    0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 \ hline
    0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
    0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 \
    0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n \ hline
    end{array}
    right]
    $$



    However, you still need to verify that $A' = AP$ and $B' = P^TA$ which requires just as much work as directly verifying that $AB = A'B'$. Note that you can just assume that $I_n = 1$ and $A_i = a_i$ since block matrix rules are the same as regular matrix rules.






    share|cite|improve this answer











    $endgroup$


















      1












      $begingroup$

      Let $P$ be the $nmJtimes nmJ$ matrix with $mtimes J$ blocks of size $nJtimes nm$ with blocks labeled $P_{i,j}$ for $i=1,ldots,m$ and $j=1,ldots,J$.



      Define $P_{i,j}$ to be the block matrix of $Jtimes J$ blocks of size $ntimes n$ where the $j,i$ block is the $ntimes n$ identity and all other entries are zero.



      Note that each row and column of $P$ has exactly one 1. Therefore $P$ is a permutation. Moreover $A' = AP$ and $B' = P^TB$ so that $A'B' = APP^TB = AB$.



      For instance, here is $P$ when $J=3$, $m=4$
      $$
      P =
      left[
      begin{array}{cccc|cccc|cccc}
      I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
      0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
      0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 \ hline
      0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
      0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 \
      0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 \ hline
      0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
      0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 \
      0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 \ hline
      0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
      0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 \
      0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n \ hline
      end{array}
      right]
      $$



      However, you still need to verify that $A' = AP$ and $B' = P^TA$ which requires just as much work as directly verifying that $AB = A'B'$. Note that you can just assume that $I_n = 1$ and $A_i = a_i$ since block matrix rules are the same as regular matrix rules.






      share|cite|improve this answer











      $endgroup$
















        1












        1








        1





        $begingroup$

        Let $P$ be the $nmJtimes nmJ$ matrix with $mtimes J$ blocks of size $nJtimes nm$ with blocks labeled $P_{i,j}$ for $i=1,ldots,m$ and $j=1,ldots,J$.



        Define $P_{i,j}$ to be the block matrix of $Jtimes J$ blocks of size $ntimes n$ where the $j,i$ block is the $ntimes n$ identity and all other entries are zero.



        Note that each row and column of $P$ has exactly one 1. Therefore $P$ is a permutation. Moreover $A' = AP$ and $B' = P^TB$ so that $A'B' = APP^TB = AB$.



        For instance, here is $P$ when $J=3$, $m=4$
        $$
        P =
        left[
        begin{array}{cccc|cccc|cccc}
        I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 \ hline
        0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 \ hline
        0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 \ hline
        0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n \ hline
        end{array}
        right]
        $$



        However, you still need to verify that $A' = AP$ and $B' = P^TA$ which requires just as much work as directly verifying that $AB = A'B'$. Note that you can just assume that $I_n = 1$ and $A_i = a_i$ since block matrix rules are the same as regular matrix rules.






        share|cite|improve this answer











        $endgroup$



        Let $P$ be the $nmJtimes nmJ$ matrix with $mtimes J$ blocks of size $nJtimes nm$ with blocks labeled $P_{i,j}$ for $i=1,ldots,m$ and $j=1,ldots,J$.



        Define $P_{i,j}$ to be the block matrix of $Jtimes J$ blocks of size $ntimes n$ where the $j,i$ block is the $ntimes n$ identity and all other entries are zero.



        Note that each row and column of $P$ has exactly one 1. Therefore $P$ is a permutation. Moreover $A' = AP$ and $B' = P^TB$ so that $A'B' = APP^TB = AB$.



        For instance, here is $P$ when $J=3$, $m=4$
        $$
        P =
        left[
        begin{array}{cccc|cccc|cccc}
        I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 \ hline
        0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 \ hline
        0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 \ hline
        0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n & 0 & 0 & 0 & 0 \
        0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & I_n \ hline
        end{array}
        right]
        $$



        However, you still need to verify that $A' = AP$ and $B' = P^TA$ which requires just as much work as directly verifying that $AB = A'B'$. Note that you can just assume that $I_n = 1$ and $A_i = a_i$ since block matrix rules are the same as regular matrix rules.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Jan 9 at 3:02

























        answered Jan 9 at 2:33









        tchtch

        833310




        833310























            1












            $begingroup$

            It seems helpful to rewrite these matrices using Kronecker products. In particular, we have
            $$
            mathbf A = I_m otimes pmatrix{A_1 & cdots & A_J} = sum_{k=1}^J I_m otimes e_k^T otimes A_k\
            mathbf B = I_{Jm} otimes B = I_m otimes I_J otimes B \
            mathbf A' = pmatrix{I_m otimes A_1 & cdots & I_m otimes A_J} = sum_{k=1}^J e_k^T otimes I_m otimes A_k\
            mathbf B' = pmatrix{I_m otimes e_1 otimes B & cdots & I_m otimes e_J otimes B}
            = sum_{k=1}^J e_k^T otimes I_m otimes e_k otimes B
            $$

            where $I$ is the identity matrix, and $e_i$ is the $i$th column of the identity matrix (in this case, of the size $J$ identity matrix).



            With all that said, we can now use the properties of the Kronecker product to compute
            $$
            mathbf {AB} = [sum_{k=1}^J I_m otimes e_k^T otimes A_k][I_m otimes I_J otimes B]\
            = sum_{k=1}^JI_m otimes e_k^T otimes (A_kB)\
            = I_m otimes pmatrix{A_1B & cdots & A_JB}
            $$

            Unfortunately, I'm having trouble performing a similar computation on the product $mathbf{A'B'}$. However, I still think you will find this useful.





            The column permutation that takes us from $mathbf A$ to $mathbf A'$ can be nicely described by
            $$
            (e_i^{(m)} otimes e_j^{(J)} otimes e_k^{(n)})^TC = (e_i^{(J)} otimes e_j^{(m)} otimes e_k^{(n)})^T
            $$

            and in fact, we can deduce that $C = C^{-1}$. With that in mind, it seems that you have miscalculated $mathbf B'$. We should have
            $$
            mathbf B' = C mathbf B = mathbf B
            $$

            So you should find that $mathbf{AB} = mathbf{A'B}$.






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              thanks! but isn't column permutation matrix C should have its inverse equals its transpose? so $C^{-1} = C^top$?
              $endgroup$
              – ArtificiallyIntelligence
              Jan 9 at 3:09










            • $begingroup$
              @ArtificiallyIntelligence right, so $C$ as it turns out is a symmetric matrix
              $endgroup$
              – Omnomnomnom
              Jan 9 at 15:30
















            1












            $begingroup$

            It seems helpful to rewrite these matrices using Kronecker products. In particular, we have
            $$
            mathbf A = I_m otimes pmatrix{A_1 & cdots & A_J} = sum_{k=1}^J I_m otimes e_k^T otimes A_k\
            mathbf B = I_{Jm} otimes B = I_m otimes I_J otimes B \
            mathbf A' = pmatrix{I_m otimes A_1 & cdots & I_m otimes A_J} = sum_{k=1}^J e_k^T otimes I_m otimes A_k\
            mathbf B' = pmatrix{I_m otimes e_1 otimes B & cdots & I_m otimes e_J otimes B}
            = sum_{k=1}^J e_k^T otimes I_m otimes e_k otimes B
            $$

            where $I$ is the identity matrix, and $e_i$ is the $i$th column of the identity matrix (in this case, of the size $J$ identity matrix).



            With all that said, we can now use the properties of the Kronecker product to compute
            $$
            mathbf {AB} = [sum_{k=1}^J I_m otimes e_k^T otimes A_k][I_m otimes I_J otimes B]\
            = sum_{k=1}^JI_m otimes e_k^T otimes (A_kB)\
            = I_m otimes pmatrix{A_1B & cdots & A_JB}
            $$

            Unfortunately, I'm having trouble performing a similar computation on the product $mathbf{A'B'}$. However, I still think you will find this useful.





            The column permutation that takes us from $mathbf A$ to $mathbf A'$ can be nicely described by
            $$
            (e_i^{(m)} otimes e_j^{(J)} otimes e_k^{(n)})^TC = (e_i^{(J)} otimes e_j^{(m)} otimes e_k^{(n)})^T
            $$

            and in fact, we can deduce that $C = C^{-1}$. With that in mind, it seems that you have miscalculated $mathbf B'$. We should have
            $$
            mathbf B' = C mathbf B = mathbf B
            $$

            So you should find that $mathbf{AB} = mathbf{A'B}$.






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              thanks! but isn't column permutation matrix C should have its inverse equals its transpose? so $C^{-1} = C^top$?
              $endgroup$
              – ArtificiallyIntelligence
              Jan 9 at 3:09










            • $begingroup$
              @ArtificiallyIntelligence right, so $C$ as it turns out is a symmetric matrix
              $endgroup$
              – Omnomnomnom
              Jan 9 at 15:30














            1












            1








            1





            $begingroup$

            It seems helpful to rewrite these matrices using Kronecker products. In particular, we have
            $$
            mathbf A = I_m otimes pmatrix{A_1 & cdots & A_J} = sum_{k=1}^J I_m otimes e_k^T otimes A_k\
            mathbf B = I_{Jm} otimes B = I_m otimes I_J otimes B \
            mathbf A' = pmatrix{I_m otimes A_1 & cdots & I_m otimes A_J} = sum_{k=1}^J e_k^T otimes I_m otimes A_k\
            mathbf B' = pmatrix{I_m otimes e_1 otimes B & cdots & I_m otimes e_J otimes B}
            = sum_{k=1}^J e_k^T otimes I_m otimes e_k otimes B
            $$

            where $I$ is the identity matrix, and $e_i$ is the $i$th column of the identity matrix (in this case, of the size $J$ identity matrix).



            With all that said, we can now use the properties of the Kronecker product to compute
            $$
            mathbf {AB} = [sum_{k=1}^J I_m otimes e_k^T otimes A_k][I_m otimes I_J otimes B]\
            = sum_{k=1}^JI_m otimes e_k^T otimes (A_kB)\
            = I_m otimes pmatrix{A_1B & cdots & A_JB}
            $$

            Unfortunately, I'm having trouble performing a similar computation on the product $mathbf{A'B'}$. However, I still think you will find this useful.





            The column permutation that takes us from $mathbf A$ to $mathbf A'$ can be nicely described by
            $$
            (e_i^{(m)} otimes e_j^{(J)} otimes e_k^{(n)})^TC = (e_i^{(J)} otimes e_j^{(m)} otimes e_k^{(n)})^T
            $$

            and in fact, we can deduce that $C = C^{-1}$. With that in mind, it seems that you have miscalculated $mathbf B'$. We should have
            $$
            mathbf B' = C mathbf B = mathbf B
            $$

            So you should find that $mathbf{AB} = mathbf{A'B}$.






            share|cite|improve this answer











            $endgroup$



            It seems helpful to rewrite these matrices using Kronecker products. In particular, we have
            $$
            mathbf A = I_m otimes pmatrix{A_1 & cdots & A_J} = sum_{k=1}^J I_m otimes e_k^T otimes A_k\
            mathbf B = I_{Jm} otimes B = I_m otimes I_J otimes B \
            mathbf A' = pmatrix{I_m otimes A_1 & cdots & I_m otimes A_J} = sum_{k=1}^J e_k^T otimes I_m otimes A_k\
            mathbf B' = pmatrix{I_m otimes e_1 otimes B & cdots & I_m otimes e_J otimes B}
            = sum_{k=1}^J e_k^T otimes I_m otimes e_k otimes B
            $$

            where $I$ is the identity matrix, and $e_i$ is the $i$th column of the identity matrix (in this case, of the size $J$ identity matrix).



            With all that said, we can now use the properties of the Kronecker product to compute
            $$
            mathbf {AB} = [sum_{k=1}^J I_m otimes e_k^T otimes A_k][I_m otimes I_J otimes B]\
            = sum_{k=1}^JI_m otimes e_k^T otimes (A_kB)\
            = I_m otimes pmatrix{A_1B & cdots & A_JB}
            $$

            Unfortunately, I'm having trouble performing a similar computation on the product $mathbf{A'B'}$. However, I still think you will find this useful.





            The column permutation that takes us from $mathbf A$ to $mathbf A'$ can be nicely described by
            $$
            (e_i^{(m)} otimes e_j^{(J)} otimes e_k^{(n)})^TC = (e_i^{(J)} otimes e_j^{(m)} otimes e_k^{(n)})^T
            $$

            and in fact, we can deduce that $C = C^{-1}$. With that in mind, it seems that you have miscalculated $mathbf B'$. We should have
            $$
            mathbf B' = C mathbf B = mathbf B
            $$

            So you should find that $mathbf{AB} = mathbf{A'B}$.







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited Jan 9 at 0:27

























            answered Jan 9 at 0:13









            OmnomnomnomOmnomnomnom

            129k793187




            129k793187












            • $begingroup$
              thanks! but isn't column permutation matrix C should have its inverse equals its transpose? so $C^{-1} = C^top$?
              $endgroup$
              – ArtificiallyIntelligence
              Jan 9 at 3:09










            • $begingroup$
              @ArtificiallyIntelligence right, so $C$ as it turns out is a symmetric matrix
              $endgroup$
              – Omnomnomnom
              Jan 9 at 15:30


















            • $begingroup$
              thanks! but isn't column permutation matrix C should have its inverse equals its transpose? so $C^{-1} = C^top$?
              $endgroup$
              – ArtificiallyIntelligence
              Jan 9 at 3:09










            • $begingroup$
              @ArtificiallyIntelligence right, so $C$ as it turns out is a symmetric matrix
              $endgroup$
              – Omnomnomnom
              Jan 9 at 15:30
















            $begingroup$
            thanks! but isn't column permutation matrix C should have its inverse equals its transpose? so $C^{-1} = C^top$?
            $endgroup$
            – ArtificiallyIntelligence
            Jan 9 at 3:09




            $begingroup$
            thanks! but isn't column permutation matrix C should have its inverse equals its transpose? so $C^{-1} = C^top$?
            $endgroup$
            – ArtificiallyIntelligence
            Jan 9 at 3:09












            $begingroup$
            @ArtificiallyIntelligence right, so $C$ as it turns out is a symmetric matrix
            $endgroup$
            – Omnomnomnom
            Jan 9 at 15:30




            $begingroup$
            @ArtificiallyIntelligence right, so $C$ as it turns out is a symmetric matrix
            $endgroup$
            – Omnomnomnom
            Jan 9 at 15:30


















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3066762%2fhow-to-prove-this-very-interesting-matrix-identity%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Bressuire

            Cabo Verde

            Gyllenstierna