Existence of the Pfaffian?












13












$begingroup$


Consider a square skew-symmetric $ntimes n$ matrix $A$. We know that $det(A)=det(A^T)=(-1)^ndet(A)$, so if $n$ is odd, the determinant vanishes.



If $n$ is even, my book claims that the determinant is the square of a polynomial function of the entries, and Wikipedia confirms this. The polynomial in question is called the Pfaffian.



I was wondering if there was an easy (clean, conceptual) way to show that this is the case, without mucking around with the symmetric group.










share|cite|improve this question











$endgroup$












  • $begingroup$
    See "alternate definitions" in the Wikipedia article. The idea that you can associate to such an $A$ an element in the exterior square $Lambda^2(V)$ of the vector space $V$ on which $A$ acts and then take exterior powers.
    $endgroup$
    – Qiaochu Yuan
    Jun 8 '12 at 3:35










  • $begingroup$
    Well, I haven't worked through the details.
    $endgroup$
    – Qiaochu Yuan
    Jun 8 '12 at 3:41












  • $begingroup$
    Related: sciencedirect.com/science/article/pii/S0001870885710298
    $endgroup$
    – darij grinberg
    Jul 21 '15 at 14:10
















13












$begingroup$


Consider a square skew-symmetric $ntimes n$ matrix $A$. We know that $det(A)=det(A^T)=(-1)^ndet(A)$, so if $n$ is odd, the determinant vanishes.



If $n$ is even, my book claims that the determinant is the square of a polynomial function of the entries, and Wikipedia confirms this. The polynomial in question is called the Pfaffian.



I was wondering if there was an easy (clean, conceptual) way to show that this is the case, without mucking around with the symmetric group.










share|cite|improve this question











$endgroup$












  • $begingroup$
    See "alternate definitions" in the Wikipedia article. The idea that you can associate to such an $A$ an element in the exterior square $Lambda^2(V)$ of the vector space $V$ on which $A$ acts and then take exterior powers.
    $endgroup$
    – Qiaochu Yuan
    Jun 8 '12 at 3:35










  • $begingroup$
    Well, I haven't worked through the details.
    $endgroup$
    – Qiaochu Yuan
    Jun 8 '12 at 3:41












  • $begingroup$
    Related: sciencedirect.com/science/article/pii/S0001870885710298
    $endgroup$
    – darij grinberg
    Jul 21 '15 at 14:10














13












13








13


5



$begingroup$


Consider a square skew-symmetric $ntimes n$ matrix $A$. We know that $det(A)=det(A^T)=(-1)^ndet(A)$, so if $n$ is odd, the determinant vanishes.



If $n$ is even, my book claims that the determinant is the square of a polynomial function of the entries, and Wikipedia confirms this. The polynomial in question is called the Pfaffian.



I was wondering if there was an easy (clean, conceptual) way to show that this is the case, without mucking around with the symmetric group.










share|cite|improve this question











$endgroup$




Consider a square skew-symmetric $ntimes n$ matrix $A$. We know that $det(A)=det(A^T)=(-1)^ndet(A)$, so if $n$ is odd, the determinant vanishes.



If $n$ is even, my book claims that the determinant is the square of a polynomial function of the entries, and Wikipedia confirms this. The polynomial in question is called the Pfaffian.



I was wondering if there was an easy (clean, conceptual) way to show that this is the case, without mucking around with the symmetric group.







linear-algebra pfaffian






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 25 '13 at 14:24









Grigory M

13.6k357104




13.6k357104










asked Jun 8 '12 at 3:10









PotatoPotato

21.5k1189189




21.5k1189189












  • $begingroup$
    See "alternate definitions" in the Wikipedia article. The idea that you can associate to such an $A$ an element in the exterior square $Lambda^2(V)$ of the vector space $V$ on which $A$ acts and then take exterior powers.
    $endgroup$
    – Qiaochu Yuan
    Jun 8 '12 at 3:35










  • $begingroup$
    Well, I haven't worked through the details.
    $endgroup$
    – Qiaochu Yuan
    Jun 8 '12 at 3:41












  • $begingroup$
    Related: sciencedirect.com/science/article/pii/S0001870885710298
    $endgroup$
    – darij grinberg
    Jul 21 '15 at 14:10


















  • $begingroup$
    See "alternate definitions" in the Wikipedia article. The idea that you can associate to such an $A$ an element in the exterior square $Lambda^2(V)$ of the vector space $V$ on which $A$ acts and then take exterior powers.
    $endgroup$
    – Qiaochu Yuan
    Jun 8 '12 at 3:35










  • $begingroup$
    Well, I haven't worked through the details.
    $endgroup$
    – Qiaochu Yuan
    Jun 8 '12 at 3:41












  • $begingroup$
    Related: sciencedirect.com/science/article/pii/S0001870885710298
    $endgroup$
    – darij grinberg
    Jul 21 '15 at 14:10
















$begingroup$
See "alternate definitions" in the Wikipedia article. The idea that you can associate to such an $A$ an element in the exterior square $Lambda^2(V)$ of the vector space $V$ on which $A$ acts and then take exterior powers.
$endgroup$
– Qiaochu Yuan
Jun 8 '12 at 3:35




$begingroup$
See "alternate definitions" in the Wikipedia article. The idea that you can associate to such an $A$ an element in the exterior square $Lambda^2(V)$ of the vector space $V$ on which $A$ acts and then take exterior powers.
$endgroup$
– Qiaochu Yuan
Jun 8 '12 at 3:35












$begingroup$
Well, I haven't worked through the details.
$endgroup$
– Qiaochu Yuan
Jun 8 '12 at 3:41






$begingroup$
Well, I haven't worked through the details.
$endgroup$
– Qiaochu Yuan
Jun 8 '12 at 3:41














$begingroup$
Related: sciencedirect.com/science/article/pii/S0001870885710298
$endgroup$
– darij grinberg
Jul 21 '15 at 14:10




$begingroup$
Related: sciencedirect.com/science/article/pii/S0001870885710298
$endgroup$
– darij grinberg
Jul 21 '15 at 14:10










3 Answers
3






active

oldest

votes


















12












$begingroup$

Here is an elaboration of Qiaochu's comment above:



A $2ntimes 2n$ matrix $A$ induces a pairing (say on column vectors), namely
$$langle v,w rangle := v^T A w.$$
Thus we can think of $A$ as being an element of $(Votimes V)^*$ (which is
the space of all bilinear pairings on $V$), where $V$ is the space of $2n$-dimensional column vectors.



If $A$ is skew-symmetric, then this pairing is anti-symmetric, and so we can actually regard $A$ as an element of $wedge^2 V^*$. We can then take the $n$th exterior power of $A$, so as to obtain an element of $wedge^{2n} V^*$. This latter space is $1$-dimensional, and so if we fix some appropriately normalized basis for it, the $n$th exterior power of $A$ can be thought of just as a number. This is the Pfaffian of $A$ (provided we chose the right basis for $wedge^{2n} V^*$).



How does this compare to the usual description of determinants via exterior powers:



For this, we regard $A$ as an endomorphism $V to V$, which induces an endomorphism $wedge^{2n} V to wedge^{2n} V$, which is a scalar (being an endomorphism of a $1$-dimensional space); this is $det A$.



So now we see where the formula $det(A) =$ Pf$(A)^2$ comes from: computing the determinant involves taking a $2n$th exterior power of $A$, while computing the Pfaffian involves only taking an $n$th exterior power (because we use the skew-symmetry of $A$ to get an exterior square "for free", so to speak).



The sorting out the details of all this should be a fun exercise.






share|cite|improve this answer









$endgroup$









  • 4




    $begingroup$
    It's clear from this argument that $operatorname{Pf}(A)$ exists and is a polynomial of order $n$ in the elements of $A$, while $det(A)$ is a polynomial of order $2n$. But to see that $operatorname{Pf}(A)^2=det(A)$ using exterior algebra methods, so far I haven't been able.
    $endgroup$
    – ziggurism
    Nov 14 '12 at 4:06










  • $begingroup$
    @JoeHannon It seems to me that this argument also shows that they have the same zeroes. So, if you can argue that $Pf(A)$ is irreducible, then $det(A)$ and $Pf(A)$ are proportional. I agree that I have never been able to figure out a truly clean way to finish this argument, though.
    $endgroup$
    – David E Speyer
    Sep 17 '15 at 15:33



















1












$begingroup$

Here is an approach using (possibly complex) Grassmann variables and Berezin integration$^1$ to prove the required relation $${rm Det}(A)~=~{rm Pf}(A)^2. tag{1}$$ This approach isn't purely conceptional, but at least it is easy, we don't fudge the overall sign, we don't muck around much with the symmetric group, and Grassmann variables do implement exterior calculus.




  1. Define the Pfaffian of a (possibly complex) antisymmetric matrix $A^{jk}=-A^{kj}$ (in $n$ dimensions) as$^2$
    $$ begin{align}
    {rm Pf}(A)&~:=~int !dtheta_n ldots dtheta_1~
    e^{frac{1}{2}theta_j A^{jk}theta_k}
    cr &~=~(-1)^{[frac{n}{2}]} int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr &
    cr &~=~(-1)^{frac{n}{2}} int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr
    cr &~=~i^n int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr
    &~=~ int !dtheta_1 ldots dtheta_n~
    e^{-frac{1}{2}theta_j A^{jk}theta_k}.end{align}
    tag{2}$$

    In the last equality of eq. (2), we rotated the Grassmann variables $theta_kto itheta_k$ with the imaginary unit.


  2. Define the determinant as
    $$ {rm Det}(A)~:=~int !dtheta_1 ~dwidetilde{theta}_1 ldots dtheta_n ~dwidetilde{theta}_n~ e^{widetilde{theta}_j A^{jk}theta_k}
    . tag{3}$$


    It is not hard to prove via coordinate substitution that eq. (3) indeed reproduces the standard definition of the determinant.


  3. If we make a change of coordinates
    $$ theta^{pm}_k~=~ frac{theta_kpm widetilde{theta}_k}{sqrt{2}}, qquad k~in~{1,ldots,n},tag{4} $$
    in eq. (3), the super-Jacobian becomes $(-1)^n$.


  4. Therefore we calculate
    $$begin{align} {rm Det}(A)&stackrel{(3)+(4)}{=}~(-1)^nint !dtheta^+_1 ~dtheta^-_1 ldots dtheta^+_n ~dtheta^-_n~
    e^{frac{1}{2}theta^+_j A^{jk}theta^+_k
    -frac{1}{2}theta^-_j A^{jk}theta^-_k}cr
    &~~=~int !dtheta^-_1 ldots dtheta^-_n~dtheta^+_nldots dtheta^+_1 ~~e^{frac{1}{2}theta^+_j A^{jk}theta^+_k}
    e^{-frac{1}{2}theta^-_j A^{jk}theta^-_k}cr
    &~~stackrel{(2)}{=}~{rm Pf}(A)^2, end{align}tag{5}$$


    which proves eq. (1).$Box$



--



$^1$ We use the sign convention that Berezin integration $$int dtheta_i~equiv~frac{partial}{partial theta_i}tag{6} $$ is the same as differentiation wrt. $theta_i$ acting from left. See e.g. this Phys.SE post and this Math.SE post.



$^2$ The sign of the permutation $(1, ldots, n)mapsto(n, ldots, 1)$ is given by $(-1)^{[frac{n}{2}]}$, where $[frac{n}{2}]$ denotes the integer part of $frac{n}{2}$. One may show that the Pfaffian (2) vanishes in odd dimensions $n$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    what do the square brackets in the expression [n/2] in equation (2) mean?
    $endgroup$
    – Mtheorist
    Dec 22 '18 at 7:15






  • 1




    $begingroup$
    See footnote 2.
    $endgroup$
    – Qmechanic
    Dec 22 '18 at 7:46



















0












$begingroup$

By continuity, we can assume that $A$ can always be reduced to block diagonal form with blocks
$$
left(begin{matrix} 0 &lambda_icr -lambda_i &0 end{matrix} right)
$$
on the diagonal. In this case computing the determinant gives $prod lambda^2_i$ and computing the Pfaffian gives $prod lambda_i$, so the determinant is the square of the Pfaffian.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    you did not understand one word about the asked question. We have to find a polynomial $P_nin mathbb{Z}[x]$, that depends on $n$ and not on $Ain skew_n(K)$, s.t. $det(A)=P_n((a_{i,j})_{i<j})^2$. When you block diagonalize $A$, you kill the $(a_{i,j})$ and you replace them with the $(lambda_i)$ and with the coefficients of some orthogonal matrix of change of basis. Unfortunately (for you) these new variables are in an algebraic extension of the underlying field (say $K$)! How are you going to link these variables to the $(a_{i,j})$ ? -to be continued-
    $endgroup$
    – loup blanc
    Mar 24 '18 at 10:32










  • $begingroup$
    In fact, we can (we must) work in the field $K$ . In particular, this result is valid for every commutative ring... Then the best is to remove your post that, since $3$ years, looks out of place.
    $endgroup$
    – loup blanc
    Mar 24 '18 at 10:32










  • $begingroup$
    The question askes us to show that $Pf(A)= epsilon^{a_1a_2cdots a_{2n}}A_{a_1a_2}cdots A_{a_{2n-1} a_{2n}}/(2^n n!)$ squares to the determinant. This is a a polynomial in the entries of $A$. No field extension is required to contruct it. To prove that it squares to det can be done by tedious combinatorics, or by showing it true for diagonalizable matrices.
    $endgroup$
    – mike stone
    Mar 25 '18 at 12:43












  • $begingroup$
    You write anything; there exists no compact formula for $Pf$; in fact, it can only be recursively defined. On the other hand, to show the result for the block diagonalized matrix does not prove anything.
    $endgroup$
    – loup blanc
    Mar 26 '18 at 18:16










  • $begingroup$
    @loup blanc. The formula for the Pfaffian is standard: en.wikipedia.org/wiki/Pfaffian. Also recall that any real skew symmetric matrix can be reduced into skew 2-by-2 blocks by a real orthogonal transformation: en.wikipedia.org/wiki/Skew-symmetric_matrix#Spectral_theory. Thus, if the formula holds for the block diagonal matrices(and it does) it holds for all matrices. The OP does not request for working in an arbitrary field, and will probably be satisfied with the result for real or complexes which is what I give.
    $endgroup$
    – mike stone
    Mar 26 '18 at 19:01













Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f155429%2fexistence-of-the-pfaffian%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























3 Answers
3






active

oldest

votes








3 Answers
3






active

oldest

votes









active

oldest

votes






active

oldest

votes









12












$begingroup$

Here is an elaboration of Qiaochu's comment above:



A $2ntimes 2n$ matrix $A$ induces a pairing (say on column vectors), namely
$$langle v,w rangle := v^T A w.$$
Thus we can think of $A$ as being an element of $(Votimes V)^*$ (which is
the space of all bilinear pairings on $V$), where $V$ is the space of $2n$-dimensional column vectors.



If $A$ is skew-symmetric, then this pairing is anti-symmetric, and so we can actually regard $A$ as an element of $wedge^2 V^*$. We can then take the $n$th exterior power of $A$, so as to obtain an element of $wedge^{2n} V^*$. This latter space is $1$-dimensional, and so if we fix some appropriately normalized basis for it, the $n$th exterior power of $A$ can be thought of just as a number. This is the Pfaffian of $A$ (provided we chose the right basis for $wedge^{2n} V^*$).



How does this compare to the usual description of determinants via exterior powers:



For this, we regard $A$ as an endomorphism $V to V$, which induces an endomorphism $wedge^{2n} V to wedge^{2n} V$, which is a scalar (being an endomorphism of a $1$-dimensional space); this is $det A$.



So now we see where the formula $det(A) =$ Pf$(A)^2$ comes from: computing the determinant involves taking a $2n$th exterior power of $A$, while computing the Pfaffian involves only taking an $n$th exterior power (because we use the skew-symmetry of $A$ to get an exterior square "for free", so to speak).



The sorting out the details of all this should be a fun exercise.






share|cite|improve this answer









$endgroup$









  • 4




    $begingroup$
    It's clear from this argument that $operatorname{Pf}(A)$ exists and is a polynomial of order $n$ in the elements of $A$, while $det(A)$ is a polynomial of order $2n$. But to see that $operatorname{Pf}(A)^2=det(A)$ using exterior algebra methods, so far I haven't been able.
    $endgroup$
    – ziggurism
    Nov 14 '12 at 4:06










  • $begingroup$
    @JoeHannon It seems to me that this argument also shows that they have the same zeroes. So, if you can argue that $Pf(A)$ is irreducible, then $det(A)$ and $Pf(A)$ are proportional. I agree that I have never been able to figure out a truly clean way to finish this argument, though.
    $endgroup$
    – David E Speyer
    Sep 17 '15 at 15:33
















12












$begingroup$

Here is an elaboration of Qiaochu's comment above:



A $2ntimes 2n$ matrix $A$ induces a pairing (say on column vectors), namely
$$langle v,w rangle := v^T A w.$$
Thus we can think of $A$ as being an element of $(Votimes V)^*$ (which is
the space of all bilinear pairings on $V$), where $V$ is the space of $2n$-dimensional column vectors.



If $A$ is skew-symmetric, then this pairing is anti-symmetric, and so we can actually regard $A$ as an element of $wedge^2 V^*$. We can then take the $n$th exterior power of $A$, so as to obtain an element of $wedge^{2n} V^*$. This latter space is $1$-dimensional, and so if we fix some appropriately normalized basis for it, the $n$th exterior power of $A$ can be thought of just as a number. This is the Pfaffian of $A$ (provided we chose the right basis for $wedge^{2n} V^*$).



How does this compare to the usual description of determinants via exterior powers:



For this, we regard $A$ as an endomorphism $V to V$, which induces an endomorphism $wedge^{2n} V to wedge^{2n} V$, which is a scalar (being an endomorphism of a $1$-dimensional space); this is $det A$.



So now we see where the formula $det(A) =$ Pf$(A)^2$ comes from: computing the determinant involves taking a $2n$th exterior power of $A$, while computing the Pfaffian involves only taking an $n$th exterior power (because we use the skew-symmetry of $A$ to get an exterior square "for free", so to speak).



The sorting out the details of all this should be a fun exercise.






share|cite|improve this answer









$endgroup$









  • 4




    $begingroup$
    It's clear from this argument that $operatorname{Pf}(A)$ exists and is a polynomial of order $n$ in the elements of $A$, while $det(A)$ is a polynomial of order $2n$. But to see that $operatorname{Pf}(A)^2=det(A)$ using exterior algebra methods, so far I haven't been able.
    $endgroup$
    – ziggurism
    Nov 14 '12 at 4:06










  • $begingroup$
    @JoeHannon It seems to me that this argument also shows that they have the same zeroes. So, if you can argue that $Pf(A)$ is irreducible, then $det(A)$ and $Pf(A)$ are proportional. I agree that I have never been able to figure out a truly clean way to finish this argument, though.
    $endgroup$
    – David E Speyer
    Sep 17 '15 at 15:33














12












12








12





$begingroup$

Here is an elaboration of Qiaochu's comment above:



A $2ntimes 2n$ matrix $A$ induces a pairing (say on column vectors), namely
$$langle v,w rangle := v^T A w.$$
Thus we can think of $A$ as being an element of $(Votimes V)^*$ (which is
the space of all bilinear pairings on $V$), where $V$ is the space of $2n$-dimensional column vectors.



If $A$ is skew-symmetric, then this pairing is anti-symmetric, and so we can actually regard $A$ as an element of $wedge^2 V^*$. We can then take the $n$th exterior power of $A$, so as to obtain an element of $wedge^{2n} V^*$. This latter space is $1$-dimensional, and so if we fix some appropriately normalized basis for it, the $n$th exterior power of $A$ can be thought of just as a number. This is the Pfaffian of $A$ (provided we chose the right basis for $wedge^{2n} V^*$).



How does this compare to the usual description of determinants via exterior powers:



For this, we regard $A$ as an endomorphism $V to V$, which induces an endomorphism $wedge^{2n} V to wedge^{2n} V$, which is a scalar (being an endomorphism of a $1$-dimensional space); this is $det A$.



So now we see where the formula $det(A) =$ Pf$(A)^2$ comes from: computing the determinant involves taking a $2n$th exterior power of $A$, while computing the Pfaffian involves only taking an $n$th exterior power (because we use the skew-symmetry of $A$ to get an exterior square "for free", so to speak).



The sorting out the details of all this should be a fun exercise.






share|cite|improve this answer









$endgroup$



Here is an elaboration of Qiaochu's comment above:



A $2ntimes 2n$ matrix $A$ induces a pairing (say on column vectors), namely
$$langle v,w rangle := v^T A w.$$
Thus we can think of $A$ as being an element of $(Votimes V)^*$ (which is
the space of all bilinear pairings on $V$), where $V$ is the space of $2n$-dimensional column vectors.



If $A$ is skew-symmetric, then this pairing is anti-symmetric, and so we can actually regard $A$ as an element of $wedge^2 V^*$. We can then take the $n$th exterior power of $A$, so as to obtain an element of $wedge^{2n} V^*$. This latter space is $1$-dimensional, and so if we fix some appropriately normalized basis for it, the $n$th exterior power of $A$ can be thought of just as a number. This is the Pfaffian of $A$ (provided we chose the right basis for $wedge^{2n} V^*$).



How does this compare to the usual description of determinants via exterior powers:



For this, we regard $A$ as an endomorphism $V to V$, which induces an endomorphism $wedge^{2n} V to wedge^{2n} V$, which is a scalar (being an endomorphism of a $1$-dimensional space); this is $det A$.



So now we see where the formula $det(A) =$ Pf$(A)^2$ comes from: computing the determinant involves taking a $2n$th exterior power of $A$, while computing the Pfaffian involves only taking an $n$th exterior power (because we use the skew-symmetry of $A$ to get an exterior square "for free", so to speak).



The sorting out the details of all this should be a fun exercise.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jun 8 '12 at 4:22









Matt EMatt E

105k8218385




105k8218385








  • 4




    $begingroup$
    It's clear from this argument that $operatorname{Pf}(A)$ exists and is a polynomial of order $n$ in the elements of $A$, while $det(A)$ is a polynomial of order $2n$. But to see that $operatorname{Pf}(A)^2=det(A)$ using exterior algebra methods, so far I haven't been able.
    $endgroup$
    – ziggurism
    Nov 14 '12 at 4:06










  • $begingroup$
    @JoeHannon It seems to me that this argument also shows that they have the same zeroes. So, if you can argue that $Pf(A)$ is irreducible, then $det(A)$ and $Pf(A)$ are proportional. I agree that I have never been able to figure out a truly clean way to finish this argument, though.
    $endgroup$
    – David E Speyer
    Sep 17 '15 at 15:33














  • 4




    $begingroup$
    It's clear from this argument that $operatorname{Pf}(A)$ exists and is a polynomial of order $n$ in the elements of $A$, while $det(A)$ is a polynomial of order $2n$. But to see that $operatorname{Pf}(A)^2=det(A)$ using exterior algebra methods, so far I haven't been able.
    $endgroup$
    – ziggurism
    Nov 14 '12 at 4:06










  • $begingroup$
    @JoeHannon It seems to me that this argument also shows that they have the same zeroes. So, if you can argue that $Pf(A)$ is irreducible, then $det(A)$ and $Pf(A)$ are proportional. I agree that I have never been able to figure out a truly clean way to finish this argument, though.
    $endgroup$
    – David E Speyer
    Sep 17 '15 at 15:33








4




4




$begingroup$
It's clear from this argument that $operatorname{Pf}(A)$ exists and is a polynomial of order $n$ in the elements of $A$, while $det(A)$ is a polynomial of order $2n$. But to see that $operatorname{Pf}(A)^2=det(A)$ using exterior algebra methods, so far I haven't been able.
$endgroup$
– ziggurism
Nov 14 '12 at 4:06




$begingroup$
It's clear from this argument that $operatorname{Pf}(A)$ exists and is a polynomial of order $n$ in the elements of $A$, while $det(A)$ is a polynomial of order $2n$. But to see that $operatorname{Pf}(A)^2=det(A)$ using exterior algebra methods, so far I haven't been able.
$endgroup$
– ziggurism
Nov 14 '12 at 4:06












$begingroup$
@JoeHannon It seems to me that this argument also shows that they have the same zeroes. So, if you can argue that $Pf(A)$ is irreducible, then $det(A)$ and $Pf(A)$ are proportional. I agree that I have never been able to figure out a truly clean way to finish this argument, though.
$endgroup$
– David E Speyer
Sep 17 '15 at 15:33




$begingroup$
@JoeHannon It seems to me that this argument also shows that they have the same zeroes. So, if you can argue that $Pf(A)$ is irreducible, then $det(A)$ and $Pf(A)$ are proportional. I agree that I have never been able to figure out a truly clean way to finish this argument, though.
$endgroup$
– David E Speyer
Sep 17 '15 at 15:33











1












$begingroup$

Here is an approach using (possibly complex) Grassmann variables and Berezin integration$^1$ to prove the required relation $${rm Det}(A)~=~{rm Pf}(A)^2. tag{1}$$ This approach isn't purely conceptional, but at least it is easy, we don't fudge the overall sign, we don't muck around much with the symmetric group, and Grassmann variables do implement exterior calculus.




  1. Define the Pfaffian of a (possibly complex) antisymmetric matrix $A^{jk}=-A^{kj}$ (in $n$ dimensions) as$^2$
    $$ begin{align}
    {rm Pf}(A)&~:=~int !dtheta_n ldots dtheta_1~
    e^{frac{1}{2}theta_j A^{jk}theta_k}
    cr &~=~(-1)^{[frac{n}{2}]} int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr &
    cr &~=~(-1)^{frac{n}{2}} int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr
    cr &~=~i^n int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr
    &~=~ int !dtheta_1 ldots dtheta_n~
    e^{-frac{1}{2}theta_j A^{jk}theta_k}.end{align}
    tag{2}$$

    In the last equality of eq. (2), we rotated the Grassmann variables $theta_kto itheta_k$ with the imaginary unit.


  2. Define the determinant as
    $$ {rm Det}(A)~:=~int !dtheta_1 ~dwidetilde{theta}_1 ldots dtheta_n ~dwidetilde{theta}_n~ e^{widetilde{theta}_j A^{jk}theta_k}
    . tag{3}$$


    It is not hard to prove via coordinate substitution that eq. (3) indeed reproduces the standard definition of the determinant.


  3. If we make a change of coordinates
    $$ theta^{pm}_k~=~ frac{theta_kpm widetilde{theta}_k}{sqrt{2}}, qquad k~in~{1,ldots,n},tag{4} $$
    in eq. (3), the super-Jacobian becomes $(-1)^n$.


  4. Therefore we calculate
    $$begin{align} {rm Det}(A)&stackrel{(3)+(4)}{=}~(-1)^nint !dtheta^+_1 ~dtheta^-_1 ldots dtheta^+_n ~dtheta^-_n~
    e^{frac{1}{2}theta^+_j A^{jk}theta^+_k
    -frac{1}{2}theta^-_j A^{jk}theta^-_k}cr
    &~~=~int !dtheta^-_1 ldots dtheta^-_n~dtheta^+_nldots dtheta^+_1 ~~e^{frac{1}{2}theta^+_j A^{jk}theta^+_k}
    e^{-frac{1}{2}theta^-_j A^{jk}theta^-_k}cr
    &~~stackrel{(2)}{=}~{rm Pf}(A)^2, end{align}tag{5}$$


    which proves eq. (1).$Box$



--



$^1$ We use the sign convention that Berezin integration $$int dtheta_i~equiv~frac{partial}{partial theta_i}tag{6} $$ is the same as differentiation wrt. $theta_i$ acting from left. See e.g. this Phys.SE post and this Math.SE post.



$^2$ The sign of the permutation $(1, ldots, n)mapsto(n, ldots, 1)$ is given by $(-1)^{[frac{n}{2}]}$, where $[frac{n}{2}]$ denotes the integer part of $frac{n}{2}$. One may show that the Pfaffian (2) vanishes in odd dimensions $n$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    what do the square brackets in the expression [n/2] in equation (2) mean?
    $endgroup$
    – Mtheorist
    Dec 22 '18 at 7:15






  • 1




    $begingroup$
    See footnote 2.
    $endgroup$
    – Qmechanic
    Dec 22 '18 at 7:46
















1












$begingroup$

Here is an approach using (possibly complex) Grassmann variables and Berezin integration$^1$ to prove the required relation $${rm Det}(A)~=~{rm Pf}(A)^2. tag{1}$$ This approach isn't purely conceptional, but at least it is easy, we don't fudge the overall sign, we don't muck around much with the symmetric group, and Grassmann variables do implement exterior calculus.




  1. Define the Pfaffian of a (possibly complex) antisymmetric matrix $A^{jk}=-A^{kj}$ (in $n$ dimensions) as$^2$
    $$ begin{align}
    {rm Pf}(A)&~:=~int !dtheta_n ldots dtheta_1~
    e^{frac{1}{2}theta_j A^{jk}theta_k}
    cr &~=~(-1)^{[frac{n}{2}]} int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr &
    cr &~=~(-1)^{frac{n}{2}} int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr
    cr &~=~i^n int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr
    &~=~ int !dtheta_1 ldots dtheta_n~
    e^{-frac{1}{2}theta_j A^{jk}theta_k}.end{align}
    tag{2}$$

    In the last equality of eq. (2), we rotated the Grassmann variables $theta_kto itheta_k$ with the imaginary unit.


  2. Define the determinant as
    $$ {rm Det}(A)~:=~int !dtheta_1 ~dwidetilde{theta}_1 ldots dtheta_n ~dwidetilde{theta}_n~ e^{widetilde{theta}_j A^{jk}theta_k}
    . tag{3}$$


    It is not hard to prove via coordinate substitution that eq. (3) indeed reproduces the standard definition of the determinant.


  3. If we make a change of coordinates
    $$ theta^{pm}_k~=~ frac{theta_kpm widetilde{theta}_k}{sqrt{2}}, qquad k~in~{1,ldots,n},tag{4} $$
    in eq. (3), the super-Jacobian becomes $(-1)^n$.


  4. Therefore we calculate
    $$begin{align} {rm Det}(A)&stackrel{(3)+(4)}{=}~(-1)^nint !dtheta^+_1 ~dtheta^-_1 ldots dtheta^+_n ~dtheta^-_n~
    e^{frac{1}{2}theta^+_j A^{jk}theta^+_k
    -frac{1}{2}theta^-_j A^{jk}theta^-_k}cr
    &~~=~int !dtheta^-_1 ldots dtheta^-_n~dtheta^+_nldots dtheta^+_1 ~~e^{frac{1}{2}theta^+_j A^{jk}theta^+_k}
    e^{-frac{1}{2}theta^-_j A^{jk}theta^-_k}cr
    &~~stackrel{(2)}{=}~{rm Pf}(A)^2, end{align}tag{5}$$


    which proves eq. (1).$Box$



--



$^1$ We use the sign convention that Berezin integration $$int dtheta_i~equiv~frac{partial}{partial theta_i}tag{6} $$ is the same as differentiation wrt. $theta_i$ acting from left. See e.g. this Phys.SE post and this Math.SE post.



$^2$ The sign of the permutation $(1, ldots, n)mapsto(n, ldots, 1)$ is given by $(-1)^{[frac{n}{2}]}$, where $[frac{n}{2}]$ denotes the integer part of $frac{n}{2}$. One may show that the Pfaffian (2) vanishes in odd dimensions $n$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    what do the square brackets in the expression [n/2] in equation (2) mean?
    $endgroup$
    – Mtheorist
    Dec 22 '18 at 7:15






  • 1




    $begingroup$
    See footnote 2.
    $endgroup$
    – Qmechanic
    Dec 22 '18 at 7:46














1












1








1





$begingroup$

Here is an approach using (possibly complex) Grassmann variables and Berezin integration$^1$ to prove the required relation $${rm Det}(A)~=~{rm Pf}(A)^2. tag{1}$$ This approach isn't purely conceptional, but at least it is easy, we don't fudge the overall sign, we don't muck around much with the symmetric group, and Grassmann variables do implement exterior calculus.




  1. Define the Pfaffian of a (possibly complex) antisymmetric matrix $A^{jk}=-A^{kj}$ (in $n$ dimensions) as$^2$
    $$ begin{align}
    {rm Pf}(A)&~:=~int !dtheta_n ldots dtheta_1~
    e^{frac{1}{2}theta_j A^{jk}theta_k}
    cr &~=~(-1)^{[frac{n}{2}]} int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr &
    cr &~=~(-1)^{frac{n}{2}} int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr
    cr &~=~i^n int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr
    &~=~ int !dtheta_1 ldots dtheta_n~
    e^{-frac{1}{2}theta_j A^{jk}theta_k}.end{align}
    tag{2}$$

    In the last equality of eq. (2), we rotated the Grassmann variables $theta_kto itheta_k$ with the imaginary unit.


  2. Define the determinant as
    $$ {rm Det}(A)~:=~int !dtheta_1 ~dwidetilde{theta}_1 ldots dtheta_n ~dwidetilde{theta}_n~ e^{widetilde{theta}_j A^{jk}theta_k}
    . tag{3}$$


    It is not hard to prove via coordinate substitution that eq. (3) indeed reproduces the standard definition of the determinant.


  3. If we make a change of coordinates
    $$ theta^{pm}_k~=~ frac{theta_kpm widetilde{theta}_k}{sqrt{2}}, qquad k~in~{1,ldots,n},tag{4} $$
    in eq. (3), the super-Jacobian becomes $(-1)^n$.


  4. Therefore we calculate
    $$begin{align} {rm Det}(A)&stackrel{(3)+(4)}{=}~(-1)^nint !dtheta^+_1 ~dtheta^-_1 ldots dtheta^+_n ~dtheta^-_n~
    e^{frac{1}{2}theta^+_j A^{jk}theta^+_k
    -frac{1}{2}theta^-_j A^{jk}theta^-_k}cr
    &~~=~int !dtheta^-_1 ldots dtheta^-_n~dtheta^+_nldots dtheta^+_1 ~~e^{frac{1}{2}theta^+_j A^{jk}theta^+_k}
    e^{-frac{1}{2}theta^-_j A^{jk}theta^-_k}cr
    &~~stackrel{(2)}{=}~{rm Pf}(A)^2, end{align}tag{5}$$


    which proves eq. (1).$Box$



--



$^1$ We use the sign convention that Berezin integration $$int dtheta_i~equiv~frac{partial}{partial theta_i}tag{6} $$ is the same as differentiation wrt. $theta_i$ acting from left. See e.g. this Phys.SE post and this Math.SE post.



$^2$ The sign of the permutation $(1, ldots, n)mapsto(n, ldots, 1)$ is given by $(-1)^{[frac{n}{2}]}$, where $[frac{n}{2}]$ denotes the integer part of $frac{n}{2}$. One may show that the Pfaffian (2) vanishes in odd dimensions $n$.






share|cite|improve this answer











$endgroup$



Here is an approach using (possibly complex) Grassmann variables and Berezin integration$^1$ to prove the required relation $${rm Det}(A)~=~{rm Pf}(A)^2. tag{1}$$ This approach isn't purely conceptional, but at least it is easy, we don't fudge the overall sign, we don't muck around much with the symmetric group, and Grassmann variables do implement exterior calculus.




  1. Define the Pfaffian of a (possibly complex) antisymmetric matrix $A^{jk}=-A^{kj}$ (in $n$ dimensions) as$^2$
    $$ begin{align}
    {rm Pf}(A)&~:=~int !dtheta_n ldots dtheta_1~
    e^{frac{1}{2}theta_j A^{jk}theta_k}
    cr &~=~(-1)^{[frac{n}{2}]} int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr &
    cr &~=~(-1)^{frac{n}{2}} int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr
    cr &~=~i^n int !dtheta_1 ldots dtheta_n~
    e^{frac{1}{2}theta_j A^{jk}theta_k}cr
    &~=~ int !dtheta_1 ldots dtheta_n~
    e^{-frac{1}{2}theta_j A^{jk}theta_k}.end{align}
    tag{2}$$

    In the last equality of eq. (2), we rotated the Grassmann variables $theta_kto itheta_k$ with the imaginary unit.


  2. Define the determinant as
    $$ {rm Det}(A)~:=~int !dtheta_1 ~dwidetilde{theta}_1 ldots dtheta_n ~dwidetilde{theta}_n~ e^{widetilde{theta}_j A^{jk}theta_k}
    . tag{3}$$


    It is not hard to prove via coordinate substitution that eq. (3) indeed reproduces the standard definition of the determinant.


  3. If we make a change of coordinates
    $$ theta^{pm}_k~=~ frac{theta_kpm widetilde{theta}_k}{sqrt{2}}, qquad k~in~{1,ldots,n},tag{4} $$
    in eq. (3), the super-Jacobian becomes $(-1)^n$.


  4. Therefore we calculate
    $$begin{align} {rm Det}(A)&stackrel{(3)+(4)}{=}~(-1)^nint !dtheta^+_1 ~dtheta^-_1 ldots dtheta^+_n ~dtheta^-_n~
    e^{frac{1}{2}theta^+_j A^{jk}theta^+_k
    -frac{1}{2}theta^-_j A^{jk}theta^-_k}cr
    &~~=~int !dtheta^-_1 ldots dtheta^-_n~dtheta^+_nldots dtheta^+_1 ~~e^{frac{1}{2}theta^+_j A^{jk}theta^+_k}
    e^{-frac{1}{2}theta^-_j A^{jk}theta^-_k}cr
    &~~stackrel{(2)}{=}~{rm Pf}(A)^2, end{align}tag{5}$$


    which proves eq. (1).$Box$



--



$^1$ We use the sign convention that Berezin integration $$int dtheta_i~equiv~frac{partial}{partial theta_i}tag{6} $$ is the same as differentiation wrt. $theta_i$ acting from left. See e.g. this Phys.SE post and this Math.SE post.



$^2$ The sign of the permutation $(1, ldots, n)mapsto(n, ldots, 1)$ is given by $(-1)^{[frac{n}{2}]}$, where $[frac{n}{2}]$ denotes the integer part of $frac{n}{2}$. One may show that the Pfaffian (2) vanishes in odd dimensions $n$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Dec 22 '18 at 7:46

























answered Mar 23 '18 at 19:53









QmechanicQmechanic

5,03211856




5,03211856












  • $begingroup$
    what do the square brackets in the expression [n/2] in equation (2) mean?
    $endgroup$
    – Mtheorist
    Dec 22 '18 at 7:15






  • 1




    $begingroup$
    See footnote 2.
    $endgroup$
    – Qmechanic
    Dec 22 '18 at 7:46


















  • $begingroup$
    what do the square brackets in the expression [n/2] in equation (2) mean?
    $endgroup$
    – Mtheorist
    Dec 22 '18 at 7:15






  • 1




    $begingroup$
    See footnote 2.
    $endgroup$
    – Qmechanic
    Dec 22 '18 at 7:46
















$begingroup$
what do the square brackets in the expression [n/2] in equation (2) mean?
$endgroup$
– Mtheorist
Dec 22 '18 at 7:15




$begingroup$
what do the square brackets in the expression [n/2] in equation (2) mean?
$endgroup$
– Mtheorist
Dec 22 '18 at 7:15




1




1




$begingroup$
See footnote 2.
$endgroup$
– Qmechanic
Dec 22 '18 at 7:46




$begingroup$
See footnote 2.
$endgroup$
– Qmechanic
Dec 22 '18 at 7:46











0












$begingroup$

By continuity, we can assume that $A$ can always be reduced to block diagonal form with blocks
$$
left(begin{matrix} 0 &lambda_icr -lambda_i &0 end{matrix} right)
$$
on the diagonal. In this case computing the determinant gives $prod lambda^2_i$ and computing the Pfaffian gives $prod lambda_i$, so the determinant is the square of the Pfaffian.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    you did not understand one word about the asked question. We have to find a polynomial $P_nin mathbb{Z}[x]$, that depends on $n$ and not on $Ain skew_n(K)$, s.t. $det(A)=P_n((a_{i,j})_{i<j})^2$. When you block diagonalize $A$, you kill the $(a_{i,j})$ and you replace them with the $(lambda_i)$ and with the coefficients of some orthogonal matrix of change of basis. Unfortunately (for you) these new variables are in an algebraic extension of the underlying field (say $K$)! How are you going to link these variables to the $(a_{i,j})$ ? -to be continued-
    $endgroup$
    – loup blanc
    Mar 24 '18 at 10:32










  • $begingroup$
    In fact, we can (we must) work in the field $K$ . In particular, this result is valid for every commutative ring... Then the best is to remove your post that, since $3$ years, looks out of place.
    $endgroup$
    – loup blanc
    Mar 24 '18 at 10:32










  • $begingroup$
    The question askes us to show that $Pf(A)= epsilon^{a_1a_2cdots a_{2n}}A_{a_1a_2}cdots A_{a_{2n-1} a_{2n}}/(2^n n!)$ squares to the determinant. This is a a polynomial in the entries of $A$. No field extension is required to contruct it. To prove that it squares to det can be done by tedious combinatorics, or by showing it true for diagonalizable matrices.
    $endgroup$
    – mike stone
    Mar 25 '18 at 12:43












  • $begingroup$
    You write anything; there exists no compact formula for $Pf$; in fact, it can only be recursively defined. On the other hand, to show the result for the block diagonalized matrix does not prove anything.
    $endgroup$
    – loup blanc
    Mar 26 '18 at 18:16










  • $begingroup$
    @loup blanc. The formula for the Pfaffian is standard: en.wikipedia.org/wiki/Pfaffian. Also recall that any real skew symmetric matrix can be reduced into skew 2-by-2 blocks by a real orthogonal transformation: en.wikipedia.org/wiki/Skew-symmetric_matrix#Spectral_theory. Thus, if the formula holds for the block diagonal matrices(and it does) it holds for all matrices. The OP does not request for working in an arbitrary field, and will probably be satisfied with the result for real or complexes which is what I give.
    $endgroup$
    – mike stone
    Mar 26 '18 at 19:01


















0












$begingroup$

By continuity, we can assume that $A$ can always be reduced to block diagonal form with blocks
$$
left(begin{matrix} 0 &lambda_icr -lambda_i &0 end{matrix} right)
$$
on the diagonal. In this case computing the determinant gives $prod lambda^2_i$ and computing the Pfaffian gives $prod lambda_i$, so the determinant is the square of the Pfaffian.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    you did not understand one word about the asked question. We have to find a polynomial $P_nin mathbb{Z}[x]$, that depends on $n$ and not on $Ain skew_n(K)$, s.t. $det(A)=P_n((a_{i,j})_{i<j})^2$. When you block diagonalize $A$, you kill the $(a_{i,j})$ and you replace them with the $(lambda_i)$ and with the coefficients of some orthogonal matrix of change of basis. Unfortunately (for you) these new variables are in an algebraic extension of the underlying field (say $K$)! How are you going to link these variables to the $(a_{i,j})$ ? -to be continued-
    $endgroup$
    – loup blanc
    Mar 24 '18 at 10:32










  • $begingroup$
    In fact, we can (we must) work in the field $K$ . In particular, this result is valid for every commutative ring... Then the best is to remove your post that, since $3$ years, looks out of place.
    $endgroup$
    – loup blanc
    Mar 24 '18 at 10:32










  • $begingroup$
    The question askes us to show that $Pf(A)= epsilon^{a_1a_2cdots a_{2n}}A_{a_1a_2}cdots A_{a_{2n-1} a_{2n}}/(2^n n!)$ squares to the determinant. This is a a polynomial in the entries of $A$. No field extension is required to contruct it. To prove that it squares to det can be done by tedious combinatorics, or by showing it true for diagonalizable matrices.
    $endgroup$
    – mike stone
    Mar 25 '18 at 12:43












  • $begingroup$
    You write anything; there exists no compact formula for $Pf$; in fact, it can only be recursively defined. On the other hand, to show the result for the block diagonalized matrix does not prove anything.
    $endgroup$
    – loup blanc
    Mar 26 '18 at 18:16










  • $begingroup$
    @loup blanc. The formula for the Pfaffian is standard: en.wikipedia.org/wiki/Pfaffian. Also recall that any real skew symmetric matrix can be reduced into skew 2-by-2 blocks by a real orthogonal transformation: en.wikipedia.org/wiki/Skew-symmetric_matrix#Spectral_theory. Thus, if the formula holds for the block diagonal matrices(and it does) it holds for all matrices. The OP does not request for working in an arbitrary field, and will probably be satisfied with the result for real or complexes which is what I give.
    $endgroup$
    – mike stone
    Mar 26 '18 at 19:01
















0












0








0





$begingroup$

By continuity, we can assume that $A$ can always be reduced to block diagonal form with blocks
$$
left(begin{matrix} 0 &lambda_icr -lambda_i &0 end{matrix} right)
$$
on the diagonal. In this case computing the determinant gives $prod lambda^2_i$ and computing the Pfaffian gives $prod lambda_i$, so the determinant is the square of the Pfaffian.






share|cite|improve this answer











$endgroup$



By continuity, we can assume that $A$ can always be reduced to block diagonal form with blocks
$$
left(begin{matrix} 0 &lambda_icr -lambda_i &0 end{matrix} right)
$$
on the diagonal. In this case computing the determinant gives $prod lambda^2_i$ and computing the Pfaffian gives $prod lambda_i$, so the determinant is the square of the Pfaffian.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Jul 21 '15 at 16:08

























answered Jul 21 '15 at 14:31









mike stonemike stone

31817




31817












  • $begingroup$
    you did not understand one word about the asked question. We have to find a polynomial $P_nin mathbb{Z}[x]$, that depends on $n$ and not on $Ain skew_n(K)$, s.t. $det(A)=P_n((a_{i,j})_{i<j})^2$. When you block diagonalize $A$, you kill the $(a_{i,j})$ and you replace them with the $(lambda_i)$ and with the coefficients of some orthogonal matrix of change of basis. Unfortunately (for you) these new variables are in an algebraic extension of the underlying field (say $K$)! How are you going to link these variables to the $(a_{i,j})$ ? -to be continued-
    $endgroup$
    – loup blanc
    Mar 24 '18 at 10:32










  • $begingroup$
    In fact, we can (we must) work in the field $K$ . In particular, this result is valid for every commutative ring... Then the best is to remove your post that, since $3$ years, looks out of place.
    $endgroup$
    – loup blanc
    Mar 24 '18 at 10:32










  • $begingroup$
    The question askes us to show that $Pf(A)= epsilon^{a_1a_2cdots a_{2n}}A_{a_1a_2}cdots A_{a_{2n-1} a_{2n}}/(2^n n!)$ squares to the determinant. This is a a polynomial in the entries of $A$. No field extension is required to contruct it. To prove that it squares to det can be done by tedious combinatorics, or by showing it true for diagonalizable matrices.
    $endgroup$
    – mike stone
    Mar 25 '18 at 12:43












  • $begingroup$
    You write anything; there exists no compact formula for $Pf$; in fact, it can only be recursively defined. On the other hand, to show the result for the block diagonalized matrix does not prove anything.
    $endgroup$
    – loup blanc
    Mar 26 '18 at 18:16










  • $begingroup$
    @loup blanc. The formula for the Pfaffian is standard: en.wikipedia.org/wiki/Pfaffian. Also recall that any real skew symmetric matrix can be reduced into skew 2-by-2 blocks by a real orthogonal transformation: en.wikipedia.org/wiki/Skew-symmetric_matrix#Spectral_theory. Thus, if the formula holds for the block diagonal matrices(and it does) it holds for all matrices. The OP does not request for working in an arbitrary field, and will probably be satisfied with the result for real or complexes which is what I give.
    $endgroup$
    – mike stone
    Mar 26 '18 at 19:01




















  • $begingroup$
    you did not understand one word about the asked question. We have to find a polynomial $P_nin mathbb{Z}[x]$, that depends on $n$ and not on $Ain skew_n(K)$, s.t. $det(A)=P_n((a_{i,j})_{i<j})^2$. When you block diagonalize $A$, you kill the $(a_{i,j})$ and you replace them with the $(lambda_i)$ and with the coefficients of some orthogonal matrix of change of basis. Unfortunately (for you) these new variables are in an algebraic extension of the underlying field (say $K$)! How are you going to link these variables to the $(a_{i,j})$ ? -to be continued-
    $endgroup$
    – loup blanc
    Mar 24 '18 at 10:32










  • $begingroup$
    In fact, we can (we must) work in the field $K$ . In particular, this result is valid for every commutative ring... Then the best is to remove your post that, since $3$ years, looks out of place.
    $endgroup$
    – loup blanc
    Mar 24 '18 at 10:32










  • $begingroup$
    The question askes us to show that $Pf(A)= epsilon^{a_1a_2cdots a_{2n}}A_{a_1a_2}cdots A_{a_{2n-1} a_{2n}}/(2^n n!)$ squares to the determinant. This is a a polynomial in the entries of $A$. No field extension is required to contruct it. To prove that it squares to det can be done by tedious combinatorics, or by showing it true for diagonalizable matrices.
    $endgroup$
    – mike stone
    Mar 25 '18 at 12:43












  • $begingroup$
    You write anything; there exists no compact formula for $Pf$; in fact, it can only be recursively defined. On the other hand, to show the result for the block diagonalized matrix does not prove anything.
    $endgroup$
    – loup blanc
    Mar 26 '18 at 18:16










  • $begingroup$
    @loup blanc. The formula for the Pfaffian is standard: en.wikipedia.org/wiki/Pfaffian. Also recall that any real skew symmetric matrix can be reduced into skew 2-by-2 blocks by a real orthogonal transformation: en.wikipedia.org/wiki/Skew-symmetric_matrix#Spectral_theory. Thus, if the formula holds for the block diagonal matrices(and it does) it holds for all matrices. The OP does not request for working in an arbitrary field, and will probably be satisfied with the result for real or complexes which is what I give.
    $endgroup$
    – mike stone
    Mar 26 '18 at 19:01


















$begingroup$
you did not understand one word about the asked question. We have to find a polynomial $P_nin mathbb{Z}[x]$, that depends on $n$ and not on $Ain skew_n(K)$, s.t. $det(A)=P_n((a_{i,j})_{i<j})^2$. When you block diagonalize $A$, you kill the $(a_{i,j})$ and you replace them with the $(lambda_i)$ and with the coefficients of some orthogonal matrix of change of basis. Unfortunately (for you) these new variables are in an algebraic extension of the underlying field (say $K$)! How are you going to link these variables to the $(a_{i,j})$ ? -to be continued-
$endgroup$
– loup blanc
Mar 24 '18 at 10:32




$begingroup$
you did not understand one word about the asked question. We have to find a polynomial $P_nin mathbb{Z}[x]$, that depends on $n$ and not on $Ain skew_n(K)$, s.t. $det(A)=P_n((a_{i,j})_{i<j})^2$. When you block diagonalize $A$, you kill the $(a_{i,j})$ and you replace them with the $(lambda_i)$ and with the coefficients of some orthogonal matrix of change of basis. Unfortunately (for you) these new variables are in an algebraic extension of the underlying field (say $K$)! How are you going to link these variables to the $(a_{i,j})$ ? -to be continued-
$endgroup$
– loup blanc
Mar 24 '18 at 10:32












$begingroup$
In fact, we can (we must) work in the field $K$ . In particular, this result is valid for every commutative ring... Then the best is to remove your post that, since $3$ years, looks out of place.
$endgroup$
– loup blanc
Mar 24 '18 at 10:32




$begingroup$
In fact, we can (we must) work in the field $K$ . In particular, this result is valid for every commutative ring... Then the best is to remove your post that, since $3$ years, looks out of place.
$endgroup$
– loup blanc
Mar 24 '18 at 10:32












$begingroup$
The question askes us to show that $Pf(A)= epsilon^{a_1a_2cdots a_{2n}}A_{a_1a_2}cdots A_{a_{2n-1} a_{2n}}/(2^n n!)$ squares to the determinant. This is a a polynomial in the entries of $A$. No field extension is required to contruct it. To prove that it squares to det can be done by tedious combinatorics, or by showing it true for diagonalizable matrices.
$endgroup$
– mike stone
Mar 25 '18 at 12:43






$begingroup$
The question askes us to show that $Pf(A)= epsilon^{a_1a_2cdots a_{2n}}A_{a_1a_2}cdots A_{a_{2n-1} a_{2n}}/(2^n n!)$ squares to the determinant. This is a a polynomial in the entries of $A$. No field extension is required to contruct it. To prove that it squares to det can be done by tedious combinatorics, or by showing it true for diagonalizable matrices.
$endgroup$
– mike stone
Mar 25 '18 at 12:43














$begingroup$
You write anything; there exists no compact formula for $Pf$; in fact, it can only be recursively defined. On the other hand, to show the result for the block diagonalized matrix does not prove anything.
$endgroup$
– loup blanc
Mar 26 '18 at 18:16




$begingroup$
You write anything; there exists no compact formula for $Pf$; in fact, it can only be recursively defined. On the other hand, to show the result for the block diagonalized matrix does not prove anything.
$endgroup$
– loup blanc
Mar 26 '18 at 18:16












$begingroup$
@loup blanc. The formula for the Pfaffian is standard: en.wikipedia.org/wiki/Pfaffian. Also recall that any real skew symmetric matrix can be reduced into skew 2-by-2 blocks by a real orthogonal transformation: en.wikipedia.org/wiki/Skew-symmetric_matrix#Spectral_theory. Thus, if the formula holds for the block diagonal matrices(and it does) it holds for all matrices. The OP does not request for working in an arbitrary field, and will probably be satisfied with the result for real or complexes which is what I give.
$endgroup$
– mike stone
Mar 26 '18 at 19:01






$begingroup$
@loup blanc. The formula for the Pfaffian is standard: en.wikipedia.org/wiki/Pfaffian. Also recall that any real skew symmetric matrix can be reduced into skew 2-by-2 blocks by a real orthogonal transformation: en.wikipedia.org/wiki/Skew-symmetric_matrix#Spectral_theory. Thus, if the formula holds for the block diagonal matrices(and it does) it holds for all matrices. The OP does not request for working in an arbitrary field, and will probably be satisfied with the result for real or complexes which is what I give.
$endgroup$
– mike stone
Mar 26 '18 at 19:01




















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f155429%2fexistence-of-the-pfaffian%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Bressuire

Cabo Verde

Gyllenstierna