Trace of power of random matrix / sum of random variables with semicircle distribution
up vote
3
down vote
favorite
I want to calculate the expectation value for the trace of the $m$-th power of the $ntimes n$ adjacency matrix $A$ of a large Erdos-Renyi random graph (without self-coupling, i.e., all diagonal elements of $A$ are equal to zero). I was planning to use the invariance of trace under a change of basis and write
$forall m<n::::tr(A^m)=sum_{i=1}^nnu_i^m.$
Wigner's semicircle law states that in the asymptotic limit, the $(n-1)$ eigenvalues $nu_1,dots,nu_{n-1}$ have the semicircle probability distribution function
$f(nu) = frac{1}{2pi sigma^2}sqrt{4sigma^2-frac{nu^2}{n}}$
with second moment $sigma^2$ of the distribution of the non-diagonal elements of $A$.
Since $tr(A)=0$ (no self-coupling), I know that $nu_n=-sum_{i=1}^{n-1}nu_i$. My plan was to calculate the pdfs for $nu_i^m$ and $tr(A^m)$ via multiple convolutions of $f(nu_i)$ with itself. However, I already struggle with calculating the convolution of two semicircle pdfs,
$f(nu_i)star f(nu_j):=int_{-infty}^infty f(nu_i)f(nu_j-nu_i)d nu_i$.
How can I calculate this convolution? Or is there a better way to calculate the expectation value of $tr(A^m)$, that is, the expectation value of a non-linear function of $(n-1)$ iid random variables with semicircle pdf?
EDIT:
Since I am only interested in the expectation value of $tr(A^m)$, I do not need a pdf for $tr(A^m)$, because
$langle tr(A^m)rangle=sum_{i=1}^{n}langle nu_i^mrangle=(n-1)int_{-infty}^infty f(nu)nu^mdnu+langle nu_n^mrangle.$
However, I believe I still need the convolution of semicircle distributions for calculating
$langle nu_n^mrangle = (-1)^mlangle sum_{i=1}^{n-1} nu_i^mrangle.$
probability-theory probability-distributions random-variables random-graphs random-matrices
add a comment |
up vote
3
down vote
favorite
I want to calculate the expectation value for the trace of the $m$-th power of the $ntimes n$ adjacency matrix $A$ of a large Erdos-Renyi random graph (without self-coupling, i.e., all diagonal elements of $A$ are equal to zero). I was planning to use the invariance of trace under a change of basis and write
$forall m<n::::tr(A^m)=sum_{i=1}^nnu_i^m.$
Wigner's semicircle law states that in the asymptotic limit, the $(n-1)$ eigenvalues $nu_1,dots,nu_{n-1}$ have the semicircle probability distribution function
$f(nu) = frac{1}{2pi sigma^2}sqrt{4sigma^2-frac{nu^2}{n}}$
with second moment $sigma^2$ of the distribution of the non-diagonal elements of $A$.
Since $tr(A)=0$ (no self-coupling), I know that $nu_n=-sum_{i=1}^{n-1}nu_i$. My plan was to calculate the pdfs for $nu_i^m$ and $tr(A^m)$ via multiple convolutions of $f(nu_i)$ with itself. However, I already struggle with calculating the convolution of two semicircle pdfs,
$f(nu_i)star f(nu_j):=int_{-infty}^infty f(nu_i)f(nu_j-nu_i)d nu_i$.
How can I calculate this convolution? Or is there a better way to calculate the expectation value of $tr(A^m)$, that is, the expectation value of a non-linear function of $(n-1)$ iid random variables with semicircle pdf?
EDIT:
Since I am only interested in the expectation value of $tr(A^m)$, I do not need a pdf for $tr(A^m)$, because
$langle tr(A^m)rangle=sum_{i=1}^{n}langle nu_i^mrangle=(n-1)int_{-infty}^infty f(nu)nu^mdnu+langle nu_n^mrangle.$
However, I believe I still need the convolution of semicircle distributions for calculating
$langle nu_n^mrangle = (-1)^mlangle sum_{i=1}^{n-1} nu_i^mrangle.$
probability-theory probability-distributions random-variables random-graphs random-matrices
I think we need to know the joint distribution of ${nu_i}_{1,...,n-1}$ to answer this problem. For example, if ${nu_i}_{1,...,n-1}$ follow the GOE joint distribution, then we know that $sum_{i=1}^{n-1}nu_i$ is Gaussian (the trace of a gaussian matrix) and the $left<tr(A^m)right>$ will be easy to express in terms of moments of gaussian and semicircle distributions. If the eigenvalues are independent (and if they are, I'd be interested to see the proof), then things will be complicated...
– D.A.N.
Apr 8 '16 at 22:29
@D.A.N I am unfamiliar with the GOE joint distribution. Could you please provide some more info? I don't understand why the eigenvalues of A should follow a GOE joint distribution. A is the adjacency matrix of an Erdos-Renyi random graph, so the entries of A are not drawn from a Gaussian distribution. The distribution of all but one eigenvalues of A are given by the semicircle distribution f(nu) which is also not Gaussian.
– Alice Schwarze
Apr 18 '16 at 10:53
Here is a link to the joint pdf for GOE: en.wikipedia.org/wiki/Random_matrix#Gaussian_ensembles. I'm not saying the eigenvalues are GOE either, and I have no idea what the proof is that the eigenvalues of a Erdos-Renyi graph follow the semicircle distribution. I mentioned the GOE since it is an example where the marginal distribution of an eigenvalue, in the aymptotic limit, follows a semicircle distribution, but the trace is not a convolution of semicircle random variables since the eigenvalues are not independent. Why do you believe your eigenvalues are independent?
– D.A.N.
Apr 19 '16 at 3:05
add a comment |
up vote
3
down vote
favorite
up vote
3
down vote
favorite
I want to calculate the expectation value for the trace of the $m$-th power of the $ntimes n$ adjacency matrix $A$ of a large Erdos-Renyi random graph (without self-coupling, i.e., all diagonal elements of $A$ are equal to zero). I was planning to use the invariance of trace under a change of basis and write
$forall m<n::::tr(A^m)=sum_{i=1}^nnu_i^m.$
Wigner's semicircle law states that in the asymptotic limit, the $(n-1)$ eigenvalues $nu_1,dots,nu_{n-1}$ have the semicircle probability distribution function
$f(nu) = frac{1}{2pi sigma^2}sqrt{4sigma^2-frac{nu^2}{n}}$
with second moment $sigma^2$ of the distribution of the non-diagonal elements of $A$.
Since $tr(A)=0$ (no self-coupling), I know that $nu_n=-sum_{i=1}^{n-1}nu_i$. My plan was to calculate the pdfs for $nu_i^m$ and $tr(A^m)$ via multiple convolutions of $f(nu_i)$ with itself. However, I already struggle with calculating the convolution of two semicircle pdfs,
$f(nu_i)star f(nu_j):=int_{-infty}^infty f(nu_i)f(nu_j-nu_i)d nu_i$.
How can I calculate this convolution? Or is there a better way to calculate the expectation value of $tr(A^m)$, that is, the expectation value of a non-linear function of $(n-1)$ iid random variables with semicircle pdf?
EDIT:
Since I am only interested in the expectation value of $tr(A^m)$, I do not need a pdf for $tr(A^m)$, because
$langle tr(A^m)rangle=sum_{i=1}^{n}langle nu_i^mrangle=(n-1)int_{-infty}^infty f(nu)nu^mdnu+langle nu_n^mrangle.$
However, I believe I still need the convolution of semicircle distributions for calculating
$langle nu_n^mrangle = (-1)^mlangle sum_{i=1}^{n-1} nu_i^mrangle.$
probability-theory probability-distributions random-variables random-graphs random-matrices
I want to calculate the expectation value for the trace of the $m$-th power of the $ntimes n$ adjacency matrix $A$ of a large Erdos-Renyi random graph (without self-coupling, i.e., all diagonal elements of $A$ are equal to zero). I was planning to use the invariance of trace under a change of basis and write
$forall m<n::::tr(A^m)=sum_{i=1}^nnu_i^m.$
Wigner's semicircle law states that in the asymptotic limit, the $(n-1)$ eigenvalues $nu_1,dots,nu_{n-1}$ have the semicircle probability distribution function
$f(nu) = frac{1}{2pi sigma^2}sqrt{4sigma^2-frac{nu^2}{n}}$
with second moment $sigma^2$ of the distribution of the non-diagonal elements of $A$.
Since $tr(A)=0$ (no self-coupling), I know that $nu_n=-sum_{i=1}^{n-1}nu_i$. My plan was to calculate the pdfs for $nu_i^m$ and $tr(A^m)$ via multiple convolutions of $f(nu_i)$ with itself. However, I already struggle with calculating the convolution of two semicircle pdfs,
$f(nu_i)star f(nu_j):=int_{-infty}^infty f(nu_i)f(nu_j-nu_i)d nu_i$.
How can I calculate this convolution? Or is there a better way to calculate the expectation value of $tr(A^m)$, that is, the expectation value of a non-linear function of $(n-1)$ iid random variables with semicircle pdf?
EDIT:
Since I am only interested in the expectation value of $tr(A^m)$, I do not need a pdf for $tr(A^m)$, because
$langle tr(A^m)rangle=sum_{i=1}^{n}langle nu_i^mrangle=(n-1)int_{-infty}^infty f(nu)nu^mdnu+langle nu_n^mrangle.$
However, I believe I still need the convolution of semicircle distributions for calculating
$langle nu_n^mrangle = (-1)^mlangle sum_{i=1}^{n-1} nu_i^mrangle.$
probability-theory probability-distributions random-variables random-graphs random-matrices
probability-theory probability-distributions random-variables random-graphs random-matrices
edited Apr 6 '16 at 9:53
asked Apr 5 '16 at 17:00
Alice Schwarze
9114
9114
I think we need to know the joint distribution of ${nu_i}_{1,...,n-1}$ to answer this problem. For example, if ${nu_i}_{1,...,n-1}$ follow the GOE joint distribution, then we know that $sum_{i=1}^{n-1}nu_i$ is Gaussian (the trace of a gaussian matrix) and the $left<tr(A^m)right>$ will be easy to express in terms of moments of gaussian and semicircle distributions. If the eigenvalues are independent (and if they are, I'd be interested to see the proof), then things will be complicated...
– D.A.N.
Apr 8 '16 at 22:29
@D.A.N I am unfamiliar with the GOE joint distribution. Could you please provide some more info? I don't understand why the eigenvalues of A should follow a GOE joint distribution. A is the adjacency matrix of an Erdos-Renyi random graph, so the entries of A are not drawn from a Gaussian distribution. The distribution of all but one eigenvalues of A are given by the semicircle distribution f(nu) which is also not Gaussian.
– Alice Schwarze
Apr 18 '16 at 10:53
Here is a link to the joint pdf for GOE: en.wikipedia.org/wiki/Random_matrix#Gaussian_ensembles. I'm not saying the eigenvalues are GOE either, and I have no idea what the proof is that the eigenvalues of a Erdos-Renyi graph follow the semicircle distribution. I mentioned the GOE since it is an example where the marginal distribution of an eigenvalue, in the aymptotic limit, follows a semicircle distribution, but the trace is not a convolution of semicircle random variables since the eigenvalues are not independent. Why do you believe your eigenvalues are independent?
– D.A.N.
Apr 19 '16 at 3:05
add a comment |
I think we need to know the joint distribution of ${nu_i}_{1,...,n-1}$ to answer this problem. For example, if ${nu_i}_{1,...,n-1}$ follow the GOE joint distribution, then we know that $sum_{i=1}^{n-1}nu_i$ is Gaussian (the trace of a gaussian matrix) and the $left<tr(A^m)right>$ will be easy to express in terms of moments of gaussian and semicircle distributions. If the eigenvalues are independent (and if they are, I'd be interested to see the proof), then things will be complicated...
– D.A.N.
Apr 8 '16 at 22:29
@D.A.N I am unfamiliar with the GOE joint distribution. Could you please provide some more info? I don't understand why the eigenvalues of A should follow a GOE joint distribution. A is the adjacency matrix of an Erdos-Renyi random graph, so the entries of A are not drawn from a Gaussian distribution. The distribution of all but one eigenvalues of A are given by the semicircle distribution f(nu) which is also not Gaussian.
– Alice Schwarze
Apr 18 '16 at 10:53
Here is a link to the joint pdf for GOE: en.wikipedia.org/wiki/Random_matrix#Gaussian_ensembles. I'm not saying the eigenvalues are GOE either, and I have no idea what the proof is that the eigenvalues of a Erdos-Renyi graph follow the semicircle distribution. I mentioned the GOE since it is an example where the marginal distribution of an eigenvalue, in the aymptotic limit, follows a semicircle distribution, but the trace is not a convolution of semicircle random variables since the eigenvalues are not independent. Why do you believe your eigenvalues are independent?
– D.A.N.
Apr 19 '16 at 3:05
I think we need to know the joint distribution of ${nu_i}_{1,...,n-1}$ to answer this problem. For example, if ${nu_i}_{1,...,n-1}$ follow the GOE joint distribution, then we know that $sum_{i=1}^{n-1}nu_i$ is Gaussian (the trace of a gaussian matrix) and the $left<tr(A^m)right>$ will be easy to express in terms of moments of gaussian and semicircle distributions. If the eigenvalues are independent (and if they are, I'd be interested to see the proof), then things will be complicated...
– D.A.N.
Apr 8 '16 at 22:29
I think we need to know the joint distribution of ${nu_i}_{1,...,n-1}$ to answer this problem. For example, if ${nu_i}_{1,...,n-1}$ follow the GOE joint distribution, then we know that $sum_{i=1}^{n-1}nu_i$ is Gaussian (the trace of a gaussian matrix) and the $left<tr(A^m)right>$ will be easy to express in terms of moments of gaussian and semicircle distributions. If the eigenvalues are independent (and if they are, I'd be interested to see the proof), then things will be complicated...
– D.A.N.
Apr 8 '16 at 22:29
@D.A.N I am unfamiliar with the GOE joint distribution. Could you please provide some more info? I don't understand why the eigenvalues of A should follow a GOE joint distribution. A is the adjacency matrix of an Erdos-Renyi random graph, so the entries of A are not drawn from a Gaussian distribution. The distribution of all but one eigenvalues of A are given by the semicircle distribution f(nu) which is also not Gaussian.
– Alice Schwarze
Apr 18 '16 at 10:53
@D.A.N I am unfamiliar with the GOE joint distribution. Could you please provide some more info? I don't understand why the eigenvalues of A should follow a GOE joint distribution. A is the adjacency matrix of an Erdos-Renyi random graph, so the entries of A are not drawn from a Gaussian distribution. The distribution of all but one eigenvalues of A are given by the semicircle distribution f(nu) which is also not Gaussian.
– Alice Schwarze
Apr 18 '16 at 10:53
Here is a link to the joint pdf for GOE: en.wikipedia.org/wiki/Random_matrix#Gaussian_ensembles. I'm not saying the eigenvalues are GOE either, and I have no idea what the proof is that the eigenvalues of a Erdos-Renyi graph follow the semicircle distribution. I mentioned the GOE since it is an example where the marginal distribution of an eigenvalue, in the aymptotic limit, follows a semicircle distribution, but the trace is not a convolution of semicircle random variables since the eigenvalues are not independent. Why do you believe your eigenvalues are independent?
– D.A.N.
Apr 19 '16 at 3:05
Here is a link to the joint pdf for GOE: en.wikipedia.org/wiki/Random_matrix#Gaussian_ensembles. I'm not saying the eigenvalues are GOE either, and I have no idea what the proof is that the eigenvalues of a Erdos-Renyi graph follow the semicircle distribution. I mentioned the GOE since it is an example where the marginal distribution of an eigenvalue, in the aymptotic limit, follows a semicircle distribution, but the trace is not a convolution of semicircle random variables since the eigenvalues are not independent. Why do you believe your eigenvalues are independent?
– D.A.N.
Apr 19 '16 at 3:05
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
Here's my attempt on an exact answer, which is not yet complete.
(if you want an answer in the form of a limit as $n to infty$, take a look at the proof of Proposition 4.1 of these lecture notes, on which this answer is heavily inspired)
It is well known that given the adjacency matrix $A$ of a graph $G$,
$operatorname{Tr}(A^m)$ is the number of
closed walks of length $m$ in $G$
(since $(A^m)_{ij}$ is the number of walks of length $m$
beginning at $i$ and ending at $j$).
So here's a combinatorial approach to
$left< operatorname{Tr}(A^m) right>$.
Assuming that we are dealing with a $G(n, p)$ graph
on the vertex set $[n]$, we have that
$left< operatorname{Tr}(A^m) right>
= sum_{i=1}^n left< (A^m)_{ii} right>
= sum_{(i_1, dots, i_m) in [n]^m} left< A_{i_1i_2} A_{i_2i_3} dots A_{i_mi_1} right>$
For $i = (i_1, dots, i_m) in [n]^m$,
denote $A_{i_1i_2} dots A_{i_mi_1}$ by $A_i$
and let $E_i$ be the edge set
${i_1i_2, i_2i_3, dots, i_mi_1}$.
Since all $A_{jk}$, $j neq k$, are independent indicator
random variables with probability $p$,
then $left< A_i right> = p^{|E_i|}$
(provided there are no self-loops in $E_i$,
in which case $left< A_i right> = 0$).
Let $W^m_e$ be the set of all closed walks of length $m$
on the vertex set $[n]$ with no self-loops
using $e$ distinct edges. Then
$left< operatorname{Tr}(A^m) right>
= sum_{i=1}^m sum_{W in W_i^m} p^i
= sum_{i=1}^m p^i #{W in W_i^m}$
And all we are left to do is count how many walks we have in $W_i^m$.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1729205%2ftrace-of-power-of-random-matrix-sum-of-random-variables-with-semicircle-distri%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
Here's my attempt on an exact answer, which is not yet complete.
(if you want an answer in the form of a limit as $n to infty$, take a look at the proof of Proposition 4.1 of these lecture notes, on which this answer is heavily inspired)
It is well known that given the adjacency matrix $A$ of a graph $G$,
$operatorname{Tr}(A^m)$ is the number of
closed walks of length $m$ in $G$
(since $(A^m)_{ij}$ is the number of walks of length $m$
beginning at $i$ and ending at $j$).
So here's a combinatorial approach to
$left< operatorname{Tr}(A^m) right>$.
Assuming that we are dealing with a $G(n, p)$ graph
on the vertex set $[n]$, we have that
$left< operatorname{Tr}(A^m) right>
= sum_{i=1}^n left< (A^m)_{ii} right>
= sum_{(i_1, dots, i_m) in [n]^m} left< A_{i_1i_2} A_{i_2i_3} dots A_{i_mi_1} right>$
For $i = (i_1, dots, i_m) in [n]^m$,
denote $A_{i_1i_2} dots A_{i_mi_1}$ by $A_i$
and let $E_i$ be the edge set
${i_1i_2, i_2i_3, dots, i_mi_1}$.
Since all $A_{jk}$, $j neq k$, are independent indicator
random variables with probability $p$,
then $left< A_i right> = p^{|E_i|}$
(provided there are no self-loops in $E_i$,
in which case $left< A_i right> = 0$).
Let $W^m_e$ be the set of all closed walks of length $m$
on the vertex set $[n]$ with no self-loops
using $e$ distinct edges. Then
$left< operatorname{Tr}(A^m) right>
= sum_{i=1}^m sum_{W in W_i^m} p^i
= sum_{i=1}^m p^i #{W in W_i^m}$
And all we are left to do is count how many walks we have in $W_i^m$.
add a comment |
up vote
1
down vote
Here's my attempt on an exact answer, which is not yet complete.
(if you want an answer in the form of a limit as $n to infty$, take a look at the proof of Proposition 4.1 of these lecture notes, on which this answer is heavily inspired)
It is well known that given the adjacency matrix $A$ of a graph $G$,
$operatorname{Tr}(A^m)$ is the number of
closed walks of length $m$ in $G$
(since $(A^m)_{ij}$ is the number of walks of length $m$
beginning at $i$ and ending at $j$).
So here's a combinatorial approach to
$left< operatorname{Tr}(A^m) right>$.
Assuming that we are dealing with a $G(n, p)$ graph
on the vertex set $[n]$, we have that
$left< operatorname{Tr}(A^m) right>
= sum_{i=1}^n left< (A^m)_{ii} right>
= sum_{(i_1, dots, i_m) in [n]^m} left< A_{i_1i_2} A_{i_2i_3} dots A_{i_mi_1} right>$
For $i = (i_1, dots, i_m) in [n]^m$,
denote $A_{i_1i_2} dots A_{i_mi_1}$ by $A_i$
and let $E_i$ be the edge set
${i_1i_2, i_2i_3, dots, i_mi_1}$.
Since all $A_{jk}$, $j neq k$, are independent indicator
random variables with probability $p$,
then $left< A_i right> = p^{|E_i|}$
(provided there are no self-loops in $E_i$,
in which case $left< A_i right> = 0$).
Let $W^m_e$ be the set of all closed walks of length $m$
on the vertex set $[n]$ with no self-loops
using $e$ distinct edges. Then
$left< operatorname{Tr}(A^m) right>
= sum_{i=1}^m sum_{W in W_i^m} p^i
= sum_{i=1}^m p^i #{W in W_i^m}$
And all we are left to do is count how many walks we have in $W_i^m$.
add a comment |
up vote
1
down vote
up vote
1
down vote
Here's my attempt on an exact answer, which is not yet complete.
(if you want an answer in the form of a limit as $n to infty$, take a look at the proof of Proposition 4.1 of these lecture notes, on which this answer is heavily inspired)
It is well known that given the adjacency matrix $A$ of a graph $G$,
$operatorname{Tr}(A^m)$ is the number of
closed walks of length $m$ in $G$
(since $(A^m)_{ij}$ is the number of walks of length $m$
beginning at $i$ and ending at $j$).
So here's a combinatorial approach to
$left< operatorname{Tr}(A^m) right>$.
Assuming that we are dealing with a $G(n, p)$ graph
on the vertex set $[n]$, we have that
$left< operatorname{Tr}(A^m) right>
= sum_{i=1}^n left< (A^m)_{ii} right>
= sum_{(i_1, dots, i_m) in [n]^m} left< A_{i_1i_2} A_{i_2i_3} dots A_{i_mi_1} right>$
For $i = (i_1, dots, i_m) in [n]^m$,
denote $A_{i_1i_2} dots A_{i_mi_1}$ by $A_i$
and let $E_i$ be the edge set
${i_1i_2, i_2i_3, dots, i_mi_1}$.
Since all $A_{jk}$, $j neq k$, are independent indicator
random variables with probability $p$,
then $left< A_i right> = p^{|E_i|}$
(provided there are no self-loops in $E_i$,
in which case $left< A_i right> = 0$).
Let $W^m_e$ be the set of all closed walks of length $m$
on the vertex set $[n]$ with no self-loops
using $e$ distinct edges. Then
$left< operatorname{Tr}(A^m) right>
= sum_{i=1}^m sum_{W in W_i^m} p^i
= sum_{i=1}^m p^i #{W in W_i^m}$
And all we are left to do is count how many walks we have in $W_i^m$.
Here's my attempt on an exact answer, which is not yet complete.
(if you want an answer in the form of a limit as $n to infty$, take a look at the proof of Proposition 4.1 of these lecture notes, on which this answer is heavily inspired)
It is well known that given the adjacency matrix $A$ of a graph $G$,
$operatorname{Tr}(A^m)$ is the number of
closed walks of length $m$ in $G$
(since $(A^m)_{ij}$ is the number of walks of length $m$
beginning at $i$ and ending at $j$).
So here's a combinatorial approach to
$left< operatorname{Tr}(A^m) right>$.
Assuming that we are dealing with a $G(n, p)$ graph
on the vertex set $[n]$, we have that
$left< operatorname{Tr}(A^m) right>
= sum_{i=1}^n left< (A^m)_{ii} right>
= sum_{(i_1, dots, i_m) in [n]^m} left< A_{i_1i_2} A_{i_2i_3} dots A_{i_mi_1} right>$
For $i = (i_1, dots, i_m) in [n]^m$,
denote $A_{i_1i_2} dots A_{i_mi_1}$ by $A_i$
and let $E_i$ be the edge set
${i_1i_2, i_2i_3, dots, i_mi_1}$.
Since all $A_{jk}$, $j neq k$, are independent indicator
random variables with probability $p$,
then $left< A_i right> = p^{|E_i|}$
(provided there are no self-loops in $E_i$,
in which case $left< A_i right> = 0$).
Let $W^m_e$ be the set of all closed walks of length $m$
on the vertex set $[n]$ with no self-loops
using $e$ distinct edges. Then
$left< operatorname{Tr}(A^m) right>
= sum_{i=1}^m sum_{W in W_i^m} p^i
= sum_{i=1}^m p^i #{W in W_i^m}$
And all we are left to do is count how many walks we have in $W_i^m$.
edited Dec 5 at 2:05
answered Nov 24 at 4:48
Felix Liu
214
214
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1729205%2ftrace-of-power-of-random-matrix-sum-of-random-variables-with-semicircle-distri%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
I think we need to know the joint distribution of ${nu_i}_{1,...,n-1}$ to answer this problem. For example, if ${nu_i}_{1,...,n-1}$ follow the GOE joint distribution, then we know that $sum_{i=1}^{n-1}nu_i$ is Gaussian (the trace of a gaussian matrix) and the $left<tr(A^m)right>$ will be easy to express in terms of moments of gaussian and semicircle distributions. If the eigenvalues are independent (and if they are, I'd be interested to see the proof), then things will be complicated...
– D.A.N.
Apr 8 '16 at 22:29
@D.A.N I am unfamiliar with the GOE joint distribution. Could you please provide some more info? I don't understand why the eigenvalues of A should follow a GOE joint distribution. A is the adjacency matrix of an Erdos-Renyi random graph, so the entries of A are not drawn from a Gaussian distribution. The distribution of all but one eigenvalues of A are given by the semicircle distribution f(nu) which is also not Gaussian.
– Alice Schwarze
Apr 18 '16 at 10:53
Here is a link to the joint pdf for GOE: en.wikipedia.org/wiki/Random_matrix#Gaussian_ensembles. I'm not saying the eigenvalues are GOE either, and I have no idea what the proof is that the eigenvalues of a Erdos-Renyi graph follow the semicircle distribution. I mentioned the GOE since it is an example where the marginal distribution of an eigenvalue, in the aymptotic limit, follows a semicircle distribution, but the trace is not a convolution of semicircle random variables since the eigenvalues are not independent. Why do you believe your eigenvalues are independent?
– D.A.N.
Apr 19 '16 at 3:05