What are the interest of the moments of a random variable?
$begingroup$
Let $X$ a random variable. We define the moment of order $rinmathbb N$ by $m_r=mathbb E[X^r]$. I know that the moment of order $1$ is the expectation, of order 2, one can get the variance, of order 3 and 4, they can be interesting, but in what moments of order $r$ is interesting ? Also, we define the moment generating function by $M_X(t)=mathbb E[e^{tX}]$. Why such a definition, and not simply $$sum_{r=0}^infty m_rx^r ?$$
(normally, the moment generating function of a sequence $(u_n)$ is $f(t)=sum_{ninmathbb N}u_nt^n$). Also, what is the interest of the moment generating function ? I really don't see why it's interesting (instead maybe of the fact that $M_X^{(r)}(0)=m_r.$)
probability moment-generating-functions
$endgroup$
add a comment |
$begingroup$
Let $X$ a random variable. We define the moment of order $rinmathbb N$ by $m_r=mathbb E[X^r]$. I know that the moment of order $1$ is the expectation, of order 2, one can get the variance, of order 3 and 4, they can be interesting, but in what moments of order $r$ is interesting ? Also, we define the moment generating function by $M_X(t)=mathbb E[e^{tX}]$. Why such a definition, and not simply $$sum_{r=0}^infty m_rx^r ?$$
(normally, the moment generating function of a sequence $(u_n)$ is $f(t)=sum_{ninmathbb N}u_nt^n$). Also, what is the interest of the moment generating function ? I really don't see why it's interesting (instead maybe of the fact that $M_X^{(r)}(0)=m_r.$)
probability moment-generating-functions
$endgroup$
$begingroup$
You are underestimating the usefulness of that last mentioned fact imo. MGFs (and CFs) can even be used to recover the distribution in some cases!
$endgroup$
– LoveTooNap29
Dec 25 '18 at 16:26
$begingroup$
The mgf's derivatives give the moments, hence the name. Your proposed alternative is the typically less useful $Bbb Efrac{1}{1-xX}$.
$endgroup$
– J.G.
Dec 25 '18 at 16:51
add a comment |
$begingroup$
Let $X$ a random variable. We define the moment of order $rinmathbb N$ by $m_r=mathbb E[X^r]$. I know that the moment of order $1$ is the expectation, of order 2, one can get the variance, of order 3 and 4, they can be interesting, but in what moments of order $r$ is interesting ? Also, we define the moment generating function by $M_X(t)=mathbb E[e^{tX}]$. Why such a definition, and not simply $$sum_{r=0}^infty m_rx^r ?$$
(normally, the moment generating function of a sequence $(u_n)$ is $f(t)=sum_{ninmathbb N}u_nt^n$). Also, what is the interest of the moment generating function ? I really don't see why it's interesting (instead maybe of the fact that $M_X^{(r)}(0)=m_r.$)
probability moment-generating-functions
$endgroup$
Let $X$ a random variable. We define the moment of order $rinmathbb N$ by $m_r=mathbb E[X^r]$. I know that the moment of order $1$ is the expectation, of order 2, one can get the variance, of order 3 and 4, they can be interesting, but in what moments of order $r$ is interesting ? Also, we define the moment generating function by $M_X(t)=mathbb E[e^{tX}]$. Why such a definition, and not simply $$sum_{r=0}^infty m_rx^r ?$$
(normally, the moment generating function of a sequence $(u_n)$ is $f(t)=sum_{ninmathbb N}u_nt^n$). Also, what is the interest of the moment generating function ? I really don't see why it's interesting (instead maybe of the fact that $M_X^{(r)}(0)=m_r.$)
probability moment-generating-functions
probability moment-generating-functions
asked Dec 25 '18 at 15:32
user623855user623855
1507
1507
$begingroup$
You are underestimating the usefulness of that last mentioned fact imo. MGFs (and CFs) can even be used to recover the distribution in some cases!
$endgroup$
– LoveTooNap29
Dec 25 '18 at 16:26
$begingroup$
The mgf's derivatives give the moments, hence the name. Your proposed alternative is the typically less useful $Bbb Efrac{1}{1-xX}$.
$endgroup$
– J.G.
Dec 25 '18 at 16:51
add a comment |
$begingroup$
You are underestimating the usefulness of that last mentioned fact imo. MGFs (and CFs) can even be used to recover the distribution in some cases!
$endgroup$
– LoveTooNap29
Dec 25 '18 at 16:26
$begingroup$
The mgf's derivatives give the moments, hence the name. Your proposed alternative is the typically less useful $Bbb Efrac{1}{1-xX}$.
$endgroup$
– J.G.
Dec 25 '18 at 16:51
$begingroup$
You are underestimating the usefulness of that last mentioned fact imo. MGFs (and CFs) can even be used to recover the distribution in some cases!
$endgroup$
– LoveTooNap29
Dec 25 '18 at 16:26
$begingroup$
You are underestimating the usefulness of that last mentioned fact imo. MGFs (and CFs) can even be used to recover the distribution in some cases!
$endgroup$
– LoveTooNap29
Dec 25 '18 at 16:26
$begingroup$
The mgf's derivatives give the moments, hence the name. Your proposed alternative is the typically less useful $Bbb Efrac{1}{1-xX}$.
$endgroup$
– J.G.
Dec 25 '18 at 16:51
$begingroup$
The mgf's derivatives give the moments, hence the name. Your proposed alternative is the typically less useful $Bbb Efrac{1}{1-xX}$.
$endgroup$
– J.G.
Dec 25 '18 at 16:51
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
... but in what moments of order r is interesting?
One example: in statistics, moments of higher order may be needed in the method of moments.
Why such a definition, and not simply ...
The moment generating function of a random variable is not defined merely for calculating the moments of a random variable. It has other important properties such as $phi_{X+Y}(t)=phi_Xphi_Y(t)$ when $X$ and $Y$ are independent. (Maybe) most importantly, it characterizes a distribution!
Even in the studies of infinite sequences, exponential generating functions may be generally more convenient than ordinary generating functions in some situations.
... what is the interest of the moment generating function?
You could first read the Wikipedia article on moment generating function. Again, this is not simply a tool for calculating moments. You may also want to take a look at a more often used cousin: the characteristic function, which is essentially the Fourier transform of a random variable. A classical proof the central limit theorem uses the notion of characteristic functions.
$endgroup$
$begingroup$
Thank you for your answer. I saw the wiki article. They wrote for example that if $M_X(t)=M_Y(t)$ for all $t$, then $X$ and $Y$ has the same law. So why considering the Characteristic function instead of moment generating function since both looks to have have the property $M_X=M_Y$ implies $Xsim Y$.
$endgroup$
– user623855
Dec 25 '18 at 16:20
2
$begingroup$
@user623855: in short: characteristic functions always exist for $L^1$ random varaibles. However, the moment generating function doesn't need to exist because in particular it requires the existence of moments of any order.
$endgroup$
– user587192
Dec 25 '18 at 16:22
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3052202%2fwhat-are-the-interest-of-the-moments-of-a-random-variable%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
... but in what moments of order r is interesting?
One example: in statistics, moments of higher order may be needed in the method of moments.
Why such a definition, and not simply ...
The moment generating function of a random variable is not defined merely for calculating the moments of a random variable. It has other important properties such as $phi_{X+Y}(t)=phi_Xphi_Y(t)$ when $X$ and $Y$ are independent. (Maybe) most importantly, it characterizes a distribution!
Even in the studies of infinite sequences, exponential generating functions may be generally more convenient than ordinary generating functions in some situations.
... what is the interest of the moment generating function?
You could first read the Wikipedia article on moment generating function. Again, this is not simply a tool for calculating moments. You may also want to take a look at a more often used cousin: the characteristic function, which is essentially the Fourier transform of a random variable. A classical proof the central limit theorem uses the notion of characteristic functions.
$endgroup$
$begingroup$
Thank you for your answer. I saw the wiki article. They wrote for example that if $M_X(t)=M_Y(t)$ for all $t$, then $X$ and $Y$ has the same law. So why considering the Characteristic function instead of moment generating function since both looks to have have the property $M_X=M_Y$ implies $Xsim Y$.
$endgroup$
– user623855
Dec 25 '18 at 16:20
2
$begingroup$
@user623855: in short: characteristic functions always exist for $L^1$ random varaibles. However, the moment generating function doesn't need to exist because in particular it requires the existence of moments of any order.
$endgroup$
– user587192
Dec 25 '18 at 16:22
add a comment |
$begingroup$
... but in what moments of order r is interesting?
One example: in statistics, moments of higher order may be needed in the method of moments.
Why such a definition, and not simply ...
The moment generating function of a random variable is not defined merely for calculating the moments of a random variable. It has other important properties such as $phi_{X+Y}(t)=phi_Xphi_Y(t)$ when $X$ and $Y$ are independent. (Maybe) most importantly, it characterizes a distribution!
Even in the studies of infinite sequences, exponential generating functions may be generally more convenient than ordinary generating functions in some situations.
... what is the interest of the moment generating function?
You could first read the Wikipedia article on moment generating function. Again, this is not simply a tool for calculating moments. You may also want to take a look at a more often used cousin: the characteristic function, which is essentially the Fourier transform of a random variable. A classical proof the central limit theorem uses the notion of characteristic functions.
$endgroup$
$begingroup$
Thank you for your answer. I saw the wiki article. They wrote for example that if $M_X(t)=M_Y(t)$ for all $t$, then $X$ and $Y$ has the same law. So why considering the Characteristic function instead of moment generating function since both looks to have have the property $M_X=M_Y$ implies $Xsim Y$.
$endgroup$
– user623855
Dec 25 '18 at 16:20
2
$begingroup$
@user623855: in short: characteristic functions always exist for $L^1$ random varaibles. However, the moment generating function doesn't need to exist because in particular it requires the existence of moments of any order.
$endgroup$
– user587192
Dec 25 '18 at 16:22
add a comment |
$begingroup$
... but in what moments of order r is interesting?
One example: in statistics, moments of higher order may be needed in the method of moments.
Why such a definition, and not simply ...
The moment generating function of a random variable is not defined merely for calculating the moments of a random variable. It has other important properties such as $phi_{X+Y}(t)=phi_Xphi_Y(t)$ when $X$ and $Y$ are independent. (Maybe) most importantly, it characterizes a distribution!
Even in the studies of infinite sequences, exponential generating functions may be generally more convenient than ordinary generating functions in some situations.
... what is the interest of the moment generating function?
You could first read the Wikipedia article on moment generating function. Again, this is not simply a tool for calculating moments. You may also want to take a look at a more often used cousin: the characteristic function, which is essentially the Fourier transform of a random variable. A classical proof the central limit theorem uses the notion of characteristic functions.
$endgroup$
... but in what moments of order r is interesting?
One example: in statistics, moments of higher order may be needed in the method of moments.
Why such a definition, and not simply ...
The moment generating function of a random variable is not defined merely for calculating the moments of a random variable. It has other important properties such as $phi_{X+Y}(t)=phi_Xphi_Y(t)$ when $X$ and $Y$ are independent. (Maybe) most importantly, it characterizes a distribution!
Even in the studies of infinite sequences, exponential generating functions may be generally more convenient than ordinary generating functions in some situations.
... what is the interest of the moment generating function?
You could first read the Wikipedia article on moment generating function. Again, this is not simply a tool for calculating moments. You may also want to take a look at a more often used cousin: the characteristic function, which is essentially the Fourier transform of a random variable. A classical proof the central limit theorem uses the notion of characteristic functions.
edited Dec 25 '18 at 16:25
answered Dec 25 '18 at 16:15
user587192user587192
2,064415
2,064415
$begingroup$
Thank you for your answer. I saw the wiki article. They wrote for example that if $M_X(t)=M_Y(t)$ for all $t$, then $X$ and $Y$ has the same law. So why considering the Characteristic function instead of moment generating function since both looks to have have the property $M_X=M_Y$ implies $Xsim Y$.
$endgroup$
– user623855
Dec 25 '18 at 16:20
2
$begingroup$
@user623855: in short: characteristic functions always exist for $L^1$ random varaibles. However, the moment generating function doesn't need to exist because in particular it requires the existence of moments of any order.
$endgroup$
– user587192
Dec 25 '18 at 16:22
add a comment |
$begingroup$
Thank you for your answer. I saw the wiki article. They wrote for example that if $M_X(t)=M_Y(t)$ for all $t$, then $X$ and $Y$ has the same law. So why considering the Characteristic function instead of moment generating function since both looks to have have the property $M_X=M_Y$ implies $Xsim Y$.
$endgroup$
– user623855
Dec 25 '18 at 16:20
2
$begingroup$
@user623855: in short: characteristic functions always exist for $L^1$ random varaibles. However, the moment generating function doesn't need to exist because in particular it requires the existence of moments of any order.
$endgroup$
– user587192
Dec 25 '18 at 16:22
$begingroup$
Thank you for your answer. I saw the wiki article. They wrote for example that if $M_X(t)=M_Y(t)$ for all $t$, then $X$ and $Y$ has the same law. So why considering the Characteristic function instead of moment generating function since both looks to have have the property $M_X=M_Y$ implies $Xsim Y$.
$endgroup$
– user623855
Dec 25 '18 at 16:20
$begingroup$
Thank you for your answer. I saw the wiki article. They wrote for example that if $M_X(t)=M_Y(t)$ for all $t$, then $X$ and $Y$ has the same law. So why considering the Characteristic function instead of moment generating function since both looks to have have the property $M_X=M_Y$ implies $Xsim Y$.
$endgroup$
– user623855
Dec 25 '18 at 16:20
2
2
$begingroup$
@user623855: in short: characteristic functions always exist for $L^1$ random varaibles. However, the moment generating function doesn't need to exist because in particular it requires the existence of moments of any order.
$endgroup$
– user587192
Dec 25 '18 at 16:22
$begingroup$
@user623855: in short: characteristic functions always exist for $L^1$ random varaibles. However, the moment generating function doesn't need to exist because in particular it requires the existence of moments of any order.
$endgroup$
– user587192
Dec 25 '18 at 16:22
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3052202%2fwhat-are-the-interest-of-the-moments-of-a-random-variable%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
You are underestimating the usefulness of that last mentioned fact imo. MGFs (and CFs) can even be used to recover the distribution in some cases!
$endgroup$
– LoveTooNap29
Dec 25 '18 at 16:26
$begingroup$
The mgf's derivatives give the moments, hence the name. Your proposed alternative is the typically less useful $Bbb Efrac{1}{1-xX}$.
$endgroup$
– J.G.
Dec 25 '18 at 16:51