calculate expectation and variance of max for 2 random variables
$begingroup$
i have following problem,
Random variables X and Y have the joint distribution below, and Z=max{X,Y}.
begin{array}{c|ccc}
Xsetminus Y & 1 & 2 & 3\
hline
1 & 0.12 & 0.08 & 0.20\
2 & 0.18 & 0.12 & 0.30
end{array}
calculate E[Z] and V[Z].
i tried to calculate the expectation through sum of max value for each combination of the 2 variables times probability in the table divided over number of all possible combinations, however i am not getting the correct answer. my question, when both variables are equal, what we should consider the max value and what probability should be used?
random-variables
$endgroup$
add a comment |
$begingroup$
i have following problem,
Random variables X and Y have the joint distribution below, and Z=max{X,Y}.
begin{array}{c|ccc}
Xsetminus Y & 1 & 2 & 3\
hline
1 & 0.12 & 0.08 & 0.20\
2 & 0.18 & 0.12 & 0.30
end{array}
calculate E[Z] and V[Z].
i tried to calculate the expectation through sum of max value for each combination of the 2 variables times probability in the table divided over number of all possible combinations, however i am not getting the correct answer. my question, when both variables are equal, what we should consider the max value and what probability should be used?
random-variables
$endgroup$
add a comment |
$begingroup$
i have following problem,
Random variables X and Y have the joint distribution below, and Z=max{X,Y}.
begin{array}{c|ccc}
Xsetminus Y & 1 & 2 & 3\
hline
1 & 0.12 & 0.08 & 0.20\
2 & 0.18 & 0.12 & 0.30
end{array}
calculate E[Z] and V[Z].
i tried to calculate the expectation through sum of max value for each combination of the 2 variables times probability in the table divided over number of all possible combinations, however i am not getting the correct answer. my question, when both variables are equal, what we should consider the max value and what probability should be used?
random-variables
$endgroup$
i have following problem,
Random variables X and Y have the joint distribution below, and Z=max{X,Y}.
begin{array}{c|ccc}
Xsetminus Y & 1 & 2 & 3\
hline
1 & 0.12 & 0.08 & 0.20\
2 & 0.18 & 0.12 & 0.30
end{array}
calculate E[Z] and V[Z].
i tried to calculate the expectation through sum of max value for each combination of the 2 variables times probability in the table divided over number of all possible combinations, however i am not getting the correct answer. my question, when both variables are equal, what we should consider the max value and what probability should be used?
random-variables
random-variables
asked Dec 27 '18 at 7:54
NourNour
254
254
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
$Z=1$ iff $X=Y=1$. $Z=2$ iff $X=1,Y=2$ or $X=2,Y=1$ or $X=2,Y=2$ so $P{Z=2}=0.08+0.18+0,12$, $Z=3$ iff $X=3,Y=1$ or $X=3,Y=2$ or $X=1,Y=3$ or $X=2,Y=3$ or $X=Y=3$, so add the probabilities of these. Now you have the distribution of $Z$. Can you take it from here?
$endgroup$
add a comment |
$begingroup$
As shown in the answer of Kavi you can do it by first finding the distribution of $Z$.
On base of that expectations $mathbb EZ$ and $mathbb EZ^2$ can be found.
Also you can go for: $$mathbb EZ=mathbb Emax{X,Y}=sum_{i=1}^2sum_{j=1}^3max{i,j}P(X=i,Y=j)$$and $$mathbb EZ^2=mathbb Emax{X,Y}^2=sum_{i=1}^2sum_{j=1}^3max{i,j}^2P(X=i,Y=j)$$
In this case I would go for the method of Kavi, but often this way works more easily.
$endgroup$
add a comment |
$begingroup$
Consider the cumulative distribution function (cdf) of Z
$F_Z(z)=P(Z<=z)=P[(X<=z wedge Y<=z)=int_{-infty}^z dx int_{-infty}^z dy P(x,y)$
You deal with discrete random variables with a compactly supported joint density of probablities over the rectangle $[1,2]times [1,3]$ where the integral can be turned into a discrete sum with discrete steps $Delta x=Delta y=1$.
$F_Z(z) = int_{-infty}^z dx int_{-infty}^z dy P(x,y) = sum_{i=1}^z sum_{j=1}^z P(i,j)$
It is now clear that for $z>=3$ $F_Z(z)=1$ and for $z<1$, $F_Z(z)=0$ So you have only two values of the cdf to be calculated,
for $z=1$: $$F_Z(1) = P(1,1)= 0.12$$
for $z=2$: $$F_Z(2) = P(1,1)+P(1,2)+P(2,1)+P(2,2)= 0.50$$
Now you can obtain the probability density function (pdf) of $Z$ by discrete forward differentiating the cumulative distribution $P_Z(i)=F_Z(i)-F_Z(i-1)$ starting with
$P_Z(z) = 0$ for $z<1$ and $P_Z(z)=0$ for $z>3$.
$$P_Z(1) = 0.12 - 0 = 0.12$$
$$P_Z(2) = 0.50 - 0.12 = 0.38$$
$$P_Z(3) = 1.0 - 0.50 = 0.50$$
From this pdf yo will get the first and second order statistical moments
$E(Z) = int P_Z(z) z dz = P_Z(1)*1+P_Z(2)*2+P_Z(3)*3 = 0.12+2*0.38+0.5*3=2.38$
$E(Z^2) = int P_Z(z) z^2 dz = P_Z(1)*1+P_Z(2)*4+P_Z(3)*9=0.12+4*0.38+9*0.5=6.14$
You classically get the variance of $Z$ as,
$V(Z)=E([Z-E(Z)]^2)=E([Z^2-2ZE(Z)+E(Z)^2]=E(Z^2)-E(Z)^2=6.14-2.38*2.38=0.4756$
Hope this helps.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3053691%2fcalculate-expectation-and-variance-of-max-for-2-random-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
$Z=1$ iff $X=Y=1$. $Z=2$ iff $X=1,Y=2$ or $X=2,Y=1$ or $X=2,Y=2$ so $P{Z=2}=0.08+0.18+0,12$, $Z=3$ iff $X=3,Y=1$ or $X=3,Y=2$ or $X=1,Y=3$ or $X=2,Y=3$ or $X=Y=3$, so add the probabilities of these. Now you have the distribution of $Z$. Can you take it from here?
$endgroup$
add a comment |
$begingroup$
$Z=1$ iff $X=Y=1$. $Z=2$ iff $X=1,Y=2$ or $X=2,Y=1$ or $X=2,Y=2$ so $P{Z=2}=0.08+0.18+0,12$, $Z=3$ iff $X=3,Y=1$ or $X=3,Y=2$ or $X=1,Y=3$ or $X=2,Y=3$ or $X=Y=3$, so add the probabilities of these. Now you have the distribution of $Z$. Can you take it from here?
$endgroup$
add a comment |
$begingroup$
$Z=1$ iff $X=Y=1$. $Z=2$ iff $X=1,Y=2$ or $X=2,Y=1$ or $X=2,Y=2$ so $P{Z=2}=0.08+0.18+0,12$, $Z=3$ iff $X=3,Y=1$ or $X=3,Y=2$ or $X=1,Y=3$ or $X=2,Y=3$ or $X=Y=3$, so add the probabilities of these. Now you have the distribution of $Z$. Can you take it from here?
$endgroup$
$Z=1$ iff $X=Y=1$. $Z=2$ iff $X=1,Y=2$ or $X=2,Y=1$ or $X=2,Y=2$ so $P{Z=2}=0.08+0.18+0,12$, $Z=3$ iff $X=3,Y=1$ or $X=3,Y=2$ or $X=1,Y=3$ or $X=2,Y=3$ or $X=Y=3$, so add the probabilities of these. Now you have the distribution of $Z$. Can you take it from here?
answered Dec 27 '18 at 8:03
Kavi Rama MurthyKavi Rama Murthy
60.5k42161
60.5k42161
add a comment |
add a comment |
$begingroup$
As shown in the answer of Kavi you can do it by first finding the distribution of $Z$.
On base of that expectations $mathbb EZ$ and $mathbb EZ^2$ can be found.
Also you can go for: $$mathbb EZ=mathbb Emax{X,Y}=sum_{i=1}^2sum_{j=1}^3max{i,j}P(X=i,Y=j)$$and $$mathbb EZ^2=mathbb Emax{X,Y}^2=sum_{i=1}^2sum_{j=1}^3max{i,j}^2P(X=i,Y=j)$$
In this case I would go for the method of Kavi, but often this way works more easily.
$endgroup$
add a comment |
$begingroup$
As shown in the answer of Kavi you can do it by first finding the distribution of $Z$.
On base of that expectations $mathbb EZ$ and $mathbb EZ^2$ can be found.
Also you can go for: $$mathbb EZ=mathbb Emax{X,Y}=sum_{i=1}^2sum_{j=1}^3max{i,j}P(X=i,Y=j)$$and $$mathbb EZ^2=mathbb Emax{X,Y}^2=sum_{i=1}^2sum_{j=1}^3max{i,j}^2P(X=i,Y=j)$$
In this case I would go for the method of Kavi, but often this way works more easily.
$endgroup$
add a comment |
$begingroup$
As shown in the answer of Kavi you can do it by first finding the distribution of $Z$.
On base of that expectations $mathbb EZ$ and $mathbb EZ^2$ can be found.
Also you can go for: $$mathbb EZ=mathbb Emax{X,Y}=sum_{i=1}^2sum_{j=1}^3max{i,j}P(X=i,Y=j)$$and $$mathbb EZ^2=mathbb Emax{X,Y}^2=sum_{i=1}^2sum_{j=1}^3max{i,j}^2P(X=i,Y=j)$$
In this case I would go for the method of Kavi, but often this way works more easily.
$endgroup$
As shown in the answer of Kavi you can do it by first finding the distribution of $Z$.
On base of that expectations $mathbb EZ$ and $mathbb EZ^2$ can be found.
Also you can go for: $$mathbb EZ=mathbb Emax{X,Y}=sum_{i=1}^2sum_{j=1}^3max{i,j}P(X=i,Y=j)$$and $$mathbb EZ^2=mathbb Emax{X,Y}^2=sum_{i=1}^2sum_{j=1}^3max{i,j}^2P(X=i,Y=j)$$
In this case I would go for the method of Kavi, but often this way works more easily.
edited Dec 27 '18 at 9:43
answered Dec 27 '18 at 8:17
drhabdrhab
101k545136
101k545136
add a comment |
add a comment |
$begingroup$
Consider the cumulative distribution function (cdf) of Z
$F_Z(z)=P(Z<=z)=P[(X<=z wedge Y<=z)=int_{-infty}^z dx int_{-infty}^z dy P(x,y)$
You deal with discrete random variables with a compactly supported joint density of probablities over the rectangle $[1,2]times [1,3]$ where the integral can be turned into a discrete sum with discrete steps $Delta x=Delta y=1$.
$F_Z(z) = int_{-infty}^z dx int_{-infty}^z dy P(x,y) = sum_{i=1}^z sum_{j=1}^z P(i,j)$
It is now clear that for $z>=3$ $F_Z(z)=1$ and for $z<1$, $F_Z(z)=0$ So you have only two values of the cdf to be calculated,
for $z=1$: $$F_Z(1) = P(1,1)= 0.12$$
for $z=2$: $$F_Z(2) = P(1,1)+P(1,2)+P(2,1)+P(2,2)= 0.50$$
Now you can obtain the probability density function (pdf) of $Z$ by discrete forward differentiating the cumulative distribution $P_Z(i)=F_Z(i)-F_Z(i-1)$ starting with
$P_Z(z) = 0$ for $z<1$ and $P_Z(z)=0$ for $z>3$.
$$P_Z(1) = 0.12 - 0 = 0.12$$
$$P_Z(2) = 0.50 - 0.12 = 0.38$$
$$P_Z(3) = 1.0 - 0.50 = 0.50$$
From this pdf yo will get the first and second order statistical moments
$E(Z) = int P_Z(z) z dz = P_Z(1)*1+P_Z(2)*2+P_Z(3)*3 = 0.12+2*0.38+0.5*3=2.38$
$E(Z^2) = int P_Z(z) z^2 dz = P_Z(1)*1+P_Z(2)*4+P_Z(3)*9=0.12+4*0.38+9*0.5=6.14$
You classically get the variance of $Z$ as,
$V(Z)=E([Z-E(Z)]^2)=E([Z^2-2ZE(Z)+E(Z)^2]=E(Z^2)-E(Z)^2=6.14-2.38*2.38=0.4756$
Hope this helps.
$endgroup$
add a comment |
$begingroup$
Consider the cumulative distribution function (cdf) of Z
$F_Z(z)=P(Z<=z)=P[(X<=z wedge Y<=z)=int_{-infty}^z dx int_{-infty}^z dy P(x,y)$
You deal with discrete random variables with a compactly supported joint density of probablities over the rectangle $[1,2]times [1,3]$ where the integral can be turned into a discrete sum with discrete steps $Delta x=Delta y=1$.
$F_Z(z) = int_{-infty}^z dx int_{-infty}^z dy P(x,y) = sum_{i=1}^z sum_{j=1}^z P(i,j)$
It is now clear that for $z>=3$ $F_Z(z)=1$ and for $z<1$, $F_Z(z)=0$ So you have only two values of the cdf to be calculated,
for $z=1$: $$F_Z(1) = P(1,1)= 0.12$$
for $z=2$: $$F_Z(2) = P(1,1)+P(1,2)+P(2,1)+P(2,2)= 0.50$$
Now you can obtain the probability density function (pdf) of $Z$ by discrete forward differentiating the cumulative distribution $P_Z(i)=F_Z(i)-F_Z(i-1)$ starting with
$P_Z(z) = 0$ for $z<1$ and $P_Z(z)=0$ for $z>3$.
$$P_Z(1) = 0.12 - 0 = 0.12$$
$$P_Z(2) = 0.50 - 0.12 = 0.38$$
$$P_Z(3) = 1.0 - 0.50 = 0.50$$
From this pdf yo will get the first and second order statistical moments
$E(Z) = int P_Z(z) z dz = P_Z(1)*1+P_Z(2)*2+P_Z(3)*3 = 0.12+2*0.38+0.5*3=2.38$
$E(Z^2) = int P_Z(z) z^2 dz = P_Z(1)*1+P_Z(2)*4+P_Z(3)*9=0.12+4*0.38+9*0.5=6.14$
You classically get the variance of $Z$ as,
$V(Z)=E([Z-E(Z)]^2)=E([Z^2-2ZE(Z)+E(Z)^2]=E(Z^2)-E(Z)^2=6.14-2.38*2.38=0.4756$
Hope this helps.
$endgroup$
add a comment |
$begingroup$
Consider the cumulative distribution function (cdf) of Z
$F_Z(z)=P(Z<=z)=P[(X<=z wedge Y<=z)=int_{-infty}^z dx int_{-infty}^z dy P(x,y)$
You deal with discrete random variables with a compactly supported joint density of probablities over the rectangle $[1,2]times [1,3]$ where the integral can be turned into a discrete sum with discrete steps $Delta x=Delta y=1$.
$F_Z(z) = int_{-infty}^z dx int_{-infty}^z dy P(x,y) = sum_{i=1}^z sum_{j=1}^z P(i,j)$
It is now clear that for $z>=3$ $F_Z(z)=1$ and for $z<1$, $F_Z(z)=0$ So you have only two values of the cdf to be calculated,
for $z=1$: $$F_Z(1) = P(1,1)= 0.12$$
for $z=2$: $$F_Z(2) = P(1,1)+P(1,2)+P(2,1)+P(2,2)= 0.50$$
Now you can obtain the probability density function (pdf) of $Z$ by discrete forward differentiating the cumulative distribution $P_Z(i)=F_Z(i)-F_Z(i-1)$ starting with
$P_Z(z) = 0$ for $z<1$ and $P_Z(z)=0$ for $z>3$.
$$P_Z(1) = 0.12 - 0 = 0.12$$
$$P_Z(2) = 0.50 - 0.12 = 0.38$$
$$P_Z(3) = 1.0 - 0.50 = 0.50$$
From this pdf yo will get the first and second order statistical moments
$E(Z) = int P_Z(z) z dz = P_Z(1)*1+P_Z(2)*2+P_Z(3)*3 = 0.12+2*0.38+0.5*3=2.38$
$E(Z^2) = int P_Z(z) z^2 dz = P_Z(1)*1+P_Z(2)*4+P_Z(3)*9=0.12+4*0.38+9*0.5=6.14$
You classically get the variance of $Z$ as,
$V(Z)=E([Z-E(Z)]^2)=E([Z^2-2ZE(Z)+E(Z)^2]=E(Z^2)-E(Z)^2=6.14-2.38*2.38=0.4756$
Hope this helps.
$endgroup$
Consider the cumulative distribution function (cdf) of Z
$F_Z(z)=P(Z<=z)=P[(X<=z wedge Y<=z)=int_{-infty}^z dx int_{-infty}^z dy P(x,y)$
You deal with discrete random variables with a compactly supported joint density of probablities over the rectangle $[1,2]times [1,3]$ where the integral can be turned into a discrete sum with discrete steps $Delta x=Delta y=1$.
$F_Z(z) = int_{-infty}^z dx int_{-infty}^z dy P(x,y) = sum_{i=1}^z sum_{j=1}^z P(i,j)$
It is now clear that for $z>=3$ $F_Z(z)=1$ and for $z<1$, $F_Z(z)=0$ So you have only two values of the cdf to be calculated,
for $z=1$: $$F_Z(1) = P(1,1)= 0.12$$
for $z=2$: $$F_Z(2) = P(1,1)+P(1,2)+P(2,1)+P(2,2)= 0.50$$
Now you can obtain the probability density function (pdf) of $Z$ by discrete forward differentiating the cumulative distribution $P_Z(i)=F_Z(i)-F_Z(i-1)$ starting with
$P_Z(z) = 0$ for $z<1$ and $P_Z(z)=0$ for $z>3$.
$$P_Z(1) = 0.12 - 0 = 0.12$$
$$P_Z(2) = 0.50 - 0.12 = 0.38$$
$$P_Z(3) = 1.0 - 0.50 = 0.50$$
From this pdf yo will get the first and second order statistical moments
$E(Z) = int P_Z(z) z dz = P_Z(1)*1+P_Z(2)*2+P_Z(3)*3 = 0.12+2*0.38+0.5*3=2.38$
$E(Z^2) = int P_Z(z) z^2 dz = P_Z(1)*1+P_Z(2)*4+P_Z(3)*9=0.12+4*0.38+9*0.5=6.14$
You classically get the variance of $Z$ as,
$V(Z)=E([Z-E(Z)]^2)=E([Z^2-2ZE(Z)+E(Z)^2]=E(Z^2)-E(Z)^2=6.14-2.38*2.38=0.4756$
Hope this helps.
edited Dec 27 '18 at 9:15
answered Dec 27 '18 at 9:09
PetePete
69536
69536
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3053691%2fcalculate-expectation-and-variance-of-max-for-2-random-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown