Conditional expectation of function of two RVs, one previsible
up vote
0
down vote
favorite
Setup:
Let $(epsilon_n)_{ngeq 1}$ be IID Bernoulli taking values in ${-1,1}$ and $mathscr{F}_n=sigma(epsilon_1,dotsc,epsilon_n)$. Then let $(C_n)_{ngeq 1}$ be any previsible process (i.e. $C_n$ is $mathscr{F}_{n-1}$ measurable) and $Z_0>0$ such that we may define the process $(Z_n)$ by $Z_n=Z_{n-1}+epsilon_n C_n$ with $0<C_n<Z_{n-1}$ for $ngeq 1$.
Specific question:
How do I justify that if $Y_n=C_n/Z_{n-1}$ then
$mathbb{E}(log(1+epsilon_n Y_n) | mathscr{F}_{n-1})=g(Y_n)$ where $g(x)=plog(1+x)+qlog(1-x)$ wherever this is defined?
My thoughts:
Intuitively, $Y_n$ would be known from $mathscr{F}_{n-1}$ since $C_n$ is and so would $Z_{n-1}$ but I’ve been stumped from translating this into detail. I understand simpler calculations where you can “pull out what is known”, e.g. $mathbb{E}(epsilon_n C_n | mathscr{F}_{n-1} )=C_n(2p-1)$ but here I directly know $C_n$ is previsible. It seems very natural to think “oh this RV is known so I can just average out the other one even when its a function of both” but I want to be sure why, especially when a function of the RVs is involved, if that’s true.
Generalized a bit:
More generally (but perhaps not totally well-posed), given some Borel function $f(x,y)$, and process $X, Y$ and $mathscr{F}_n$ generated by $X$ when can one say
$mathbb{E}(f(X_n, Y_n) | mathscr{F}_{n-1})=g(Y_n)$ where $g(y)=mathbb{E}(f(.,y))$? Is all that is needed that $Y_n$ is previsible? Or more? Thanks.
probability-theory
add a comment |
up vote
0
down vote
favorite
Setup:
Let $(epsilon_n)_{ngeq 1}$ be IID Bernoulli taking values in ${-1,1}$ and $mathscr{F}_n=sigma(epsilon_1,dotsc,epsilon_n)$. Then let $(C_n)_{ngeq 1}$ be any previsible process (i.e. $C_n$ is $mathscr{F}_{n-1}$ measurable) and $Z_0>0$ such that we may define the process $(Z_n)$ by $Z_n=Z_{n-1}+epsilon_n C_n$ with $0<C_n<Z_{n-1}$ for $ngeq 1$.
Specific question:
How do I justify that if $Y_n=C_n/Z_{n-1}$ then
$mathbb{E}(log(1+epsilon_n Y_n) | mathscr{F}_{n-1})=g(Y_n)$ where $g(x)=plog(1+x)+qlog(1-x)$ wherever this is defined?
My thoughts:
Intuitively, $Y_n$ would be known from $mathscr{F}_{n-1}$ since $C_n$ is and so would $Z_{n-1}$ but I’ve been stumped from translating this into detail. I understand simpler calculations where you can “pull out what is known”, e.g. $mathbb{E}(epsilon_n C_n | mathscr{F}_{n-1} )=C_n(2p-1)$ but here I directly know $C_n$ is previsible. It seems very natural to think “oh this RV is known so I can just average out the other one even when its a function of both” but I want to be sure why, especially when a function of the RVs is involved, if that’s true.
Generalized a bit:
More generally (but perhaps not totally well-posed), given some Borel function $f(x,y)$, and process $X, Y$ and $mathscr{F}_n$ generated by $X$ when can one say
$mathbb{E}(f(X_n, Y_n) | mathscr{F}_{n-1})=g(Y_n)$ where $g(y)=mathbb{E}(f(.,y))$? Is all that is needed that $Y_n$ is previsible? Or more? Thanks.
probability-theory
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Setup:
Let $(epsilon_n)_{ngeq 1}$ be IID Bernoulli taking values in ${-1,1}$ and $mathscr{F}_n=sigma(epsilon_1,dotsc,epsilon_n)$. Then let $(C_n)_{ngeq 1}$ be any previsible process (i.e. $C_n$ is $mathscr{F}_{n-1}$ measurable) and $Z_0>0$ such that we may define the process $(Z_n)$ by $Z_n=Z_{n-1}+epsilon_n C_n$ with $0<C_n<Z_{n-1}$ for $ngeq 1$.
Specific question:
How do I justify that if $Y_n=C_n/Z_{n-1}$ then
$mathbb{E}(log(1+epsilon_n Y_n) | mathscr{F}_{n-1})=g(Y_n)$ where $g(x)=plog(1+x)+qlog(1-x)$ wherever this is defined?
My thoughts:
Intuitively, $Y_n$ would be known from $mathscr{F}_{n-1}$ since $C_n$ is and so would $Z_{n-1}$ but I’ve been stumped from translating this into detail. I understand simpler calculations where you can “pull out what is known”, e.g. $mathbb{E}(epsilon_n C_n | mathscr{F}_{n-1} )=C_n(2p-1)$ but here I directly know $C_n$ is previsible. It seems very natural to think “oh this RV is known so I can just average out the other one even when its a function of both” but I want to be sure why, especially when a function of the RVs is involved, if that’s true.
Generalized a bit:
More generally (but perhaps not totally well-posed), given some Borel function $f(x,y)$, and process $X, Y$ and $mathscr{F}_n$ generated by $X$ when can one say
$mathbb{E}(f(X_n, Y_n) | mathscr{F}_{n-1})=g(Y_n)$ where $g(y)=mathbb{E}(f(.,y))$? Is all that is needed that $Y_n$ is previsible? Or more? Thanks.
probability-theory
Setup:
Let $(epsilon_n)_{ngeq 1}$ be IID Bernoulli taking values in ${-1,1}$ and $mathscr{F}_n=sigma(epsilon_1,dotsc,epsilon_n)$. Then let $(C_n)_{ngeq 1}$ be any previsible process (i.e. $C_n$ is $mathscr{F}_{n-1}$ measurable) and $Z_0>0$ such that we may define the process $(Z_n)$ by $Z_n=Z_{n-1}+epsilon_n C_n$ with $0<C_n<Z_{n-1}$ for $ngeq 1$.
Specific question:
How do I justify that if $Y_n=C_n/Z_{n-1}$ then
$mathbb{E}(log(1+epsilon_n Y_n) | mathscr{F}_{n-1})=g(Y_n)$ where $g(x)=plog(1+x)+qlog(1-x)$ wherever this is defined?
My thoughts:
Intuitively, $Y_n$ would be known from $mathscr{F}_{n-1}$ since $C_n$ is and so would $Z_{n-1}$ but I’ve been stumped from translating this into detail. I understand simpler calculations where you can “pull out what is known”, e.g. $mathbb{E}(epsilon_n C_n | mathscr{F}_{n-1} )=C_n(2p-1)$ but here I directly know $C_n$ is previsible. It seems very natural to think “oh this RV is known so I can just average out the other one even when its a function of both” but I want to be sure why, especially when a function of the RVs is involved, if that’s true.
Generalized a bit:
More generally (but perhaps not totally well-posed), given some Borel function $f(x,y)$, and process $X, Y$ and $mathscr{F}_n$ generated by $X$ when can one say
$mathbb{E}(f(X_n, Y_n) | mathscr{F}_{n-1})=g(Y_n)$ where $g(y)=mathbb{E}(f(.,y))$? Is all that is needed that $Y_n$ is previsible? Or more? Thanks.
probability-theory
probability-theory
asked Dec 2 at 20:43
LoveTooNap29
9821613
9821613
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
For every sigma-algebra $mathcal G$, every random variables $X$ and $Y$ and every measurable function $h$ such that $h(X,Y)$ is integrable, if $X$ is independent of $mathcal G$ and $Y$ is $sigma(mathcal G)$-mesurable, then $$E(h(X,Y)midmathcal G)=g(Y)$$ where the function $g$ is defined as $$g(y)=E(h(X,y))$$
Proof: It suffices to show that, for every bounded and $sigma(mathcal G)$-mesurable random variable $Z$, $$E(h(X,Y)Z)=E(g(Y)Z)$$ By hypothesis, $X$ is independent of $(Y,Z)$ hence, using Fubini, $$E(h(X,Y)Z)=int!!!!!iint h(x,y)zdP_X(x)dP_{Y,Z}(y,z)=iintleft(int h(x,y)dP_X(x)right)zdP_{Y,Z}(y,z)$$ Each inner parenthesis is $$int h(x,y)dP_X(x)=E(h(X,y))=g(y)$$ hence the whole is $$E(h(X,Y)Z)=iint g(y)zdP_{Y,Z}(y,z)=E(g(Y)Z)$$ and the proof is complete.
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
For every sigma-algebra $mathcal G$, every random variables $X$ and $Y$ and every measurable function $h$ such that $h(X,Y)$ is integrable, if $X$ is independent of $mathcal G$ and $Y$ is $sigma(mathcal G)$-mesurable, then $$E(h(X,Y)midmathcal G)=g(Y)$$ where the function $g$ is defined as $$g(y)=E(h(X,y))$$
Proof: It suffices to show that, for every bounded and $sigma(mathcal G)$-mesurable random variable $Z$, $$E(h(X,Y)Z)=E(g(Y)Z)$$ By hypothesis, $X$ is independent of $(Y,Z)$ hence, using Fubini, $$E(h(X,Y)Z)=int!!!!!iint h(x,y)zdP_X(x)dP_{Y,Z}(y,z)=iintleft(int h(x,y)dP_X(x)right)zdP_{Y,Z}(y,z)$$ Each inner parenthesis is $$int h(x,y)dP_X(x)=E(h(X,y))=g(y)$$ hence the whole is $$E(h(X,Y)Z)=iint g(y)zdP_{Y,Z}(y,z)=E(g(Y)Z)$$ and the proof is complete.
add a comment |
up vote
1
down vote
accepted
For every sigma-algebra $mathcal G$, every random variables $X$ and $Y$ and every measurable function $h$ such that $h(X,Y)$ is integrable, if $X$ is independent of $mathcal G$ and $Y$ is $sigma(mathcal G)$-mesurable, then $$E(h(X,Y)midmathcal G)=g(Y)$$ where the function $g$ is defined as $$g(y)=E(h(X,y))$$
Proof: It suffices to show that, for every bounded and $sigma(mathcal G)$-mesurable random variable $Z$, $$E(h(X,Y)Z)=E(g(Y)Z)$$ By hypothesis, $X$ is independent of $(Y,Z)$ hence, using Fubini, $$E(h(X,Y)Z)=int!!!!!iint h(x,y)zdP_X(x)dP_{Y,Z}(y,z)=iintleft(int h(x,y)dP_X(x)right)zdP_{Y,Z}(y,z)$$ Each inner parenthesis is $$int h(x,y)dP_X(x)=E(h(X,y))=g(y)$$ hence the whole is $$E(h(X,Y)Z)=iint g(y)zdP_{Y,Z}(y,z)=E(g(Y)Z)$$ and the proof is complete.
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
For every sigma-algebra $mathcal G$, every random variables $X$ and $Y$ and every measurable function $h$ such that $h(X,Y)$ is integrable, if $X$ is independent of $mathcal G$ and $Y$ is $sigma(mathcal G)$-mesurable, then $$E(h(X,Y)midmathcal G)=g(Y)$$ where the function $g$ is defined as $$g(y)=E(h(X,y))$$
Proof: It suffices to show that, for every bounded and $sigma(mathcal G)$-mesurable random variable $Z$, $$E(h(X,Y)Z)=E(g(Y)Z)$$ By hypothesis, $X$ is independent of $(Y,Z)$ hence, using Fubini, $$E(h(X,Y)Z)=int!!!!!iint h(x,y)zdP_X(x)dP_{Y,Z}(y,z)=iintleft(int h(x,y)dP_X(x)right)zdP_{Y,Z}(y,z)$$ Each inner parenthesis is $$int h(x,y)dP_X(x)=E(h(X,y))=g(y)$$ hence the whole is $$E(h(X,Y)Z)=iint g(y)zdP_{Y,Z}(y,z)=E(g(Y)Z)$$ and the proof is complete.
For every sigma-algebra $mathcal G$, every random variables $X$ and $Y$ and every measurable function $h$ such that $h(X,Y)$ is integrable, if $X$ is independent of $mathcal G$ and $Y$ is $sigma(mathcal G)$-mesurable, then $$E(h(X,Y)midmathcal G)=g(Y)$$ where the function $g$ is defined as $$g(y)=E(h(X,y))$$
Proof: It suffices to show that, for every bounded and $sigma(mathcal G)$-mesurable random variable $Z$, $$E(h(X,Y)Z)=E(g(Y)Z)$$ By hypothesis, $X$ is independent of $(Y,Z)$ hence, using Fubini, $$E(h(X,Y)Z)=int!!!!!iint h(x,y)zdP_X(x)dP_{Y,Z}(y,z)=iintleft(int h(x,y)dP_X(x)right)zdP_{Y,Z}(y,z)$$ Each inner parenthesis is $$int h(x,y)dP_X(x)=E(h(X,y))=g(y)$$ hence the whole is $$E(h(X,Y)Z)=iint g(y)zdP_{Y,Z}(y,z)=E(g(Y)Z)$$ and the proof is complete.
edited Dec 2 at 21:45
answered Dec 2 at 21:37
Did
245k23217452
245k23217452
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3023151%2fconditional-expectation-of-function-of-two-rvs-one-previsible%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown