$X$ and $Y$ i.i.d., $X+Y$ and $X-Y$ independent, $mathbb{E}(X)=0 $and $mathbb{E}(X^2)=1$. Show $X sim N(0,1)$












1














$X$ and $Y$ are independent and identically distribued (i.i.d.), $X+Y$ and $X-Y$ are independent, $mathbb{E}(X)=0$ and $mathbb{E}(X^2)=1$. Show that $Xsim N(0,1)$.



We should use characteristic functions to prove this. Any ideas?










share|cite|improve this question
























  • Sure. And yours?
    – Did
    Nov 7 '13 at 20:41










  • A good reference to someone with a lot of good ideas!? Bogachev! ..in his book Gaussian Measures; particularly section 1.9
    – Tom
    Nov 7 '13 at 20:48










  • What if $X$ and $Y$ are discrete?
    – Michael Hoppe
    Nov 7 '13 at 21:56


















1














$X$ and $Y$ are independent and identically distribued (i.i.d.), $X+Y$ and $X-Y$ are independent, $mathbb{E}(X)=0$ and $mathbb{E}(X^2)=1$. Show that $Xsim N(0,1)$.



We should use characteristic functions to prove this. Any ideas?










share|cite|improve this question
























  • Sure. And yours?
    – Did
    Nov 7 '13 at 20:41










  • A good reference to someone with a lot of good ideas!? Bogachev! ..in his book Gaussian Measures; particularly section 1.9
    – Tom
    Nov 7 '13 at 20:48










  • What if $X$ and $Y$ are discrete?
    – Michael Hoppe
    Nov 7 '13 at 21:56
















1












1








1


4





$X$ and $Y$ are independent and identically distribued (i.i.d.), $X+Y$ and $X-Y$ are independent, $mathbb{E}(X)=0$ and $mathbb{E}(X^2)=1$. Show that $Xsim N(0,1)$.



We should use characteristic functions to prove this. Any ideas?










share|cite|improve this question















$X$ and $Y$ are independent and identically distribued (i.i.d.), $X+Y$ and $X-Y$ are independent, $mathbb{E}(X)=0$ and $mathbb{E}(X^2)=1$. Show that $Xsim N(0,1)$.



We should use characteristic functions to prove this. Any ideas?







probability-theory normal-distribution characteristic-functions






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Aug 16 '14 at 10:21









saz

78.1k758122




78.1k758122










asked Nov 7 '13 at 20:02









user106240

173




173












  • Sure. And yours?
    – Did
    Nov 7 '13 at 20:41










  • A good reference to someone with a lot of good ideas!? Bogachev! ..in his book Gaussian Measures; particularly section 1.9
    – Tom
    Nov 7 '13 at 20:48










  • What if $X$ and $Y$ are discrete?
    – Michael Hoppe
    Nov 7 '13 at 21:56




















  • Sure. And yours?
    – Did
    Nov 7 '13 at 20:41










  • A good reference to someone with a lot of good ideas!? Bogachev! ..in his book Gaussian Measures; particularly section 1.9
    – Tom
    Nov 7 '13 at 20:48










  • What if $X$ and $Y$ are discrete?
    – Michael Hoppe
    Nov 7 '13 at 21:56


















Sure. And yours?
– Did
Nov 7 '13 at 20:41




Sure. And yours?
– Did
Nov 7 '13 at 20:41












A good reference to someone with a lot of good ideas!? Bogachev! ..in his book Gaussian Measures; particularly section 1.9
– Tom
Nov 7 '13 at 20:48




A good reference to someone with a lot of good ideas!? Bogachev! ..in his book Gaussian Measures; particularly section 1.9
– Tom
Nov 7 '13 at 20:48












What if $X$ and $Y$ are discrete?
– Michael Hoppe
Nov 7 '13 at 21:56






What if $X$ and $Y$ are discrete?
– Michael Hoppe
Nov 7 '13 at 21:56












1 Answer
1






active

oldest

votes


















11














Denote by $Phi(t) = mathbb{E}e^{imath , t cdot X}$ the characteristic function of $X$. We have



$$X = frac{1}{2} big((X+Y)+(X-Y) big).$$



Thus,



$$begin{align*} Phi(t) &= mathbb{E}e^{imath , frac{t}{2} (X+Y)} cdot mathbb{E}e^{imath , frac{t}{2} (X-Y)}= left( mathbb{E}e^{imath , frac{t}{2} X} right)^2 cdot mathbb{E}e^{imath , frac{t}{2} Y} cdot mathbb{E}e^{-imath , frac{t}{2} Y} end{align*}$$



where we used the independence of $X-Y$ and $X+Y$ as well as the independence of $X$ and $Y$. By assumption, $X sim Y$; therefore



$$Phi(t) = Phi left( frac{t}{2} right)^3 cdot Phi left( - frac{t}{2} right). tag{1} $$



This shows that it suffices to determine $Phi$ on $B(0,varepsilon)$ for some $varepsilon>0$. Since $Phi(0)=1$, we can choose $varepsilon>0$ such that $Phi(B(0,varepsilon)) cap {x+imath , y; x leq 0, y in mathbb{R}} = emptyset$. For $t in B(0,varepsilon)$ we define



$$psi(t) := log Phi(t)$$



Then $(1)$ reads



$$psi(t) = 3 psi left( frac{t}{2} right) + psi left( - frac{t}{2} right). tag{2} $$



Applying this to $-t$ yields



$$psi(-t) = 3 psi left( - frac{t}{2} right) + psi left( frac{t}{2} right).$$



Subtracting the last two equalities we obtain



$$delta(t) := psi(t)-psi(-t) = 2 psi left( frac{t}{2} right) - 2 psi left( - frac{t}{2} right) = 2 delta left( frac{t}{2} right).$$



Consequently,



$$frac{delta(t)}{t} = frac{delta left( frac{t}{2^n} right)}{frac{t}{2^n}}. tag{3}$$



Note that $Phi$ is twice differentiable and $Phi'(0)=0$, $Phi''(0)=-1$ since $mathbb{E}X=0$, $mathbb{E}(X^2)=1$. Therefore, $delta$ and $psi$ are also twice differentiable and we can calculate the deriatives at $t=0$ explicitely. From $(3)$, we find



$$frac{delta(t)}{t} to delta'(0) = 0qquad text{as} ,, n to infty,$$



i.e. $delta(t)=0$. By the definition of $delta$ and $(2)$,



$$psi(t) = 4 psi left( frac{t}{2} right).$$



By Hôpital's theorem,



$$frac{psi(t)}{t^2} = frac{psi left( frac{t}{2^n} right)}{left( frac{t}{2^n} right)^2} to frac{1}{2} psi''(0) = - frac{1}{2} qquad text{as} ,, n to infty.$$



Hence, $$psi(t) = - frac{t^2}{2}.$$



Reference:




  • Rényi, A.: Probability Theory. (Chapter VI.5 Theorem 1)






share|cite|improve this answer























  • "From Φ(0)=1 we conclude by the intermediate value theorem that Φ(t)>0 for any t∈R"... Except that Φ is complex-valued, not real-valued a priori.
    – Did
    Nov 9 '13 at 1:24










  • @Did You are right. Should be fine now...
    – saz
    Nov 9 '13 at 7:41










  • I fail to see the construction of $(t_n)$. That $Phi(t)^3Phi(-t)$ is in $mathbb R_-$ does not imply a priori that $Phi(t)$ or $Phi(-t)$ is in $mathbb R_-$. // Note that the identity you are trying to use is $Phi(2t)=Phi(t)^2|Phi(t)|^2$.
    – Did
    Nov 9 '13 at 7:50










  • @Did Yes, of course, sorry.
    – saz
    Nov 9 '13 at 9:33










  • Rrrright... +1.
    – Did
    Nov 9 '13 at 9:45











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f556030%2fx-and-y-i-i-d-xy-and-x-y-independent-mathbbex-0-and-mathbb%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









11














Denote by $Phi(t) = mathbb{E}e^{imath , t cdot X}$ the characteristic function of $X$. We have



$$X = frac{1}{2} big((X+Y)+(X-Y) big).$$



Thus,



$$begin{align*} Phi(t) &= mathbb{E}e^{imath , frac{t}{2} (X+Y)} cdot mathbb{E}e^{imath , frac{t}{2} (X-Y)}= left( mathbb{E}e^{imath , frac{t}{2} X} right)^2 cdot mathbb{E}e^{imath , frac{t}{2} Y} cdot mathbb{E}e^{-imath , frac{t}{2} Y} end{align*}$$



where we used the independence of $X-Y$ and $X+Y$ as well as the independence of $X$ and $Y$. By assumption, $X sim Y$; therefore



$$Phi(t) = Phi left( frac{t}{2} right)^3 cdot Phi left( - frac{t}{2} right). tag{1} $$



This shows that it suffices to determine $Phi$ on $B(0,varepsilon)$ for some $varepsilon>0$. Since $Phi(0)=1$, we can choose $varepsilon>0$ such that $Phi(B(0,varepsilon)) cap {x+imath , y; x leq 0, y in mathbb{R}} = emptyset$. For $t in B(0,varepsilon)$ we define



$$psi(t) := log Phi(t)$$



Then $(1)$ reads



$$psi(t) = 3 psi left( frac{t}{2} right) + psi left( - frac{t}{2} right). tag{2} $$



Applying this to $-t$ yields



$$psi(-t) = 3 psi left( - frac{t}{2} right) + psi left( frac{t}{2} right).$$



Subtracting the last two equalities we obtain



$$delta(t) := psi(t)-psi(-t) = 2 psi left( frac{t}{2} right) - 2 psi left( - frac{t}{2} right) = 2 delta left( frac{t}{2} right).$$



Consequently,



$$frac{delta(t)}{t} = frac{delta left( frac{t}{2^n} right)}{frac{t}{2^n}}. tag{3}$$



Note that $Phi$ is twice differentiable and $Phi'(0)=0$, $Phi''(0)=-1$ since $mathbb{E}X=0$, $mathbb{E}(X^2)=1$. Therefore, $delta$ and $psi$ are also twice differentiable and we can calculate the deriatives at $t=0$ explicitely. From $(3)$, we find



$$frac{delta(t)}{t} to delta'(0) = 0qquad text{as} ,, n to infty,$$



i.e. $delta(t)=0$. By the definition of $delta$ and $(2)$,



$$psi(t) = 4 psi left( frac{t}{2} right).$$



By Hôpital's theorem,



$$frac{psi(t)}{t^2} = frac{psi left( frac{t}{2^n} right)}{left( frac{t}{2^n} right)^2} to frac{1}{2} psi''(0) = - frac{1}{2} qquad text{as} ,, n to infty.$$



Hence, $$psi(t) = - frac{t^2}{2}.$$



Reference:




  • Rényi, A.: Probability Theory. (Chapter VI.5 Theorem 1)






share|cite|improve this answer























  • "From Φ(0)=1 we conclude by the intermediate value theorem that Φ(t)>0 for any t∈R"... Except that Φ is complex-valued, not real-valued a priori.
    – Did
    Nov 9 '13 at 1:24










  • @Did You are right. Should be fine now...
    – saz
    Nov 9 '13 at 7:41










  • I fail to see the construction of $(t_n)$. That $Phi(t)^3Phi(-t)$ is in $mathbb R_-$ does not imply a priori that $Phi(t)$ or $Phi(-t)$ is in $mathbb R_-$. // Note that the identity you are trying to use is $Phi(2t)=Phi(t)^2|Phi(t)|^2$.
    – Did
    Nov 9 '13 at 7:50










  • @Did Yes, of course, sorry.
    – saz
    Nov 9 '13 at 9:33










  • Rrrright... +1.
    – Did
    Nov 9 '13 at 9:45
















11














Denote by $Phi(t) = mathbb{E}e^{imath , t cdot X}$ the characteristic function of $X$. We have



$$X = frac{1}{2} big((X+Y)+(X-Y) big).$$



Thus,



$$begin{align*} Phi(t) &= mathbb{E}e^{imath , frac{t}{2} (X+Y)} cdot mathbb{E}e^{imath , frac{t}{2} (X-Y)}= left( mathbb{E}e^{imath , frac{t}{2} X} right)^2 cdot mathbb{E}e^{imath , frac{t}{2} Y} cdot mathbb{E}e^{-imath , frac{t}{2} Y} end{align*}$$



where we used the independence of $X-Y$ and $X+Y$ as well as the independence of $X$ and $Y$. By assumption, $X sim Y$; therefore



$$Phi(t) = Phi left( frac{t}{2} right)^3 cdot Phi left( - frac{t}{2} right). tag{1} $$



This shows that it suffices to determine $Phi$ on $B(0,varepsilon)$ for some $varepsilon>0$. Since $Phi(0)=1$, we can choose $varepsilon>0$ such that $Phi(B(0,varepsilon)) cap {x+imath , y; x leq 0, y in mathbb{R}} = emptyset$. For $t in B(0,varepsilon)$ we define



$$psi(t) := log Phi(t)$$



Then $(1)$ reads



$$psi(t) = 3 psi left( frac{t}{2} right) + psi left( - frac{t}{2} right). tag{2} $$



Applying this to $-t$ yields



$$psi(-t) = 3 psi left( - frac{t}{2} right) + psi left( frac{t}{2} right).$$



Subtracting the last two equalities we obtain



$$delta(t) := psi(t)-psi(-t) = 2 psi left( frac{t}{2} right) - 2 psi left( - frac{t}{2} right) = 2 delta left( frac{t}{2} right).$$



Consequently,



$$frac{delta(t)}{t} = frac{delta left( frac{t}{2^n} right)}{frac{t}{2^n}}. tag{3}$$



Note that $Phi$ is twice differentiable and $Phi'(0)=0$, $Phi''(0)=-1$ since $mathbb{E}X=0$, $mathbb{E}(X^2)=1$. Therefore, $delta$ and $psi$ are also twice differentiable and we can calculate the deriatives at $t=0$ explicitely. From $(3)$, we find



$$frac{delta(t)}{t} to delta'(0) = 0qquad text{as} ,, n to infty,$$



i.e. $delta(t)=0$. By the definition of $delta$ and $(2)$,



$$psi(t) = 4 psi left( frac{t}{2} right).$$



By Hôpital's theorem,



$$frac{psi(t)}{t^2} = frac{psi left( frac{t}{2^n} right)}{left( frac{t}{2^n} right)^2} to frac{1}{2} psi''(0) = - frac{1}{2} qquad text{as} ,, n to infty.$$



Hence, $$psi(t) = - frac{t^2}{2}.$$



Reference:




  • Rényi, A.: Probability Theory. (Chapter VI.5 Theorem 1)






share|cite|improve this answer























  • "From Φ(0)=1 we conclude by the intermediate value theorem that Φ(t)>0 for any t∈R"... Except that Φ is complex-valued, not real-valued a priori.
    – Did
    Nov 9 '13 at 1:24










  • @Did You are right. Should be fine now...
    – saz
    Nov 9 '13 at 7:41










  • I fail to see the construction of $(t_n)$. That $Phi(t)^3Phi(-t)$ is in $mathbb R_-$ does not imply a priori that $Phi(t)$ or $Phi(-t)$ is in $mathbb R_-$. // Note that the identity you are trying to use is $Phi(2t)=Phi(t)^2|Phi(t)|^2$.
    – Did
    Nov 9 '13 at 7:50










  • @Did Yes, of course, sorry.
    – saz
    Nov 9 '13 at 9:33










  • Rrrright... +1.
    – Did
    Nov 9 '13 at 9:45














11












11








11






Denote by $Phi(t) = mathbb{E}e^{imath , t cdot X}$ the characteristic function of $X$. We have



$$X = frac{1}{2} big((X+Y)+(X-Y) big).$$



Thus,



$$begin{align*} Phi(t) &= mathbb{E}e^{imath , frac{t}{2} (X+Y)} cdot mathbb{E}e^{imath , frac{t}{2} (X-Y)}= left( mathbb{E}e^{imath , frac{t}{2} X} right)^2 cdot mathbb{E}e^{imath , frac{t}{2} Y} cdot mathbb{E}e^{-imath , frac{t}{2} Y} end{align*}$$



where we used the independence of $X-Y$ and $X+Y$ as well as the independence of $X$ and $Y$. By assumption, $X sim Y$; therefore



$$Phi(t) = Phi left( frac{t}{2} right)^3 cdot Phi left( - frac{t}{2} right). tag{1} $$



This shows that it suffices to determine $Phi$ on $B(0,varepsilon)$ for some $varepsilon>0$. Since $Phi(0)=1$, we can choose $varepsilon>0$ such that $Phi(B(0,varepsilon)) cap {x+imath , y; x leq 0, y in mathbb{R}} = emptyset$. For $t in B(0,varepsilon)$ we define



$$psi(t) := log Phi(t)$$



Then $(1)$ reads



$$psi(t) = 3 psi left( frac{t}{2} right) + psi left( - frac{t}{2} right). tag{2} $$



Applying this to $-t$ yields



$$psi(-t) = 3 psi left( - frac{t}{2} right) + psi left( frac{t}{2} right).$$



Subtracting the last two equalities we obtain



$$delta(t) := psi(t)-psi(-t) = 2 psi left( frac{t}{2} right) - 2 psi left( - frac{t}{2} right) = 2 delta left( frac{t}{2} right).$$



Consequently,



$$frac{delta(t)}{t} = frac{delta left( frac{t}{2^n} right)}{frac{t}{2^n}}. tag{3}$$



Note that $Phi$ is twice differentiable and $Phi'(0)=0$, $Phi''(0)=-1$ since $mathbb{E}X=0$, $mathbb{E}(X^2)=1$. Therefore, $delta$ and $psi$ are also twice differentiable and we can calculate the deriatives at $t=0$ explicitely. From $(3)$, we find



$$frac{delta(t)}{t} to delta'(0) = 0qquad text{as} ,, n to infty,$$



i.e. $delta(t)=0$. By the definition of $delta$ and $(2)$,



$$psi(t) = 4 psi left( frac{t}{2} right).$$



By Hôpital's theorem,



$$frac{psi(t)}{t^2} = frac{psi left( frac{t}{2^n} right)}{left( frac{t}{2^n} right)^2} to frac{1}{2} psi''(0) = - frac{1}{2} qquad text{as} ,, n to infty.$$



Hence, $$psi(t) = - frac{t^2}{2}.$$



Reference:




  • Rényi, A.: Probability Theory. (Chapter VI.5 Theorem 1)






share|cite|improve this answer














Denote by $Phi(t) = mathbb{E}e^{imath , t cdot X}$ the characteristic function of $X$. We have



$$X = frac{1}{2} big((X+Y)+(X-Y) big).$$



Thus,



$$begin{align*} Phi(t) &= mathbb{E}e^{imath , frac{t}{2} (X+Y)} cdot mathbb{E}e^{imath , frac{t}{2} (X-Y)}= left( mathbb{E}e^{imath , frac{t}{2} X} right)^2 cdot mathbb{E}e^{imath , frac{t}{2} Y} cdot mathbb{E}e^{-imath , frac{t}{2} Y} end{align*}$$



where we used the independence of $X-Y$ and $X+Y$ as well as the independence of $X$ and $Y$. By assumption, $X sim Y$; therefore



$$Phi(t) = Phi left( frac{t}{2} right)^3 cdot Phi left( - frac{t}{2} right). tag{1} $$



This shows that it suffices to determine $Phi$ on $B(0,varepsilon)$ for some $varepsilon>0$. Since $Phi(0)=1$, we can choose $varepsilon>0$ such that $Phi(B(0,varepsilon)) cap {x+imath , y; x leq 0, y in mathbb{R}} = emptyset$. For $t in B(0,varepsilon)$ we define



$$psi(t) := log Phi(t)$$



Then $(1)$ reads



$$psi(t) = 3 psi left( frac{t}{2} right) + psi left( - frac{t}{2} right). tag{2} $$



Applying this to $-t$ yields



$$psi(-t) = 3 psi left( - frac{t}{2} right) + psi left( frac{t}{2} right).$$



Subtracting the last two equalities we obtain



$$delta(t) := psi(t)-psi(-t) = 2 psi left( frac{t}{2} right) - 2 psi left( - frac{t}{2} right) = 2 delta left( frac{t}{2} right).$$



Consequently,



$$frac{delta(t)}{t} = frac{delta left( frac{t}{2^n} right)}{frac{t}{2^n}}. tag{3}$$



Note that $Phi$ is twice differentiable and $Phi'(0)=0$, $Phi''(0)=-1$ since $mathbb{E}X=0$, $mathbb{E}(X^2)=1$. Therefore, $delta$ and $psi$ are also twice differentiable and we can calculate the deriatives at $t=0$ explicitely. From $(3)$, we find



$$frac{delta(t)}{t} to delta'(0) = 0qquad text{as} ,, n to infty,$$



i.e. $delta(t)=0$. By the definition of $delta$ and $(2)$,



$$psi(t) = 4 psi left( frac{t}{2} right).$$



By Hôpital's theorem,



$$frac{psi(t)}{t^2} = frac{psi left( frac{t}{2^n} right)}{left( frac{t}{2^n} right)^2} to frac{1}{2} psi''(0) = - frac{1}{2} qquad text{as} ,, n to infty.$$



Hence, $$psi(t) = - frac{t^2}{2}.$$



Reference:




  • Rényi, A.: Probability Theory. (Chapter VI.5 Theorem 1)







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Apr 19 '15 at 6:52

























answered Nov 8 '13 at 16:53









saz

78.1k758122




78.1k758122












  • "From Φ(0)=1 we conclude by the intermediate value theorem that Φ(t)>0 for any t∈R"... Except that Φ is complex-valued, not real-valued a priori.
    – Did
    Nov 9 '13 at 1:24










  • @Did You are right. Should be fine now...
    – saz
    Nov 9 '13 at 7:41










  • I fail to see the construction of $(t_n)$. That $Phi(t)^3Phi(-t)$ is in $mathbb R_-$ does not imply a priori that $Phi(t)$ or $Phi(-t)$ is in $mathbb R_-$. // Note that the identity you are trying to use is $Phi(2t)=Phi(t)^2|Phi(t)|^2$.
    – Did
    Nov 9 '13 at 7:50










  • @Did Yes, of course, sorry.
    – saz
    Nov 9 '13 at 9:33










  • Rrrright... +1.
    – Did
    Nov 9 '13 at 9:45


















  • "From Φ(0)=1 we conclude by the intermediate value theorem that Φ(t)>0 for any t∈R"... Except that Φ is complex-valued, not real-valued a priori.
    – Did
    Nov 9 '13 at 1:24










  • @Did You are right. Should be fine now...
    – saz
    Nov 9 '13 at 7:41










  • I fail to see the construction of $(t_n)$. That $Phi(t)^3Phi(-t)$ is in $mathbb R_-$ does not imply a priori that $Phi(t)$ or $Phi(-t)$ is in $mathbb R_-$. // Note that the identity you are trying to use is $Phi(2t)=Phi(t)^2|Phi(t)|^2$.
    – Did
    Nov 9 '13 at 7:50










  • @Did Yes, of course, sorry.
    – saz
    Nov 9 '13 at 9:33










  • Rrrright... +1.
    – Did
    Nov 9 '13 at 9:45
















"From Φ(0)=1 we conclude by the intermediate value theorem that Φ(t)>0 for any t∈R"... Except that Φ is complex-valued, not real-valued a priori.
– Did
Nov 9 '13 at 1:24




"From Φ(0)=1 we conclude by the intermediate value theorem that Φ(t)>0 for any t∈R"... Except that Φ is complex-valued, not real-valued a priori.
– Did
Nov 9 '13 at 1:24












@Did You are right. Should be fine now...
– saz
Nov 9 '13 at 7:41




@Did You are right. Should be fine now...
– saz
Nov 9 '13 at 7:41












I fail to see the construction of $(t_n)$. That $Phi(t)^3Phi(-t)$ is in $mathbb R_-$ does not imply a priori that $Phi(t)$ or $Phi(-t)$ is in $mathbb R_-$. // Note that the identity you are trying to use is $Phi(2t)=Phi(t)^2|Phi(t)|^2$.
– Did
Nov 9 '13 at 7:50




I fail to see the construction of $(t_n)$. That $Phi(t)^3Phi(-t)$ is in $mathbb R_-$ does not imply a priori that $Phi(t)$ or $Phi(-t)$ is in $mathbb R_-$. // Note that the identity you are trying to use is $Phi(2t)=Phi(t)^2|Phi(t)|^2$.
– Did
Nov 9 '13 at 7:50












@Did Yes, of course, sorry.
– saz
Nov 9 '13 at 9:33




@Did Yes, of course, sorry.
– saz
Nov 9 '13 at 9:33












Rrrright... +1.
– Did
Nov 9 '13 at 9:45




Rrrright... +1.
– Did
Nov 9 '13 at 9:45


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f556030%2fx-and-y-i-i-d-xy-and-x-y-independent-mathbbex-0-and-mathbb%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Bressuire

Cabo Verde

Gyllenstierna