Convexity of $f(v) = E[(sum_i X_i v_i)^k]$ subj. to $sum_i v_i^2=1$
$begingroup$
I'm interested in the function
$$f(v) = Eleft[big(sum_i X_i v_ibig)^{2k}right],$$
where $v$ is conditioned to $|v|_2=1$, $kge1$ is integer, and $X_i$ are iid random variables.
For which random distributions is $f$ maximized when all the weight of $v$ is concentrated on a single coordinate? That is, when $v=[1,0dots,0]$.
Clearly when $X_isimmathcal N(0,1)$ we get $Eleft[big(sum_i X_i v_ibig)^{2k}right]=C_k|v|^{2k}_2=C_k$ for some constant $C_k$ depending on $k$. Hence for the normal distribution the `layout' of $v$ doesn't matter. On the other hand, if $X_i$ is an $alpha$-stable distribution, we'd have $Eleft[big(sum_i X_i v_ibig)^{2k}right]=C_k|v|^{2k}_alpha$, which is clearly maximized at $v=[1,0dots,0]$ when $alpha<2$.
It seems to me that $f(v)$ will be convex for any random distribution that has Gaussian moments or larger. I have checked it for the exponential distribution and chi-squared as well, but I don't know how to prove this in general.
Is $f(v)$ convex for any distribution with $Eleft[X^{2k}right]ge (2k-1)!!$ (rhs being the $2k$th moment of a standard normal distribution.)
real-analysis probability statistics probability-distributions convex-analysis
$endgroup$
|
show 2 more comments
$begingroup$
I'm interested in the function
$$f(v) = Eleft[big(sum_i X_i v_ibig)^{2k}right],$$
where $v$ is conditioned to $|v|_2=1$, $kge1$ is integer, and $X_i$ are iid random variables.
For which random distributions is $f$ maximized when all the weight of $v$ is concentrated on a single coordinate? That is, when $v=[1,0dots,0]$.
Clearly when $X_isimmathcal N(0,1)$ we get $Eleft[big(sum_i X_i v_ibig)^{2k}right]=C_k|v|^{2k}_2=C_k$ for some constant $C_k$ depending on $k$. Hence for the normal distribution the `layout' of $v$ doesn't matter. On the other hand, if $X_i$ is an $alpha$-stable distribution, we'd have $Eleft[big(sum_i X_i v_ibig)^{2k}right]=C_k|v|^{2k}_alpha$, which is clearly maximized at $v=[1,0dots,0]$ when $alpha<2$.
It seems to me that $f(v)$ will be convex for any random distribution that has Gaussian moments or larger. I have checked it for the exponential distribution and chi-squared as well, but I don't know how to prove this in general.
Is $f(v)$ convex for any distribution with $Eleft[X^{2k}right]ge (2k-1)!!$ (rhs being the $2k$th moment of a standard normal distribution.)
real-analysis probability statistics probability-distributions convex-analysis
$endgroup$
2
$begingroup$
$f(v_1,...,v_n)$ is convex (for any random variables for which the expectations are finite) over $mathbb{R}^n$, but not over the domain ${v:||v||=1}$. For example take the convex function $g(x_1,x_2)=(x_1+x_2)^{2k}$. Then $g(1,0)=1 < g(1/sqrt{2},1/sqrt{2})=sqrt{2}^{2k}$.
$endgroup$
– Michael
Dec 21 '18 at 18:37
1
$begingroup$
In other words, convexity seems to have nothing to do with the problem. It would also help to clarify how many terms your sum is over (i.e., the dimension of $v$ is what?), and if the square is inside or outside the expectation (I assume inside). That is, does $E(sum X_i)^2$ mean $E[(sum X_i)^2]$ or $E[sum X_i]^2$?
$endgroup$
– Michael
Dec 21 '18 at 18:48
$begingroup$
@Michael Thank you, I've added more brackets to my expectations. Regarding convexity, I do indeed mean convex over the domain. Is there a better way to state this?
$endgroup$
– Thomas Ahle
Dec 21 '18 at 19:01
1
$begingroup$
A convex function must have a convex domain. There is no function that is convex over the domain ${v: ||v||=1}$.
$endgroup$
– Michael
Dec 21 '18 at 19:02
1
$begingroup$
You likely want the $X_i$ to have mean zero since if $E[X_i]=m$ we get $f(1,0,0,..,0)=E[X_1^{2k}]$ and $f(1/sqrt{n}, ..., 1/sqrt{n})geq (msqrt{n})^{2k}$, which goes to infinity with $n$ whenever $mneq 0$. It would also help to know whether or not $n=k$.
$endgroup$
– Michael
Dec 21 '18 at 19:06
|
show 2 more comments
$begingroup$
I'm interested in the function
$$f(v) = Eleft[big(sum_i X_i v_ibig)^{2k}right],$$
where $v$ is conditioned to $|v|_2=1$, $kge1$ is integer, and $X_i$ are iid random variables.
For which random distributions is $f$ maximized when all the weight of $v$ is concentrated on a single coordinate? That is, when $v=[1,0dots,0]$.
Clearly when $X_isimmathcal N(0,1)$ we get $Eleft[big(sum_i X_i v_ibig)^{2k}right]=C_k|v|^{2k}_2=C_k$ for some constant $C_k$ depending on $k$. Hence for the normal distribution the `layout' of $v$ doesn't matter. On the other hand, if $X_i$ is an $alpha$-stable distribution, we'd have $Eleft[big(sum_i X_i v_ibig)^{2k}right]=C_k|v|^{2k}_alpha$, which is clearly maximized at $v=[1,0dots,0]$ when $alpha<2$.
It seems to me that $f(v)$ will be convex for any random distribution that has Gaussian moments or larger. I have checked it for the exponential distribution and chi-squared as well, but I don't know how to prove this in general.
Is $f(v)$ convex for any distribution with $Eleft[X^{2k}right]ge (2k-1)!!$ (rhs being the $2k$th moment of a standard normal distribution.)
real-analysis probability statistics probability-distributions convex-analysis
$endgroup$
I'm interested in the function
$$f(v) = Eleft[big(sum_i X_i v_ibig)^{2k}right],$$
where $v$ is conditioned to $|v|_2=1$, $kge1$ is integer, and $X_i$ are iid random variables.
For which random distributions is $f$ maximized when all the weight of $v$ is concentrated on a single coordinate? That is, when $v=[1,0dots,0]$.
Clearly when $X_isimmathcal N(0,1)$ we get $Eleft[big(sum_i X_i v_ibig)^{2k}right]=C_k|v|^{2k}_2=C_k$ for some constant $C_k$ depending on $k$. Hence for the normal distribution the `layout' of $v$ doesn't matter. On the other hand, if $X_i$ is an $alpha$-stable distribution, we'd have $Eleft[big(sum_i X_i v_ibig)^{2k}right]=C_k|v|^{2k}_alpha$, which is clearly maximized at $v=[1,0dots,0]$ when $alpha<2$.
It seems to me that $f(v)$ will be convex for any random distribution that has Gaussian moments or larger. I have checked it for the exponential distribution and chi-squared as well, but I don't know how to prove this in general.
Is $f(v)$ convex for any distribution with $Eleft[X^{2k}right]ge (2k-1)!!$ (rhs being the $2k$th moment of a standard normal distribution.)
real-analysis probability statistics probability-distributions convex-analysis
real-analysis probability statistics probability-distributions convex-analysis
edited Dec 21 '18 at 19:30
Henning Makholm
240k17305541
240k17305541
asked Dec 21 '18 at 18:20
Thomas AhleThomas Ahle
1,5121320
1,5121320
2
$begingroup$
$f(v_1,...,v_n)$ is convex (for any random variables for which the expectations are finite) over $mathbb{R}^n$, but not over the domain ${v:||v||=1}$. For example take the convex function $g(x_1,x_2)=(x_1+x_2)^{2k}$. Then $g(1,0)=1 < g(1/sqrt{2},1/sqrt{2})=sqrt{2}^{2k}$.
$endgroup$
– Michael
Dec 21 '18 at 18:37
1
$begingroup$
In other words, convexity seems to have nothing to do with the problem. It would also help to clarify how many terms your sum is over (i.e., the dimension of $v$ is what?), and if the square is inside or outside the expectation (I assume inside). That is, does $E(sum X_i)^2$ mean $E[(sum X_i)^2]$ or $E[sum X_i]^2$?
$endgroup$
– Michael
Dec 21 '18 at 18:48
$begingroup$
@Michael Thank you, I've added more brackets to my expectations. Regarding convexity, I do indeed mean convex over the domain. Is there a better way to state this?
$endgroup$
– Thomas Ahle
Dec 21 '18 at 19:01
1
$begingroup$
A convex function must have a convex domain. There is no function that is convex over the domain ${v: ||v||=1}$.
$endgroup$
– Michael
Dec 21 '18 at 19:02
1
$begingroup$
You likely want the $X_i$ to have mean zero since if $E[X_i]=m$ we get $f(1,0,0,..,0)=E[X_1^{2k}]$ and $f(1/sqrt{n}, ..., 1/sqrt{n})geq (msqrt{n})^{2k}$, which goes to infinity with $n$ whenever $mneq 0$. It would also help to know whether or not $n=k$.
$endgroup$
– Michael
Dec 21 '18 at 19:06
|
show 2 more comments
2
$begingroup$
$f(v_1,...,v_n)$ is convex (for any random variables for which the expectations are finite) over $mathbb{R}^n$, but not over the domain ${v:||v||=1}$. For example take the convex function $g(x_1,x_2)=(x_1+x_2)^{2k}$. Then $g(1,0)=1 < g(1/sqrt{2},1/sqrt{2})=sqrt{2}^{2k}$.
$endgroup$
– Michael
Dec 21 '18 at 18:37
1
$begingroup$
In other words, convexity seems to have nothing to do with the problem. It would also help to clarify how many terms your sum is over (i.e., the dimension of $v$ is what?), and if the square is inside or outside the expectation (I assume inside). That is, does $E(sum X_i)^2$ mean $E[(sum X_i)^2]$ or $E[sum X_i]^2$?
$endgroup$
– Michael
Dec 21 '18 at 18:48
$begingroup$
@Michael Thank you, I've added more brackets to my expectations. Regarding convexity, I do indeed mean convex over the domain. Is there a better way to state this?
$endgroup$
– Thomas Ahle
Dec 21 '18 at 19:01
1
$begingroup$
A convex function must have a convex domain. There is no function that is convex over the domain ${v: ||v||=1}$.
$endgroup$
– Michael
Dec 21 '18 at 19:02
1
$begingroup$
You likely want the $X_i$ to have mean zero since if $E[X_i]=m$ we get $f(1,0,0,..,0)=E[X_1^{2k}]$ and $f(1/sqrt{n}, ..., 1/sqrt{n})geq (msqrt{n})^{2k}$, which goes to infinity with $n$ whenever $mneq 0$. It would also help to know whether or not $n=k$.
$endgroup$
– Michael
Dec 21 '18 at 19:06
2
2
$begingroup$
$f(v_1,...,v_n)$ is convex (for any random variables for which the expectations are finite) over $mathbb{R}^n$, but not over the domain ${v:||v||=1}$. For example take the convex function $g(x_1,x_2)=(x_1+x_2)^{2k}$. Then $g(1,0)=1 < g(1/sqrt{2},1/sqrt{2})=sqrt{2}^{2k}$.
$endgroup$
– Michael
Dec 21 '18 at 18:37
$begingroup$
$f(v_1,...,v_n)$ is convex (for any random variables for which the expectations are finite) over $mathbb{R}^n$, but not over the domain ${v:||v||=1}$. For example take the convex function $g(x_1,x_2)=(x_1+x_2)^{2k}$. Then $g(1,0)=1 < g(1/sqrt{2},1/sqrt{2})=sqrt{2}^{2k}$.
$endgroup$
– Michael
Dec 21 '18 at 18:37
1
1
$begingroup$
In other words, convexity seems to have nothing to do with the problem. It would also help to clarify how many terms your sum is over (i.e., the dimension of $v$ is what?), and if the square is inside or outside the expectation (I assume inside). That is, does $E(sum X_i)^2$ mean $E[(sum X_i)^2]$ or $E[sum X_i]^2$?
$endgroup$
– Michael
Dec 21 '18 at 18:48
$begingroup$
In other words, convexity seems to have nothing to do with the problem. It would also help to clarify how many terms your sum is over (i.e., the dimension of $v$ is what?), and if the square is inside or outside the expectation (I assume inside). That is, does $E(sum X_i)^2$ mean $E[(sum X_i)^2]$ or $E[sum X_i]^2$?
$endgroup$
– Michael
Dec 21 '18 at 18:48
$begingroup$
@Michael Thank you, I've added more brackets to my expectations. Regarding convexity, I do indeed mean convex over the domain. Is there a better way to state this?
$endgroup$
– Thomas Ahle
Dec 21 '18 at 19:01
$begingroup$
@Michael Thank you, I've added more brackets to my expectations. Regarding convexity, I do indeed mean convex over the domain. Is there a better way to state this?
$endgroup$
– Thomas Ahle
Dec 21 '18 at 19:01
1
1
$begingroup$
A convex function must have a convex domain. There is no function that is convex over the domain ${v: ||v||=1}$.
$endgroup$
– Michael
Dec 21 '18 at 19:02
$begingroup$
A convex function must have a convex domain. There is no function that is convex over the domain ${v: ||v||=1}$.
$endgroup$
– Michael
Dec 21 '18 at 19:02
1
1
$begingroup$
You likely want the $X_i$ to have mean zero since if $E[X_i]=m$ we get $f(1,0,0,..,0)=E[X_1^{2k}]$ and $f(1/sqrt{n}, ..., 1/sqrt{n})geq (msqrt{n})^{2k}$, which goes to infinity with $n$ whenever $mneq 0$. It would also help to know whether or not $n=k$.
$endgroup$
– Michael
Dec 21 '18 at 19:06
$begingroup$
You likely want the $X_i$ to have mean zero since if $E[X_i]=m$ we get $f(1,0,0,..,0)=E[X_1^{2k}]$ and $f(1/sqrt{n}, ..., 1/sqrt{n})geq (msqrt{n})^{2k}$, which goes to infinity with $n$ whenever $mneq 0$. It would also help to know whether or not $n=k$.
$endgroup$
– Michael
Dec 21 '18 at 19:06
|
show 2 more comments
1 Answer
1
active
oldest
votes
$begingroup$
In fact, we can show that $$
g(v) = Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}}$$ is subadditive, homogeneous of degree $1$ over $vinmathbb{R}^n$ (this implies convexity.) Subadditivity comes from Minkowski's inequality which is implying that
$$begin{eqnarray}
g(v+w) &=& Eleft[(sum_{i} v_i X_i +sum_{i} w_i X_i)^{2k}right]^{frac{1}{2k}}\&leq& Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}} + Eleft[(sum_{i} w_i X_i)^{2k}right]^{frac{1}{2k}}\&=&g(v) + g(w).
end{eqnarray}$$ Homogeneity can be shown easily:
$$
g(tv) = Eleft[(sum_{i} tv_i X_i)^{2k}right]^{frac{1}{2k}}=t Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}} = tg(v),quadforall tgeq 0.
$$ This proves $$
g(tv + (1-t)w) leq g(tv) + g((1-t)w) =tg(v) + (1-t)g(w) ,$$ for all $tin [0,1]$ and $v,winmathbb{R}^n$. Since $g$ is a non-negative convex function, and the map $tin [0,infty)mapsto t^{2k}$ is non-decreasing convex, we have $f(v) = g(v)^{2k}$ is also convex.
$endgroup$
$begingroup$
Convexity came without any requirement on $x$! I guess @Michael was right that convexity is not really the property I need, since my variables live on the sphere. I really appreciate the answer, but do you have any ideas how to show that the sum is maximized when $v=[1, 0,dots]$ as in the question?
$endgroup$
– Thomas Ahle
Dec 22 '18 at 12:22
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3048769%2fconvexity-of-fv-e-sum-i-x-i-v-ik-subj-to-sum-i-v-i2-1%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
In fact, we can show that $$
g(v) = Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}}$$ is subadditive, homogeneous of degree $1$ over $vinmathbb{R}^n$ (this implies convexity.) Subadditivity comes from Minkowski's inequality which is implying that
$$begin{eqnarray}
g(v+w) &=& Eleft[(sum_{i} v_i X_i +sum_{i} w_i X_i)^{2k}right]^{frac{1}{2k}}\&leq& Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}} + Eleft[(sum_{i} w_i X_i)^{2k}right]^{frac{1}{2k}}\&=&g(v) + g(w).
end{eqnarray}$$ Homogeneity can be shown easily:
$$
g(tv) = Eleft[(sum_{i} tv_i X_i)^{2k}right]^{frac{1}{2k}}=t Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}} = tg(v),quadforall tgeq 0.
$$ This proves $$
g(tv + (1-t)w) leq g(tv) + g((1-t)w) =tg(v) + (1-t)g(w) ,$$ for all $tin [0,1]$ and $v,winmathbb{R}^n$. Since $g$ is a non-negative convex function, and the map $tin [0,infty)mapsto t^{2k}$ is non-decreasing convex, we have $f(v) = g(v)^{2k}$ is also convex.
$endgroup$
$begingroup$
Convexity came without any requirement on $x$! I guess @Michael was right that convexity is not really the property I need, since my variables live on the sphere. I really appreciate the answer, but do you have any ideas how to show that the sum is maximized when $v=[1, 0,dots]$ as in the question?
$endgroup$
– Thomas Ahle
Dec 22 '18 at 12:22
add a comment |
$begingroup$
In fact, we can show that $$
g(v) = Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}}$$ is subadditive, homogeneous of degree $1$ over $vinmathbb{R}^n$ (this implies convexity.) Subadditivity comes from Minkowski's inequality which is implying that
$$begin{eqnarray}
g(v+w) &=& Eleft[(sum_{i} v_i X_i +sum_{i} w_i X_i)^{2k}right]^{frac{1}{2k}}\&leq& Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}} + Eleft[(sum_{i} w_i X_i)^{2k}right]^{frac{1}{2k}}\&=&g(v) + g(w).
end{eqnarray}$$ Homogeneity can be shown easily:
$$
g(tv) = Eleft[(sum_{i} tv_i X_i)^{2k}right]^{frac{1}{2k}}=t Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}} = tg(v),quadforall tgeq 0.
$$ This proves $$
g(tv + (1-t)w) leq g(tv) + g((1-t)w) =tg(v) + (1-t)g(w) ,$$ for all $tin [0,1]$ and $v,winmathbb{R}^n$. Since $g$ is a non-negative convex function, and the map $tin [0,infty)mapsto t^{2k}$ is non-decreasing convex, we have $f(v) = g(v)^{2k}$ is also convex.
$endgroup$
$begingroup$
Convexity came without any requirement on $x$! I guess @Michael was right that convexity is not really the property I need, since my variables live on the sphere. I really appreciate the answer, but do you have any ideas how to show that the sum is maximized when $v=[1, 0,dots]$ as in the question?
$endgroup$
– Thomas Ahle
Dec 22 '18 at 12:22
add a comment |
$begingroup$
In fact, we can show that $$
g(v) = Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}}$$ is subadditive, homogeneous of degree $1$ over $vinmathbb{R}^n$ (this implies convexity.) Subadditivity comes from Minkowski's inequality which is implying that
$$begin{eqnarray}
g(v+w) &=& Eleft[(sum_{i} v_i X_i +sum_{i} w_i X_i)^{2k}right]^{frac{1}{2k}}\&leq& Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}} + Eleft[(sum_{i} w_i X_i)^{2k}right]^{frac{1}{2k}}\&=&g(v) + g(w).
end{eqnarray}$$ Homogeneity can be shown easily:
$$
g(tv) = Eleft[(sum_{i} tv_i X_i)^{2k}right]^{frac{1}{2k}}=t Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}} = tg(v),quadforall tgeq 0.
$$ This proves $$
g(tv + (1-t)w) leq g(tv) + g((1-t)w) =tg(v) + (1-t)g(w) ,$$ for all $tin [0,1]$ and $v,winmathbb{R}^n$. Since $g$ is a non-negative convex function, and the map $tin [0,infty)mapsto t^{2k}$ is non-decreasing convex, we have $f(v) = g(v)^{2k}$ is also convex.
$endgroup$
In fact, we can show that $$
g(v) = Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}}$$ is subadditive, homogeneous of degree $1$ over $vinmathbb{R}^n$ (this implies convexity.) Subadditivity comes from Minkowski's inequality which is implying that
$$begin{eqnarray}
g(v+w) &=& Eleft[(sum_{i} v_i X_i +sum_{i} w_i X_i)^{2k}right]^{frac{1}{2k}}\&leq& Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}} + Eleft[(sum_{i} w_i X_i)^{2k}right]^{frac{1}{2k}}\&=&g(v) + g(w).
end{eqnarray}$$ Homogeneity can be shown easily:
$$
g(tv) = Eleft[(sum_{i} tv_i X_i)^{2k}right]^{frac{1}{2k}}=t Eleft[(sum_{i} v_i X_i)^{2k}right]^{frac{1}{2k}} = tg(v),quadforall tgeq 0.
$$ This proves $$
g(tv + (1-t)w) leq g(tv) + g((1-t)w) =tg(v) + (1-t)g(w) ,$$ for all $tin [0,1]$ and $v,winmathbb{R}^n$. Since $g$ is a non-negative convex function, and the map $tin [0,infty)mapsto t^{2k}$ is non-decreasing convex, we have $f(v) = g(v)^{2k}$ is also convex.
answered Dec 21 '18 at 19:28
SongSong
11.3k628
11.3k628
$begingroup$
Convexity came without any requirement on $x$! I guess @Michael was right that convexity is not really the property I need, since my variables live on the sphere. I really appreciate the answer, but do you have any ideas how to show that the sum is maximized when $v=[1, 0,dots]$ as in the question?
$endgroup$
– Thomas Ahle
Dec 22 '18 at 12:22
add a comment |
$begingroup$
Convexity came without any requirement on $x$! I guess @Michael was right that convexity is not really the property I need, since my variables live on the sphere. I really appreciate the answer, but do you have any ideas how to show that the sum is maximized when $v=[1, 0,dots]$ as in the question?
$endgroup$
– Thomas Ahle
Dec 22 '18 at 12:22
$begingroup$
Convexity came without any requirement on $x$! I guess @Michael was right that convexity is not really the property I need, since my variables live on the sphere. I really appreciate the answer, but do you have any ideas how to show that the sum is maximized when $v=[1, 0,dots]$ as in the question?
$endgroup$
– Thomas Ahle
Dec 22 '18 at 12:22
$begingroup$
Convexity came without any requirement on $x$! I guess @Michael was right that convexity is not really the property I need, since my variables live on the sphere. I really appreciate the answer, but do you have any ideas how to show that the sum is maximized when $v=[1, 0,dots]$ as in the question?
$endgroup$
– Thomas Ahle
Dec 22 '18 at 12:22
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3048769%2fconvexity-of-fv-e-sum-i-x-i-v-ik-subj-to-sum-i-v-i2-1%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
$f(v_1,...,v_n)$ is convex (for any random variables for which the expectations are finite) over $mathbb{R}^n$, but not over the domain ${v:||v||=1}$. For example take the convex function $g(x_1,x_2)=(x_1+x_2)^{2k}$. Then $g(1,0)=1 < g(1/sqrt{2},1/sqrt{2})=sqrt{2}^{2k}$.
$endgroup$
– Michael
Dec 21 '18 at 18:37
1
$begingroup$
In other words, convexity seems to have nothing to do with the problem. It would also help to clarify how many terms your sum is over (i.e., the dimension of $v$ is what?), and if the square is inside or outside the expectation (I assume inside). That is, does $E(sum X_i)^2$ mean $E[(sum X_i)^2]$ or $E[sum X_i]^2$?
$endgroup$
– Michael
Dec 21 '18 at 18:48
$begingroup$
@Michael Thank you, I've added more brackets to my expectations. Regarding convexity, I do indeed mean convex over the domain. Is there a better way to state this?
$endgroup$
– Thomas Ahle
Dec 21 '18 at 19:01
1
$begingroup$
A convex function must have a convex domain. There is no function that is convex over the domain ${v: ||v||=1}$.
$endgroup$
– Michael
Dec 21 '18 at 19:02
1
$begingroup$
You likely want the $X_i$ to have mean zero since if $E[X_i]=m$ we get $f(1,0,0,..,0)=E[X_1^{2k}]$ and $f(1/sqrt{n}, ..., 1/sqrt{n})geq (msqrt{n})^{2k}$, which goes to infinity with $n$ whenever $mneq 0$. It would also help to know whether or not $n=k$.
$endgroup$
– Michael
Dec 21 '18 at 19:06