Model Selection (k-piece-constant function)
$begingroup$
A $k$-piece-constant function is define by $k-1$ thresholds $-100<t_1<t_2<......<t_{k-1}<100$ and $k$ values as $a_1,a_2,......,a_k$
The function is defined as follows-
If $x<t_1$ then $f(x)=a_1$
If $t_1<x<t_2$ then $f(x)=a_2$
If $t_2<x<t_3$ then $f(x)=a_3$
.
.
.
If $t_{i-1}<x<t_i$ then $f(x)=a_i$
.
.
.
If $t_{k-1}<x<t_k$ then $f(x)=a_k$
Let $f$ be a -piece-constant function. Suppose you are given $n$ data points $((x_1,y_1),(x_2,y_2),.......,(x_n,y_n))$ each of which is generated in the following way:
1. first, $x$ is drawn according to the uniform distribution over the range $[-100,100].$
2. second $y$ is chosen to be $f(x)+omega$ where $omega$ is drawn according to the normal distribution $N(mu,sigma^2)$
You partition the data into a training set and a test set of equal sizes. For each $j=1,2,...$ you find the $j$ -piece-constant function $g_j$ that minimizes the root-mean-square-error on the training set. Denote by $train(j)$ the RMSE on the training set and by $test(j)$ the RMSE on the test set.
Which of the following statements is correct?
$train(j)$ is a monotonically non-increasing function
$test(j)$ is a monotonically non-increasing function
$test(j)$ has a minimum close to $j=k$
$train(j)$ has a minimum close to $j=k$
if $j>n/2$ $train(j)=0$
I have absolutely no clue of the question. I somewhat understood the k-piece-function (which is perhaps new to me) but still unable to crack the question.
P.S. I know the idea behind training and testing sets
statistics regression
$endgroup$
add a comment |
$begingroup$
A $k$-piece-constant function is define by $k-1$ thresholds $-100<t_1<t_2<......<t_{k-1}<100$ and $k$ values as $a_1,a_2,......,a_k$
The function is defined as follows-
If $x<t_1$ then $f(x)=a_1$
If $t_1<x<t_2$ then $f(x)=a_2$
If $t_2<x<t_3$ then $f(x)=a_3$
.
.
.
If $t_{i-1}<x<t_i$ then $f(x)=a_i$
.
.
.
If $t_{k-1}<x<t_k$ then $f(x)=a_k$
Let $f$ be a -piece-constant function. Suppose you are given $n$ data points $((x_1,y_1),(x_2,y_2),.......,(x_n,y_n))$ each of which is generated in the following way:
1. first, $x$ is drawn according to the uniform distribution over the range $[-100,100].$
2. second $y$ is chosen to be $f(x)+omega$ where $omega$ is drawn according to the normal distribution $N(mu,sigma^2)$
You partition the data into a training set and a test set of equal sizes. For each $j=1,2,...$ you find the $j$ -piece-constant function $g_j$ that minimizes the root-mean-square-error on the training set. Denote by $train(j)$ the RMSE on the training set and by $test(j)$ the RMSE on the test set.
Which of the following statements is correct?
$train(j)$ is a monotonically non-increasing function
$test(j)$ is a monotonically non-increasing function
$test(j)$ has a minimum close to $j=k$
$train(j)$ has a minimum close to $j=k$
if $j>n/2$ $train(j)=0$
I have absolutely no clue of the question. I somewhat understood the k-piece-function (which is perhaps new to me) but still unable to crack the question.
P.S. I know the idea behind training and testing sets
statistics regression
$endgroup$
add a comment |
$begingroup$
A $k$-piece-constant function is define by $k-1$ thresholds $-100<t_1<t_2<......<t_{k-1}<100$ and $k$ values as $a_1,a_2,......,a_k$
The function is defined as follows-
If $x<t_1$ then $f(x)=a_1$
If $t_1<x<t_2$ then $f(x)=a_2$
If $t_2<x<t_3$ then $f(x)=a_3$
.
.
.
If $t_{i-1}<x<t_i$ then $f(x)=a_i$
.
.
.
If $t_{k-1}<x<t_k$ then $f(x)=a_k$
Let $f$ be a -piece-constant function. Suppose you are given $n$ data points $((x_1,y_1),(x_2,y_2),.......,(x_n,y_n))$ each of which is generated in the following way:
1. first, $x$ is drawn according to the uniform distribution over the range $[-100,100].$
2. second $y$ is chosen to be $f(x)+omega$ where $omega$ is drawn according to the normal distribution $N(mu,sigma^2)$
You partition the data into a training set and a test set of equal sizes. For each $j=1,2,...$ you find the $j$ -piece-constant function $g_j$ that minimizes the root-mean-square-error on the training set. Denote by $train(j)$ the RMSE on the training set and by $test(j)$ the RMSE on the test set.
Which of the following statements is correct?
$train(j)$ is a monotonically non-increasing function
$test(j)$ is a monotonically non-increasing function
$test(j)$ has a minimum close to $j=k$
$train(j)$ has a minimum close to $j=k$
if $j>n/2$ $train(j)=0$
I have absolutely no clue of the question. I somewhat understood the k-piece-function (which is perhaps new to me) but still unable to crack the question.
P.S. I know the idea behind training and testing sets
statistics regression
$endgroup$
A $k$-piece-constant function is define by $k-1$ thresholds $-100<t_1<t_2<......<t_{k-1}<100$ and $k$ values as $a_1,a_2,......,a_k$
The function is defined as follows-
If $x<t_1$ then $f(x)=a_1$
If $t_1<x<t_2$ then $f(x)=a_2$
If $t_2<x<t_3$ then $f(x)=a_3$
.
.
.
If $t_{i-1}<x<t_i$ then $f(x)=a_i$
.
.
.
If $t_{k-1}<x<t_k$ then $f(x)=a_k$
Let $f$ be a -piece-constant function. Suppose you are given $n$ data points $((x_1,y_1),(x_2,y_2),.......,(x_n,y_n))$ each of which is generated in the following way:
1. first, $x$ is drawn according to the uniform distribution over the range $[-100,100].$
2. second $y$ is chosen to be $f(x)+omega$ where $omega$ is drawn according to the normal distribution $N(mu,sigma^2)$
You partition the data into a training set and a test set of equal sizes. For each $j=1,2,...$ you find the $j$ -piece-constant function $g_j$ that minimizes the root-mean-square-error on the training set. Denote by $train(j)$ the RMSE on the training set and by $test(j)$ the RMSE on the test set.
Which of the following statements is correct?
$train(j)$ is a monotonically non-increasing function
$test(j)$ is a monotonically non-increasing function
$test(j)$ has a minimum close to $j=k$
$train(j)$ has a minimum close to $j=k$
if $j>n/2$ $train(j)=0$
I have absolutely no clue of the question. I somewhat understood the k-piece-function (which is perhaps new to me) but still unable to crack the question.
P.S. I know the idea behind training and testing sets
statistics regression
statistics regression
asked Dec 14 '18 at 12:13
Kriti AroraKriti Arora
396
396
add a comment |
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3039286%2fmodel-selection-k-piece-constant-function%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3039286%2fmodel-selection-k-piece-constant-function%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown