Finding UMVUE for uniform distribution $U(alpha, beta)$












1












$begingroup$


Let $X = (X_1, X_2, ldots, X_n)$ be a sample from uniform distribution $U(alpha, beta): alpha, beta in mathbb{R}, alpha < beta$. I am to find UMVUE for the parameters $alpha, beta$.



Using factorization theorem I showed that $T(X) = (min{ X_1, X_2, ldots, X_2}, max{ X_1, X_2, ldots, X_n })$ is a sufficient statistics.



I think that I should use Lehmann–Scheffé theorem. Am I to solve this problem fixing $beta$ and then computing UMVUE for $alpha$? Then vive versa?



I tried to find an unbiased estimator for $alpha$ first. Thinking of $beta$ as fixed I managed to calculate that
$$mathbb{E}(2X_1 - beta) = alpha.$$
Thus using Lehmann–Scheffé theorem UMVUE would be
$$mathbb{E}(2X_1 - beta|T(X)).$$
How can I find conditional expected value when $T(X)$ is a vector?



On the other hand I fixed $beta$ in order to find my unbiased estimator so should I calculate $mathbb{E}(2X_1 - beta|X_{(1)})$?



I am a bit confused. I would appreciate any hints or tips.










share|cite|improve this question











$endgroup$

















    1












    $begingroup$


    Let $X = (X_1, X_2, ldots, X_n)$ be a sample from uniform distribution $U(alpha, beta): alpha, beta in mathbb{R}, alpha < beta$. I am to find UMVUE for the parameters $alpha, beta$.



    Using factorization theorem I showed that $T(X) = (min{ X_1, X_2, ldots, X_2}, max{ X_1, X_2, ldots, X_n })$ is a sufficient statistics.



    I think that I should use Lehmann–Scheffé theorem. Am I to solve this problem fixing $beta$ and then computing UMVUE for $alpha$? Then vive versa?



    I tried to find an unbiased estimator for $alpha$ first. Thinking of $beta$ as fixed I managed to calculate that
    $$mathbb{E}(2X_1 - beta) = alpha.$$
    Thus using Lehmann–Scheffé theorem UMVUE would be
    $$mathbb{E}(2X_1 - beta|T(X)).$$
    How can I find conditional expected value when $T(X)$ is a vector?



    On the other hand I fixed $beta$ in order to find my unbiased estimator so should I calculate $mathbb{E}(2X_1 - beta|X_{(1)})$?



    I am a bit confused. I would appreciate any hints or tips.










    share|cite|improve this question











    $endgroup$















      1












      1








      1


      1



      $begingroup$


      Let $X = (X_1, X_2, ldots, X_n)$ be a sample from uniform distribution $U(alpha, beta): alpha, beta in mathbb{R}, alpha < beta$. I am to find UMVUE for the parameters $alpha, beta$.



      Using factorization theorem I showed that $T(X) = (min{ X_1, X_2, ldots, X_2}, max{ X_1, X_2, ldots, X_n })$ is a sufficient statistics.



      I think that I should use Lehmann–Scheffé theorem. Am I to solve this problem fixing $beta$ and then computing UMVUE for $alpha$? Then vive versa?



      I tried to find an unbiased estimator for $alpha$ first. Thinking of $beta$ as fixed I managed to calculate that
      $$mathbb{E}(2X_1 - beta) = alpha.$$
      Thus using Lehmann–Scheffé theorem UMVUE would be
      $$mathbb{E}(2X_1 - beta|T(X)).$$
      How can I find conditional expected value when $T(X)$ is a vector?



      On the other hand I fixed $beta$ in order to find my unbiased estimator so should I calculate $mathbb{E}(2X_1 - beta|X_{(1)})$?



      I am a bit confused. I would appreciate any hints or tips.










      share|cite|improve this question











      $endgroup$




      Let $X = (X_1, X_2, ldots, X_n)$ be a sample from uniform distribution $U(alpha, beta): alpha, beta in mathbb{R}, alpha < beta$. I am to find UMVUE for the parameters $alpha, beta$.



      Using factorization theorem I showed that $T(X) = (min{ X_1, X_2, ldots, X_2}, max{ X_1, X_2, ldots, X_n })$ is a sufficient statistics.



      I think that I should use Lehmann–Scheffé theorem. Am I to solve this problem fixing $beta$ and then computing UMVUE for $alpha$? Then vive versa?



      I tried to find an unbiased estimator for $alpha$ first. Thinking of $beta$ as fixed I managed to calculate that
      $$mathbb{E}(2X_1 - beta) = alpha.$$
      Thus using Lehmann–Scheffé theorem UMVUE would be
      $$mathbb{E}(2X_1 - beta|T(X)).$$
      How can I find conditional expected value when $T(X)$ is a vector?



      On the other hand I fixed $beta$ in order to find my unbiased estimator so should I calculate $mathbb{E}(2X_1 - beta|X_{(1)})$?



      I am a bit confused. I would appreciate any hints or tips.







      statistics estimation






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Jan 13 at 18:25







      Hendrra

















      asked Jan 13 at 18:17









      HendrraHendrra

      1,214516




      1,214516






















          1 Answer
          1






          active

          oldest

          votes


















          1












          $begingroup$

          $T=(X_{(1)},X_{(n)})$ is not only sufficient but a complete sufficient statistic which is needed here, $X_{(k)}$ being the $k$th $,(1le kle n)$ order statistic.



          Since the $X_i$'s are i.i.d $U(a,b)$ variables, $Y_i=frac{X_i-a}{b-a}$ are i.i.d $U(0,1)$ variables, $, 1le ile n$.



          Now it is well-known that $Y_{(1)}sim text{Beta}(1,n)$ and $Y_{(n)}simtext{Beta}(n,1)$, implying $E(Y_{(1)})=frac{1}{n+1}$ and $E(Y_{(n)})=frac{n}{n+1}$. So all you have to do is solve for $a$ and $b$ from the equations



          $$E(X_{(1)})=frac{b-a}{n+1}+a\ E(X_{(n)})=frac{(b-a)n}{n+1}+a$$



          You would get $a$ and $b$ as unbiased estimators of some function of $T$, and those will be the corresponding UMVUEs by Lehmann-Scheffe theorem.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Does that mean that my UMVUE will be a vector too?
            $endgroup$
            – Hendrra
            Jan 13 at 20:19






          • 1




            $begingroup$
            @Hendrra No, the UMVUEs are linear combinations of $X_{(1)}$ and $X_{(n)}$.
            $endgroup$
            – StubbornAtom
            Jan 13 at 20:21










          • $begingroup$
            One for b and one for a?
            $endgroup$
            – Hendrra
            Jan 13 at 20:24






          • 1




            $begingroup$
            @Hendrra If you have found an unbiased estimator of your parameter based on a complete sufficient statistic, then that is bound to be the UMVUE and it equals the conditional expectation ( the latter is sometimes difficult to compute). This follows from Lehmann-Scheffe.
            $endgroup$
            – StubbornAtom
            Jan 13 at 20:46






          • 1




            $begingroup$
            @Hendrra The linear combination part is irrelevant (it's only here that the UMVUEs are linear combinations, not in general).
            $endgroup$
            – StubbornAtom
            Jan 13 at 21:01












          Your Answer








          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3072347%2ffinding-umvue-for-uniform-distribution-u-alpha-beta%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1












          $begingroup$

          $T=(X_{(1)},X_{(n)})$ is not only sufficient but a complete sufficient statistic which is needed here, $X_{(k)}$ being the $k$th $,(1le kle n)$ order statistic.



          Since the $X_i$'s are i.i.d $U(a,b)$ variables, $Y_i=frac{X_i-a}{b-a}$ are i.i.d $U(0,1)$ variables, $, 1le ile n$.



          Now it is well-known that $Y_{(1)}sim text{Beta}(1,n)$ and $Y_{(n)}simtext{Beta}(n,1)$, implying $E(Y_{(1)})=frac{1}{n+1}$ and $E(Y_{(n)})=frac{n}{n+1}$. So all you have to do is solve for $a$ and $b$ from the equations



          $$E(X_{(1)})=frac{b-a}{n+1}+a\ E(X_{(n)})=frac{(b-a)n}{n+1}+a$$



          You would get $a$ and $b$ as unbiased estimators of some function of $T$, and those will be the corresponding UMVUEs by Lehmann-Scheffe theorem.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Does that mean that my UMVUE will be a vector too?
            $endgroup$
            – Hendrra
            Jan 13 at 20:19






          • 1




            $begingroup$
            @Hendrra No, the UMVUEs are linear combinations of $X_{(1)}$ and $X_{(n)}$.
            $endgroup$
            – StubbornAtom
            Jan 13 at 20:21










          • $begingroup$
            One for b and one for a?
            $endgroup$
            – Hendrra
            Jan 13 at 20:24






          • 1




            $begingroup$
            @Hendrra If you have found an unbiased estimator of your parameter based on a complete sufficient statistic, then that is bound to be the UMVUE and it equals the conditional expectation ( the latter is sometimes difficult to compute). This follows from Lehmann-Scheffe.
            $endgroup$
            – StubbornAtom
            Jan 13 at 20:46






          • 1




            $begingroup$
            @Hendrra The linear combination part is irrelevant (it's only here that the UMVUEs are linear combinations, not in general).
            $endgroup$
            – StubbornAtom
            Jan 13 at 21:01
















          1












          $begingroup$

          $T=(X_{(1)},X_{(n)})$ is not only sufficient but a complete sufficient statistic which is needed here, $X_{(k)}$ being the $k$th $,(1le kle n)$ order statistic.



          Since the $X_i$'s are i.i.d $U(a,b)$ variables, $Y_i=frac{X_i-a}{b-a}$ are i.i.d $U(0,1)$ variables, $, 1le ile n$.



          Now it is well-known that $Y_{(1)}sim text{Beta}(1,n)$ and $Y_{(n)}simtext{Beta}(n,1)$, implying $E(Y_{(1)})=frac{1}{n+1}$ and $E(Y_{(n)})=frac{n}{n+1}$. So all you have to do is solve for $a$ and $b$ from the equations



          $$E(X_{(1)})=frac{b-a}{n+1}+a\ E(X_{(n)})=frac{(b-a)n}{n+1}+a$$



          You would get $a$ and $b$ as unbiased estimators of some function of $T$, and those will be the corresponding UMVUEs by Lehmann-Scheffe theorem.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Does that mean that my UMVUE will be a vector too?
            $endgroup$
            – Hendrra
            Jan 13 at 20:19






          • 1




            $begingroup$
            @Hendrra No, the UMVUEs are linear combinations of $X_{(1)}$ and $X_{(n)}$.
            $endgroup$
            – StubbornAtom
            Jan 13 at 20:21










          • $begingroup$
            One for b and one for a?
            $endgroup$
            – Hendrra
            Jan 13 at 20:24






          • 1




            $begingroup$
            @Hendrra If you have found an unbiased estimator of your parameter based on a complete sufficient statistic, then that is bound to be the UMVUE and it equals the conditional expectation ( the latter is sometimes difficult to compute). This follows from Lehmann-Scheffe.
            $endgroup$
            – StubbornAtom
            Jan 13 at 20:46






          • 1




            $begingroup$
            @Hendrra The linear combination part is irrelevant (it's only here that the UMVUEs are linear combinations, not in general).
            $endgroup$
            – StubbornAtom
            Jan 13 at 21:01














          1












          1








          1





          $begingroup$

          $T=(X_{(1)},X_{(n)})$ is not only sufficient but a complete sufficient statistic which is needed here, $X_{(k)}$ being the $k$th $,(1le kle n)$ order statistic.



          Since the $X_i$'s are i.i.d $U(a,b)$ variables, $Y_i=frac{X_i-a}{b-a}$ are i.i.d $U(0,1)$ variables, $, 1le ile n$.



          Now it is well-known that $Y_{(1)}sim text{Beta}(1,n)$ and $Y_{(n)}simtext{Beta}(n,1)$, implying $E(Y_{(1)})=frac{1}{n+1}$ and $E(Y_{(n)})=frac{n}{n+1}$. So all you have to do is solve for $a$ and $b$ from the equations



          $$E(X_{(1)})=frac{b-a}{n+1}+a\ E(X_{(n)})=frac{(b-a)n}{n+1}+a$$



          You would get $a$ and $b$ as unbiased estimators of some function of $T$, and those will be the corresponding UMVUEs by Lehmann-Scheffe theorem.






          share|cite|improve this answer









          $endgroup$



          $T=(X_{(1)},X_{(n)})$ is not only sufficient but a complete sufficient statistic which is needed here, $X_{(k)}$ being the $k$th $,(1le kle n)$ order statistic.



          Since the $X_i$'s are i.i.d $U(a,b)$ variables, $Y_i=frac{X_i-a}{b-a}$ are i.i.d $U(0,1)$ variables, $, 1le ile n$.



          Now it is well-known that $Y_{(1)}sim text{Beta}(1,n)$ and $Y_{(n)}simtext{Beta}(n,1)$, implying $E(Y_{(1)})=frac{1}{n+1}$ and $E(Y_{(n)})=frac{n}{n+1}$. So all you have to do is solve for $a$ and $b$ from the equations



          $$E(X_{(1)})=frac{b-a}{n+1}+a\ E(X_{(n)})=frac{(b-a)n}{n+1}+a$$



          You would get $a$ and $b$ as unbiased estimators of some function of $T$, and those will be the corresponding UMVUEs by Lehmann-Scheffe theorem.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 13 at 19:08









          StubbornAtomStubbornAtom

          6,45431440




          6,45431440












          • $begingroup$
            Does that mean that my UMVUE will be a vector too?
            $endgroup$
            – Hendrra
            Jan 13 at 20:19






          • 1




            $begingroup$
            @Hendrra No, the UMVUEs are linear combinations of $X_{(1)}$ and $X_{(n)}$.
            $endgroup$
            – StubbornAtom
            Jan 13 at 20:21










          • $begingroup$
            One for b and one for a?
            $endgroup$
            – Hendrra
            Jan 13 at 20:24






          • 1




            $begingroup$
            @Hendrra If you have found an unbiased estimator of your parameter based on a complete sufficient statistic, then that is bound to be the UMVUE and it equals the conditional expectation ( the latter is sometimes difficult to compute). This follows from Lehmann-Scheffe.
            $endgroup$
            – StubbornAtom
            Jan 13 at 20:46






          • 1




            $begingroup$
            @Hendrra The linear combination part is irrelevant (it's only here that the UMVUEs are linear combinations, not in general).
            $endgroup$
            – StubbornAtom
            Jan 13 at 21:01


















          • $begingroup$
            Does that mean that my UMVUE will be a vector too?
            $endgroup$
            – Hendrra
            Jan 13 at 20:19






          • 1




            $begingroup$
            @Hendrra No, the UMVUEs are linear combinations of $X_{(1)}$ and $X_{(n)}$.
            $endgroup$
            – StubbornAtom
            Jan 13 at 20:21










          • $begingroup$
            One for b and one for a?
            $endgroup$
            – Hendrra
            Jan 13 at 20:24






          • 1




            $begingroup$
            @Hendrra If you have found an unbiased estimator of your parameter based on a complete sufficient statistic, then that is bound to be the UMVUE and it equals the conditional expectation ( the latter is sometimes difficult to compute). This follows from Lehmann-Scheffe.
            $endgroup$
            – StubbornAtom
            Jan 13 at 20:46






          • 1




            $begingroup$
            @Hendrra The linear combination part is irrelevant (it's only here that the UMVUEs are linear combinations, not in general).
            $endgroup$
            – StubbornAtom
            Jan 13 at 21:01
















          $begingroup$
          Does that mean that my UMVUE will be a vector too?
          $endgroup$
          – Hendrra
          Jan 13 at 20:19




          $begingroup$
          Does that mean that my UMVUE will be a vector too?
          $endgroup$
          – Hendrra
          Jan 13 at 20:19




          1




          1




          $begingroup$
          @Hendrra No, the UMVUEs are linear combinations of $X_{(1)}$ and $X_{(n)}$.
          $endgroup$
          – StubbornAtom
          Jan 13 at 20:21




          $begingroup$
          @Hendrra No, the UMVUEs are linear combinations of $X_{(1)}$ and $X_{(n)}$.
          $endgroup$
          – StubbornAtom
          Jan 13 at 20:21












          $begingroup$
          One for b and one for a?
          $endgroup$
          – Hendrra
          Jan 13 at 20:24




          $begingroup$
          One for b and one for a?
          $endgroup$
          – Hendrra
          Jan 13 at 20:24




          1




          1




          $begingroup$
          @Hendrra If you have found an unbiased estimator of your parameter based on a complete sufficient statistic, then that is bound to be the UMVUE and it equals the conditional expectation ( the latter is sometimes difficult to compute). This follows from Lehmann-Scheffe.
          $endgroup$
          – StubbornAtom
          Jan 13 at 20:46




          $begingroup$
          @Hendrra If you have found an unbiased estimator of your parameter based on a complete sufficient statistic, then that is bound to be the UMVUE and it equals the conditional expectation ( the latter is sometimes difficult to compute). This follows from Lehmann-Scheffe.
          $endgroup$
          – StubbornAtom
          Jan 13 at 20:46




          1




          1




          $begingroup$
          @Hendrra The linear combination part is irrelevant (it's only here that the UMVUEs are linear combinations, not in general).
          $endgroup$
          – StubbornAtom
          Jan 13 at 21:01




          $begingroup$
          @Hendrra The linear combination part is irrelevant (it's only here that the UMVUEs are linear combinations, not in general).
          $endgroup$
          – StubbornAtom
          Jan 13 at 21:01


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3072347%2ffinding-umvue-for-uniform-distribution-u-alpha-beta%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Bressuire

          Cabo Verde

          Gyllenstierna