Expected number of record highs in an iid sequence of discrete uniform random variables












1














Let X1,X2,X3... be an endless sequence of random variables,
uniformly distributed on the set {1,2,3....10}.
Index i will be called "King" if X_i is greater than all of the numbers before it in the sequence.
Calculate the expected value of the number of indices that are to be crowned Kings.



please help.










share|cite|improve this question




















  • 1




    What have you tried so far?
    – Aditya Dua
    Dec 8 '18 at 22:29










  • This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc.
    – Did
    Dec 9 '18 at 8:53
















1














Let X1,X2,X3... be an endless sequence of random variables,
uniformly distributed on the set {1,2,3....10}.
Index i will be called "King" if X_i is greater than all of the numbers before it in the sequence.
Calculate the expected value of the number of indices that are to be crowned Kings.



please help.










share|cite|improve this question




















  • 1




    What have you tried so far?
    – Aditya Dua
    Dec 8 '18 at 22:29










  • This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc.
    – Did
    Dec 9 '18 at 8:53














1












1








1


3





Let X1,X2,X3... be an endless sequence of random variables,
uniformly distributed on the set {1,2,3....10}.
Index i will be called "King" if X_i is greater than all of the numbers before it in the sequence.
Calculate the expected value of the number of indices that are to be crowned Kings.



please help.










share|cite|improve this question















Let X1,X2,X3... be an endless sequence of random variables,
uniformly distributed on the set {1,2,3....10}.
Index i will be called "King" if X_i is greater than all of the numbers before it in the sequence.
Calculate the expected value of the number of indices that are to be crowned Kings.



please help.







probability random-variables expected-value






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 10 '18 at 1:12









r.e.s.

7,58811952




7,58811952










asked Dec 8 '18 at 12:05









user3184910

344




344








  • 1




    What have you tried so far?
    – Aditya Dua
    Dec 8 '18 at 22:29










  • This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc.
    – Did
    Dec 9 '18 at 8:53














  • 1




    What have you tried so far?
    – Aditya Dua
    Dec 8 '18 at 22:29










  • This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc.
    – Did
    Dec 9 '18 at 8:53








1




1




What have you tried so far?
– Aditya Dua
Dec 8 '18 at 22:29




What have you tried so far?
– Aditya Dua
Dec 8 '18 at 22:29












This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc.
– Did
Dec 9 '18 at 8:53




This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc.
– Did
Dec 9 '18 at 8:53










3 Answers
3






active

oldest

votes


















2














A term whose index is what you call a "king" is ordinarily called a record, so you are asking for the expected number of records in an iid sequence of Uniform${1,...,m}$ random variables, e.g. with $m=10$. (This assumes the terms are mutually independent.)



Shorter derivation



The probability that the $k$th distinct term is a record is ${1over k},$
and the number of records in the infinite sequence can be written as the sum of indicators
$$begin{align}sum_{k=1}^m mathbb{1}_{text{$k$th distinct term is a record}}
end{align}$$



so the expected number of records in the infinite sequence is



$$begin{align}
Eleft(sum_{k=1}^m mathbb{1}_{text{$k$th distinct term is a record}} right)&=sum_{i=1}^m E(mathbb{1}_{text{$k$th distinct term is a record}})\[2ex]
&=sum_{k=1}^m{1over k}\[2ex]
&=H_m
end{align}$$



where $H_m=sum_{k=1}^m{1over k}$ is the $m$th harmonic number. (E.g., $H_{10}={7381over 2520}$.)



Longer derivation



The probability that the $i$th term is a record is
$$begin{align}P_i&=sum_{k=1}^mP((X_i=k)cap(X_{j}< k text{ for all }j<i)) \[2ex]
&=sum_{k=1}^m{1over m}left({k-1over m}right)^{i-1}
end{align}$$

and the number of records in the infinite sequence can be written as the sum of indicators
$$begin{align}sum_{i=1}^infty mathbb{1}_{text{$X_i$ is a record}}
end{align}$$



so the expected number of records in the infinite sequence is



$$begin{align}
Eleft(sum_{i=1}^infty 1_{text{$X_i$ is a record}}right)&=sum_{i=1}^infty E(1_{text{$X_i$ is a record}})\[2ex]
&=sum_{i=1}^infty P_i\[2ex]
&=sum_{i=1}^inftysum_{k=1}^m{1over m}left({k-1over m}right)^{i-1}\[2ex]
&= {1over m}sum_{i=1}^inftysum_{k=1}^mleft({k-1over m}right)^{i-1}\[2ex]
&={1over m}sum_{k=1}^msum_{i=1}^inftyleft({k-1over m}right)^{i-1}\[2ex]
&={1over m}sum_{k=1}^m{mover m-k+1}\[2ex]
&={1over m}(mcdot sum_{j=1}^m{1over j})\[2ex]
&=H_m.
end{align}$$






share|cite|improve this answer































    1














    This problem can be modeled using an absorbing Markov chain. Let ${Y_n:ngeqslant 1}$ be defined by $Y_1$ uniformly distributed over ${1,ldots,10}$ and for $ngeqslant 1$:
    $$
    mathbb P(Y_{n+1}=jmid Y_n=i)=begin{cases}
    frac1{10-i},& i<jleqslant 10\
    1,& i=j=10.
    end{cases}
    $$

    Let $P$ be the transition matrix of this Markov chain, then $$P=pmatrix{Q&R\0&I} $$ where $Q$ is the substochastic matrix corresponding to transitions between transient states, $R$ that corresponding to transitions from a transient state to an absorbing state, and $I$ that corresponding to transitions between absorbing states. (Here we have only one absorbing state.) For pair of transient states $i,j$ the probability of transitioning from $i$ to $j$ in exactly $k$ steps is the $(i,j)$ entry of $Q^k$. Summing for all $k$ yields the fundamental matrix
    $$
    N = sum_{k=0}Q^k.
    $$

    Since the rows of $Q$ sum to strictly less than one, the Neumann series converges and we have $N=(I-Q)^{-1}$, with $I$ the identity matrix. The expected number of transitions until being absorbed starting in transient state $i$ is the $i^{mathsf{th}}$ entry of the vector $t=Ncdotmathbf 1$, where $mathbf 1$ is a column vector whose entries are all one. Here
    $$
    t=left(
    begin{array}{c}
    frac{9649}{2520} \
    frac{1041}{280} \
    frac{503}{140} \
    frac{69}{20} \
    frac{197}{60} \
    frac{37}{12} \
    frac{17}{6} \
    frac{5}{2} \
    2 \
    end{array}
    right),
    $$

    and since the initial distribution was uniform over ${1,ldots,10}$, we weight each of these entries, along with $1$ (for the case when $X_1=10$), by $frac1{10}$ and sum to obtain $$frac{7381}{2520}. $$






    share|cite|improve this answer

















    • 3




      All these computations can be essentially bypassed and replaced by a direct proof that (and explanation why) the result is the $10$th harmonic number $$H_{10}=sum_{k=1}^{10}frac1{10}$$
      – Did
      Dec 9 '18 at 9:12










    • @Did can you explain why?
      – Adddison
      Dec 9 '18 at 16:00










    • @Adddison Not on the page of a question so cruelly lacking personal input, sorry.
      – Did
      Dec 9 '18 at 16:15










    • @Did That is certainly a more elegant approach, though indeed I do not know how to show it.
      – Math1000
      Dec 9 '18 at 16:36



















    0














    Given an infinite sequence of uniform draws from {1,...,10}, since we are only counting records, there is nothing changed by deleting any draw that is equal to a previous draw -- the number of records will be the same.



    The edited sequence is finite, and with probability 1, is a linear ordering of {1,...,10} (since the probability is $0$ that any value is omitted in the original infinite sequence). By iid uniform, all $10!$ such orderings are equally likely .



    Looking at a random ordering of {1,...10}, the expected number of (left-to-right) records in location $j$ is $1/j$, since among the first $j$ entries, each one of them is equally likely to be the maximum among them.



    Using linearity of expectation the expected number of records in the original sequence will be $1 + 1/2 + 1/3 + .... + 1/10$.






    share|cite|improve this answer























    • "the second round will be bigger than the first $1/2$ the time" -- No, $P(X_2>X_1)=45/100,$ $P(X_2<X_1)=45/100,$ and $P(X_2=X_1)=10/100. $ Anyway, the question is about the expected number of records in the whole *infinite* sequence, not just in its first $10$ terms.
      – r.e.s.
      Dec 12 '18 at 2:33










    • Sorry, I didn't read it accurately, I thought they were continuous RV's and 10 was the number of trials.
      – Ned
      Dec 12 '18 at 11:11










    • (+1) for your latest version. Not realizing you'd made this edit, I added a similar version to my answer.
      – r.e.s.
      Dec 12 '18 at 16:12










    • @r.e.s. The funny thing is, this isn't the first time I misread a problem in a way which had the same answer as the actual problem. Having seen the identical answers, of course I did not go back to be sure I'd read it right.... LOL... thanks for pointing it out, I appreciate it.
      – Ned
      Dec 12 '18 at 18:57











    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3031032%2fexpected-number-of-record-highs-in-an-iid-sequence-of-discrete-uniform-random-va%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2














    A term whose index is what you call a "king" is ordinarily called a record, so you are asking for the expected number of records in an iid sequence of Uniform${1,...,m}$ random variables, e.g. with $m=10$. (This assumes the terms are mutually independent.)



    Shorter derivation



    The probability that the $k$th distinct term is a record is ${1over k},$
    and the number of records in the infinite sequence can be written as the sum of indicators
    $$begin{align}sum_{k=1}^m mathbb{1}_{text{$k$th distinct term is a record}}
    end{align}$$



    so the expected number of records in the infinite sequence is



    $$begin{align}
    Eleft(sum_{k=1}^m mathbb{1}_{text{$k$th distinct term is a record}} right)&=sum_{i=1}^m E(mathbb{1}_{text{$k$th distinct term is a record}})\[2ex]
    &=sum_{k=1}^m{1over k}\[2ex]
    &=H_m
    end{align}$$



    where $H_m=sum_{k=1}^m{1over k}$ is the $m$th harmonic number. (E.g., $H_{10}={7381over 2520}$.)



    Longer derivation



    The probability that the $i$th term is a record is
    $$begin{align}P_i&=sum_{k=1}^mP((X_i=k)cap(X_{j}< k text{ for all }j<i)) \[2ex]
    &=sum_{k=1}^m{1over m}left({k-1over m}right)^{i-1}
    end{align}$$

    and the number of records in the infinite sequence can be written as the sum of indicators
    $$begin{align}sum_{i=1}^infty mathbb{1}_{text{$X_i$ is a record}}
    end{align}$$



    so the expected number of records in the infinite sequence is



    $$begin{align}
    Eleft(sum_{i=1}^infty 1_{text{$X_i$ is a record}}right)&=sum_{i=1}^infty E(1_{text{$X_i$ is a record}})\[2ex]
    &=sum_{i=1}^infty P_i\[2ex]
    &=sum_{i=1}^inftysum_{k=1}^m{1over m}left({k-1over m}right)^{i-1}\[2ex]
    &= {1over m}sum_{i=1}^inftysum_{k=1}^mleft({k-1over m}right)^{i-1}\[2ex]
    &={1over m}sum_{k=1}^msum_{i=1}^inftyleft({k-1over m}right)^{i-1}\[2ex]
    &={1over m}sum_{k=1}^m{mover m-k+1}\[2ex]
    &={1over m}(mcdot sum_{j=1}^m{1over j})\[2ex]
    &=H_m.
    end{align}$$






    share|cite|improve this answer




























      2














      A term whose index is what you call a "king" is ordinarily called a record, so you are asking for the expected number of records in an iid sequence of Uniform${1,...,m}$ random variables, e.g. with $m=10$. (This assumes the terms are mutually independent.)



      Shorter derivation



      The probability that the $k$th distinct term is a record is ${1over k},$
      and the number of records in the infinite sequence can be written as the sum of indicators
      $$begin{align}sum_{k=1}^m mathbb{1}_{text{$k$th distinct term is a record}}
      end{align}$$



      so the expected number of records in the infinite sequence is



      $$begin{align}
      Eleft(sum_{k=1}^m mathbb{1}_{text{$k$th distinct term is a record}} right)&=sum_{i=1}^m E(mathbb{1}_{text{$k$th distinct term is a record}})\[2ex]
      &=sum_{k=1}^m{1over k}\[2ex]
      &=H_m
      end{align}$$



      where $H_m=sum_{k=1}^m{1over k}$ is the $m$th harmonic number. (E.g., $H_{10}={7381over 2520}$.)



      Longer derivation



      The probability that the $i$th term is a record is
      $$begin{align}P_i&=sum_{k=1}^mP((X_i=k)cap(X_{j}< k text{ for all }j<i)) \[2ex]
      &=sum_{k=1}^m{1over m}left({k-1over m}right)^{i-1}
      end{align}$$

      and the number of records in the infinite sequence can be written as the sum of indicators
      $$begin{align}sum_{i=1}^infty mathbb{1}_{text{$X_i$ is a record}}
      end{align}$$



      so the expected number of records in the infinite sequence is



      $$begin{align}
      Eleft(sum_{i=1}^infty 1_{text{$X_i$ is a record}}right)&=sum_{i=1}^infty E(1_{text{$X_i$ is a record}})\[2ex]
      &=sum_{i=1}^infty P_i\[2ex]
      &=sum_{i=1}^inftysum_{k=1}^m{1over m}left({k-1over m}right)^{i-1}\[2ex]
      &= {1over m}sum_{i=1}^inftysum_{k=1}^mleft({k-1over m}right)^{i-1}\[2ex]
      &={1over m}sum_{k=1}^msum_{i=1}^inftyleft({k-1over m}right)^{i-1}\[2ex]
      &={1over m}sum_{k=1}^m{mover m-k+1}\[2ex]
      &={1over m}(mcdot sum_{j=1}^m{1over j})\[2ex]
      &=H_m.
      end{align}$$






      share|cite|improve this answer


























        2












        2








        2






        A term whose index is what you call a "king" is ordinarily called a record, so you are asking for the expected number of records in an iid sequence of Uniform${1,...,m}$ random variables, e.g. with $m=10$. (This assumes the terms are mutually independent.)



        Shorter derivation



        The probability that the $k$th distinct term is a record is ${1over k},$
        and the number of records in the infinite sequence can be written as the sum of indicators
        $$begin{align}sum_{k=1}^m mathbb{1}_{text{$k$th distinct term is a record}}
        end{align}$$



        so the expected number of records in the infinite sequence is



        $$begin{align}
        Eleft(sum_{k=1}^m mathbb{1}_{text{$k$th distinct term is a record}} right)&=sum_{i=1}^m E(mathbb{1}_{text{$k$th distinct term is a record}})\[2ex]
        &=sum_{k=1}^m{1over k}\[2ex]
        &=H_m
        end{align}$$



        where $H_m=sum_{k=1}^m{1over k}$ is the $m$th harmonic number. (E.g., $H_{10}={7381over 2520}$.)



        Longer derivation



        The probability that the $i$th term is a record is
        $$begin{align}P_i&=sum_{k=1}^mP((X_i=k)cap(X_{j}< k text{ for all }j<i)) \[2ex]
        &=sum_{k=1}^m{1over m}left({k-1over m}right)^{i-1}
        end{align}$$

        and the number of records in the infinite sequence can be written as the sum of indicators
        $$begin{align}sum_{i=1}^infty mathbb{1}_{text{$X_i$ is a record}}
        end{align}$$



        so the expected number of records in the infinite sequence is



        $$begin{align}
        Eleft(sum_{i=1}^infty 1_{text{$X_i$ is a record}}right)&=sum_{i=1}^infty E(1_{text{$X_i$ is a record}})\[2ex]
        &=sum_{i=1}^infty P_i\[2ex]
        &=sum_{i=1}^inftysum_{k=1}^m{1over m}left({k-1over m}right)^{i-1}\[2ex]
        &= {1over m}sum_{i=1}^inftysum_{k=1}^mleft({k-1over m}right)^{i-1}\[2ex]
        &={1over m}sum_{k=1}^msum_{i=1}^inftyleft({k-1over m}right)^{i-1}\[2ex]
        &={1over m}sum_{k=1}^m{mover m-k+1}\[2ex]
        &={1over m}(mcdot sum_{j=1}^m{1over j})\[2ex]
        &=H_m.
        end{align}$$






        share|cite|improve this answer














        A term whose index is what you call a "king" is ordinarily called a record, so you are asking for the expected number of records in an iid sequence of Uniform${1,...,m}$ random variables, e.g. with $m=10$. (This assumes the terms are mutually independent.)



        Shorter derivation



        The probability that the $k$th distinct term is a record is ${1over k},$
        and the number of records in the infinite sequence can be written as the sum of indicators
        $$begin{align}sum_{k=1}^m mathbb{1}_{text{$k$th distinct term is a record}}
        end{align}$$



        so the expected number of records in the infinite sequence is



        $$begin{align}
        Eleft(sum_{k=1}^m mathbb{1}_{text{$k$th distinct term is a record}} right)&=sum_{i=1}^m E(mathbb{1}_{text{$k$th distinct term is a record}})\[2ex]
        &=sum_{k=1}^m{1over k}\[2ex]
        &=H_m
        end{align}$$



        where $H_m=sum_{k=1}^m{1over k}$ is the $m$th harmonic number. (E.g., $H_{10}={7381over 2520}$.)



        Longer derivation



        The probability that the $i$th term is a record is
        $$begin{align}P_i&=sum_{k=1}^mP((X_i=k)cap(X_{j}< k text{ for all }j<i)) \[2ex]
        &=sum_{k=1}^m{1over m}left({k-1over m}right)^{i-1}
        end{align}$$

        and the number of records in the infinite sequence can be written as the sum of indicators
        $$begin{align}sum_{i=1}^infty mathbb{1}_{text{$X_i$ is a record}}
        end{align}$$



        so the expected number of records in the infinite sequence is



        $$begin{align}
        Eleft(sum_{i=1}^infty 1_{text{$X_i$ is a record}}right)&=sum_{i=1}^infty E(1_{text{$X_i$ is a record}})\[2ex]
        &=sum_{i=1}^infty P_i\[2ex]
        &=sum_{i=1}^inftysum_{k=1}^m{1over m}left({k-1over m}right)^{i-1}\[2ex]
        &= {1over m}sum_{i=1}^inftysum_{k=1}^mleft({k-1over m}right)^{i-1}\[2ex]
        &={1over m}sum_{k=1}^msum_{i=1}^inftyleft({k-1over m}right)^{i-1}\[2ex]
        &={1over m}sum_{k=1}^m{mover m-k+1}\[2ex]
        &={1over m}(mcdot sum_{j=1}^m{1over j})\[2ex]
        &=H_m.
        end{align}$$







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Dec 12 '18 at 16:07

























        answered Dec 10 '18 at 1:04









        r.e.s.

        7,58811952




        7,58811952























            1














            This problem can be modeled using an absorbing Markov chain. Let ${Y_n:ngeqslant 1}$ be defined by $Y_1$ uniformly distributed over ${1,ldots,10}$ and for $ngeqslant 1$:
            $$
            mathbb P(Y_{n+1}=jmid Y_n=i)=begin{cases}
            frac1{10-i},& i<jleqslant 10\
            1,& i=j=10.
            end{cases}
            $$

            Let $P$ be the transition matrix of this Markov chain, then $$P=pmatrix{Q&R\0&I} $$ where $Q$ is the substochastic matrix corresponding to transitions between transient states, $R$ that corresponding to transitions from a transient state to an absorbing state, and $I$ that corresponding to transitions between absorbing states. (Here we have only one absorbing state.) For pair of transient states $i,j$ the probability of transitioning from $i$ to $j$ in exactly $k$ steps is the $(i,j)$ entry of $Q^k$. Summing for all $k$ yields the fundamental matrix
            $$
            N = sum_{k=0}Q^k.
            $$

            Since the rows of $Q$ sum to strictly less than one, the Neumann series converges and we have $N=(I-Q)^{-1}$, with $I$ the identity matrix. The expected number of transitions until being absorbed starting in transient state $i$ is the $i^{mathsf{th}}$ entry of the vector $t=Ncdotmathbf 1$, where $mathbf 1$ is a column vector whose entries are all one. Here
            $$
            t=left(
            begin{array}{c}
            frac{9649}{2520} \
            frac{1041}{280} \
            frac{503}{140} \
            frac{69}{20} \
            frac{197}{60} \
            frac{37}{12} \
            frac{17}{6} \
            frac{5}{2} \
            2 \
            end{array}
            right),
            $$

            and since the initial distribution was uniform over ${1,ldots,10}$, we weight each of these entries, along with $1$ (for the case when $X_1=10$), by $frac1{10}$ and sum to obtain $$frac{7381}{2520}. $$






            share|cite|improve this answer

















            • 3




              All these computations can be essentially bypassed and replaced by a direct proof that (and explanation why) the result is the $10$th harmonic number $$H_{10}=sum_{k=1}^{10}frac1{10}$$
              – Did
              Dec 9 '18 at 9:12










            • @Did can you explain why?
              – Adddison
              Dec 9 '18 at 16:00










            • @Adddison Not on the page of a question so cruelly lacking personal input, sorry.
              – Did
              Dec 9 '18 at 16:15










            • @Did That is certainly a more elegant approach, though indeed I do not know how to show it.
              – Math1000
              Dec 9 '18 at 16:36
















            1














            This problem can be modeled using an absorbing Markov chain. Let ${Y_n:ngeqslant 1}$ be defined by $Y_1$ uniformly distributed over ${1,ldots,10}$ and for $ngeqslant 1$:
            $$
            mathbb P(Y_{n+1}=jmid Y_n=i)=begin{cases}
            frac1{10-i},& i<jleqslant 10\
            1,& i=j=10.
            end{cases}
            $$

            Let $P$ be the transition matrix of this Markov chain, then $$P=pmatrix{Q&R\0&I} $$ where $Q$ is the substochastic matrix corresponding to transitions between transient states, $R$ that corresponding to transitions from a transient state to an absorbing state, and $I$ that corresponding to transitions between absorbing states. (Here we have only one absorbing state.) For pair of transient states $i,j$ the probability of transitioning from $i$ to $j$ in exactly $k$ steps is the $(i,j)$ entry of $Q^k$. Summing for all $k$ yields the fundamental matrix
            $$
            N = sum_{k=0}Q^k.
            $$

            Since the rows of $Q$ sum to strictly less than one, the Neumann series converges and we have $N=(I-Q)^{-1}$, with $I$ the identity matrix. The expected number of transitions until being absorbed starting in transient state $i$ is the $i^{mathsf{th}}$ entry of the vector $t=Ncdotmathbf 1$, where $mathbf 1$ is a column vector whose entries are all one. Here
            $$
            t=left(
            begin{array}{c}
            frac{9649}{2520} \
            frac{1041}{280} \
            frac{503}{140} \
            frac{69}{20} \
            frac{197}{60} \
            frac{37}{12} \
            frac{17}{6} \
            frac{5}{2} \
            2 \
            end{array}
            right),
            $$

            and since the initial distribution was uniform over ${1,ldots,10}$, we weight each of these entries, along with $1$ (for the case when $X_1=10$), by $frac1{10}$ and sum to obtain $$frac{7381}{2520}. $$






            share|cite|improve this answer

















            • 3




              All these computations can be essentially bypassed and replaced by a direct proof that (and explanation why) the result is the $10$th harmonic number $$H_{10}=sum_{k=1}^{10}frac1{10}$$
              – Did
              Dec 9 '18 at 9:12










            • @Did can you explain why?
              – Adddison
              Dec 9 '18 at 16:00










            • @Adddison Not on the page of a question so cruelly lacking personal input, sorry.
              – Did
              Dec 9 '18 at 16:15










            • @Did That is certainly a more elegant approach, though indeed I do not know how to show it.
              – Math1000
              Dec 9 '18 at 16:36














            1












            1








            1






            This problem can be modeled using an absorbing Markov chain. Let ${Y_n:ngeqslant 1}$ be defined by $Y_1$ uniformly distributed over ${1,ldots,10}$ and for $ngeqslant 1$:
            $$
            mathbb P(Y_{n+1}=jmid Y_n=i)=begin{cases}
            frac1{10-i},& i<jleqslant 10\
            1,& i=j=10.
            end{cases}
            $$

            Let $P$ be the transition matrix of this Markov chain, then $$P=pmatrix{Q&R\0&I} $$ where $Q$ is the substochastic matrix corresponding to transitions between transient states, $R$ that corresponding to transitions from a transient state to an absorbing state, and $I$ that corresponding to transitions between absorbing states. (Here we have only one absorbing state.) For pair of transient states $i,j$ the probability of transitioning from $i$ to $j$ in exactly $k$ steps is the $(i,j)$ entry of $Q^k$. Summing for all $k$ yields the fundamental matrix
            $$
            N = sum_{k=0}Q^k.
            $$

            Since the rows of $Q$ sum to strictly less than one, the Neumann series converges and we have $N=(I-Q)^{-1}$, with $I$ the identity matrix. The expected number of transitions until being absorbed starting in transient state $i$ is the $i^{mathsf{th}}$ entry of the vector $t=Ncdotmathbf 1$, where $mathbf 1$ is a column vector whose entries are all one. Here
            $$
            t=left(
            begin{array}{c}
            frac{9649}{2520} \
            frac{1041}{280} \
            frac{503}{140} \
            frac{69}{20} \
            frac{197}{60} \
            frac{37}{12} \
            frac{17}{6} \
            frac{5}{2} \
            2 \
            end{array}
            right),
            $$

            and since the initial distribution was uniform over ${1,ldots,10}$, we weight each of these entries, along with $1$ (for the case when $X_1=10$), by $frac1{10}$ and sum to obtain $$frac{7381}{2520}. $$






            share|cite|improve this answer












            This problem can be modeled using an absorbing Markov chain. Let ${Y_n:ngeqslant 1}$ be defined by $Y_1$ uniformly distributed over ${1,ldots,10}$ and for $ngeqslant 1$:
            $$
            mathbb P(Y_{n+1}=jmid Y_n=i)=begin{cases}
            frac1{10-i},& i<jleqslant 10\
            1,& i=j=10.
            end{cases}
            $$

            Let $P$ be the transition matrix of this Markov chain, then $$P=pmatrix{Q&R\0&I} $$ where $Q$ is the substochastic matrix corresponding to transitions between transient states, $R$ that corresponding to transitions from a transient state to an absorbing state, and $I$ that corresponding to transitions between absorbing states. (Here we have only one absorbing state.) For pair of transient states $i,j$ the probability of transitioning from $i$ to $j$ in exactly $k$ steps is the $(i,j)$ entry of $Q^k$. Summing for all $k$ yields the fundamental matrix
            $$
            N = sum_{k=0}Q^k.
            $$

            Since the rows of $Q$ sum to strictly less than one, the Neumann series converges and we have $N=(I-Q)^{-1}$, with $I$ the identity matrix. The expected number of transitions until being absorbed starting in transient state $i$ is the $i^{mathsf{th}}$ entry of the vector $t=Ncdotmathbf 1$, where $mathbf 1$ is a column vector whose entries are all one. Here
            $$
            t=left(
            begin{array}{c}
            frac{9649}{2520} \
            frac{1041}{280} \
            frac{503}{140} \
            frac{69}{20} \
            frac{197}{60} \
            frac{37}{12} \
            frac{17}{6} \
            frac{5}{2} \
            2 \
            end{array}
            right),
            $$

            and since the initial distribution was uniform over ${1,ldots,10}$, we weight each of these entries, along with $1$ (for the case when $X_1=10$), by $frac1{10}$ and sum to obtain $$frac{7381}{2520}. $$







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered Dec 8 '18 at 23:44









            Math1000

            19k31745




            19k31745








            • 3




              All these computations can be essentially bypassed and replaced by a direct proof that (and explanation why) the result is the $10$th harmonic number $$H_{10}=sum_{k=1}^{10}frac1{10}$$
              – Did
              Dec 9 '18 at 9:12










            • @Did can you explain why?
              – Adddison
              Dec 9 '18 at 16:00










            • @Adddison Not on the page of a question so cruelly lacking personal input, sorry.
              – Did
              Dec 9 '18 at 16:15










            • @Did That is certainly a more elegant approach, though indeed I do not know how to show it.
              – Math1000
              Dec 9 '18 at 16:36














            • 3




              All these computations can be essentially bypassed and replaced by a direct proof that (and explanation why) the result is the $10$th harmonic number $$H_{10}=sum_{k=1}^{10}frac1{10}$$
              – Did
              Dec 9 '18 at 9:12










            • @Did can you explain why?
              – Adddison
              Dec 9 '18 at 16:00










            • @Adddison Not on the page of a question so cruelly lacking personal input, sorry.
              – Did
              Dec 9 '18 at 16:15










            • @Did That is certainly a more elegant approach, though indeed I do not know how to show it.
              – Math1000
              Dec 9 '18 at 16:36








            3




            3




            All these computations can be essentially bypassed and replaced by a direct proof that (and explanation why) the result is the $10$th harmonic number $$H_{10}=sum_{k=1}^{10}frac1{10}$$
            – Did
            Dec 9 '18 at 9:12




            All these computations can be essentially bypassed and replaced by a direct proof that (and explanation why) the result is the $10$th harmonic number $$H_{10}=sum_{k=1}^{10}frac1{10}$$
            – Did
            Dec 9 '18 at 9:12












            @Did can you explain why?
            – Adddison
            Dec 9 '18 at 16:00




            @Did can you explain why?
            – Adddison
            Dec 9 '18 at 16:00












            @Adddison Not on the page of a question so cruelly lacking personal input, sorry.
            – Did
            Dec 9 '18 at 16:15




            @Adddison Not on the page of a question so cruelly lacking personal input, sorry.
            – Did
            Dec 9 '18 at 16:15












            @Did That is certainly a more elegant approach, though indeed I do not know how to show it.
            – Math1000
            Dec 9 '18 at 16:36




            @Did That is certainly a more elegant approach, though indeed I do not know how to show it.
            – Math1000
            Dec 9 '18 at 16:36











            0














            Given an infinite sequence of uniform draws from {1,...,10}, since we are only counting records, there is nothing changed by deleting any draw that is equal to a previous draw -- the number of records will be the same.



            The edited sequence is finite, and with probability 1, is a linear ordering of {1,...,10} (since the probability is $0$ that any value is omitted in the original infinite sequence). By iid uniform, all $10!$ such orderings are equally likely .



            Looking at a random ordering of {1,...10}, the expected number of (left-to-right) records in location $j$ is $1/j$, since among the first $j$ entries, each one of them is equally likely to be the maximum among them.



            Using linearity of expectation the expected number of records in the original sequence will be $1 + 1/2 + 1/3 + .... + 1/10$.






            share|cite|improve this answer























            • "the second round will be bigger than the first $1/2$ the time" -- No, $P(X_2>X_1)=45/100,$ $P(X_2<X_1)=45/100,$ and $P(X_2=X_1)=10/100. $ Anyway, the question is about the expected number of records in the whole *infinite* sequence, not just in its first $10$ terms.
              – r.e.s.
              Dec 12 '18 at 2:33










            • Sorry, I didn't read it accurately, I thought they were continuous RV's and 10 was the number of trials.
              – Ned
              Dec 12 '18 at 11:11










            • (+1) for your latest version. Not realizing you'd made this edit, I added a similar version to my answer.
              – r.e.s.
              Dec 12 '18 at 16:12










            • @r.e.s. The funny thing is, this isn't the first time I misread a problem in a way which had the same answer as the actual problem. Having seen the identical answers, of course I did not go back to be sure I'd read it right.... LOL... thanks for pointing it out, I appreciate it.
              – Ned
              Dec 12 '18 at 18:57
















            0














            Given an infinite sequence of uniform draws from {1,...,10}, since we are only counting records, there is nothing changed by deleting any draw that is equal to a previous draw -- the number of records will be the same.



            The edited sequence is finite, and with probability 1, is a linear ordering of {1,...,10} (since the probability is $0$ that any value is omitted in the original infinite sequence). By iid uniform, all $10!$ such orderings are equally likely .



            Looking at a random ordering of {1,...10}, the expected number of (left-to-right) records in location $j$ is $1/j$, since among the first $j$ entries, each one of them is equally likely to be the maximum among them.



            Using linearity of expectation the expected number of records in the original sequence will be $1 + 1/2 + 1/3 + .... + 1/10$.






            share|cite|improve this answer























            • "the second round will be bigger than the first $1/2$ the time" -- No, $P(X_2>X_1)=45/100,$ $P(X_2<X_1)=45/100,$ and $P(X_2=X_1)=10/100. $ Anyway, the question is about the expected number of records in the whole *infinite* sequence, not just in its first $10$ terms.
              – r.e.s.
              Dec 12 '18 at 2:33










            • Sorry, I didn't read it accurately, I thought they were continuous RV's and 10 was the number of trials.
              – Ned
              Dec 12 '18 at 11:11










            • (+1) for your latest version. Not realizing you'd made this edit, I added a similar version to my answer.
              – r.e.s.
              Dec 12 '18 at 16:12










            • @r.e.s. The funny thing is, this isn't the first time I misread a problem in a way which had the same answer as the actual problem. Having seen the identical answers, of course I did not go back to be sure I'd read it right.... LOL... thanks for pointing it out, I appreciate it.
              – Ned
              Dec 12 '18 at 18:57














            0












            0








            0






            Given an infinite sequence of uniform draws from {1,...,10}, since we are only counting records, there is nothing changed by deleting any draw that is equal to a previous draw -- the number of records will be the same.



            The edited sequence is finite, and with probability 1, is a linear ordering of {1,...,10} (since the probability is $0$ that any value is omitted in the original infinite sequence). By iid uniform, all $10!$ such orderings are equally likely .



            Looking at a random ordering of {1,...10}, the expected number of (left-to-right) records in location $j$ is $1/j$, since among the first $j$ entries, each one of them is equally likely to be the maximum among them.



            Using linearity of expectation the expected number of records in the original sequence will be $1 + 1/2 + 1/3 + .... + 1/10$.






            share|cite|improve this answer














            Given an infinite sequence of uniform draws from {1,...,10}, since we are only counting records, there is nothing changed by deleting any draw that is equal to a previous draw -- the number of records will be the same.



            The edited sequence is finite, and with probability 1, is a linear ordering of {1,...,10} (since the probability is $0$ that any value is omitted in the original infinite sequence). By iid uniform, all $10!$ such orderings are equally likely .



            Looking at a random ordering of {1,...10}, the expected number of (left-to-right) records in location $j$ is $1/j$, since among the first $j$ entries, each one of them is equally likely to be the maximum among them.



            Using linearity of expectation the expected number of records in the original sequence will be $1 + 1/2 + 1/3 + .... + 1/10$.







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited Dec 12 '18 at 15:17

























            answered Dec 10 '18 at 18:11









            Ned

            2,003910




            2,003910












            • "the second round will be bigger than the first $1/2$ the time" -- No, $P(X_2>X_1)=45/100,$ $P(X_2<X_1)=45/100,$ and $P(X_2=X_1)=10/100. $ Anyway, the question is about the expected number of records in the whole *infinite* sequence, not just in its first $10$ terms.
              – r.e.s.
              Dec 12 '18 at 2:33










            • Sorry, I didn't read it accurately, I thought they were continuous RV's and 10 was the number of trials.
              – Ned
              Dec 12 '18 at 11:11










            • (+1) for your latest version. Not realizing you'd made this edit, I added a similar version to my answer.
              – r.e.s.
              Dec 12 '18 at 16:12










            • @r.e.s. The funny thing is, this isn't the first time I misread a problem in a way which had the same answer as the actual problem. Having seen the identical answers, of course I did not go back to be sure I'd read it right.... LOL... thanks for pointing it out, I appreciate it.
              – Ned
              Dec 12 '18 at 18:57


















            • "the second round will be bigger than the first $1/2$ the time" -- No, $P(X_2>X_1)=45/100,$ $P(X_2<X_1)=45/100,$ and $P(X_2=X_1)=10/100. $ Anyway, the question is about the expected number of records in the whole *infinite* sequence, not just in its first $10$ terms.
              – r.e.s.
              Dec 12 '18 at 2:33










            • Sorry, I didn't read it accurately, I thought they were continuous RV's and 10 was the number of trials.
              – Ned
              Dec 12 '18 at 11:11










            • (+1) for your latest version. Not realizing you'd made this edit, I added a similar version to my answer.
              – r.e.s.
              Dec 12 '18 at 16:12










            • @r.e.s. The funny thing is, this isn't the first time I misread a problem in a way which had the same answer as the actual problem. Having seen the identical answers, of course I did not go back to be sure I'd read it right.... LOL... thanks for pointing it out, I appreciate it.
              – Ned
              Dec 12 '18 at 18:57
















            "the second round will be bigger than the first $1/2$ the time" -- No, $P(X_2>X_1)=45/100,$ $P(X_2<X_1)=45/100,$ and $P(X_2=X_1)=10/100. $ Anyway, the question is about the expected number of records in the whole *infinite* sequence, not just in its first $10$ terms.
            – r.e.s.
            Dec 12 '18 at 2:33




            "the second round will be bigger than the first $1/2$ the time" -- No, $P(X_2>X_1)=45/100,$ $P(X_2<X_1)=45/100,$ and $P(X_2=X_1)=10/100. $ Anyway, the question is about the expected number of records in the whole *infinite* sequence, not just in its first $10$ terms.
            – r.e.s.
            Dec 12 '18 at 2:33












            Sorry, I didn't read it accurately, I thought they were continuous RV's and 10 was the number of trials.
            – Ned
            Dec 12 '18 at 11:11




            Sorry, I didn't read it accurately, I thought they were continuous RV's and 10 was the number of trials.
            – Ned
            Dec 12 '18 at 11:11












            (+1) for your latest version. Not realizing you'd made this edit, I added a similar version to my answer.
            – r.e.s.
            Dec 12 '18 at 16:12




            (+1) for your latest version. Not realizing you'd made this edit, I added a similar version to my answer.
            – r.e.s.
            Dec 12 '18 at 16:12












            @r.e.s. The funny thing is, this isn't the first time I misread a problem in a way which had the same answer as the actual problem. Having seen the identical answers, of course I did not go back to be sure I'd read it right.... LOL... thanks for pointing it out, I appreciate it.
            – Ned
            Dec 12 '18 at 18:57




            @r.e.s. The funny thing is, this isn't the first time I misread a problem in a way which had the same answer as the actual problem. Having seen the identical answers, of course I did not go back to be sure I'd read it right.... LOL... thanks for pointing it out, I appreciate it.
            – Ned
            Dec 12 '18 at 18:57


















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3031032%2fexpected-number-of-record-highs-in-an-iid-sequence-of-discrete-uniform-random-va%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Bressuire

            Cabo Verde

            Gyllenstierna