Defining Random Variable, Expected Value for Number of Fixed Points given a permutation











up vote
0
down vote

favorite












Let $n in mathbb N$, and $mathcal{K}$ be the set of permutations possible for a set ${1,...,n}$. Let $sigma in mathcal{K}, $ such that $sigma: [n] to [n]$ is a randomly selected permutation.



Define/Find: the appropriate probability space and define the random variable $X$ as the number of fixed points (note: $i$ is a fixed point under $sigma$ if $sigma(i) = i$ for $i in [n]$). Moreover, find $mathbb E[X]$.



My ideas:
$Omega:={(1,sigma(1))times...times(n,sigma(n))in ({1,...,n},{1,...,n})^{n}:sigma in mathcal{K}}$ (not sure in this case; any alternatives?).



I am having difficulties to define random variable $X$ as one "function": As multiple functions I would take $sum_{i=1}^{n}X_{i}=X$, where
$X_{i}(omega)=begin{cases}
text{1,} &quadtext{if } text{$i=sigma(i)$}\
text{0,} &quadtext{otherwise.} \
end{cases}$



Is there anyway of defining $X$ as simply one "Function"?



Now onto my greatest worry, the expected value $mathbb{E}[X]$:



Initially I thought to myself, the probability measure $mathbb{P}$ needs to be a binomial distribution, given that we're looking at a given number of "successes" within $n$ particular instances. Then I thought about the parameter $p$ and knew it would not be constant, which changed my thinking (Does this mean $Binom(n) to X$ is only valid if $(X_{i})_{i =1,...,n}$ are I.I.D?).



With this in mind, we know $mathbb{E}[X]=sum_{i=1}^{n}mathbb{E}{[X_{i}]}$. Individually, this means $mathbb{E}[X_{1}]=0times frac{n-1}{n}+1timesfrac{1}{n}$. But then for $X_{2},...,X_{n}$ surely it would differ greatly on the basis of what "happened" in the previous random variable? How can I calculate this expected value?










share|cite|improve this question


























    up vote
    0
    down vote

    favorite












    Let $n in mathbb N$, and $mathcal{K}$ be the set of permutations possible for a set ${1,...,n}$. Let $sigma in mathcal{K}, $ such that $sigma: [n] to [n]$ is a randomly selected permutation.



    Define/Find: the appropriate probability space and define the random variable $X$ as the number of fixed points (note: $i$ is a fixed point under $sigma$ if $sigma(i) = i$ for $i in [n]$). Moreover, find $mathbb E[X]$.



    My ideas:
    $Omega:={(1,sigma(1))times...times(n,sigma(n))in ({1,...,n},{1,...,n})^{n}:sigma in mathcal{K}}$ (not sure in this case; any alternatives?).



    I am having difficulties to define random variable $X$ as one "function": As multiple functions I would take $sum_{i=1}^{n}X_{i}=X$, where
    $X_{i}(omega)=begin{cases}
    text{1,} &quadtext{if } text{$i=sigma(i)$}\
    text{0,} &quadtext{otherwise.} \
    end{cases}$



    Is there anyway of defining $X$ as simply one "Function"?



    Now onto my greatest worry, the expected value $mathbb{E}[X]$:



    Initially I thought to myself, the probability measure $mathbb{P}$ needs to be a binomial distribution, given that we're looking at a given number of "successes" within $n$ particular instances. Then I thought about the parameter $p$ and knew it would not be constant, which changed my thinking (Does this mean $Binom(n) to X$ is only valid if $(X_{i})_{i =1,...,n}$ are I.I.D?).



    With this in mind, we know $mathbb{E}[X]=sum_{i=1}^{n}mathbb{E}{[X_{i}]}$. Individually, this means $mathbb{E}[X_{1}]=0times frac{n-1}{n}+1timesfrac{1}{n}$. But then for $X_{2},...,X_{n}$ surely it would differ greatly on the basis of what "happened" in the previous random variable? How can I calculate this expected value?










    share|cite|improve this question
























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      Let $n in mathbb N$, and $mathcal{K}$ be the set of permutations possible for a set ${1,...,n}$. Let $sigma in mathcal{K}, $ such that $sigma: [n] to [n]$ is a randomly selected permutation.



      Define/Find: the appropriate probability space and define the random variable $X$ as the number of fixed points (note: $i$ is a fixed point under $sigma$ if $sigma(i) = i$ for $i in [n]$). Moreover, find $mathbb E[X]$.



      My ideas:
      $Omega:={(1,sigma(1))times...times(n,sigma(n))in ({1,...,n},{1,...,n})^{n}:sigma in mathcal{K}}$ (not sure in this case; any alternatives?).



      I am having difficulties to define random variable $X$ as one "function": As multiple functions I would take $sum_{i=1}^{n}X_{i}=X$, where
      $X_{i}(omega)=begin{cases}
      text{1,} &quadtext{if } text{$i=sigma(i)$}\
      text{0,} &quadtext{otherwise.} \
      end{cases}$



      Is there anyway of defining $X$ as simply one "Function"?



      Now onto my greatest worry, the expected value $mathbb{E}[X]$:



      Initially I thought to myself, the probability measure $mathbb{P}$ needs to be a binomial distribution, given that we're looking at a given number of "successes" within $n$ particular instances. Then I thought about the parameter $p$ and knew it would not be constant, which changed my thinking (Does this mean $Binom(n) to X$ is only valid if $(X_{i})_{i =1,...,n}$ are I.I.D?).



      With this in mind, we know $mathbb{E}[X]=sum_{i=1}^{n}mathbb{E}{[X_{i}]}$. Individually, this means $mathbb{E}[X_{1}]=0times frac{n-1}{n}+1timesfrac{1}{n}$. But then for $X_{2},...,X_{n}$ surely it would differ greatly on the basis of what "happened" in the previous random variable? How can I calculate this expected value?










      share|cite|improve this question













      Let $n in mathbb N$, and $mathcal{K}$ be the set of permutations possible for a set ${1,...,n}$. Let $sigma in mathcal{K}, $ such that $sigma: [n] to [n]$ is a randomly selected permutation.



      Define/Find: the appropriate probability space and define the random variable $X$ as the number of fixed points (note: $i$ is a fixed point under $sigma$ if $sigma(i) = i$ for $i in [n]$). Moreover, find $mathbb E[X]$.



      My ideas:
      $Omega:={(1,sigma(1))times...times(n,sigma(n))in ({1,...,n},{1,...,n})^{n}:sigma in mathcal{K}}$ (not sure in this case; any alternatives?).



      I am having difficulties to define random variable $X$ as one "function": As multiple functions I would take $sum_{i=1}^{n}X_{i}=X$, where
      $X_{i}(omega)=begin{cases}
      text{1,} &quadtext{if } text{$i=sigma(i)$}\
      text{0,} &quadtext{otherwise.} \
      end{cases}$



      Is there anyway of defining $X$ as simply one "Function"?



      Now onto my greatest worry, the expected value $mathbb{E}[X]$:



      Initially I thought to myself, the probability measure $mathbb{P}$ needs to be a binomial distribution, given that we're looking at a given number of "successes" within $n$ particular instances. Then I thought about the parameter $p$ and knew it would not be constant, which changed my thinking (Does this mean $Binom(n) to X$ is only valid if $(X_{i})_{i =1,...,n}$ are I.I.D?).



      With this in mind, we know $mathbb{E}[X]=sum_{i=1}^{n}mathbb{E}{[X_{i}]}$. Individually, this means $mathbb{E}[X_{1}]=0times frac{n-1}{n}+1timesfrac{1}{n}$. But then for $X_{2},...,X_{n}$ surely it would differ greatly on the basis of what "happened" in the previous random variable? How can I calculate this expected value?







      real-analysis probability stochastic-calculus expected-value






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Nov 17 at 10:40









      SABOY

      507211




      507211






















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote



          accepted










          You can take $Omega=mathcal K$ equipped with $sigma$-algebra $wp(mathcal K)$ and where all outcomes are equiprobable, so that $P({sigma})=frac1{n!}$ for every $sigmainmathcal K$.



          Then $X:Omega=mathcal Ktomathbb R$ is prescribed by $sigmamapsto|{iin[n]mid i=sigma(i)}|$.



          $X$ has no binomial distribution (e.g. note that $P(X=n-1)=0$).



          To find $mathbb EX$ you can use linearity of expectation, and your setup for this (introducing $X_i$) is okay. The $X_i$ are not independent, but that does not matter. Independence is not needed for linearity of expectation. The $X_i$ all have the same distribution, allowing us to make use of symmetry as well.



          Then we find:$$mathbb EX=sum_{i=1}^nmathbb EX_i=nmathbb EX_1=nP(sigma(1)=1)=nfrac1n=1$$






          share|cite|improve this answer























          • I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
            – SABOY
            Nov 17 at 11:06










          • Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
            – drhab
            Nov 17 at 11:10










          • Only if they were iid
            – SABOY
            Nov 17 at 11:12










          • Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
            – drhab
            Nov 17 at 11:14








          • 1




            E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
            – drhab
            Nov 17 at 11:19













          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3002189%2fdefining-random-variable-expected-value-for-number-of-fixed-points-given-a-perm%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote



          accepted










          You can take $Omega=mathcal K$ equipped with $sigma$-algebra $wp(mathcal K)$ and where all outcomes are equiprobable, so that $P({sigma})=frac1{n!}$ for every $sigmainmathcal K$.



          Then $X:Omega=mathcal Ktomathbb R$ is prescribed by $sigmamapsto|{iin[n]mid i=sigma(i)}|$.



          $X$ has no binomial distribution (e.g. note that $P(X=n-1)=0$).



          To find $mathbb EX$ you can use linearity of expectation, and your setup for this (introducing $X_i$) is okay. The $X_i$ are not independent, but that does not matter. Independence is not needed for linearity of expectation. The $X_i$ all have the same distribution, allowing us to make use of symmetry as well.



          Then we find:$$mathbb EX=sum_{i=1}^nmathbb EX_i=nmathbb EX_1=nP(sigma(1)=1)=nfrac1n=1$$






          share|cite|improve this answer























          • I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
            – SABOY
            Nov 17 at 11:06










          • Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
            – drhab
            Nov 17 at 11:10










          • Only if they were iid
            – SABOY
            Nov 17 at 11:12










          • Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
            – drhab
            Nov 17 at 11:14








          • 1




            E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
            – drhab
            Nov 17 at 11:19

















          up vote
          1
          down vote



          accepted










          You can take $Omega=mathcal K$ equipped with $sigma$-algebra $wp(mathcal K)$ and where all outcomes are equiprobable, so that $P({sigma})=frac1{n!}$ for every $sigmainmathcal K$.



          Then $X:Omega=mathcal Ktomathbb R$ is prescribed by $sigmamapsto|{iin[n]mid i=sigma(i)}|$.



          $X$ has no binomial distribution (e.g. note that $P(X=n-1)=0$).



          To find $mathbb EX$ you can use linearity of expectation, and your setup for this (introducing $X_i$) is okay. The $X_i$ are not independent, but that does not matter. Independence is not needed for linearity of expectation. The $X_i$ all have the same distribution, allowing us to make use of symmetry as well.



          Then we find:$$mathbb EX=sum_{i=1}^nmathbb EX_i=nmathbb EX_1=nP(sigma(1)=1)=nfrac1n=1$$






          share|cite|improve this answer























          • I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
            – SABOY
            Nov 17 at 11:06










          • Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
            – drhab
            Nov 17 at 11:10










          • Only if they were iid
            – SABOY
            Nov 17 at 11:12










          • Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
            – drhab
            Nov 17 at 11:14








          • 1




            E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
            – drhab
            Nov 17 at 11:19















          up vote
          1
          down vote



          accepted







          up vote
          1
          down vote



          accepted






          You can take $Omega=mathcal K$ equipped with $sigma$-algebra $wp(mathcal K)$ and where all outcomes are equiprobable, so that $P({sigma})=frac1{n!}$ for every $sigmainmathcal K$.



          Then $X:Omega=mathcal Ktomathbb R$ is prescribed by $sigmamapsto|{iin[n]mid i=sigma(i)}|$.



          $X$ has no binomial distribution (e.g. note that $P(X=n-1)=0$).



          To find $mathbb EX$ you can use linearity of expectation, and your setup for this (introducing $X_i$) is okay. The $X_i$ are not independent, but that does not matter. Independence is not needed for linearity of expectation. The $X_i$ all have the same distribution, allowing us to make use of symmetry as well.



          Then we find:$$mathbb EX=sum_{i=1}^nmathbb EX_i=nmathbb EX_1=nP(sigma(1)=1)=nfrac1n=1$$






          share|cite|improve this answer














          You can take $Omega=mathcal K$ equipped with $sigma$-algebra $wp(mathcal K)$ and where all outcomes are equiprobable, so that $P({sigma})=frac1{n!}$ for every $sigmainmathcal K$.



          Then $X:Omega=mathcal Ktomathbb R$ is prescribed by $sigmamapsto|{iin[n]mid i=sigma(i)}|$.



          $X$ has no binomial distribution (e.g. note that $P(X=n-1)=0$).



          To find $mathbb EX$ you can use linearity of expectation, and your setup for this (introducing $X_i$) is okay. The $X_i$ are not independent, but that does not matter. Independence is not needed for linearity of expectation. The $X_i$ all have the same distribution, allowing us to make use of symmetry as well.



          Then we find:$$mathbb EX=sum_{i=1}^nmathbb EX_i=nmathbb EX_1=nP(sigma(1)=1)=nfrac1n=1$$







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Nov 17 at 11:02

























          answered Nov 17 at 10:57









          drhab

          95.1k543126




          95.1k543126












          • I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
            – SABOY
            Nov 17 at 11:06










          • Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
            – drhab
            Nov 17 at 11:10










          • Only if they were iid
            – SABOY
            Nov 17 at 11:12










          • Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
            – drhab
            Nov 17 at 11:14








          • 1




            E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
            – drhab
            Nov 17 at 11:19




















          • I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
            – SABOY
            Nov 17 at 11:06










          • Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
            – drhab
            Nov 17 at 11:10










          • Only if they were iid
            – SABOY
            Nov 17 at 11:12










          • Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
            – drhab
            Nov 17 at 11:14








          • 1




            E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
            – drhab
            Nov 17 at 11:19


















          I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
          – SABOY
          Nov 17 at 11:06




          I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
          – SABOY
          Nov 17 at 11:06












          Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
          – drhab
          Nov 17 at 11:10




          Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
          – drhab
          Nov 17 at 11:10












          Only if they were iid
          – SABOY
          Nov 17 at 11:12




          Only if they were iid
          – SABOY
          Nov 17 at 11:12












          Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
          – drhab
          Nov 17 at 11:14






          Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
          – drhab
          Nov 17 at 11:14






          1




          1




          E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
          – drhab
          Nov 17 at 11:19






          E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
          – drhab
          Nov 17 at 11:19




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3002189%2fdefining-random-variable-expected-value-for-number-of-fixed-points-given-a-perm%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Plaza Victoria

          In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

          How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...