Defining Random Variable, Expected Value for Number of Fixed Points given a permutation
up vote
0
down vote
favorite
Let $n in mathbb N$, and $mathcal{K}$ be the set of permutations possible for a set ${1,...,n}$. Let $sigma in mathcal{K}, $ such that $sigma: [n] to [n]$ is a randomly selected permutation.
Define/Find: the appropriate probability space and define the random variable $X$ as the number of fixed points (note: $i$ is a fixed point under $sigma$ if $sigma(i) = i$ for $i in [n]$). Moreover, find $mathbb E[X]$.
My ideas:
$Omega:={(1,sigma(1))times...times(n,sigma(n))in ({1,...,n},{1,...,n})^{n}:sigma in mathcal{K}}$ (not sure in this case; any alternatives?).
I am having difficulties to define random variable $X$ as one "function": As multiple functions I would take $sum_{i=1}^{n}X_{i}=X$, where
$X_{i}(omega)=begin{cases}
text{1,} &quadtext{if } text{$i=sigma(i)$}\
text{0,} &quadtext{otherwise.} \
end{cases}$
Is there anyway of defining $X$ as simply one "Function"?
Now onto my greatest worry, the expected value $mathbb{E}[X]$:
Initially I thought to myself, the probability measure $mathbb{P}$ needs to be a binomial distribution, given that we're looking at a given number of "successes" within $n$ particular instances. Then I thought about the parameter $p$ and knew it would not be constant, which changed my thinking (Does this mean $Binom(n) to X$ is only valid if $(X_{i})_{i =1,...,n}$ are I.I.D?).
With this in mind, we know $mathbb{E}[X]=sum_{i=1}^{n}mathbb{E}{[X_{i}]}$. Individually, this means $mathbb{E}[X_{1}]=0times frac{n-1}{n}+1timesfrac{1}{n}$. But then for $X_{2},...,X_{n}$ surely it would differ greatly on the basis of what "happened" in the previous random variable? How can I calculate this expected value?
real-analysis probability stochastic-calculus expected-value
add a comment |
up vote
0
down vote
favorite
Let $n in mathbb N$, and $mathcal{K}$ be the set of permutations possible for a set ${1,...,n}$. Let $sigma in mathcal{K}, $ such that $sigma: [n] to [n]$ is a randomly selected permutation.
Define/Find: the appropriate probability space and define the random variable $X$ as the number of fixed points (note: $i$ is a fixed point under $sigma$ if $sigma(i) = i$ for $i in [n]$). Moreover, find $mathbb E[X]$.
My ideas:
$Omega:={(1,sigma(1))times...times(n,sigma(n))in ({1,...,n},{1,...,n})^{n}:sigma in mathcal{K}}$ (not sure in this case; any alternatives?).
I am having difficulties to define random variable $X$ as one "function": As multiple functions I would take $sum_{i=1}^{n}X_{i}=X$, where
$X_{i}(omega)=begin{cases}
text{1,} &quadtext{if } text{$i=sigma(i)$}\
text{0,} &quadtext{otherwise.} \
end{cases}$
Is there anyway of defining $X$ as simply one "Function"?
Now onto my greatest worry, the expected value $mathbb{E}[X]$:
Initially I thought to myself, the probability measure $mathbb{P}$ needs to be a binomial distribution, given that we're looking at a given number of "successes" within $n$ particular instances. Then I thought about the parameter $p$ and knew it would not be constant, which changed my thinking (Does this mean $Binom(n) to X$ is only valid if $(X_{i})_{i =1,...,n}$ are I.I.D?).
With this in mind, we know $mathbb{E}[X]=sum_{i=1}^{n}mathbb{E}{[X_{i}]}$. Individually, this means $mathbb{E}[X_{1}]=0times frac{n-1}{n}+1timesfrac{1}{n}$. But then for $X_{2},...,X_{n}$ surely it would differ greatly on the basis of what "happened" in the previous random variable? How can I calculate this expected value?
real-analysis probability stochastic-calculus expected-value
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Let $n in mathbb N$, and $mathcal{K}$ be the set of permutations possible for a set ${1,...,n}$. Let $sigma in mathcal{K}, $ such that $sigma: [n] to [n]$ is a randomly selected permutation.
Define/Find: the appropriate probability space and define the random variable $X$ as the number of fixed points (note: $i$ is a fixed point under $sigma$ if $sigma(i) = i$ for $i in [n]$). Moreover, find $mathbb E[X]$.
My ideas:
$Omega:={(1,sigma(1))times...times(n,sigma(n))in ({1,...,n},{1,...,n})^{n}:sigma in mathcal{K}}$ (not sure in this case; any alternatives?).
I am having difficulties to define random variable $X$ as one "function": As multiple functions I would take $sum_{i=1}^{n}X_{i}=X$, where
$X_{i}(omega)=begin{cases}
text{1,} &quadtext{if } text{$i=sigma(i)$}\
text{0,} &quadtext{otherwise.} \
end{cases}$
Is there anyway of defining $X$ as simply one "Function"?
Now onto my greatest worry, the expected value $mathbb{E}[X]$:
Initially I thought to myself, the probability measure $mathbb{P}$ needs to be a binomial distribution, given that we're looking at a given number of "successes" within $n$ particular instances. Then I thought about the parameter $p$ and knew it would not be constant, which changed my thinking (Does this mean $Binom(n) to X$ is only valid if $(X_{i})_{i =1,...,n}$ are I.I.D?).
With this in mind, we know $mathbb{E}[X]=sum_{i=1}^{n}mathbb{E}{[X_{i}]}$. Individually, this means $mathbb{E}[X_{1}]=0times frac{n-1}{n}+1timesfrac{1}{n}$. But then for $X_{2},...,X_{n}$ surely it would differ greatly on the basis of what "happened" in the previous random variable? How can I calculate this expected value?
real-analysis probability stochastic-calculus expected-value
Let $n in mathbb N$, and $mathcal{K}$ be the set of permutations possible for a set ${1,...,n}$. Let $sigma in mathcal{K}, $ such that $sigma: [n] to [n]$ is a randomly selected permutation.
Define/Find: the appropriate probability space and define the random variable $X$ as the number of fixed points (note: $i$ is a fixed point under $sigma$ if $sigma(i) = i$ for $i in [n]$). Moreover, find $mathbb E[X]$.
My ideas:
$Omega:={(1,sigma(1))times...times(n,sigma(n))in ({1,...,n},{1,...,n})^{n}:sigma in mathcal{K}}$ (not sure in this case; any alternatives?).
I am having difficulties to define random variable $X$ as one "function": As multiple functions I would take $sum_{i=1}^{n}X_{i}=X$, where
$X_{i}(omega)=begin{cases}
text{1,} &quadtext{if } text{$i=sigma(i)$}\
text{0,} &quadtext{otherwise.} \
end{cases}$
Is there anyway of defining $X$ as simply one "Function"?
Now onto my greatest worry, the expected value $mathbb{E}[X]$:
Initially I thought to myself, the probability measure $mathbb{P}$ needs to be a binomial distribution, given that we're looking at a given number of "successes" within $n$ particular instances. Then I thought about the parameter $p$ and knew it would not be constant, which changed my thinking (Does this mean $Binom(n) to X$ is only valid if $(X_{i})_{i =1,...,n}$ are I.I.D?).
With this in mind, we know $mathbb{E}[X]=sum_{i=1}^{n}mathbb{E}{[X_{i}]}$. Individually, this means $mathbb{E}[X_{1}]=0times frac{n-1}{n}+1timesfrac{1}{n}$. But then for $X_{2},...,X_{n}$ surely it would differ greatly on the basis of what "happened" in the previous random variable? How can I calculate this expected value?
real-analysis probability stochastic-calculus expected-value
real-analysis probability stochastic-calculus expected-value
asked Nov 17 at 10:40
SABOY
507211
507211
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
You can take $Omega=mathcal K$ equipped with $sigma$-algebra $wp(mathcal K)$ and where all outcomes are equiprobable, so that $P({sigma})=frac1{n!}$ for every $sigmainmathcal K$.
Then $X:Omega=mathcal Ktomathbb R$ is prescribed by $sigmamapsto|{iin[n]mid i=sigma(i)}|$.
$X$ has no binomial distribution (e.g. note that $P(X=n-1)=0$).
To find $mathbb EX$ you can use linearity of expectation, and your setup for this (introducing $X_i$) is okay. The $X_i$ are not independent, but that does not matter. Independence is not needed for linearity of expectation. The $X_i$ all have the same distribution, allowing us to make use of symmetry as well.
Then we find:$$mathbb EX=sum_{i=1}^nmathbb EX_i=nmathbb EX_1=nP(sigma(1)=1)=nfrac1n=1$$
I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
– SABOY
Nov 17 at 11:06
Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
– drhab
Nov 17 at 11:10
Only if they were iid
– SABOY
Nov 17 at 11:12
Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
– drhab
Nov 17 at 11:14
1
E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
– drhab
Nov 17 at 11:19
|
show 1 more comment
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
You can take $Omega=mathcal K$ equipped with $sigma$-algebra $wp(mathcal K)$ and where all outcomes are equiprobable, so that $P({sigma})=frac1{n!}$ for every $sigmainmathcal K$.
Then $X:Omega=mathcal Ktomathbb R$ is prescribed by $sigmamapsto|{iin[n]mid i=sigma(i)}|$.
$X$ has no binomial distribution (e.g. note that $P(X=n-1)=0$).
To find $mathbb EX$ you can use linearity of expectation, and your setup for this (introducing $X_i$) is okay. The $X_i$ are not independent, but that does not matter. Independence is not needed for linearity of expectation. The $X_i$ all have the same distribution, allowing us to make use of symmetry as well.
Then we find:$$mathbb EX=sum_{i=1}^nmathbb EX_i=nmathbb EX_1=nP(sigma(1)=1)=nfrac1n=1$$
I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
– SABOY
Nov 17 at 11:06
Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
– drhab
Nov 17 at 11:10
Only if they were iid
– SABOY
Nov 17 at 11:12
Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
– drhab
Nov 17 at 11:14
1
E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
– drhab
Nov 17 at 11:19
|
show 1 more comment
up vote
1
down vote
accepted
You can take $Omega=mathcal K$ equipped with $sigma$-algebra $wp(mathcal K)$ and where all outcomes are equiprobable, so that $P({sigma})=frac1{n!}$ for every $sigmainmathcal K$.
Then $X:Omega=mathcal Ktomathbb R$ is prescribed by $sigmamapsto|{iin[n]mid i=sigma(i)}|$.
$X$ has no binomial distribution (e.g. note that $P(X=n-1)=0$).
To find $mathbb EX$ you can use linearity of expectation, and your setup for this (introducing $X_i$) is okay. The $X_i$ are not independent, but that does not matter. Independence is not needed for linearity of expectation. The $X_i$ all have the same distribution, allowing us to make use of symmetry as well.
Then we find:$$mathbb EX=sum_{i=1}^nmathbb EX_i=nmathbb EX_1=nP(sigma(1)=1)=nfrac1n=1$$
I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
– SABOY
Nov 17 at 11:06
Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
– drhab
Nov 17 at 11:10
Only if they were iid
– SABOY
Nov 17 at 11:12
Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
– drhab
Nov 17 at 11:14
1
E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
– drhab
Nov 17 at 11:19
|
show 1 more comment
up vote
1
down vote
accepted
up vote
1
down vote
accepted
You can take $Omega=mathcal K$ equipped with $sigma$-algebra $wp(mathcal K)$ and where all outcomes are equiprobable, so that $P({sigma})=frac1{n!}$ for every $sigmainmathcal K$.
Then $X:Omega=mathcal Ktomathbb R$ is prescribed by $sigmamapsto|{iin[n]mid i=sigma(i)}|$.
$X$ has no binomial distribution (e.g. note that $P(X=n-1)=0$).
To find $mathbb EX$ you can use linearity of expectation, and your setup for this (introducing $X_i$) is okay. The $X_i$ are not independent, but that does not matter. Independence is not needed for linearity of expectation. The $X_i$ all have the same distribution, allowing us to make use of symmetry as well.
Then we find:$$mathbb EX=sum_{i=1}^nmathbb EX_i=nmathbb EX_1=nP(sigma(1)=1)=nfrac1n=1$$
You can take $Omega=mathcal K$ equipped with $sigma$-algebra $wp(mathcal K)$ and where all outcomes are equiprobable, so that $P({sigma})=frac1{n!}$ for every $sigmainmathcal K$.
Then $X:Omega=mathcal Ktomathbb R$ is prescribed by $sigmamapsto|{iin[n]mid i=sigma(i)}|$.
$X$ has no binomial distribution (e.g. note that $P(X=n-1)=0$).
To find $mathbb EX$ you can use linearity of expectation, and your setup for this (introducing $X_i$) is okay. The $X_i$ are not independent, but that does not matter. Independence is not needed for linearity of expectation. The $X_i$ all have the same distribution, allowing us to make use of symmetry as well.
Then we find:$$mathbb EX=sum_{i=1}^nmathbb EX_i=nmathbb EX_1=nP(sigma(1)=1)=nfrac1n=1$$
edited Nov 17 at 11:02
answered Nov 17 at 10:57
drhab
95.1k543126
95.1k543126
I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
– SABOY
Nov 17 at 11:06
Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
– drhab
Nov 17 at 11:10
Only if they were iid
– SABOY
Nov 17 at 11:12
Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
– drhab
Nov 17 at 11:14
1
E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
– drhab
Nov 17 at 11:19
|
show 1 more comment
I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
– SABOY
Nov 17 at 11:06
Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
– drhab
Nov 17 at 11:10
Only if they were iid
– SABOY
Nov 17 at 11:12
Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
– drhab
Nov 17 at 11:14
1
E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
– drhab
Nov 17 at 11:19
I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
– SABOY
Nov 17 at 11:06
I am a bit confused. If, as you state $X_{i}$ are not iid, then why is $sum_{i=1}^{n}mathbb{E}X_{i}=nmathbb{E}X_{1}$? For all we know $X_{1}$ could be defined differently from $X_{2}$
– SABOY
Nov 17 at 11:06
Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
– drhab
Nov 17 at 11:10
Do you agree that the probability of event $sigma(1)=1$ will be the same as the probability of event $sigma(2)=2$? If not then why not?
– drhab
Nov 17 at 11:10
Only if they were iid
– SABOY
Nov 17 at 11:12
Only if they were iid
– SABOY
Nov 17 at 11:12
Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
– drhab
Nov 17 at 11:14
Correct is: "only if their indicator functions have the same distribution". And that is quite well possible if they are not independent. Strong example: if $X=Y$ then $X$ and $Y$ are not independent (so not iid), but they have the same distribution and consequently $mathbb EX=mathbb EY$.
– drhab
Nov 17 at 11:14
1
1
E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
– drhab
Nov 17 at 11:19
E.g. take special case $n=2$. Then there are two possibilities: $X_1=0=X_2$ and $X_1=1=X_2$, so $X_1=X_2$, so not independent and having equal distribution.
– drhab
Nov 17 at 11:19
|
show 1 more comment
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3002189%2fdefining-random-variable-expected-value-for-number-of-fixed-points-given-a-perm%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown