Conditional expectation of number of trials
up vote
0
down vote
favorite
Consider $n$ independent trials, each of which results in one of the outcomes ${1, ..., k}$, with respective probabilities $p_1, p_2, ...,p_k$ where those probabilites sum to $1$. Let $N_i$ denote the number of trials that result in outcome $i$ where $i = 1, ..., k$. For $ineq j$ find $mathbb{E}[N_i|N_j>0]$.
I tried to write it as a double sum on $i$ and $j$, and expanding the conditional probability as $mathbb{E}[N_i=i|N_j=j]=mathbb{P}dfrac{(N_i=icap N_j=j)}{mathbb{P}(N_j=j)}$ but nothing came out of it, how should I proceed?
probability conditional-expectation conditional-probability
add a comment |
up vote
0
down vote
favorite
Consider $n$ independent trials, each of which results in one of the outcomes ${1, ..., k}$, with respective probabilities $p_1, p_2, ...,p_k$ where those probabilites sum to $1$. Let $N_i$ denote the number of trials that result in outcome $i$ where $i = 1, ..., k$. For $ineq j$ find $mathbb{E}[N_i|N_j>0]$.
I tried to write it as a double sum on $i$ and $j$, and expanding the conditional probability as $mathbb{E}[N_i=i|N_j=j]=mathbb{P}dfrac{(N_i=icap N_j=j)}{mathbb{P}(N_j=j)}$ but nothing came out of it, how should I proceed?
probability conditional-expectation conditional-probability
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Consider $n$ independent trials, each of which results in one of the outcomes ${1, ..., k}$, with respective probabilities $p_1, p_2, ...,p_k$ where those probabilites sum to $1$. Let $N_i$ denote the number of trials that result in outcome $i$ where $i = 1, ..., k$. For $ineq j$ find $mathbb{E}[N_i|N_j>0]$.
I tried to write it as a double sum on $i$ and $j$, and expanding the conditional probability as $mathbb{E}[N_i=i|N_j=j]=mathbb{P}dfrac{(N_i=icap N_j=j)}{mathbb{P}(N_j=j)}$ but nothing came out of it, how should I proceed?
probability conditional-expectation conditional-probability
Consider $n$ independent trials, each of which results in one of the outcomes ${1, ..., k}$, with respective probabilities $p_1, p_2, ...,p_k$ where those probabilites sum to $1$. Let $N_i$ denote the number of trials that result in outcome $i$ where $i = 1, ..., k$. For $ineq j$ find $mathbb{E}[N_i|N_j>0]$.
I tried to write it as a double sum on $i$ and $j$, and expanding the conditional probability as $mathbb{E}[N_i=i|N_j=j]=mathbb{P}dfrac{(N_i=icap N_j=j)}{mathbb{P}(N_j=j)}$ but nothing came out of it, how should I proceed?
probability conditional-expectation conditional-probability
probability conditional-expectation conditional-probability
asked Nov 15 at 18:58
liz
1046
1046
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
up vote
1
down vote
Hint: use the Law of Total Expectation:$$mathsf E(N_i)=mathsf E(N_imid N_j{=}0)~mathsf P(N_j{=}0)+mathsf E(N_imid N_j{>}0)~mathsf P(N_j{>}0)$$
$$therefore mathsf E(N_imid N_j>0)=dfrac{mathsf E(N_i)-mathsf E(N_imid N_j{=}0)~mathsf P(N_j{=}0)}{mathsf P(N_j{>}0)}$$
The terms in this fraction may be evaluated by noticing that $N_isimmathcal{Binom}(n,p_i)$, $N_jsimmathcal{Binom}(n,p_j)$, and $(N_imid N_j{=}0)simmathcal {Binom}(n, tfrac{p_i}{1-p_j})$.
[When given that none of the trials are outcome $j$ the conditional probability that a particular trial is outcome $i$ is $p_i/(1-p_j)$]
add a comment |
up vote
0
down vote
$newcommandE{mathbb{E}}newcommandP{mathbb{P}}$Using Bayes' rule we can reduce$$
P(N_i = k | N_j > 0) = frac{P(N_j > 0 | N_i = k)P(N_i = k)}{P(N_j > 0)} = frac{p_i^k(1-p_i)^{n-k}}{1 - (1-p_j)^n}P(N_j > 0 | N_i = k).
$$
But when $N_i = k$, there are $n-k$ more independent trials to possibly affect $N_j$. As a result
$$
P(N_j > 0 | N=k) = 1 - (1-p_j)^{n-k},
$$
giving
$$
P(N_i = k | N_j > 0) = frac{1 - (1-p_j)^n}{,,,,,1 - (1-p_j)^{n-k}} p_i^k (1-p_i)^{n-k}.
$$
Thus
$$
E(N_i | N_j > 0) = sum_{k=1}^n k frac{1 - (1-p_j)^n}{,,,,,1 - (1-p_j)^{n-k}} p_i^k (1-p_i)^k = bigl[1 - (1-p_j)^nbigr]sum_{k=0}^n frac{k p_i^k (1-p_i)^{n-k}}{1 - (1-p_j^{n-k})}.
$$
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
Hint: use the Law of Total Expectation:$$mathsf E(N_i)=mathsf E(N_imid N_j{=}0)~mathsf P(N_j{=}0)+mathsf E(N_imid N_j{>}0)~mathsf P(N_j{>}0)$$
$$therefore mathsf E(N_imid N_j>0)=dfrac{mathsf E(N_i)-mathsf E(N_imid N_j{=}0)~mathsf P(N_j{=}0)}{mathsf P(N_j{>}0)}$$
The terms in this fraction may be evaluated by noticing that $N_isimmathcal{Binom}(n,p_i)$, $N_jsimmathcal{Binom}(n,p_j)$, and $(N_imid N_j{=}0)simmathcal {Binom}(n, tfrac{p_i}{1-p_j})$.
[When given that none of the trials are outcome $j$ the conditional probability that a particular trial is outcome $i$ is $p_i/(1-p_j)$]
add a comment |
up vote
1
down vote
Hint: use the Law of Total Expectation:$$mathsf E(N_i)=mathsf E(N_imid N_j{=}0)~mathsf P(N_j{=}0)+mathsf E(N_imid N_j{>}0)~mathsf P(N_j{>}0)$$
$$therefore mathsf E(N_imid N_j>0)=dfrac{mathsf E(N_i)-mathsf E(N_imid N_j{=}0)~mathsf P(N_j{=}0)}{mathsf P(N_j{>}0)}$$
The terms in this fraction may be evaluated by noticing that $N_isimmathcal{Binom}(n,p_i)$, $N_jsimmathcal{Binom}(n,p_j)$, and $(N_imid N_j{=}0)simmathcal {Binom}(n, tfrac{p_i}{1-p_j})$.
[When given that none of the trials are outcome $j$ the conditional probability that a particular trial is outcome $i$ is $p_i/(1-p_j)$]
add a comment |
up vote
1
down vote
up vote
1
down vote
Hint: use the Law of Total Expectation:$$mathsf E(N_i)=mathsf E(N_imid N_j{=}0)~mathsf P(N_j{=}0)+mathsf E(N_imid N_j{>}0)~mathsf P(N_j{>}0)$$
$$therefore mathsf E(N_imid N_j>0)=dfrac{mathsf E(N_i)-mathsf E(N_imid N_j{=}0)~mathsf P(N_j{=}0)}{mathsf P(N_j{>}0)}$$
The terms in this fraction may be evaluated by noticing that $N_isimmathcal{Binom}(n,p_i)$, $N_jsimmathcal{Binom}(n,p_j)$, and $(N_imid N_j{=}0)simmathcal {Binom}(n, tfrac{p_i}{1-p_j})$.
[When given that none of the trials are outcome $j$ the conditional probability that a particular trial is outcome $i$ is $p_i/(1-p_j)$]
Hint: use the Law of Total Expectation:$$mathsf E(N_i)=mathsf E(N_imid N_j{=}0)~mathsf P(N_j{=}0)+mathsf E(N_imid N_j{>}0)~mathsf P(N_j{>}0)$$
$$therefore mathsf E(N_imid N_j>0)=dfrac{mathsf E(N_i)-mathsf E(N_imid N_j{=}0)~mathsf P(N_j{=}0)}{mathsf P(N_j{>}0)}$$
The terms in this fraction may be evaluated by noticing that $N_isimmathcal{Binom}(n,p_i)$, $N_jsimmathcal{Binom}(n,p_j)$, and $(N_imid N_j{=}0)simmathcal {Binom}(n, tfrac{p_i}{1-p_j})$.
[When given that none of the trials are outcome $j$ the conditional probability that a particular trial is outcome $i$ is $p_i/(1-p_j)$]
edited Nov 16 at 2:42
answered Nov 15 at 23:09
Graham Kemp
84.3k43378
84.3k43378
add a comment |
add a comment |
up vote
0
down vote
$newcommandE{mathbb{E}}newcommandP{mathbb{P}}$Using Bayes' rule we can reduce$$
P(N_i = k | N_j > 0) = frac{P(N_j > 0 | N_i = k)P(N_i = k)}{P(N_j > 0)} = frac{p_i^k(1-p_i)^{n-k}}{1 - (1-p_j)^n}P(N_j > 0 | N_i = k).
$$
But when $N_i = k$, there are $n-k$ more independent trials to possibly affect $N_j$. As a result
$$
P(N_j > 0 | N=k) = 1 - (1-p_j)^{n-k},
$$
giving
$$
P(N_i = k | N_j > 0) = frac{1 - (1-p_j)^n}{,,,,,1 - (1-p_j)^{n-k}} p_i^k (1-p_i)^{n-k}.
$$
Thus
$$
E(N_i | N_j > 0) = sum_{k=1}^n k frac{1 - (1-p_j)^n}{,,,,,1 - (1-p_j)^{n-k}} p_i^k (1-p_i)^k = bigl[1 - (1-p_j)^nbigr]sum_{k=0}^n frac{k p_i^k (1-p_i)^{n-k}}{1 - (1-p_j^{n-k})}.
$$
add a comment |
up vote
0
down vote
$newcommandE{mathbb{E}}newcommandP{mathbb{P}}$Using Bayes' rule we can reduce$$
P(N_i = k | N_j > 0) = frac{P(N_j > 0 | N_i = k)P(N_i = k)}{P(N_j > 0)} = frac{p_i^k(1-p_i)^{n-k}}{1 - (1-p_j)^n}P(N_j > 0 | N_i = k).
$$
But when $N_i = k$, there are $n-k$ more independent trials to possibly affect $N_j$. As a result
$$
P(N_j > 0 | N=k) = 1 - (1-p_j)^{n-k},
$$
giving
$$
P(N_i = k | N_j > 0) = frac{1 - (1-p_j)^n}{,,,,,1 - (1-p_j)^{n-k}} p_i^k (1-p_i)^{n-k}.
$$
Thus
$$
E(N_i | N_j > 0) = sum_{k=1}^n k frac{1 - (1-p_j)^n}{,,,,,1 - (1-p_j)^{n-k}} p_i^k (1-p_i)^k = bigl[1 - (1-p_j)^nbigr]sum_{k=0}^n frac{k p_i^k (1-p_i)^{n-k}}{1 - (1-p_j^{n-k})}.
$$
add a comment |
up vote
0
down vote
up vote
0
down vote
$newcommandE{mathbb{E}}newcommandP{mathbb{P}}$Using Bayes' rule we can reduce$$
P(N_i = k | N_j > 0) = frac{P(N_j > 0 | N_i = k)P(N_i = k)}{P(N_j > 0)} = frac{p_i^k(1-p_i)^{n-k}}{1 - (1-p_j)^n}P(N_j > 0 | N_i = k).
$$
But when $N_i = k$, there are $n-k$ more independent trials to possibly affect $N_j$. As a result
$$
P(N_j > 0 | N=k) = 1 - (1-p_j)^{n-k},
$$
giving
$$
P(N_i = k | N_j > 0) = frac{1 - (1-p_j)^n}{,,,,,1 - (1-p_j)^{n-k}} p_i^k (1-p_i)^{n-k}.
$$
Thus
$$
E(N_i | N_j > 0) = sum_{k=1}^n k frac{1 - (1-p_j)^n}{,,,,,1 - (1-p_j)^{n-k}} p_i^k (1-p_i)^k = bigl[1 - (1-p_j)^nbigr]sum_{k=0}^n frac{k p_i^k (1-p_i)^{n-k}}{1 - (1-p_j^{n-k})}.
$$
$newcommandE{mathbb{E}}newcommandP{mathbb{P}}$Using Bayes' rule we can reduce$$
P(N_i = k | N_j > 0) = frac{P(N_j > 0 | N_i = k)P(N_i = k)}{P(N_j > 0)} = frac{p_i^k(1-p_i)^{n-k}}{1 - (1-p_j)^n}P(N_j > 0 | N_i = k).
$$
But when $N_i = k$, there are $n-k$ more independent trials to possibly affect $N_j$. As a result
$$
P(N_j > 0 | N=k) = 1 - (1-p_j)^{n-k},
$$
giving
$$
P(N_i = k | N_j > 0) = frac{1 - (1-p_j)^n}{,,,,,1 - (1-p_j)^{n-k}} p_i^k (1-p_i)^{n-k}.
$$
Thus
$$
E(N_i | N_j > 0) = sum_{k=1}^n k frac{1 - (1-p_j)^n}{,,,,,1 - (1-p_j)^{n-k}} p_i^k (1-p_i)^k = bigl[1 - (1-p_j)^nbigr]sum_{k=0}^n frac{k p_i^k (1-p_i)^{n-k}}{1 - (1-p_j^{n-k})}.
$$
answered Nov 15 at 22:56
cdipaolo
590312
590312
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3000127%2fconditional-expectation-of-number-of-trials%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown