Approximating binomial with normal distribution: probability and density values are practically the same?
$begingroup$
In discrete distribution when we plot PMF the Y axis is probability. In continuous distribution when we plot PDF the Y axis is density (probability is the area under the curve). So, we learn that density values are not probability values.
But what happens when I approximate binomial with normal distribution. Consider an example case of B(100, 0.5). So, Mu=50, sigma=5. I calculated both binomial and normal distributions with these parameters. Below is the plot. For binomial distribution my Y axis is probability but for normal distribution it is density. But numerically the values practically overlap. Obviously, probability in a point for normal distribution is still 0. Does this makes sense to you?
Thanks!
probability probability-theory normal-distribution binomial-distribution
$endgroup$
add a comment |
$begingroup$
In discrete distribution when we plot PMF the Y axis is probability. In continuous distribution when we plot PDF the Y axis is density (probability is the area under the curve). So, we learn that density values are not probability values.
But what happens when I approximate binomial with normal distribution. Consider an example case of B(100, 0.5). So, Mu=50, sigma=5. I calculated both binomial and normal distributions with these parameters. Below is the plot. For binomial distribution my Y axis is probability but for normal distribution it is density. But numerically the values practically overlap. Obviously, probability in a point for normal distribution is still 0. Does this makes sense to you?
Thanks!
probability probability-theory normal-distribution binomial-distribution
$endgroup$
$begingroup$
I think that depends on the normalization for the histogram. If you choose a bin width different from 1, you won’t obtain a good fit with the Gaussian unless you normalize the histogram ( usually an option in CAS). In this latter case the probability is the area under the histogram
$endgroup$
– lcv
Dec 22 '18 at 12:36
$begingroup$
In order to connect with the valid answers that you got, note that for general bins $P(X=k) approx P(k-Delta k/2 < Y < k+ Delta k/2) Delta k$, where $Delta k$ is the size of the bin.
$endgroup$
– lcv
Dec 22 '18 at 12:41
1
$begingroup$
I see. Thanks a lot for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
add a comment |
$begingroup$
In discrete distribution when we plot PMF the Y axis is probability. In continuous distribution when we plot PDF the Y axis is density (probability is the area under the curve). So, we learn that density values are not probability values.
But what happens when I approximate binomial with normal distribution. Consider an example case of B(100, 0.5). So, Mu=50, sigma=5. I calculated both binomial and normal distributions with these parameters. Below is the plot. For binomial distribution my Y axis is probability but for normal distribution it is density. But numerically the values practically overlap. Obviously, probability in a point for normal distribution is still 0. Does this makes sense to you?
Thanks!
probability probability-theory normal-distribution binomial-distribution
$endgroup$
In discrete distribution when we plot PMF the Y axis is probability. In continuous distribution when we plot PDF the Y axis is density (probability is the area under the curve). So, we learn that density values are not probability values.
But what happens when I approximate binomial with normal distribution. Consider an example case of B(100, 0.5). So, Mu=50, sigma=5. I calculated both binomial and normal distributions with these parameters. Below is the plot. For binomial distribution my Y axis is probability but for normal distribution it is density. But numerically the values practically overlap. Obviously, probability in a point for normal distribution is still 0. Does this makes sense to you?
Thanks!
probability probability-theory normal-distribution binomial-distribution
probability probability-theory normal-distribution binomial-distribution
asked Dec 22 '18 at 10:56
JohnJohn
1236
1236
$begingroup$
I think that depends on the normalization for the histogram. If you choose a bin width different from 1, you won’t obtain a good fit with the Gaussian unless you normalize the histogram ( usually an option in CAS). In this latter case the probability is the area under the histogram
$endgroup$
– lcv
Dec 22 '18 at 12:36
$begingroup$
In order to connect with the valid answers that you got, note that for general bins $P(X=k) approx P(k-Delta k/2 < Y < k+ Delta k/2) Delta k$, where $Delta k$ is the size of the bin.
$endgroup$
– lcv
Dec 22 '18 at 12:41
1
$begingroup$
I see. Thanks a lot for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
add a comment |
$begingroup$
I think that depends on the normalization for the histogram. If you choose a bin width different from 1, you won’t obtain a good fit with the Gaussian unless you normalize the histogram ( usually an option in CAS). In this latter case the probability is the area under the histogram
$endgroup$
– lcv
Dec 22 '18 at 12:36
$begingroup$
In order to connect with the valid answers that you got, note that for general bins $P(X=k) approx P(k-Delta k/2 < Y < k+ Delta k/2) Delta k$, where $Delta k$ is the size of the bin.
$endgroup$
– lcv
Dec 22 '18 at 12:41
1
$begingroup$
I see. Thanks a lot for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
$begingroup$
I think that depends on the normalization for the histogram. If you choose a bin width different from 1, you won’t obtain a good fit with the Gaussian unless you normalize the histogram ( usually an option in CAS). In this latter case the probability is the area under the histogram
$endgroup$
– lcv
Dec 22 '18 at 12:36
$begingroup$
I think that depends on the normalization for the histogram. If you choose a bin width different from 1, you won’t obtain a good fit with the Gaussian unless you normalize the histogram ( usually an option in CAS). In this latter case the probability is the area under the histogram
$endgroup$
– lcv
Dec 22 '18 at 12:36
$begingroup$
In order to connect with the valid answers that you got, note that for general bins $P(X=k) approx P(k-Delta k/2 < Y < k+ Delta k/2) Delta k$, where $Delta k$ is the size of the bin.
$endgroup$
– lcv
Dec 22 '18 at 12:41
$begingroup$
In order to connect with the valid answers that you got, note that for general bins $P(X=k) approx P(k-Delta k/2 < Y < k+ Delta k/2) Delta k$, where $Delta k$ is the size of the bin.
$endgroup$
– lcv
Dec 22 '18 at 12:41
1
1
$begingroup$
I see. Thanks a lot for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
$begingroup$
I see. Thanks a lot for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
What you observe is that: $$P(X=k)approx Pleft(k-frac12<Y<k+frac12right)=int_{k-frac12}^{k+frac12}f_Y(y)dy$$where $X$ is binomial and $Y$ is normal.
So actually expressions like $int_{k-frac12}^{k+frac12}f_Y(y)dy$ where $f_Y(y)$ is a PDF can be recognized here as probabilities.
In a suitable situation some $x_0in(k-frac12,k+frac12)$ may exist that satisfies $$P(X=k)=int_{k-frac12}^{k+frac12}f_Y(y)dy=int_{k-frac12}^{k+frac12}f_Y(x_0)dy=f(x_0)$$
This however does not prevent that a PDF can take values in $(1,infty)$, and this is not possible for a probability.
edit:
Let me say also that lots of PMF's (not all) actually induce a PDF.
For instance let $p$ denote the PMF of a random variable $X$ that takes values in $mathbb Z$.
The function $f$ prescribed by $f(x)=lfloor xrfloor$ is then a PDF.
This because it is measurable, non-negative and satisfies: $$int f(x)dx=sum_{ninmathbb Z}p(n)=1$$
It is the PDF of $X+U$ where $X$ and $U$ are independent and $U$ has uniform distribution on $[0,1)$.
By picturing these PMF and PDF will have the same looks except that the PMF only takes positive values on $mathbb Z$ and not on $mathbb R-mathbb Z$.
$endgroup$
$begingroup$
Thanks, but I am not sure that I fully understand your answer. In my specific case the values on Y axis are the same for both discrete and continuous case. How can the PDF take (1, infinity) ?
$endgroup$
– John
Dec 22 '18 at 11:38
$begingroup$
When I say that PDF's can take values in $(1,infty)$ then I am not talking specifically about the PDF in your quesion, but about PDF's in general.
$endgroup$
– drhab
Dec 22 '18 at 11:41
$begingroup$
I see. Thank you for the help.
$endgroup$
– John
Dec 22 '18 at 14:35
1
$begingroup$
@John Are you aware that you can accept answers? Accepting an answer is a kind of appreciation for the people who take time to help you. Accept the answer which fits at best to your question.
$endgroup$
– callculus
Dec 22 '18 at 18:03
add a comment |
$begingroup$
Note that:
$$P(X=k)approx Pleft(k-frac12<Y<k+frac12right)=\
=Pleft(frac{left(k-frac12right)-50}{5}<Z<frac{left(k+frac12right)-50}{5}right)=\
Pleft(frac k5-10.1<Z<frac k5-9.9right),$$
where $X$ is the discrete r.v., $Y$ is the continuous normal r.v. and $Z$ is the continuous standard normal r.v.
Obviously, probability in a point for normal distribution is still $0$.
The thing is the probability of the binomial r.v. $X=k$ approximately corresponds to the density (or area) under the normal curve for the interval $(k-frac12, k+frac12)$. As the width of the rectangle gets smaller and the height of the rectangle gets bigger, the area does not change and is approximately equal to the probability of the normal (standard normal) random variable in the specified interval.
$endgroup$
$begingroup$
This is only valid if the size of the bins is 1. I think this is where the OP’s confusion originates. See my comment.
$endgroup$
– lcv
Dec 22 '18 at 12:44
1
$begingroup$
Thanks you both for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3049310%2fapproximating-binomial-with-normal-distribution-probability-and-density-values%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
What you observe is that: $$P(X=k)approx Pleft(k-frac12<Y<k+frac12right)=int_{k-frac12}^{k+frac12}f_Y(y)dy$$where $X$ is binomial and $Y$ is normal.
So actually expressions like $int_{k-frac12}^{k+frac12}f_Y(y)dy$ where $f_Y(y)$ is a PDF can be recognized here as probabilities.
In a suitable situation some $x_0in(k-frac12,k+frac12)$ may exist that satisfies $$P(X=k)=int_{k-frac12}^{k+frac12}f_Y(y)dy=int_{k-frac12}^{k+frac12}f_Y(x_0)dy=f(x_0)$$
This however does not prevent that a PDF can take values in $(1,infty)$, and this is not possible for a probability.
edit:
Let me say also that lots of PMF's (not all) actually induce a PDF.
For instance let $p$ denote the PMF of a random variable $X$ that takes values in $mathbb Z$.
The function $f$ prescribed by $f(x)=lfloor xrfloor$ is then a PDF.
This because it is measurable, non-negative and satisfies: $$int f(x)dx=sum_{ninmathbb Z}p(n)=1$$
It is the PDF of $X+U$ where $X$ and $U$ are independent and $U$ has uniform distribution on $[0,1)$.
By picturing these PMF and PDF will have the same looks except that the PMF only takes positive values on $mathbb Z$ and not on $mathbb R-mathbb Z$.
$endgroup$
$begingroup$
Thanks, but I am not sure that I fully understand your answer. In my specific case the values on Y axis are the same for both discrete and continuous case. How can the PDF take (1, infinity) ?
$endgroup$
– John
Dec 22 '18 at 11:38
$begingroup$
When I say that PDF's can take values in $(1,infty)$ then I am not talking specifically about the PDF in your quesion, but about PDF's in general.
$endgroup$
– drhab
Dec 22 '18 at 11:41
$begingroup$
I see. Thank you for the help.
$endgroup$
– John
Dec 22 '18 at 14:35
1
$begingroup$
@John Are you aware that you can accept answers? Accepting an answer is a kind of appreciation for the people who take time to help you. Accept the answer which fits at best to your question.
$endgroup$
– callculus
Dec 22 '18 at 18:03
add a comment |
$begingroup$
What you observe is that: $$P(X=k)approx Pleft(k-frac12<Y<k+frac12right)=int_{k-frac12}^{k+frac12}f_Y(y)dy$$where $X$ is binomial and $Y$ is normal.
So actually expressions like $int_{k-frac12}^{k+frac12}f_Y(y)dy$ where $f_Y(y)$ is a PDF can be recognized here as probabilities.
In a suitable situation some $x_0in(k-frac12,k+frac12)$ may exist that satisfies $$P(X=k)=int_{k-frac12}^{k+frac12}f_Y(y)dy=int_{k-frac12}^{k+frac12}f_Y(x_0)dy=f(x_0)$$
This however does not prevent that a PDF can take values in $(1,infty)$, and this is not possible for a probability.
edit:
Let me say also that lots of PMF's (not all) actually induce a PDF.
For instance let $p$ denote the PMF of a random variable $X$ that takes values in $mathbb Z$.
The function $f$ prescribed by $f(x)=lfloor xrfloor$ is then a PDF.
This because it is measurable, non-negative and satisfies: $$int f(x)dx=sum_{ninmathbb Z}p(n)=1$$
It is the PDF of $X+U$ where $X$ and $U$ are independent and $U$ has uniform distribution on $[0,1)$.
By picturing these PMF and PDF will have the same looks except that the PMF only takes positive values on $mathbb Z$ and not on $mathbb R-mathbb Z$.
$endgroup$
$begingroup$
Thanks, but I am not sure that I fully understand your answer. In my specific case the values on Y axis are the same for both discrete and continuous case. How can the PDF take (1, infinity) ?
$endgroup$
– John
Dec 22 '18 at 11:38
$begingroup$
When I say that PDF's can take values in $(1,infty)$ then I am not talking specifically about the PDF in your quesion, but about PDF's in general.
$endgroup$
– drhab
Dec 22 '18 at 11:41
$begingroup$
I see. Thank you for the help.
$endgroup$
– John
Dec 22 '18 at 14:35
1
$begingroup$
@John Are you aware that you can accept answers? Accepting an answer is a kind of appreciation for the people who take time to help you. Accept the answer which fits at best to your question.
$endgroup$
– callculus
Dec 22 '18 at 18:03
add a comment |
$begingroup$
What you observe is that: $$P(X=k)approx Pleft(k-frac12<Y<k+frac12right)=int_{k-frac12}^{k+frac12}f_Y(y)dy$$where $X$ is binomial and $Y$ is normal.
So actually expressions like $int_{k-frac12}^{k+frac12}f_Y(y)dy$ where $f_Y(y)$ is a PDF can be recognized here as probabilities.
In a suitable situation some $x_0in(k-frac12,k+frac12)$ may exist that satisfies $$P(X=k)=int_{k-frac12}^{k+frac12}f_Y(y)dy=int_{k-frac12}^{k+frac12}f_Y(x_0)dy=f(x_0)$$
This however does not prevent that a PDF can take values in $(1,infty)$, and this is not possible for a probability.
edit:
Let me say also that lots of PMF's (not all) actually induce a PDF.
For instance let $p$ denote the PMF of a random variable $X$ that takes values in $mathbb Z$.
The function $f$ prescribed by $f(x)=lfloor xrfloor$ is then a PDF.
This because it is measurable, non-negative and satisfies: $$int f(x)dx=sum_{ninmathbb Z}p(n)=1$$
It is the PDF of $X+U$ where $X$ and $U$ are independent and $U$ has uniform distribution on $[0,1)$.
By picturing these PMF and PDF will have the same looks except that the PMF only takes positive values on $mathbb Z$ and not on $mathbb R-mathbb Z$.
$endgroup$
What you observe is that: $$P(X=k)approx Pleft(k-frac12<Y<k+frac12right)=int_{k-frac12}^{k+frac12}f_Y(y)dy$$where $X$ is binomial and $Y$ is normal.
So actually expressions like $int_{k-frac12}^{k+frac12}f_Y(y)dy$ where $f_Y(y)$ is a PDF can be recognized here as probabilities.
In a suitable situation some $x_0in(k-frac12,k+frac12)$ may exist that satisfies $$P(X=k)=int_{k-frac12}^{k+frac12}f_Y(y)dy=int_{k-frac12}^{k+frac12}f_Y(x_0)dy=f(x_0)$$
This however does not prevent that a PDF can take values in $(1,infty)$, and this is not possible for a probability.
edit:
Let me say also that lots of PMF's (not all) actually induce a PDF.
For instance let $p$ denote the PMF of a random variable $X$ that takes values in $mathbb Z$.
The function $f$ prescribed by $f(x)=lfloor xrfloor$ is then a PDF.
This because it is measurable, non-negative and satisfies: $$int f(x)dx=sum_{ninmathbb Z}p(n)=1$$
It is the PDF of $X+U$ where $X$ and $U$ are independent and $U$ has uniform distribution on $[0,1)$.
By picturing these PMF and PDF will have the same looks except that the PMF only takes positive values on $mathbb Z$ and not on $mathbb R-mathbb Z$.
edited Dec 22 '18 at 12:03
answered Dec 22 '18 at 11:21
drhabdrhab
104k545136
104k545136
$begingroup$
Thanks, but I am not sure that I fully understand your answer. In my specific case the values on Y axis are the same for both discrete and continuous case. How can the PDF take (1, infinity) ?
$endgroup$
– John
Dec 22 '18 at 11:38
$begingroup$
When I say that PDF's can take values in $(1,infty)$ then I am not talking specifically about the PDF in your quesion, but about PDF's in general.
$endgroup$
– drhab
Dec 22 '18 at 11:41
$begingroup$
I see. Thank you for the help.
$endgroup$
– John
Dec 22 '18 at 14:35
1
$begingroup$
@John Are you aware that you can accept answers? Accepting an answer is a kind of appreciation for the people who take time to help you. Accept the answer which fits at best to your question.
$endgroup$
– callculus
Dec 22 '18 at 18:03
add a comment |
$begingroup$
Thanks, but I am not sure that I fully understand your answer. In my specific case the values on Y axis are the same for both discrete and continuous case. How can the PDF take (1, infinity) ?
$endgroup$
– John
Dec 22 '18 at 11:38
$begingroup$
When I say that PDF's can take values in $(1,infty)$ then I am not talking specifically about the PDF in your quesion, but about PDF's in general.
$endgroup$
– drhab
Dec 22 '18 at 11:41
$begingroup$
I see. Thank you for the help.
$endgroup$
– John
Dec 22 '18 at 14:35
1
$begingroup$
@John Are you aware that you can accept answers? Accepting an answer is a kind of appreciation for the people who take time to help you. Accept the answer which fits at best to your question.
$endgroup$
– callculus
Dec 22 '18 at 18:03
$begingroup$
Thanks, but I am not sure that I fully understand your answer. In my specific case the values on Y axis are the same for both discrete and continuous case. How can the PDF take (1, infinity) ?
$endgroup$
– John
Dec 22 '18 at 11:38
$begingroup$
Thanks, but I am not sure that I fully understand your answer. In my specific case the values on Y axis are the same for both discrete and continuous case. How can the PDF take (1, infinity) ?
$endgroup$
– John
Dec 22 '18 at 11:38
$begingroup$
When I say that PDF's can take values in $(1,infty)$ then I am not talking specifically about the PDF in your quesion, but about PDF's in general.
$endgroup$
– drhab
Dec 22 '18 at 11:41
$begingroup$
When I say that PDF's can take values in $(1,infty)$ then I am not talking specifically about the PDF in your quesion, but about PDF's in general.
$endgroup$
– drhab
Dec 22 '18 at 11:41
$begingroup$
I see. Thank you for the help.
$endgroup$
– John
Dec 22 '18 at 14:35
$begingroup$
I see. Thank you for the help.
$endgroup$
– John
Dec 22 '18 at 14:35
1
1
$begingroup$
@John Are you aware that you can accept answers? Accepting an answer is a kind of appreciation for the people who take time to help you. Accept the answer which fits at best to your question.
$endgroup$
– callculus
Dec 22 '18 at 18:03
$begingroup$
@John Are you aware that you can accept answers? Accepting an answer is a kind of appreciation for the people who take time to help you. Accept the answer which fits at best to your question.
$endgroup$
– callculus
Dec 22 '18 at 18:03
add a comment |
$begingroup$
Note that:
$$P(X=k)approx Pleft(k-frac12<Y<k+frac12right)=\
=Pleft(frac{left(k-frac12right)-50}{5}<Z<frac{left(k+frac12right)-50}{5}right)=\
Pleft(frac k5-10.1<Z<frac k5-9.9right),$$
where $X$ is the discrete r.v., $Y$ is the continuous normal r.v. and $Z$ is the continuous standard normal r.v.
Obviously, probability in a point for normal distribution is still $0$.
The thing is the probability of the binomial r.v. $X=k$ approximately corresponds to the density (or area) under the normal curve for the interval $(k-frac12, k+frac12)$. As the width of the rectangle gets smaller and the height of the rectangle gets bigger, the area does not change and is approximately equal to the probability of the normal (standard normal) random variable in the specified interval.
$endgroup$
$begingroup$
This is only valid if the size of the bins is 1. I think this is where the OP’s confusion originates. See my comment.
$endgroup$
– lcv
Dec 22 '18 at 12:44
1
$begingroup$
Thanks you both for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
add a comment |
$begingroup$
Note that:
$$P(X=k)approx Pleft(k-frac12<Y<k+frac12right)=\
=Pleft(frac{left(k-frac12right)-50}{5}<Z<frac{left(k+frac12right)-50}{5}right)=\
Pleft(frac k5-10.1<Z<frac k5-9.9right),$$
where $X$ is the discrete r.v., $Y$ is the continuous normal r.v. and $Z$ is the continuous standard normal r.v.
Obviously, probability in a point for normal distribution is still $0$.
The thing is the probability of the binomial r.v. $X=k$ approximately corresponds to the density (or area) under the normal curve for the interval $(k-frac12, k+frac12)$. As the width of the rectangle gets smaller and the height of the rectangle gets bigger, the area does not change and is approximately equal to the probability of the normal (standard normal) random variable in the specified interval.
$endgroup$
$begingroup$
This is only valid if the size of the bins is 1. I think this is where the OP’s confusion originates. See my comment.
$endgroup$
– lcv
Dec 22 '18 at 12:44
1
$begingroup$
Thanks you both for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
add a comment |
$begingroup$
Note that:
$$P(X=k)approx Pleft(k-frac12<Y<k+frac12right)=\
=Pleft(frac{left(k-frac12right)-50}{5}<Z<frac{left(k+frac12right)-50}{5}right)=\
Pleft(frac k5-10.1<Z<frac k5-9.9right),$$
where $X$ is the discrete r.v., $Y$ is the continuous normal r.v. and $Z$ is the continuous standard normal r.v.
Obviously, probability in a point for normal distribution is still $0$.
The thing is the probability of the binomial r.v. $X=k$ approximately corresponds to the density (or area) under the normal curve for the interval $(k-frac12, k+frac12)$. As the width of the rectangle gets smaller and the height of the rectangle gets bigger, the area does not change and is approximately equal to the probability of the normal (standard normal) random variable in the specified interval.
$endgroup$
Note that:
$$P(X=k)approx Pleft(k-frac12<Y<k+frac12right)=\
=Pleft(frac{left(k-frac12right)-50}{5}<Z<frac{left(k+frac12right)-50}{5}right)=\
Pleft(frac k5-10.1<Z<frac k5-9.9right),$$
where $X$ is the discrete r.v., $Y$ is the continuous normal r.v. and $Z$ is the continuous standard normal r.v.
Obviously, probability in a point for normal distribution is still $0$.
The thing is the probability of the binomial r.v. $X=k$ approximately corresponds to the density (or area) under the normal curve for the interval $(k-frac12, k+frac12)$. As the width of the rectangle gets smaller and the height of the rectangle gets bigger, the area does not change and is approximately equal to the probability of the normal (standard normal) random variable in the specified interval.
answered Dec 22 '18 at 12:10
farruhotafarruhota
22.2k2942
22.2k2942
$begingroup$
This is only valid if the size of the bins is 1. I think this is where the OP’s confusion originates. See my comment.
$endgroup$
– lcv
Dec 22 '18 at 12:44
1
$begingroup$
Thanks you both for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
add a comment |
$begingroup$
This is only valid if the size of the bins is 1. I think this is where the OP’s confusion originates. See my comment.
$endgroup$
– lcv
Dec 22 '18 at 12:44
1
$begingroup$
Thanks you both for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
$begingroup$
This is only valid if the size of the bins is 1. I think this is where the OP’s confusion originates. See my comment.
$endgroup$
– lcv
Dec 22 '18 at 12:44
$begingroup$
This is only valid if the size of the bins is 1. I think this is where the OP’s confusion originates. See my comment.
$endgroup$
– lcv
Dec 22 '18 at 12:44
1
1
$begingroup$
Thanks you both for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
$begingroup$
Thanks you both for the help.
$endgroup$
– John
Dec 22 '18 at 14:34
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3049310%2fapproximating-binomial-with-normal-distribution-probability-and-density-values%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
I think that depends on the normalization for the histogram. If you choose a bin width different from 1, you won’t obtain a good fit with the Gaussian unless you normalize the histogram ( usually an option in CAS). In this latter case the probability is the area under the histogram
$endgroup$
– lcv
Dec 22 '18 at 12:36
$begingroup$
In order to connect with the valid answers that you got, note that for general bins $P(X=k) approx P(k-Delta k/2 < Y < k+ Delta k/2) Delta k$, where $Delta k$ is the size of the bin.
$endgroup$
– lcv
Dec 22 '18 at 12:41
1
$begingroup$
I see. Thanks a lot for the help.
$endgroup$
– John
Dec 22 '18 at 14:34