Law of Unconscious statistician - Proof on wiki wrong?
up vote
0
down vote
favorite
So I don't understand the last part of how wikipedia proves the LOTUS theorem. Here is the link.
After we prove that,
$F_Y(y) = F_X(g^{-1}(y))$
It says by chain rule we have,
$F_Y(y) = f_X(g^{-1}(y)) * 1/g^{'}(g^{-1}(y))$
But didn't we get this by differentiating. This should be,
$dF_Y(y) = f_X(g^{-1}(y)) * 1/g^{'}(g^{-1}(y))$
Then the second last equation would be
$displaystyleint_{-infty}^{infty}g(x) f_X(x) dx = displaystyleint_{-infty}^{infty} y,. dF_Y(y).dy$
Which makes sense. By definition the expected value of a function of random variable should be equal to the integral of the product of the value of the function of the random variable i.e. $y$ and the PDF of the function of the random variable $f_Y(y)$.
statistics probability-distributions expected-value
add a comment |
up vote
0
down vote
favorite
So I don't understand the last part of how wikipedia proves the LOTUS theorem. Here is the link.
After we prove that,
$F_Y(y) = F_X(g^{-1}(y))$
It says by chain rule we have,
$F_Y(y) = f_X(g^{-1}(y)) * 1/g^{'}(g^{-1}(y))$
But didn't we get this by differentiating. This should be,
$dF_Y(y) = f_X(g^{-1}(y)) * 1/g^{'}(g^{-1}(y))$
Then the second last equation would be
$displaystyleint_{-infty}^{infty}g(x) f_X(x) dx = displaystyleint_{-infty}^{infty} y,. dF_Y(y).dy$
Which makes sense. By definition the expected value of a function of random variable should be equal to the integral of the product of the value of the function of the random variable i.e. $y$ and the PDF of the function of the random variable $f_Y(y)$.
statistics probability-distributions expected-value
Yes, the wiki page has mistakes.
– Kavi Rama Murthy
Nov 14 at 8:47
Ok thanks just wanted to confirm it.
– gallickgunner
Nov 14 at 13:14
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
So I don't understand the last part of how wikipedia proves the LOTUS theorem. Here is the link.
After we prove that,
$F_Y(y) = F_X(g^{-1}(y))$
It says by chain rule we have,
$F_Y(y) = f_X(g^{-1}(y)) * 1/g^{'}(g^{-1}(y))$
But didn't we get this by differentiating. This should be,
$dF_Y(y) = f_X(g^{-1}(y)) * 1/g^{'}(g^{-1}(y))$
Then the second last equation would be
$displaystyleint_{-infty}^{infty}g(x) f_X(x) dx = displaystyleint_{-infty}^{infty} y,. dF_Y(y).dy$
Which makes sense. By definition the expected value of a function of random variable should be equal to the integral of the product of the value of the function of the random variable i.e. $y$ and the PDF of the function of the random variable $f_Y(y)$.
statistics probability-distributions expected-value
So I don't understand the last part of how wikipedia proves the LOTUS theorem. Here is the link.
After we prove that,
$F_Y(y) = F_X(g^{-1}(y))$
It says by chain rule we have,
$F_Y(y) = f_X(g^{-1}(y)) * 1/g^{'}(g^{-1}(y))$
But didn't we get this by differentiating. This should be,
$dF_Y(y) = f_X(g^{-1}(y)) * 1/g^{'}(g^{-1}(y))$
Then the second last equation would be
$displaystyleint_{-infty}^{infty}g(x) f_X(x) dx = displaystyleint_{-infty}^{infty} y,. dF_Y(y).dy$
Which makes sense. By definition the expected value of a function of random variable should be equal to the integral of the product of the value of the function of the random variable i.e. $y$ and the PDF of the function of the random variable $f_Y(y)$.
statistics probability-distributions expected-value
statistics probability-distributions expected-value
asked Nov 14 at 8:43
gallickgunner
1155
1155
Yes, the wiki page has mistakes.
– Kavi Rama Murthy
Nov 14 at 8:47
Ok thanks just wanted to confirm it.
– gallickgunner
Nov 14 at 13:14
add a comment |
Yes, the wiki page has mistakes.
– Kavi Rama Murthy
Nov 14 at 8:47
Ok thanks just wanted to confirm it.
– gallickgunner
Nov 14 at 13:14
Yes, the wiki page has mistakes.
– Kavi Rama Murthy
Nov 14 at 8:47
Yes, the wiki page has mistakes.
– Kavi Rama Murthy
Nov 14 at 8:47
Ok thanks just wanted to confirm it.
– gallickgunner
Nov 14 at 13:14
Ok thanks just wanted to confirm it.
– gallickgunner
Nov 14 at 13:14
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2997994%2flaw-of-unconscious-statistician-proof-on-wiki-wrong%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Yes, the wiki page has mistakes.
– Kavi Rama Murthy
Nov 14 at 8:47
Ok thanks just wanted to confirm it.
– gallickgunner
Nov 14 at 13:14