Finding pdf of function of independent random variables
up vote
0
down vote
favorite
Suppose $X$ and $Y$ are i.i.d with a common pdf
$$f(t) = begin{cases} text{exp}(-t) & text{ if } t > 0 \ 0, &
> text{ otherwise}. end{cases} $$
Show that $X + Y$ and $X/Y$ are independent.
I think to solve this problem, the approach will be to compute the joint density of $X$ and $Y$. Then, make a transformation using a Jacobian and compute the joint density of $X + Y $ and $X/Y$, and use those to find the marginal densities of $X + Y$ and $X/Y$, and see if their products equals the joint density.
I believe the joint densities of $X$ and $Y$ are given by
$$text{exp}(-x) cdot text{exp}(-y) = text{exp}(-(x + y)) text{ if } t > 0$$
and $0$ otherwise.
Define $h(X, Y) = X + Y$ and define $g(X, Y) = X/Y$. Then, the Jacobian is given by
$$J(x, y)= frac{partial h}{partial x} frac{partial g}{partial y} - frac{partial h}{partial y}frac{partial g}{partial x} = left(1right)left(-frac{X}{Y^{2}}right) - left(1right)left(frac{1}{Y}right) = -frac{X + Y}{Y^{2}}.$$
The joint density of $H$ and $G$ is given by
$$f_{HG}(h, g) = f_{XY}(x, y) cdot |J(x, y)|^{-1}$$
$$= text{exp}left(-(x + y)right) cdot frac{Y^2}{X + Y}.$$
I don't understand what I'm doing wrong though, because I think that it should be in terms of $h$ and $g$.
probability multivariable-calculus
add a comment |
up vote
0
down vote
favorite
Suppose $X$ and $Y$ are i.i.d with a common pdf
$$f(t) = begin{cases} text{exp}(-t) & text{ if } t > 0 \ 0, &
> text{ otherwise}. end{cases} $$
Show that $X + Y$ and $X/Y$ are independent.
I think to solve this problem, the approach will be to compute the joint density of $X$ and $Y$. Then, make a transformation using a Jacobian and compute the joint density of $X + Y $ and $X/Y$, and use those to find the marginal densities of $X + Y$ and $X/Y$, and see if their products equals the joint density.
I believe the joint densities of $X$ and $Y$ are given by
$$text{exp}(-x) cdot text{exp}(-y) = text{exp}(-(x + y)) text{ if } t > 0$$
and $0$ otherwise.
Define $h(X, Y) = X + Y$ and define $g(X, Y) = X/Y$. Then, the Jacobian is given by
$$J(x, y)= frac{partial h}{partial x} frac{partial g}{partial y} - frac{partial h}{partial y}frac{partial g}{partial x} = left(1right)left(-frac{X}{Y^{2}}right) - left(1right)left(frac{1}{Y}right) = -frac{X + Y}{Y^{2}}.$$
The joint density of $H$ and $G$ is given by
$$f_{HG}(h, g) = f_{XY}(x, y) cdot |J(x, y)|^{-1}$$
$$= text{exp}left(-(x + y)right) cdot frac{Y^2}{X + Y}.$$
I don't understand what I'm doing wrong though, because I think that it should be in terms of $h$ and $g$.
probability multivariable-calculus
2
Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
– NCh
Nov 15 at 0:26
An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
– Henry
Nov 15 at 1:07
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Suppose $X$ and $Y$ are i.i.d with a common pdf
$$f(t) = begin{cases} text{exp}(-t) & text{ if } t > 0 \ 0, &
> text{ otherwise}. end{cases} $$
Show that $X + Y$ and $X/Y$ are independent.
I think to solve this problem, the approach will be to compute the joint density of $X$ and $Y$. Then, make a transformation using a Jacobian and compute the joint density of $X + Y $ and $X/Y$, and use those to find the marginal densities of $X + Y$ and $X/Y$, and see if their products equals the joint density.
I believe the joint densities of $X$ and $Y$ are given by
$$text{exp}(-x) cdot text{exp}(-y) = text{exp}(-(x + y)) text{ if } t > 0$$
and $0$ otherwise.
Define $h(X, Y) = X + Y$ and define $g(X, Y) = X/Y$. Then, the Jacobian is given by
$$J(x, y)= frac{partial h}{partial x} frac{partial g}{partial y} - frac{partial h}{partial y}frac{partial g}{partial x} = left(1right)left(-frac{X}{Y^{2}}right) - left(1right)left(frac{1}{Y}right) = -frac{X + Y}{Y^{2}}.$$
The joint density of $H$ and $G$ is given by
$$f_{HG}(h, g) = f_{XY}(x, y) cdot |J(x, y)|^{-1}$$
$$= text{exp}left(-(x + y)right) cdot frac{Y^2}{X + Y}.$$
I don't understand what I'm doing wrong though, because I think that it should be in terms of $h$ and $g$.
probability multivariable-calculus
Suppose $X$ and $Y$ are i.i.d with a common pdf
$$f(t) = begin{cases} text{exp}(-t) & text{ if } t > 0 \ 0, &
> text{ otherwise}. end{cases} $$
Show that $X + Y$ and $X/Y$ are independent.
I think to solve this problem, the approach will be to compute the joint density of $X$ and $Y$. Then, make a transformation using a Jacobian and compute the joint density of $X + Y $ and $X/Y$, and use those to find the marginal densities of $X + Y$ and $X/Y$, and see if their products equals the joint density.
I believe the joint densities of $X$ and $Y$ are given by
$$text{exp}(-x) cdot text{exp}(-y) = text{exp}(-(x + y)) text{ if } t > 0$$
and $0$ otherwise.
Define $h(X, Y) = X + Y$ and define $g(X, Y) = X/Y$. Then, the Jacobian is given by
$$J(x, y)= frac{partial h}{partial x} frac{partial g}{partial y} - frac{partial h}{partial y}frac{partial g}{partial x} = left(1right)left(-frac{X}{Y^{2}}right) - left(1right)left(frac{1}{Y}right) = -frac{X + Y}{Y^{2}}.$$
The joint density of $H$ and $G$ is given by
$$f_{HG}(h, g) = f_{XY}(x, y) cdot |J(x, y)|^{-1}$$
$$= text{exp}left(-(x + y)right) cdot frac{Y^2}{X + Y}.$$
I don't understand what I'm doing wrong though, because I think that it should be in terms of $h$ and $g$.
probability multivariable-calculus
probability multivariable-calculus
asked Nov 15 at 0:05
joseph
958
958
2
Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
– NCh
Nov 15 at 0:26
An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
– Henry
Nov 15 at 1:07
add a comment |
2
Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
– NCh
Nov 15 at 0:26
An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
– Henry
Nov 15 at 1:07
2
2
Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
– NCh
Nov 15 at 0:26
Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
– NCh
Nov 15 at 0:26
An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
– Henry
Nov 15 at 1:07
An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
– Henry
Nov 15 at 1:07
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999000%2ffinding-pdf-of-function-of-independent-random-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
– NCh
Nov 15 at 0:26
An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
– Henry
Nov 15 at 1:07