Showing a function is Frechet Differentiable?
$begingroup$
I just started learning the Frechet Derivatives.
So I have a function $H:mathbb{R}^{Ntimes n}tomathbb{R}^{Ntimes n}$, i.e. $U^Tinmathbb{R}^{Ntimes n}$ and
$$H(U^T)=GWtimes (F(U))^T+Stimes U^T+C$$
with $G,W,Sin mathbb{R}^{N times N}$ are two matrices of size $Ntimes N$, $F(cdot)in mathbb{R}^ntomathbb{R}^n $ is a nonlinear function which maps each column vetor of $U$ to the corresponding column vector of $F(U)$, and $Cinmathbb{R}^{Ntimes n}$.
My question is what property should the nonlinear unknown function $F(cdot)$ satisfy to ensure the function $H(cdot)$ is Frechet differentiable? What does the Frechet derivative matrix looks like? What should I start?Thank you!
calculus real-analysis banach-spaces fixed-point-theorems frechet-derivative
$endgroup$
add a comment |
$begingroup$
I just started learning the Frechet Derivatives.
So I have a function $H:mathbb{R}^{Ntimes n}tomathbb{R}^{Ntimes n}$, i.e. $U^Tinmathbb{R}^{Ntimes n}$ and
$$H(U^T)=GWtimes (F(U))^T+Stimes U^T+C$$
with $G,W,Sin mathbb{R}^{N times N}$ are two matrices of size $Ntimes N$, $F(cdot)in mathbb{R}^ntomathbb{R}^n $ is a nonlinear function which maps each column vetor of $U$ to the corresponding column vector of $F(U)$, and $Cinmathbb{R}^{Ntimes n}$.
My question is what property should the nonlinear unknown function $F(cdot)$ satisfy to ensure the function $H(cdot)$ is Frechet differentiable? What does the Frechet derivative matrix looks like? What should I start?Thank you!
calculus real-analysis banach-spaces fixed-point-theorems frechet-derivative
$endgroup$
1
$begingroup$
What does $times$ mean here?
$endgroup$
– Will M.
Dec 2 '18 at 21:28
$begingroup$
@WillM. Sorry for the confusing notation. It means normal matrix multiplication.
$endgroup$
– Sherry
Dec 3 '18 at 16:11
$begingroup$
"with $G,W,Sin mathbb{R}^{N times N}$ are two matrices" did you meant three instead of two? are these constant matrices?
$endgroup$
– zhw.
Dec 5 '18 at 19:52
add a comment |
$begingroup$
I just started learning the Frechet Derivatives.
So I have a function $H:mathbb{R}^{Ntimes n}tomathbb{R}^{Ntimes n}$, i.e. $U^Tinmathbb{R}^{Ntimes n}$ and
$$H(U^T)=GWtimes (F(U))^T+Stimes U^T+C$$
with $G,W,Sin mathbb{R}^{N times N}$ are two matrices of size $Ntimes N$, $F(cdot)in mathbb{R}^ntomathbb{R}^n $ is a nonlinear function which maps each column vetor of $U$ to the corresponding column vector of $F(U)$, and $Cinmathbb{R}^{Ntimes n}$.
My question is what property should the nonlinear unknown function $F(cdot)$ satisfy to ensure the function $H(cdot)$ is Frechet differentiable? What does the Frechet derivative matrix looks like? What should I start?Thank you!
calculus real-analysis banach-spaces fixed-point-theorems frechet-derivative
$endgroup$
I just started learning the Frechet Derivatives.
So I have a function $H:mathbb{R}^{Ntimes n}tomathbb{R}^{Ntimes n}$, i.e. $U^Tinmathbb{R}^{Ntimes n}$ and
$$H(U^T)=GWtimes (F(U))^T+Stimes U^T+C$$
with $G,W,Sin mathbb{R}^{N times N}$ are two matrices of size $Ntimes N$, $F(cdot)in mathbb{R}^ntomathbb{R}^n $ is a nonlinear function which maps each column vetor of $U$ to the corresponding column vector of $F(U)$, and $Cinmathbb{R}^{Ntimes n}$.
My question is what property should the nonlinear unknown function $F(cdot)$ satisfy to ensure the function $H(cdot)$ is Frechet differentiable? What does the Frechet derivative matrix looks like? What should I start?Thank you!
calculus real-analysis banach-spaces fixed-point-theorems frechet-derivative
calculus real-analysis banach-spaces fixed-point-theorems frechet-derivative
asked Nov 30 '18 at 19:04
SherrySherry
1,672623
1,672623
1
$begingroup$
What does $times$ mean here?
$endgroup$
– Will M.
Dec 2 '18 at 21:28
$begingroup$
@WillM. Sorry for the confusing notation. It means normal matrix multiplication.
$endgroup$
– Sherry
Dec 3 '18 at 16:11
$begingroup$
"with $G,W,Sin mathbb{R}^{N times N}$ are two matrices" did you meant three instead of two? are these constant matrices?
$endgroup$
– zhw.
Dec 5 '18 at 19:52
add a comment |
1
$begingroup$
What does $times$ mean here?
$endgroup$
– Will M.
Dec 2 '18 at 21:28
$begingroup$
@WillM. Sorry for the confusing notation. It means normal matrix multiplication.
$endgroup$
– Sherry
Dec 3 '18 at 16:11
$begingroup$
"with $G,W,Sin mathbb{R}^{N times N}$ are two matrices" did you meant three instead of two? are these constant matrices?
$endgroup$
– zhw.
Dec 5 '18 at 19:52
1
1
$begingroup$
What does $times$ mean here?
$endgroup$
– Will M.
Dec 2 '18 at 21:28
$begingroup$
What does $times$ mean here?
$endgroup$
– Will M.
Dec 2 '18 at 21:28
$begingroup$
@WillM. Sorry for the confusing notation. It means normal matrix multiplication.
$endgroup$
– Sherry
Dec 3 '18 at 16:11
$begingroup$
@WillM. Sorry for the confusing notation. It means normal matrix multiplication.
$endgroup$
– Sherry
Dec 3 '18 at 16:11
$begingroup$
"with $G,W,Sin mathbb{R}^{N times N}$ are two matrices" did you meant three instead of two? are these constant matrices?
$endgroup$
– zhw.
Dec 5 '18 at 19:52
$begingroup$
"with $G,W,Sin mathbb{R}^{N times N}$ are two matrices" did you meant three instead of two? are these constant matrices?
$endgroup$
– zhw.
Dec 5 '18 at 19:52
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
I'll rewrite the definition of $H$ as
$$
H(X) = GW F(X^T)^T + SX + C.
$$
Let's assume that $F$ is Frechet differentiable at a particular point $X^T$,
so that
$$
F(X^T + Delta X^T) = F(X^T) + F'(X^T) Delta X^T + e(Delta X),
$$
and the error term $e(Delta X)$ satisfies
$$
lim_{Delta X to 0} frac{|e(Delta X)|}{| Delta X |} = 0.
$$
Notice that
begin{align}
H(X + Delta X) &= GW F(X^T + Delta X^T)^T + SX + SDelta X + C \
&= GW left( F(X^T) + F'(X^T) Delta X^T + e(Delta X) right)^T + SX + S Delta X + C \
&= underbrace{GWF(X^T)^T + SX + C}_{H(X)} + underbrace{GW(F'(X^T) Delta X^T)^T + S Delta X}_{H'(X) Delta X} + underbrace{GW e(Delta X)^T}_{text{small}}.
end{align}
Comparing this with the equation
$$
H(X + Delta X) approx H(X) + H'(X) Delta X
$$
suggests that $H$ is differentiable at $X$ and that $H'(X)$ is the linear transformation defined by
$$
tag{1} H'(X) Delta X = GW(F'(X^T) Delta X^T)^T + S Delta X.
$$
To prove that this is true, we only need to show that
$$
tag{2} lim_{Delta X to 0} frac{| GW e(Delta X)^T |}{ | Delta X |} = 0
$$
To establish (2), let $L$ be the linear transformation defined by
$$
L(v) = GW v^T.
$$
Then
begin{align}
frac{| GW e(Delta X)^T |}{ | Delta X |}
&= frac{| L(e(Delta X)) |}{| Delta X |} \
&leq frac{| L | |e(Delta X) |}{| Delta X |}
end{align}
which approaches $0$ as $Delta X to 0$.
In order to reach the conclusion that $H$ is differentiable at $X$, we needed to assume that $F$ is differentiable at $X^T$.
I don't see a simpler way to express $H'(X)$, but maybe somebody else will.
$endgroup$
$begingroup$
Thanks for your help! I have a question, what is the form of $F'(X^T)^T$? Is it a matrix? Is it possible to represent it explicitly using derivatives of $f_i, 1le i le n$ suppose $F=(f_1,ldots,f_n)$?
$endgroup$
– Sherry
Dec 3 '18 at 17:03
add a comment |
$begingroup$
They gave you the hand-wavy proof above. I am giving you the high-end proof now.
Recall from basic differential calculus:
Basic differentiation algebra: the derivative acts linearly $(f+ alpha g)'(x) = f'(x) + alpha g'(x)$; the derivative of constant functions is zero; and the derivative of continuous linear functions are themselves $f'(x) cdot h = f(h)$ whenever $f$ is linear and continuous.
Chain rule: if $g$ and $f$ are two functions defined on open subsets of normed vector spaces such that $f$ is differentiable at $x$ and $g$ is differentiable at $f(x)$ then the composite function $g circ f$ is differentiable at $c$ and its derivative is the composite of the derivatives $$(g circ f)'(x) = g'(f(x)) circ f'(x).$$
Abridged proof. Write $y = f(x)$ and $f(x + h) = f(x) + underbrace{f'(x) cdot h + o(h)}_k$ and $$g(y + k) = g(y) + g'(y) k + o(k) = g(y) + g'(y) cdot f'(x) cdot h + underbrace{g'(y) o(h) + o(k)}_{o(k)}. square$$
To your exercise. The function $H$ is differentiable at every $U$ where the function $F$ is differentiable as well.
Proof. The functions $varphi:V mapsto GW V^intercal$ and $psi = U mapsto SU^intercal$ are linear while the function $U mapsto C$ is contant. Therefore, the function $H = varphi circ F + psi + C$ will be differentiable at all points where $F$ is differentiable (by the chain rule) and its derivative is simply $$H'(U) = varphi'(U) circ F'(U)^intercal + psi'(U) = varphi circ F'(U) + psi.$$
If you are dealing with finite dimensional vector spaces, find bases of each so that (by denoting $[ cdot ]$ the matrix represantion) we get $$[H'(U)]=[varphi][F'(U)]^intercal + [psi]. square$$
Ammend. If the function $varphi$ is invertible then the differentiability of $H$ implies that of $F$ for we can write $F = varphi^{-1} circ (H - psi - C),$ and $varphi^{-1}$ being linear and continuous, it is differentiable. $square$
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3020505%2fshowing-a-function-is-frechet-differentiable%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I'll rewrite the definition of $H$ as
$$
H(X) = GW F(X^T)^T + SX + C.
$$
Let's assume that $F$ is Frechet differentiable at a particular point $X^T$,
so that
$$
F(X^T + Delta X^T) = F(X^T) + F'(X^T) Delta X^T + e(Delta X),
$$
and the error term $e(Delta X)$ satisfies
$$
lim_{Delta X to 0} frac{|e(Delta X)|}{| Delta X |} = 0.
$$
Notice that
begin{align}
H(X + Delta X) &= GW F(X^T + Delta X^T)^T + SX + SDelta X + C \
&= GW left( F(X^T) + F'(X^T) Delta X^T + e(Delta X) right)^T + SX + S Delta X + C \
&= underbrace{GWF(X^T)^T + SX + C}_{H(X)} + underbrace{GW(F'(X^T) Delta X^T)^T + S Delta X}_{H'(X) Delta X} + underbrace{GW e(Delta X)^T}_{text{small}}.
end{align}
Comparing this with the equation
$$
H(X + Delta X) approx H(X) + H'(X) Delta X
$$
suggests that $H$ is differentiable at $X$ and that $H'(X)$ is the linear transformation defined by
$$
tag{1} H'(X) Delta X = GW(F'(X^T) Delta X^T)^T + S Delta X.
$$
To prove that this is true, we only need to show that
$$
tag{2} lim_{Delta X to 0} frac{| GW e(Delta X)^T |}{ | Delta X |} = 0
$$
To establish (2), let $L$ be the linear transformation defined by
$$
L(v) = GW v^T.
$$
Then
begin{align}
frac{| GW e(Delta X)^T |}{ | Delta X |}
&= frac{| L(e(Delta X)) |}{| Delta X |} \
&leq frac{| L | |e(Delta X) |}{| Delta X |}
end{align}
which approaches $0$ as $Delta X to 0$.
In order to reach the conclusion that $H$ is differentiable at $X$, we needed to assume that $F$ is differentiable at $X^T$.
I don't see a simpler way to express $H'(X)$, but maybe somebody else will.
$endgroup$
$begingroup$
Thanks for your help! I have a question, what is the form of $F'(X^T)^T$? Is it a matrix? Is it possible to represent it explicitly using derivatives of $f_i, 1le i le n$ suppose $F=(f_1,ldots,f_n)$?
$endgroup$
– Sherry
Dec 3 '18 at 17:03
add a comment |
$begingroup$
I'll rewrite the definition of $H$ as
$$
H(X) = GW F(X^T)^T + SX + C.
$$
Let's assume that $F$ is Frechet differentiable at a particular point $X^T$,
so that
$$
F(X^T + Delta X^T) = F(X^T) + F'(X^T) Delta X^T + e(Delta X),
$$
and the error term $e(Delta X)$ satisfies
$$
lim_{Delta X to 0} frac{|e(Delta X)|}{| Delta X |} = 0.
$$
Notice that
begin{align}
H(X + Delta X) &= GW F(X^T + Delta X^T)^T + SX + SDelta X + C \
&= GW left( F(X^T) + F'(X^T) Delta X^T + e(Delta X) right)^T + SX + S Delta X + C \
&= underbrace{GWF(X^T)^T + SX + C}_{H(X)} + underbrace{GW(F'(X^T) Delta X^T)^T + S Delta X}_{H'(X) Delta X} + underbrace{GW e(Delta X)^T}_{text{small}}.
end{align}
Comparing this with the equation
$$
H(X + Delta X) approx H(X) + H'(X) Delta X
$$
suggests that $H$ is differentiable at $X$ and that $H'(X)$ is the linear transformation defined by
$$
tag{1} H'(X) Delta X = GW(F'(X^T) Delta X^T)^T + S Delta X.
$$
To prove that this is true, we only need to show that
$$
tag{2} lim_{Delta X to 0} frac{| GW e(Delta X)^T |}{ | Delta X |} = 0
$$
To establish (2), let $L$ be the linear transformation defined by
$$
L(v) = GW v^T.
$$
Then
begin{align}
frac{| GW e(Delta X)^T |}{ | Delta X |}
&= frac{| L(e(Delta X)) |}{| Delta X |} \
&leq frac{| L | |e(Delta X) |}{| Delta X |}
end{align}
which approaches $0$ as $Delta X to 0$.
In order to reach the conclusion that $H$ is differentiable at $X$, we needed to assume that $F$ is differentiable at $X^T$.
I don't see a simpler way to express $H'(X)$, but maybe somebody else will.
$endgroup$
$begingroup$
Thanks for your help! I have a question, what is the form of $F'(X^T)^T$? Is it a matrix? Is it possible to represent it explicitly using derivatives of $f_i, 1le i le n$ suppose $F=(f_1,ldots,f_n)$?
$endgroup$
– Sherry
Dec 3 '18 at 17:03
add a comment |
$begingroup$
I'll rewrite the definition of $H$ as
$$
H(X) = GW F(X^T)^T + SX + C.
$$
Let's assume that $F$ is Frechet differentiable at a particular point $X^T$,
so that
$$
F(X^T + Delta X^T) = F(X^T) + F'(X^T) Delta X^T + e(Delta X),
$$
and the error term $e(Delta X)$ satisfies
$$
lim_{Delta X to 0} frac{|e(Delta X)|}{| Delta X |} = 0.
$$
Notice that
begin{align}
H(X + Delta X) &= GW F(X^T + Delta X^T)^T + SX + SDelta X + C \
&= GW left( F(X^T) + F'(X^T) Delta X^T + e(Delta X) right)^T + SX + S Delta X + C \
&= underbrace{GWF(X^T)^T + SX + C}_{H(X)} + underbrace{GW(F'(X^T) Delta X^T)^T + S Delta X}_{H'(X) Delta X} + underbrace{GW e(Delta X)^T}_{text{small}}.
end{align}
Comparing this with the equation
$$
H(X + Delta X) approx H(X) + H'(X) Delta X
$$
suggests that $H$ is differentiable at $X$ and that $H'(X)$ is the linear transformation defined by
$$
tag{1} H'(X) Delta X = GW(F'(X^T) Delta X^T)^T + S Delta X.
$$
To prove that this is true, we only need to show that
$$
tag{2} lim_{Delta X to 0} frac{| GW e(Delta X)^T |}{ | Delta X |} = 0
$$
To establish (2), let $L$ be the linear transformation defined by
$$
L(v) = GW v^T.
$$
Then
begin{align}
frac{| GW e(Delta X)^T |}{ | Delta X |}
&= frac{| L(e(Delta X)) |}{| Delta X |} \
&leq frac{| L | |e(Delta X) |}{| Delta X |}
end{align}
which approaches $0$ as $Delta X to 0$.
In order to reach the conclusion that $H$ is differentiable at $X$, we needed to assume that $F$ is differentiable at $X^T$.
I don't see a simpler way to express $H'(X)$, but maybe somebody else will.
$endgroup$
I'll rewrite the definition of $H$ as
$$
H(X) = GW F(X^T)^T + SX + C.
$$
Let's assume that $F$ is Frechet differentiable at a particular point $X^T$,
so that
$$
F(X^T + Delta X^T) = F(X^T) + F'(X^T) Delta X^T + e(Delta X),
$$
and the error term $e(Delta X)$ satisfies
$$
lim_{Delta X to 0} frac{|e(Delta X)|}{| Delta X |} = 0.
$$
Notice that
begin{align}
H(X + Delta X) &= GW F(X^T + Delta X^T)^T + SX + SDelta X + C \
&= GW left( F(X^T) + F'(X^T) Delta X^T + e(Delta X) right)^T + SX + S Delta X + C \
&= underbrace{GWF(X^T)^T + SX + C}_{H(X)} + underbrace{GW(F'(X^T) Delta X^T)^T + S Delta X}_{H'(X) Delta X} + underbrace{GW e(Delta X)^T}_{text{small}}.
end{align}
Comparing this with the equation
$$
H(X + Delta X) approx H(X) + H'(X) Delta X
$$
suggests that $H$ is differentiable at $X$ and that $H'(X)$ is the linear transformation defined by
$$
tag{1} H'(X) Delta X = GW(F'(X^T) Delta X^T)^T + S Delta X.
$$
To prove that this is true, we only need to show that
$$
tag{2} lim_{Delta X to 0} frac{| GW e(Delta X)^T |}{ | Delta X |} = 0
$$
To establish (2), let $L$ be the linear transformation defined by
$$
L(v) = GW v^T.
$$
Then
begin{align}
frac{| GW e(Delta X)^T |}{ | Delta X |}
&= frac{| L(e(Delta X)) |}{| Delta X |} \
&leq frac{| L | |e(Delta X) |}{| Delta X |}
end{align}
which approaches $0$ as $Delta X to 0$.
In order to reach the conclusion that $H$ is differentiable at $X$, we needed to assume that $F$ is differentiable at $X^T$.
I don't see a simpler way to express $H'(X)$, but maybe somebody else will.
edited Dec 2 '18 at 22:44
answered Dec 2 '18 at 21:57
littleOlittleO
29.5k645109
29.5k645109
$begingroup$
Thanks for your help! I have a question, what is the form of $F'(X^T)^T$? Is it a matrix? Is it possible to represent it explicitly using derivatives of $f_i, 1le i le n$ suppose $F=(f_1,ldots,f_n)$?
$endgroup$
– Sherry
Dec 3 '18 at 17:03
add a comment |
$begingroup$
Thanks for your help! I have a question, what is the form of $F'(X^T)^T$? Is it a matrix? Is it possible to represent it explicitly using derivatives of $f_i, 1le i le n$ suppose $F=(f_1,ldots,f_n)$?
$endgroup$
– Sherry
Dec 3 '18 at 17:03
$begingroup$
Thanks for your help! I have a question, what is the form of $F'(X^T)^T$? Is it a matrix? Is it possible to represent it explicitly using derivatives of $f_i, 1le i le n$ suppose $F=(f_1,ldots,f_n)$?
$endgroup$
– Sherry
Dec 3 '18 at 17:03
$begingroup$
Thanks for your help! I have a question, what is the form of $F'(X^T)^T$? Is it a matrix? Is it possible to represent it explicitly using derivatives of $f_i, 1le i le n$ suppose $F=(f_1,ldots,f_n)$?
$endgroup$
– Sherry
Dec 3 '18 at 17:03
add a comment |
$begingroup$
They gave you the hand-wavy proof above. I am giving you the high-end proof now.
Recall from basic differential calculus:
Basic differentiation algebra: the derivative acts linearly $(f+ alpha g)'(x) = f'(x) + alpha g'(x)$; the derivative of constant functions is zero; and the derivative of continuous linear functions are themselves $f'(x) cdot h = f(h)$ whenever $f$ is linear and continuous.
Chain rule: if $g$ and $f$ are two functions defined on open subsets of normed vector spaces such that $f$ is differentiable at $x$ and $g$ is differentiable at $f(x)$ then the composite function $g circ f$ is differentiable at $c$ and its derivative is the composite of the derivatives $$(g circ f)'(x) = g'(f(x)) circ f'(x).$$
Abridged proof. Write $y = f(x)$ and $f(x + h) = f(x) + underbrace{f'(x) cdot h + o(h)}_k$ and $$g(y + k) = g(y) + g'(y) k + o(k) = g(y) + g'(y) cdot f'(x) cdot h + underbrace{g'(y) o(h) + o(k)}_{o(k)}. square$$
To your exercise. The function $H$ is differentiable at every $U$ where the function $F$ is differentiable as well.
Proof. The functions $varphi:V mapsto GW V^intercal$ and $psi = U mapsto SU^intercal$ are linear while the function $U mapsto C$ is contant. Therefore, the function $H = varphi circ F + psi + C$ will be differentiable at all points where $F$ is differentiable (by the chain rule) and its derivative is simply $$H'(U) = varphi'(U) circ F'(U)^intercal + psi'(U) = varphi circ F'(U) + psi.$$
If you are dealing with finite dimensional vector spaces, find bases of each so that (by denoting $[ cdot ]$ the matrix represantion) we get $$[H'(U)]=[varphi][F'(U)]^intercal + [psi]. square$$
Ammend. If the function $varphi$ is invertible then the differentiability of $H$ implies that of $F$ for we can write $F = varphi^{-1} circ (H - psi - C),$ and $varphi^{-1}$ being linear and continuous, it is differentiable. $square$
$endgroup$
add a comment |
$begingroup$
They gave you the hand-wavy proof above. I am giving you the high-end proof now.
Recall from basic differential calculus:
Basic differentiation algebra: the derivative acts linearly $(f+ alpha g)'(x) = f'(x) + alpha g'(x)$; the derivative of constant functions is zero; and the derivative of continuous linear functions are themselves $f'(x) cdot h = f(h)$ whenever $f$ is linear and continuous.
Chain rule: if $g$ and $f$ are two functions defined on open subsets of normed vector spaces such that $f$ is differentiable at $x$ and $g$ is differentiable at $f(x)$ then the composite function $g circ f$ is differentiable at $c$ and its derivative is the composite of the derivatives $$(g circ f)'(x) = g'(f(x)) circ f'(x).$$
Abridged proof. Write $y = f(x)$ and $f(x + h) = f(x) + underbrace{f'(x) cdot h + o(h)}_k$ and $$g(y + k) = g(y) + g'(y) k + o(k) = g(y) + g'(y) cdot f'(x) cdot h + underbrace{g'(y) o(h) + o(k)}_{o(k)}. square$$
To your exercise. The function $H$ is differentiable at every $U$ where the function $F$ is differentiable as well.
Proof. The functions $varphi:V mapsto GW V^intercal$ and $psi = U mapsto SU^intercal$ are linear while the function $U mapsto C$ is contant. Therefore, the function $H = varphi circ F + psi + C$ will be differentiable at all points where $F$ is differentiable (by the chain rule) and its derivative is simply $$H'(U) = varphi'(U) circ F'(U)^intercal + psi'(U) = varphi circ F'(U) + psi.$$
If you are dealing with finite dimensional vector spaces, find bases of each so that (by denoting $[ cdot ]$ the matrix represantion) we get $$[H'(U)]=[varphi][F'(U)]^intercal + [psi]. square$$
Ammend. If the function $varphi$ is invertible then the differentiability of $H$ implies that of $F$ for we can write $F = varphi^{-1} circ (H - psi - C),$ and $varphi^{-1}$ being linear and continuous, it is differentiable. $square$
$endgroup$
add a comment |
$begingroup$
They gave you the hand-wavy proof above. I am giving you the high-end proof now.
Recall from basic differential calculus:
Basic differentiation algebra: the derivative acts linearly $(f+ alpha g)'(x) = f'(x) + alpha g'(x)$; the derivative of constant functions is zero; and the derivative of continuous linear functions are themselves $f'(x) cdot h = f(h)$ whenever $f$ is linear and continuous.
Chain rule: if $g$ and $f$ are two functions defined on open subsets of normed vector spaces such that $f$ is differentiable at $x$ and $g$ is differentiable at $f(x)$ then the composite function $g circ f$ is differentiable at $c$ and its derivative is the composite of the derivatives $$(g circ f)'(x) = g'(f(x)) circ f'(x).$$
Abridged proof. Write $y = f(x)$ and $f(x + h) = f(x) + underbrace{f'(x) cdot h + o(h)}_k$ and $$g(y + k) = g(y) + g'(y) k + o(k) = g(y) + g'(y) cdot f'(x) cdot h + underbrace{g'(y) o(h) + o(k)}_{o(k)}. square$$
To your exercise. The function $H$ is differentiable at every $U$ where the function $F$ is differentiable as well.
Proof. The functions $varphi:V mapsto GW V^intercal$ and $psi = U mapsto SU^intercal$ are linear while the function $U mapsto C$ is contant. Therefore, the function $H = varphi circ F + psi + C$ will be differentiable at all points where $F$ is differentiable (by the chain rule) and its derivative is simply $$H'(U) = varphi'(U) circ F'(U)^intercal + psi'(U) = varphi circ F'(U) + psi.$$
If you are dealing with finite dimensional vector spaces, find bases of each so that (by denoting $[ cdot ]$ the matrix represantion) we get $$[H'(U)]=[varphi][F'(U)]^intercal + [psi]. square$$
Ammend. If the function $varphi$ is invertible then the differentiability of $H$ implies that of $F$ for we can write $F = varphi^{-1} circ (H - psi - C),$ and $varphi^{-1}$ being linear and continuous, it is differentiable. $square$
$endgroup$
They gave you the hand-wavy proof above. I am giving you the high-end proof now.
Recall from basic differential calculus:
Basic differentiation algebra: the derivative acts linearly $(f+ alpha g)'(x) = f'(x) + alpha g'(x)$; the derivative of constant functions is zero; and the derivative of continuous linear functions are themselves $f'(x) cdot h = f(h)$ whenever $f$ is linear and continuous.
Chain rule: if $g$ and $f$ are two functions defined on open subsets of normed vector spaces such that $f$ is differentiable at $x$ and $g$ is differentiable at $f(x)$ then the composite function $g circ f$ is differentiable at $c$ and its derivative is the composite of the derivatives $$(g circ f)'(x) = g'(f(x)) circ f'(x).$$
Abridged proof. Write $y = f(x)$ and $f(x + h) = f(x) + underbrace{f'(x) cdot h + o(h)}_k$ and $$g(y + k) = g(y) + g'(y) k + o(k) = g(y) + g'(y) cdot f'(x) cdot h + underbrace{g'(y) o(h) + o(k)}_{o(k)}. square$$
To your exercise. The function $H$ is differentiable at every $U$ where the function $F$ is differentiable as well.
Proof. The functions $varphi:V mapsto GW V^intercal$ and $psi = U mapsto SU^intercal$ are linear while the function $U mapsto C$ is contant. Therefore, the function $H = varphi circ F + psi + C$ will be differentiable at all points where $F$ is differentiable (by the chain rule) and its derivative is simply $$H'(U) = varphi'(U) circ F'(U)^intercal + psi'(U) = varphi circ F'(U) + psi.$$
If you are dealing with finite dimensional vector spaces, find bases of each so that (by denoting $[ cdot ]$ the matrix represantion) we get $$[H'(U)]=[varphi][F'(U)]^intercal + [psi]. square$$
Ammend. If the function $varphi$ is invertible then the differentiability of $H$ implies that of $F$ for we can write $F = varphi^{-1} circ (H - psi - C),$ and $varphi^{-1}$ being linear and continuous, it is differentiable. $square$
answered Dec 3 '18 at 21:45
Will M.Will M.
2,440314
2,440314
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3020505%2fshowing-a-function-is-frechet-differentiable%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
What does $times$ mean here?
$endgroup$
– Will M.
Dec 2 '18 at 21:28
$begingroup$
@WillM. Sorry for the confusing notation. It means normal matrix multiplication.
$endgroup$
– Sherry
Dec 3 '18 at 16:11
$begingroup$
"with $G,W,Sin mathbb{R}^{N times N}$ are two matrices" did you meant three instead of two? are these constant matrices?
$endgroup$
– zhw.
Dec 5 '18 at 19:52