Computing least square solution when eigenvalue and eigenvectors are known.












1












$begingroup$


Suppose a matrix $A$ has eigenvalues 0, 3, 7 and eigenvectors $mathbf{u, v, w,}$ respectively. Find the least square minimum length solution for $Amathbf{x} = mathbf{u+v+w}$.



This was on our engineering math final exam last year and we've tried some techniques about Moore-Penrose pseudoinverse, which didn't seem to work. Can someone help?










share|cite|improve this question









$endgroup$












  • $begingroup$
    What size is $A $?
    $endgroup$
    – AnyAD
    Dec 16 '18 at 6:42










  • $begingroup$
    A is 3 by 3 matrix. All of eigenvalues and eigenvectors of A is known as above.
    $endgroup$
    – Gratus
    Dec 16 '18 at 6:46
















1












$begingroup$


Suppose a matrix $A$ has eigenvalues 0, 3, 7 and eigenvectors $mathbf{u, v, w,}$ respectively. Find the least square minimum length solution for $Amathbf{x} = mathbf{u+v+w}$.



This was on our engineering math final exam last year and we've tried some techniques about Moore-Penrose pseudoinverse, which didn't seem to work. Can someone help?










share|cite|improve this question









$endgroup$












  • $begingroup$
    What size is $A $?
    $endgroup$
    – AnyAD
    Dec 16 '18 at 6:42










  • $begingroup$
    A is 3 by 3 matrix. All of eigenvalues and eigenvectors of A is known as above.
    $endgroup$
    – Gratus
    Dec 16 '18 at 6:46














1












1








1


1



$begingroup$


Suppose a matrix $A$ has eigenvalues 0, 3, 7 and eigenvectors $mathbf{u, v, w,}$ respectively. Find the least square minimum length solution for $Amathbf{x} = mathbf{u+v+w}$.



This was on our engineering math final exam last year and we've tried some techniques about Moore-Penrose pseudoinverse, which didn't seem to work. Can someone help?










share|cite|improve this question









$endgroup$




Suppose a matrix $A$ has eigenvalues 0, 3, 7 and eigenvectors $mathbf{u, v, w,}$ respectively. Find the least square minimum length solution for $Amathbf{x} = mathbf{u+v+w}$.



This was on our engineering math final exam last year and we've tried some techniques about Moore-Penrose pseudoinverse, which didn't seem to work. Can someone help?







linear-algebra






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Dec 16 '18 at 6:37









GratusGratus

104




104












  • $begingroup$
    What size is $A $?
    $endgroup$
    – AnyAD
    Dec 16 '18 at 6:42










  • $begingroup$
    A is 3 by 3 matrix. All of eigenvalues and eigenvectors of A is known as above.
    $endgroup$
    – Gratus
    Dec 16 '18 at 6:46


















  • $begingroup$
    What size is $A $?
    $endgroup$
    – AnyAD
    Dec 16 '18 at 6:42










  • $begingroup$
    A is 3 by 3 matrix. All of eigenvalues and eigenvectors of A is known as above.
    $endgroup$
    – Gratus
    Dec 16 '18 at 6:46
















$begingroup$
What size is $A $?
$endgroup$
– AnyAD
Dec 16 '18 at 6:42




$begingroup$
What size is $A $?
$endgroup$
– AnyAD
Dec 16 '18 at 6:42












$begingroup$
A is 3 by 3 matrix. All of eigenvalues and eigenvectors of A is known as above.
$endgroup$
– Gratus
Dec 16 '18 at 6:46




$begingroup$
A is 3 by 3 matrix. All of eigenvalues and eigenvectors of A is known as above.
$endgroup$
– Gratus
Dec 16 '18 at 6:46










3 Answers
3






active

oldest

votes


















0












$begingroup$

The problem here is that there is no such vector $x$ that satisfies
$$
Ax = u+v+w.$$
We can see this easily from the fact that
$$operatorname{ran} A =operatorname{span}{v,w}.$$



$textbf{EDIT:}$ Our problem can be written as follows. First, find $y in operatorname{ran}A$ (that is, $y=Ax$ for some $x$)
such that
$$
lVert y-(u+v+w)rVert leq lVert z-(u+v+w)rVert,quadforall zinoperatorname{ran}A.
$$
Then find $xinmathbb{R}^3$ such that $Ax=y$ and
$$
lVert xrVertleq lVert x'rVert, quadforall x':Ax'=y.
$$



Firstly, the solution $y$ is given by the orthogonal projection $P(u+v+w)$ of $u+v+w$ onto $operatorname{ran} A$. What we need to do is to compute $Pu$ since $P(v+w) = v+w$. To do this, let
$$
Pu = alpha v +beta w.$$
It should be that
$$
u-Pu = u-(alpha v +beta w) perp v,w.
$$
This gives a system of equations about $alpha,beta$:
$$
langle u,vrangle=alpha langle v,v rangle +beta langle w,v rangle,
$$
and
$$
langle u,wrangle= alphalangle v,w rangle+ langle w,w rangle.
$$
(If $A$ is normal, then we have $langle u,vrangle=langle v,wrangle=langle w,vrangle=0$. But this is not true in general.)



Solving this equation gives $y =(alpha+1)v + (beta+1)w$. Finally, note that
$$
A^{-1}({y}) = {frac{(alpha+1)}{3}v+frac{(beta+1)}{7}w+gamma u;|;gammainmathbb{R}}.$$

To minimize $lVert x rVert$ over $xin A^{-1}({y})$, it must be that
$$
x =frac{(alpha+1)}{3}v+frac{(beta+1)}{7}w+gamma u perp u.
$$

This gives us
$$
x = frac{(alpha+1)}{3}left(v-frac{langle v,urangle}{lVert u rVert^2}uright) + frac{(beta+1)}{7}left(w-frac{langle w,urangle}{lVert u rVert^2}uright).
$$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    From the fact that eigenvalue is zero, this seems trivial. How should I use this fact to compute least square solution?
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:50










  • $begingroup$
    I meant there is no solution of $Ax = u+v+w$. Perhaps I'm missing something. What is your definition of least squre solution?
    $endgroup$
    – Song
    Dec 16 '18 at 12:52










  • $begingroup$
    least square is defined as a vector $x$ such that $|Ax-b|$ is minimized, as I know.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:54












  • $begingroup$
    So basically what I know is that I have to find x which minimizes $||Ax-(u+v+w)||$ from given conditions. Additionally, if we have more than one solution as a least square solution, we should find out which one has minimum length. Formally, $||x||$ should be also minimized. Some of my friends are insisting that solution must be v/3+w/7, but we couldn't prove it. Btw, thanks for your concern :)
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:57










  • $begingroup$
    So the minimization is two-folded. I think I can help.
    $endgroup$
    – Song
    Dec 16 '18 at 12:59



















0












$begingroup$

Consider the basis made up of the eigenvectors, say $B$. Then use the fact that $A$ is diagonisable. Then you can appky least squares formula to $D[x]_B=[Ax]_B=[1 1 1]^T$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for a lesson. However, I still don't really get how am I supposed to apply least squares formula to $PD[x]_B$ after we changed basis to $u,v,w$. Would you mind elaborating a bit more? It would be very helpful.
    $endgroup$
    – Gratus
    Dec 16 '18 at 7:01










  • $begingroup$
    @Gratus See here math.stackexchange.com/questions/72222/…
    $endgroup$
    – AnyAD
    Dec 16 '18 at 8:55










  • $begingroup$
    In this case, it seems $A^T A$ is not invertible. Though I think I got your key idea. Thanks.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:49



















0












$begingroup$

$Ax$ owns to the set $E$ generated by $v$ and $w$.



Let us call $u_0$ the projection of $u$ on $E$.



Then $u =u_0+u_1$ with $u_1$ orthogonal to $E$.



The best you can obtain is $Ax = u_0 + v + w$, projection theorem.

As $u_0$ owns to $E$, it can be represented as $u_0 =alpha v + beta w$, and one solution is
$$ x = frac{1+alpha}{3}v + frac{1+beta}{7}w $$



This simple solution minimizes the error on the output.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Can this solution written in forms of $v$ and $w$? Some of my friends are insisting that solution must be $mathbf{v}/3+mathbf{w}/7$, but we couldn't prove it.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:52










  • $begingroup$
    I will try to complete it later
    $endgroup$
    – Damien
    Dec 16 '18 at 13:09











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3042308%2fcomputing-least-square-solution-when-eigenvalue-and-eigenvectors-are-known%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























3 Answers
3






active

oldest

votes








3 Answers
3






active

oldest

votes









active

oldest

votes






active

oldest

votes









0












$begingroup$

The problem here is that there is no such vector $x$ that satisfies
$$
Ax = u+v+w.$$
We can see this easily from the fact that
$$operatorname{ran} A =operatorname{span}{v,w}.$$



$textbf{EDIT:}$ Our problem can be written as follows. First, find $y in operatorname{ran}A$ (that is, $y=Ax$ for some $x$)
such that
$$
lVert y-(u+v+w)rVert leq lVert z-(u+v+w)rVert,quadforall zinoperatorname{ran}A.
$$
Then find $xinmathbb{R}^3$ such that $Ax=y$ and
$$
lVert xrVertleq lVert x'rVert, quadforall x':Ax'=y.
$$



Firstly, the solution $y$ is given by the orthogonal projection $P(u+v+w)$ of $u+v+w$ onto $operatorname{ran} A$. What we need to do is to compute $Pu$ since $P(v+w) = v+w$. To do this, let
$$
Pu = alpha v +beta w.$$
It should be that
$$
u-Pu = u-(alpha v +beta w) perp v,w.
$$
This gives a system of equations about $alpha,beta$:
$$
langle u,vrangle=alpha langle v,v rangle +beta langle w,v rangle,
$$
and
$$
langle u,wrangle= alphalangle v,w rangle+ langle w,w rangle.
$$
(If $A$ is normal, then we have $langle u,vrangle=langle v,wrangle=langle w,vrangle=0$. But this is not true in general.)



Solving this equation gives $y =(alpha+1)v + (beta+1)w$. Finally, note that
$$
A^{-1}({y}) = {frac{(alpha+1)}{3}v+frac{(beta+1)}{7}w+gamma u;|;gammainmathbb{R}}.$$

To minimize $lVert x rVert$ over $xin A^{-1}({y})$, it must be that
$$
x =frac{(alpha+1)}{3}v+frac{(beta+1)}{7}w+gamma u perp u.
$$

This gives us
$$
x = frac{(alpha+1)}{3}left(v-frac{langle v,urangle}{lVert u rVert^2}uright) + frac{(beta+1)}{7}left(w-frac{langle w,urangle}{lVert u rVert^2}uright).
$$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    From the fact that eigenvalue is zero, this seems trivial. How should I use this fact to compute least square solution?
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:50










  • $begingroup$
    I meant there is no solution of $Ax = u+v+w$. Perhaps I'm missing something. What is your definition of least squre solution?
    $endgroup$
    – Song
    Dec 16 '18 at 12:52










  • $begingroup$
    least square is defined as a vector $x$ such that $|Ax-b|$ is minimized, as I know.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:54












  • $begingroup$
    So basically what I know is that I have to find x which minimizes $||Ax-(u+v+w)||$ from given conditions. Additionally, if we have more than one solution as a least square solution, we should find out which one has minimum length. Formally, $||x||$ should be also minimized. Some of my friends are insisting that solution must be v/3+w/7, but we couldn't prove it. Btw, thanks for your concern :)
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:57










  • $begingroup$
    So the minimization is two-folded. I think I can help.
    $endgroup$
    – Song
    Dec 16 '18 at 12:59
















0












$begingroup$

The problem here is that there is no such vector $x$ that satisfies
$$
Ax = u+v+w.$$
We can see this easily from the fact that
$$operatorname{ran} A =operatorname{span}{v,w}.$$



$textbf{EDIT:}$ Our problem can be written as follows. First, find $y in operatorname{ran}A$ (that is, $y=Ax$ for some $x$)
such that
$$
lVert y-(u+v+w)rVert leq lVert z-(u+v+w)rVert,quadforall zinoperatorname{ran}A.
$$
Then find $xinmathbb{R}^3$ such that $Ax=y$ and
$$
lVert xrVertleq lVert x'rVert, quadforall x':Ax'=y.
$$



Firstly, the solution $y$ is given by the orthogonal projection $P(u+v+w)$ of $u+v+w$ onto $operatorname{ran} A$. What we need to do is to compute $Pu$ since $P(v+w) = v+w$. To do this, let
$$
Pu = alpha v +beta w.$$
It should be that
$$
u-Pu = u-(alpha v +beta w) perp v,w.
$$
This gives a system of equations about $alpha,beta$:
$$
langle u,vrangle=alpha langle v,v rangle +beta langle w,v rangle,
$$
and
$$
langle u,wrangle= alphalangle v,w rangle+ langle w,w rangle.
$$
(If $A$ is normal, then we have $langle u,vrangle=langle v,wrangle=langle w,vrangle=0$. But this is not true in general.)



Solving this equation gives $y =(alpha+1)v + (beta+1)w$. Finally, note that
$$
A^{-1}({y}) = {frac{(alpha+1)}{3}v+frac{(beta+1)}{7}w+gamma u;|;gammainmathbb{R}}.$$

To minimize $lVert x rVert$ over $xin A^{-1}({y})$, it must be that
$$
x =frac{(alpha+1)}{3}v+frac{(beta+1)}{7}w+gamma u perp u.
$$

This gives us
$$
x = frac{(alpha+1)}{3}left(v-frac{langle v,urangle}{lVert u rVert^2}uright) + frac{(beta+1)}{7}left(w-frac{langle w,urangle}{lVert u rVert^2}uright).
$$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    From the fact that eigenvalue is zero, this seems trivial. How should I use this fact to compute least square solution?
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:50










  • $begingroup$
    I meant there is no solution of $Ax = u+v+w$. Perhaps I'm missing something. What is your definition of least squre solution?
    $endgroup$
    – Song
    Dec 16 '18 at 12:52










  • $begingroup$
    least square is defined as a vector $x$ such that $|Ax-b|$ is minimized, as I know.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:54












  • $begingroup$
    So basically what I know is that I have to find x which minimizes $||Ax-(u+v+w)||$ from given conditions. Additionally, if we have more than one solution as a least square solution, we should find out which one has minimum length. Formally, $||x||$ should be also minimized. Some of my friends are insisting that solution must be v/3+w/7, but we couldn't prove it. Btw, thanks for your concern :)
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:57










  • $begingroup$
    So the minimization is two-folded. I think I can help.
    $endgroup$
    – Song
    Dec 16 '18 at 12:59














0












0








0





$begingroup$

The problem here is that there is no such vector $x$ that satisfies
$$
Ax = u+v+w.$$
We can see this easily from the fact that
$$operatorname{ran} A =operatorname{span}{v,w}.$$



$textbf{EDIT:}$ Our problem can be written as follows. First, find $y in operatorname{ran}A$ (that is, $y=Ax$ for some $x$)
such that
$$
lVert y-(u+v+w)rVert leq lVert z-(u+v+w)rVert,quadforall zinoperatorname{ran}A.
$$
Then find $xinmathbb{R}^3$ such that $Ax=y$ and
$$
lVert xrVertleq lVert x'rVert, quadforall x':Ax'=y.
$$



Firstly, the solution $y$ is given by the orthogonal projection $P(u+v+w)$ of $u+v+w$ onto $operatorname{ran} A$. What we need to do is to compute $Pu$ since $P(v+w) = v+w$. To do this, let
$$
Pu = alpha v +beta w.$$
It should be that
$$
u-Pu = u-(alpha v +beta w) perp v,w.
$$
This gives a system of equations about $alpha,beta$:
$$
langle u,vrangle=alpha langle v,v rangle +beta langle w,v rangle,
$$
and
$$
langle u,wrangle= alphalangle v,w rangle+ langle w,w rangle.
$$
(If $A$ is normal, then we have $langle u,vrangle=langle v,wrangle=langle w,vrangle=0$. But this is not true in general.)



Solving this equation gives $y =(alpha+1)v + (beta+1)w$. Finally, note that
$$
A^{-1}({y}) = {frac{(alpha+1)}{3}v+frac{(beta+1)}{7}w+gamma u;|;gammainmathbb{R}}.$$

To minimize $lVert x rVert$ over $xin A^{-1}({y})$, it must be that
$$
x =frac{(alpha+1)}{3}v+frac{(beta+1)}{7}w+gamma u perp u.
$$

This gives us
$$
x = frac{(alpha+1)}{3}left(v-frac{langle v,urangle}{lVert u rVert^2}uright) + frac{(beta+1)}{7}left(w-frac{langle w,urangle}{lVert u rVert^2}uright).
$$






share|cite|improve this answer











$endgroup$



The problem here is that there is no such vector $x$ that satisfies
$$
Ax = u+v+w.$$
We can see this easily from the fact that
$$operatorname{ran} A =operatorname{span}{v,w}.$$



$textbf{EDIT:}$ Our problem can be written as follows. First, find $y in operatorname{ran}A$ (that is, $y=Ax$ for some $x$)
such that
$$
lVert y-(u+v+w)rVert leq lVert z-(u+v+w)rVert,quadforall zinoperatorname{ran}A.
$$
Then find $xinmathbb{R}^3$ such that $Ax=y$ and
$$
lVert xrVertleq lVert x'rVert, quadforall x':Ax'=y.
$$



Firstly, the solution $y$ is given by the orthogonal projection $P(u+v+w)$ of $u+v+w$ onto $operatorname{ran} A$. What we need to do is to compute $Pu$ since $P(v+w) = v+w$. To do this, let
$$
Pu = alpha v +beta w.$$
It should be that
$$
u-Pu = u-(alpha v +beta w) perp v,w.
$$
This gives a system of equations about $alpha,beta$:
$$
langle u,vrangle=alpha langle v,v rangle +beta langle w,v rangle,
$$
and
$$
langle u,wrangle= alphalangle v,w rangle+ langle w,w rangle.
$$
(If $A$ is normal, then we have $langle u,vrangle=langle v,wrangle=langle w,vrangle=0$. But this is not true in general.)



Solving this equation gives $y =(alpha+1)v + (beta+1)w$. Finally, note that
$$
A^{-1}({y}) = {frac{(alpha+1)}{3}v+frac{(beta+1)}{7}w+gamma u;|;gammainmathbb{R}}.$$

To minimize $lVert x rVert$ over $xin A^{-1}({y})$, it must be that
$$
x =frac{(alpha+1)}{3}v+frac{(beta+1)}{7}w+gamma u perp u.
$$

This gives us
$$
x = frac{(alpha+1)}{3}left(v-frac{langle v,urangle}{lVert u rVert^2}uright) + frac{(beta+1)}{7}left(w-frac{langle w,urangle}{lVert u rVert^2}uright).
$$







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Dec 16 '18 at 17:41

























answered Dec 16 '18 at 8:58









SongSong

16.8k21145




16.8k21145












  • $begingroup$
    From the fact that eigenvalue is zero, this seems trivial. How should I use this fact to compute least square solution?
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:50










  • $begingroup$
    I meant there is no solution of $Ax = u+v+w$. Perhaps I'm missing something. What is your definition of least squre solution?
    $endgroup$
    – Song
    Dec 16 '18 at 12:52










  • $begingroup$
    least square is defined as a vector $x$ such that $|Ax-b|$ is minimized, as I know.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:54












  • $begingroup$
    So basically what I know is that I have to find x which minimizes $||Ax-(u+v+w)||$ from given conditions. Additionally, if we have more than one solution as a least square solution, we should find out which one has minimum length. Formally, $||x||$ should be also minimized. Some of my friends are insisting that solution must be v/3+w/7, but we couldn't prove it. Btw, thanks for your concern :)
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:57










  • $begingroup$
    So the minimization is two-folded. I think I can help.
    $endgroup$
    – Song
    Dec 16 '18 at 12:59


















  • $begingroup$
    From the fact that eigenvalue is zero, this seems trivial. How should I use this fact to compute least square solution?
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:50










  • $begingroup$
    I meant there is no solution of $Ax = u+v+w$. Perhaps I'm missing something. What is your definition of least squre solution?
    $endgroup$
    – Song
    Dec 16 '18 at 12:52










  • $begingroup$
    least square is defined as a vector $x$ such that $|Ax-b|$ is minimized, as I know.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:54












  • $begingroup$
    So basically what I know is that I have to find x which minimizes $||Ax-(u+v+w)||$ from given conditions. Additionally, if we have more than one solution as a least square solution, we should find out which one has minimum length. Formally, $||x||$ should be also minimized. Some of my friends are insisting that solution must be v/3+w/7, but we couldn't prove it. Btw, thanks for your concern :)
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:57










  • $begingroup$
    So the minimization is two-folded. I think I can help.
    $endgroup$
    – Song
    Dec 16 '18 at 12:59
















$begingroup$
From the fact that eigenvalue is zero, this seems trivial. How should I use this fact to compute least square solution?
$endgroup$
– Gratus
Dec 16 '18 at 12:50




$begingroup$
From the fact that eigenvalue is zero, this seems trivial. How should I use this fact to compute least square solution?
$endgroup$
– Gratus
Dec 16 '18 at 12:50












$begingroup$
I meant there is no solution of $Ax = u+v+w$. Perhaps I'm missing something. What is your definition of least squre solution?
$endgroup$
– Song
Dec 16 '18 at 12:52




$begingroup$
I meant there is no solution of $Ax = u+v+w$. Perhaps I'm missing something. What is your definition of least squre solution?
$endgroup$
– Song
Dec 16 '18 at 12:52












$begingroup$
least square is defined as a vector $x$ such that $|Ax-b|$ is minimized, as I know.
$endgroup$
– Gratus
Dec 16 '18 at 12:54






$begingroup$
least square is defined as a vector $x$ such that $|Ax-b|$ is minimized, as I know.
$endgroup$
– Gratus
Dec 16 '18 at 12:54














$begingroup$
So basically what I know is that I have to find x which minimizes $||Ax-(u+v+w)||$ from given conditions. Additionally, if we have more than one solution as a least square solution, we should find out which one has minimum length. Formally, $||x||$ should be also minimized. Some of my friends are insisting that solution must be v/3+w/7, but we couldn't prove it. Btw, thanks for your concern :)
$endgroup$
– Gratus
Dec 16 '18 at 12:57




$begingroup$
So basically what I know is that I have to find x which minimizes $||Ax-(u+v+w)||$ from given conditions. Additionally, if we have more than one solution as a least square solution, we should find out which one has minimum length. Formally, $||x||$ should be also minimized. Some of my friends are insisting that solution must be v/3+w/7, but we couldn't prove it. Btw, thanks for your concern :)
$endgroup$
– Gratus
Dec 16 '18 at 12:57












$begingroup$
So the minimization is two-folded. I think I can help.
$endgroup$
– Song
Dec 16 '18 at 12:59




$begingroup$
So the minimization is two-folded. I think I can help.
$endgroup$
– Song
Dec 16 '18 at 12:59











0












$begingroup$

Consider the basis made up of the eigenvectors, say $B$. Then use the fact that $A$ is diagonisable. Then you can appky least squares formula to $D[x]_B=[Ax]_B=[1 1 1]^T$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for a lesson. However, I still don't really get how am I supposed to apply least squares formula to $PD[x]_B$ after we changed basis to $u,v,w$. Would you mind elaborating a bit more? It would be very helpful.
    $endgroup$
    – Gratus
    Dec 16 '18 at 7:01










  • $begingroup$
    @Gratus See here math.stackexchange.com/questions/72222/…
    $endgroup$
    – AnyAD
    Dec 16 '18 at 8:55










  • $begingroup$
    In this case, it seems $A^T A$ is not invertible. Though I think I got your key idea. Thanks.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:49
















0












$begingroup$

Consider the basis made up of the eigenvectors, say $B$. Then use the fact that $A$ is diagonisable. Then you can appky least squares formula to $D[x]_B=[Ax]_B=[1 1 1]^T$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for a lesson. However, I still don't really get how am I supposed to apply least squares formula to $PD[x]_B$ after we changed basis to $u,v,w$. Would you mind elaborating a bit more? It would be very helpful.
    $endgroup$
    – Gratus
    Dec 16 '18 at 7:01










  • $begingroup$
    @Gratus See here math.stackexchange.com/questions/72222/…
    $endgroup$
    – AnyAD
    Dec 16 '18 at 8:55










  • $begingroup$
    In this case, it seems $A^T A$ is not invertible. Though I think I got your key idea. Thanks.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:49














0












0








0





$begingroup$

Consider the basis made up of the eigenvectors, say $B$. Then use the fact that $A$ is diagonisable. Then you can appky least squares formula to $D[x]_B=[Ax]_B=[1 1 1]^T$.






share|cite|improve this answer











$endgroup$



Consider the basis made up of the eigenvectors, say $B$. Then use the fact that $A$ is diagonisable. Then you can appky least squares formula to $D[x]_B=[Ax]_B=[1 1 1]^T$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Dec 16 '18 at 8:41

























answered Dec 16 '18 at 6:54









AnyADAnyAD

2,098812




2,098812












  • $begingroup$
    Thanks for a lesson. However, I still don't really get how am I supposed to apply least squares formula to $PD[x]_B$ after we changed basis to $u,v,w$. Would you mind elaborating a bit more? It would be very helpful.
    $endgroup$
    – Gratus
    Dec 16 '18 at 7:01










  • $begingroup$
    @Gratus See here math.stackexchange.com/questions/72222/…
    $endgroup$
    – AnyAD
    Dec 16 '18 at 8:55










  • $begingroup$
    In this case, it seems $A^T A$ is not invertible. Though I think I got your key idea. Thanks.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:49


















  • $begingroup$
    Thanks for a lesson. However, I still don't really get how am I supposed to apply least squares formula to $PD[x]_B$ after we changed basis to $u,v,w$. Would you mind elaborating a bit more? It would be very helpful.
    $endgroup$
    – Gratus
    Dec 16 '18 at 7:01










  • $begingroup$
    @Gratus See here math.stackexchange.com/questions/72222/…
    $endgroup$
    – AnyAD
    Dec 16 '18 at 8:55










  • $begingroup$
    In this case, it seems $A^T A$ is not invertible. Though I think I got your key idea. Thanks.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:49
















$begingroup$
Thanks for a lesson. However, I still don't really get how am I supposed to apply least squares formula to $PD[x]_B$ after we changed basis to $u,v,w$. Would you mind elaborating a bit more? It would be very helpful.
$endgroup$
– Gratus
Dec 16 '18 at 7:01




$begingroup$
Thanks for a lesson. However, I still don't really get how am I supposed to apply least squares formula to $PD[x]_B$ after we changed basis to $u,v,w$. Would you mind elaborating a bit more? It would be very helpful.
$endgroup$
– Gratus
Dec 16 '18 at 7:01












$begingroup$
@Gratus See here math.stackexchange.com/questions/72222/…
$endgroup$
– AnyAD
Dec 16 '18 at 8:55




$begingroup$
@Gratus See here math.stackexchange.com/questions/72222/…
$endgroup$
– AnyAD
Dec 16 '18 at 8:55












$begingroup$
In this case, it seems $A^T A$ is not invertible. Though I think I got your key idea. Thanks.
$endgroup$
– Gratus
Dec 16 '18 at 12:49




$begingroup$
In this case, it seems $A^T A$ is not invertible. Though I think I got your key idea. Thanks.
$endgroup$
– Gratus
Dec 16 '18 at 12:49











0












$begingroup$

$Ax$ owns to the set $E$ generated by $v$ and $w$.



Let us call $u_0$ the projection of $u$ on $E$.



Then $u =u_0+u_1$ with $u_1$ orthogonal to $E$.



The best you can obtain is $Ax = u_0 + v + w$, projection theorem.

As $u_0$ owns to $E$, it can be represented as $u_0 =alpha v + beta w$, and one solution is
$$ x = frac{1+alpha}{3}v + frac{1+beta}{7}w $$



This simple solution minimizes the error on the output.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Can this solution written in forms of $v$ and $w$? Some of my friends are insisting that solution must be $mathbf{v}/3+mathbf{w}/7$, but we couldn't prove it.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:52










  • $begingroup$
    I will try to complete it later
    $endgroup$
    – Damien
    Dec 16 '18 at 13:09
















0












$begingroup$

$Ax$ owns to the set $E$ generated by $v$ and $w$.



Let us call $u_0$ the projection of $u$ on $E$.



Then $u =u_0+u_1$ with $u_1$ orthogonal to $E$.



The best you can obtain is $Ax = u_0 + v + w$, projection theorem.

As $u_0$ owns to $E$, it can be represented as $u_0 =alpha v + beta w$, and one solution is
$$ x = frac{1+alpha}{3}v + frac{1+beta}{7}w $$



This simple solution minimizes the error on the output.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Can this solution written in forms of $v$ and $w$? Some of my friends are insisting that solution must be $mathbf{v}/3+mathbf{w}/7$, but we couldn't prove it.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:52










  • $begingroup$
    I will try to complete it later
    $endgroup$
    – Damien
    Dec 16 '18 at 13:09














0












0








0





$begingroup$

$Ax$ owns to the set $E$ generated by $v$ and $w$.



Let us call $u_0$ the projection of $u$ on $E$.



Then $u =u_0+u_1$ with $u_1$ orthogonal to $E$.



The best you can obtain is $Ax = u_0 + v + w$, projection theorem.

As $u_0$ owns to $E$, it can be represented as $u_0 =alpha v + beta w$, and one solution is
$$ x = frac{1+alpha}{3}v + frac{1+beta}{7}w $$



This simple solution minimizes the error on the output.






share|cite|improve this answer











$endgroup$



$Ax$ owns to the set $E$ generated by $v$ and $w$.



Let us call $u_0$ the projection of $u$ on $E$.



Then $u =u_0+u_1$ with $u_1$ orthogonal to $E$.



The best you can obtain is $Ax = u_0 + v + w$, projection theorem.

As $u_0$ owns to $E$, it can be represented as $u_0 =alpha v + beta w$, and one solution is
$$ x = frac{1+alpha}{3}v + frac{1+beta}{7}w $$



This simple solution minimizes the error on the output.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Dec 16 '18 at 17:46

























answered Dec 16 '18 at 7:21









DamienDamien

59714




59714












  • $begingroup$
    Can this solution written in forms of $v$ and $w$? Some of my friends are insisting that solution must be $mathbf{v}/3+mathbf{w}/7$, but we couldn't prove it.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:52










  • $begingroup$
    I will try to complete it later
    $endgroup$
    – Damien
    Dec 16 '18 at 13:09


















  • $begingroup$
    Can this solution written in forms of $v$ and $w$? Some of my friends are insisting that solution must be $mathbf{v}/3+mathbf{w}/7$, but we couldn't prove it.
    $endgroup$
    – Gratus
    Dec 16 '18 at 12:52










  • $begingroup$
    I will try to complete it later
    $endgroup$
    – Damien
    Dec 16 '18 at 13:09
















$begingroup$
Can this solution written in forms of $v$ and $w$? Some of my friends are insisting that solution must be $mathbf{v}/3+mathbf{w}/7$, but we couldn't prove it.
$endgroup$
– Gratus
Dec 16 '18 at 12:52




$begingroup$
Can this solution written in forms of $v$ and $w$? Some of my friends are insisting that solution must be $mathbf{v}/3+mathbf{w}/7$, but we couldn't prove it.
$endgroup$
– Gratus
Dec 16 '18 at 12:52












$begingroup$
I will try to complete it later
$endgroup$
– Damien
Dec 16 '18 at 13:09




$begingroup$
I will try to complete it later
$endgroup$
– Damien
Dec 16 '18 at 13:09


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3042308%2fcomputing-least-square-solution-when-eigenvalue-and-eigenvectors-are-known%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Plaza Victoria

In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...