Scalar multiplication of vectors on left or right
$begingroup$
Vectors are often written column-wise as if they were $ntimes 1$ matrices:
$$
mathbf{v} := begin{bmatrix}
1 \ 2 \ 3
end{bmatrix}
$$
This notation implicitly identifies the vector $mathbf{v}in mathbf{R}^3$ with its equivalent matrix, which represents a linear operator
$$
v: mathbf{R}^1 to mathbf{R}^3 \
v(t) := begin{bmatrix}
t \ 2t \ 3t
end{bmatrix}
$$
Thus, identifying a real scalar $lambdainmathbf{R}$ with its corresponding $1$-vector, it would seem to make sense that scalar multiplication of this vector with real scalar $lambdainmathbf{R}$ be written as
$$
mathbf{v}lambda
$$
to match the usual notation for matrix-vector multiplication, where the operator is written on the left. However, it is more common to see
$$
lambda mathbf{v}
$$
where the expression cannot be read as a covector-matrix multiplication, because the $1times 1$ dimension of the scalar $lambda$ is apparently incompatible with the $3times 1$ matrix $mathbf{v}$. Why is this second notation, with the scalar on the left, more common?
linear-algebra notation
$endgroup$
add a comment |
$begingroup$
Vectors are often written column-wise as if they were $ntimes 1$ matrices:
$$
mathbf{v} := begin{bmatrix}
1 \ 2 \ 3
end{bmatrix}
$$
This notation implicitly identifies the vector $mathbf{v}in mathbf{R}^3$ with its equivalent matrix, which represents a linear operator
$$
v: mathbf{R}^1 to mathbf{R}^3 \
v(t) := begin{bmatrix}
t \ 2t \ 3t
end{bmatrix}
$$
Thus, identifying a real scalar $lambdainmathbf{R}$ with its corresponding $1$-vector, it would seem to make sense that scalar multiplication of this vector with real scalar $lambdainmathbf{R}$ be written as
$$
mathbf{v}lambda
$$
to match the usual notation for matrix-vector multiplication, where the operator is written on the left. However, it is more common to see
$$
lambda mathbf{v}
$$
where the expression cannot be read as a covector-matrix multiplication, because the $1times 1$ dimension of the scalar $lambda$ is apparently incompatible with the $3times 1$ matrix $mathbf{v}$. Why is this second notation, with the scalar on the left, more common?
linear-algebra notation
$endgroup$
1
$begingroup$
Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
$endgroup$
– Rob Arthan
Dec 13 '18 at 22:47
add a comment |
$begingroup$
Vectors are often written column-wise as if they were $ntimes 1$ matrices:
$$
mathbf{v} := begin{bmatrix}
1 \ 2 \ 3
end{bmatrix}
$$
This notation implicitly identifies the vector $mathbf{v}in mathbf{R}^3$ with its equivalent matrix, which represents a linear operator
$$
v: mathbf{R}^1 to mathbf{R}^3 \
v(t) := begin{bmatrix}
t \ 2t \ 3t
end{bmatrix}
$$
Thus, identifying a real scalar $lambdainmathbf{R}$ with its corresponding $1$-vector, it would seem to make sense that scalar multiplication of this vector with real scalar $lambdainmathbf{R}$ be written as
$$
mathbf{v}lambda
$$
to match the usual notation for matrix-vector multiplication, where the operator is written on the left. However, it is more common to see
$$
lambda mathbf{v}
$$
where the expression cannot be read as a covector-matrix multiplication, because the $1times 1$ dimension of the scalar $lambda$ is apparently incompatible with the $3times 1$ matrix $mathbf{v}$. Why is this second notation, with the scalar on the left, more common?
linear-algebra notation
$endgroup$
Vectors are often written column-wise as if they were $ntimes 1$ matrices:
$$
mathbf{v} := begin{bmatrix}
1 \ 2 \ 3
end{bmatrix}
$$
This notation implicitly identifies the vector $mathbf{v}in mathbf{R}^3$ with its equivalent matrix, which represents a linear operator
$$
v: mathbf{R}^1 to mathbf{R}^3 \
v(t) := begin{bmatrix}
t \ 2t \ 3t
end{bmatrix}
$$
Thus, identifying a real scalar $lambdainmathbf{R}$ with its corresponding $1$-vector, it would seem to make sense that scalar multiplication of this vector with real scalar $lambdainmathbf{R}$ be written as
$$
mathbf{v}lambda
$$
to match the usual notation for matrix-vector multiplication, where the operator is written on the left. However, it is more common to see
$$
lambda mathbf{v}
$$
where the expression cannot be read as a covector-matrix multiplication, because the $1times 1$ dimension of the scalar $lambda$ is apparently incompatible with the $3times 1$ matrix $mathbf{v}$. Why is this second notation, with the scalar on the left, more common?
linear-algebra notation
linear-algebra notation
asked Dec 13 '18 at 22:40
Fengyang WangFengyang Wang
1,2841920
1,2841920
1
$begingroup$
Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
$endgroup$
– Rob Arthan
Dec 13 '18 at 22:47
add a comment |
1
$begingroup$
Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
$endgroup$
– Rob Arthan
Dec 13 '18 at 22:47
1
1
$begingroup$
Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
$endgroup$
– Rob Arthan
Dec 13 '18 at 22:47
$begingroup$
Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
$endgroup$
– Rob Arthan
Dec 13 '18 at 22:47
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
We do not want to pigeonhole ourselves into thinking of scalars $lambda$ as their corresponding $1times 1$ matrices $[lambda]$. It is always legal to scale any $mtimes n$ matrix $A$ by $lambda$, but not always legal to multiply $A$ by $[lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.
I do not have a good explanation why $lambda v$ is more common that $vlambda$, but it should not stem from thinking of scalars as a $1times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3038680%2fscalar-multiplication-of-vectors-on-left-or-right%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
We do not want to pigeonhole ourselves into thinking of scalars $lambda$ as their corresponding $1times 1$ matrices $[lambda]$. It is always legal to scale any $mtimes n$ matrix $A$ by $lambda$, but not always legal to multiply $A$ by $[lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.
I do not have a good explanation why $lambda v$ is more common that $vlambda$, but it should not stem from thinking of scalars as a $1times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).
$endgroup$
add a comment |
$begingroup$
We do not want to pigeonhole ourselves into thinking of scalars $lambda$ as their corresponding $1times 1$ matrices $[lambda]$. It is always legal to scale any $mtimes n$ matrix $A$ by $lambda$, but not always legal to multiply $A$ by $[lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.
I do not have a good explanation why $lambda v$ is more common that $vlambda$, but it should not stem from thinking of scalars as a $1times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).
$endgroup$
add a comment |
$begingroup$
We do not want to pigeonhole ourselves into thinking of scalars $lambda$ as their corresponding $1times 1$ matrices $[lambda]$. It is always legal to scale any $mtimes n$ matrix $A$ by $lambda$, but not always legal to multiply $A$ by $[lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.
I do not have a good explanation why $lambda v$ is more common that $vlambda$, but it should not stem from thinking of scalars as a $1times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).
$endgroup$
We do not want to pigeonhole ourselves into thinking of scalars $lambda$ as their corresponding $1times 1$ matrices $[lambda]$. It is always legal to scale any $mtimes n$ matrix $A$ by $lambda$, but not always legal to multiply $A$ by $[lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.
I do not have a good explanation why $lambda v$ is more common that $vlambda$, but it should not stem from thinking of scalars as a $1times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).
answered Dec 13 '18 at 23:15
Mike EarnestMike Earnest
23.9k12051
23.9k12051
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3038680%2fscalar-multiplication-of-vectors-on-left-or-right%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
$endgroup$
– Rob Arthan
Dec 13 '18 at 22:47