Scalar multiplication of vectors on left or right












1












$begingroup$


Vectors are often written column-wise as if they were $ntimes 1$ matrices:



$$
mathbf{v} := begin{bmatrix}
1 \ 2 \ 3
end{bmatrix}
$$



This notation implicitly identifies the vector $mathbf{v}in mathbf{R}^3$ with its equivalent matrix, which represents a linear operator



$$
v: mathbf{R}^1 to mathbf{R}^3 \
v(t) := begin{bmatrix}
t \ 2t \ 3t
end{bmatrix}
$$



Thus, identifying a real scalar $lambdainmathbf{R}$ with its corresponding $1$-vector, it would seem to make sense that scalar multiplication of this vector with real scalar $lambdainmathbf{R}$ be written as



$$
mathbf{v}lambda
$$



to match the usual notation for matrix-vector multiplication, where the operator is written on the left. However, it is more common to see



$$
lambda mathbf{v}
$$



where the expression cannot be read as a covector-matrix multiplication, because the $1times 1$ dimension of the scalar $lambda$ is apparently incompatible with the $3times 1$ matrix $mathbf{v}$. Why is this second notation, with the scalar on the left, more common?










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
    $endgroup$
    – Rob Arthan
    Dec 13 '18 at 22:47


















1












$begingroup$


Vectors are often written column-wise as if they were $ntimes 1$ matrices:



$$
mathbf{v} := begin{bmatrix}
1 \ 2 \ 3
end{bmatrix}
$$



This notation implicitly identifies the vector $mathbf{v}in mathbf{R}^3$ with its equivalent matrix, which represents a linear operator



$$
v: mathbf{R}^1 to mathbf{R}^3 \
v(t) := begin{bmatrix}
t \ 2t \ 3t
end{bmatrix}
$$



Thus, identifying a real scalar $lambdainmathbf{R}$ with its corresponding $1$-vector, it would seem to make sense that scalar multiplication of this vector with real scalar $lambdainmathbf{R}$ be written as



$$
mathbf{v}lambda
$$



to match the usual notation for matrix-vector multiplication, where the operator is written on the left. However, it is more common to see



$$
lambda mathbf{v}
$$



where the expression cannot be read as a covector-matrix multiplication, because the $1times 1$ dimension of the scalar $lambda$ is apparently incompatible with the $3times 1$ matrix $mathbf{v}$. Why is this second notation, with the scalar on the left, more common?










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
    $endgroup$
    – Rob Arthan
    Dec 13 '18 at 22:47
















1












1








1





$begingroup$


Vectors are often written column-wise as if they were $ntimes 1$ matrices:



$$
mathbf{v} := begin{bmatrix}
1 \ 2 \ 3
end{bmatrix}
$$



This notation implicitly identifies the vector $mathbf{v}in mathbf{R}^3$ with its equivalent matrix, which represents a linear operator



$$
v: mathbf{R}^1 to mathbf{R}^3 \
v(t) := begin{bmatrix}
t \ 2t \ 3t
end{bmatrix}
$$



Thus, identifying a real scalar $lambdainmathbf{R}$ with its corresponding $1$-vector, it would seem to make sense that scalar multiplication of this vector with real scalar $lambdainmathbf{R}$ be written as



$$
mathbf{v}lambda
$$



to match the usual notation for matrix-vector multiplication, where the operator is written on the left. However, it is more common to see



$$
lambda mathbf{v}
$$



where the expression cannot be read as a covector-matrix multiplication, because the $1times 1$ dimension of the scalar $lambda$ is apparently incompatible with the $3times 1$ matrix $mathbf{v}$. Why is this second notation, with the scalar on the left, more common?










share|cite|improve this question









$endgroup$




Vectors are often written column-wise as if they were $ntimes 1$ matrices:



$$
mathbf{v} := begin{bmatrix}
1 \ 2 \ 3
end{bmatrix}
$$



This notation implicitly identifies the vector $mathbf{v}in mathbf{R}^3$ with its equivalent matrix, which represents a linear operator



$$
v: mathbf{R}^1 to mathbf{R}^3 \
v(t) := begin{bmatrix}
t \ 2t \ 3t
end{bmatrix}
$$



Thus, identifying a real scalar $lambdainmathbf{R}$ with its corresponding $1$-vector, it would seem to make sense that scalar multiplication of this vector with real scalar $lambdainmathbf{R}$ be written as



$$
mathbf{v}lambda
$$



to match the usual notation for matrix-vector multiplication, where the operator is written on the left. However, it is more common to see



$$
lambda mathbf{v}
$$



where the expression cannot be read as a covector-matrix multiplication, because the $1times 1$ dimension of the scalar $lambda$ is apparently incompatible with the $3times 1$ matrix $mathbf{v}$. Why is this second notation, with the scalar on the left, more common?







linear-algebra notation






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Dec 13 '18 at 22:40









Fengyang WangFengyang Wang

1,2841920




1,2841920








  • 1




    $begingroup$
    Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
    $endgroup$
    – Rob Arthan
    Dec 13 '18 at 22:47
















  • 1




    $begingroup$
    Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
    $endgroup$
    – Rob Arthan
    Dec 13 '18 at 22:47










1




1




$begingroup$
Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
$endgroup$
– Rob Arthan
Dec 13 '18 at 22:47






$begingroup$
Vectors are more often written as row vectors, so your argument doesn't have much force. I don't think matrix notation has anything to do with the common practice of having the field in a vector space act on the left. My guess is that the practice arose from traditional notation for polynomials which puts the coefficients before the variables. If you are working with modules over noncommutative rings, then left actions and right actions are different things and need to be distinguished notationally.
$endgroup$
– Rob Arthan
Dec 13 '18 at 22:47












1 Answer
1






active

oldest

votes


















3












$begingroup$

We do not want to pigeonhole ourselves into thinking of scalars $lambda$ as their corresponding $1times 1$ matrices $[lambda]$. It is always legal to scale any $mtimes n$ matrix $A$ by $lambda$, but not always legal to multiply $A$ by $[lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.



I do not have a good explanation why $lambda v$ is more common that $vlambda$, but it should not stem from thinking of scalars as a $1times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3038680%2fscalar-multiplication-of-vectors-on-left-or-right%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    3












    $begingroup$

    We do not want to pigeonhole ourselves into thinking of scalars $lambda$ as their corresponding $1times 1$ matrices $[lambda]$. It is always legal to scale any $mtimes n$ matrix $A$ by $lambda$, but not always legal to multiply $A$ by $[lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.



    I do not have a good explanation why $lambda v$ is more common that $vlambda$, but it should not stem from thinking of scalars as a $1times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).






    share|cite|improve this answer









    $endgroup$


















      3












      $begingroup$

      We do not want to pigeonhole ourselves into thinking of scalars $lambda$ as their corresponding $1times 1$ matrices $[lambda]$. It is always legal to scale any $mtimes n$ matrix $A$ by $lambda$, but not always legal to multiply $A$ by $[lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.



      I do not have a good explanation why $lambda v$ is more common that $vlambda$, but it should not stem from thinking of scalars as a $1times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).






      share|cite|improve this answer









      $endgroup$
















        3












        3








        3





        $begingroup$

        We do not want to pigeonhole ourselves into thinking of scalars $lambda$ as their corresponding $1times 1$ matrices $[lambda]$. It is always legal to scale any $mtimes n$ matrix $A$ by $lambda$, but not always legal to multiply $A$ by $[lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.



        I do not have a good explanation why $lambda v$ is more common that $vlambda$, but it should not stem from thinking of scalars as a $1times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).






        share|cite|improve this answer









        $endgroup$



        We do not want to pigeonhole ourselves into thinking of scalars $lambda$ as their corresponding $1times 1$ matrices $[lambda]$. It is always legal to scale any $mtimes n$ matrix $A$ by $lambda$, but not always legal to multiply $A$ by $[lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.



        I do not have a good explanation why $lambda v$ is more common that $vlambda$, but it should not stem from thinking of scalars as a $1times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Dec 13 '18 at 23:15









        Mike EarnestMike Earnest

        23.9k12051




        23.9k12051






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3038680%2fscalar-multiplication-of-vectors-on-left-or-right%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Plaza Victoria

            In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

            How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...