Sum of two vectors is the vector that has components equal to sum of components [closed]












0












$begingroup$


Consider geometric vectors for example, where the addition is defined by parallelogram law. How can I prove that Sum of two vectors is the vector that has components equal to sum of components of individual vectors with respect to some basis? In fact, how can I do the same for any vector space where the addition defined is consistent with the defintion of vector space but having a non obvious formulation such as parallelogram law?










share|cite|improve this question











$endgroup$



closed as off-topic by Kavi Rama Murthy, Saad, Adrian Keister, Lord_Farin, José Carlos Santos Dec 21 '18 at 16:41


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Adrian Keister, Lord_Farin, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.
















  • $begingroup$
    How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
    $endgroup$
    – Dan Robertson
    Dec 21 '18 at 11:49










  • $begingroup$
    Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
    $endgroup$
    – Akhil
    Dec 21 '18 at 12:26
















0












$begingroup$


Consider geometric vectors for example, where the addition is defined by parallelogram law. How can I prove that Sum of two vectors is the vector that has components equal to sum of components of individual vectors with respect to some basis? In fact, how can I do the same for any vector space where the addition defined is consistent with the defintion of vector space but having a non obvious formulation such as parallelogram law?










share|cite|improve this question











$endgroup$



closed as off-topic by Kavi Rama Murthy, Saad, Adrian Keister, Lord_Farin, José Carlos Santos Dec 21 '18 at 16:41


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Adrian Keister, Lord_Farin, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.
















  • $begingroup$
    How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
    $endgroup$
    – Dan Robertson
    Dec 21 '18 at 11:49










  • $begingroup$
    Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
    $endgroup$
    – Akhil
    Dec 21 '18 at 12:26














0












0








0


1



$begingroup$


Consider geometric vectors for example, where the addition is defined by parallelogram law. How can I prove that Sum of two vectors is the vector that has components equal to sum of components of individual vectors with respect to some basis? In fact, how can I do the same for any vector space where the addition defined is consistent with the defintion of vector space but having a non obvious formulation such as parallelogram law?










share|cite|improve this question











$endgroup$




Consider geometric vectors for example, where the addition is defined by parallelogram law. How can I prove that Sum of two vectors is the vector that has components equal to sum of components of individual vectors with respect to some basis? In fact, how can I do the same for any vector space where the addition defined is consistent with the defintion of vector space but having a non obvious formulation such as parallelogram law?







algebra-precalculus euclidean-geometry






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 21 '18 at 12:15







Akhil

















asked Dec 21 '18 at 11:42









AkhilAkhil

94




94




closed as off-topic by Kavi Rama Murthy, Saad, Adrian Keister, Lord_Farin, José Carlos Santos Dec 21 '18 at 16:41


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Adrian Keister, Lord_Farin, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.







closed as off-topic by Kavi Rama Murthy, Saad, Adrian Keister, Lord_Farin, José Carlos Santos Dec 21 '18 at 16:41


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Adrian Keister, Lord_Farin, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.












  • $begingroup$
    How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
    $endgroup$
    – Dan Robertson
    Dec 21 '18 at 11:49










  • $begingroup$
    Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
    $endgroup$
    – Akhil
    Dec 21 '18 at 12:26


















  • $begingroup$
    How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
    $endgroup$
    – Dan Robertson
    Dec 21 '18 at 11:49










  • $begingroup$
    Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
    $endgroup$
    – Akhil
    Dec 21 '18 at 12:26
















$begingroup$
How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
$endgroup$
– Dan Robertson
Dec 21 '18 at 11:49




$begingroup$
How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
$endgroup$
– Dan Robertson
Dec 21 '18 at 11:49












$begingroup$
Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
$endgroup$
– Akhil
Dec 21 '18 at 12:26




$begingroup$
Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
$endgroup$
– Akhil
Dec 21 '18 at 12:26










2 Answers
2






active

oldest

votes


















1












$begingroup$

Let $v,,w$ be vectors, and write $v=sum_i (vcdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=sum_i ((v+w)cdot e_i)e_i=sum_i (vcdot e_i+wcdot e_i)e_i=sum_i (vcdot e_i)e_i+sum_i (wcdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.



Edit: we can do without dot products as long as the components $v_i$ satisfy $v=sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$sum_i (v+w)_ie_i=v+w=sum_i v_ie_i+sum_i w_i e_i=sum_i (v_i+w_i)e_iimplies (v+w)_i=v_i+w_i.$$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
    $endgroup$
    – Akhil
    Dec 21 '18 at 12:22










  • $begingroup$
    @Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
    $endgroup$
    – J.G.
    Dec 21 '18 at 12:48










  • $begingroup$
    Is there a more fundamental proof that doesn't involve dot product?
    $endgroup$
    – Akhil
    Dec 21 '18 at 14:00










  • $begingroup$
    @Akhil See my edit.
    $endgroup$
    – J.G.
    Dec 21 '18 at 14:11










  • $begingroup$
    But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
    $endgroup$
    – Akhil
    Dec 21 '18 at 14:26





















0












$begingroup$

What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that:
$$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$



So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.



Then everything should fall out.






share|cite|improve this answer









$endgroup$




















    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    Let $v,,w$ be vectors, and write $v=sum_i (vcdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=sum_i ((v+w)cdot e_i)e_i=sum_i (vcdot e_i+wcdot e_i)e_i=sum_i (vcdot e_i)e_i+sum_i (wcdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.



    Edit: we can do without dot products as long as the components $v_i$ satisfy $v=sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$sum_i (v+w)_ie_i=v+w=sum_i v_ie_i+sum_i w_i e_i=sum_i (v_i+w_i)e_iimplies (v+w)_i=v_i+w_i.$$






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
      $endgroup$
      – Akhil
      Dec 21 '18 at 12:22










    • $begingroup$
      @Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
      $endgroup$
      – J.G.
      Dec 21 '18 at 12:48










    • $begingroup$
      Is there a more fundamental proof that doesn't involve dot product?
      $endgroup$
      – Akhil
      Dec 21 '18 at 14:00










    • $begingroup$
      @Akhil See my edit.
      $endgroup$
      – J.G.
      Dec 21 '18 at 14:11










    • $begingroup$
      But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
      $endgroup$
      – Akhil
      Dec 21 '18 at 14:26


















    1












    $begingroup$

    Let $v,,w$ be vectors, and write $v=sum_i (vcdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=sum_i ((v+w)cdot e_i)e_i=sum_i (vcdot e_i+wcdot e_i)e_i=sum_i (vcdot e_i)e_i+sum_i (wcdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.



    Edit: we can do without dot products as long as the components $v_i$ satisfy $v=sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$sum_i (v+w)_ie_i=v+w=sum_i v_ie_i+sum_i w_i e_i=sum_i (v_i+w_i)e_iimplies (v+w)_i=v_i+w_i.$$






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
      $endgroup$
      – Akhil
      Dec 21 '18 at 12:22










    • $begingroup$
      @Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
      $endgroup$
      – J.G.
      Dec 21 '18 at 12:48










    • $begingroup$
      Is there a more fundamental proof that doesn't involve dot product?
      $endgroup$
      – Akhil
      Dec 21 '18 at 14:00










    • $begingroup$
      @Akhil See my edit.
      $endgroup$
      – J.G.
      Dec 21 '18 at 14:11










    • $begingroup$
      But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
      $endgroup$
      – Akhil
      Dec 21 '18 at 14:26
















    1












    1








    1





    $begingroup$

    Let $v,,w$ be vectors, and write $v=sum_i (vcdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=sum_i ((v+w)cdot e_i)e_i=sum_i (vcdot e_i+wcdot e_i)e_i=sum_i (vcdot e_i)e_i+sum_i (wcdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.



    Edit: we can do without dot products as long as the components $v_i$ satisfy $v=sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$sum_i (v+w)_ie_i=v+w=sum_i v_ie_i+sum_i w_i e_i=sum_i (v_i+w_i)e_iimplies (v+w)_i=v_i+w_i.$$






    share|cite|improve this answer











    $endgroup$



    Let $v,,w$ be vectors, and write $v=sum_i (vcdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=sum_i ((v+w)cdot e_i)e_i=sum_i (vcdot e_i+wcdot e_i)e_i=sum_i (vcdot e_i)e_i+sum_i (wcdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.



    Edit: we can do without dot products as long as the components $v_i$ satisfy $v=sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$sum_i (v+w)_ie_i=v+w=sum_i v_ie_i+sum_i w_i e_i=sum_i (v_i+w_i)e_iimplies (v+w)_i=v_i+w_i.$$







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Dec 21 '18 at 14:11

























    answered Dec 21 '18 at 11:50









    J.G.J.G.

    33.3k23252




    33.3k23252












    • $begingroup$
      Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
      $endgroup$
      – Akhil
      Dec 21 '18 at 12:22










    • $begingroup$
      @Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
      $endgroup$
      – J.G.
      Dec 21 '18 at 12:48










    • $begingroup$
      Is there a more fundamental proof that doesn't involve dot product?
      $endgroup$
      – Akhil
      Dec 21 '18 at 14:00










    • $begingroup$
      @Akhil See my edit.
      $endgroup$
      – J.G.
      Dec 21 '18 at 14:11










    • $begingroup$
      But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
      $endgroup$
      – Akhil
      Dec 21 '18 at 14:26




















    • $begingroup$
      Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
      $endgroup$
      – Akhil
      Dec 21 '18 at 12:22










    • $begingroup$
      @Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
      $endgroup$
      – J.G.
      Dec 21 '18 at 12:48










    • $begingroup$
      Is there a more fundamental proof that doesn't involve dot product?
      $endgroup$
      – Akhil
      Dec 21 '18 at 14:00










    • $begingroup$
      @Akhil See my edit.
      $endgroup$
      – J.G.
      Dec 21 '18 at 14:11










    • $begingroup$
      But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
      $endgroup$
      – Akhil
      Dec 21 '18 at 14:26


















    $begingroup$
    Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
    $endgroup$
    – Akhil
    Dec 21 '18 at 12:22




    $begingroup$
    Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
    $endgroup$
    – Akhil
    Dec 21 '18 at 12:22












    $begingroup$
    @Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
    $endgroup$
    – J.G.
    Dec 21 '18 at 12:48




    $begingroup$
    @Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
    $endgroup$
    – J.G.
    Dec 21 '18 at 12:48












    $begingroup$
    Is there a more fundamental proof that doesn't involve dot product?
    $endgroup$
    – Akhil
    Dec 21 '18 at 14:00




    $begingroup$
    Is there a more fundamental proof that doesn't involve dot product?
    $endgroup$
    – Akhil
    Dec 21 '18 at 14:00












    $begingroup$
    @Akhil See my edit.
    $endgroup$
    – J.G.
    Dec 21 '18 at 14:11




    $begingroup$
    @Akhil See my edit.
    $endgroup$
    – J.G.
    Dec 21 '18 at 14:11












    $begingroup$
    But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
    $endgroup$
    – Akhil
    Dec 21 '18 at 14:26






    $begingroup$
    But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
    $endgroup$
    – Akhil
    Dec 21 '18 at 14:26













    0












    $begingroup$

    What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that:
    $$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$



    So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.



    Then everything should fall out.






    share|cite|improve this answer









    $endgroup$


















      0












      $begingroup$

      What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that:
      $$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$



      So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.



      Then everything should fall out.






      share|cite|improve this answer









      $endgroup$
















        0












        0








        0





        $begingroup$

        What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that:
        $$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$



        So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.



        Then everything should fall out.






        share|cite|improve this answer









        $endgroup$



        What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that:
        $$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$



        So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.



        Then everything should fall out.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Dec 21 '18 at 11:54









        Dan RobertsonDan Robertson

        2,591512




        2,591512















            Popular posts from this blog

            Plaza Victoria

            Puebla de Zaragoza

            Musa