Why are algebraic structures preserved under intersection but not union?











up vote
18
down vote

favorite
7












In general, the intersection of subgroups/subrings/subfields/sub(vector)spaces will still be subgroups/subrings/subfields/sub(vector)spaces. However, the union will (generally) not be.



Is there a "deep" reason for this?










share|cite|improve this question


















  • 8




    I think you can view this phenomenon as an incarnation of forgetful functors preserving limits, but not colimits (or that forgetful functors often have left adjoints, but not right adjoints).
    – Stahl
    19 hours ago






  • 2




    I suppose generally you could just "fall out of the union" if you multiply/add/whatever objects in a union that are unable to embed in one another.
    – Munk
    18 hours ago

















up vote
18
down vote

favorite
7












In general, the intersection of subgroups/subrings/subfields/sub(vector)spaces will still be subgroups/subrings/subfields/sub(vector)spaces. However, the union will (generally) not be.



Is there a "deep" reason for this?










share|cite|improve this question


















  • 8




    I think you can view this phenomenon as an incarnation of forgetful functors preserving limits, but not colimits (or that forgetful functors often have left adjoints, but not right adjoints).
    – Stahl
    19 hours ago






  • 2




    I suppose generally you could just "fall out of the union" if you multiply/add/whatever objects in a union that are unable to embed in one another.
    – Munk
    18 hours ago















up vote
18
down vote

favorite
7









up vote
18
down vote

favorite
7






7





In general, the intersection of subgroups/subrings/subfields/sub(vector)spaces will still be subgroups/subrings/subfields/sub(vector)spaces. However, the union will (generally) not be.



Is there a "deep" reason for this?










share|cite|improve this question













In general, the intersection of subgroups/subrings/subfields/sub(vector)spaces will still be subgroups/subrings/subfields/sub(vector)spaces. However, the union will (generally) not be.



Is there a "deep" reason for this?







abstract-algebra






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 19 hours ago









MathematicsStudent1122

7,61422162




7,61422162








  • 8




    I think you can view this phenomenon as an incarnation of forgetful functors preserving limits, but not colimits (or that forgetful functors often have left adjoints, but not right adjoints).
    – Stahl
    19 hours ago






  • 2




    I suppose generally you could just "fall out of the union" if you multiply/add/whatever objects in a union that are unable to embed in one another.
    – Munk
    18 hours ago
















  • 8




    I think you can view this phenomenon as an incarnation of forgetful functors preserving limits, but not colimits (or that forgetful functors often have left adjoints, but not right adjoints).
    – Stahl
    19 hours ago






  • 2




    I suppose generally you could just "fall out of the union" if you multiply/add/whatever objects in a union that are unable to embed in one another.
    – Munk
    18 hours ago










8




8




I think you can view this phenomenon as an incarnation of forgetful functors preserving limits, but not colimits (or that forgetful functors often have left adjoints, but not right adjoints).
– Stahl
19 hours ago




I think you can view this phenomenon as an incarnation of forgetful functors preserving limits, but not colimits (or that forgetful functors often have left adjoints, but not right adjoints).
– Stahl
19 hours ago




2




2




I suppose generally you could just "fall out of the union" if you multiply/add/whatever objects in a union that are unable to embed in one another.
– Munk
18 hours ago






I suppose generally you could just "fall out of the union" if you multiply/add/whatever objects in a union that are unable to embed in one another.
– Munk
18 hours ago












8 Answers
8






active

oldest

votes

















up vote
18
down vote













I wouldn't call it "deep", but here's an intuitive reasoning.



Intersections have elements that come from both sets, so they have the properties of both sets. If, for each of the component sets, there is some element(s) guaranteed to exist within that set, then such element(s) must necessarily exist in the intersection. For example, if $A$ and $B$ are closed under addition, then any pair of elements $x,yin Acap B$ is in each of $A$ and $B$, so the sum $x+y$ must be in each of $A$ and $B$, and so $x+yin Acap B$. This line of reasoning holds for basically any "structure" property out there, simply by virtue of the fact that all elements come from a collection of sets that simultaneously have that property.



Unions, on the other hand, have some elements from only one set or the other. In a sense, these elements only have one piece of the puzzle, i.e. they only have the properties of one set rather than both. Even if the statement of those properties is the same, like "closure under addition", the actual mechanics of those properties is different from set to set, and may not be compatible. Given $xin A$ and $yin B$, we have $x,yin Acup B$, but there's no reason to believe that $x+y in Acup B$. Sometimes it's simply not true, such as $Bbb{N}cup iBbb{N}$, where $iBbb{N} = { z in Bbb{C} | z = in text{ for some } ninBbb{N} }$. In this case, the closure under addition which is guaranteed for each of the component sets is not compatible with one another, so you get sums like $1+i$ which isn't in either set. On the other hand, sometimes you do have sets with compatible structure, such as $Bbb{N}cup -Bbb{N}$ (considering $0inBbb{N}$), where any sum of elements from this union still lies in the union.






share|cite|improve this answer

















  • 4




    +1 for the giving an example where the union does still preserve structure.
    – theREALyumdub
    8 hours ago


















up vote
10
down vote













Algebraic structures are typically defined by universal statements. For example, a group is a structure $(G,cdot,^{-1},e)$, where $cdot$ is a binary function, $^{-1}$ is a unary function, and $e$ is a nullary function, satisfying the following axioms:





  1. $forall x,y,z ; (xcdot y)cdot z = xcdot(ycdot z)$.


  2. $forall x ; x cdot x^{-1} = x^{-1} cdot x = e$.


  3. $forall x ; x cdot e = e cdot x = x$.


Universal axioms are preserved under intersection but not under union.






share|cite|improve this answer

















  • 6




    Can you elaborate? I don't really get it. (I'm not saying you're wrong, it's just that I don't understand)
    – goblin
    15 hours ago






  • 2




    @goblin say you have a universal statement "for all x in A, Px is true"; then if B is a subset of A, you have that "for all x in B, Px is true" too, right?; since the axioms for algebraic structures are universal statements, a subset of a given structure that closed under the operations in question will satisfy the axioms, and hence be an algebraic structure 'of the same kind'; you can check an intersection of structures (of the same kind, provided it exists) is closed under the relevant operations, and, being a subset of them all, is thus an algebraic structure of their kind
    – alkchf
    12 hours ago


















up vote
6
down vote













Let $X$ be a set, and let $Y$ and $Z$ be subsets of $X$. Let $f: X^2 to X$ be a binary function, and assume that the restrictions of $f$ to $Y$ and $Z$ are also functions (i.e. $f|_{Y^2} subseteq Y$ and $f|{Z^2} subseteq Z$).



Is it the case that $f|_{(Y cap Z)^2} subseteq Y cap Z$? Yes, it is: if $a, b in Y cap Z$ then $f(a, b) in Y$ because $f|_{Y^2} subseteq Y$ and $f(a, b) in Z$ because $f|_{Z^2} subseteq Z$, so $f(a, b) in Y cap Z$.



Is it the case that $f|_{(Y cup Z)^2} subseteq Y cup Z$? Not necessarily: if $a in Y$ and $b in Z$ then we know nothing at all about $f(a, b)$.



(Almost?) any structure we would call "algebraic" has some binary function (group multiplication, vector space addition, etc) which runs into this problem.






share|cite|improve this answer




























    up vote
    3
    down vote













    The fact that the intersection of subgroups are themselves subgroups, the intersections of subrings are subrings, etc., is indeed an example of a more general property:




    Call a set $X$ "closed under $f$", where $f$ is a function with $n$ arguments, if the domain of $f$ includes $X^n$ and if $f(x_1, x_2, dots, x_n) in X$ for all $(x_1, x_2, dots, x_n) in X^n$.



    Theorem: If $X$ and $Y$ are both closed under $f$, then $Z = X cap Y$ is closed under $f$.



    Proof. Since $X$ is closed under $f$, $Z^n subset X^n$ is part of the domain of $f$. Furthermore, since every $n$-tuple $(z_1, z_2, dots, z_n) in Z^n$ is in both $X^n$ and $Y^n$, and both $X$ and $Y$ are closed under $f$, it follows that $f(z_1, z_2, dots, z_n)$ belongs to both $X$ and $Y$, and therefore to their intersection $Z$. $square$




    For example, let $(G, +, 0)$ be a group with the group operation $+$ and the zero element $0$. Consider the functions $f_+: G^2 to G$ and $f_0: G^0 to G$ defined by $f_+(a, b) = a + b$ and $f_0(varepsilon) = 0$ (where $varepsilon$ denotes the zero-element tuple, the only element of $G^0 = {varepsilon}$). Clearly, the subgroups of $(G, +, 0)$ are exactly the subsets of $G$ that are closed under both $f_+$ and $f_0$. Thus, if $X$ and $Y$ are subgroups of $(G, +, 0)$, and $Z = X cap Y$ is their intersection, then $Z$ must also be closed under both $f_+$ and $f_0$, and thus also a subgroup.



    More generally, any time we can define a "subthingy" of a "thingy" $(T, dots)$ as a subset of $T$ that is closed under one or more functions $f: T^n to T$, it automatically follows from this definition that the intersection of two subthingies of the same thingy must itself be a subthingy. Since most definitions of substructures of an algebraic structure are indeed naturally of this form, they do have this property.





    On the other hand, for unions of substructures we have no equivalent of the theorem above, and thus the $W = X cup Y$ union of two subthingies $X$ and $Y$ of a thingy $(T, dots)$ is not usually a subthingy.



    Probably the nearest thing that we can say, kind of trivially, is that the closure $bar W$ of $W$ (i.e. the unique smallest subset of $T$ that includes $W$ and is closed under all the relevant functions, if one exists) will be a subthingy of $T$. Which, of course, is actually true by definition for all $W in T$, not just those that arise as a union of two (or more) subthingies.



    For example, the union of two subspaces of a vector space is not generally a subspace, because adding together two vectors from different subspaces can produce a vector that belongs to neither of the original subspaces. But the span of the union is indeed a subspace — as is the span of any arbitrary subset of the full vector space.






    share|cite|improve this answer




























      up vote
      2
      down vote













      For intersection case, the result is again an algebraic structure. For some times the union does't make much sense. For example, consider the intersection of two lines in a plane. It is either a point or empty. But we consider the union of two lines in the plane, it is, typically, a cross, something
      like an $times$ or, in an extreme case, when the two given lines coincide,
      it is just a line. Similar comments can be made about the union of two
      planes in space-geometric intuition is still working. Geometric intuition
      is likely to stop working when it is consulted about the union of two
      subspaces of a $19$-dimensional space-say a $17$-dimensional one and an
      $18$-dimensional one. So ingeneral




      • a group cannot be written as union of two subgroups

      • a real vector space cannot be the union of a finite
        number of proper subspaces.

      • a Banach space cannot be written as a union of even a countable infinity of
        proper subspaces


      $$vdots$$






      share|cite|improve this answer




























        up vote
        0
        down vote













        If we have a set $S$ and binary operator $O$, then $O$ is defined on the cartesian production $S$ with itself. So it comes down to the fact that $(Acap B)times (Acap B)=(A times A) cap (B times B)$, but $(Acup B)times (Acup B) neq (A times A) cup (B times B)$.



        If we're taking the intersection of $A$ and $B$, then adding elements isn't a problem: we're adding elements that are in both $A$ and $B$, so we can use either the addition defined for $A$ or the one defined for $B$. But if we have the union of $A$ and $B$, then we have to add an element of $A$ to an element of $B$, and we can't use either previously defined addition.






        share|cite|improve this answer




























          up vote
          0
          down vote













          A little warning: the whole question is meaningful only if your are talking about substructures of a given structure $A$, so I what follow I will assume this.



          Under the remark above one you can regard substructures in two different, but equivalent ways,




          1. as structures whose underlying set is a subset of $A$ and such that the inclusion is an homomorphism

          2. as subsets of $A$ closed under the operations of $A$ (and then the underlying structure is the one induced by $A$).


          If we take the second approach we have that a subset $S subseteq A$ is substructure of $A$ if and only if for every operation $f colon A^n to A$ we have that
          $$G(f) cap S^{n} times A = G(f) cap S^{n+1} ,$$
          here with $G(f)$denotes the graph of $f$, i.e. the set ${(bar a,a) in A^{n} times A mid f(bar a)=a}$.



          Using this formula is clear why intersection works well:
          if $(S_i)_i$ is a family of substructures of $A$, i.e. a family of subsets satisfying the equation above, the we have that
          $$
          begin{align*}
          G(f) cap (bigcap_i S_i)^n times A &= G(f) cap (bigcap_i S_i^n times A)\
          &= bigcap_i G(f) cap (S_i^n times A)\
          &= bigcap_i G(f) cap S_i^{n+1} text{ (because the $S_i$ are substructures)}\
          &= G(f) cap (bigcap_i S_i)^{n+1} .
          end{align*}$$



          What makes this work is the fact that products commutes with intersection:
          i.e. the following holds
          $$bigcap_i (A^1_i times dots times A_i^n) = (bigcap_i A_i^1) times dots times (bigcap_i A_i^n)$$
          A similar formula does not hold for unions, we have
          $$bigcup_i (A_i^1 times dots times A_i^n) subsetneq (bigcup_i A_i^1) times dots times (bigcup_i A_i^n) .$$



          So if you like, a deep reason why intersection of substructures works so well is because intersection commutes with products.






          share|cite|improve this answer




























            up vote
            0
            down vote













            Since no one has explained this from a categorical perspective yet, let me try to offer another point of view. Each of the types of objects you mention (groups, rings, fields, vector spaces) form a concrete category. That is, every group, ring, field, or vector space is a set equipped with the data of extra structure, and the homomorphisms between them are set maps which preserve that extra structure.



            Another way we might say the above is that if $mathcal{C}$ is the category of any of the above algebraic objects and their morphisms, we have a forgetful functor
            begin{align*}
            U : mathcal{C}&tomathsf{Set}\
            A&mapsto UA,
            end{align*}

            which sends each algebraic structure $A$ to its underlying set $UA$ and each homomorphism of algebraic structures $f : Ato B$ to the underlying function on sets $Uf : UAto UB.$



            In each of these situations (well, except when $mathcal{C}$ is the category of fields), the forgetful functor has a left adjoint - the free object functor. Explicitly, this means that if you're given a group, ring, or vector space (more generally module) $A$ and a set $S,$ then there is a natural bijection
            $$
            {textrm{homomorphisms of algebraic structures }f : F(S)to A}cong{textrm{maps of sets }g : Sto UA},
            $$

            where $F(S)$ denotes the free [group, ring, vector space, module...] on $S.$ This is essentially the definition of a free object: to give a homomorphism of the free group, ring, or vector space $F(S)$ on a set $S$ to another group, ring, or vector space $A,$ you need to give a map of sets $Sto UA.$ Think of $S$ as being the set of generators of $F(S),$ and the "freeness" means that there are no relations between these generators other than the relations forced by the axioms of the algebraic structure.



            For example, the free vector space on a set $S$ may be described as the vector space $F(S)$ with basis ${e_smid sin S}$ indexed by the elements of $s.$ To give a map from $F(S)$ to any other vector space $V,$ you need only specify where the basis elements $e_s$ are sent, and this is completely determined by a set map $Sto UV$ (again, $UV$ is the underlying set of the vector space $V$).



            As another example, the free commutative ring on a set $S$ is the ring $Bbb Z[x_smid sin S]$ - the polynomial ring over $Bbb Z$ with one variable for each element of $s.$



            Now that I've set this up, the point is that intersections are limits in the category of sets, and that forgetful functors (or more generally, right adjoints) play nicely with limits. In particular, if $S$ and $T$ are subsets of some set $X,$ then we may consider the diagram



            $$
            require{AMScd}
            begin{CD}
            Y @>>> S\
            @VVV @VVV \
            T @>>> X,
            end{CD}
            $$

            where $Y$ is some unspecified set together with maps such that the diagram commutes. The intersection $Scap T$ has the nice property that any set $Y$ with maps to $S$ and $T$ as in the diagram will factor uniquely through a map $Yto Scap T.$ This is the statement that $Scap T$ is the limit of the diagram above (without the $Y$).



            By abstract nonsense arguments, right adjoints (like the forgetful functor in these cases) preserve limits. (We also sometimes have even nicer results, but let me not get in too deep.) Preservation of limits means that if we have a limit of algebras $varprojlim A_i$ over some diagram, then the underlying set of the limit is canonically isomorphic to the limit $varprojlim UA_i$ (in the category of sets) of the underlying sets of the algebras.



            So, if you have subalgebras $A_1,A_2$ of a given algebra $A,$ and you consider the limit $B$ of these inclusions as we did for sets:
            $$
            require{AMScd}
            begin{CD}
            B @>>> A_1\
            @VVV @VVV \
            A_2 @>>> A,
            end{CD}
            $$



            then the underlying set of the limit $B$ is the limit of the underlying sets $U A_1$ and $UA_2,$ which is simply the intersection $UA_1cap UA_2.$



            The other punchline is that the union of two sets $S$ and $T$ (which are subsets of some ambient set $X$) is the colimit of an appropriate diagram. However, if $S$ and $T$ are the underlying sets of some algebraic objects $S = UA_1,$ $T = UA_2,$ the forgetful functor does not preserve colimits (even if $X$ is the underlying set of some large algebra $X = UA$). So, unless you are surprisingly lucky, the smallest algebra containing two given algebras $A_1$ and $A_2$ (which is a colimit) will not be the same as the smallest set containing $UA_1$ and $UA_2.$



            Many others have already expressed that this is also related to the fact that products and intersections commute but the same is not true of products and unions: this is also a categorical fact! Products and intersections are both examples of limits, but unions are colimits. Limits commute with limits, but limits do not necessarily commute with colimits, unless certain nice conditions hold.



            All in all, the failure of existence of an algebraic structure on a union is a combination of a number of categorical facts, which are much more general than the specific situations you mention. While proving that the forgetful functors described have the properties I claim essentially come down to making arguments like in the other answers, I prefer this perspective because taking "unions" or "intersections" is somehow an unnatural thing to do when you have things which aren't sets - you want to combine your algebraic objects in ways that result in objects with the same algebraic structure (e.g., via taking limits and colimits or using other categorical constructions). The fact that the underlying set of a limit coincides with the limits of the underlying sets is a result of nice properties that the forgetful functor in question has.



            Note: I said we didn't consider fields above, and that is because the category of fields is particularly badly behaved, because fields are rather restrictive.






            share|cite|improve this answer























              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "69"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3039113%2fwhy-are-algebraic-structures-preserved-under-intersection-but-not-union%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              8 Answers
              8






              active

              oldest

              votes








              8 Answers
              8






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              18
              down vote













              I wouldn't call it "deep", but here's an intuitive reasoning.



              Intersections have elements that come from both sets, so they have the properties of both sets. If, for each of the component sets, there is some element(s) guaranteed to exist within that set, then such element(s) must necessarily exist in the intersection. For example, if $A$ and $B$ are closed under addition, then any pair of elements $x,yin Acap B$ is in each of $A$ and $B$, so the sum $x+y$ must be in each of $A$ and $B$, and so $x+yin Acap B$. This line of reasoning holds for basically any "structure" property out there, simply by virtue of the fact that all elements come from a collection of sets that simultaneously have that property.



              Unions, on the other hand, have some elements from only one set or the other. In a sense, these elements only have one piece of the puzzle, i.e. they only have the properties of one set rather than both. Even if the statement of those properties is the same, like "closure under addition", the actual mechanics of those properties is different from set to set, and may not be compatible. Given $xin A$ and $yin B$, we have $x,yin Acup B$, but there's no reason to believe that $x+y in Acup B$. Sometimes it's simply not true, such as $Bbb{N}cup iBbb{N}$, where $iBbb{N} = { z in Bbb{C} | z = in text{ for some } ninBbb{N} }$. In this case, the closure under addition which is guaranteed for each of the component sets is not compatible with one another, so you get sums like $1+i$ which isn't in either set. On the other hand, sometimes you do have sets with compatible structure, such as $Bbb{N}cup -Bbb{N}$ (considering $0inBbb{N}$), where any sum of elements from this union still lies in the union.






              share|cite|improve this answer

















              • 4




                +1 for the giving an example where the union does still preserve structure.
                – theREALyumdub
                8 hours ago















              up vote
              18
              down vote













              I wouldn't call it "deep", but here's an intuitive reasoning.



              Intersections have elements that come from both sets, so they have the properties of both sets. If, for each of the component sets, there is some element(s) guaranteed to exist within that set, then such element(s) must necessarily exist in the intersection. For example, if $A$ and $B$ are closed under addition, then any pair of elements $x,yin Acap B$ is in each of $A$ and $B$, so the sum $x+y$ must be in each of $A$ and $B$, and so $x+yin Acap B$. This line of reasoning holds for basically any "structure" property out there, simply by virtue of the fact that all elements come from a collection of sets that simultaneously have that property.



              Unions, on the other hand, have some elements from only one set or the other. In a sense, these elements only have one piece of the puzzle, i.e. they only have the properties of one set rather than both. Even if the statement of those properties is the same, like "closure under addition", the actual mechanics of those properties is different from set to set, and may not be compatible. Given $xin A$ and $yin B$, we have $x,yin Acup B$, but there's no reason to believe that $x+y in Acup B$. Sometimes it's simply not true, such as $Bbb{N}cup iBbb{N}$, where $iBbb{N} = { z in Bbb{C} | z = in text{ for some } ninBbb{N} }$. In this case, the closure under addition which is guaranteed for each of the component sets is not compatible with one another, so you get sums like $1+i$ which isn't in either set. On the other hand, sometimes you do have sets with compatible structure, such as $Bbb{N}cup -Bbb{N}$ (considering $0inBbb{N}$), where any sum of elements from this union still lies in the union.






              share|cite|improve this answer

















              • 4




                +1 for the giving an example where the union does still preserve structure.
                – theREALyumdub
                8 hours ago













              up vote
              18
              down vote










              up vote
              18
              down vote









              I wouldn't call it "deep", but here's an intuitive reasoning.



              Intersections have elements that come from both sets, so they have the properties of both sets. If, for each of the component sets, there is some element(s) guaranteed to exist within that set, then such element(s) must necessarily exist in the intersection. For example, if $A$ and $B$ are closed under addition, then any pair of elements $x,yin Acap B$ is in each of $A$ and $B$, so the sum $x+y$ must be in each of $A$ and $B$, and so $x+yin Acap B$. This line of reasoning holds for basically any "structure" property out there, simply by virtue of the fact that all elements come from a collection of sets that simultaneously have that property.



              Unions, on the other hand, have some elements from only one set or the other. In a sense, these elements only have one piece of the puzzle, i.e. they only have the properties of one set rather than both. Even if the statement of those properties is the same, like "closure under addition", the actual mechanics of those properties is different from set to set, and may not be compatible. Given $xin A$ and $yin B$, we have $x,yin Acup B$, but there's no reason to believe that $x+y in Acup B$. Sometimes it's simply not true, such as $Bbb{N}cup iBbb{N}$, where $iBbb{N} = { z in Bbb{C} | z = in text{ for some } ninBbb{N} }$. In this case, the closure under addition which is guaranteed for each of the component sets is not compatible with one another, so you get sums like $1+i$ which isn't in either set. On the other hand, sometimes you do have sets with compatible structure, such as $Bbb{N}cup -Bbb{N}$ (considering $0inBbb{N}$), where any sum of elements from this union still lies in the union.






              share|cite|improve this answer












              I wouldn't call it "deep", but here's an intuitive reasoning.



              Intersections have elements that come from both sets, so they have the properties of both sets. If, for each of the component sets, there is some element(s) guaranteed to exist within that set, then such element(s) must necessarily exist in the intersection. For example, if $A$ and $B$ are closed under addition, then any pair of elements $x,yin Acap B$ is in each of $A$ and $B$, so the sum $x+y$ must be in each of $A$ and $B$, and so $x+yin Acap B$. This line of reasoning holds for basically any "structure" property out there, simply by virtue of the fact that all elements come from a collection of sets that simultaneously have that property.



              Unions, on the other hand, have some elements from only one set or the other. In a sense, these elements only have one piece of the puzzle, i.e. they only have the properties of one set rather than both. Even if the statement of those properties is the same, like "closure under addition", the actual mechanics of those properties is different from set to set, and may not be compatible. Given $xin A$ and $yin B$, we have $x,yin Acup B$, but there's no reason to believe that $x+y in Acup B$. Sometimes it's simply not true, such as $Bbb{N}cup iBbb{N}$, where $iBbb{N} = { z in Bbb{C} | z = in text{ for some } ninBbb{N} }$. In this case, the closure under addition which is guaranteed for each of the component sets is not compatible with one another, so you get sums like $1+i$ which isn't in either set. On the other hand, sometimes you do have sets with compatible structure, such as $Bbb{N}cup -Bbb{N}$ (considering $0inBbb{N}$), where any sum of elements from this union still lies in the union.







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered 15 hours ago









              AlexanderJ93

              5,624623




              5,624623








              • 4




                +1 for the giving an example where the union does still preserve structure.
                – theREALyumdub
                8 hours ago














              • 4




                +1 for the giving an example where the union does still preserve structure.
                – theREALyumdub
                8 hours ago








              4




              4




              +1 for the giving an example where the union does still preserve structure.
              – theREALyumdub
              8 hours ago




              +1 for the giving an example where the union does still preserve structure.
              – theREALyumdub
              8 hours ago










              up vote
              10
              down vote













              Algebraic structures are typically defined by universal statements. For example, a group is a structure $(G,cdot,^{-1},e)$, where $cdot$ is a binary function, $^{-1}$ is a unary function, and $e$ is a nullary function, satisfying the following axioms:





              1. $forall x,y,z ; (xcdot y)cdot z = xcdot(ycdot z)$.


              2. $forall x ; x cdot x^{-1} = x^{-1} cdot x = e$.


              3. $forall x ; x cdot e = e cdot x = x$.


              Universal axioms are preserved under intersection but not under union.






              share|cite|improve this answer

















              • 6




                Can you elaborate? I don't really get it. (I'm not saying you're wrong, it's just that I don't understand)
                – goblin
                15 hours ago






              • 2




                @goblin say you have a universal statement "for all x in A, Px is true"; then if B is a subset of A, you have that "for all x in B, Px is true" too, right?; since the axioms for algebraic structures are universal statements, a subset of a given structure that closed under the operations in question will satisfy the axioms, and hence be an algebraic structure 'of the same kind'; you can check an intersection of structures (of the same kind, provided it exists) is closed under the relevant operations, and, being a subset of them all, is thus an algebraic structure of their kind
                – alkchf
                12 hours ago















              up vote
              10
              down vote













              Algebraic structures are typically defined by universal statements. For example, a group is a structure $(G,cdot,^{-1},e)$, where $cdot$ is a binary function, $^{-1}$ is a unary function, and $e$ is a nullary function, satisfying the following axioms:





              1. $forall x,y,z ; (xcdot y)cdot z = xcdot(ycdot z)$.


              2. $forall x ; x cdot x^{-1} = x^{-1} cdot x = e$.


              3. $forall x ; x cdot e = e cdot x = x$.


              Universal axioms are preserved under intersection but not under union.






              share|cite|improve this answer

















              • 6




                Can you elaborate? I don't really get it. (I'm not saying you're wrong, it's just that I don't understand)
                – goblin
                15 hours ago






              • 2




                @goblin say you have a universal statement "for all x in A, Px is true"; then if B is a subset of A, you have that "for all x in B, Px is true" too, right?; since the axioms for algebraic structures are universal statements, a subset of a given structure that closed under the operations in question will satisfy the axioms, and hence be an algebraic structure 'of the same kind'; you can check an intersection of structures (of the same kind, provided it exists) is closed under the relevant operations, and, being a subset of them all, is thus an algebraic structure of their kind
                – alkchf
                12 hours ago













              up vote
              10
              down vote










              up vote
              10
              down vote









              Algebraic structures are typically defined by universal statements. For example, a group is a structure $(G,cdot,^{-1},e)$, where $cdot$ is a binary function, $^{-1}$ is a unary function, and $e$ is a nullary function, satisfying the following axioms:





              1. $forall x,y,z ; (xcdot y)cdot z = xcdot(ycdot z)$.


              2. $forall x ; x cdot x^{-1} = x^{-1} cdot x = e$.


              3. $forall x ; x cdot e = e cdot x = x$.


              Universal axioms are preserved under intersection but not under union.






              share|cite|improve this answer












              Algebraic structures are typically defined by universal statements. For example, a group is a structure $(G,cdot,^{-1},e)$, where $cdot$ is a binary function, $^{-1}$ is a unary function, and $e$ is a nullary function, satisfying the following axioms:





              1. $forall x,y,z ; (xcdot y)cdot z = xcdot(ycdot z)$.


              2. $forall x ; x cdot x^{-1} = x^{-1} cdot x = e$.


              3. $forall x ; x cdot e = e cdot x = x$.


              Universal axioms are preserved under intersection but not under union.







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered 18 hours ago









              Yuval Filmus

              47.8k470143




              47.8k470143








              • 6




                Can you elaborate? I don't really get it. (I'm not saying you're wrong, it's just that I don't understand)
                – goblin
                15 hours ago






              • 2




                @goblin say you have a universal statement "for all x in A, Px is true"; then if B is a subset of A, you have that "for all x in B, Px is true" too, right?; since the axioms for algebraic structures are universal statements, a subset of a given structure that closed under the operations in question will satisfy the axioms, and hence be an algebraic structure 'of the same kind'; you can check an intersection of structures (of the same kind, provided it exists) is closed under the relevant operations, and, being a subset of them all, is thus an algebraic structure of their kind
                – alkchf
                12 hours ago














              • 6




                Can you elaborate? I don't really get it. (I'm not saying you're wrong, it's just that I don't understand)
                – goblin
                15 hours ago






              • 2




                @goblin say you have a universal statement "for all x in A, Px is true"; then if B is a subset of A, you have that "for all x in B, Px is true" too, right?; since the axioms for algebraic structures are universal statements, a subset of a given structure that closed under the operations in question will satisfy the axioms, and hence be an algebraic structure 'of the same kind'; you can check an intersection of structures (of the same kind, provided it exists) is closed under the relevant operations, and, being a subset of them all, is thus an algebraic structure of their kind
                – alkchf
                12 hours ago








              6




              6




              Can you elaborate? I don't really get it. (I'm not saying you're wrong, it's just that I don't understand)
              – goblin
              15 hours ago




              Can you elaborate? I don't really get it. (I'm not saying you're wrong, it's just that I don't understand)
              – goblin
              15 hours ago




              2




              2




              @goblin say you have a universal statement "for all x in A, Px is true"; then if B is a subset of A, you have that "for all x in B, Px is true" too, right?; since the axioms for algebraic structures are universal statements, a subset of a given structure that closed under the operations in question will satisfy the axioms, and hence be an algebraic structure 'of the same kind'; you can check an intersection of structures (of the same kind, provided it exists) is closed under the relevant operations, and, being a subset of them all, is thus an algebraic structure of their kind
              – alkchf
              12 hours ago




              @goblin say you have a universal statement "for all x in A, Px is true"; then if B is a subset of A, you have that "for all x in B, Px is true" too, right?; since the axioms for algebraic structures are universal statements, a subset of a given structure that closed under the operations in question will satisfy the axioms, and hence be an algebraic structure 'of the same kind'; you can check an intersection of structures (of the same kind, provided it exists) is closed under the relevant operations, and, being a subset of them all, is thus an algebraic structure of their kind
              – alkchf
              12 hours ago










              up vote
              6
              down vote













              Let $X$ be a set, and let $Y$ and $Z$ be subsets of $X$. Let $f: X^2 to X$ be a binary function, and assume that the restrictions of $f$ to $Y$ and $Z$ are also functions (i.e. $f|_{Y^2} subseteq Y$ and $f|{Z^2} subseteq Z$).



              Is it the case that $f|_{(Y cap Z)^2} subseteq Y cap Z$? Yes, it is: if $a, b in Y cap Z$ then $f(a, b) in Y$ because $f|_{Y^2} subseteq Y$ and $f(a, b) in Z$ because $f|_{Z^2} subseteq Z$, so $f(a, b) in Y cap Z$.



              Is it the case that $f|_{(Y cup Z)^2} subseteq Y cup Z$? Not necessarily: if $a in Y$ and $b in Z$ then we know nothing at all about $f(a, b)$.



              (Almost?) any structure we would call "algebraic" has some binary function (group multiplication, vector space addition, etc) which runs into this problem.






              share|cite|improve this answer

























                up vote
                6
                down vote













                Let $X$ be a set, and let $Y$ and $Z$ be subsets of $X$. Let $f: X^2 to X$ be a binary function, and assume that the restrictions of $f$ to $Y$ and $Z$ are also functions (i.e. $f|_{Y^2} subseteq Y$ and $f|{Z^2} subseteq Z$).



                Is it the case that $f|_{(Y cap Z)^2} subseteq Y cap Z$? Yes, it is: if $a, b in Y cap Z$ then $f(a, b) in Y$ because $f|_{Y^2} subseteq Y$ and $f(a, b) in Z$ because $f|_{Z^2} subseteq Z$, so $f(a, b) in Y cap Z$.



                Is it the case that $f|_{(Y cup Z)^2} subseteq Y cup Z$? Not necessarily: if $a in Y$ and $b in Z$ then we know nothing at all about $f(a, b)$.



                (Almost?) any structure we would call "algebraic" has some binary function (group multiplication, vector space addition, etc) which runs into this problem.






                share|cite|improve this answer























                  up vote
                  6
                  down vote










                  up vote
                  6
                  down vote









                  Let $X$ be a set, and let $Y$ and $Z$ be subsets of $X$. Let $f: X^2 to X$ be a binary function, and assume that the restrictions of $f$ to $Y$ and $Z$ are also functions (i.e. $f|_{Y^2} subseteq Y$ and $f|{Z^2} subseteq Z$).



                  Is it the case that $f|_{(Y cap Z)^2} subseteq Y cap Z$? Yes, it is: if $a, b in Y cap Z$ then $f(a, b) in Y$ because $f|_{Y^2} subseteq Y$ and $f(a, b) in Z$ because $f|_{Z^2} subseteq Z$, so $f(a, b) in Y cap Z$.



                  Is it the case that $f|_{(Y cup Z)^2} subseteq Y cup Z$? Not necessarily: if $a in Y$ and $b in Z$ then we know nothing at all about $f(a, b)$.



                  (Almost?) any structure we would call "algebraic" has some binary function (group multiplication, vector space addition, etc) which runs into this problem.






                  share|cite|improve this answer












                  Let $X$ be a set, and let $Y$ and $Z$ be subsets of $X$. Let $f: X^2 to X$ be a binary function, and assume that the restrictions of $f$ to $Y$ and $Z$ are also functions (i.e. $f|_{Y^2} subseteq Y$ and $f|{Z^2} subseteq Z$).



                  Is it the case that $f|_{(Y cap Z)^2} subseteq Y cap Z$? Yes, it is: if $a, b in Y cap Z$ then $f(a, b) in Y$ because $f|_{Y^2} subseteq Y$ and $f(a, b) in Z$ because $f|_{Z^2} subseteq Z$, so $f(a, b) in Y cap Z$.



                  Is it the case that $f|_{(Y cup Z)^2} subseteq Y cup Z$? Not necessarily: if $a in Y$ and $b in Z$ then we know nothing at all about $f(a, b)$.



                  (Almost?) any structure we would call "algebraic" has some binary function (group multiplication, vector space addition, etc) which runs into this problem.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered 17 hours ago









                  Christopher

                  6,41711628




                  6,41711628






















                      up vote
                      3
                      down vote













                      The fact that the intersection of subgroups are themselves subgroups, the intersections of subrings are subrings, etc., is indeed an example of a more general property:




                      Call a set $X$ "closed under $f$", where $f$ is a function with $n$ arguments, if the domain of $f$ includes $X^n$ and if $f(x_1, x_2, dots, x_n) in X$ for all $(x_1, x_2, dots, x_n) in X^n$.



                      Theorem: If $X$ and $Y$ are both closed under $f$, then $Z = X cap Y$ is closed under $f$.



                      Proof. Since $X$ is closed under $f$, $Z^n subset X^n$ is part of the domain of $f$. Furthermore, since every $n$-tuple $(z_1, z_2, dots, z_n) in Z^n$ is in both $X^n$ and $Y^n$, and both $X$ and $Y$ are closed under $f$, it follows that $f(z_1, z_2, dots, z_n)$ belongs to both $X$ and $Y$, and therefore to their intersection $Z$. $square$




                      For example, let $(G, +, 0)$ be a group with the group operation $+$ and the zero element $0$. Consider the functions $f_+: G^2 to G$ and $f_0: G^0 to G$ defined by $f_+(a, b) = a + b$ and $f_0(varepsilon) = 0$ (where $varepsilon$ denotes the zero-element tuple, the only element of $G^0 = {varepsilon}$). Clearly, the subgroups of $(G, +, 0)$ are exactly the subsets of $G$ that are closed under both $f_+$ and $f_0$. Thus, if $X$ and $Y$ are subgroups of $(G, +, 0)$, and $Z = X cap Y$ is their intersection, then $Z$ must also be closed under both $f_+$ and $f_0$, and thus also a subgroup.



                      More generally, any time we can define a "subthingy" of a "thingy" $(T, dots)$ as a subset of $T$ that is closed under one or more functions $f: T^n to T$, it automatically follows from this definition that the intersection of two subthingies of the same thingy must itself be a subthingy. Since most definitions of substructures of an algebraic structure are indeed naturally of this form, they do have this property.





                      On the other hand, for unions of substructures we have no equivalent of the theorem above, and thus the $W = X cup Y$ union of two subthingies $X$ and $Y$ of a thingy $(T, dots)$ is not usually a subthingy.



                      Probably the nearest thing that we can say, kind of trivially, is that the closure $bar W$ of $W$ (i.e. the unique smallest subset of $T$ that includes $W$ and is closed under all the relevant functions, if one exists) will be a subthingy of $T$. Which, of course, is actually true by definition for all $W in T$, not just those that arise as a union of two (or more) subthingies.



                      For example, the union of two subspaces of a vector space is not generally a subspace, because adding together two vectors from different subspaces can produce a vector that belongs to neither of the original subspaces. But the span of the union is indeed a subspace — as is the span of any arbitrary subset of the full vector space.






                      share|cite|improve this answer

























                        up vote
                        3
                        down vote













                        The fact that the intersection of subgroups are themselves subgroups, the intersections of subrings are subrings, etc., is indeed an example of a more general property:




                        Call a set $X$ "closed under $f$", where $f$ is a function with $n$ arguments, if the domain of $f$ includes $X^n$ and if $f(x_1, x_2, dots, x_n) in X$ for all $(x_1, x_2, dots, x_n) in X^n$.



                        Theorem: If $X$ and $Y$ are both closed under $f$, then $Z = X cap Y$ is closed under $f$.



                        Proof. Since $X$ is closed under $f$, $Z^n subset X^n$ is part of the domain of $f$. Furthermore, since every $n$-tuple $(z_1, z_2, dots, z_n) in Z^n$ is in both $X^n$ and $Y^n$, and both $X$ and $Y$ are closed under $f$, it follows that $f(z_1, z_2, dots, z_n)$ belongs to both $X$ and $Y$, and therefore to their intersection $Z$. $square$




                        For example, let $(G, +, 0)$ be a group with the group operation $+$ and the zero element $0$. Consider the functions $f_+: G^2 to G$ and $f_0: G^0 to G$ defined by $f_+(a, b) = a + b$ and $f_0(varepsilon) = 0$ (where $varepsilon$ denotes the zero-element tuple, the only element of $G^0 = {varepsilon}$). Clearly, the subgroups of $(G, +, 0)$ are exactly the subsets of $G$ that are closed under both $f_+$ and $f_0$. Thus, if $X$ and $Y$ are subgroups of $(G, +, 0)$, and $Z = X cap Y$ is their intersection, then $Z$ must also be closed under both $f_+$ and $f_0$, and thus also a subgroup.



                        More generally, any time we can define a "subthingy" of a "thingy" $(T, dots)$ as a subset of $T$ that is closed under one or more functions $f: T^n to T$, it automatically follows from this definition that the intersection of two subthingies of the same thingy must itself be a subthingy. Since most definitions of substructures of an algebraic structure are indeed naturally of this form, they do have this property.





                        On the other hand, for unions of substructures we have no equivalent of the theorem above, and thus the $W = X cup Y$ union of two subthingies $X$ and $Y$ of a thingy $(T, dots)$ is not usually a subthingy.



                        Probably the nearest thing that we can say, kind of trivially, is that the closure $bar W$ of $W$ (i.e. the unique smallest subset of $T$ that includes $W$ and is closed under all the relevant functions, if one exists) will be a subthingy of $T$. Which, of course, is actually true by definition for all $W in T$, not just those that arise as a union of two (or more) subthingies.



                        For example, the union of two subspaces of a vector space is not generally a subspace, because adding together two vectors from different subspaces can produce a vector that belongs to neither of the original subspaces. But the span of the union is indeed a subspace — as is the span of any arbitrary subset of the full vector space.






                        share|cite|improve this answer























                          up vote
                          3
                          down vote










                          up vote
                          3
                          down vote









                          The fact that the intersection of subgroups are themselves subgroups, the intersections of subrings are subrings, etc., is indeed an example of a more general property:




                          Call a set $X$ "closed under $f$", where $f$ is a function with $n$ arguments, if the domain of $f$ includes $X^n$ and if $f(x_1, x_2, dots, x_n) in X$ for all $(x_1, x_2, dots, x_n) in X^n$.



                          Theorem: If $X$ and $Y$ are both closed under $f$, then $Z = X cap Y$ is closed under $f$.



                          Proof. Since $X$ is closed under $f$, $Z^n subset X^n$ is part of the domain of $f$. Furthermore, since every $n$-tuple $(z_1, z_2, dots, z_n) in Z^n$ is in both $X^n$ and $Y^n$, and both $X$ and $Y$ are closed under $f$, it follows that $f(z_1, z_2, dots, z_n)$ belongs to both $X$ and $Y$, and therefore to their intersection $Z$. $square$




                          For example, let $(G, +, 0)$ be a group with the group operation $+$ and the zero element $0$. Consider the functions $f_+: G^2 to G$ and $f_0: G^0 to G$ defined by $f_+(a, b) = a + b$ and $f_0(varepsilon) = 0$ (where $varepsilon$ denotes the zero-element tuple, the only element of $G^0 = {varepsilon}$). Clearly, the subgroups of $(G, +, 0)$ are exactly the subsets of $G$ that are closed under both $f_+$ and $f_0$. Thus, if $X$ and $Y$ are subgroups of $(G, +, 0)$, and $Z = X cap Y$ is their intersection, then $Z$ must also be closed under both $f_+$ and $f_0$, and thus also a subgroup.



                          More generally, any time we can define a "subthingy" of a "thingy" $(T, dots)$ as a subset of $T$ that is closed under one or more functions $f: T^n to T$, it automatically follows from this definition that the intersection of two subthingies of the same thingy must itself be a subthingy. Since most definitions of substructures of an algebraic structure are indeed naturally of this form, they do have this property.





                          On the other hand, for unions of substructures we have no equivalent of the theorem above, and thus the $W = X cup Y$ union of two subthingies $X$ and $Y$ of a thingy $(T, dots)$ is not usually a subthingy.



                          Probably the nearest thing that we can say, kind of trivially, is that the closure $bar W$ of $W$ (i.e. the unique smallest subset of $T$ that includes $W$ and is closed under all the relevant functions, if one exists) will be a subthingy of $T$. Which, of course, is actually true by definition for all $W in T$, not just those that arise as a union of two (or more) subthingies.



                          For example, the union of two subspaces of a vector space is not generally a subspace, because adding together two vectors from different subspaces can produce a vector that belongs to neither of the original subspaces. But the span of the union is indeed a subspace — as is the span of any arbitrary subset of the full vector space.






                          share|cite|improve this answer












                          The fact that the intersection of subgroups are themselves subgroups, the intersections of subrings are subrings, etc., is indeed an example of a more general property:




                          Call a set $X$ "closed under $f$", where $f$ is a function with $n$ arguments, if the domain of $f$ includes $X^n$ and if $f(x_1, x_2, dots, x_n) in X$ for all $(x_1, x_2, dots, x_n) in X^n$.



                          Theorem: If $X$ and $Y$ are both closed under $f$, then $Z = X cap Y$ is closed under $f$.



                          Proof. Since $X$ is closed under $f$, $Z^n subset X^n$ is part of the domain of $f$. Furthermore, since every $n$-tuple $(z_1, z_2, dots, z_n) in Z^n$ is in both $X^n$ and $Y^n$, and both $X$ and $Y$ are closed under $f$, it follows that $f(z_1, z_2, dots, z_n)$ belongs to both $X$ and $Y$, and therefore to their intersection $Z$. $square$




                          For example, let $(G, +, 0)$ be a group with the group operation $+$ and the zero element $0$. Consider the functions $f_+: G^2 to G$ and $f_0: G^0 to G$ defined by $f_+(a, b) = a + b$ and $f_0(varepsilon) = 0$ (where $varepsilon$ denotes the zero-element tuple, the only element of $G^0 = {varepsilon}$). Clearly, the subgroups of $(G, +, 0)$ are exactly the subsets of $G$ that are closed under both $f_+$ and $f_0$. Thus, if $X$ and $Y$ are subgroups of $(G, +, 0)$, and $Z = X cap Y$ is their intersection, then $Z$ must also be closed under both $f_+$ and $f_0$, and thus also a subgroup.



                          More generally, any time we can define a "subthingy" of a "thingy" $(T, dots)$ as a subset of $T$ that is closed under one or more functions $f: T^n to T$, it automatically follows from this definition that the intersection of two subthingies of the same thingy must itself be a subthingy. Since most definitions of substructures of an algebraic structure are indeed naturally of this form, they do have this property.





                          On the other hand, for unions of substructures we have no equivalent of the theorem above, and thus the $W = X cup Y$ union of two subthingies $X$ and $Y$ of a thingy $(T, dots)$ is not usually a subthingy.



                          Probably the nearest thing that we can say, kind of trivially, is that the closure $bar W$ of $W$ (i.e. the unique smallest subset of $T$ that includes $W$ and is closed under all the relevant functions, if one exists) will be a subthingy of $T$. Which, of course, is actually true by definition for all $W in T$, not just those that arise as a union of two (or more) subthingies.



                          For example, the union of two subspaces of a vector space is not generally a subspace, because adding together two vectors from different subspaces can produce a vector that belongs to neither of the original subspaces. But the span of the union is indeed a subspace — as is the span of any arbitrary subset of the full vector space.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered 12 hours ago









                          Ilmari Karonen

                          19.3k25182




                          19.3k25182






















                              up vote
                              2
                              down vote













                              For intersection case, the result is again an algebraic structure. For some times the union does't make much sense. For example, consider the intersection of two lines in a plane. It is either a point or empty. But we consider the union of two lines in the plane, it is, typically, a cross, something
                              like an $times$ or, in an extreme case, when the two given lines coincide,
                              it is just a line. Similar comments can be made about the union of two
                              planes in space-geometric intuition is still working. Geometric intuition
                              is likely to stop working when it is consulted about the union of two
                              subspaces of a $19$-dimensional space-say a $17$-dimensional one and an
                              $18$-dimensional one. So ingeneral




                              • a group cannot be written as union of two subgroups

                              • a real vector space cannot be the union of a finite
                                number of proper subspaces.

                              • a Banach space cannot be written as a union of even a countable infinity of
                                proper subspaces


                              $$vdots$$






                              share|cite|improve this answer

























                                up vote
                                2
                                down vote













                                For intersection case, the result is again an algebraic structure. For some times the union does't make much sense. For example, consider the intersection of two lines in a plane. It is either a point or empty. But we consider the union of two lines in the plane, it is, typically, a cross, something
                                like an $times$ or, in an extreme case, when the two given lines coincide,
                                it is just a line. Similar comments can be made about the union of two
                                planes in space-geometric intuition is still working. Geometric intuition
                                is likely to stop working when it is consulted about the union of two
                                subspaces of a $19$-dimensional space-say a $17$-dimensional one and an
                                $18$-dimensional one. So ingeneral




                                • a group cannot be written as union of two subgroups

                                • a real vector space cannot be the union of a finite
                                  number of proper subspaces.

                                • a Banach space cannot be written as a union of even a countable infinity of
                                  proper subspaces


                                $$vdots$$






                                share|cite|improve this answer























                                  up vote
                                  2
                                  down vote










                                  up vote
                                  2
                                  down vote









                                  For intersection case, the result is again an algebraic structure. For some times the union does't make much sense. For example, consider the intersection of two lines in a plane. It is either a point or empty. But we consider the union of two lines in the plane, it is, typically, a cross, something
                                  like an $times$ or, in an extreme case, when the two given lines coincide,
                                  it is just a line. Similar comments can be made about the union of two
                                  planes in space-geometric intuition is still working. Geometric intuition
                                  is likely to stop working when it is consulted about the union of two
                                  subspaces of a $19$-dimensional space-say a $17$-dimensional one and an
                                  $18$-dimensional one. So ingeneral




                                  • a group cannot be written as union of two subgroups

                                  • a real vector space cannot be the union of a finite
                                    number of proper subspaces.

                                  • a Banach space cannot be written as a union of even a countable infinity of
                                    proper subspaces


                                  $$vdots$$






                                  share|cite|improve this answer












                                  For intersection case, the result is again an algebraic structure. For some times the union does't make much sense. For example, consider the intersection of two lines in a plane. It is either a point or empty. But we consider the union of two lines in the plane, it is, typically, a cross, something
                                  like an $times$ or, in an extreme case, when the two given lines coincide,
                                  it is just a line. Similar comments can be made about the union of two
                                  planes in space-geometric intuition is still working. Geometric intuition
                                  is likely to stop working when it is consulted about the union of two
                                  subspaces of a $19$-dimensional space-say a $17$-dimensional one and an
                                  $18$-dimensional one. So ingeneral




                                  • a group cannot be written as union of two subgroups

                                  • a real vector space cannot be the union of a finite
                                    number of proper subspaces.

                                  • a Banach space cannot be written as a union of even a countable infinity of
                                    proper subspaces


                                  $$vdots$$







                                  share|cite|improve this answer












                                  share|cite|improve this answer



                                  share|cite|improve this answer










                                  answered 18 hours ago









                                  Chinnapparaj R

                                  4,9871825




                                  4,9871825






















                                      up vote
                                      0
                                      down vote













                                      If we have a set $S$ and binary operator $O$, then $O$ is defined on the cartesian production $S$ with itself. So it comes down to the fact that $(Acap B)times (Acap B)=(A times A) cap (B times B)$, but $(Acup B)times (Acup B) neq (A times A) cup (B times B)$.



                                      If we're taking the intersection of $A$ and $B$, then adding elements isn't a problem: we're adding elements that are in both $A$ and $B$, so we can use either the addition defined for $A$ or the one defined for $B$. But if we have the union of $A$ and $B$, then we have to add an element of $A$ to an element of $B$, and we can't use either previously defined addition.






                                      share|cite|improve this answer

























                                        up vote
                                        0
                                        down vote













                                        If we have a set $S$ and binary operator $O$, then $O$ is defined on the cartesian production $S$ with itself. So it comes down to the fact that $(Acap B)times (Acap B)=(A times A) cap (B times B)$, but $(Acup B)times (Acup B) neq (A times A) cup (B times B)$.



                                        If we're taking the intersection of $A$ and $B$, then adding elements isn't a problem: we're adding elements that are in both $A$ and $B$, so we can use either the addition defined for $A$ or the one defined for $B$. But if we have the union of $A$ and $B$, then we have to add an element of $A$ to an element of $B$, and we can't use either previously defined addition.






                                        share|cite|improve this answer























                                          up vote
                                          0
                                          down vote










                                          up vote
                                          0
                                          down vote









                                          If we have a set $S$ and binary operator $O$, then $O$ is defined on the cartesian production $S$ with itself. So it comes down to the fact that $(Acap B)times (Acap B)=(A times A) cap (B times B)$, but $(Acup B)times (Acup B) neq (A times A) cup (B times B)$.



                                          If we're taking the intersection of $A$ and $B$, then adding elements isn't a problem: we're adding elements that are in both $A$ and $B$, so we can use either the addition defined for $A$ or the one defined for $B$. But if we have the union of $A$ and $B$, then we have to add an element of $A$ to an element of $B$, and we can't use either previously defined addition.






                                          share|cite|improve this answer












                                          If we have a set $S$ and binary operator $O$, then $O$ is defined on the cartesian production $S$ with itself. So it comes down to the fact that $(Acap B)times (Acap B)=(A times A) cap (B times B)$, but $(Acup B)times (Acup B) neq (A times A) cup (B times B)$.



                                          If we're taking the intersection of $A$ and $B$, then adding elements isn't a problem: we're adding elements that are in both $A$ and $B$, so we can use either the addition defined for $A$ or the one defined for $B$. But if we have the union of $A$ and $B$, then we have to add an element of $A$ to an element of $B$, and we can't use either previously defined addition.







                                          share|cite|improve this answer












                                          share|cite|improve this answer



                                          share|cite|improve this answer










                                          answered 10 hours ago









                                          Acccumulation

                                          6,6412616




                                          6,6412616






















                                              up vote
                                              0
                                              down vote













                                              A little warning: the whole question is meaningful only if your are talking about substructures of a given structure $A$, so I what follow I will assume this.



                                              Under the remark above one you can regard substructures in two different, but equivalent ways,




                                              1. as structures whose underlying set is a subset of $A$ and such that the inclusion is an homomorphism

                                              2. as subsets of $A$ closed under the operations of $A$ (and then the underlying structure is the one induced by $A$).


                                              If we take the second approach we have that a subset $S subseteq A$ is substructure of $A$ if and only if for every operation $f colon A^n to A$ we have that
                                              $$G(f) cap S^{n} times A = G(f) cap S^{n+1} ,$$
                                              here with $G(f)$denotes the graph of $f$, i.e. the set ${(bar a,a) in A^{n} times A mid f(bar a)=a}$.



                                              Using this formula is clear why intersection works well:
                                              if $(S_i)_i$ is a family of substructures of $A$, i.e. a family of subsets satisfying the equation above, the we have that
                                              $$
                                              begin{align*}
                                              G(f) cap (bigcap_i S_i)^n times A &= G(f) cap (bigcap_i S_i^n times A)\
                                              &= bigcap_i G(f) cap (S_i^n times A)\
                                              &= bigcap_i G(f) cap S_i^{n+1} text{ (because the $S_i$ are substructures)}\
                                              &= G(f) cap (bigcap_i S_i)^{n+1} .
                                              end{align*}$$



                                              What makes this work is the fact that products commutes with intersection:
                                              i.e. the following holds
                                              $$bigcap_i (A^1_i times dots times A_i^n) = (bigcap_i A_i^1) times dots times (bigcap_i A_i^n)$$
                                              A similar formula does not hold for unions, we have
                                              $$bigcup_i (A_i^1 times dots times A_i^n) subsetneq (bigcup_i A_i^1) times dots times (bigcup_i A_i^n) .$$



                                              So if you like, a deep reason why intersection of substructures works so well is because intersection commutes with products.






                                              share|cite|improve this answer

























                                                up vote
                                                0
                                                down vote













                                                A little warning: the whole question is meaningful only if your are talking about substructures of a given structure $A$, so I what follow I will assume this.



                                                Under the remark above one you can regard substructures in two different, but equivalent ways,




                                                1. as structures whose underlying set is a subset of $A$ and such that the inclusion is an homomorphism

                                                2. as subsets of $A$ closed under the operations of $A$ (and then the underlying structure is the one induced by $A$).


                                                If we take the second approach we have that a subset $S subseteq A$ is substructure of $A$ if and only if for every operation $f colon A^n to A$ we have that
                                                $$G(f) cap S^{n} times A = G(f) cap S^{n+1} ,$$
                                                here with $G(f)$denotes the graph of $f$, i.e. the set ${(bar a,a) in A^{n} times A mid f(bar a)=a}$.



                                                Using this formula is clear why intersection works well:
                                                if $(S_i)_i$ is a family of substructures of $A$, i.e. a family of subsets satisfying the equation above, the we have that
                                                $$
                                                begin{align*}
                                                G(f) cap (bigcap_i S_i)^n times A &= G(f) cap (bigcap_i S_i^n times A)\
                                                &= bigcap_i G(f) cap (S_i^n times A)\
                                                &= bigcap_i G(f) cap S_i^{n+1} text{ (because the $S_i$ are substructures)}\
                                                &= G(f) cap (bigcap_i S_i)^{n+1} .
                                                end{align*}$$



                                                What makes this work is the fact that products commutes with intersection:
                                                i.e. the following holds
                                                $$bigcap_i (A^1_i times dots times A_i^n) = (bigcap_i A_i^1) times dots times (bigcap_i A_i^n)$$
                                                A similar formula does not hold for unions, we have
                                                $$bigcup_i (A_i^1 times dots times A_i^n) subsetneq (bigcup_i A_i^1) times dots times (bigcup_i A_i^n) .$$



                                                So if you like, a deep reason why intersection of substructures works so well is because intersection commutes with products.






                                                share|cite|improve this answer























                                                  up vote
                                                  0
                                                  down vote










                                                  up vote
                                                  0
                                                  down vote









                                                  A little warning: the whole question is meaningful only if your are talking about substructures of a given structure $A$, so I what follow I will assume this.



                                                  Under the remark above one you can regard substructures in two different, but equivalent ways,




                                                  1. as structures whose underlying set is a subset of $A$ and such that the inclusion is an homomorphism

                                                  2. as subsets of $A$ closed under the operations of $A$ (and then the underlying structure is the one induced by $A$).


                                                  If we take the second approach we have that a subset $S subseteq A$ is substructure of $A$ if and only if for every operation $f colon A^n to A$ we have that
                                                  $$G(f) cap S^{n} times A = G(f) cap S^{n+1} ,$$
                                                  here with $G(f)$denotes the graph of $f$, i.e. the set ${(bar a,a) in A^{n} times A mid f(bar a)=a}$.



                                                  Using this formula is clear why intersection works well:
                                                  if $(S_i)_i$ is a family of substructures of $A$, i.e. a family of subsets satisfying the equation above, the we have that
                                                  $$
                                                  begin{align*}
                                                  G(f) cap (bigcap_i S_i)^n times A &= G(f) cap (bigcap_i S_i^n times A)\
                                                  &= bigcap_i G(f) cap (S_i^n times A)\
                                                  &= bigcap_i G(f) cap S_i^{n+1} text{ (because the $S_i$ are substructures)}\
                                                  &= G(f) cap (bigcap_i S_i)^{n+1} .
                                                  end{align*}$$



                                                  What makes this work is the fact that products commutes with intersection:
                                                  i.e. the following holds
                                                  $$bigcap_i (A^1_i times dots times A_i^n) = (bigcap_i A_i^1) times dots times (bigcap_i A_i^n)$$
                                                  A similar formula does not hold for unions, we have
                                                  $$bigcup_i (A_i^1 times dots times A_i^n) subsetneq (bigcup_i A_i^1) times dots times (bigcup_i A_i^n) .$$



                                                  So if you like, a deep reason why intersection of substructures works so well is because intersection commutes with products.






                                                  share|cite|improve this answer












                                                  A little warning: the whole question is meaningful only if your are talking about substructures of a given structure $A$, so I what follow I will assume this.



                                                  Under the remark above one you can regard substructures in two different, but equivalent ways,




                                                  1. as structures whose underlying set is a subset of $A$ and such that the inclusion is an homomorphism

                                                  2. as subsets of $A$ closed under the operations of $A$ (and then the underlying structure is the one induced by $A$).


                                                  If we take the second approach we have that a subset $S subseteq A$ is substructure of $A$ if and only if for every operation $f colon A^n to A$ we have that
                                                  $$G(f) cap S^{n} times A = G(f) cap S^{n+1} ,$$
                                                  here with $G(f)$denotes the graph of $f$, i.e. the set ${(bar a,a) in A^{n} times A mid f(bar a)=a}$.



                                                  Using this formula is clear why intersection works well:
                                                  if $(S_i)_i$ is a family of substructures of $A$, i.e. a family of subsets satisfying the equation above, the we have that
                                                  $$
                                                  begin{align*}
                                                  G(f) cap (bigcap_i S_i)^n times A &= G(f) cap (bigcap_i S_i^n times A)\
                                                  &= bigcap_i G(f) cap (S_i^n times A)\
                                                  &= bigcap_i G(f) cap S_i^{n+1} text{ (because the $S_i$ are substructures)}\
                                                  &= G(f) cap (bigcap_i S_i)^{n+1} .
                                                  end{align*}$$



                                                  What makes this work is the fact that products commutes with intersection:
                                                  i.e. the following holds
                                                  $$bigcap_i (A^1_i times dots times A_i^n) = (bigcap_i A_i^1) times dots times (bigcap_i A_i^n)$$
                                                  A similar formula does not hold for unions, we have
                                                  $$bigcup_i (A_i^1 times dots times A_i^n) subsetneq (bigcup_i A_i^1) times dots times (bigcup_i A_i^n) .$$



                                                  So if you like, a deep reason why intersection of substructures works so well is because intersection commutes with products.







                                                  share|cite|improve this answer












                                                  share|cite|improve this answer



                                                  share|cite|improve this answer










                                                  answered 9 hours ago









                                                  Giorgio Mossa

                                                  13.7k11748




                                                  13.7k11748






















                                                      up vote
                                                      0
                                                      down vote













                                                      Since no one has explained this from a categorical perspective yet, let me try to offer another point of view. Each of the types of objects you mention (groups, rings, fields, vector spaces) form a concrete category. That is, every group, ring, field, or vector space is a set equipped with the data of extra structure, and the homomorphisms between them are set maps which preserve that extra structure.



                                                      Another way we might say the above is that if $mathcal{C}$ is the category of any of the above algebraic objects and their morphisms, we have a forgetful functor
                                                      begin{align*}
                                                      U : mathcal{C}&tomathsf{Set}\
                                                      A&mapsto UA,
                                                      end{align*}

                                                      which sends each algebraic structure $A$ to its underlying set $UA$ and each homomorphism of algebraic structures $f : Ato B$ to the underlying function on sets $Uf : UAto UB.$



                                                      In each of these situations (well, except when $mathcal{C}$ is the category of fields), the forgetful functor has a left adjoint - the free object functor. Explicitly, this means that if you're given a group, ring, or vector space (more generally module) $A$ and a set $S,$ then there is a natural bijection
                                                      $$
                                                      {textrm{homomorphisms of algebraic structures }f : F(S)to A}cong{textrm{maps of sets }g : Sto UA},
                                                      $$

                                                      where $F(S)$ denotes the free [group, ring, vector space, module...] on $S.$ This is essentially the definition of a free object: to give a homomorphism of the free group, ring, or vector space $F(S)$ on a set $S$ to another group, ring, or vector space $A,$ you need to give a map of sets $Sto UA.$ Think of $S$ as being the set of generators of $F(S),$ and the "freeness" means that there are no relations between these generators other than the relations forced by the axioms of the algebraic structure.



                                                      For example, the free vector space on a set $S$ may be described as the vector space $F(S)$ with basis ${e_smid sin S}$ indexed by the elements of $s.$ To give a map from $F(S)$ to any other vector space $V,$ you need only specify where the basis elements $e_s$ are sent, and this is completely determined by a set map $Sto UV$ (again, $UV$ is the underlying set of the vector space $V$).



                                                      As another example, the free commutative ring on a set $S$ is the ring $Bbb Z[x_smid sin S]$ - the polynomial ring over $Bbb Z$ with one variable for each element of $s.$



                                                      Now that I've set this up, the point is that intersections are limits in the category of sets, and that forgetful functors (or more generally, right adjoints) play nicely with limits. In particular, if $S$ and $T$ are subsets of some set $X,$ then we may consider the diagram



                                                      $$
                                                      require{AMScd}
                                                      begin{CD}
                                                      Y @>>> S\
                                                      @VVV @VVV \
                                                      T @>>> X,
                                                      end{CD}
                                                      $$

                                                      where $Y$ is some unspecified set together with maps such that the diagram commutes. The intersection $Scap T$ has the nice property that any set $Y$ with maps to $S$ and $T$ as in the diagram will factor uniquely through a map $Yto Scap T.$ This is the statement that $Scap T$ is the limit of the diagram above (without the $Y$).



                                                      By abstract nonsense arguments, right adjoints (like the forgetful functor in these cases) preserve limits. (We also sometimes have even nicer results, but let me not get in too deep.) Preservation of limits means that if we have a limit of algebras $varprojlim A_i$ over some diagram, then the underlying set of the limit is canonically isomorphic to the limit $varprojlim UA_i$ (in the category of sets) of the underlying sets of the algebras.



                                                      So, if you have subalgebras $A_1,A_2$ of a given algebra $A,$ and you consider the limit $B$ of these inclusions as we did for sets:
                                                      $$
                                                      require{AMScd}
                                                      begin{CD}
                                                      B @>>> A_1\
                                                      @VVV @VVV \
                                                      A_2 @>>> A,
                                                      end{CD}
                                                      $$



                                                      then the underlying set of the limit $B$ is the limit of the underlying sets $U A_1$ and $UA_2,$ which is simply the intersection $UA_1cap UA_2.$



                                                      The other punchline is that the union of two sets $S$ and $T$ (which are subsets of some ambient set $X$) is the colimit of an appropriate diagram. However, if $S$ and $T$ are the underlying sets of some algebraic objects $S = UA_1,$ $T = UA_2,$ the forgetful functor does not preserve colimits (even if $X$ is the underlying set of some large algebra $X = UA$). So, unless you are surprisingly lucky, the smallest algebra containing two given algebras $A_1$ and $A_2$ (which is a colimit) will not be the same as the smallest set containing $UA_1$ and $UA_2.$



                                                      Many others have already expressed that this is also related to the fact that products and intersections commute but the same is not true of products and unions: this is also a categorical fact! Products and intersections are both examples of limits, but unions are colimits. Limits commute with limits, but limits do not necessarily commute with colimits, unless certain nice conditions hold.



                                                      All in all, the failure of existence of an algebraic structure on a union is a combination of a number of categorical facts, which are much more general than the specific situations you mention. While proving that the forgetful functors described have the properties I claim essentially come down to making arguments like in the other answers, I prefer this perspective because taking "unions" or "intersections" is somehow an unnatural thing to do when you have things which aren't sets - you want to combine your algebraic objects in ways that result in objects with the same algebraic structure (e.g., via taking limits and colimits or using other categorical constructions). The fact that the underlying set of a limit coincides with the limits of the underlying sets is a result of nice properties that the forgetful functor in question has.



                                                      Note: I said we didn't consider fields above, and that is because the category of fields is particularly badly behaved, because fields are rather restrictive.






                                                      share|cite|improve this answer



























                                                        up vote
                                                        0
                                                        down vote













                                                        Since no one has explained this from a categorical perspective yet, let me try to offer another point of view. Each of the types of objects you mention (groups, rings, fields, vector spaces) form a concrete category. That is, every group, ring, field, or vector space is a set equipped with the data of extra structure, and the homomorphisms between them are set maps which preserve that extra structure.



                                                        Another way we might say the above is that if $mathcal{C}$ is the category of any of the above algebraic objects and their morphisms, we have a forgetful functor
                                                        begin{align*}
                                                        U : mathcal{C}&tomathsf{Set}\
                                                        A&mapsto UA,
                                                        end{align*}

                                                        which sends each algebraic structure $A$ to its underlying set $UA$ and each homomorphism of algebraic structures $f : Ato B$ to the underlying function on sets $Uf : UAto UB.$



                                                        In each of these situations (well, except when $mathcal{C}$ is the category of fields), the forgetful functor has a left adjoint - the free object functor. Explicitly, this means that if you're given a group, ring, or vector space (more generally module) $A$ and a set $S,$ then there is a natural bijection
                                                        $$
                                                        {textrm{homomorphisms of algebraic structures }f : F(S)to A}cong{textrm{maps of sets }g : Sto UA},
                                                        $$

                                                        where $F(S)$ denotes the free [group, ring, vector space, module...] on $S.$ This is essentially the definition of a free object: to give a homomorphism of the free group, ring, or vector space $F(S)$ on a set $S$ to another group, ring, or vector space $A,$ you need to give a map of sets $Sto UA.$ Think of $S$ as being the set of generators of $F(S),$ and the "freeness" means that there are no relations between these generators other than the relations forced by the axioms of the algebraic structure.



                                                        For example, the free vector space on a set $S$ may be described as the vector space $F(S)$ with basis ${e_smid sin S}$ indexed by the elements of $s.$ To give a map from $F(S)$ to any other vector space $V,$ you need only specify where the basis elements $e_s$ are sent, and this is completely determined by a set map $Sto UV$ (again, $UV$ is the underlying set of the vector space $V$).



                                                        As another example, the free commutative ring on a set $S$ is the ring $Bbb Z[x_smid sin S]$ - the polynomial ring over $Bbb Z$ with one variable for each element of $s.$



                                                        Now that I've set this up, the point is that intersections are limits in the category of sets, and that forgetful functors (or more generally, right adjoints) play nicely with limits. In particular, if $S$ and $T$ are subsets of some set $X,$ then we may consider the diagram



                                                        $$
                                                        require{AMScd}
                                                        begin{CD}
                                                        Y @>>> S\
                                                        @VVV @VVV \
                                                        T @>>> X,
                                                        end{CD}
                                                        $$

                                                        where $Y$ is some unspecified set together with maps such that the diagram commutes. The intersection $Scap T$ has the nice property that any set $Y$ with maps to $S$ and $T$ as in the diagram will factor uniquely through a map $Yto Scap T.$ This is the statement that $Scap T$ is the limit of the diagram above (without the $Y$).



                                                        By abstract nonsense arguments, right adjoints (like the forgetful functor in these cases) preserve limits. (We also sometimes have even nicer results, but let me not get in too deep.) Preservation of limits means that if we have a limit of algebras $varprojlim A_i$ over some diagram, then the underlying set of the limit is canonically isomorphic to the limit $varprojlim UA_i$ (in the category of sets) of the underlying sets of the algebras.



                                                        So, if you have subalgebras $A_1,A_2$ of a given algebra $A,$ and you consider the limit $B$ of these inclusions as we did for sets:
                                                        $$
                                                        require{AMScd}
                                                        begin{CD}
                                                        B @>>> A_1\
                                                        @VVV @VVV \
                                                        A_2 @>>> A,
                                                        end{CD}
                                                        $$



                                                        then the underlying set of the limit $B$ is the limit of the underlying sets $U A_1$ and $UA_2,$ which is simply the intersection $UA_1cap UA_2.$



                                                        The other punchline is that the union of two sets $S$ and $T$ (which are subsets of some ambient set $X$) is the colimit of an appropriate diagram. However, if $S$ and $T$ are the underlying sets of some algebraic objects $S = UA_1,$ $T = UA_2,$ the forgetful functor does not preserve colimits (even if $X$ is the underlying set of some large algebra $X = UA$). So, unless you are surprisingly lucky, the smallest algebra containing two given algebras $A_1$ and $A_2$ (which is a colimit) will not be the same as the smallest set containing $UA_1$ and $UA_2.$



                                                        Many others have already expressed that this is also related to the fact that products and intersections commute but the same is not true of products and unions: this is also a categorical fact! Products and intersections are both examples of limits, but unions are colimits. Limits commute with limits, but limits do not necessarily commute with colimits, unless certain nice conditions hold.



                                                        All in all, the failure of existence of an algebraic structure on a union is a combination of a number of categorical facts, which are much more general than the specific situations you mention. While proving that the forgetful functors described have the properties I claim essentially come down to making arguments like in the other answers, I prefer this perspective because taking "unions" or "intersections" is somehow an unnatural thing to do when you have things which aren't sets - you want to combine your algebraic objects in ways that result in objects with the same algebraic structure (e.g., via taking limits and colimits or using other categorical constructions). The fact that the underlying set of a limit coincides with the limits of the underlying sets is a result of nice properties that the forgetful functor in question has.



                                                        Note: I said we didn't consider fields above, and that is because the category of fields is particularly badly behaved, because fields are rather restrictive.






                                                        share|cite|improve this answer

























                                                          up vote
                                                          0
                                                          down vote










                                                          up vote
                                                          0
                                                          down vote









                                                          Since no one has explained this from a categorical perspective yet, let me try to offer another point of view. Each of the types of objects you mention (groups, rings, fields, vector spaces) form a concrete category. That is, every group, ring, field, or vector space is a set equipped with the data of extra structure, and the homomorphisms between them are set maps which preserve that extra structure.



                                                          Another way we might say the above is that if $mathcal{C}$ is the category of any of the above algebraic objects and their morphisms, we have a forgetful functor
                                                          begin{align*}
                                                          U : mathcal{C}&tomathsf{Set}\
                                                          A&mapsto UA,
                                                          end{align*}

                                                          which sends each algebraic structure $A$ to its underlying set $UA$ and each homomorphism of algebraic structures $f : Ato B$ to the underlying function on sets $Uf : UAto UB.$



                                                          In each of these situations (well, except when $mathcal{C}$ is the category of fields), the forgetful functor has a left adjoint - the free object functor. Explicitly, this means that if you're given a group, ring, or vector space (more generally module) $A$ and a set $S,$ then there is a natural bijection
                                                          $$
                                                          {textrm{homomorphisms of algebraic structures }f : F(S)to A}cong{textrm{maps of sets }g : Sto UA},
                                                          $$

                                                          where $F(S)$ denotes the free [group, ring, vector space, module...] on $S.$ This is essentially the definition of a free object: to give a homomorphism of the free group, ring, or vector space $F(S)$ on a set $S$ to another group, ring, or vector space $A,$ you need to give a map of sets $Sto UA.$ Think of $S$ as being the set of generators of $F(S),$ and the "freeness" means that there are no relations between these generators other than the relations forced by the axioms of the algebraic structure.



                                                          For example, the free vector space on a set $S$ may be described as the vector space $F(S)$ with basis ${e_smid sin S}$ indexed by the elements of $s.$ To give a map from $F(S)$ to any other vector space $V,$ you need only specify where the basis elements $e_s$ are sent, and this is completely determined by a set map $Sto UV$ (again, $UV$ is the underlying set of the vector space $V$).



                                                          As another example, the free commutative ring on a set $S$ is the ring $Bbb Z[x_smid sin S]$ - the polynomial ring over $Bbb Z$ with one variable for each element of $s.$



                                                          Now that I've set this up, the point is that intersections are limits in the category of sets, and that forgetful functors (or more generally, right adjoints) play nicely with limits. In particular, if $S$ and $T$ are subsets of some set $X,$ then we may consider the diagram



                                                          $$
                                                          require{AMScd}
                                                          begin{CD}
                                                          Y @>>> S\
                                                          @VVV @VVV \
                                                          T @>>> X,
                                                          end{CD}
                                                          $$

                                                          where $Y$ is some unspecified set together with maps such that the diagram commutes. The intersection $Scap T$ has the nice property that any set $Y$ with maps to $S$ and $T$ as in the diagram will factor uniquely through a map $Yto Scap T.$ This is the statement that $Scap T$ is the limit of the diagram above (without the $Y$).



                                                          By abstract nonsense arguments, right adjoints (like the forgetful functor in these cases) preserve limits. (We also sometimes have even nicer results, but let me not get in too deep.) Preservation of limits means that if we have a limit of algebras $varprojlim A_i$ over some diagram, then the underlying set of the limit is canonically isomorphic to the limit $varprojlim UA_i$ (in the category of sets) of the underlying sets of the algebras.



                                                          So, if you have subalgebras $A_1,A_2$ of a given algebra $A,$ and you consider the limit $B$ of these inclusions as we did for sets:
                                                          $$
                                                          require{AMScd}
                                                          begin{CD}
                                                          B @>>> A_1\
                                                          @VVV @VVV \
                                                          A_2 @>>> A,
                                                          end{CD}
                                                          $$



                                                          then the underlying set of the limit $B$ is the limit of the underlying sets $U A_1$ and $UA_2,$ which is simply the intersection $UA_1cap UA_2.$



                                                          The other punchline is that the union of two sets $S$ and $T$ (which are subsets of some ambient set $X$) is the colimit of an appropriate diagram. However, if $S$ and $T$ are the underlying sets of some algebraic objects $S = UA_1,$ $T = UA_2,$ the forgetful functor does not preserve colimits (even if $X$ is the underlying set of some large algebra $X = UA$). So, unless you are surprisingly lucky, the smallest algebra containing two given algebras $A_1$ and $A_2$ (which is a colimit) will not be the same as the smallest set containing $UA_1$ and $UA_2.$



                                                          Many others have already expressed that this is also related to the fact that products and intersections commute but the same is not true of products and unions: this is also a categorical fact! Products and intersections are both examples of limits, but unions are colimits. Limits commute with limits, but limits do not necessarily commute with colimits, unless certain nice conditions hold.



                                                          All in all, the failure of existence of an algebraic structure on a union is a combination of a number of categorical facts, which are much more general than the specific situations you mention. While proving that the forgetful functors described have the properties I claim essentially come down to making arguments like in the other answers, I prefer this perspective because taking "unions" or "intersections" is somehow an unnatural thing to do when you have things which aren't sets - you want to combine your algebraic objects in ways that result in objects with the same algebraic structure (e.g., via taking limits and colimits or using other categorical constructions). The fact that the underlying set of a limit coincides with the limits of the underlying sets is a result of nice properties that the forgetful functor in question has.



                                                          Note: I said we didn't consider fields above, and that is because the category of fields is particularly badly behaved, because fields are rather restrictive.






                                                          share|cite|improve this answer














                                                          Since no one has explained this from a categorical perspective yet, let me try to offer another point of view. Each of the types of objects you mention (groups, rings, fields, vector spaces) form a concrete category. That is, every group, ring, field, or vector space is a set equipped with the data of extra structure, and the homomorphisms between them are set maps which preserve that extra structure.



                                                          Another way we might say the above is that if $mathcal{C}$ is the category of any of the above algebraic objects and their morphisms, we have a forgetful functor
                                                          begin{align*}
                                                          U : mathcal{C}&tomathsf{Set}\
                                                          A&mapsto UA,
                                                          end{align*}

                                                          which sends each algebraic structure $A$ to its underlying set $UA$ and each homomorphism of algebraic structures $f : Ato B$ to the underlying function on sets $Uf : UAto UB.$



                                                          In each of these situations (well, except when $mathcal{C}$ is the category of fields), the forgetful functor has a left adjoint - the free object functor. Explicitly, this means that if you're given a group, ring, or vector space (more generally module) $A$ and a set $S,$ then there is a natural bijection
                                                          $$
                                                          {textrm{homomorphisms of algebraic structures }f : F(S)to A}cong{textrm{maps of sets }g : Sto UA},
                                                          $$

                                                          where $F(S)$ denotes the free [group, ring, vector space, module...] on $S.$ This is essentially the definition of a free object: to give a homomorphism of the free group, ring, or vector space $F(S)$ on a set $S$ to another group, ring, or vector space $A,$ you need to give a map of sets $Sto UA.$ Think of $S$ as being the set of generators of $F(S),$ and the "freeness" means that there are no relations between these generators other than the relations forced by the axioms of the algebraic structure.



                                                          For example, the free vector space on a set $S$ may be described as the vector space $F(S)$ with basis ${e_smid sin S}$ indexed by the elements of $s.$ To give a map from $F(S)$ to any other vector space $V,$ you need only specify where the basis elements $e_s$ are sent, and this is completely determined by a set map $Sto UV$ (again, $UV$ is the underlying set of the vector space $V$).



                                                          As another example, the free commutative ring on a set $S$ is the ring $Bbb Z[x_smid sin S]$ - the polynomial ring over $Bbb Z$ with one variable for each element of $s.$



                                                          Now that I've set this up, the point is that intersections are limits in the category of sets, and that forgetful functors (or more generally, right adjoints) play nicely with limits. In particular, if $S$ and $T$ are subsets of some set $X,$ then we may consider the diagram



                                                          $$
                                                          require{AMScd}
                                                          begin{CD}
                                                          Y @>>> S\
                                                          @VVV @VVV \
                                                          T @>>> X,
                                                          end{CD}
                                                          $$

                                                          where $Y$ is some unspecified set together with maps such that the diagram commutes. The intersection $Scap T$ has the nice property that any set $Y$ with maps to $S$ and $T$ as in the diagram will factor uniquely through a map $Yto Scap T.$ This is the statement that $Scap T$ is the limit of the diagram above (without the $Y$).



                                                          By abstract nonsense arguments, right adjoints (like the forgetful functor in these cases) preserve limits. (We also sometimes have even nicer results, but let me not get in too deep.) Preservation of limits means that if we have a limit of algebras $varprojlim A_i$ over some diagram, then the underlying set of the limit is canonically isomorphic to the limit $varprojlim UA_i$ (in the category of sets) of the underlying sets of the algebras.



                                                          So, if you have subalgebras $A_1,A_2$ of a given algebra $A,$ and you consider the limit $B$ of these inclusions as we did for sets:
                                                          $$
                                                          require{AMScd}
                                                          begin{CD}
                                                          B @>>> A_1\
                                                          @VVV @VVV \
                                                          A_2 @>>> A,
                                                          end{CD}
                                                          $$



                                                          then the underlying set of the limit $B$ is the limit of the underlying sets $U A_1$ and $UA_2,$ which is simply the intersection $UA_1cap UA_2.$



                                                          The other punchline is that the union of two sets $S$ and $T$ (which are subsets of some ambient set $X$) is the colimit of an appropriate diagram. However, if $S$ and $T$ are the underlying sets of some algebraic objects $S = UA_1,$ $T = UA_2,$ the forgetful functor does not preserve colimits (even if $X$ is the underlying set of some large algebra $X = UA$). So, unless you are surprisingly lucky, the smallest algebra containing two given algebras $A_1$ and $A_2$ (which is a colimit) will not be the same as the smallest set containing $UA_1$ and $UA_2.$



                                                          Many others have already expressed that this is also related to the fact that products and intersections commute but the same is not true of products and unions: this is also a categorical fact! Products and intersections are both examples of limits, but unions are colimits. Limits commute with limits, but limits do not necessarily commute with colimits, unless certain nice conditions hold.



                                                          All in all, the failure of existence of an algebraic structure on a union is a combination of a number of categorical facts, which are much more general than the specific situations you mention. While proving that the forgetful functors described have the properties I claim essentially come down to making arguments like in the other answers, I prefer this perspective because taking "unions" or "intersections" is somehow an unnatural thing to do when you have things which aren't sets - you want to combine your algebraic objects in ways that result in objects with the same algebraic structure (e.g., via taking limits and colimits or using other categorical constructions). The fact that the underlying set of a limit coincides with the limits of the underlying sets is a result of nice properties that the forgetful functor in question has.



                                                          Note: I said we didn't consider fields above, and that is because the category of fields is particularly badly behaved, because fields are rather restrictive.







                                                          share|cite|improve this answer














                                                          share|cite|improve this answer



                                                          share|cite|improve this answer








                                                          edited 1 hour ago

























                                                          answered 8 hours ago









                                                          Stahl

                                                          16.1k43353




                                                          16.1k43353






























                                                              draft saved

                                                              draft discarded




















































                                                              Thanks for contributing an answer to Mathematics Stack Exchange!


                                                              • Please be sure to answer the question. Provide details and share your research!

                                                              But avoid



                                                              • Asking for help, clarification, or responding to other answers.

                                                              • Making statements based on opinion; back them up with references or personal experience.


                                                              Use MathJax to format equations. MathJax reference.


                                                              To learn more, see our tips on writing great answers.





                                                              Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                                                              Please pay close attention to the following guidance:


                                                              • Please be sure to answer the question. Provide details and share your research!

                                                              But avoid



                                                              • Asking for help, clarification, or responding to other answers.

                                                              • Making statements based on opinion; back them up with references or personal experience.


                                                              To learn more, see our tips on writing great answers.




                                                              draft saved


                                                              draft discarded














                                                              StackExchange.ready(
                                                              function () {
                                                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3039113%2fwhy-are-algebraic-structures-preserved-under-intersection-but-not-union%23new-answer', 'question_page');
                                                              }
                                                              );

                                                              Post as a guest















                                                              Required, but never shown





















































                                                              Required, but never shown














                                                              Required, but never shown












                                                              Required, but never shown







                                                              Required, but never shown

































                                                              Required, but never shown














                                                              Required, but never shown












                                                              Required, but never shown







                                                              Required, but never shown







                                                              Popular posts from this blog

                                                              Plaza Victoria

                                                              Puebla de Zaragoza

                                                              Musa