linear combination of some matrices is identity matrix












0












$begingroup$



Assume $T$ is a $ntimes n$ matrix over number field $mathbb{F}$. If $lambda$ is not an eigenvalue of $T$, we know $T-lambda E$ is invertible matrix where $E$ is the identity matrix. Now if we have $n$ different numbers $lambda_1,cdots,lambda_ninmathbb{F}$ and each one is not an eigenvalue of $T$, how to prove there exist $n$ numbers $a_1,cdots,a_ninmathbb{F}$ which satisfy $$sum_{k=1}^na_k(T-lambda_k E)^{-1}=E ?$$




I don't have much idea. I figured that it's sufficient to prove the case $mathbb{F}=mathbb{C}$. Because once we have $n$ complex numbers satisfying the equation, consider $mathbb{C}$ as a linear space over $mathbb{F}$ we can get $n$ numbers in $mathbb{F}$ satisfying the equation. And if we set the characteristic polynomial of $T$ is $p(x)$ and $$p(x)=g_k(x)(x-lambda_k)+p(lambda_k),$$ then $$(T-lambda_k E)^{-1}=frac{g_k(T)}{p(lambda_k)}.$$ But i don't know how to continue. Any help would be appreciated.










share|cite|improve this question











$endgroup$












  • $begingroup$
    (You're missing a minus sign...) If you can show that the $g_k$ are independent then they must span the space of polynomials of degree less than or equal to $n$, since that space has dimension $n$. So there exist scalars with $sum c_kg_k=1$ and you're done.
    $endgroup$
    – David C. Ullrich
    Dec 10 '18 at 16:10
















0












$begingroup$



Assume $T$ is a $ntimes n$ matrix over number field $mathbb{F}$. If $lambda$ is not an eigenvalue of $T$, we know $T-lambda E$ is invertible matrix where $E$ is the identity matrix. Now if we have $n$ different numbers $lambda_1,cdots,lambda_ninmathbb{F}$ and each one is not an eigenvalue of $T$, how to prove there exist $n$ numbers $a_1,cdots,a_ninmathbb{F}$ which satisfy $$sum_{k=1}^na_k(T-lambda_k E)^{-1}=E ?$$




I don't have much idea. I figured that it's sufficient to prove the case $mathbb{F}=mathbb{C}$. Because once we have $n$ complex numbers satisfying the equation, consider $mathbb{C}$ as a linear space over $mathbb{F}$ we can get $n$ numbers in $mathbb{F}$ satisfying the equation. And if we set the characteristic polynomial of $T$ is $p(x)$ and $$p(x)=g_k(x)(x-lambda_k)+p(lambda_k),$$ then $$(T-lambda_k E)^{-1}=frac{g_k(T)}{p(lambda_k)}.$$ But i don't know how to continue. Any help would be appreciated.










share|cite|improve this question











$endgroup$












  • $begingroup$
    (You're missing a minus sign...) If you can show that the $g_k$ are independent then they must span the space of polynomials of degree less than or equal to $n$, since that space has dimension $n$. So there exist scalars with $sum c_kg_k=1$ and you're done.
    $endgroup$
    – David C. Ullrich
    Dec 10 '18 at 16:10














0












0








0


2



$begingroup$



Assume $T$ is a $ntimes n$ matrix over number field $mathbb{F}$. If $lambda$ is not an eigenvalue of $T$, we know $T-lambda E$ is invertible matrix where $E$ is the identity matrix. Now if we have $n$ different numbers $lambda_1,cdots,lambda_ninmathbb{F}$ and each one is not an eigenvalue of $T$, how to prove there exist $n$ numbers $a_1,cdots,a_ninmathbb{F}$ which satisfy $$sum_{k=1}^na_k(T-lambda_k E)^{-1}=E ?$$




I don't have much idea. I figured that it's sufficient to prove the case $mathbb{F}=mathbb{C}$. Because once we have $n$ complex numbers satisfying the equation, consider $mathbb{C}$ as a linear space over $mathbb{F}$ we can get $n$ numbers in $mathbb{F}$ satisfying the equation. And if we set the characteristic polynomial of $T$ is $p(x)$ and $$p(x)=g_k(x)(x-lambda_k)+p(lambda_k),$$ then $$(T-lambda_k E)^{-1}=frac{g_k(T)}{p(lambda_k)}.$$ But i don't know how to continue. Any help would be appreciated.










share|cite|improve this question











$endgroup$





Assume $T$ is a $ntimes n$ matrix over number field $mathbb{F}$. If $lambda$ is not an eigenvalue of $T$, we know $T-lambda E$ is invertible matrix where $E$ is the identity matrix. Now if we have $n$ different numbers $lambda_1,cdots,lambda_ninmathbb{F}$ and each one is not an eigenvalue of $T$, how to prove there exist $n$ numbers $a_1,cdots,a_ninmathbb{F}$ which satisfy $$sum_{k=1}^na_k(T-lambda_k E)^{-1}=E ?$$




I don't have much idea. I figured that it's sufficient to prove the case $mathbb{F}=mathbb{C}$. Because once we have $n$ complex numbers satisfying the equation, consider $mathbb{C}$ as a linear space over $mathbb{F}$ we can get $n$ numbers in $mathbb{F}$ satisfying the equation. And if we set the characteristic polynomial of $T$ is $p(x)$ and $$p(x)=g_k(x)(x-lambda_k)+p(lambda_k),$$ then $$(T-lambda_k E)^{-1}=frac{g_k(T)}{p(lambda_k)}.$$ But i don't know how to continue. Any help would be appreciated.







linear-algebra polynomials eigenvalues-eigenvectors matrix-equations matrix-analysis






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 10 '18 at 17:02







user593746

















asked Dec 10 '18 at 13:16









hctbhctb

1,010410




1,010410












  • $begingroup$
    (You're missing a minus sign...) If you can show that the $g_k$ are independent then they must span the space of polynomials of degree less than or equal to $n$, since that space has dimension $n$. So there exist scalars with $sum c_kg_k=1$ and you're done.
    $endgroup$
    – David C. Ullrich
    Dec 10 '18 at 16:10


















  • $begingroup$
    (You're missing a minus sign...) If you can show that the $g_k$ are independent then they must span the space of polynomials of degree less than or equal to $n$, since that space has dimension $n$. So there exist scalars with $sum c_kg_k=1$ and you're done.
    $endgroup$
    – David C. Ullrich
    Dec 10 '18 at 16:10
















$begingroup$
(You're missing a minus sign...) If you can show that the $g_k$ are independent then they must span the space of polynomials of degree less than or equal to $n$, since that space has dimension $n$. So there exist scalars with $sum c_kg_k=1$ and you're done.
$endgroup$
– David C. Ullrich
Dec 10 '18 at 16:10




$begingroup$
(You're missing a minus sign...) If you can show that the $g_k$ are independent then they must span the space of polynomials of degree less than or equal to $n$, since that space has dimension $n$. So there exist scalars with $sum c_kg_k=1$ and you're done.
$endgroup$
– David C. Ullrich
Dec 10 '18 at 16:10










2 Answers
2






active

oldest

votes


















3












$begingroup$

Let $p(x)=det(xI-T)$ be the characteristic polynomial of $T$. Let $q(x)$ denote $prod_{i=1}^n(x-lambda_i)$. Then,
$$f(x)=q(x)-p(x)$$
is a polynomial of degree at most $n-1$. That is,
$$frac{f(x)}{q(x)}=sum_{i=1}^nfrac{a_i}{x-lambda_i}$$
for some $a_1,a_2,ldots,a_nin Bbb F$. To be precise, $$a_i=frac{f(lambda_i)}{ prod_{jneq i}(lambda_i-lambda_j)}=-frac{p(lambda_i)}{prod_{jneq i}(lambda_i-lambda_j)}=-frac{p(lambda_i)}{q'(lambda_i)}.$$
Therefore,
$$q(T)=q(T)-p(T)=f(T)=left(sum_{i=1}^na_i(T-lambda_i E)^{-1}right) q(T).$$
Since $q(T)$ is invertible, so
$$E=sum_{i=1}^na_i(T-lambda_iE)^{-1}=-sum_{i=1}^nfrac{p(lambda_i)}{prod_{jneq i}(lambda_i-lambda_j)}(T-lambda_i E)^{-1}.$$






share|cite|improve this answer









$endgroup$





















    0












    $begingroup$

    The accepted answer is simply excellent. Partial fractions - not just for calculus!



    But the accepted answer doesn't say anything about why $f/q$ has a partial-fraction decomposition as claimed. It's possible to give a proof that "partial fractions work" in $Bbb C(x)$ using a little bit of complex analysis; in fact I'm guilty of publishing such a proof, in Complex Made Simple. Given that and the fact that the OP specifies that $Bbb F$ is a subfield of $Bbb C$ and says "I figured that it's sufficient to prove the case $Bbb F=Bbb C$", it seems possible that some readers might get the idea that the argument is specific to complex numbers.



    No, it works over any field. And here, since the $lambda_j$ are distinct, it turns out that it's much simpler than I realized until yesterday - I thought I'd share the argument.



    Notation is as above, except that $Bbb F$ is an arbitrary field. Define $$q_k(x)=frac{q(x)}{x-lambda_k}=prod_{jne k}(x-lambda_j).$$ We need to show that there exist scalars $a_j$ with $$f=sum a_jq_j.$$ Letting $V$ be the space of polynomials of degree no larger than $n-1$, it's enough to show that





    $q_1,dots,q_n$ span $V$.





    Since $dim(V)=n$ this is the same as





    $q_1,dots,q_n$ are independent.





    And that's more or less obvious: Say $$sum c_jq_j=0.$$



    Noting that $q_j(lambda_k)=0$ for $jne k$ this shows that $$0=sum_jc_jq_j(lambda_k)=c_kq_k(lambda_k);$$hence $c_k=0$, since $q_k(lambda_k)ne0$.






    share|cite|improve this answer











    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3033901%2flinear-combination-of-some-matrices-is-identity-matrix%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      3












      $begingroup$

      Let $p(x)=det(xI-T)$ be the characteristic polynomial of $T$. Let $q(x)$ denote $prod_{i=1}^n(x-lambda_i)$. Then,
      $$f(x)=q(x)-p(x)$$
      is a polynomial of degree at most $n-1$. That is,
      $$frac{f(x)}{q(x)}=sum_{i=1}^nfrac{a_i}{x-lambda_i}$$
      for some $a_1,a_2,ldots,a_nin Bbb F$. To be precise, $$a_i=frac{f(lambda_i)}{ prod_{jneq i}(lambda_i-lambda_j)}=-frac{p(lambda_i)}{prod_{jneq i}(lambda_i-lambda_j)}=-frac{p(lambda_i)}{q'(lambda_i)}.$$
      Therefore,
      $$q(T)=q(T)-p(T)=f(T)=left(sum_{i=1}^na_i(T-lambda_i E)^{-1}right) q(T).$$
      Since $q(T)$ is invertible, so
      $$E=sum_{i=1}^na_i(T-lambda_iE)^{-1}=-sum_{i=1}^nfrac{p(lambda_i)}{prod_{jneq i}(lambda_i-lambda_j)}(T-lambda_i E)^{-1}.$$






      share|cite|improve this answer









      $endgroup$


















        3












        $begingroup$

        Let $p(x)=det(xI-T)$ be the characteristic polynomial of $T$. Let $q(x)$ denote $prod_{i=1}^n(x-lambda_i)$. Then,
        $$f(x)=q(x)-p(x)$$
        is a polynomial of degree at most $n-1$. That is,
        $$frac{f(x)}{q(x)}=sum_{i=1}^nfrac{a_i}{x-lambda_i}$$
        for some $a_1,a_2,ldots,a_nin Bbb F$. To be precise, $$a_i=frac{f(lambda_i)}{ prod_{jneq i}(lambda_i-lambda_j)}=-frac{p(lambda_i)}{prod_{jneq i}(lambda_i-lambda_j)}=-frac{p(lambda_i)}{q'(lambda_i)}.$$
        Therefore,
        $$q(T)=q(T)-p(T)=f(T)=left(sum_{i=1}^na_i(T-lambda_i E)^{-1}right) q(T).$$
        Since $q(T)$ is invertible, so
        $$E=sum_{i=1}^na_i(T-lambda_iE)^{-1}=-sum_{i=1}^nfrac{p(lambda_i)}{prod_{jneq i}(lambda_i-lambda_j)}(T-lambda_i E)^{-1}.$$






        share|cite|improve this answer









        $endgroup$
















          3












          3








          3





          $begingroup$

          Let $p(x)=det(xI-T)$ be the characteristic polynomial of $T$. Let $q(x)$ denote $prod_{i=1}^n(x-lambda_i)$. Then,
          $$f(x)=q(x)-p(x)$$
          is a polynomial of degree at most $n-1$. That is,
          $$frac{f(x)}{q(x)}=sum_{i=1}^nfrac{a_i}{x-lambda_i}$$
          for some $a_1,a_2,ldots,a_nin Bbb F$. To be precise, $$a_i=frac{f(lambda_i)}{ prod_{jneq i}(lambda_i-lambda_j)}=-frac{p(lambda_i)}{prod_{jneq i}(lambda_i-lambda_j)}=-frac{p(lambda_i)}{q'(lambda_i)}.$$
          Therefore,
          $$q(T)=q(T)-p(T)=f(T)=left(sum_{i=1}^na_i(T-lambda_i E)^{-1}right) q(T).$$
          Since $q(T)$ is invertible, so
          $$E=sum_{i=1}^na_i(T-lambda_iE)^{-1}=-sum_{i=1}^nfrac{p(lambda_i)}{prod_{jneq i}(lambda_i-lambda_j)}(T-lambda_i E)^{-1}.$$






          share|cite|improve this answer









          $endgroup$



          Let $p(x)=det(xI-T)$ be the characteristic polynomial of $T$. Let $q(x)$ denote $prod_{i=1}^n(x-lambda_i)$. Then,
          $$f(x)=q(x)-p(x)$$
          is a polynomial of degree at most $n-1$. That is,
          $$frac{f(x)}{q(x)}=sum_{i=1}^nfrac{a_i}{x-lambda_i}$$
          for some $a_1,a_2,ldots,a_nin Bbb F$. To be precise, $$a_i=frac{f(lambda_i)}{ prod_{jneq i}(lambda_i-lambda_j)}=-frac{p(lambda_i)}{prod_{jneq i}(lambda_i-lambda_j)}=-frac{p(lambda_i)}{q'(lambda_i)}.$$
          Therefore,
          $$q(T)=q(T)-p(T)=f(T)=left(sum_{i=1}^na_i(T-lambda_i E)^{-1}right) q(T).$$
          Since $q(T)$ is invertible, so
          $$E=sum_{i=1}^na_i(T-lambda_iE)^{-1}=-sum_{i=1}^nfrac{p(lambda_i)}{prod_{jneq i}(lambda_i-lambda_j)}(T-lambda_i E)^{-1}.$$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Dec 10 '18 at 16:57







          user593746






























              0












              $begingroup$

              The accepted answer is simply excellent. Partial fractions - not just for calculus!



              But the accepted answer doesn't say anything about why $f/q$ has a partial-fraction decomposition as claimed. It's possible to give a proof that "partial fractions work" in $Bbb C(x)$ using a little bit of complex analysis; in fact I'm guilty of publishing such a proof, in Complex Made Simple. Given that and the fact that the OP specifies that $Bbb F$ is a subfield of $Bbb C$ and says "I figured that it's sufficient to prove the case $Bbb F=Bbb C$", it seems possible that some readers might get the idea that the argument is specific to complex numbers.



              No, it works over any field. And here, since the $lambda_j$ are distinct, it turns out that it's much simpler than I realized until yesterday - I thought I'd share the argument.



              Notation is as above, except that $Bbb F$ is an arbitrary field. Define $$q_k(x)=frac{q(x)}{x-lambda_k}=prod_{jne k}(x-lambda_j).$$ We need to show that there exist scalars $a_j$ with $$f=sum a_jq_j.$$ Letting $V$ be the space of polynomials of degree no larger than $n-1$, it's enough to show that





              $q_1,dots,q_n$ span $V$.





              Since $dim(V)=n$ this is the same as





              $q_1,dots,q_n$ are independent.





              And that's more or less obvious: Say $$sum c_jq_j=0.$$



              Noting that $q_j(lambda_k)=0$ for $jne k$ this shows that $$0=sum_jc_jq_j(lambda_k)=c_kq_k(lambda_k);$$hence $c_k=0$, since $q_k(lambda_k)ne0$.






              share|cite|improve this answer











              $endgroup$


















                0












                $begingroup$

                The accepted answer is simply excellent. Partial fractions - not just for calculus!



                But the accepted answer doesn't say anything about why $f/q$ has a partial-fraction decomposition as claimed. It's possible to give a proof that "partial fractions work" in $Bbb C(x)$ using a little bit of complex analysis; in fact I'm guilty of publishing such a proof, in Complex Made Simple. Given that and the fact that the OP specifies that $Bbb F$ is a subfield of $Bbb C$ and says "I figured that it's sufficient to prove the case $Bbb F=Bbb C$", it seems possible that some readers might get the idea that the argument is specific to complex numbers.



                No, it works over any field. And here, since the $lambda_j$ are distinct, it turns out that it's much simpler than I realized until yesterday - I thought I'd share the argument.



                Notation is as above, except that $Bbb F$ is an arbitrary field. Define $$q_k(x)=frac{q(x)}{x-lambda_k}=prod_{jne k}(x-lambda_j).$$ We need to show that there exist scalars $a_j$ with $$f=sum a_jq_j.$$ Letting $V$ be the space of polynomials of degree no larger than $n-1$, it's enough to show that





                $q_1,dots,q_n$ span $V$.





                Since $dim(V)=n$ this is the same as





                $q_1,dots,q_n$ are independent.





                And that's more or less obvious: Say $$sum c_jq_j=0.$$



                Noting that $q_j(lambda_k)=0$ for $jne k$ this shows that $$0=sum_jc_jq_j(lambda_k)=c_kq_k(lambda_k);$$hence $c_k=0$, since $q_k(lambda_k)ne0$.






                share|cite|improve this answer











                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  The accepted answer is simply excellent. Partial fractions - not just for calculus!



                  But the accepted answer doesn't say anything about why $f/q$ has a partial-fraction decomposition as claimed. It's possible to give a proof that "partial fractions work" in $Bbb C(x)$ using a little bit of complex analysis; in fact I'm guilty of publishing such a proof, in Complex Made Simple. Given that and the fact that the OP specifies that $Bbb F$ is a subfield of $Bbb C$ and says "I figured that it's sufficient to prove the case $Bbb F=Bbb C$", it seems possible that some readers might get the idea that the argument is specific to complex numbers.



                  No, it works over any field. And here, since the $lambda_j$ are distinct, it turns out that it's much simpler than I realized until yesterday - I thought I'd share the argument.



                  Notation is as above, except that $Bbb F$ is an arbitrary field. Define $$q_k(x)=frac{q(x)}{x-lambda_k}=prod_{jne k}(x-lambda_j).$$ We need to show that there exist scalars $a_j$ with $$f=sum a_jq_j.$$ Letting $V$ be the space of polynomials of degree no larger than $n-1$, it's enough to show that





                  $q_1,dots,q_n$ span $V$.





                  Since $dim(V)=n$ this is the same as





                  $q_1,dots,q_n$ are independent.





                  And that's more or less obvious: Say $$sum c_jq_j=0.$$



                  Noting that $q_j(lambda_k)=0$ for $jne k$ this shows that $$0=sum_jc_jq_j(lambda_k)=c_kq_k(lambda_k);$$hence $c_k=0$, since $q_k(lambda_k)ne0$.






                  share|cite|improve this answer











                  $endgroup$



                  The accepted answer is simply excellent. Partial fractions - not just for calculus!



                  But the accepted answer doesn't say anything about why $f/q$ has a partial-fraction decomposition as claimed. It's possible to give a proof that "partial fractions work" in $Bbb C(x)$ using a little bit of complex analysis; in fact I'm guilty of publishing such a proof, in Complex Made Simple. Given that and the fact that the OP specifies that $Bbb F$ is a subfield of $Bbb C$ and says "I figured that it's sufficient to prove the case $Bbb F=Bbb C$", it seems possible that some readers might get the idea that the argument is specific to complex numbers.



                  No, it works over any field. And here, since the $lambda_j$ are distinct, it turns out that it's much simpler than I realized until yesterday - I thought I'd share the argument.



                  Notation is as above, except that $Bbb F$ is an arbitrary field. Define $$q_k(x)=frac{q(x)}{x-lambda_k}=prod_{jne k}(x-lambda_j).$$ We need to show that there exist scalars $a_j$ with $$f=sum a_jq_j.$$ Letting $V$ be the space of polynomials of degree no larger than $n-1$, it's enough to show that





                  $q_1,dots,q_n$ span $V$.





                  Since $dim(V)=n$ this is the same as





                  $q_1,dots,q_n$ are independent.





                  And that's more or less obvious: Say $$sum c_jq_j=0.$$



                  Noting that $q_j(lambda_k)=0$ for $jne k$ this shows that $$0=sum_jc_jq_j(lambda_k)=c_kq_k(lambda_k);$$hence $c_k=0$, since $q_k(lambda_k)ne0$.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Dec 12 '18 at 16:52

























                  answered Dec 12 '18 at 14:51









                  David C. UllrichDavid C. Ullrich

                  60.9k43994




                  60.9k43994






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3033901%2flinear-combination-of-some-matrices-is-identity-matrix%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Plaza Victoria

                      Puebla de Zaragoza

                      Musa