Let $X$ be a Banach space then every absolutely convergent series in $X$ converges in $X$












2












$begingroup$


my trial



Let $sum x_k$ be absolutely convergent in $X$ $implies$



$sum |x_k |$ converges in $mathbb{R}$ $implies$



$forall epsilon >0, exists N(epsilon)$ st $forall n>N(epsilon)$. we have
$| | sum_{k=1}^{k=n} x_k | - L |$ < $epsilon$ where $L$ is the limit in $mathbb{R}$. Now fix an $epsilon$>0 then there exist some $N$ st for all $n>N$



$| | sum_{k=1}^{k=n} x_k | - L |$ < $epsilon$



By reverse-triangular inequality $implies$ $ | sum_{k=1}^{k=n} x_k | $ < $epsilon + |L| = epsilon'$. Similarly for $m>n>N$ we have



$ | sum_{k=1}^{k=m} x_k | $ < $epsilon'$



By triangular inequality, we get $| | sum_{k=1}^{k=n} x_k | - | sum_{k=1}^{k=m} x_k | | <2 $$epsilon$'



But $| sum_{k=1}^{n} x_k - sum_{k=1}^{m} x_k|$ =$| sum_{k=n+1}^{m} x_k |=$$| | sum_{k=1}^{k=n} x_k | - | sum_{k=1}^{k=m} x_k | | <2 $$epsilon$'
Hence ${ sum_{k=1}^{n} x_k }$ is a cauchy sequence in $X$ thus has a limit in $X$ so $sum x_k $ converges in $X$.





Is my proof correct?










share|cite|improve this question











$endgroup$

















    2












    $begingroup$


    my trial



    Let $sum x_k$ be absolutely convergent in $X$ $implies$



    $sum |x_k |$ converges in $mathbb{R}$ $implies$



    $forall epsilon >0, exists N(epsilon)$ st $forall n>N(epsilon)$. we have
    $| | sum_{k=1}^{k=n} x_k | - L |$ < $epsilon$ where $L$ is the limit in $mathbb{R}$. Now fix an $epsilon$>0 then there exist some $N$ st for all $n>N$



    $| | sum_{k=1}^{k=n} x_k | - L |$ < $epsilon$



    By reverse-triangular inequality $implies$ $ | sum_{k=1}^{k=n} x_k | $ < $epsilon + |L| = epsilon'$. Similarly for $m>n>N$ we have



    $ | sum_{k=1}^{k=m} x_k | $ < $epsilon'$



    By triangular inequality, we get $| | sum_{k=1}^{k=n} x_k | - | sum_{k=1}^{k=m} x_k | | <2 $$epsilon$'



    But $| sum_{k=1}^{n} x_k - sum_{k=1}^{m} x_k|$ =$| sum_{k=n+1}^{m} x_k |=$$| | sum_{k=1}^{k=n} x_k | - | sum_{k=1}^{k=m} x_k | | <2 $$epsilon$'
    Hence ${ sum_{k=1}^{n} x_k }$ is a cauchy sequence in $X$ thus has a limit in $X$ so $sum x_k $ converges in $X$.





    Is my proof correct?










    share|cite|improve this question











    $endgroup$















      2












      2








      2





      $begingroup$


      my trial



      Let $sum x_k$ be absolutely convergent in $X$ $implies$



      $sum |x_k |$ converges in $mathbb{R}$ $implies$



      $forall epsilon >0, exists N(epsilon)$ st $forall n>N(epsilon)$. we have
      $| | sum_{k=1}^{k=n} x_k | - L |$ < $epsilon$ where $L$ is the limit in $mathbb{R}$. Now fix an $epsilon$>0 then there exist some $N$ st for all $n>N$



      $| | sum_{k=1}^{k=n} x_k | - L |$ < $epsilon$



      By reverse-triangular inequality $implies$ $ | sum_{k=1}^{k=n} x_k | $ < $epsilon + |L| = epsilon'$. Similarly for $m>n>N$ we have



      $ | sum_{k=1}^{k=m} x_k | $ < $epsilon'$



      By triangular inequality, we get $| | sum_{k=1}^{k=n} x_k | - | sum_{k=1}^{k=m} x_k | | <2 $$epsilon$'



      But $| sum_{k=1}^{n} x_k - sum_{k=1}^{m} x_k|$ =$| sum_{k=n+1}^{m} x_k |=$$| | sum_{k=1}^{k=n} x_k | - | sum_{k=1}^{k=m} x_k | | <2 $$epsilon$'
      Hence ${ sum_{k=1}^{n} x_k }$ is a cauchy sequence in $X$ thus has a limit in $X$ so $sum x_k $ converges in $X$.





      Is my proof correct?










      share|cite|improve this question











      $endgroup$




      my trial



      Let $sum x_k$ be absolutely convergent in $X$ $implies$



      $sum |x_k |$ converges in $mathbb{R}$ $implies$



      $forall epsilon >0, exists N(epsilon)$ st $forall n>N(epsilon)$. we have
      $| | sum_{k=1}^{k=n} x_k | - L |$ < $epsilon$ where $L$ is the limit in $mathbb{R}$. Now fix an $epsilon$>0 then there exist some $N$ st for all $n>N$



      $| | sum_{k=1}^{k=n} x_k | - L |$ < $epsilon$



      By reverse-triangular inequality $implies$ $ | sum_{k=1}^{k=n} x_k | $ < $epsilon + |L| = epsilon'$. Similarly for $m>n>N$ we have



      $ | sum_{k=1}^{k=m} x_k | $ < $epsilon'$



      By triangular inequality, we get $| | sum_{k=1}^{k=n} x_k | - | sum_{k=1}^{k=m} x_k | | <2 $$epsilon$'



      But $| sum_{k=1}^{n} x_k - sum_{k=1}^{m} x_k|$ =$| sum_{k=n+1}^{m} x_k |=$$| | sum_{k=1}^{k=n} x_k | - | sum_{k=1}^{k=m} x_k | | <2 $$epsilon$'
      Hence ${ sum_{k=1}^{n} x_k }$ is a cauchy sequence in $X$ thus has a limit in $X$ so $sum x_k $ converges in $X$.





      Is my proof correct?







      real-analysis functional-analysis proof-verification banach-spaces absolute-convergence






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Dec 31 '18 at 17:51







      Dreamer123

















      asked Dec 31 '18 at 17:32









      Dreamer123Dreamer123

      32729




      32729






















          2 Answers
          2






          active

          oldest

          votes


















          3












          $begingroup$

          You've more or less got the right idea, but we've got to put the norm signs in the right place, and there is a place near the end where we need to use the triangle inequality slightly differently.



          The fact that $sum_{k = 1}^infty | x_k |$ converges and is equal to $L$ (say) means that for any $epsilon > 0$, there exists an $N(epsilon) in mathbb N$ such that



          $$ n geq N(epsilon ) implies left| sum_{k = 1}^{n}| x_k | - Lright| < epsilon .$$



          Following through with your original approach, we find that
          $$ m > n geq N(epsilon) implies sum_{k = n + 1}^{m} | x_k| < 2epsilon .$$



          [To spell it out, I'm using the triangle inequality like this:
          $$ sum_{k = n+1}^m | x_k | = left| left( sum_{k = 1}^m | x_k | - Lright)- left( sum_{k = 1}^n | x_k | - L right)right| leq left| sum_{k = 1}^{m}| x_k | - Lright| + left| sum_{k = 1}^{n}| x_k | - Lright| < 2epsilon$$
          ]



          To show that $n mapsto sum_{k = 1}^n x_k$ is a Cauchy sequence, we must use the triangle inequality like this:



          $$ left| sum_{k = 1}^m x_k - sum_{k = 1}^nx_kright| = left| sum_{k=n+1}^m x_kright| leq sum_{k = n+1}^m | x_k |.$$



          So for any $epsilon > 0$, we have



          $$ m > n geq N(epsilon) implies left| sum_{k = 1}^m x_k - sum_{k = 1}^nx_kright| < 2epsilon$$



          which implies that $n mapsto sum_{k = 1}^n x_k$ is Cauchy, and hence, convergent.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            In the last two lines, the triangular inequality is in the norm of $X$ not in absolute value
            $endgroup$
            – Dreamer123
            Dec 31 '18 at 18:47










          • $begingroup$
            @Dreamer123 Thanks!
            $endgroup$
            – Kenny Wong
            Dec 31 '18 at 18:48



















          4












          $begingroup$

          No, your proof does not work because:




          1. The conclusion that you get from the reverse triangle inequality is that$$leftlVertsum_{k=1}^nx_krightrVert<varepsilon+lvert Lrvert.$$

          2. If you define $varepsilon'=varepsilon+lvert Lrvert$, then $varepsilon'$ is not an arbitrary number greater than $0$, as it should be.


          In order to get a proof, apply the Cauchy criterion for series.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Why is that? since $epsilon$ is chosen arbitrarily and $L$ is fixed then $epsilon$' is arbitrary
            $endgroup$
            – Dreamer123
            Dec 31 '18 at 17:56






          • 1




            $begingroup$
            Not so, since $varepsilon'geqslant L$.
            $endgroup$
            – José Carlos Santos
            Dec 31 '18 at 17:57











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3057886%2flet-x-be-a-banach-space-then-every-absolutely-convergent-series-in-x-converg%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          3












          $begingroup$

          You've more or less got the right idea, but we've got to put the norm signs in the right place, and there is a place near the end where we need to use the triangle inequality slightly differently.



          The fact that $sum_{k = 1}^infty | x_k |$ converges and is equal to $L$ (say) means that for any $epsilon > 0$, there exists an $N(epsilon) in mathbb N$ such that



          $$ n geq N(epsilon ) implies left| sum_{k = 1}^{n}| x_k | - Lright| < epsilon .$$



          Following through with your original approach, we find that
          $$ m > n geq N(epsilon) implies sum_{k = n + 1}^{m} | x_k| < 2epsilon .$$



          [To spell it out, I'm using the triangle inequality like this:
          $$ sum_{k = n+1}^m | x_k | = left| left( sum_{k = 1}^m | x_k | - Lright)- left( sum_{k = 1}^n | x_k | - L right)right| leq left| sum_{k = 1}^{m}| x_k | - Lright| + left| sum_{k = 1}^{n}| x_k | - Lright| < 2epsilon$$
          ]



          To show that $n mapsto sum_{k = 1}^n x_k$ is a Cauchy sequence, we must use the triangle inequality like this:



          $$ left| sum_{k = 1}^m x_k - sum_{k = 1}^nx_kright| = left| sum_{k=n+1}^m x_kright| leq sum_{k = n+1}^m | x_k |.$$



          So for any $epsilon > 0$, we have



          $$ m > n geq N(epsilon) implies left| sum_{k = 1}^m x_k - sum_{k = 1}^nx_kright| < 2epsilon$$



          which implies that $n mapsto sum_{k = 1}^n x_k$ is Cauchy, and hence, convergent.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            In the last two lines, the triangular inequality is in the norm of $X$ not in absolute value
            $endgroup$
            – Dreamer123
            Dec 31 '18 at 18:47










          • $begingroup$
            @Dreamer123 Thanks!
            $endgroup$
            – Kenny Wong
            Dec 31 '18 at 18:48
















          3












          $begingroup$

          You've more or less got the right idea, but we've got to put the norm signs in the right place, and there is a place near the end where we need to use the triangle inequality slightly differently.



          The fact that $sum_{k = 1}^infty | x_k |$ converges and is equal to $L$ (say) means that for any $epsilon > 0$, there exists an $N(epsilon) in mathbb N$ such that



          $$ n geq N(epsilon ) implies left| sum_{k = 1}^{n}| x_k | - Lright| < epsilon .$$



          Following through with your original approach, we find that
          $$ m > n geq N(epsilon) implies sum_{k = n + 1}^{m} | x_k| < 2epsilon .$$



          [To spell it out, I'm using the triangle inequality like this:
          $$ sum_{k = n+1}^m | x_k | = left| left( sum_{k = 1}^m | x_k | - Lright)- left( sum_{k = 1}^n | x_k | - L right)right| leq left| sum_{k = 1}^{m}| x_k | - Lright| + left| sum_{k = 1}^{n}| x_k | - Lright| < 2epsilon$$
          ]



          To show that $n mapsto sum_{k = 1}^n x_k$ is a Cauchy sequence, we must use the triangle inequality like this:



          $$ left| sum_{k = 1}^m x_k - sum_{k = 1}^nx_kright| = left| sum_{k=n+1}^m x_kright| leq sum_{k = n+1}^m | x_k |.$$



          So for any $epsilon > 0$, we have



          $$ m > n geq N(epsilon) implies left| sum_{k = 1}^m x_k - sum_{k = 1}^nx_kright| < 2epsilon$$



          which implies that $n mapsto sum_{k = 1}^n x_k$ is Cauchy, and hence, convergent.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            In the last two lines, the triangular inequality is in the norm of $X$ not in absolute value
            $endgroup$
            – Dreamer123
            Dec 31 '18 at 18:47










          • $begingroup$
            @Dreamer123 Thanks!
            $endgroup$
            – Kenny Wong
            Dec 31 '18 at 18:48














          3












          3








          3





          $begingroup$

          You've more or less got the right idea, but we've got to put the norm signs in the right place, and there is a place near the end where we need to use the triangle inequality slightly differently.



          The fact that $sum_{k = 1}^infty | x_k |$ converges and is equal to $L$ (say) means that for any $epsilon > 0$, there exists an $N(epsilon) in mathbb N$ such that



          $$ n geq N(epsilon ) implies left| sum_{k = 1}^{n}| x_k | - Lright| < epsilon .$$



          Following through with your original approach, we find that
          $$ m > n geq N(epsilon) implies sum_{k = n + 1}^{m} | x_k| < 2epsilon .$$



          [To spell it out, I'm using the triangle inequality like this:
          $$ sum_{k = n+1}^m | x_k | = left| left( sum_{k = 1}^m | x_k | - Lright)- left( sum_{k = 1}^n | x_k | - L right)right| leq left| sum_{k = 1}^{m}| x_k | - Lright| + left| sum_{k = 1}^{n}| x_k | - Lright| < 2epsilon$$
          ]



          To show that $n mapsto sum_{k = 1}^n x_k$ is a Cauchy sequence, we must use the triangle inequality like this:



          $$ left| sum_{k = 1}^m x_k - sum_{k = 1}^nx_kright| = left| sum_{k=n+1}^m x_kright| leq sum_{k = n+1}^m | x_k |.$$



          So for any $epsilon > 0$, we have



          $$ m > n geq N(epsilon) implies left| sum_{k = 1}^m x_k - sum_{k = 1}^nx_kright| < 2epsilon$$



          which implies that $n mapsto sum_{k = 1}^n x_k$ is Cauchy, and hence, convergent.






          share|cite|improve this answer











          $endgroup$



          You've more or less got the right idea, but we've got to put the norm signs in the right place, and there is a place near the end where we need to use the triangle inequality slightly differently.



          The fact that $sum_{k = 1}^infty | x_k |$ converges and is equal to $L$ (say) means that for any $epsilon > 0$, there exists an $N(epsilon) in mathbb N$ such that



          $$ n geq N(epsilon ) implies left| sum_{k = 1}^{n}| x_k | - Lright| < epsilon .$$



          Following through with your original approach, we find that
          $$ m > n geq N(epsilon) implies sum_{k = n + 1}^{m} | x_k| < 2epsilon .$$



          [To spell it out, I'm using the triangle inequality like this:
          $$ sum_{k = n+1}^m | x_k | = left| left( sum_{k = 1}^m | x_k | - Lright)- left( sum_{k = 1}^n | x_k | - L right)right| leq left| sum_{k = 1}^{m}| x_k | - Lright| + left| sum_{k = 1}^{n}| x_k | - Lright| < 2epsilon$$
          ]



          To show that $n mapsto sum_{k = 1}^n x_k$ is a Cauchy sequence, we must use the triangle inequality like this:



          $$ left| sum_{k = 1}^m x_k - sum_{k = 1}^nx_kright| = left| sum_{k=n+1}^m x_kright| leq sum_{k = n+1}^m | x_k |.$$



          So for any $epsilon > 0$, we have



          $$ m > n geq N(epsilon) implies left| sum_{k = 1}^m x_k - sum_{k = 1}^nx_kright| < 2epsilon$$



          which implies that $n mapsto sum_{k = 1}^n x_k$ is Cauchy, and hence, convergent.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Dec 31 '18 at 18:48

























          answered Dec 31 '18 at 17:50









          Kenny WongKenny Wong

          19k21440




          19k21440












          • $begingroup$
            In the last two lines, the triangular inequality is in the norm of $X$ not in absolute value
            $endgroup$
            – Dreamer123
            Dec 31 '18 at 18:47










          • $begingroup$
            @Dreamer123 Thanks!
            $endgroup$
            – Kenny Wong
            Dec 31 '18 at 18:48


















          • $begingroup$
            In the last two lines, the triangular inequality is in the norm of $X$ not in absolute value
            $endgroup$
            – Dreamer123
            Dec 31 '18 at 18:47










          • $begingroup$
            @Dreamer123 Thanks!
            $endgroup$
            – Kenny Wong
            Dec 31 '18 at 18:48
















          $begingroup$
          In the last two lines, the triangular inequality is in the norm of $X$ not in absolute value
          $endgroup$
          – Dreamer123
          Dec 31 '18 at 18:47




          $begingroup$
          In the last two lines, the triangular inequality is in the norm of $X$ not in absolute value
          $endgroup$
          – Dreamer123
          Dec 31 '18 at 18:47












          $begingroup$
          @Dreamer123 Thanks!
          $endgroup$
          – Kenny Wong
          Dec 31 '18 at 18:48




          $begingroup$
          @Dreamer123 Thanks!
          $endgroup$
          – Kenny Wong
          Dec 31 '18 at 18:48











          4












          $begingroup$

          No, your proof does not work because:




          1. The conclusion that you get from the reverse triangle inequality is that$$leftlVertsum_{k=1}^nx_krightrVert<varepsilon+lvert Lrvert.$$

          2. If you define $varepsilon'=varepsilon+lvert Lrvert$, then $varepsilon'$ is not an arbitrary number greater than $0$, as it should be.


          In order to get a proof, apply the Cauchy criterion for series.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Why is that? since $epsilon$ is chosen arbitrarily and $L$ is fixed then $epsilon$' is arbitrary
            $endgroup$
            – Dreamer123
            Dec 31 '18 at 17:56






          • 1




            $begingroup$
            Not so, since $varepsilon'geqslant L$.
            $endgroup$
            – José Carlos Santos
            Dec 31 '18 at 17:57
















          4












          $begingroup$

          No, your proof does not work because:




          1. The conclusion that you get from the reverse triangle inequality is that$$leftlVertsum_{k=1}^nx_krightrVert<varepsilon+lvert Lrvert.$$

          2. If you define $varepsilon'=varepsilon+lvert Lrvert$, then $varepsilon'$ is not an arbitrary number greater than $0$, as it should be.


          In order to get a proof, apply the Cauchy criterion for series.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Why is that? since $epsilon$ is chosen arbitrarily and $L$ is fixed then $epsilon$' is arbitrary
            $endgroup$
            – Dreamer123
            Dec 31 '18 at 17:56






          • 1




            $begingroup$
            Not so, since $varepsilon'geqslant L$.
            $endgroup$
            – José Carlos Santos
            Dec 31 '18 at 17:57














          4












          4








          4





          $begingroup$

          No, your proof does not work because:




          1. The conclusion that you get from the reverse triangle inequality is that$$leftlVertsum_{k=1}^nx_krightrVert<varepsilon+lvert Lrvert.$$

          2. If you define $varepsilon'=varepsilon+lvert Lrvert$, then $varepsilon'$ is not an arbitrary number greater than $0$, as it should be.


          In order to get a proof, apply the Cauchy criterion for series.






          share|cite|improve this answer









          $endgroup$



          No, your proof does not work because:




          1. The conclusion that you get from the reverse triangle inequality is that$$leftlVertsum_{k=1}^nx_krightrVert<varepsilon+lvert Lrvert.$$

          2. If you define $varepsilon'=varepsilon+lvert Lrvert$, then $varepsilon'$ is not an arbitrary number greater than $0$, as it should be.


          In order to get a proof, apply the Cauchy criterion for series.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Dec 31 '18 at 17:38









          José Carlos SantosJosé Carlos Santos

          164k22131235




          164k22131235












          • $begingroup$
            Why is that? since $epsilon$ is chosen arbitrarily and $L$ is fixed then $epsilon$' is arbitrary
            $endgroup$
            – Dreamer123
            Dec 31 '18 at 17:56






          • 1




            $begingroup$
            Not so, since $varepsilon'geqslant L$.
            $endgroup$
            – José Carlos Santos
            Dec 31 '18 at 17:57


















          • $begingroup$
            Why is that? since $epsilon$ is chosen arbitrarily and $L$ is fixed then $epsilon$' is arbitrary
            $endgroup$
            – Dreamer123
            Dec 31 '18 at 17:56






          • 1




            $begingroup$
            Not so, since $varepsilon'geqslant L$.
            $endgroup$
            – José Carlos Santos
            Dec 31 '18 at 17:57
















          $begingroup$
          Why is that? since $epsilon$ is chosen arbitrarily and $L$ is fixed then $epsilon$' is arbitrary
          $endgroup$
          – Dreamer123
          Dec 31 '18 at 17:56




          $begingroup$
          Why is that? since $epsilon$ is chosen arbitrarily and $L$ is fixed then $epsilon$' is arbitrary
          $endgroup$
          – Dreamer123
          Dec 31 '18 at 17:56




          1




          1




          $begingroup$
          Not so, since $varepsilon'geqslant L$.
          $endgroup$
          – José Carlos Santos
          Dec 31 '18 at 17:57




          $begingroup$
          Not so, since $varepsilon'geqslant L$.
          $endgroup$
          – José Carlos Santos
          Dec 31 '18 at 17:57


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3057886%2flet-x-be-a-banach-space-then-every-absolutely-convergent-series-in-x-converg%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Bressuire

          Cabo Verde

          Gyllenstierna