prove that 2 vectors is linearly independent in the vector space of vectors of length 2 with entries of...












0














I have the following vectors
$V_1=(e^t,te^t), V_2=(1,t)$
now I want to prove that this two vectors are linearly independent in the vector space of vectors of length 2 with entries of real-valued functions.



when I'm writing the formula:



$C_1*V_1^T+C_2*V_2^T=0$ and writing it in matrix notation I can't get to prove that the coefficients needs to be 0.



I do note that if I'm fixing t for a given value then they are linearly dependent which is somewhat weird because i know i need to prove that for general t they are not.



Would appreciate some thoughts.










share|cite|improve this question



























    0














    I have the following vectors
    $V_1=(e^t,te^t), V_2=(1,t)$
    now I want to prove that this two vectors are linearly independent in the vector space of vectors of length 2 with entries of real-valued functions.



    when I'm writing the formula:



    $C_1*V_1^T+C_2*V_2^T=0$ and writing it in matrix notation I can't get to prove that the coefficients needs to be 0.



    I do note that if I'm fixing t for a given value then they are linearly dependent which is somewhat weird because i know i need to prove that for general t they are not.



    Would appreciate some thoughts.










    share|cite|improve this question

























      0












      0








      0







      I have the following vectors
      $V_1=(e^t,te^t), V_2=(1,t)$
      now I want to prove that this two vectors are linearly independent in the vector space of vectors of length 2 with entries of real-valued functions.



      when I'm writing the formula:



      $C_1*V_1^T+C_2*V_2^T=0$ and writing it in matrix notation I can't get to prove that the coefficients needs to be 0.



      I do note that if I'm fixing t for a given value then they are linearly dependent which is somewhat weird because i know i need to prove that for general t they are not.



      Would appreciate some thoughts.










      share|cite|improve this question













      I have the following vectors
      $V_1=(e^t,te^t), V_2=(1,t)$
      now I want to prove that this two vectors are linearly independent in the vector space of vectors of length 2 with entries of real-valued functions.



      when I'm writing the formula:



      $C_1*V_1^T+C_2*V_2^T=0$ and writing it in matrix notation I can't get to prove that the coefficients needs to be 0.



      I do note that if I'm fixing t for a given value then they are linearly dependent which is somewhat weird because i know i need to prove that for general t they are not.



      Would appreciate some thoughts.







      linear-algebra vector-spaces






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Dec 9 at 15:47









      chemist

      144




      144






















          2 Answers
          2






          active

          oldest

          votes


















          0














          You get the linear equations:



          $$begin{cases}I&c_1e^t+c_2=0\{}\II&c_1te^t+c_2t=0end{cases}$$



          The above two equations are true for any calue of $;tinBbb R;$ (I'm assuming the functions are defined on the whole real line), so take some particular values:



          $$t=0stackrel Iimplies c_1+c_2=0implies c_2=-c_1stackrel {II}implies c_1tleft(e^t-1right)=0$$



          Take now $;t=1;$ and you get $;c_1(e-1)=0implies c_1=0;$ and thus also $;c_2=0;$ and we're done.






          share|cite|improve this answer





















          • so this is a valid proof, isn't any robust method for this kind of cases? i did it but for some reason it seems to simple for a proof
            – chemist
            Dec 9 at 16:16










          • I can't see anything "robust" in this proof. It is just going by the very simple definition ...
            – DonAntonio
            Dec 9 at 16:23










          • Yeah i get that, but what if there were like k vectors, this would be the only way cause if i can represent in a matrix and isolate the part that depends on t, then its just matrix reduction
            – chemist
            Dec 9 at 16:36










          • @chemist I don't quite follow you. The above is a proof. Maybe there are other ones though I suspect not a simple and elementary as this one. If there were $;k;$ one could probably do the same, although then it would probably be a little more difficult.
            – DonAntonio
            Dec 9 at 17:33



















          0














          Linear dependence of ${bf v}_1$ and ${bf v}_2$ in the vector space $X$ of functions ${bf x}:>tmapstobigl(x_1(t),x_2(t)bigr)$ means that there are fixed numbers $lambda$, $mu in{mathbb R}$, not both zero, such that $$lambda {bf v}_1(t)+mu {bf v}_2(t)={bf 0}qquadforall> t .tag{1}$$
          Inspecting the first component of $(1)$ we obtain the condition $lambda e^t+mu=0$. This condition cannot be satisfied for all $t$ with fixed numbers $lambda$, $mu$, other than $lambda=mu=0$. It follows that the two given vectors are linearly independent in $X$.






          share|cite|improve this answer





















            Your Answer





            StackExchange.ifUsing("editor", function () {
            return StackExchange.using("mathjaxEditing", function () {
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            });
            });
            }, "mathjax-editing");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "69"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3032523%2fprove-that-2-vectors-is-linearly-independent-in-the-vector-space-of-vectors-of-l%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            You get the linear equations:



            $$begin{cases}I&c_1e^t+c_2=0\{}\II&c_1te^t+c_2t=0end{cases}$$



            The above two equations are true for any calue of $;tinBbb R;$ (I'm assuming the functions are defined on the whole real line), so take some particular values:



            $$t=0stackrel Iimplies c_1+c_2=0implies c_2=-c_1stackrel {II}implies c_1tleft(e^t-1right)=0$$



            Take now $;t=1;$ and you get $;c_1(e-1)=0implies c_1=0;$ and thus also $;c_2=0;$ and we're done.






            share|cite|improve this answer





















            • so this is a valid proof, isn't any robust method for this kind of cases? i did it but for some reason it seems to simple for a proof
              – chemist
              Dec 9 at 16:16










            • I can't see anything "robust" in this proof. It is just going by the very simple definition ...
              – DonAntonio
              Dec 9 at 16:23










            • Yeah i get that, but what if there were like k vectors, this would be the only way cause if i can represent in a matrix and isolate the part that depends on t, then its just matrix reduction
              – chemist
              Dec 9 at 16:36










            • @chemist I don't quite follow you. The above is a proof. Maybe there are other ones though I suspect not a simple and elementary as this one. If there were $;k;$ one could probably do the same, although then it would probably be a little more difficult.
              – DonAntonio
              Dec 9 at 17:33
















            0














            You get the linear equations:



            $$begin{cases}I&c_1e^t+c_2=0\{}\II&c_1te^t+c_2t=0end{cases}$$



            The above two equations are true for any calue of $;tinBbb R;$ (I'm assuming the functions are defined on the whole real line), so take some particular values:



            $$t=0stackrel Iimplies c_1+c_2=0implies c_2=-c_1stackrel {II}implies c_1tleft(e^t-1right)=0$$



            Take now $;t=1;$ and you get $;c_1(e-1)=0implies c_1=0;$ and thus also $;c_2=0;$ and we're done.






            share|cite|improve this answer





















            • so this is a valid proof, isn't any robust method for this kind of cases? i did it but for some reason it seems to simple for a proof
              – chemist
              Dec 9 at 16:16










            • I can't see anything "robust" in this proof. It is just going by the very simple definition ...
              – DonAntonio
              Dec 9 at 16:23










            • Yeah i get that, but what if there were like k vectors, this would be the only way cause if i can represent in a matrix and isolate the part that depends on t, then its just matrix reduction
              – chemist
              Dec 9 at 16:36










            • @chemist I don't quite follow you. The above is a proof. Maybe there are other ones though I suspect not a simple and elementary as this one. If there were $;k;$ one could probably do the same, although then it would probably be a little more difficult.
              – DonAntonio
              Dec 9 at 17:33














            0












            0








            0






            You get the linear equations:



            $$begin{cases}I&c_1e^t+c_2=0\{}\II&c_1te^t+c_2t=0end{cases}$$



            The above two equations are true for any calue of $;tinBbb R;$ (I'm assuming the functions are defined on the whole real line), so take some particular values:



            $$t=0stackrel Iimplies c_1+c_2=0implies c_2=-c_1stackrel {II}implies c_1tleft(e^t-1right)=0$$



            Take now $;t=1;$ and you get $;c_1(e-1)=0implies c_1=0;$ and thus also $;c_2=0;$ and we're done.






            share|cite|improve this answer












            You get the linear equations:



            $$begin{cases}I&c_1e^t+c_2=0\{}\II&c_1te^t+c_2t=0end{cases}$$



            The above two equations are true for any calue of $;tinBbb R;$ (I'm assuming the functions are defined on the whole real line), so take some particular values:



            $$t=0stackrel Iimplies c_1+c_2=0implies c_2=-c_1stackrel {II}implies c_1tleft(e^t-1right)=0$$



            Take now $;t=1;$ and you get $;c_1(e-1)=0implies c_1=0;$ and thus also $;c_2=0;$ and we're done.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered Dec 9 at 16:00









            DonAntonio

            177k1491225




            177k1491225












            • so this is a valid proof, isn't any robust method for this kind of cases? i did it but for some reason it seems to simple for a proof
              – chemist
              Dec 9 at 16:16










            • I can't see anything "robust" in this proof. It is just going by the very simple definition ...
              – DonAntonio
              Dec 9 at 16:23










            • Yeah i get that, but what if there were like k vectors, this would be the only way cause if i can represent in a matrix and isolate the part that depends on t, then its just matrix reduction
              – chemist
              Dec 9 at 16:36










            • @chemist I don't quite follow you. The above is a proof. Maybe there are other ones though I suspect not a simple and elementary as this one. If there were $;k;$ one could probably do the same, although then it would probably be a little more difficult.
              – DonAntonio
              Dec 9 at 17:33


















            • so this is a valid proof, isn't any robust method for this kind of cases? i did it but for some reason it seems to simple for a proof
              – chemist
              Dec 9 at 16:16










            • I can't see anything "robust" in this proof. It is just going by the very simple definition ...
              – DonAntonio
              Dec 9 at 16:23










            • Yeah i get that, but what if there were like k vectors, this would be the only way cause if i can represent in a matrix and isolate the part that depends on t, then its just matrix reduction
              – chemist
              Dec 9 at 16:36










            • @chemist I don't quite follow you. The above is a proof. Maybe there are other ones though I suspect not a simple and elementary as this one. If there were $;k;$ one could probably do the same, although then it would probably be a little more difficult.
              – DonAntonio
              Dec 9 at 17:33
















            so this is a valid proof, isn't any robust method for this kind of cases? i did it but for some reason it seems to simple for a proof
            – chemist
            Dec 9 at 16:16




            so this is a valid proof, isn't any robust method for this kind of cases? i did it but for some reason it seems to simple for a proof
            – chemist
            Dec 9 at 16:16












            I can't see anything "robust" in this proof. It is just going by the very simple definition ...
            – DonAntonio
            Dec 9 at 16:23




            I can't see anything "robust" in this proof. It is just going by the very simple definition ...
            – DonAntonio
            Dec 9 at 16:23












            Yeah i get that, but what if there were like k vectors, this would be the only way cause if i can represent in a matrix and isolate the part that depends on t, then its just matrix reduction
            – chemist
            Dec 9 at 16:36




            Yeah i get that, but what if there were like k vectors, this would be the only way cause if i can represent in a matrix and isolate the part that depends on t, then its just matrix reduction
            – chemist
            Dec 9 at 16:36












            @chemist I don't quite follow you. The above is a proof. Maybe there are other ones though I suspect not a simple and elementary as this one. If there were $;k;$ one could probably do the same, although then it would probably be a little more difficult.
            – DonAntonio
            Dec 9 at 17:33




            @chemist I don't quite follow you. The above is a proof. Maybe there are other ones though I suspect not a simple and elementary as this one. If there were $;k;$ one could probably do the same, although then it would probably be a little more difficult.
            – DonAntonio
            Dec 9 at 17:33











            0














            Linear dependence of ${bf v}_1$ and ${bf v}_2$ in the vector space $X$ of functions ${bf x}:>tmapstobigl(x_1(t),x_2(t)bigr)$ means that there are fixed numbers $lambda$, $mu in{mathbb R}$, not both zero, such that $$lambda {bf v}_1(t)+mu {bf v}_2(t)={bf 0}qquadforall> t .tag{1}$$
            Inspecting the first component of $(1)$ we obtain the condition $lambda e^t+mu=0$. This condition cannot be satisfied for all $t$ with fixed numbers $lambda$, $mu$, other than $lambda=mu=0$. It follows that the two given vectors are linearly independent in $X$.






            share|cite|improve this answer


























              0














              Linear dependence of ${bf v}_1$ and ${bf v}_2$ in the vector space $X$ of functions ${bf x}:>tmapstobigl(x_1(t),x_2(t)bigr)$ means that there are fixed numbers $lambda$, $mu in{mathbb R}$, not both zero, such that $$lambda {bf v}_1(t)+mu {bf v}_2(t)={bf 0}qquadforall> t .tag{1}$$
              Inspecting the first component of $(1)$ we obtain the condition $lambda e^t+mu=0$. This condition cannot be satisfied for all $t$ with fixed numbers $lambda$, $mu$, other than $lambda=mu=0$. It follows that the two given vectors are linearly independent in $X$.






              share|cite|improve this answer
























                0












                0








                0






                Linear dependence of ${bf v}_1$ and ${bf v}_2$ in the vector space $X$ of functions ${bf x}:>tmapstobigl(x_1(t),x_2(t)bigr)$ means that there are fixed numbers $lambda$, $mu in{mathbb R}$, not both zero, such that $$lambda {bf v}_1(t)+mu {bf v}_2(t)={bf 0}qquadforall> t .tag{1}$$
                Inspecting the first component of $(1)$ we obtain the condition $lambda e^t+mu=0$. This condition cannot be satisfied for all $t$ with fixed numbers $lambda$, $mu$, other than $lambda=mu=0$. It follows that the two given vectors are linearly independent in $X$.






                share|cite|improve this answer












                Linear dependence of ${bf v}_1$ and ${bf v}_2$ in the vector space $X$ of functions ${bf x}:>tmapstobigl(x_1(t),x_2(t)bigr)$ means that there are fixed numbers $lambda$, $mu in{mathbb R}$, not both zero, such that $$lambda {bf v}_1(t)+mu {bf v}_2(t)={bf 0}qquadforall> t .tag{1}$$
                Inspecting the first component of $(1)$ we obtain the condition $lambda e^t+mu=0$. This condition cannot be satisfied for all $t$ with fixed numbers $lambda$, $mu$, other than $lambda=mu=0$. It follows that the two given vectors are linearly independent in $X$.







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered Dec 9 at 20:27









                Christian Blatter

                172k7112325




                172k7112325






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Mathematics Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.





                    Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                    Please pay close attention to the following guidance:


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3032523%2fprove-that-2-vectors-is-linearly-independent-in-the-vector-space-of-vectors-of-l%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Bressuire

                    Cabo Verde

                    Gyllenstierna