Extending to a local frame that agrees with given orientation











up vote
0
down vote

favorite












Suppose that $(e_1, ldots, e_k)$ is an oriented basis for $T_pM$ where $M$ is an oriented Riemannian manifold. In general, we know that we can extend to a smooth local frame $(X_1, ldots, X_k)$ on $Uni p$ such that $X_irvert_p = e_i$ for each $i$.



But can we further stipulate that $(X_1rvert_q, ldots, X_krvert_q)$ is oriented for each $qin U$?



I have tried thinking of ways to shrink $U$ in a suitable way. I have played around with using the orientation form $omega$ on $M$ and with first applying Gram-Schmidt to $(X_1, ldots, X_k)$, but no progress.



Any help is much appreciated.










share|cite|improve this question




























    up vote
    0
    down vote

    favorite












    Suppose that $(e_1, ldots, e_k)$ is an oriented basis for $T_pM$ where $M$ is an oriented Riemannian manifold. In general, we know that we can extend to a smooth local frame $(X_1, ldots, X_k)$ on $Uni p$ such that $X_irvert_p = e_i$ for each $i$.



    But can we further stipulate that $(X_1rvert_q, ldots, X_krvert_q)$ is oriented for each $qin U$?



    I have tried thinking of ways to shrink $U$ in a suitable way. I have played around with using the orientation form $omega$ on $M$ and with first applying Gram-Schmidt to $(X_1, ldots, X_k)$, but no progress.



    Any help is much appreciated.










    share|cite|improve this question


























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      Suppose that $(e_1, ldots, e_k)$ is an oriented basis for $T_pM$ where $M$ is an oriented Riemannian manifold. In general, we know that we can extend to a smooth local frame $(X_1, ldots, X_k)$ on $Uni p$ such that $X_irvert_p = e_i$ for each $i$.



      But can we further stipulate that $(X_1rvert_q, ldots, X_krvert_q)$ is oriented for each $qin U$?



      I have tried thinking of ways to shrink $U$ in a suitable way. I have played around with using the orientation form $omega$ on $M$ and with first applying Gram-Schmidt to $(X_1, ldots, X_k)$, but no progress.



      Any help is much appreciated.










      share|cite|improve this question















      Suppose that $(e_1, ldots, e_k)$ is an oriented basis for $T_pM$ where $M$ is an oriented Riemannian manifold. In general, we know that we can extend to a smooth local frame $(X_1, ldots, X_k)$ on $Uni p$ such that $X_irvert_p = e_i$ for each $i$.



      But can we further stipulate that $(X_1rvert_q, ldots, X_krvert_q)$ is oriented for each $qin U$?



      I have tried thinking of ways to shrink $U$ in a suitable way. I have played around with using the orientation form $omega$ on $M$ and with first applying Gram-Schmidt to $(X_1, ldots, X_k)$, but no progress.



      Any help is much appreciated.







      linear-algebra differential-geometry differential-topology riemannian-geometry smooth-manifolds






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Dec 4 at 18:25

























      asked Dec 4 at 1:01









      CuriousKid7

      1,650717




      1,650717






















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote



          accepted










          The local frame $(X_i)$ will agree with the chosen orientation form: if $omega(e_1,dots,e_k)>0$, then $omega(X_1,dots,X_k)>0$ in $U$, because $omega(X_1,dots,X_k)|_q=0$ would imply that the $X_i's$ do not form a basis of $T_qM$.






          share|cite|improve this answer





















          • This solution shows how to restrict $U$ appropriately, but I don't see why vanishing at $omega rvert_q$ would imply that the $X_i$ don't form a basis. It's possible that an alternating tensor vanishes at a linearly independent tuple, right?
            – CuriousKid7
            Dec 5 at 5:01












          • You don't need to restrict $U$ in any way. $M$ is orientable and you have fixed an orientation form $omega$. What I'm saying is that if $(X_i)$ is a frame inside $U$, then $omega(X_1,dots,X_k)$ doesn't change sign inside $U$, because if it does there is a point at which it must vanish, and at that point $(X_i)$ would not be a frame
            – Federico
            Dec 5 at 13:51






          • 1




            "It's possible that an alternating tensor vanishes at a linearly independent tuple, right?" No, unless $omega=0$ at that point. If it vanishes at a linearly independent tuple, it vanishes on any tuple.
            – Federico
            Dec 5 at 13:52






          • 1




            Assume $omega(e_1,dots,e_k)=0$ for some basis $(e_i)$. Take any $k$-uple $v_j=a^i_je_i$. Then $omega(v_1,dots,v_k)=det(A)omega(e_1,dots,e_k)=0$, where $A$ is the matrix with entries $a^i_j$.
            – Federico
            Dec 5 at 13:54










          • Got it, thanks for clarifying!
            – CuriousKid7
            Dec 5 at 16:27











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3024965%2fextending-to-a-local-frame-that-agrees-with-given-orientation%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote



          accepted










          The local frame $(X_i)$ will agree with the chosen orientation form: if $omega(e_1,dots,e_k)>0$, then $omega(X_1,dots,X_k)>0$ in $U$, because $omega(X_1,dots,X_k)|_q=0$ would imply that the $X_i's$ do not form a basis of $T_qM$.






          share|cite|improve this answer





















          • This solution shows how to restrict $U$ appropriately, but I don't see why vanishing at $omega rvert_q$ would imply that the $X_i$ don't form a basis. It's possible that an alternating tensor vanishes at a linearly independent tuple, right?
            – CuriousKid7
            Dec 5 at 5:01












          • You don't need to restrict $U$ in any way. $M$ is orientable and you have fixed an orientation form $omega$. What I'm saying is that if $(X_i)$ is a frame inside $U$, then $omega(X_1,dots,X_k)$ doesn't change sign inside $U$, because if it does there is a point at which it must vanish, and at that point $(X_i)$ would not be a frame
            – Federico
            Dec 5 at 13:51






          • 1




            "It's possible that an alternating tensor vanishes at a linearly independent tuple, right?" No, unless $omega=0$ at that point. If it vanishes at a linearly independent tuple, it vanishes on any tuple.
            – Federico
            Dec 5 at 13:52






          • 1




            Assume $omega(e_1,dots,e_k)=0$ for some basis $(e_i)$. Take any $k$-uple $v_j=a^i_je_i$. Then $omega(v_1,dots,v_k)=det(A)omega(e_1,dots,e_k)=0$, where $A$ is the matrix with entries $a^i_j$.
            – Federico
            Dec 5 at 13:54










          • Got it, thanks for clarifying!
            – CuriousKid7
            Dec 5 at 16:27















          up vote
          1
          down vote



          accepted










          The local frame $(X_i)$ will agree with the chosen orientation form: if $omega(e_1,dots,e_k)>0$, then $omega(X_1,dots,X_k)>0$ in $U$, because $omega(X_1,dots,X_k)|_q=0$ would imply that the $X_i's$ do not form a basis of $T_qM$.






          share|cite|improve this answer





















          • This solution shows how to restrict $U$ appropriately, but I don't see why vanishing at $omega rvert_q$ would imply that the $X_i$ don't form a basis. It's possible that an alternating tensor vanishes at a linearly independent tuple, right?
            – CuriousKid7
            Dec 5 at 5:01












          • You don't need to restrict $U$ in any way. $M$ is orientable and you have fixed an orientation form $omega$. What I'm saying is that if $(X_i)$ is a frame inside $U$, then $omega(X_1,dots,X_k)$ doesn't change sign inside $U$, because if it does there is a point at which it must vanish, and at that point $(X_i)$ would not be a frame
            – Federico
            Dec 5 at 13:51






          • 1




            "It's possible that an alternating tensor vanishes at a linearly independent tuple, right?" No, unless $omega=0$ at that point. If it vanishes at a linearly independent tuple, it vanishes on any tuple.
            – Federico
            Dec 5 at 13:52






          • 1




            Assume $omega(e_1,dots,e_k)=0$ for some basis $(e_i)$. Take any $k$-uple $v_j=a^i_je_i$. Then $omega(v_1,dots,v_k)=det(A)omega(e_1,dots,e_k)=0$, where $A$ is the matrix with entries $a^i_j$.
            – Federico
            Dec 5 at 13:54










          • Got it, thanks for clarifying!
            – CuriousKid7
            Dec 5 at 16:27













          up vote
          1
          down vote



          accepted







          up vote
          1
          down vote



          accepted






          The local frame $(X_i)$ will agree with the chosen orientation form: if $omega(e_1,dots,e_k)>0$, then $omega(X_1,dots,X_k)>0$ in $U$, because $omega(X_1,dots,X_k)|_q=0$ would imply that the $X_i's$ do not form a basis of $T_qM$.






          share|cite|improve this answer












          The local frame $(X_i)$ will agree with the chosen orientation form: if $omega(e_1,dots,e_k)>0$, then $omega(X_1,dots,X_k)>0$ in $U$, because $omega(X_1,dots,X_k)|_q=0$ would imply that the $X_i's$ do not form a basis of $T_qM$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Dec 4 at 18:47









          Federico

          4,097512




          4,097512












          • This solution shows how to restrict $U$ appropriately, but I don't see why vanishing at $omega rvert_q$ would imply that the $X_i$ don't form a basis. It's possible that an alternating tensor vanishes at a linearly independent tuple, right?
            – CuriousKid7
            Dec 5 at 5:01












          • You don't need to restrict $U$ in any way. $M$ is orientable and you have fixed an orientation form $omega$. What I'm saying is that if $(X_i)$ is a frame inside $U$, then $omega(X_1,dots,X_k)$ doesn't change sign inside $U$, because if it does there is a point at which it must vanish, and at that point $(X_i)$ would not be a frame
            – Federico
            Dec 5 at 13:51






          • 1




            "It's possible that an alternating tensor vanishes at a linearly independent tuple, right?" No, unless $omega=0$ at that point. If it vanishes at a linearly independent tuple, it vanishes on any tuple.
            – Federico
            Dec 5 at 13:52






          • 1




            Assume $omega(e_1,dots,e_k)=0$ for some basis $(e_i)$. Take any $k$-uple $v_j=a^i_je_i$. Then $omega(v_1,dots,v_k)=det(A)omega(e_1,dots,e_k)=0$, where $A$ is the matrix with entries $a^i_j$.
            – Federico
            Dec 5 at 13:54










          • Got it, thanks for clarifying!
            – CuriousKid7
            Dec 5 at 16:27


















          • This solution shows how to restrict $U$ appropriately, but I don't see why vanishing at $omega rvert_q$ would imply that the $X_i$ don't form a basis. It's possible that an alternating tensor vanishes at a linearly independent tuple, right?
            – CuriousKid7
            Dec 5 at 5:01












          • You don't need to restrict $U$ in any way. $M$ is orientable and you have fixed an orientation form $omega$. What I'm saying is that if $(X_i)$ is a frame inside $U$, then $omega(X_1,dots,X_k)$ doesn't change sign inside $U$, because if it does there is a point at which it must vanish, and at that point $(X_i)$ would not be a frame
            – Federico
            Dec 5 at 13:51






          • 1




            "It's possible that an alternating tensor vanishes at a linearly independent tuple, right?" No, unless $omega=0$ at that point. If it vanishes at a linearly independent tuple, it vanishes on any tuple.
            – Federico
            Dec 5 at 13:52






          • 1




            Assume $omega(e_1,dots,e_k)=0$ for some basis $(e_i)$. Take any $k$-uple $v_j=a^i_je_i$. Then $omega(v_1,dots,v_k)=det(A)omega(e_1,dots,e_k)=0$, where $A$ is the matrix with entries $a^i_j$.
            – Federico
            Dec 5 at 13:54










          • Got it, thanks for clarifying!
            – CuriousKid7
            Dec 5 at 16:27
















          This solution shows how to restrict $U$ appropriately, but I don't see why vanishing at $omega rvert_q$ would imply that the $X_i$ don't form a basis. It's possible that an alternating tensor vanishes at a linearly independent tuple, right?
          – CuriousKid7
          Dec 5 at 5:01






          This solution shows how to restrict $U$ appropriately, but I don't see why vanishing at $omega rvert_q$ would imply that the $X_i$ don't form a basis. It's possible that an alternating tensor vanishes at a linearly independent tuple, right?
          – CuriousKid7
          Dec 5 at 5:01














          You don't need to restrict $U$ in any way. $M$ is orientable and you have fixed an orientation form $omega$. What I'm saying is that if $(X_i)$ is a frame inside $U$, then $omega(X_1,dots,X_k)$ doesn't change sign inside $U$, because if it does there is a point at which it must vanish, and at that point $(X_i)$ would not be a frame
          – Federico
          Dec 5 at 13:51




          You don't need to restrict $U$ in any way. $M$ is orientable and you have fixed an orientation form $omega$. What I'm saying is that if $(X_i)$ is a frame inside $U$, then $omega(X_1,dots,X_k)$ doesn't change sign inside $U$, because if it does there is a point at which it must vanish, and at that point $(X_i)$ would not be a frame
          – Federico
          Dec 5 at 13:51




          1




          1




          "It's possible that an alternating tensor vanishes at a linearly independent tuple, right?" No, unless $omega=0$ at that point. If it vanishes at a linearly independent tuple, it vanishes on any tuple.
          – Federico
          Dec 5 at 13:52




          "It's possible that an alternating tensor vanishes at a linearly independent tuple, right?" No, unless $omega=0$ at that point. If it vanishes at a linearly independent tuple, it vanishes on any tuple.
          – Federico
          Dec 5 at 13:52




          1




          1




          Assume $omega(e_1,dots,e_k)=0$ for some basis $(e_i)$. Take any $k$-uple $v_j=a^i_je_i$. Then $omega(v_1,dots,v_k)=det(A)omega(e_1,dots,e_k)=0$, where $A$ is the matrix with entries $a^i_j$.
          – Federico
          Dec 5 at 13:54




          Assume $omega(e_1,dots,e_k)=0$ for some basis $(e_i)$. Take any $k$-uple $v_j=a^i_je_i$. Then $omega(v_1,dots,v_k)=det(A)omega(e_1,dots,e_k)=0$, where $A$ is the matrix with entries $a^i_j$.
          – Federico
          Dec 5 at 13:54












          Got it, thanks for clarifying!
          – CuriousKid7
          Dec 5 at 16:27




          Got it, thanks for clarifying!
          – CuriousKid7
          Dec 5 at 16:27


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3024965%2fextending-to-a-local-frame-that-agrees-with-given-orientation%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Bressuire

          Cabo Verde

          Gyllenstierna