Removable singularities for rational functions with floating point coefficients











up vote
2
down vote

favorite












Suppose I have given a rational function $r(x)=p(x)/q(x)$ where $p$ is a degree $m$ polynomial and $q$ is a degree $n$ polynomial (over the real numbers) and the coefficients of p and q are represented by floating point numbers, which result from some computational procedure (I am working on recursive filtering of measurement data, if that is of interest to somebody).



If $p$ and $q$ share some common linear factor (e.g. $x-1$), the corresponding root (in the example it's $x_0=1$) represents a removable singularity of $r$.



But due to the numerical processing, $p$ might have a root $x^prime_0=1.0001$, whereas $q$ might have a root $x^{primeprime}_0=0.9999$, and the only reason why these roots are different is due to roundoff errors. Ideally I would like to be able to remove such singularities, although they are not removable in the true sense of the word.



So what I am basically trying to do is to perform an approximate greatest common divisor algorithm
$$gcd_{approx}(p,q,delta)$$
that eliminates common roots only up to a precision limit $delta$. So for example two roots are considered different if their distance is above the precision limit



$$gcd_{approx}(x-1.01, x-0.99, 0.001) = 1$$



whereas the roots are considered identical if they are less than the precision limit apart



$$gcd_{approx}(x-1.0001, x-0.9999, 0.001) = x-1$$



I have tried to develop this out of the well known Euklidian gcd algorithm, but I can't seem to be able to make sense out of the final residual that is obtained when roots differ slightly.



I can't believe that this is an open problem...










share|cite|improve this question






















  • One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
    – LutzL
    yesterday










  • I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
    – oliver
    yesterday






  • 1




    The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
    – LutzL
    yesterday















up vote
2
down vote

favorite












Suppose I have given a rational function $r(x)=p(x)/q(x)$ where $p$ is a degree $m$ polynomial and $q$ is a degree $n$ polynomial (over the real numbers) and the coefficients of p and q are represented by floating point numbers, which result from some computational procedure (I am working on recursive filtering of measurement data, if that is of interest to somebody).



If $p$ and $q$ share some common linear factor (e.g. $x-1$), the corresponding root (in the example it's $x_0=1$) represents a removable singularity of $r$.



But due to the numerical processing, $p$ might have a root $x^prime_0=1.0001$, whereas $q$ might have a root $x^{primeprime}_0=0.9999$, and the only reason why these roots are different is due to roundoff errors. Ideally I would like to be able to remove such singularities, although they are not removable in the true sense of the word.



So what I am basically trying to do is to perform an approximate greatest common divisor algorithm
$$gcd_{approx}(p,q,delta)$$
that eliminates common roots only up to a precision limit $delta$. So for example two roots are considered different if their distance is above the precision limit



$$gcd_{approx}(x-1.01, x-0.99, 0.001) = 1$$



whereas the roots are considered identical if they are less than the precision limit apart



$$gcd_{approx}(x-1.0001, x-0.9999, 0.001) = x-1$$



I have tried to develop this out of the well known Euklidian gcd algorithm, but I can't seem to be able to make sense out of the final residual that is obtained when roots differ slightly.



I can't believe that this is an open problem...










share|cite|improve this question






















  • One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
    – LutzL
    yesterday










  • I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
    – oliver
    yesterday






  • 1




    The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
    – LutzL
    yesterday













up vote
2
down vote

favorite









up vote
2
down vote

favorite











Suppose I have given a rational function $r(x)=p(x)/q(x)$ where $p$ is a degree $m$ polynomial and $q$ is a degree $n$ polynomial (over the real numbers) and the coefficients of p and q are represented by floating point numbers, which result from some computational procedure (I am working on recursive filtering of measurement data, if that is of interest to somebody).



If $p$ and $q$ share some common linear factor (e.g. $x-1$), the corresponding root (in the example it's $x_0=1$) represents a removable singularity of $r$.



But due to the numerical processing, $p$ might have a root $x^prime_0=1.0001$, whereas $q$ might have a root $x^{primeprime}_0=0.9999$, and the only reason why these roots are different is due to roundoff errors. Ideally I would like to be able to remove such singularities, although they are not removable in the true sense of the word.



So what I am basically trying to do is to perform an approximate greatest common divisor algorithm
$$gcd_{approx}(p,q,delta)$$
that eliminates common roots only up to a precision limit $delta$. So for example two roots are considered different if their distance is above the precision limit



$$gcd_{approx}(x-1.01, x-0.99, 0.001) = 1$$



whereas the roots are considered identical if they are less than the precision limit apart



$$gcd_{approx}(x-1.0001, x-0.9999, 0.001) = x-1$$



I have tried to develop this out of the well known Euklidian gcd algorithm, but I can't seem to be able to make sense out of the final residual that is obtained when roots differ slightly.



I can't believe that this is an open problem...










share|cite|improve this question













Suppose I have given a rational function $r(x)=p(x)/q(x)$ where $p$ is a degree $m$ polynomial and $q$ is a degree $n$ polynomial (over the real numbers) and the coefficients of p and q are represented by floating point numbers, which result from some computational procedure (I am working on recursive filtering of measurement data, if that is of interest to somebody).



If $p$ and $q$ share some common linear factor (e.g. $x-1$), the corresponding root (in the example it's $x_0=1$) represents a removable singularity of $r$.



But due to the numerical processing, $p$ might have a root $x^prime_0=1.0001$, whereas $q$ might have a root $x^{primeprime}_0=0.9999$, and the only reason why these roots are different is due to roundoff errors. Ideally I would like to be able to remove such singularities, although they are not removable in the true sense of the word.



So what I am basically trying to do is to perform an approximate greatest common divisor algorithm
$$gcd_{approx}(p,q,delta)$$
that eliminates common roots only up to a precision limit $delta$. So for example two roots are considered different if their distance is above the precision limit



$$gcd_{approx}(x-1.01, x-0.99, 0.001) = 1$$



whereas the roots are considered identical if they are less than the precision limit apart



$$gcd_{approx}(x-1.0001, x-0.9999, 0.001) = x-1$$



I have tried to develop this out of the well known Euklidian gcd algorithm, but I can't seem to be able to make sense out of the final residual that is obtained when roots differ slightly.



I can't believe that this is an open problem...







polynomials greatest-common-divisor rational-functions floating-point






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked yesterday









oliver

1215




1215












  • One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
    – LutzL
    yesterday










  • I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
    – oliver
    yesterday






  • 1




    The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
    – LutzL
    yesterday


















  • One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
    – LutzL
    yesterday










  • I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
    – oliver
    yesterday






  • 1




    The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
    – LutzL
    yesterday
















One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
– LutzL
yesterday




One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
– LutzL
yesterday












I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
– oliver
yesterday




I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
– oliver
yesterday




1




1




The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
– LutzL
yesterday




The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
– LutzL
yesterday















active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3020150%2fremovable-singularities-for-rational-functions-with-floating-point-coefficients%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown






























active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3020150%2fremovable-singularities-for-rational-functions-with-floating-point-coefficients%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Bressuire

Cabo Verde

Gyllenstierna