Removable singularities for rational functions with floating point coefficients
up vote
2
down vote
favorite
Suppose I have given a rational function $r(x)=p(x)/q(x)$ where $p$ is a degree $m$ polynomial and $q$ is a degree $n$ polynomial (over the real numbers) and the coefficients of p and q are represented by floating point numbers, which result from some computational procedure (I am working on recursive filtering of measurement data, if that is of interest to somebody).
If $p$ and $q$ share some common linear factor (e.g. $x-1$), the corresponding root (in the example it's $x_0=1$) represents a removable singularity of $r$.
But due to the numerical processing, $p$ might have a root $x^prime_0=1.0001$, whereas $q$ might have a root $x^{primeprime}_0=0.9999$, and the only reason why these roots are different is due to roundoff errors. Ideally I would like to be able to remove such singularities, although they are not removable in the true sense of the word.
So what I am basically trying to do is to perform an approximate greatest common divisor algorithm
$$gcd_{approx}(p,q,delta)$$
that eliminates common roots only up to a precision limit $delta$. So for example two roots are considered different if their distance is above the precision limit
$$gcd_{approx}(x-1.01, x-0.99, 0.001) = 1$$
whereas the roots are considered identical if they are less than the precision limit apart
$$gcd_{approx}(x-1.0001, x-0.9999, 0.001) = x-1$$
I have tried to develop this out of the well known Euklidian gcd algorithm, but I can't seem to be able to make sense out of the final residual that is obtained when roots differ slightly.
I can't believe that this is an open problem...
polynomials greatest-common-divisor rational-functions floating-point
add a comment |
up vote
2
down vote
favorite
Suppose I have given a rational function $r(x)=p(x)/q(x)$ where $p$ is a degree $m$ polynomial and $q$ is a degree $n$ polynomial (over the real numbers) and the coefficients of p and q are represented by floating point numbers, which result from some computational procedure (I am working on recursive filtering of measurement data, if that is of interest to somebody).
If $p$ and $q$ share some common linear factor (e.g. $x-1$), the corresponding root (in the example it's $x_0=1$) represents a removable singularity of $r$.
But due to the numerical processing, $p$ might have a root $x^prime_0=1.0001$, whereas $q$ might have a root $x^{primeprime}_0=0.9999$, and the only reason why these roots are different is due to roundoff errors. Ideally I would like to be able to remove such singularities, although they are not removable in the true sense of the word.
So what I am basically trying to do is to perform an approximate greatest common divisor algorithm
$$gcd_{approx}(p,q,delta)$$
that eliminates common roots only up to a precision limit $delta$. So for example two roots are considered different if their distance is above the precision limit
$$gcd_{approx}(x-1.01, x-0.99, 0.001) = 1$$
whereas the roots are considered identical if they are less than the precision limit apart
$$gcd_{approx}(x-1.0001, x-0.9999, 0.001) = x-1$$
I have tried to develop this out of the well known Euklidian gcd algorithm, but I can't seem to be able to make sense out of the final residual that is obtained when roots differ slightly.
I can't believe that this is an open problem...
polynomials greatest-common-divisor rational-functions floating-point
One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
– LutzL
yesterday
I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
– oliver
yesterday
1
The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
– LutzL
yesterday
add a comment |
up vote
2
down vote
favorite
up vote
2
down vote
favorite
Suppose I have given a rational function $r(x)=p(x)/q(x)$ where $p$ is a degree $m$ polynomial and $q$ is a degree $n$ polynomial (over the real numbers) and the coefficients of p and q are represented by floating point numbers, which result from some computational procedure (I am working on recursive filtering of measurement data, if that is of interest to somebody).
If $p$ and $q$ share some common linear factor (e.g. $x-1$), the corresponding root (in the example it's $x_0=1$) represents a removable singularity of $r$.
But due to the numerical processing, $p$ might have a root $x^prime_0=1.0001$, whereas $q$ might have a root $x^{primeprime}_0=0.9999$, and the only reason why these roots are different is due to roundoff errors. Ideally I would like to be able to remove such singularities, although they are not removable in the true sense of the word.
So what I am basically trying to do is to perform an approximate greatest common divisor algorithm
$$gcd_{approx}(p,q,delta)$$
that eliminates common roots only up to a precision limit $delta$. So for example two roots are considered different if their distance is above the precision limit
$$gcd_{approx}(x-1.01, x-0.99, 0.001) = 1$$
whereas the roots are considered identical if they are less than the precision limit apart
$$gcd_{approx}(x-1.0001, x-0.9999, 0.001) = x-1$$
I have tried to develop this out of the well known Euklidian gcd algorithm, but I can't seem to be able to make sense out of the final residual that is obtained when roots differ slightly.
I can't believe that this is an open problem...
polynomials greatest-common-divisor rational-functions floating-point
Suppose I have given a rational function $r(x)=p(x)/q(x)$ where $p$ is a degree $m$ polynomial and $q$ is a degree $n$ polynomial (over the real numbers) and the coefficients of p and q are represented by floating point numbers, which result from some computational procedure (I am working on recursive filtering of measurement data, if that is of interest to somebody).
If $p$ and $q$ share some common linear factor (e.g. $x-1$), the corresponding root (in the example it's $x_0=1$) represents a removable singularity of $r$.
But due to the numerical processing, $p$ might have a root $x^prime_0=1.0001$, whereas $q$ might have a root $x^{primeprime}_0=0.9999$, and the only reason why these roots are different is due to roundoff errors. Ideally I would like to be able to remove such singularities, although they are not removable in the true sense of the word.
So what I am basically trying to do is to perform an approximate greatest common divisor algorithm
$$gcd_{approx}(p,q,delta)$$
that eliminates common roots only up to a precision limit $delta$. So for example two roots are considered different if their distance is above the precision limit
$$gcd_{approx}(x-1.01, x-0.99, 0.001) = 1$$
whereas the roots are considered identical if they are less than the precision limit apart
$$gcd_{approx}(x-1.0001, x-0.9999, 0.001) = x-1$$
I have tried to develop this out of the well known Euklidian gcd algorithm, but I can't seem to be able to make sense out of the final residual that is obtained when roots differ slightly.
I can't believe that this is an open problem...
polynomials greatest-common-divisor rational-functions floating-point
polynomials greatest-common-divisor rational-functions floating-point
asked yesterday
oliver
1215
1215
One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
– LutzL
yesterday
I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
– oliver
yesterday
1
The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
– LutzL
yesterday
add a comment |
One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
– LutzL
yesterday
I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
– oliver
yesterday
1
The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
– LutzL
yesterday
One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
– LutzL
yesterday
One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
– LutzL
yesterday
I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
– oliver
yesterday
I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
– oliver
yesterday
1
1
The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
– LutzL
yesterday
The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
– LutzL
yesterday
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3020150%2fremovable-singularities-for-rational-functions-with-floating-point-coefficients%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
One idea that I have seen discussed in several places is to take the Sylvester matrix, do an SVD on it and with that do a rank estimation. The rank or rank deficit is connected to the degree of the approximate common factor. Somehow from the orthogonal factors you can read off either the common factor or the reduced polynomials.
– LutzL
yesterday
I guess I see the idea. But isn't SVD about as costly as complete root-finding for my polynomials? So if I can afford computing all roots, I can just determine the distance between all pairs of roots directly and just sort those out, that are close together, right?
– oliver
yesterday
1
The problem of finding root clusters is closely related, as a root cluster is almost a multiple root. There you want to find and remove approximately common factors of polynomial and its derivative. There might be more literature published lately on that topic than on the approximate gcd problem. Kaltofen did some papers on approximate polynomial algorithms.
– LutzL
yesterday