enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Polynomial root-finding algorithms - Wikipedia

    en.wikipedia.org/wiki/Polynomial_root-finding...

    Finding the root of a linear polynomial (degree one) is easy and needs only one division: the general equation + = has solution = /. For quadratic polynomials (degree two), the quadratic formula produces a solution, but its numerical evaluation may require some care for ensuring numerical stability.

  3. Root-finding algorithm - Wikipedia

    en.wikipedia.org/wiki/Root-finding_algorithm

    However, for polynomials specifically, the study of root-finding algorithms belongs to computer algebra, since algebraic properties of polynomials are fundamental for the most efficient algorithms. The efficiency and applicability of an algorithm may depend sensitively on the characteristics of the given functions.

  4. Durand–Kerner method - Wikipedia

    en.wikipedia.org/wiki/Durand–Kerner_method

    which may increasingly become a concern as the degree of the polynomial increases. If the coefficients are real and the polynomial has odd degree, then it must have at least one real root. To find this, use a real value of p 0 as the initial guess and make q 0 and r 0, etc., complex conjugate pairs.

  5. Geometrical properties of polynomial roots - Wikipedia

    en.wikipedia.org/wiki/Geometrical_properties_of...

    The root separation is a fundamental parameter of the computational complexity of root-finding algorithms for polynomials. In fact, the root separation determines the precision of number representation that is needed for being certain of distinguishing distinct roots.

  6. Jenkins–Traub algorithm - Wikipedia

    en.wikipedia.org/wiki/Jenkins–Traub_algorithm

    The Jenkins–Traub algorithm for polynomial zeros is a fast globally convergent iterative polynomial root-finding method published in 1970 by Michael A. Jenkins and Joseph F. Traub. They gave two variants, one for general polynomials with complex coefficients, commonly known as the "CPOLY" algorithm, and a more complicated variant for the ...

  7. Bisection method - Wikipedia

    en.wikipedia.org/wiki/Bisection_method

    For polynomials, more elaborate methods exist for testing the existence of a root in an interval (Descartes' rule of signs, Sturm's theorem, Budan's theorem). They allow extending the bisection method into efficient algorithms for finding all real roots of a polynomial; see Real-root isolation.

  8. Graeffe's method - Wikipedia

    en.wikipedia.org/wiki/Graeffe's_method

    In mathematics, Graeffe's method or Dandelin–Lobachesky–Graeffe method is an algorithm for finding all of the roots of a polynomial. It was developed independently by Germinal Pierre Dandelin in 1826 and Lobachevsky in 1834. In 1837 Karl Heinrich Gräffe also discovered the principal idea of the method. [1]

  9. Laguerre's method - Wikipedia

    en.wikipedia.org/wiki/Laguerre's_method

    In numerical analysis, Laguerre's method is a root-finding algorithm tailored to polynomials.In other words, Laguerre's method can be used to numerically solve the equation p(x) = 0 for a given polynomial p(x).