enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Macdonald polynomials - Wikipedia

    en.wikipedia.org/wiki/Macdonald_polynomials

    The transformed Macdonald polynomials ~ (;,) in the formula above are related to the classical Macdonald polynomials via a sequence of transformations. First, the integral form of the Macdonald polynomials, denoted J λ ( x ; q , t ) {\displaystyle J_{\lambda }(x;q,t)} , is a re-scaling of P λ ( x ; q , t ) {\displaystyle P_{\lambda }(x;q,t ...

  3. Polynomial interpolation - Wikipedia

    en.wikipedia.org/wiki/Polynomial_interpolation

    The original use of interpolation polynomials was to approximate values of important transcendental functions such as natural logarithm and trigonometric functions.Starting with a few accurately computed data points, the corresponding interpolation polynomial will approximate the function at an arbitrary nearby point.

  4. Lanczos approximation - Wikipedia

    en.wikipedia.org/wiki/Lanczos_approximation

    The following implementation in the Python programming language works for complex arguments and typically gives 13 correct decimal places. Note that omitting the smallest coefficients (in pursuit of speed, for example) gives totally inaccurate results; the coefficients must be recomputed from scratch for an expansion with fewer terms.

  5. Propagation of uncertainty - Wikipedia

    en.wikipedia.org/wiki/Propagation_of_uncertainty

    Any non-linear differentiable function, (,), of two variables, and , can be expanded as + +. If we take the variance on both sides and use the formula [11] for the variance of a linear combination of variables ⁡ (+) = ⁡ + ⁡ + ⁡ (,), then we obtain | | + | | +, where is the standard deviation of the function , is the standard deviation of , is the standard deviation of and = is the ...

  6. Polynomial regression - Wikipedia

    en.wikipedia.org/wiki/Polynomial_regression

    A drawback of polynomial bases is that the basis functions are "non-local", meaning that the fitted value of y at a given value x = x 0 depends strongly on data values with x far from x 0. [9] In modern statistics, polynomial basis-functions are used along with new basis functions, such as splines, radial basis functions, and wavelets. These ...

  7. Polynomial kernel - Wikipedia

    en.wikipedia.org/wiki/Polynomial_kernel

    For degree-d polynomials, the polynomial kernel is defined as [2](,) = (+)where x and y are vectors of size n in the input space, i.e. vectors of features computed from training or test samples and c ≥ 0 is a free parameter trading off the influence of higher-order versus lower-order terms in the polynomial.

  8. Kontorovich–Lebedev transform - Wikipedia

    en.wikipedia.org/wiki/Kontorovich–Lebedev...

    In mathematics, the Kontorovich–Lebedev transform is an integral transform which uses a Macdonald function (modified Bessel function of the second kind) with imaginary index as its kernel. Unlike other Bessel function transforms, such as the Hankel transform, this transform involves integrating over the index of the function rather than its ...

  9. Hermite interpolation - Wikipedia

    en.wikipedia.org/wiki/Hermite_interpolation

    The number of pieces of information, function values and derivative values, must add up to . Hermite's method of interpolation is closely related to the Newton's interpolation method, in that both can be derived from the calculation of divided differences. However, there are other methods for computing a Hermite interpolating polynomial.