enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Test functions for optimization - Wikipedia

    en.wikipedia.org/wiki/Test_functions_for...

    Convergence rate. Precision. Robustness. General performance. Here some test functions are presented with the aim of giving an idea about the different situations that optimization algorithms have to face when coping with these kinds of problems. In the first part, some objective functions for single-objective optimization cases are presented.

  3. Steffensen's method - Wikipedia

    en.wikipedia.org/wiki/Steffensen's_method

    The version of Steffensen's method implemented in the MATLAB code shown below can be found using the Aitken's delta-squared process for accelerating convergence of a sequence. To compare the following formulae to the formulae in the section above, notice that x n = p − p n . {\displaystyle x_{n}=p\,-\,p_{n}~.}

  4. Simulated annealing - Wikipedia

    en.wikipedia.org/wiki/Simulated_annealing

    Simulated annealing is a probabilistic technique for approximating the global optimum of a given function. It is a metaheuristic to solve combinatorial problems with many local optima, such as the traveling salesman problem, by gradually cooling the system and accepting worse solutions.

  5. Wolfe conditions - Wikipedia

    en.wikipedia.org/wiki/Wolfe_conditions

    Wolfe conditions are inequalities for performing inexact line search in optimization methods. The Armijo rule is the first condition that ensures the step length decreases sufficiently, while the curvature condition is the second condition that ensures the slope is reduced sufficiently.

  6. Conjugate gradient method - Wikipedia

    en.wikipedia.org/wiki/Conjugate_gradient_method

    Learn about the conjugate gradient method, an algorithm for solving linear systems with positive-definite matrices. Find out how it works, how to derive it, and how to implement it as an iterative method.

  7. Mittag-Leffler function - Wikipedia

    en.wikipedia.org/wiki/Mittag-Leffler_function

    The Mittag-Leffler function is a complex function of two parameters that can interpolate between a Gaussian and a Lorentzian function. It is important in fractional calculus and viscoelasticity theory.

  8. Convergence tests - Wikipedia

    en.wikipedia.org/wiki/Convergence_tests

    Learn how to test for the convergence, divergence, or absolute convergence of an infinite series using various criteria and examples. Find out how to apply the limit, ratio, root, integral, p-series, direct comparison, limit comparison, Cauchy condensation, Abel's, alternating series, Dirichlet's, Cauchy's, Stolz–Cesàro, Weierstrass M-test, and other tests.

  9. Generalized hypergeometric function - Wikipedia

    en.wikipedia.org/wiki/Generalized_hypergeometric...

    A generalized hypergeometric function is a power series with a rational function of n as the ratio of successive coefficients. It includes the hypergeometric and confluent hypergeometric functions as special cases, and has many applications in mathematics and physics.