enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Convex function - Wikipedia

    en.wikipedia.org/wiki/Convex_function

    The function () = has ″ = >, so f is a convex function. It is also strongly convex (and hence strictly convex too), with strong convexity constant 2. The function () = has ″ =, so f is a convex function. It is strictly convex, even though the second derivative is not strictly positive at all points.

  3. List of convexity topics - Wikipedia

    en.wikipedia.org/wiki/List_of_convexity_topics

    Convex function - a function in which the line segment between any two points on the graph of the function lies above the graph. Closed convex function - a convex function all of whose sublevel sets are closed sets. Proper convex function - a convex function whose effective domain is nonempty and it never attains minus infinity. Concave ...

  4. Proper convex function - Wikipedia

    en.wikipedia.org/wiki/Proper_convex_function

    For every proper convex function : [,], there exist some and such that ()for every .. The sum of two proper convex functions is convex, but not necessarily proper. [4] For instance if the sets and are non-empty convex sets in the vector space, then the characteristic functions and are proper convex functions, but if = then + is identically equal to +.

  5. Subgradient method - Wikipedia

    en.wikipedia.org/wiki/Subgradient_method

    When the objective function is differentiable, sub-gradient methods for unconstrained problems use the same search direction as the method of steepest descent. Subgradient methods are slower than Newton's method when applied to minimize twice continuously differentiable convex functions.

  6. Convex optimization - Wikipedia

    en.wikipedia.org/wiki/Convex_optimization

    For example, the problem of maximizing a concave function can be re-formulated equivalently as the problem of minimizing the convex function . The problem of maximizing a concave function over a convex set is commonly called a convex optimization problem. [8]

  7. Convex combination - Wikipedia

    en.wikipedia.org/wiki/Convex_combination

    A conical combination is a linear combination with nonnegative coefficients. When a point is to be used as the reference origin for defining displacement vectors, then is a convex combination of points ,, …, if and only if the zero displacement is a non-trivial conical combination of their respective displacement vectors relative to .

  8. Logarithmically convex function - Wikipedia

    en.wikipedia.org/.../Logarithmically_convex_function

    A logarithmically convex function f is a convex function since it is the composite of the increasing convex function and the function , which is by definition convex. However, being logarithmically convex is a strictly stronger property than being convex.

  9. Frank–Wolfe algorithm - Wikipedia

    en.wikipedia.org/wiki/Frank–Wolfe_algorithm

    The Frank–Wolfe algorithm is an iterative first-order optimization algorithm for constrained convex optimization.Also known as the conditional gradient method, [1] reduced gradient algorithm and the convex combination algorithm, the method was originally proposed by Marguerite Frank and Philip Wolfe in 1956. [2]