enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Bregman divergence - Wikipedia

    en.wikipedia.org/wiki/Bregman_divergence

    In mathematics, specifically statistics and information geometry, a Bregman divergence or Bregman distance is a measure of difference between two points, defined in terms of a strictly convex function; they form an important class of divergences. When the points are interpreted as probability distributions – notably as either values of the ...

  3. Total variation distance of probability measures - Wikipedia

    en.wikipedia.org/wiki/Total_variation_distance...

    Total variation distance is half the absolute area between the two curves: Half the shaded area above. In probability theory, the total variation distance is a distance measure for probability distributions. It is an example of a statistical distance metric, and is sometimes called the statistical distance, statistical difference or variational ...

  4. Kullback–Leibler divergence - Wikipedia

    en.wikipedia.org/wiki/Kullback–Leibler_divergence

    Kullback–Leibler divergence. In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence[1]), denoted , is a type of statistical distance: a measure of how one reference probability distribution P is different from a second probability distribution Q. [2][3] Mathematically, it is defined as.

  5. Hellinger distance - Wikipedia

    en.wikipedia.org/wiki/Hellinger_distance

    Hellinger distance. In probability and statistics, the Hellinger distance (closely related to, although different from, the Bhattacharyya distance) is used to quantify the similarity between two probability distributions. It is a type of f -divergence. The Hellinger distance is defined in terms of the Hellinger integral, which was introduced by ...

  6. Quantum Fisher information - Wikipedia

    en.wikipedia.org/wiki/Quantum_Fisher_information

    For the case of unitary dynamics, the quantum Fisher information is the convex roof of the variance. Based on that, one can obtain lower bounds on it, based on some given operator expectation values using semidefinite programming. The approach considers an optimizaton on the two-copy space.

  7. Pinsker's inequality - Wikipedia

    en.wikipedia.org/wiki/Pinsker's_inequality

    Pinsker's inequality. In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback–Leibler divergence. The inequality is tight up to constant factors.

  8. f-divergence - Wikipedia

    en.wikipedia.org/wiki/F-divergence

    f -divergence. Appearance. hide. In probability theory, an -divergence is a certain type of function that measures the difference between two probability distributions and . Many common divergences, such as KL-divergence, Hellinger distance, and total variation distance, are special cases of -divergence.

  9. Convex function - Wikipedia

    en.wikipedia.org/wiki/Convex_function

    Not convex. In mathematics, a real-valued function is called convex if the line segment between any two distinct points on the graph of the function lies above or on the graph between the two points. Equivalently, a function is convex if its epigraph (the set of points on or above the graph of the function) is a convex set.