Search results
Results from the WOW.Com Content Network
In mathematics, specifically statistics and information geometry, a Bregman divergence or Bregman distance is a measure of difference between two points, defined in terms of a strictly convex function; they form an important class of divergences. When the points are interpreted as probability distributions – notably as either values of the ...
Total variation distance is half the absolute area between the two curves: Half the shaded area above. In probability theory, the total variation distance is a distance measure for probability distributions. It is an example of a statistical distance metric, and is sometimes called the statistical distance, statistical difference or variational ...
Kullback–Leibler divergence. In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence[1]), denoted , is a type of statistical distance: a measure of how one reference probability distribution P is different from a second probability distribution Q. [2][3] Mathematically, it is defined as.
Hellinger distance. In probability and statistics, the Hellinger distance (closely related to, although different from, the Bhattacharyya distance) is used to quantify the similarity between two probability distributions. It is a type of f -divergence. The Hellinger distance is defined in terms of the Hellinger integral, which was introduced by ...
For the case of unitary dynamics, the quantum Fisher information is the convex roof of the variance. Based on that, one can obtain lower bounds on it, based on some given operator expectation values using semidefinite programming. The approach considers an optimizaton on the two-copy space.
Pinsker's inequality. In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback–Leibler divergence. The inequality is tight up to constant factors.
f -divergence. Appearance. hide. In probability theory, an -divergence is a certain type of function that measures the difference between two probability distributions and . Many common divergences, such as KL-divergence, Hellinger distance, and total variation distance, are special cases of -divergence.
Not convex. In mathematics, a real-valued function is called convex if the line segment between any two distinct points on the graph of the function lies above or on the graph between the two points. Equivalently, a function is convex if its epigraph (the set of points on or above the graph of the function) is a convex set.