Search results
Results from the WOW.Com Content Network
In probability theory, a Chernoff bound is an exponentially decreasing upper bound on the tail of a random variable based on its moment generating function. The minimum of all such exponential bounds forms the Chernoff or Chernoff-Cramér bound , which may decay faster than exponential (e.g. sub-Gaussian ).
Therefore, the theorem above gives a tighter bound than the Ahlswede–Winter result. The chief contribution of (Ahlswede & Winter 2003) was the extension of the Laplace-transform method used to prove the scalar Chernoff bound (see Chernoff bound#Additive form (absolute error)) to the case of self-adjoint
Chernoff bound [ edit ] The probability that a Poisson binomial distribution gets large, can be bounded using its moment generating function as follows (valid when s ≥ μ {\displaystyle s\geq \mu } and for any t > 0 {\displaystyle t>0} ):
In probability theory and statistics, the moment-generating function of a real-valued random variable is an alternative specification of its probability distribution.Thus, it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions.
The proof of Hoeffding's inequality follows similarly to concentration inequalities like Chernoff bounds. [9] The main difference is the use of Hoeffding's Lemma : Suppose X is a real random variable such that X ∈ [ a , b ] {\displaystyle X\in \left[a,b\right]} almost surely .
In probability theory and statistics, the Poisson distribution (/ ˈ p w ɑː s ɒ n /) is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time if these events occur with a known constant mean rate and independently of the time since the last event. [1]
The Chernoff bound of the Q-function is () ... As in the one dimensional case, there is no simple analytical formula for the Q-function. Nevertheless, ...
This is a generalization of Hoeffding's since it can handle random variables with not only almost-sure bound but both almost-sure bound and variance bound. 6. Chernoff bounds have a particularly simple form in the case of sum of independent variables, since [] = = [].