enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Markov's inequality - Wikipedia

    en.wikipedia.org/wiki/Markov's_inequality

    Markov's inequality (and other similar inequalities) relate probabilities to expectations, and provide (frequently loose but still useful) bounds for the cumulative distribution function of a random variable. Markov's inequality can also be used to upper bound the expectation of a non-negative random variable in terms of its distribution function.

  3. Second moment method - Wikipedia

    en.wikipedia.org/wiki/Second_moment_method

    In mathematics, the second moment method is a technique used in probability theory and analysis to show that a random variable has positive probability of being positive. More generally, the "moment method" consists of bounding the probability that a random variable fluctuates far from its mean, by using its moments.

  4. Markov brothers' inequality - Wikipedia

    en.wikipedia.org/wiki/Markov_brothers'_inequality

    In mathematics, the Markov brothers' inequality is an inequality, proved in the 1890s by brothers Andrey Markov and Vladimir Markov, two Russian mathematicians. This inequality bounds the maximum of the derivatives of a polynomial on an interval in terms of the maximum of the polynomial. [ 1 ]

  5. Coupon collector's problem - Wikipedia

    en.wikipedia.org/wiki/Coupon_collector's_problem

    A simple proof by martingales is in the next section. Donald J. Newman and Lawrence Shepp gave a generalization of the coupon collector's problem when m copies of each coupon need to be collected. Let T m be the first time m copies of each coupon are collected.

  6. Chebyshev's inequality - Wikipedia

    en.wikipedia.org/wiki/Chebyshev's_inequality

    The term Chebyshev's inequality may also refer to Markov's inequality, especially in the context of analysis. They are closely related, and some authors refer to Markov's inequality as "Chebyshev's First Inequality," and the similar one referred to on this page as "Chebyshev's Second Inequality."

  7. Chernoff bound - Wikipedia

    en.wikipedia.org/wiki/Chernoff_bound

    Chernoff bounds may also be applied to general sums of independent, bounded random variables, regardless of their distribution; this is known as Hoeffding's inequality. The proof follows a similar approach to the other Chernoff bounds, but applying Hoeffding's lemma to bound the moment generating functions (see Hoeffding's inequality).

  8. Chebyshev–Markov–Stieltjes inequalities - Wikipedia

    en.wikipedia.org/wiki/Chebyshev–Markov...

    In mathematical analysis, the Chebyshev–Markov–Stieltjes inequalities are inequalities related to the problem of moments that were formulated in the 1880s by Pafnuty Chebyshev and proved independently by Andrey Markov and (somewhat later) by Thomas Jan Stieltjes. [1]

  9. Chapman–Kolmogorov equation - Wikipedia

    en.wikipedia.org/wiki/Chapman–Kolmogorov_equation

    In mathematics, specifically in the theory of Markovian stochastic processes in probability theory, the Chapman–Kolmogorov equation (CKE) is an identity relating the joint probability distributions of different sets of coordinates on a stochastic process.