enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Exponential distribution - Wikipedia

    en.wikipedia.org/wiki/Exponential_distribution

    In probability theory and statistics, the exponential distribution or negative exponential distribution is the probability distribution of the distance between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant average rate; the distance parameter could be any meaningful mono-dimensional measure of the process, such as time ...

  3. Laplace distribution - Wikipedia

    en.wikipedia.org/wiki/Laplace_distribution

    In probability theory and statistics, the Laplace distribution is a continuous probability distribution named after Pierre-Simon Laplace.It is also sometimes called the double exponential distribution, because it can be thought of as two exponential distributions (with an additional location parameter) spliced together along the abscissa, although the term is also sometimes used to refer to ...

  4. Kolmogorov–Smirnov test - Wikipedia

    en.wikipedia.org/wiki/Kolmogorov–Smirnov_test

    Illustration of the Kolmogorov–Smirnov statistic. The red line is a model CDF, the blue line is an empirical CDF, and the black arrow is the KS statistic.. Kolmogorov–Smirnov test (K–S test or KS test) is a nonparametric test of the equality of continuous (or discontinuous, see Section 2.2), one-dimensional probability distributions that can be used to test whether a sample came from a ...

  5. Double exponential distribution - Wikipedia

    en.wikipedia.org/.../Double_exponential_distribution

    Laplace distribution, or bilateral exponential distribution, consisting of two exponential distributions glued together on each side of a threshold; Gumbel distribution, the cumulative distribution function of which is an iterated exponential function (the exponential of an exponential function).

  6. Statistical distance - Wikipedia

    en.wikipedia.org/wiki/Statistical_distance

    A metric on a set X is a function (called the distance function or simply distance) d : X × X → R + (where R + is the set of non-negative real numbers). For all x, y, z in X, this function is required to satisfy the following conditions: d(x, y) ≥ 0 (non-negativity) d(x, y) = 0 if and only if x = y (identity of indiscernibles.

  7. Relationships among probability distributions - Wikipedia

    en.wikipedia.org/wiki/Relationships_among...

    The square of a standard normal random variable has a chi-squared distribution with one degree of freedom. If X is a Student’s t random variable with ν degree of freedom, then X 2 is an F (1,ν) random variable. If X is a double exponential random variable with mean 0 and scale λ, then |X| is an exponential random variable with mean λ.

  8. Convolution of probability distributions - Wikipedia

    en.wikipedia.org/wiki/Convolution_of_probability...

    The probability distribution of the sum of two or more independent random variables is the convolution of their individual distributions. The term is motivated by the fact that the probability mass function or probability density function of a sum of independent random variables is the convolution of their corresponding probability mass functions or probability density functions respectively.

  9. Sign test - Wikipedia

    en.wikipedia.org/wiki/Sign_test

    The sign test is a statistical test for consistent differences between pairs of observations, such as the weight of subjects before and after treatment. Given pairs of observations (such as weight pre- and post-treatment) for each subject, the sign test determines if one member of the pair (such as pre-treatment) tends to be greater than (or less than) the other member of the pair (such as ...