Search results
Results from the WOW.Com Content Network
Illustration of the Kolmogorov–Smirnov statistic. The red line is a model CDF, the blue line is an empirical CDF, and the black arrow is the KS statistic.. In statistics, the Kolmogorov–Smirnov test (also K–S test or KS test) is a nonparametric test of the equality of continuous (or discontinuous, see Section 2.2), one-dimensional probability distributions.
Kolmogorov–Smirnov test: this test only works if the mean and the variance of the normal distribution are assumed known under the null hypothesis, Lilliefors test: based on the Kolmogorov–Smirnov test, adjusted for when also estimating the mean and variance from the data, Shapiro–Wilk test, and; Pearson's chi-squared test.
The sup-norm in this expression is called the Kolmogorov–Smirnov statistic for testing the goodness-of-fit between the empirical distribution ^ and the assumed true cumulative distribution function F. Other norm functions may be reasonably used here instead of the sup-norm.
The Kolmogorov–Smirnov test is based on cumulative distribution functions and can be used to test to see whether two empirical distributions are different or whether an empirical distribution is different from an ideal distribution. The closely related Kuiper's test is useful if the domain of the distribution is cyclic as in day of the week ...
The Kolmogorov–Smirnov test and the Kuiper test are more powerful when the sample size is small, particularly when Stephens's corrective factor is used. [54] These tests may be unduly conservative when applied to discrete distributions. Values for the Benford test have been generated by Morrow. [55]
Lilliefors test is a normality test based on the Kolmogorov–Smirnov test.It is used to test the null hypothesis that data come from a normally distributed population, when the null hypothesis does not specify which normal distribution; i.e., it does not specify the expected value and variance of the distribution. [1]
Kolmogorov–Smirnov test: tests whether a sample is drawn from a given distribution, or whether two samples are drawn from the same distribution. Kruskal–Wallis one-way analysis of variance by ranks: tests whether > 2 independent samples are drawn from the same distribution.
Kolmogorov–Smirnov estimation [ edit ] Another method for the estimation of the power-law exponent, which does not assume independent and identically distributed (iid) data, uses the minimization of the Kolmogorov–Smirnov statistic , D {\displaystyle D} , between the cumulative distribution functions of the data and the power law: