enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Quantile regression - Wikipedia

    en.wikipedia.org/wiki/Quantile_regression

    Quantile regression is a type of regression analysis used in statistics and econometrics. Whereas the method of least squares estimates the conditional mean of the response variable across values of the predictor variables, quantile regression estimates the conditional median (or other quantiles) of the response variable.

  3. Conditional expectation - Wikipedia

    en.wikipedia.org/wiki/Conditional_expectation

    The unconditional expectation of rainfall for an unspecified day is the average of the rainfall amounts for those 3652 days. The conditional expectation of rainfall for an otherwise unspecified day known to be (conditional on being) in the month of March, is the average of daily rainfall over all 310 days of the ten–year period that fall in ...

  4. Repeated median regression - Wikipedia

    en.wikipedia.org/wiki/Repeated_median_regression

    In robust statistics, repeated median regression, also known as the repeated median estimator, is a robust linear regression algorithm. The estimator has a breakdown point of 50%. [ 1 ] Although it is equivariant under scaling, or under linear transformations of either its explanatory variable or its response variable, it is not under affine ...

  5. Linear regression - Wikipedia

    en.wikipedia.org/wiki/Linear_regression

    The Theil–Sen estimator is a simple robust estimation technique that chooses the slope of the fit line to be the median of the slopes of the lines through pairs of sample points. It has similar statistical efficiency properties to simple linear regression but is much less sensitive to outliers .

  6. Rao–Blackwell theorem - Wikipedia

    en.wikipedia.org/wiki/Rao–Blackwell_theorem

    A Rao–Blackwell estimator δ 1 (X) of an unobservable quantity θ is the conditional expected value E(δ(X) | T(X)) of some estimator δ(X) given a sufficient statistic T(X). Call δ(X) the "original estimator" and δ 1 (X) the "improved estimator". It is important that the improved estimator be observable, i.e. that it does not depend on θ.

  7. M-estimator - Wikipedia

    en.wikipedia.org/wiki/M-estimator

    Such an estimator is not necessarily an M-estimator of ρ-type, but if ρ has a continuous first derivative with respect to , then a necessary condition for an M-estimator of ψ-type to be an M-estimator of ρ-type is (,) = (,). The previous definitions can easily be extended to finite samples.

  8. Method of moments (statistics) - Wikipedia

    en.wikipedia.org/wiki/Method_of_moments_(statistics)

    An example application of the method of moments is to estimate polynomial probability density distributions. In this case, an approximating polynomial of order is defined on an interval [,]. The method of moments then yields a system of equations, whose solution involves the inversion of a Hankel matrix. [2]

  9. Kernel regression - Wikipedia

    en.wikipedia.org/wiki/Kernel_regression

    In statistics, kernel regression is a non-parametric technique to estimate the conditional expectation of a random variable.The objective is to find a non-linear relation between a pair of random variables X and Y.