enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Cook's distance - Wikipedia

    en.wikipedia.org/wiki/Cook's_distance

    In statistics, Cook's distance or Cook's D is a commonly used estimate of the influence of a data point when performing a least-squares regression analysis. [1] In a practical ordinary least squares analysis, Cook's distance can be used in several ways: to indicate influential data points that are particularly worth checking for validity; or to indicate regions of the design space where it ...

  3. Leverage (statistics) - Wikipedia

    en.wikipedia.org/wiki/Leverage_(statistics)

    Specifically, for some matrix , the squared Mahalanobis distance of (where is row of ) from the vector of mean ^ = = of length , is () = (^) (^), where = is the estimated covariance matrix of 's. This is related to the leverage h i i {\displaystyle h_{ii}} of the hat matrix of X {\displaystyle \mathbf {X} } after appending a column vector of 1 ...

  4. Regression diagnostic - Wikipedia

    en.wikipedia.org/wiki/Regression_diagnostic

    A regression diagnostic may take the form of a graphical result, informal quantitative results or a formal statistical hypothesis test, [2] each of which provides guidance for further stages of a regression analysis.

  5. Studentized residual - Wikipedia

    en.wikipedia.org/wiki/Studentized_residual

    where t is a random variable distributed as Student's t-distribution with ν − 1 degrees of freedom. In fact, this implies that t i 2 /ν follows the beta distribution B(1/2,(ν − 1)/2). The distribution above is sometimes referred to as the tau distribution; [2] it was first derived by Thompson in 1935. [3]

  6. DFFITS - Wikipedia

    en.wikipedia.org/wiki/DFFITS

    Although the raw values resulting from the equations are different, Cook's distance and DFFITS are conceptually identical and there is a closed-form formula to convert one value to the other. [ 3 ] Development

  7. Beta regression - Wikipedia

    en.wikipedia.org/wiki/Beta_regression

    Beta regression is a form of regression which is used when the response variable, , takes values within (,) and can be assumed to follow a beta distribution. [1] It is generalisable to variables which takes values in the arbitrary open interval ( a , b ) {\displaystyle (a,b)} through transformations. [ 1 ]

  8. Robust Regression and Outlier Detection - Wikipedia

    en.wikipedia.org/wiki/Robust_Regression_and...

    The book has seven chapters. [1] [4] The first is introductory; it describes simple linear regression (in which there is only one independent variable), discusses the possibility of outliers that corrupt either the dependent or the independent variable, provides examples in which outliers produce misleading results, defines the breakdown point, and briefly introduces several methods for robust ...

  9. Projection matrix - Wikipedia

    en.wikipedia.org/wiki/Projection_matrix

    For linear models, the trace of the projection matrix is equal to the rank of , which is the number of independent parameters of the linear model. [8] For other models such as LOESS that are still linear in the observations y {\displaystyle \mathbf {y} } , the projection matrix can be used to define the effective degrees of freedom of the model.