enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Multivariate adaptive regression spline - Wikipedia

    en.wikipedia.org/wiki/Multivariate_adaptive...

    In statistics, multivariate adaptive regression splines (MARS) is a form of regression analysis introduced by Jerome H. Friedman in 1991. [1] It is a non-parametric regression technique and can be seen as an extension of linear models that automatically models nonlinearities and interactions between variables.

  3. Optimal experimental design - Wikipedia

    en.wikipedia.org/wiki/Optimal_experimental_design

    The earliest optimal designs were developed to estimate the parameters of regression models with continuous variables, for example, by J. D. Gergonne in 1815 (Stigler). In English, two early contributions were made by Charles S. Peirce and Kirstine Smith .

  4. Autoregressive moving-average model - Wikipedia

    en.wikipedia.org/wiki/Autoregressive_moving...

    ARMA is appropriate when a system is a function of a series of unobserved shocks (the MA or moving average part) as well as its own behavior. For example, stock prices may be shocked by fundamental information as well as exhibiting technical trending and mean-reversion effects due to market participants.

  5. Nonparametric regression - Wikipedia

    en.wikipedia.org/wiki/Nonparametric_regression

    In Gaussian process regression, also known as Kriging, a Gaussian prior is assumed for the regression curve. The errors are assumed to have a multivariate normal distribution and the regression curve is estimated by its posterior mode. The Gaussian prior may depend on unknown hyperparameters, which are usually estimated via empirical Bayes. The ...

  6. Central composite design - Wikipedia

    en.wikipedia.org/wiki/Central_composite_design

    In statistics, a central composite design is an experimental design, useful in response surface methodology, for building a second order (quadratic) model for the response variable without needing to use a complete three-level factorial experiment.

  7. Recursive least squares filter - Wikipedia

    en.wikipedia.org/wiki/Recursive_least_squares_filter

    The smaller is, the smaller is the contribution of previous samples to the covariance matrix. This makes the filter more sensitive to recent samples, which means more fluctuations in the filter co-efficients.

  8. Non-linear least squares - Wikipedia

    en.wikipedia.org/wiki/Non-linear_least_squares

    Consider a set of data points, (,), (,), …, (,), and a curve (model function) ^ = (,), that in addition to the variable also depends on parameters, = (,, …,), with . It is desired to find the vector of parameters such that the curve fits best the given data in the least squares sense, that is, the sum of squares = = is minimized, where the residuals (in-sample prediction errors) r i are ...

  9. System identification - Wikipedia

    en.wikipedia.org/wiki/System_identification

    The field of system identification uses statistical methods to build mathematical models of dynamical systems from measured data. [1] System identification also includes the optimal design of experiments for efficiently generating informative data for fitting such models as well as model reduction.