enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Distribution (mathematics) - Wikipedia

    en.wikipedia.org/wiki/Distribution_(mathematics)

    Gårding (1997) comments that although the ideas in the transformative book by Schwartz (1951) were not entirely new, it was Schwartz's broad attack and conviction that distributions would be useful almost everywhere in analysis that made the difference. A detailed history of the theory of distributions was given by Lützen (1982).

  3. Generalized function - Wikipedia

    en.wikipedia.org/wiki/Generalized_function

    An influential book on operational calculus was Oliver Heaviside's Electromagnetic Theory of 1899. When the Lebesgue integral was introduced, there was for the first time a notion of generalized function central to mathematics. An integrable function, in Lebesgue's theory, is equivalent to any other which is the same almost everywhere. That ...

  4. Schwartz kernel theorem - Wikipedia

    en.wikipedia.org/wiki/Schwartz_kernel_theorem

    In mathematics, the Schwartz kernel theorem is a foundational result in the theory of generalized functions, published by Laurent Schwartz in 1952. It states, in broad terms, that the generalized functions introduced by Schwartz (Schwartz distributions) have a two-variable theory that includes all reasonable bilinear forms on the space of test functions.

  5. Distribution (number theory) - Wikipedia

    en.wikipedia.org/wiki/Distribution_(number_theory)

    In algebra and number theory, a distribution is a function on a system of finite sets into an abelian group which is analogous to an integral: it is thus the algebraic analogue of a distribution in the sense of generalised function. The original examples of distributions occur, unnamed, as functions φ on Q/Z satisfying [1]

  6. Probability theory - Wikipedia

    en.wikipedia.org/wiki/Probability_theory

    The modern mathematical theory of probability has its roots in attempts to analyze games of chance by Gerolamo Cardano in the sixteenth century, and by Pierre de Fermat and Blaise Pascal in the seventeenth century (for example the "problem of points"). [3] Christiaan Huygens published a book on the subject in 1657. [4]

  7. Information geometry - Wikipedia

    en.wikipedia.org/wiki/Information_geometry

    [2] [3] The modern theory is largely due to Shun'ichi Amari, whose work has been greatly influential on the development of the field. [4] Classically, information geometry considered a parametrized statistical model as a Riemannian manifold. For such models, there is a natural choice of Riemannian metric, known as the Fisher information metric.

  8. Central moment - Wikipedia

    en.wikipedia.org/wiki/Central_moment

    In probability theory and statistics, a central moment is a moment of a probability distribution of a random variable about the random variable's mean; that is, it is the expected value of a specified integer power of the deviation of the random variable from the mean. The various moments form one set of values by which the properties of a ...

  9. Zipf–Mandelbrot law - Wikipedia

    en.wikipedia.org/wiki/Zipf–Mandelbrot_law

    In probability theory and statistics, the Zipf–Mandelbrot law is a discrete probability distribution.Also known as the Pareto–Zipf law, it is a power-law distribution on ranked data, named after the linguist George Kingsley Zipf, who suggested a simpler distribution called Zipf's law, and the mathematician Benoit Mandelbrot, who subsequently generalized it.