enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Markov model - Wikipedia

    en.wikipedia.org/wiki/Markov_model

    A Markov decision process is a Markov chain in which state transitions depend on the current state and an action vector that is applied to the system. Typically, a Markov decision process is used to compute a policy of actions that will maximize some utility with respect to expected rewards.

  3. Markov chain - Wikipedia

    en.wikipedia.org/wiki/Markov_chain

    In probability theory and statistics, a Markov chain or Markov process is a stochastic process describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.

  4. Markovian arrival process - Wikipedia

    en.wikipedia.org/wiki/Markovian_arrival_process

    In queueing theory, a discipline within the mathematical theory of probability, a Markovian arrival process (MAP or MArP [1]) is a mathematical model for the time between job arrivals to a system. The simplest such process is a Poisson process where the time between each arrival is exponentially distributed .

  5. Markov property - Wikipedia

    en.wikipedia.org/wiki/Markov_property

    A process with this property is said to be Markov or Markovian and known as a Markov process. Two famous classes of Markov process are the Markov chain and Brownian motion. Note that there is a subtle, often overlooked and very important point that is often missed in the plain English statement of the definition. Namely that the statespace of ...

  6. Stochastic process - Wikipedia

    en.wikipedia.org/wiki/Stochastic_process

    Markov processes are stochastic processes, traditionally in discrete or continuous time, that have the Markov property, which means the next value of the Markov process depends on the current value, but it is conditionally independent of the previous values of the stochastic process. In other words, the behavior of the process in the future is ...

  7. Gauss–Markov process - Wikipedia

    en.wikipedia.org/wiki/Gauss–Markov_process

    Gauss–Markov stochastic processes (named after Carl Friedrich Gauss and Andrey Markov) are stochastic processes that satisfy the requirements for both Gaussian processes and Markov processes. [1] [2] A stationary Gauss–Markov process is unique [citation needed] up to rescaling; such a process is also known as an Ornstein–Uhlenbeck process.

  8. Markov decision process - Wikipedia

    en.wikipedia.org/wiki/Markov_decision_process

    The "Markov" in "Markov decision process" refers to the underlying structure of state transitions that still follow the Markov property. The process is called a "decision process" because it involves making decisions that influence these state transitions, extending the concept of a Markov chain into the realm of decision-making under uncertainty.

  9. Discrete-time Markov chain - Wikipedia

    en.wikipedia.org/wiki/Discrete-time_Markov_chain

    A Markov chain with two states, A and E. In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past.