enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Stopping time - Wikipedia

    en.wikipedia.org/wiki/Stopping_time

    Example of a stopping time: a hitting time of Brownian motion.The process starts at 0 and is stopped as soon as it hits 1. In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time [1]) is a specific type of “random time”: a random variable whose value is interpreted as the time at ...

  3. Markov property - Wikipedia

    en.wikipedia.org/wiki/Markov_property

    The term strong Markov property is similar to the Markov property, except that the meaning of "present" is defined in terms of a random variable known as a stopping time. The term Markov assumption is used to describe a model where the Markov property is assumed to hold, such as a hidden Markov model .

  4. Markov chain - Wikipedia

    en.wikipedia.org/wiki/Markov_chain

    Markov chains and continuous-time Markov processes are useful in chemistry when physical systems closely approximate the Markov property. For example, imagine a large number n of molecules in solution in state A, each of which can undergo a chemical reaction to state B with a certain average rate. Perhaps the molecule is an enzyme, and the ...

  5. Markov model - Wikipedia

    en.wikipedia.org/wiki/Markov_model

    In a Markov chain, state depends only on the previous state in time, whereas in a Markov random field, each state depends on its neighbors in any of multiple directions. A Markov random field may be visualized as a field or graph of random variables, where the distribution of each random variable depends on the neighboring variables with which ...

  6. Markov renewal process - Wikipedia

    en.wikipedia.org/wiki/Markov_renewal_process

    The process is Markovian only at the specified jump instants, justifying the name semi-Markov. [1] [2] [3] (See also: hidden semi-Markov model.) A semi-Markov process (defined in the above bullet point) in which all the holding times are exponentially distributed is called a continuous-time Markov chain. In other words, if the inter-arrival ...

  7. Stochastic process - Wikipedia

    en.wikipedia.org/wiki/Stochastic_process

    The Brownian motion process and the Poisson process (in one dimension) are both examples of Markov processes [193] in continuous time, while random walks on the integers and the gambler's ruin problem are examples of Markov processes in discrete time. [194] [195] A Markov chain is a type of Markov process that has either discrete state space or ...

  8. Markovian arrival process - Wikipedia

    en.wikipedia.org/wiki/Markovian_arrival_process

    The Markov-modulated Poisson process or MMPP where m Poisson processes are switched between by an underlying continuous-time Markov chain. [8] If each of the m Poisson processes has rate λ i and the modulating continuous-time Markov has m × m transition rate matrix R , then the MAP representation is

  9. Continuous-time Markov chain - Wikipedia

    en.wikipedia.org/wiki/Continuous-time_Markov_chain

    Another discrete-time process that may be derived from a continuous-time Markov chain is a δ-skeleton—the (discrete-time) Markov chain formed by observing X(t) at intervals of δ units of time. The random variables X (0), X (δ), X (2δ), ... give the sequence of states visited by the δ-skeleton.