enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Markov Chains and Mixing Times - Wikipedia

    en.wikipedia.org/wiki/Markov_Chains_and_Mixing_Times

    The mixing time of a Markov chain is the number of steps needed for this convergence to happen, to a suitable degree of accuracy. A family of Markov chains is said to be rapidly mixing if the mixing time is a polynomial function of some size parameter of the Markov chain, and slowly mixing otherwise. This book is about finite Markov chains ...

  3. Markov chain - Wikipedia

    en.wikipedia.org/wiki/Markov_chain

    Another discrete-time process that may be derived from a continuous-time Markov chain is a δ-skeleton—the (discrete-time) Markov chain formed by observing X(t) at intervals of δ units of time. The random variables X (0), X (δ), X (2δ), ... give the sequence of states visited by the δ-skeleton.

  4. Continuous-time Markov chain - Wikipedia

    en.wikipedia.org/wiki/Continuous-time_Markov_chain

    Another discrete-time process that may be derived from a continuous-time Markov chain is a δ-skeleton—the (discrete-time) Markov chain formed by observing X(t) at intervals of δ units of time. The random variables X (0), X (δ), X (2δ), ... give the sequence of states visited by the δ-skeleton.

  5. Markov chain mixing time - Wikipedia

    en.wikipedia.org/wiki/Markov_chain_mixing_time

    In probability theory, the mixing time of a Markov chain is the time until the Markov chain is "close" to its steady state distribution.. More precisely, a fundamental result about Markov chains is that a finite state irreducible aperiodic chain has a unique stationary distribution π and, regardless of the initial state, the time-t distribution of the chain converges to π as t tends to infinity.

  6. Discrete-time Markov chain - Wikipedia

    en.wikipedia.org/wiki/Discrete-time_Markov_chain

    A Markov chain with two states, A and E. In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past.

  7. Markov renewal process - Wikipedia

    en.wikipedia.org/wiki/Markov_renewal_process

    The process is Markovian only at the specified jump instants, justifying the name semi-Markov. [1] [2] [3] (See also: hidden semi-Markov model.) A semi-Markov process (defined in the above bullet point) in which all the holding times are exponentially distributed is called a continuous-time Markov chain. In other words, if the inter-arrival ...

  8. Hitting time - Wikipedia

    en.wikipedia.org/wiki/Hitting_time

    The Hitting times and stopping times of three samples of Brownian motion. In the study of stochastic processes in mathematics, a hitting time (or first hit time) is the first time at which a given process "hits" a given subset of the state space. Exit times and return times are also examples of hitting times.

  9. Stopping time - Wikipedia

    en.wikipedia.org/wiki/Stopping_time

    Example of a stopping time: a hitting time of Brownian motion.The process starts at 0 and is stopped as soon as it hits 1. In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time [1]) is a specific type of “random time”: a random variable whose value is interpreted as the time at ...