首页 > 代码库 > Markov chain

Markov chain

w

https://en.wikipedia.org/wiki/Markov_chain

技术分享

 

https://zh.wikipedia.org/wiki/马尔科夫链

 

 

In probability theory and related fields, a Markov process, named after the Russian mathematician Andrey Markov, is a stochastic process that satisfies the Markov property[1][2](sometimes characterized as "memorylessness"). Loosely speaking, a process satisfies the Markov property if one can make predictions for the future of the process based solely on its present state just as well as one could knowing the process‘s full history, hence independently from such history; i.e., conditional on the present state of the system, its future and past states are independent.

Markov chain is a type of Markov process that has either discrete state space or discrete index set (often representing time), but the precise definition of a Markov chain varies.[3] For example, it is common to define a Markov chain as a Markov process in either discrete or continuous time with a countable state space (thus regardless of the nature of time),[4][5][6][7] but it is also common to define a Markov chain as having discrete time in either countable or continuous state space (thus regardless of the state space).[3]

 

马尔可夫链英语:Markov chain),又称离散时间马尔可夫链(discrete-time Markov chain,缩写为DTMC[1]),因俄国数学家安德烈·马尔可夫(俄语:Андрей Андреевич Марков)得名,为状态空间中经过从一个状态到另一个状态的转换的随机过程。该过程要求具备“无记忆”的性质:下一状态的概率分布只能由当前状态决定,在时间序列中它前面的事件均与之无关。这种特定类型的“无记忆性”称作马尔可夫性质。马尔科夫链作为实际过程的统计模型具有许多应用。

在马尔可夫链的每一步,系统根据概率分布,可以从一个状态变到另一个状态,也可以保持当前状态。状态的改变叫做转移,与不同的状态改变相关的概率叫做转移概率。随机漫步就是马尔可夫链的例子。随机漫步中每一步的状态是在图形中的点,每一步可以移动到任何一个相邻的点,在这里移动到每一个点的概率都是相同的(无论之前漫步路径是如何的)。

Markov chain