A Markov process for which the parameter is discrete time values. Synonym: Markov chain.
|
|
|
|
|
Words linked to "Markoff chain" : Markoff process, Markov process, Markov chain |
Copyright © 2025 Diccionario ingles.com |