English Dictionary |
MARKOV CHAIN
Pronunciation (US): | (GB): |
Dictionary entry overview: What does Markov chain mean?
• MARKOV CHAIN (noun)
The noun MARKOV CHAIN has 1 sense:
1. a Markov process for which the parameter is discrete time values
Familiarity information: MARKOV CHAIN used as a noun is very rare.
Dictionary entry details
Sense 1
Meaning:
A Markov process for which the parameter is discrete time values
Classified under:
Nouns denoting natural processes
Synonyms:
Markoff chain; Markov chain
Hypernyms ("Markov chain" is a kind of...):
Markoff process; Markov process (a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state)
Learn English with... Proverbs |
"Waste not, want not." (English proverb)
"Tongue may muddle up and say the truth." (Azerbaijani proverb)
"Advice sharpens a rusty opinion." (Arabic proverb)
"Barking dogs don't bite." (Dutch proverb)
"Tongue may muddle up and say the truth." (Azerbaijani proverb)
"Advice sharpens a rusty opinion." (Arabic proverb)
"Barking dogs don't bite." (Dutch proverb)