Is a Markov chain the same as a finite state machine?
Asked Answered
C

6

98

Is a finite state machine just an implementation of a Markov chain? What are the differences between the two?

Conlin answered 2/2, 2011 at 21:52 Comment(2)
You may think a Markov chain as a FSM in which transitions are probability drivenFocalize
Please read these papers: Links between Probabilistic Automata and Hidden Markov Models (By Pierre Dupont) info.ucl.ac.be/~pdupont/pdupont/pdf/HMM_PA_pres_n4.pdf [The Handbook of Brain Theory and Neural Networks] Hidden Markov Models and other Finite State Automata for Sequence Processing citeseerx.ist.psu.edu/viewdoc/…Karlee
C
73

Markov chains can be represented by finite state machines. The idea is that a Markov chain describes a process in which the transition to a state at time t+1 depends only on the state at time t. The main thing to keep in mind is that the transitions in a Markov chain are probabilistic rather than deterministic, which means that you can't always say with perfect certainty what will happen at time t+1.

The Wikipedia articles on Finite-state machines has a subsection on Finite Markov-chain processes, I'd recommend reading that for more information. Also, the Wikipedia article on Markov chains has a brief sentence describing the use of finite state machines in representing a Markov chain. That states:

A finite state machine can be used as a representation of a Markov chain. Assuming a sequence of independent and identically distributed input signals (for example, symbols from a binary alphabet chosen by coin tosses), if the machine is in state y at time n, then the probability that it moves to state x at time n + 1 depends only on the current state.

Cavie answered 2/2, 2011 at 21:56 Comment(3)
Actually, what you are claiming here about a Markov chain is not 100% correct. What you referred here to is the "First-order Markov process". For a second-order Markov process, the next state will depend on the latest 2 time steps' states, ...... A state machine is a special case of a Markov chain; since a Markov chain is stochastic in nature. A state machine, as far as I know, is deterministic.Farseeing
Unqualified, the term Markov chain means a discrete-time stochastic process with the Markov property, which means that it doesn't depend on past states. The original poster didn't ask about higher-order Markov processes, so they aren't really that relevant. Finite state machine is generally a catch all term for finite automaton, these can be either deterministic or non-deterministic in nature.Rowdyish
What's the quote saying about IID? Where this IID assumption required in anything? It's kind of confusing. In any case, to clarify, a Markov chain will not produce a IID in general. Note that wiki quote has been removed from the wiki article. Editor didn't explain reasoning.Marcin
I
34

Whilst a Markov chain is a finite state machine, it is distinguished by its transitions being stochastic, i.e. random, and described by probabilities.

Istria answered 2/2, 2011 at 21:56 Comment(1)
Can I say, stochastic finite state automata?Dingdong
R
19

The two are similar, but the other explanations here are slightly wrong. Only FINITE Markov chains can be represented by a FSM. Markov chains allow for an infinite state space. As it was pointed out, the transitions of a Markov chain are described by probabilities, but it is also important to mention that the transition probabilities can only depend on the current state. Without this restriction, it would be called a "discrete time stochastic process".

Rowdyish answered 5/8, 2011 at 16:55 Comment(4)
Actually, I believe it would be called "non-stationary".Invocation
@Michael I might be wrong, because I have been out of the topic a while, but I thought "stationary" was about time dependence. I might be mistaken, but that seems orthogonal.Rowdyish
"process" is typically used to express continuous time version of the term "chain" (reference: probability theory: a concise course, Rozanov) and a FSM can be represented infinitely or event-driven or non-deterministic. The only other dependence I can imagine besides state would be time.Invocation
@Michael "process" is a generic term. It can be continuous time or discrete time. An FSM can't be represented infinitely it has the word finite in its name. The link you provided even says that is not a finite state machine. You brought up time dependence not me, but in discrete time processes the sequence index is normally considered the time. In that sense, yes a discrete time stochastic process would be non-stationary, but that is not descriptive enough, since it could also potentially be continuous time. I was going for a superset, not the complement in my naming.Rowdyish
I
3

I believe this should answer your question:

https://en.wikipedia.org/wiki/Probabilistic_automaton

And, you are on to the right idea - they are almost the same, subsets, supersets and modifications depending on what adjective describes the chain or the automaton. Automata typically take an input as well, but I am sure there have been papers utilizing 'Markov-chains' with inputs.

Think gaussian distribution vs. normal distribution - same ideas different fields. Automata belong to computer science, Markov belongs to probability and statistics.

Invocation answered 3/7, 2018 at 16:21 Comment(0)
T
2

I think most of the answers are not appropriate. A Markov process is generated by a (probablistic) finite state machine, but not every process generated by a probablistic finite state machine is a Markov process. E.g. Hidden Markov Processes are basically the same as processes generated by probabilistic finite state machines, but not every Hidden Markov Process is a Markov Process.

Tantamount answered 6/7, 2019 at 22:45 Comment(0)
B
1

If leaving the inner working details aside, finite state machine is like a plain value, while markov chain is like a random variable (add probability on top of the plain value). So the answer to the original question is no, they are not the same. In the probabilistic sense, Markov chain is an extension of finite state machine.

Braunite answered 11/12, 2013 at 11:59 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.