Any business examples of using Markov chains?
Asked Answered
C

14

17

What business cases are there for using Markov chains? I've seen the sort of play area of a markov chain applied to someone's blog to write a fake post. I'd like some practical examples though? E.g. useful in business or prediction of stock market, or the like...

Edit: Thanks to all who gave examples, I upvoted each one as they were all useful.
Edit2: I selected the answer with the most detail as the accepted answer. All answers I upvoted.

Commentative answered 24/9, 2008 at 17:29 Comment(1)
Fake blogs ARE practical examples. They are used to promote web sites in search engines.Tortfeasor
S
6

There is a class of optimization methods based on Markov Chain Monte Carlo (MCMC) methods. These have been applied to a wide variety of practical problems, for example signal & image processing applications to data segmentation and classification. Speech & image recognition, time series analysis, lots of similar examples come out of computer vision and pattern recognition.

Seth answered 30/9, 2008 at 14:23 Comment(0)
D
13

The obvious one: Google's PageRank.

Decree answered 24/9, 2008 at 17:32 Comment(1)
Can you explain how Markov chains is obviously associated with Google's PageRank?Commentative
M
9

Hidden Markov models are based on a Markov chain and extensively used in speech recognition and especially bioinformatics.

Maugham answered 24/9, 2008 at 17:53 Comment(0)
F
7

I've seen spam email that was clearly generated using a Markov chain -- certainly that qualifies as a "business use". :)

Finis answered 24/9, 2008 at 17:33 Comment(0)
R
6

We use log-file chain-analysis to derive and promote secondary and tertiary links to otherwise-unrelated documents in our help-system (a collection of 10m docs).

This is especially helpful in bridging otherwise separate taxonomies. e.g. SQL docs vs. IIS docs.

Resident answered 24/9, 2008 at 17:41 Comment(0)
S
6

There is a class of optimization methods based on Markov Chain Monte Carlo (MCMC) methods. These have been applied to a wide variety of practical problems, for example signal & image processing applications to data segmentation and classification. Speech & image recognition, time series analysis, lots of similar examples come out of computer vision and pattern recognition.

Seth answered 30/9, 2008 at 14:23 Comment(0)
R
5

I know AccessData uses them in their forensic password-cracking tools. It lets you explore the more likely password phrases first, resulting in faster password recovery (on average).

Rickard answered 24/9, 2008 at 17:59 Comment(0)
T
5

There are some commercial Ray Tracing systems that implement Metropolis Light Transport (invented by Eric Veach, basically he applied metropolis hastings to ray tracing), and also Bi-Directional- and Importance-Sampling- Path Tracers use Markov-Chains.

The bold texts are googlable, I omitted further explanation for the sake of this thread.

Teplica answered 24/3, 2009 at 16:34 Comment(0)
A
5

Markov chains are used by search companies like bing to infer the relevance of documents from the sequence of clicks made by users on the results page. The underlying user behaviour in a typical query session is modeled as a markov chain , with particular behaviours as state transitions... for example if the document is relevant, a user may still examine more documents (but with a smaller probability) or else he may examine more documents (with a much larger probability).

Archine answered 28/10, 2010 at 8:9 Comment(0)
A
3

We plan to use it for predictive text entry on a handheld device for data entry in an industrial environment. In a situation with a reasonable vocabulary size, transitions to the next word can be suggested based on frequency. Our initial testing suggests that this will work well for our needs.

Achondrite answered 24/3, 2009 at 16:52 Comment(0)
D
2

IBM has CELM. Read more about it here.

Detumescence answered 5/11, 2008 at 3:56 Comment(0)
C
2

I recently stumbled on a blog example of using markov chains for creating test data...

http://github.com/emelski/code.melski.net/blob/master/markov/main.cpp

Commentative answered 16/9, 2009 at 0:24 Comment(1)
The link provided is just the source code of a markov generator; the associated blog is at blog.electric-cloud.com/2009/09/15/…Cloudburst
C
2

Markov model is a way of describing a process that goes through a series of states.

HMMs can be applied in many fields where the goal is to recover a data sequence that is not immediately observable (but depends on some other data on that sequence).

Common applications include:

Crypt-analysis, Speech recognition, Part-of-speech tagging, Machine translation, Stock Prediction, Gene prediction, Alignment of bio-sequences, Gesture Recognition, Activity recognition, Detecting browsing pattern of a user on a website.

Cottager answered 19/6, 2011 at 7:44 Comment(1)
You can refer to ganeshtiwaridotcomdotnp.blogspot.com/2011/06/… on using HMM to speech recognition. It combines Gaussian Mixture Model and Hidden Markov Model for joint Speech and Speaker Recognition.Cottager
T
1

Markov Chains can be used to simulate user interaction, f.g. when browsing service.

My friend was writing as diplom work plagiat recognision using Markov Chains (he said the input data must be whole books to succeed).

It may not be very 'business' but Markov Chains can be used to generate fictitious geographical and person names, especially in RPG games.

Tortfeasor answered 23/12, 2010 at 11:37 Comment(0)
A
1

Markov Chains are used in life insurance, particularly in the permanent disability model. There are 3 states

  • 0 - The life is healthy
  • 1 - The life becomes disabled
  • 2 - The life dies

In a permanent disability model the insurer may pay some sort of benefit if the insured becomes disabled and/or the life insurance benefit when the insured dies. The insurance company would then likely run a monte carlo simulation based on this Markov Chain to determine the likely cost of providing such an insurance.

Alonso answered 21/4, 2013 at 21:51 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.