markov-chains Questions

1

Solved

I've plotted a markov chain in R, but I dislike the rather hugh arrowheads that the plot-function is plotting. Is there a way to make the heads smaller? library( markovchain ) transition.matrix &...
Isaak asked 10/10, 2015 at 14:49

4

Solved

I have a Markov chain given as a large sparse scipy matrix A. (I've constructed the matrix in scipy.sparse.dok_matrix format, but converting to other ones or constructing it as csc_matrix are fine....
Remise asked 23/1, 2014 at 12:56

1

Solved

I have a Python dictionary with state transition probabilities of a Markov-chain model. dict_m = {('E', 'F'): 0.29032258064516131, ('D', 'F'): 0.39726027397260272, ('D', 'D'): 0.30136986301369861,...
Loco asked 13/8, 2014 at 9:15

2

Solved

I have a very large absorbing Markov chain (scales to problem size -- from 10 states to millions) that is very sparse (most states can react to only 4 or 5 other states). I need to calculate one r...
Veats asked 29/7, 2012 at 0:29

3

Solved

I was thinking of creating a chatbot using something like markov chains, but I'm not entirely sure how to get it to work. From what I understand, you create a table from data with a given word and ...

1

I'm studying Reinforcement Learning and reading Sutton's book for a university course. Beside the classic PD, MC, TD and Q-Learning algorithms, I'm reading about policy gradient methods and genetic...

2

Solved

How do Markov Chains work? I have read wikipedia for Markov Chain, But the thing I don't get is memorylessness. Memorylessness states that: The next state depends only on the current state and ...
Shropshire asked 15/12, 2013 at 14:26

1

I have written a C++ program that simulates a certain process I'm studying. It outputs discrete "states" each timestep of the simulation. For example: a b c b c b would be the output of a simula...
Hathcock asked 27/10, 2013 at 10:38

3

Solved

I need a simple random English sentence generator. I need to populate it with my own words, but it needs to be capable of making longer sentences that at least follow the rules of English, ev...
Jeffryjeffy asked 18/3, 2009 at 19:48

2

Solved

I have a series of n=400 sequences of varying length containing the letters ACGTE. For example, the probability of having C after A is: and which can be calculated from the set of empirical sequ...
Heinz asked 15/7, 2013 at 21:49

1

Solved

A Markov chain is composed of a set of states which can transition to other states with a certain probability. A Markov chain can be easily represented in Neo4J by creating a node for each state, ...
Fidelia asked 17/5, 2013 at 4:7

14

Solved

What business cases are there for using Markov chains? I've seen the sort of play area of a markov chain applied to someone's blog to write a fake post. I'd like some practical examples though? E.g...
Commentative asked 24/9, 2008 at 17:29

3

Solved

I have a Markov chain that I would like to represent graphically in javascript. I need to represent the nodes, links, and transition probabilities. Perhaps something like one of these two dia...
Gregoriagregorian asked 25/8, 2011 at 3:6

2

Solved

I would like to modify the script below so that it creates paragraphs out of a random number of the sentences generated by the script. In other words, concatenate a random number (like 1-5) of sent...
Hypercorrection asked 20/10, 2012 at 21:13

1

Solved

I use the formula exp(X) as the rate for a markov chain. So the ratio of selecting one link over another is exp(X1)/exp(X2). My problem is that sometimes X is very large, so exp(X) will exceed the ...
Daughterinlaw asked 17/8, 2012 at 19:10

3

Solved

Background I have an ordered set of data points stored as a TreeSet<DataPoint>. Each data point has a position and a Set of Event objects (HashSet<Event>). There are 4 possible Event ob...
Belly asked 15/8, 2012 at 11:4

1

Solved

A first-order transition matrix of 6 states can be constructed very elegantly as follows x = [1 6 1 6 4 4 4 3 1 2 2 3 4 5 4 5 2 6 2 6 2 6]; % the Markov chain tm = full(sparse(x(1:end-1),x(2:end...
Recipe asked 17/6, 2012 at 14:51

1

I have 11 states, and a transition probability matrix, but I don't have emissions as my model is not hidden. It consists only of states (1,2,3, ..., 11) I want to generate random states based on my...
Schaerbeek asked 15/6, 2012 at 17:22

3

Solved

I am writing a program that attempts to duplicate the algorithm discussed at the beginning of this article, http://www-stat.stanford.edu/~cgates/PERSI/papers/MCMCRev.pdf F is a function from char...
Impediment asked 14/9, 2011 at 21:36

1

Solved

While we were all twiddling our thumbs, a 17-year-old Canadian boy has apparently found an information retrieval algorithm that: a) performs with twice the precision of the current, and widely-us...

3

Solved

I tried google and found little that I could understand. I understand Markov chains to a very basic level: It's a mathematical model that only depends on previous input to change states..so sort o...
Wisniewski asked 31/3, 2011 at 15:54

2

Solved

Let's imagine, I have two English language texts written by the same person. Is it possible to apply some Markov chain algorithm to analyse each: create some kind of fingerprint based on stat...
Adduct asked 22/1, 2011 at 23:22

2

I am making a registering form and because some will enter gibberish in the Secret Answer's input (I do that myself), I would like to test that value to see if it's likely to be a good answer. I ha...
Virulence asked 12/1, 2011 at 18:58

1

Solved

I am rewriting a Monte Carlo simulation model in MATLAB with an emphasis on readability. The model involves many particles, represented as (x,y,z), following a random walk over a small set of state...
Witcher asked 24/9, 2010 at 2:34

© 2022 - 2024 — McMap. All rights reserved.