Using Markov chains (or something similar) to produce an IRC-bot
Asked Answered
W

3

20

I tried google and found little that I could understand.

I understand Markov chains to a very basic level: It's a mathematical model that only depends on previous input to change states..so sort of a FSM with weighted random chances instead of different criteria?

I've heard that you can use them to generate semi-intelligent nonsense, given sentences of existing words to use as a dictionary of kinds.

I can't think of search terms to find this, so can anyone link me or explain how I could produce something that gives a semi-intelligent answer? (if you asked it about pie, it would not start going on about the vietnam war it had heard about)

I plan on:

  • Having this bot idle in IRC channels for a bit
  • Strip any usernames out of the string and store as sentences or whatever
  • Over time, use this as the basis for the above.
Wisniewski answered 31/3, 2011 at 15:54 Comment(0)
S
32

Yes, a Markov chain is a finite-state machine with probabilistic state transitions. To generate random text with a simple, first-order Markov chain:

  1. Collect bigram (adjacent word pair) statistics from a corpus (collection of text).
  2. Make a markov chain with one state per word. Reserve a special state for end-of-text.
  3. The probability of jumping from state/word x to y is the probability of the words y immediately following x, estimated from relative bigram frequencies in the training corpus.
  4. Start with a random word x (perhaps determined by how often that word occurs as the first word of a sentence in the corpus). Then pick a state/word y to jump to randomly, taking into account the probability of y following x (the state transition probability). Repeat until you hit end-of-text.

If you want to get something semi-intelligent out of this, then your best shot is to train it on lots of carefully collected texts. The "lots" part makes it produce proper sentences (or plausible IRC speak) with high probability; the "carefully collected" part means you control what it talks about. Introducing higher-order Markov chains also helps in both areas, but takes more storage to store the necessary statistics. You may also look into things like statistical smoothing.

However, having your IRC bot actually respond to what is said to it takes a lot more than Markov chains. It may be done by doing text categorization (aka topic spotting) on what is said, then picking a domain-specific Markov chain for text generation. Naïve Bayes is a popular model for topic spotting.

Kernighan and Pike in The Practice of Programming explore various implementation strategies for Markov chain algorithms. These, and natural language generation in general, is covered in great depth by Jurafsky and Martin, Speech and Language Processing.

Subversion answered 31/3, 2011 at 16:15 Comment(1)
As for generating text related to the question asked, you may want to look at MegaHAL. It extracts keywords from the input and then uses different markov chains to expand these keywords to sentences, forwards and backwards from the keyword(s).Albumen
S
0

You want to look for Ian Barber Text Generation ( phpir.com ). Unfortunately the site is down or offline. I have a copy of his text and I want to send it to you.

Stepper answered 31/3, 2011 at 16:8 Comment(0)
M
0

It seems to me you are trying multiple things at the same time:

  1. extracting words/sentences by idling in IRC
  2. building a knowledge base
  3. listening to some chat, parsing keywords
  4. generate some sentence regarding keywords

Those are basically very different tasks. Markov models are often used for machine learning. I don't see much learning in your tasks though.

larsmans answer shows how you generate sentences from word-based markov-models. You can also train the weights to favor those word-pairs that other IRC users used. But nonetheless this will not generate keyword-related sentences, because building/refining a markov model is not the same as "driving" it.

You might try hidden markov models (HMM) where the visible output is the keywords and the hidden states are made from those word-pairs. You could then favor sentences more appropriate to specific keywords dynamically.

Masterpiece answered 31/3, 2011 at 16:22 Comment(1)
1,2, and 3 are the ones I am pretty sure I know how to do or can. 4 is the one I want Markov chains for.Wisniewski

© 2022 - 2024 — McMap. All rights reserved.