shabda
Comments
Reactions

Generating pseudo random text with Markov chains using Python

By : Shabda Raaj

First the definition from Wolfram

A Markov chain is collection of random variables {X_t} (where the index t runs through 0, 1, ...) having the property that, given the present, the future is conditionally independent of the past.

Wikipedia is a little clearer

...Markov chain is a stochastic process with markov property ... [Which means] state changes are probabilistic, and future state depend on current state only.

Markov chains have various uses, but now let's see how it can be used to generate gibberish, which might look legit.

The algorithm is,

  1. Have a text which will serve as the corpus from which we choose the next transitions.
  2. Start with two consecutive words from the text. The last two words constitute the present state.
  3. Generating next word is the markov transition. To generate the next word, look in the corpus, and find which words are present after the given two words. Choose one of them randomly.
  4. Repeat 2, until text of required size is generated.

The code for this is

To see a sample output, we take the text of My man jeeves by Wodehouse from Project Gutenberg, and see a sample output.

In [1]: file_ = open('/home/shabda/jeeves.txt')

In [2]: import markovgen

In [3]: markov = markovgen.Markov(file_)

In [4]: markov.generate_markov_text()
Out[4]: 'Can you put a few years of your twin-brother Alfred,
who was apt to rally round a bit. I should strongly advocate
the blue with milk'

[The files you need to run this are jeeves.txt and markovgen.py]

How is this a markov algorithm?

  • The last two words are the current state.
  • Next word depends on last two words only, or on present state only.
  • The next word is randomly chosen from a statistical model of the corpus.

Here is some sample text.

"The quick brown fox jumps over the brown fox who is slow jumps over the brown fox who is dead."

This gives us a corpus like,

{('The', 'quick'): ['brown'],
 ('brown', 'fox'): ['jumps', 'who', 'who'],
 ('fox', 'jumps'): ['over'],
 ('fox', 'who'): ['is', 'is'],
 ('is', 'slow'): ['jumps'],
 ('jumps', 'over'): ['the', 'the'],
 ('over', 'the'): ['brown', 'brown'],
 ('quick', 'brown'): ['fox'],
 ('slow', 'jumps'): ['over'],
 ('the', 'brown'): ['fox', 'fox'],
 ('who', 'is'): ['slow', 'dead.']}

Now if we start with "brown fox", the next word can be "jumps" or "who". If we choose "jumps", then the current state is "fox jumps" and next word is over, and so on.

Hints

  • The larger text we choose, the more choices are there at each transition, and a better looking text is generated.
  • The state can be set to depend on one word, two words, or any number of words. As the number of words in each state increases, the generated text becomes less random.
  • Don't strip the punctuations etc. They lead to a more representative corpus, and a better random text.

Resources


Did you know that markov chains have many other uses in finance, statistics and mathematics? In fact Google's page rank algorithm is essentially a markov chain of the web! We publish articles on Algorithms, Python, Django and related technologies every week. Why not Subscribe to us or follow us on twitter?


Related Posts


Can we help you build amazing apps? Contact us today.

Topics : seo algorithms

Comments

Oliver Gassner

links for 2009-06-17...

What is a browser?
OMFG (Ich sag ja immer: geht in die Fußgängerzone und FRAGT mal )
(tags: browser *****)

Thoughts on Opera Unite | FactoryCity
Opera Uni...

commmenttor
Paddy3118

You don't seem to use the relative frequenccy of next words to weight the random choice?

- Paddy.

commmenttor
shabda

Paddy: The words are added to the list the number of times they appear. For example, a key might be

('milk', 'honey'):['story', 'book', 'book'],
........

When you do a random.choice, book has twice the probability of story, hence it is frequency weighted.

commmenttor
Paddy3118

Thanks, I missed that!

commmenttor
sonu 22nd July, 2009

Markov chain has been the collection of random variables ,it is also used to generate gibberish ,actually markov chain depends on the present and future.It has the only one who can make a possible word of future.

commmenttor
brian metz 18th Aug., 2009

do you remember a program from waaaaay back called "text mangler"? it used markov chains (i guess), and it was almost creepy how the sentences came out. it was hours of pointless and amazing fun... i haven't been able to get the same results with all the other newer generators... thanks for this - i'm going to get all texty with this tonight!!!!!

commmenttor
lagerkoller's soup

Generating pseudo random text with Markov chains using Python — The Usware Blog - Django Web Development...

...

commmenttor
Sebastian Paaske Tørholm

For fun, here's a Markov-chain-based word generator to make words that have a realistic structure, based on existing words.

http://gist.github.com/360212

Usage is pretty much the same.

commmenttor
Michael T 24th Nov., 2011

Real Good description and clear examples.
Big help in understanding.
Thanks

commmenttor
Daniel Ellis 7th Aug., 2013

I did a quick modification that lets you specify chain size. Changing the chain size to 4 makes the text less random, but also makes it seem like less gibberish. It looks like it tends to make the 25 words come from 2 or 3 separate sentences:

https://gist.github.com/dellis23/6174914

commmenttor
Brendan

When I run the code I get an error:
::import markovgen

ImportError Traceback (most recent call last)
in ()
----> 1 import markovgen

ImportError: No module named markovgen

I have the code in Ipython. would that cause the problem?

commmenttor
Air Jordan 2011 Q Flight Zoom Making Use of Shock Absorption Round The Palm

Generating pseudo random text with Markov chains using Python - Agiliq Blog | Django web app development

commmenttor

Reactions

aduric

NLTK has a pretty good text generation library. It generates text based on any corpus you give it.

This comment was originally posted on Reddit

jcsalterego

Not ground-breaking by any means, but a nice, concrete application of Markov chains.

This comment was originally posted on Hacker News

davidw

This is relevant, if you want to do something fancier:http://megahal.alioth.debian.org/How.html

BTW, has anything better than Megahal turned up in the meantime, that is also open source?

This comment was originally posted on Hacker News

davidw

This is relevant, if you wan to do something fancier:http://megahal.alioth.debian.org/How.html

BTW, has anything better than Megahal turned up in the meantime, that is also open source?

This comment was originally posted on Hacker News

Dylan Richard

Markov chains can be hilarious.

This comment was originally posted on FriendFeed

Caligula

This is a big reason why speech recognition works. You have a list of probabilities of sequence of words occuring(language model) and it decides on the next word based on the last few previous words.So if you have the word Super, odds are the next word is Man and not Microphone.

This comment was originally posted on Hacker News

alxv

In "The Practice of Programming", Brian Kernighan and Rob Pike use the Markov Chain Algorithm to illustrate a number of lessons about software design and implementation. I used it as my "Hello world" program when learning a new programming language.In Python, the implementation is surprisingly succinct:

import sys import random import collections MAXGEN = 10000 NONWORD = "" suffixchain = collections.defaultdict(list) w1 = w2 = NONWORD for line in sys.stdin: for word in line.split(): suffixchain[(w1, w2)].append(word) w1, w2 = w2, word suffixchain[(w1, w2)].append(NONWORD) w1 = w2 = NONWORD for i in range(MAXGEN): suffixes = suffixchain[(w1, w2)] next = random.choice(suffixes) if next is NONWORD: break print next, w1, w2 = w2, nextTo use it, just run: $ python markov.py < english_corpus.txt

This comment was originally posted on Hacker News

SuperMicrophone

I resent that.

This comment was originally posted on Hacker News

DarkXanthos

This should be called how to use a library in Python. Pretty weak IMO.

This comment was originally posted on Reddit

DarkXanthos

Here’s one I wrote that actually shows how they’re coded: http://justinbozonier.posterous.com/markov-chains-in-python

This comment was originally posted on Reddit

fas2

I once was given homework by the information theory lecture to implement such a random text generator (k-th order markov chain, k configurable). As I just started with python I thought it would be fun to do that in a one liner. It happened, though it was a really long line and a total lambda mess.

This comment was originally posted on Reddit

Post a comment Name :

Email :

Your site url:

Comment :

© Agiliq, 2009-2012