## A Simple Markov Chain Text Generator Example Implementation

There are plenty of articles describing what a Markov Chain is, so I won’t delve into the details there, but few actually show how to implement one. So this article will focus on a simple implementation of a Markov Chain text generator with a configurable block length. Example texts available as sample on this page is an excerpt of “Alice in Wonderland”. It is included as hidden PRE tag in the HTML below.

## Overview

A Markov Chain text generator generates pseudo random text that, based on a certain block size, can produce natural looking text. For example, the letter “A” in English is followed by other letters with a certain probability. So, if the last letter printed was “A”, the generator will choose a random letter to follow it, but the randomness is weighted by previously analyzed text. If you have a larger block size, say three, the letters “Mar” are followed by three more letters with some other probability. For instance, if this article were analyzed, the generator would likely choose “kov” as the next three characters producing the word “Markov”. We will refer to these letters, or sequences of letters as nodes. It should be obvious that the longer the sequence used for nodes, the more natural looking the next will be, but also more memory will be used.

## Implementation Steps

I’ll break the implementation down into a few steps as follows:

• Choosing a block size.
• Specifying the text.
• Breaking the text into blocks.
• Creating a list of transitions from one block to another.
• Choosing a random starting node for the generation of text.
• Choosing a weighted node to follow the current node.
• Repeating the previous step as many times as desired.

## Storing Node Weights and Selected Weighted Random Values

Two of the hardest problems are defining a method of storing the weights for the transitions between the nodes, and also selecting the next node based on those weights. We can solve both problems at the same time rather simply. To start, all of the nodes will be stored as an associative array, with the characters of the node serving as the index. Each element in that array will also be an array which contains all of the nodes which follow the initial node, including repeats. For example, the entry for the node “ah” is below (based on “Alice in Wonderland”):

```nodes["ah"] = ["’l", " w", " m", ", ", ", ", ": ", " h", ", ", "’s", "!”", "! ", "’l", " s", " a", ", ", " o", " o", " i", " o"]
```

As you can see, the node “ah” is most often followed by a comma and a space, next most often is a space and the letter “o”. By storing the data this way we don’t have to compute weights individually at all, and selecting a weighted node is as simple as choosing a random value from the array. The downside is that it will use more memory than if we just stored the weights.

## Block Size and Text

We’ll assume we have a class that implements the Markov Chain text generator. We can pass in both the text and the desired block size to its constructor:

```function generateMarkovText(){
const c=new MarkovChainTextGenerator(document.getElementById("aliceinwonderlandfull").innerHTML,2);
document.getElementById("output").textContent=c.generateText(1000);
}
```

## Analyzing the Text

```constructor(text,blockSize) {
this.nodes=new Array();
this.buildChain(text,blockSize);
}

buildChain(text,blockSize){
for(let i=blockSize;i<text.length-blockSize+1;i++){
const last=text.substring(i-blockSize,i);
const next=text.substring(i,i+blockSize);
}
}

if(this.nodes[last]==undefined){
this.nodes[last]=new Array();
}
this.nodes[last][this.nodes[last].length]=next;
}
```

The constructor accepts the text to analyze and the block size as parameters, then hands that off to the buildChain() method. The buildChain() method just breaks all of the text up into it’s different blocks, and calls addTransition(). And the addTransition() method just checks to see if the node already exists, creating an empty array if it doesn’t, then appends the new node to the array.

### The Text Generator

That handles everything for generating the actual Markov Chain. Next step is to actually use the chain to produce some random text.

First, we’ll choose a node at random to start with using chooseRandomNode(). Next we’ll choose a random value between 0 and the length of the array for the current node, append that to the text we’re generating, and just keep looping until we have all the text we want.

It’s possible that there will be a node which is a dead end, meaning it has zero probability of transitioning to another node. What this really means is you’ve reached the end of the text. For example, the “Alice in Wonderland” text ends with “THE END”, which appears nowhere else in the text. So, for a node size of 7, there will be no node following it. There are multiple ways of handling this situation, for our purposes we will just choose another random node to start with and continue on. This is what the check for undefined in the generateText() method is for.

The completed class is as follows:

```class MarkovChainTextGenerator{
constructor(text,blockSize) {
this.nodes=new Array();
this.buildChain(text,blockSize);
}

buildChain(text,blockSize){
for(let i=blockSize;i<text.length-blockSize+1;i++){
const last=text.substring(i-blockSize,i);
const next=text.substring(i,i+blockSize);
}
}

if(this.nodes[last]==undefined){
this.nodes[last]=new Array();
}
this.nodes[last][this.nodes[last].length]=next;
}

chooseRandomNode(){
const values=Object.values(this.nodes);
return values[Math.floor(values.length*Math.random())];
}

generateText(textLength){
let currentNode=this.chooseRandomNode();
let text=currentNode;

for(let i=0;i<textLength;i++){
if(this.nodes[currentNode]==undefined){
currentNode=this.chooseRandomNode();
}
const nextIndex=Math.floor(this.nodes[currentNode].length*Math.random());
currentNode=this.nodes[currentNode][nextIndex];

text=text+currentNode;
}
return text;
}
}
```

### Try It!!!

Node Size:
Generated text will appear here