Long Short-Term Memory Neural Networks and Guitar Tablature

Just messing around with LSTM neural nets, I had the idea to train it on some guitar tabs and see what happened.

Data

My dataset was a collection of guitar tabs for The Who’s album, Tommy.

Some stats
Text total length: 277,155
Distinct chars : 91
Total sequences : 92,369

So yes, it’s a tiny little dataset.

Training

I’m using TFLearn’s implementation of an LSTM neural net. My setup is as follows:

Input data -> LSTM layer (512 units, returns full sequence (3d tensor)), -> dropout (50%) -> another LSTM layer (512 units, returns only output of last sequence (2d tensor)) -> dropout (50%) -> fully connected layer (softmax activation).

Results

On the first epoch with a temperature of 1.0 (meaning we’re trying to generate novel sequences, less likely to repeat the input data), we get:

Spring Quarter 2017

CSC 426 ΓÇô Research Methods and Practice in Computing

CSC 478 ΓÇô Programming Machine Learning Applications

CSC 594 ΓÇô Topics in Artificial Intelligence: Natural Language Processing