Skip to content

Latest commit

 

History

History
21 lines (13 loc) · 685 Bytes

README.md

File metadata and controls

21 lines (13 loc) · 685 Bytes

LSTM language modeling on Penn Treebank dataset

This example is mainly to demonstrate:

  1. How to train an RNN with persistent state between iterations. Here it simply manages the state inside the graph.
  2. How to use a TF reader pipeline instead of a DataFlow, for both training & inference.

It trains an language model on PTB dataset, basically an equivalent of the PTB example in tensorflow/models with its "medium" config. It has the same performance & speed as the original example as well.

Note that the data pipeline is completely copied from the tensorflow example.

To Train:

./PTB-LSTM.py