Skip to content

A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling

Notifications You must be signed in to change notification settings

ivysoftware/rnn-nlu

 
 

Repository files navigation

Attention-based RNN model for Spoken Language Understanding (Intent Detection & Slot Filling)

Tensorflow implementation of attention-based LSTM models for sequence classification and sequence labeling.

Setup

Usage:

data_dir=data/ATIS_samples
model_dir=model_tmp
max_sequence_length=50  # max length for train/valid/test sequence
task=joint  # available options: intent; tagging; joint
bidirectional_rnn=True  # available options: True; False

python run_multi-task_rnn.py --data_dir $data_dir \
      --train_dir   $model_dir\
      --max_sequence_length $max_sequence_length \
      --task $task \
      --bidirectional_rnn $bidirectional_rnn

Reference

Contact

Feel free to email liubing@cmu.edu for any pertinent questions/bugs regarding the code.

About

A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 84.1%
  • Perl 15.9%