Skip to content

Instantly share code, notes, and snippets.

@aleksas
Created December 5, 2018 01:01
Show Gist options
  • Save aleksas/2cc37cf699ee34aadb7f69308457d0f7 to your computer and use it in GitHub Desktop.
Save aleksas/2cc37cf699ee34aadb7f69308457d0f7 to your computer and use it in GitHub Desktop.
Try t2t on lt
PROBLEM=encoder_character_stressor
MODEL=transformer_encoder
HPARAMS=transformer_base
DATA_DIR=$HOME/t2t_data
TMP_DIR=/tmp/t2t_datagen
TRAIN_DIR=$HOME/t2t_train/$PROBLEM/$MODEL-$HPARAMS
BATCH_SIZE=2048
WORKER_GPU=2
TRAIN_STEPS=500000
USR_DIR=.
mkdir -p $DATA_DIR $TMP_DIR
tensorboard --logdir $TRAIN_DIR &
t2t-datagen \
--data_dir=$DATA_DIR \
--tmp_dir=$TMP_DIR \
--t2t_usr_dir=$USR_DIR \
--problem=$PROBLEM
t2t-trainer \
--data_dir=$DATA_DIR \
--problem=$PROBLEM \
--model=$MODEL \
--hparams_set=$HPARAMS \
--output_dir=$TRAIN_DIR \
--tmp_dir=$TMP_DIR \
--t2t_usr_dir=$USR_DIR \
--batch_size=$BATCH_SIZE \
--worker_gpu=$WORKER_GPU \
--train_steps=$TRAIN_STEPS \
--hparams=eval_drop_long_sequences=True
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment