Skip to content

Commit

Permalink
Merge pull request #106 from lissyx/fix-amp-usage
Browse files Browse the repository at this point in the history
Make automatic mixed precision a default flag
  • Loading branch information
lissyx authored Mar 4, 2020
2 parents 0859bfe + 39a032d commit 3c8aee8
Show file tree
Hide file tree
Showing 3 changed files with 17 additions and 4 deletions.
7 changes: 7 additions & 0 deletions DeepSpeech/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,13 @@ Some parameters for the model itself:
- `dropout` to define the dropout applied
- `lm_alpha`, `lm_beta` to control language model alpha and beta parameters

Pay attention to automatic mixed precision: it will speed up the training
process (by itself and because it allows to increase batch size). However,
this is only viable when you are experimenting on hyper-parameters. Proper
selection of best evaluating model seems to vary much more when AMP is enabled
than when it is disabled. So use with caution when tuning parameters and
disable it when making a release.

Default values should provide good experience.

The default batch size has been tested with this mix of dataset:
Expand Down
6 changes: 4 additions & 2 deletions DeepSpeech/Dockerfile.train
Original file line number Diff line number Diff line change
Expand Up @@ -8,15 +8,16 @@ ARG kenlm_branch=2ad7cb56924cd3c6811c604973f592cb5ef604eb

ARG model_language=fr

ARG batch_size=96
ARG batch_size=64
ARG n_hidden=2048
ARG epochs=60
ARG epochs=30
ARG learning_rate=0.0001
ARG dropout=0.3
ARG lm_alpha=0.65
ARG lm_beta=1.45
ARG beam_width=500
ARG early_stop=1
ARG amp=0

ARG lm_evaluate_range=
ARG english_compatible=0
Expand Down Expand Up @@ -51,6 +52,7 @@ ENV LM_EVALUATE_RANGE=$lm_evaluate_range
ENV ENGLISH_COMPATIBLE=$english_compatible

ENV EARLY_STOP=$early_stop
ENV AMP=$amp

ENV PATH="$VIRTUAL_ENV/bin:$PATH"

Expand Down
8 changes: 6 additions & 2 deletions DeepSpeech/fr/train.sh
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,15 @@ pushd $HOME/ds/
EARLY_STOP_FLAG="--noearly_stop"
fi;

AMP_FLAG=""
if [ "${AMP}" = "1" ]; then
AMP_FLAG="--automatic_mixed_precision True"
fi;

python -u DeepSpeech.py \
--show_progressbar True \
--use_cudnn_rnn True \
--automatic_mixed_precision True \
${AMP_FLAG} \
--alphabet_config_path /mnt/models/alphabet.txt \
--lm_binary_path /mnt/lm/lm.binary \
--lm_trie_path /mnt/lm/trie \
Expand Down Expand Up @@ -57,7 +62,6 @@ pushd $HOME/ds/
--beam_width ${BEAM_WIDTH} \
--lm_alpha ${LM_ALPHA} \
--lm_beta ${LM_BETA} \
${EARLY_STOP_FLAG} \
--load "best" \
--checkpoint_dir /mnt/checkpoints/ \
--export_dir /mnt/models/ \
Expand Down

0 comments on commit 3c8aee8

Please sign in to comment.