InvalidArgument Error for DeepSpeech

I am using my own data try to train, I wrote a shell script as follows:
#!/bin/sh
set -xe
if [ ! -f DeepSpeech.py ]; then
echo “Please make sure you run this from DeepSpeech’s top level directory.”
exit 1
fi;
if [ -d “${COMPUTE_KEEP_DIR}” ]; then
checkpoint_dir=$COMPUTE_KEEP_DIR
else
checkpoint_dir=$(python -c “from xdg import BaseDirectory as xdg; print(xdg.save_data_path(“DeepSpeech/$yihong”))”)
fi
python -u DeepSpeech.py
–train_files ~/DeepSpeech/data/yihong/train/train.csv
–dev_files ~/DeepSpeech/data/yihong/dev/dev.csv
–test_files ~/DeepSpeech/data/yihong/test/test.csv
–train_batch_size 8820
–dev_batch_size 2520
–test_batch_size 1260
–n_hidden 512
–epoch 50
–validation_step 1
–early_stop True
–earlystop_nsteps 6
–estop_mean_thresh 0.1
–estop_std_thresh 0.1
–dropout_rate 0.22
–learning_rate 0.00095
–report_count 100
–use_seq_length False
–export_dir ~/DeepSpeech/data/yihong/results/model_export/
–checkpoint_dir “$checkpoint_dir”
–alphabet_config_path ~/DeepSpeech/data/yihong/alphabet.txt
–lm_binary_path /home/nvidia/DeepSpeech/data/yihong/lm.binary
–lm_trie_path /home/nvidia/DeepSpeech/data/yihong/trie
“$@”
However, I got this error:
InvalidArgumentError (see above for traceback): Multiple OpKernel registrations match NodeDef 'node tensors/component_0 (defined at /home/tom/DeepSpeech/util/feeding.py:95) ': ‘op: “Const” device_type: “GPU” constraint { name: “dtype” allowed_values { list { type: DT_INT32 } } }’ and ‘op: “Const” device_type: “GPU” constraint { name: “dtype” allowed_values { list { type: DT_INT32 } } } host_memory_arg: “output”’
[[node tensors/component_0 (defined at /home/tom/DeepSpeech/util/feeding.py:95) ]]
It seems like tensorflow seq2seq problem.
Can anyone help me to figure out?
Thank you

This looks like it’s unrelated to DeepSpeech, maybe a TensorFlow installation problem, like having two versions installed at the same time? Check with a clean virtualenv, maybe you have tensorflow and tensorflow-gpu, or tensorflow and tf-nightly, or something like that.