Native_client/deepspeech_utils.so - undefined reference to symbol 'roundf@@GLIBC_2.2.5'

I am trying to build the native_client binaries from source, as documented in “native_client/README.md”.

In the section “Building”, I am requested to run the following command :
bazel build --config=monolithic -c opt --copt=-O3 --copt=-fvisibility=hidden //native_client:libdeepspeech.so //native_client:deepspeech_utils //native_client:generate_trie

The target //native_client:deepspeech_utils gives the following error :
undefined reference to symbol ‘roundf@@GLIBC_2.2.5

So I add “-lm” to “linkopts” in “native_client/BUILD”, I then get :
undefined reference to `main’

So I add “-c” to “linkopts” in “native_client/BUILD”, I then get :
output ‘native_client/libdeepspeech_utils.so’ was not created

How is it even supposed to be created, I don’t know… However libdeepspeech_utils.a is created, so it seems we might be building a static library instead of a shared one ?

Why is this target using “cc_library” and not “tf_cc_shared_object” as for “libdeepspeech.so” ?

Why isn’t named “libdeepspeech_utils.so” with the .so extension ?

It seems that almost nobody ever used bazel except to build TensorFlow and the documentation is very poor. So it is hard to find any help.

Can you share more details on exactly what you are doing ?

What kind of details do you need ?

I want to be able to modify the C++ source code, so I am trying to build the existing code. I have cloned mozilla/tensorflow and simply followed instructions in "native_client/README.md”.

Both mozilla/tensorflow and mozilla/deepspeech are on the HEAD of their master branch.

Which system are you building for? If you are using DeepSpeech/master, you should be using tensorflow/r1.6.

I am building for linux 64 with CUDA. I have just switched to tensorflow/r1.6 and seems to have the same issue.

I have also hit the following error (which has already been reported) :

Illegal ambiguous match on configurable attribute “deps” in @jpeg//:jpeg:
@jpeg//:k8
@jpeg//:armeabi-v7a

So I have commented out the armeabi-v7a part.

It seems really spurious. Which distro are you targetting ? Which version of bazel are you using ?

If you are building for CUDA then you lack --config=cuda as well …

bazel release 0.12.0
#43~16.04.1-Ubuntu

Can you try and stick to bazel v0.10.0 ?

Also, why do you need to rebuild ? Our binaries should work well on 16.04

ok

Yes, I can run DeepSpeech with the pre-built binaries. But I suspect there might be a bug in the binaries related to this issue https://github.com/mozilla/DeepSpeech/issues/1156.

Well, building yourself, you will get the same bug :slight_smile:

Well, actually, I would only need to recompile libctc_decoder_with_kenlm.so for now, which works. I have trouble only with building libdeepspeech_utils.so.

Yeah, but to be able to fix it, I need to be able to build :wink:

Still, I’m enclined to tell you “works for me”, which is not helpful, so there is something in your environment that is wrong, somehow.

Knowing that it works on another environment already helps. I have downgraded to bazel 0.10, we’ll see…

ok, with bazel 0.10 and “–config=cuda --config=monolithic”, everything builds !

The python binding of libctc_decoder_with_kenlm.so seems broken with DeepSpeech.py, but it might eventually work.

Thanks ! :slight_smile:

1 Like

libctc_decoder_with_kenlm.so should work, what is the error you have ?

DeepSpeech.py says :

AttributeError: ‘module’ object has no attribute ctc_beam_search_decoder_with_lm

The command arguments are not relevant since it works with the pre-built libctc_decoder_with_kenlm.so, and this function (ctc_beam_search_decoder_with_lm) is always needed.

I don’t find where the function is defined though, I suppose there should be a python binding from some C++ function in libctc_decoder_with_kenlm.so to the python function ctc_beam_search_decoder_with_lm.

EDIT : this page https://www.tensorflow.org/extend/adding_an_op describes the mechanism used to add a custom operation. For instance, my libctc_decoder_with_kenlm.so does contain the following symbols :

000000000031bd70  w    F .text  0000000000001be2              _ZN28CTCBeamSearchDecoderWithLMOp7ComputeEPN10tensorflow15OpKernelContextE
0000000000311730  w    F .text  000000000000025a              _ZN28CTCBeamSearchDecoderWithLMOpD2Ev
0000000000313450  w    F .text  00000000000019d3              _ZN28CTCBeamSearchDecoderWithLMOpC2EPN10tensorflow20OpKernelConstructionE
0000000000ce43c8  w    O .data.rel.ro   0000000000000018              _ZTI28CTCBeamSearchDecoderWithLMOp
0000000000ce4550  w    O .data.rel.ro   0000000000000038              _ZTV28CTCBeamSearchDecoderWithLMOp
0000000000313450  w    F .text  00000000000019d3              _ZN28CTCBeamSearchDecoderWithLMOpC1EPN10tensorflow20OpKernelConstructionE
0000000000311730  w    F .text  000000000000025a              _ZN28CTCBeamSearchDecoderWithLMOpD1Ev
000000000095cd20  w    O .rodata        000000000000001f              _ZTS28CTCBeamSearchDecoderWithLMOp
0000000000311990  w    F .text  0000000000000262              _ZN28CTCBeamSearchDecoderWithLMOpD0Ev

This guy had the same issue :

Oh, did you by any mistake build it with --config=monolithic ?