1

I am trying to convert a frozen graph file (.pb) to a TensorFlow Lite FlatBuffer file (.tflite) in Raspberry Pi 3B+ Linux. TensorFlow was installed in Python using pip3; TensorFlow Lite was installed and built (static library) from source. TensorFlow version (installed in Python) is 1.11.0.

I am getting this error: Attributeerror: module 'tensorflow.contrib.lite.python.lite' has no attribute 'TFLiteConverter'.

Here is the Python code (derived from https://www.tensorflow.org/lite/convert/python_api):

#!/usr/bin/python3
import tensorflow as tf
print("tf version = " + tf.version)
graph_def_file = "/home/pi/sols/demo/src/image_classification/network/fruit_models/frozen_graph.pb"
input_arrays = ["X"]
output_arrays = ["softmax"]
converter = tf.contrib.lite.TFLiteConverter.from_frozen_graph(
graph_def_file, input_arrays, output_arrays)
tflite_model = converter.convert()
open("converted_model.tflite", "wb").write(tflite_model)

Please help!

3 Answers 3

0

The TfliteConverter and TocoConverter seem to be very problematic on all OS except Mac. You can still convert the model to TensorFlow Lite using the following steps:

  1. Create a new Google Colab notebook
  2. Write the code to convert the model. Import TFLiteConverter and other stuff.
  3. Upload the model in the notebook.
  4. Run the notebook.
  5. Download the generated TFLite file.

You can use this notebook.

Sign up to request clarification or add additional context in comments.

6 Comments

Thanks, but it didn't help. The call converter.convert() gives this error: RuntimeError: TOCO failed see console for info. ...
Thanks, but it didn't help. The call converter.convert() gives this error: RuntimeError: TOCO failed see console for info. ... tensorflow/contrib/lite/toco/graph_transformations/resolve_constant_random_uniform.cc:85] RandomUniform op outputting "dropout/random_uniform/RandomUniform" is truly random (using /dev/random system entropy). Therefore, cannot resolve as constant. Set "seed" or "seed2" attr non-zero to fix this\n2018-12-12 22:36:17.915000: Any idea how this can be fixed?
Did you try it in the Google Colab
Yes, I did. The above error message is from Google Colab.
Did you use LSTM or BatchNormalization or some other layer rather Dense or Conv layers
|
0

I think the problem is that the docs reflect the latest release. In 1.11.0, it probably had a different name and was renamed to TFLiteConverter.

Comments

0

Seems it depends on tensorflow version that you are using:

>>> import tensorflow as tf
>>> dir(tf.contrib.lite)
['DecodeError', 'Interpreter', 'OpHint', 'PY3', 'TocoConverter', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__spec__', '_freeze_graph', '_freeze_saved_model', '_get_tensors_from_tensor_names', '_global_variables_initializer', '_graph_pb2', '_import_graph_def', '_is_frozen_graph', '_keras', '_session', '_set_tensor_shapes', '_signature_constants', '_tag_constants', '_tensor_name', '_text_format', '_tf_graph_util', 'absolute_import', 'build_toco_convert_protos', 'constants', 'convert_op_hints_to_stubs', 'division', 'print_function', 'toco_convert', 'toco_convert_protos']
>>> tf.__version__
'1.10.0'

So in older versions it's TocoConverter

https://www.tensorflow.org/api_docs/python/tf/contrib/lite/TocoConverter

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.