How Do I Convert Tensorflow 2.0 Estimator Model To Tensorflow Lite?
THe following code i have below produce the regular tensorflow model but when i try to convert it to tensorflow lite it doesn't work, i followed the following documentations. https
Solution 1:
Try to use a concrete function:
export_dir = "tmp"
serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
tf.feature_column.make_parse_example_spec(feat_cols))
estimator.export_saved_model(export_dir, serving_input_fn)
# Convert the model.
saved_model_obj = tf.saved_model.load(export_dir="tmp/1571728920/")
concrete_func = saved_model_obj.signatures['serving_default']
converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func])
# print(saved_model_obj.signatures.keys())# converter.optimizations = [tf.lite.Optimize.DEFAULT]# converter.experimental_new_converter = True
tflite_model = converter.convert()
serving_default
is the default key for signatures in a SavedModels.
If not working try to uncomment converter.experimental_new_converter = True
and the two lines above it.
Short explanation
Based on Concrete functions guide
Eager execution in TensorFlow 2 evaluates operations immediately, without building graphs. To save the model you need graph/s which is wrapped in a python callables: a concrete functions.
Post a Comment for "How Do I Convert Tensorflow 2.0 Estimator Model To Tensorflow Lite?"