Skip to content

Instantly share code, notes, and snippets.

@bholota
Last active March 7, 2018 13:17
Show Gist options
  • Save bholota/f67bcc9b51c079486f9c2322e53f8861 to your computer and use it in GitHub Desktop.
Save bholota/f67bcc9b51c079486f9c2322e53f8861 to your computer and use it in GitHub Desktop.
import numpy as np
import tensorflow as tf
# manually put back imported modules - bug?
import tempfile
import subprocess
tf.contrib.lite.tempfile = tempfile
tf.contrib.lite.subprocess = subprocess
with tf.name_scope("inputs"):
X = tf.placeholder(tf.float32, [2], name="X-input")
with tf.name_scope("outputs"):
Y = tf.add(X[0], X[1], name="Y-output")
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
tflite_model = tf.contrib.lite.toco_convert(sess.graph_def, [X], [Y])
with open('fm.lite', "wb") as f:
f.write(tflite_model)
@bholota
Copy link
Author

bholota commented Mar 7, 2018

I'm getting java.lang.IllegalArgumentException: Invalid handle to Interpreter. when trying to use fm.lite in android app.

Android code:

void run(MappedByteBuffer byteBuffer) {
        Interpreter interpreter = new Interpreter(byteBuffer);
        float[] input = new float[]{1.0f, 1.0f};
        float output = 0.0f;
        interpreter.run(input, output);
        Log.e("TAG", "OUTPUT: " + output);
}

It's working properly with some samples found in github (after altering input/output size).

I also tried to freeze graph, save it as pb and then convert with toco:
toco --input_file=frozen_model.pb --input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE --output_file=frozen_model.lite --inference_type=FLOAT --input_arrays=inputs/X-input --output_arrays=outputs/Y-output --input_shapes=2 --output_shapes=1
Unfortunately same error is thrown.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment