Created
October 9, 2019 03:45
-
-
Save chunseoklee/96b103be6dd73df965b6cdda95964b86 to your computer and use it in GitHub Desktop.
tflite instance norm generation python
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## tested on tf 2.0 | |
from tensorflow import keras | |
import tensorflow as tf | |
from tensorflow.keras import layers | |
import tensorflow_addons as tfa | |
inputs = keras.Input(shape=(784,), name='digits') | |
x = tfa.layers.normalizations.InstanceNormalization()(inputs) | |
outputs = layers.Dense(10, activation='softmax', name='predictions')(x) | |
model = keras.Model(inputs=inputs, outputs=outputs, name='3_layer_mlp') | |
model.summary() | |
model.save('instance_norm.h5') | |
converter = tf.lite.TFLiteConverter.from_keras_model(model) | |
tflite_model = converter.convert() | |
f = open('instance_norm.tflite', 'bw') | |
f.write(tflite_model) | |
f.close() |
Thanks a lot!
Hi chunseoklee,
Did you test the converted model on device / using python test.
Even though the conversion works it breaks during inference due to instance normalization
Hi chunseoklee,
Did you test the converted model on device / using python test.
Even though the conversion works it breaks during inference due to instance normalization
I do not recall. If instance norm is split into base ops(like add, mul ...), i am sure that it will run on devices with the latest tflite interpreter.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
For broadcast binaryop :