Skip to content

Instantly share code, notes, and snippets.

@thiagopbueno
Last active September 17, 2020 00:13
Show Gist options
  • Save thiagopbueno/6e19efe5c6b732ce3dd0749578a46464 to your computer and use it in GitHub Desktop.
Save thiagopbueno/6e19efe5c6b732ce3dd0749578a46464 to your computer and use it in GitHub Desktop.
TensorFlow Autograph

Tips

  • Pass numeric arguments as tf.Tensor objects (to avoid building new graphs and unnecessary retracing)
  • Beware of hidden side effects
  • Pay attention to undefined values (special care with if/else statements and assymetric returns)
  • Pay attention to lexical/dynamical scopes
  • Do not create object attributes inside autographed functions (mutations lead to undefs)
  • Avoid using Python lists; prefer tf.TensorArray
  • To accumulate results from a dynamically unrolled loop, use tf.TensorArray
  • Use tf.TensorSpec(shape=..., dtype=...) in input_signature to avoid proliferation of graphs
  • Avoid creating stateful objects inside autographed functions (define tf.Variable outside and pass as argument or alternatively rely on lexical scope and closures)
  • Only use Python side effects to debug your traces
  • Iterate over Python data by wrapping it in a tf.data.Dataset (e.g., tf.data.Dataset.from_generator(...) or tf.data.Dataset.from_tensors(...)) and leveraging the for x in y idiom; if possible read data from files via TFRecordDataset/CsvDataset/etc.

Notes

  • Ordering of stateful operations in a tf.function replicates the semantics of Eager mode; there's no need to add manual control dependencies
  • In a Python if statement, if the condition is a tf.Tensor then the original statement is converted to tf.cond. Otherwise, the conditional is simply executed during tracing
  • In a Python if statement, if one branch creates a tf.Tensor used downstream, the other branch must also create it
  • If you have a break or early return clause that depends on a tf.Tensor, the top-level condition or iterable should also be a tensor
  • The shape/dtypes of all loop variables must stay consistent with each iteration

Code

Set autograph verbosity level

tf.autograph.set_verbosity(0-10, echo_to_stdout)
export AUTOGRAPH_VERBOSITY=0-10

Force eager execution

# all Graph functions will be eagerly executed
tf.config.experimental_run_functions_eargerly(True)

# do not get autographed
@tf.function(autograph=False)
def f():
  pass

Concrete functions

@tf.function
def f(x):
  pass
  
f_int32 = f.get_concrete_function(tf.TensorSpec(shape=[None], dtype=tf.int32))

Inspection in AutoGraph

print(tf.autograph.to_code(f))

Static unrolling of for loops

def test_dynamically_unrolled(f, *args):
  g = f.get_concrete_function(*args).graph
  if any(node.name == 'while' for node in g.as_graph_def().node):
    print("{}({}) uses tf.while_loop.".format(
        f.__name__, ', '.join(map(str, args))))
  elif any(node.name == 'ReduceDataset' for node in g.as_graph_def().node):
    print("{}({}) uses tf.data.Dataset.reduce.".format(
        f.__name__, ', '.join(map(str, args))))
  else:
    print("{}({}) gets unrolled.".format(
        f.__name__, ', '.join(map(str, args))))

References

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment