SciTech-BigDataAIML-Tensorflow-Introduction to graphs and tf.function

  • Graphs are data structures that contain:

    1. a set of tf.Operation objects,
      which representing units of computation;
    2. and tf.Tensor objects,
      which represent the units of data that flow between operations.
  • Graphs are defined in a tf.Graph context. Since these graphs are data structures, they can be saved, run, and restored all without the original Python code.

  • The benefits of graphs:
    In short, graphs are extremely useful and let your TensorFlow run fast, run in parallel, and run efficiently on multiple devices.
    However, you still want to define your machine learning models (or other computations) in Python for convenience, and then automatically construct graphs when you need them.

    1. TensorFlow uses graphs as the format for saved models when it exports them from Python.
    2. With a graph, you have a great deal of flexibility. You can use your TensorFlow graph in environments that don't have a Python interpreter, like mobile applications, embedded devices, and backend servers.
    3. Graphs are also easily optimized, allowing the compiler to do transformations like:
      Statically infer the value of tensors by folding constant nodes in your computation ("constant folding").
      Separate sub-parts of a computation that are independent and split them between threads or devices.
      Simplify arithmetic operations by eliminating common subexpressions.
      There is an entire optimization system, Grappler, to perform this and other speedups.
posted @   abaelhe  阅读(5)  评论(0编辑  收藏  举报
相关博文:
阅读排行:
· 震惊!C++程序真的从main开始吗?99%的程序员都答错了
· 别再用vector<bool>了!Google高级工程师:这可能是STL最大的设计失误
· 【硬核科普】Trae如何「偷看」你的代码?零基础破解AI编程运行原理
· 单元测试从入门到精通
· 上周热点回顾(3.3-3.9)
点击右上角即可分享
微信分享提示