SciTech-BigDataAIML-Tensorflow-Introduction to modules, layers, and models
Introduction to modules
, layers
, and models
-
Model: To do machine learning in TensorFlow, you are likely to need to define, save, and restore a model.
A model is, abstractly:- A function that computes something on tensors (a forward pass)
- Some variables that can be updated in response to training
In this guide, you will go below the surface of Keras to see how TensorFlow models are defined.
This looks at how TensorFlow collects variables and models, as well as how they are saved and restored. - **Most
models
are made oflayers
. Layers
are functions with a known mathematical structure that can be reused and have trainable variables.- In TensorFlow, most high-level implementations of layers and models, are built on the same foundational class:
tf.Module
. - Modules and, by extension, layers are deep-learning terminology for "objects": they have
internal state
, andmethods
thatuse that state
. - Note:
tf.Module is the base class for
bothtf.keras.layers.Layer
andtf.keras.Model
,
so everything you come across here also applies in Keras.
For historical compatibility reasons Keras layers do not collect variables from modules,
so your models should use onlymodules
or onlyKeras layers
.
However, the methods shown below for inspecting variables are the same in either case. - By subclassing
tf.Module
, anytf.Variable
ortf.Module
instances assigned to this object's properties are automatically collected.
This allows you to save and load variables, and also create collections of tf.Modules.
-
TensorFlow Modules
Building Modules
Here's an example of a very simple tf.Module that operates on a scalar tensor:
class SimpleModule(tf.Module):
def __init__(self, name=None):
super().__init__(name=name)
self.a_variable = tf.Variable(5.0, name="train_me")
self.non_trainable_variable = tf.Variable(5.0, trainable=False, name="do_not_train_me")
def __call__(self, x):
return self.a_variable * x + self.non_trainable_variable
simple_module = SimpleModule(name="simple")
simple_module(tf.constant(5.0))
There is nothing special about call except to act like a Python callable;
you can invoke your models with whatever functions you wish.
You can set the trainability of variables on and off for any reason, including freezing layers and variables during fine-tuning.
This is an example of a two-layer linear layer model made out of modules.
# First a dense (linear) layer:
class Dense(tf.Module):
def __init__(self, in_features, out_features, name=None):
super().__init__(name=name)
self.w = tf.Variable(
tf.random.normal([in_features, out_features]), name='w')
self.b = tf.Variable(tf.zeros([out_features]), name='b')
def __call__(self, x):
y = tf.matmul(x, self.w) + self.b
return tf.nn.relu(y)
# And then the complete model, which makes two layer instances and applies them:
class SequentialModule(tf.Module):
def __init__(self, name=None):
super().__init__(name=name)
self.dense_1 = Dense(in_features=3, out_features=3)
self.dense_2 = Dense(in_features=3, out_features=2)
def __call__(self, x):
x = self.dense_1(x)
return self.dense_2(x)
# You have made a model!
my_model = SequentialModule(name="the_model")
# Call it, with random results
print("Model results:", my_model(tf.constant([[2.0, 2.0, 2.0]])))
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· winform 绘制太阳,地球,月球 运作规律
· 震惊!C++程序真的从main开始吗?99%的程序员都答错了
· AI与.NET技术实操系列(五):向量存储与相似性搜索在 .NET 中的实现
· 超详细:普通电脑也行Windows部署deepseek R1训练数据并当服务器共享给他人
· 【硬核科普】Trae如何「偷看」你的代码?零基础破解AI编程运行原理