[Practice] TensorFlow Course Note 1
[Practice] TensorFlow Course Note 1
Preface
The original open course is from the github whose website is https://github.com/instillai/TensorFlow-Course Written by Harry. Dec. 8th.
Automatic Differentiation
Gradient Calculation
There are mainly two ways to calculate the gradient in TensorFlow, which are
x = tf.constant([2.0])
x = tf.Variable(x)
with tf.GradientTape(persistent=False, watch_accessed_variables=True) as grad:
f = x ** 2
# Print gradient output
print('The gradient df/dx where f=(x^2):\n', grad.gradient(f, x))
and
x = tf.constant([2.0])
with tf.GradientTape(persistent=False, watch_accessed_variables=True) as grad:
grad.watch(x)
f = x ** 2
# Print gradient output
print('The gradient df/dx where f=(x^2):\n', grad.gradient(f, x))
The results are the same which is
The gradient df/dx where f=(x^2): tf.Tensor([4.], shape=(1,), dtype=float32)
and the first method which set x as the variable is recommended by TensorFlow.
Function: tf.GradientTape
attribute: persistent
It's a boolean attribute, and here is an example code:
with tf.GradientTape(persistent=True, watch_accessed_variables=True) as grad:
f = x ** 2
h = x ** 3
# Print gradient output
print('The gradient df/dx where f=(x^2):\n', grad.gradient(f, x))
print('The gradient dh/dx where h=(x^3):\n', grad.gradient(h, x))
The gradient df/dx where f=(x^2):
tf.Tensor([4.], shape=(1,), dtype=float32)
The gradient dh/dx where h=(x^3):
tf.Tensor([12.], shape=(1,), dtype=float32)
- True: it means in the block of GradientTape and the gradient would be allowed to calculate for more than 1 time
- False: opposite to True which is only one time.
attribute: watch_accessed_variables
- True: track the variables automatically.
- False: opposite to True, where you have to track x through grad.watch(x)