Auto Differentiation

http://d2l.ai/chapter_preliminaries/autograd.html

Hi, I have 2 questions:

  • In question 3: as I understood, derivative of a vector over a vector results in a matrix, since f is a vector of a, I got a vector instead, which I dont understand.
  • In question 5: I have a small problem: my intention is to plot the x_grad after GradientTape, I tried but still error. Hope sb can help me out.
    import matplotlib.pyplot as plt
    import numpy as np

x = tf.range(-10,10,1,dtype=tf.float32)
x = tf.Variable(x)

with tf.GradientTape() as t:
y = np.sin(x)

x_grad = t.gradient(y, x)

#Plotting
x = np.arange(-10,10,0.1)
plt.figure(1)
plt.plot(x, np.sin(x), ‘r’)
plt.plot(x,np.(x_grad),‘g’)
plt.show()

#Even when I try to assign value to y, I got None result
y = tf.Variable(tf.zeros_like(x))
with tf.GradientTape() as t:
for i in range(tf.size(y)):
y[i].assign(math.sin(x[i]))

Thank you :smiley:

For question 5, you can plot ‘x_grad’ directly through pl.plt without np.

Hi @randomonlinedude, to calculate the second gradients, you can use tf.hessians():wink:

For question 5, I also ran into some similar problems, like:

TypeError: ResourceVariable doesn't have attribute ....
Y is None object.

I think the reason is that numpy ndarrays are different from tensors and they have different attributes, thus we can’t mix them up. For example, when you serve y as a numpy array in tf.gradient(y,x) method, it will return a None Object.
Also, you can refer to my code below:

import tensorflow as tf
import matplotlib.pyplot as plt

x = tf.range(-10, 10, 0.1)
x = tf.Variable(x)

with tf.GradientTape() as t:
    y = tf.math.sin(x)
    
x_grad = t.gradient(y, x)

# plotting
x = np.arange(-10, 10, 0.1)
plt.figure(1)
plt.plot(x, np.sin(x), color='r')
plt.plot(x, x_grad.numpy(), color='g')
plt.show()

It works on my local machine.

1 Like