Auto Differentiation

Hi, I have 2 questions:

  • In question 3: as I understood, derivative of a vector over a vector results in a matrix, since f is a vector of a, I got a vector instead, which I dont understand.
  • In question 5: I have a small problem: my intention is to plot the x_grad after GradientTape, I tried but still error. Hope sb can help me out.
    import matplotlib.pyplot as plt
    import numpy as np

x = tf.range(-10,10,1,dtype=tf.float32)
x = tf.Variable(x)

with tf.GradientTape() as t:
y = np.sin(x)

x_grad = t.gradient(y, x)

x = np.arange(-10,10,0.1)
plt.plot(x, np.sin(x), ‘r’)

#Even when I try to assign value to y, I got None result
y = tf.Variable(tf.zeros_like(x))
with tf.GradientTape() as t:
for i in range(tf.size(y)):

Thank you :smiley:

For question 5, you can plot ‘x_grad’ directly through pl.plt without np.

Hi @randomonlinedude, to calculate the second gradients, you can use tf.hessians():wink: