I’m at Chapter4.1.
Following the example of plotting relu, I wonder how to plot prelu? I found that torch.prelu is a built-in function, however i cannot find it in the official document but just nn.prelu, thus i don’t know the correct usage, mainly how to put the weight to it?
x = torch.arange(-8.0, 8.0, 0.1, requires_grad=True) weight = torch.ones_like(x)*0.1 y = torch.prelu(x,weight) d2l.plot(x.detach(), y.detach(), 'x', 'prelu(x)', figsize=(5, 2.5))
I hope to put all weight with 0.1, but the code above doesn’t work. Thank you.
Besides, why i cannot find the document for torch.prelu? what’s the difference between torch.prelu and nn.prelu?