Pytorch的runtime error
RuntimeError: bool value of Tensor with more than one value is ambiguous
运行下面这段代码的时候出错了,后来网上搜说改成 if w1.grad is not None: 可以通过。
#===========PyTorch: Autograd============== #====A PyTorch Variable is a node in a computational graph=== #==============x.data is a Tensor #=============x.grad is a Variable of gradients(same shape as x.data)== #========x.grad.data is a Tensor of gradients=========================== import torch from torch.autograd import Variable N,D_in, H,D_out = 64, 1000, 100, 10 x = Variable(torch.randn(N, D_in), requires_grad=False) y = Variable(torch.randn(N, D_out), requires_grad=False) w1 = Variable(torch.randn(D_in, H), requires_grad=True) w2 = Variable(torch.randn(H,D_out), requires_grad=True) learning_rate = 1e-6 for t in range(500): y_pred = x.mm(w1).clamp(min=0).mm(w2) loss = ((y_pred-y).pow(2).sum()) #should be if w1.grad is not None: if w1.grad: w1.grad.data.zero_() if w2.grad: w2.grad.data.zero_() loss.backward() w1.data -= learning_rate * w1.grad.data w2.data -= learning_rate * w2.grad.data print(loss)
The Safest Way to Get what you Want is to Try and Deserve What you Want.