多loss的反向传播路径

转自:https://www.jb51.net/article/213149.htm

1.多个loss

 x = torch.tensor(2.0, requires_grad=True)                                                    
 y = x**2                                                                                     
 z = x                                                                                        
# 反向传播
 y.backward()                                                                                 
 x.grad                                                                                       
 tensor(4.)
 z.backward()                                                                                 
 x.grad                                                                                       
 tensor(5.) ## 累加

 

 官方文档:

torch.autograd.backward(tensors, grad_tensors=None, retain_graph=None, create_graph=False, 
grad_variables=None)

 Computes the sum of gradients of given tensors w.r.t. graph leaves.The graph is differentiated using the chain rule. 

 不同路径的计算结果会累加到tensor上。

 2.清空梯度

2022-5-20更新————————————————————————

https://blog.csdn.net/weixin_44132485/article/details/102869555

Variable.grad.zero_()
或
optimizer.zero_grad()

 

posted @ 2021-11-11 22:20  lypbendlf  阅读(470)  评论(0编辑  收藏  举报