摘要: retain_graph参数的作用 官方定义: retain_graph (bool, optional) – If False, the graph used to compute the grad will be freed. Note that in nearly all cases sett 阅读全文
posted @ 2019-03-18 22:22 yangyuwen_yang 阅读(8721) 评论(0) 推荐(1) 编辑
摘要: backward函数 官方定义: torch.autograd.backward(tensors, grad_tensors=None, retain_graph=None, create_graph=False, grad_variables=None) Computes the sum of g 阅读全文
posted @ 2019-03-18 16:09 yangyuwen_yang 阅读(10194) 评论(0) 推荐(0) 编辑