动手学深度学习——基本张量运算

基本张量运算

张量

张量可以被看做多维数组,高维矩阵,可以进行多种数据操作和数学运算

import torch
torch.tensor([[1.,-1.],[1.,-1.]])

创建张量

tensor([[ 1., -1.],
        [ 1., -1.]])
a = torch.randn(2,3)
torch.sigmoid(a)
a

处理张量

tensor([[-0.1690, -0.2554, -0.4382],
        [-1.0814, -2.1793, -1.0939]])
torch.argmax(a)
torch.argmax(a,dim=1)
torch.sum(a)
tensor(-5.2172)
torch.index_select(a,dim=1,index=torch.tensor([0,2]))
tensor([[-0.1690, -0.4382],
        [-1.0814, -1.0939]])
loss1 = torch.tensor(3)
loss1.shape
loss1.item()

取值

3

自动微分

a = torch.randn(2,3,requires_grad=True)
loss = a.sum()
loss.backward()
a.grad
tensor([[1., 1., 1.],
        [1., 1., 1.]])
a = torch.randn(2,3,requires_grad=True)
a

设置梯度追踪

tensor([[-0.2077, -0.9880, -0.1491],
        [ 0.5216, -0.2129,  0.4189]], requires_grad=True)
loss = a.sum(dim = 0)
loss.shape
loss
tensor([ 0.3139, -1.2009,  0.2697], grad_fn=<SumBackward1>)

传入向量可以看作上一层的偏导

a = torch.randn(2,3,requires_grad=True)
loss = a.abs().sum()
while loss < 100:
    loss = loss * 2
loss
tensor(181.8438, grad_fn=<MulBackward0>)
loss.backward()
a.grad
tensor([[ 64., -64., -64.],
        [-64., -64.,  64.]])
b = a.detach()
b.requires_grad
False

调用detach方法可以显式地让Autograd引擎将某一张量从追踪地计算图中排除

posted @ 2024-05-06 10:09  Sun-Wind  阅读(15)  评论(0编辑  收藏  举报