04 2022 档案

摘要:batch_size , seq ,num_layers = 2,3,2 input_size , hidden_size = 2,3 input = torch.randn(batch_size, seq, input_size) h_0 = torch.zeros(num_layers,batc 阅读全文
posted @ 2022-04-27 11:30 华小电 阅读(50) 评论(0) 推荐(0) 编辑
摘要:rnn = nn.RNN(input_size=4,hidden_size=3,num_layers=2,batch_first=True, bidirectional = True) input = torch.randn(1,5,4) output , h_n = rnn(input) prin 阅读全文
posted @ 2022-04-25 09:04 华小电 阅读(313) 评论(0) 推荐(0) 编辑
摘要:batch_first – If True, then the input and output tensors are provided as (batch, seq, feature) instead of (seq, batch, feature). Note that this does n 阅读全文
posted @ 2022-04-25 08:49 华小电 阅读(637) 评论(0) 推荐(0) 编辑
摘要:# Define LSTM class Lstm(nn.Module): def __init__(self, input_size, hidden_size=2, output_size=1, num_layers=1): super().__init__() self.layer1 = nn.L 阅读全文
posted @ 2022-04-07 20:18 华小电 阅读(27) 评论(0) 推荐(0) 编辑

点击右上角即可分享
微信分享提示