04 2022 档案
摘要:batch_size , seq ,num_layers = 2,3,2 input_size , hidden_size = 2,3 input = torch.randn(batch_size, seq, input_size) h_0 = torch.zeros(num_layers,batc
阅读全文
摘要:rnn = nn.RNN(input_size=4,hidden_size=3,num_layers=2,batch_first=True, bidirectional = True) input = torch.randn(1,5,4) output , h_n = rnn(input) prin
阅读全文
摘要:batch_first – If True, then the input and output tensors are provided as (batch, seq, feature) instead of (seq, batch, feature). Note that this does n
阅读全文
摘要:# Define LSTM class Lstm(nn.Module): def __init__(self, input_size, hidden_size=2, output_size=1, num_layers=1): super().__init__() self.layer1 = nn.L
阅读全文