Working with Sequences

A few questions about the MLP:

def get_net():
    net = nn.Sequential(nn.Linear(4, 10), nn.ReLU(), nn.Linear(10, 1))
    net.apply(init_weights)
    return net
  1. Is it correct to call this Multi-Layer Perceptron an RNN? Or does calling something an RNN only depend on the having a sliding window training & label set?

  2. tau is 4 in this case correct? What do both 10s mean contextually?

A few about the max steps section

  1. Are you predicting a sequence of length step size, or are you shifting each window by the step size?

Iā€™m confused about this code in Chapter 9.1. If I understand correctly, our FEATURE should be a T-tau fragment of length tua; why is the FEATURE here actually a tau fragment of length T-tau

def get_dataloader(self, train):
    features = [self.x[i : self.T-self.tau+i] for i in range(self.tau)]
    self.features = torch.stack(features, 1)
    self.labels = self.x[self.tau:].reshape((-1, 1))
    i = slice(0, self.num_train) if train else slice(self.num_train, None)
    return self.get_tensorloader([self.features, self.labels], train, i)

My solutions to the exs: 9.1

1 Like

torch.stack converts the features list whose shape is (4, 996) to (996, 4), which means 996 samples with 4 feature