Object-Oriented Design for Implementation

https://d2l.ai/chapter_linear-regression/oo-design.html

Hello everyone, I am trying to run the codes in this chapter on colab, but it always shows me that error. Of course, I have already installed d2l package with !pip install d2l==1.0.0-alpha0.

pls help me! Thanks

This is because of a recent update in colab i guess, it was working before but now you need to install matplotlib_inline manually. ModuleNotFoundError when running the official pytorch colab notebook · Issue #2250 · d2l-ai/d2l-en · GitHub tracks the problem and we will make a patch release soon to fix it by depending on matplotlib_inline.

For now you can pip install matploblib_inline to fix the bug.

1 Like

For those who got some issues d2l library. Try this command then restart the kernel

pip install --upgrade d2l==1.0.0a0

There is no need to restart the kernel after the recent release.

Just use d2l==1.0.0-alpha1.post0

Here are my opinions for the exs:
ex.1
Use ? or ?? in jupyter notebook

ex.2
Can’t access atrribute a,b without calling the save_parameters().
By “??d2l.HyperParameters.save_hyperparameters” I got:

    def save_hyperparameters(self, ignore=[]):
        """Save function arguments into class attributes.
    
        Defined in :numref:`sec_utils`"""
        frame = inspect.currentframe().f_back
        _, _, _, local_vars = inspect.getargvalues(frame)
        self.hparams = {k:v for k, v in local_vars.items()
                        if k not in set(ignore+['self']) and not k.startswith('_')}
        for k, v in self.hparams.items():
            setattr(self, k, v)

As shown in the defination of this method, it firstly iter the parameters to get both the key(k) and value(v) of the inputs, then use setattr() to add each input as B’s feature.

1 Like

Can someone explain why we cant access attribute a, b without calling save_hyperparameters()?
Is it because the attribute a,b have not been set in init whereas when we call save_hyperparameters() , we have for loop (provided below) to set the attributes
for k, v in self.hparams.items():
setattr(self, k, v)

As you said, without the loop of save_hyperparameters, the attributes are not initialized.

Would love it if someone were to enlighten me with how this piece of code works (part of Trainer fit function)

        l = self.loss(self(*batch[:-1]), batch[-1])

How come we can call self()? and what is *batch[:-1]? What does the * mean?
Thank you.

self(X) same as module.forward(X), see https://stackoverflow.com/questions/73991158/pytorch-lightning-whats-the-meaning-of-calling-self

1 Like