Residual Networks (ResNet) and ResNeXt

http://d2l.ai/chapter_convolutional-modern/resnet.html

Im not sure if you meant to say “Why can we not increase” vs “Why cannot we” for exercise 5 of this section. :slight_smile:

It’s “Why can’t we” now :grinning:

Awesome :muscle: :muscle:

Thats an interesting question.
My take is that we cannot just increase the complexity of functions because we cannot be sure whether the nesting moves in the right direction (ie, we cannot be sure whether more classes leads us to the optimum function). As a result, we might end up seriously overfitting the data or learning arbitrary patterns that does not make sense for the given problem

1 Like

Residual block is as an improved block w/ first two paths of Inception.