in the net.initialize(), we did not use parameter initializer package which means we would use the default, it is init.Uniform(scale=0.07), on the other hand we asssume the distribution of data is Gaussian, but why we not use init.Normal() to initialize the parameter? correct me if I am wrong, is there any reason or am I missing something important about the distribution of the data. Thank you.
You are right about it.
Check here: http://preview.d2l.ai/d2l-en/master/chapter_generative-adversarial-networks/gan.html
We can use init.Normal() to initialize the parameter
check my answer…
you means that the learned weights should be gauss distribution?
i think the weights should be fit finally to the training set even it’s initiated w/ 0,that’s the meaning of ‘self-learning in ml’.so it’s viable to initiate w/ gauss as well.note the data here should be i.i.d only ,and the noise is assumed to be guass instead.
on the other word,you can use default initialization instead of normal to initiate weiths in mlp sections(i did that too)