TransWikia.com

Generative network understanding

Data Science Asked by thanatoz on June 10, 2021

I was going through GAN’s notebook by fchallot on Generative Adversarial Networks where, in the Generator Network, he creates a Dense layer with $16*16 * 128$ (where 128 is the number of channels).

  • How exactly does latent_dim=32 becomes of shape $16 * 16$ in the network.

  • How are these values decided?

enter image description here

One Answer

  • latent_dim does not become of shape 16*16
x = layers.Dense(128 * 16 * 16)(generator_input)

mean: The input of size 32 (the latent_dim) is connected to a layer of size 16*16*128

  • These value are decided by the data scientist, they are the hyper-parameters. There is a lot of research on how to adjust hyper-parameters for GAN. People use generally a size of 100 for the latent_dim but it may depends of the complexity of the image you are trying to generate.

Answered by vico on June 10, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP