Categories
Misc

Adding new block/inputs to non-sequential network

I am designing a progressive GAN and I have been stuck on an issue for a couple days now. I have successfully made my generator grow, but increasing the size of my discriminator is not that easy. in my discriminator, I decided to try implementing an ADA layer (like in the generator in StyleGAN3) However I have so far been unsuccessful in connecting the old layers with an input from a new block. The main problem is the non-sequential nature of the discriminator, as I need to give multiple inputs for the multiplication and addition layers. I will give the code to construct my discriminator, however I believe my code to add a block to the discriminator is wholly non-functioning, so that will not be included.

def construct_disc(label_dim=50): # Kernel Init init = tf.keras.initializers.HeUniform(seed=1) # Create Discriminator Inputs im_in = tf.keras.layers.Input(shape = (8,8,3)) lab_in = tf.keras.layers.Input(shape = (label_dim,)) # Style Vector Describer D = tf.keras.layers.Dense(3 * 3 * 3, activation=tf.keras.layers.LeakyReLU(alpha=0.2), kernel_initializer=init)(lab_in) D = tf.keras.layers.Dense(3 * 3 * 3, activation=tf.keras.layers.LeakyReLU(alpha=0.2), kernel_initializer=init)(D) D = tf.keras.layers.Dense(3 * 3 * 3, activation=tf.keras.layers.LeakyReLU(alpha=0.2), kernel_initializer=init)(D) # Conv Block Begins G = tf.keras.layers.Conv2D(128,1, padding = 'same', activation=tf.keras.layers.LeakyReLU(alpha=0.2), kernel_initializer=init)(im_in) G = tf.keras.layers.Conv2D(128,3, padding = 'same', activation = tf.keras.layers.LeakyReLU(alpha=0.2), kernel_initializer=init)(G) # Create Dense Style Interpreter (This Is Part Of The Block) W = tf.keras.layers.Dense(1)(D) W = W[:,:,tf.newaxis,tf.newaxis] B = tf.keras.layers.Dense(1)(D) B = B[:,:,tf.newaxis,tf.newaxis] G = tf.math.multiply(W, G) G = tf.add(G, B) G = tf.keras.layers.Conv2D(128,3, padding = 'same', activation = tf.keras.layers.LeakyReLU(alpha=0.2), kernel_initializer=init)(G) # Block Ends Here ^ G = tf.keras.layers.AveragePooling2D(2)(G) G = tf.keras.layers.Conv2D(128,3, padding = 'same', activation = tf.keras.layers.LeakyReLU(alpha=0.2), kernel_initializer=init)(G) G = tf.keras.layers.Flatten()(G) out = tf.keras.layers.Dense(1, activation='sigmoid', kernel_initializer=init)(G) model = tf.keras.Model([im_in, lab_in], out) # Compile Model opt = tf.keras.optimizers.Adam(lr=0.0002, beta_1=0.5) model.compile(loss=tf.keras.losses.BinaryCrossentropy(from_logits=True), optimizer=opt) return model 

submitted by /u/Yo1up
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *