Categories
Misc

My network was outputting the same value for every input, so I added BatchNormalization() as the final layer in the model and now it actually changes its output. I feel like I should not do this, but I don’t know why. Anyone know if this is ok?

Adding batch norm to the layer before doesn’t help.

“`python model = Sequential()

# inputs: A 3D tensor with shape [batch, timesteps, feature]. # https://keras.io/api/layers/recurrent_layers/lstm/ dropout = 0.1 recurrent_dropout = 0 # 0 for cudnn model.add(LSTM(25, input_shape=(x_shape[1:]), activation='tanh', return_sequences = True, dropout=dropout, recurrent_dropout=recurrent_dropout)) #model.add(BatchNormalization()) # (A) model.add(LSTM(15, activation='tanh', return_sequences = False, dropout=dropout, recurrent_dropout=recurrent_dropout)) #model.add(BatchNormalization()) # (B) model.add(Dense(10, activation='gelu')) model.add(BatchNormalization()) # (C) #model.add(Dropout(0.1)) model.add(Dense(1, activation='sigmoid')) #model.add(BatchNormalization()) # (D) loss = tf.keras.losses.MeanAbsoluteError() model.compile(loss = loss, metrics=["accuracy",tf.keras.metrics.MeanAbsoluteError()], optimizer=tf.keras.optimizers.Adam())#lr=LEARN_RATE_LSTM, decay=LEARN_RATE_LSTM_DECAY return model 

“`

submitted by /u/Abradolf–Lincler
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *