How can I add weight decay to the optimizer (e.g. ADAM) in Tensorflow object detection API?
When setting the optimizer, the options are:
optimizer { adam_optimizer: { epsilon: 1e-7 # Match tf.keras.optimizers.Adam's default. learning_rate: { manual_step_learning_rate { initial_learning_rate: 1e-3 schedule { step: 90000 learning_rate: 1e-4 } schedule { step: 120000 learning_rate: 1e-5 } } }
submitted by /u/giakou4
[visit reddit] [comments]