Categories
Misc

BatchNormalization Layer is causing ValueError: tf.function only supports singleton tf.Variables created on the first call

I’m training a deep and wide model with a convolutional side which I’m using inception blocks for. I needed to put in some Batch Normalization layers to stop exploding gradients, and I get a ValueError that points to the BatchNormalization layer creating multiple variables. I can’t find anyone else with this problem, so I don’t know what is causing it. I found that if I set it to eager mode, the error doesn’t come up during training, but will prevent me from saving my model. Any ideas on what is causing this?

submitted by /u/SaveShark
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *