I’ve a piece of code that I believe was written in TF2, but based on a repo written in TF1.
I am trying to run it in TF 1.52
It expressively invokes a piece of LSTM code that causes an error (unknown parameter “scope”)
net = tf.keras.layers.LSTM(32, return_sequences=True, dropout=0.4, recurrent_dropout=0.4)(net, scope=’lstm1′, training=is_training)
net = tf.keras.layers.LSTM(32, dropout=0.4, recurrent_dropout=0.4)(net, scope=’lstm2′, training=is_training)
All of the other layers have their scope parameter defined as part of a custom layer definition, (with tf.variable_scope(scope, reuse=reuse_weights) as sc )
Without the scope param in the LSTM layers, the kernel fails. I believe the problem is in the lack of a custom layer definition for the LSTM layers with the scope defined accordingly, but I’m not totally sure
submitted by /u/dxjustice
[visit reddit] [comments]